NASA Astrophysics Data System (ADS)
Husain, S. Z.; Separovic, L.; Yu, W.; Fernig, D.
2014-12-01
Extended-range high-resolution mesoscale simulations with limited-area atmospheric models when applied to downscale regional analysis fields over large spatial domains can provide valuable information for many applications including the weather-dependent renewable energy industry. Long-term simulations over a continental-scale spatial domain, however, require mechanisms to control the large-scale deviations in the high-resolution simulated fields from the coarse-resolution driving fields. As enforcement of the lateral boundary conditions is insufficient to restrict such deviations, large scales in the simulated high-resolution meteorological fields are therefore spectrally nudged toward the driving fields. Different spectral nudging approaches, including the appropriate nudging length scales as well as the vertical profiles and temporal relaxations for nudging, have been investigated to propose an optimal nudging strategy. Impacts of time-varying nudging and generation of hourly analysis estimates are explored to circumvent problems arising from the coarse temporal resolution of the regional analysis fields. Although controlling the evolution of the atmospheric large scales generally improves the outputs of high-resolution mesoscale simulations within the surface layer, the prognostically evolving surface fields can nevertheless deviate from their expected values leading to significant inaccuracies in the predicted surface layer meteorology. A forcing strategy based on grid nudging of the different surface fields, including surface temperature, soil moisture, and snow conditions, toward their expected values obtained from a high-resolution offline surface scheme is therefore proposed to limit any considerable deviation. Finally, wind speed and temperature at wind turbine hub height predicted by different spectrally nudged extended-range simulations are compared against observations to demonstrate possible improvements achievable using higher spatiotemporal resolution.
OpenMP parallelization of a gridded SWAT (SWATG)
NASA Astrophysics Data System (ADS)
Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin
2017-12-01
Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.
Large-scale magnetic fields at high Reynolds numbers in magnetohydrodynamic simulations.
Hotta, H; Rempel, M; Yokoyama, T
2016-03-25
The 11-year solar magnetic cycle shows a high degree of coherence in spite of the turbulent nature of the solar convection zone. It has been found in recent high-resolution magnetohydrodynamics simulations that the maintenance of a large-scale coherent magnetic field is difficult with small viscosity and magnetic diffusivity (≲10 (12) square centimenters per second). We reproduced previous findings that indicate a reduction of the energy in the large-scale magnetic field for lower diffusivities and demonstrate the recovery of the global-scale magnetic field using unprecedentedly high resolution. We found an efficient small-scale dynamo that suppresses small-scale flows, which mimics the properties of large diffusivity. As a result, the global-scale magnetic field is maintained even in the regime of small diffusivities-that is, large Reynolds numbers. Copyright © 2016, American Association for the Advancement of Science.
Trajectory Segmentation Map-Matching Approach for Large-Scale, High-Resolution GPS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Lei; Holden, Jacob R.; Gonder, Jeffrey D.
With the development of smartphones and portable GPS devices, large-scale, high-resolution GPS data can be collected. Map matching is a critical step in studying vehicle driving activity and recognizing network traffic conditions from the data. A new trajectory segmentation map-matching algorithm is proposed to deal accurately and efficiently with large-scale, high-resolution GPS trajectory data. The new algorithm separated the GPS trajectory into segments. It found the shortest path for each segment in a scientific manner and ultimately generated a best-matched path for the entire trajectory. The similarity of a trajectory segment and its matched path is described by a similaritymore » score system based on the longest common subsequence. The numerical experiment indicated that the proposed map-matching algorithm was very promising in relation to accuracy and computational efficiency. Large-scale data set applications verified that the proposed method is robust and capable of dealing with real-world, large-scale GPS data in a computationally efficient and accurate manner.« less
Trajectory Segmentation Map-Matching Approach for Large-Scale, High-Resolution GPS Data
Zhu, Lei; Holden, Jacob R.; Gonder, Jeffrey D.
2017-01-01
With the development of smartphones and portable GPS devices, large-scale, high-resolution GPS data can be collected. Map matching is a critical step in studying vehicle driving activity and recognizing network traffic conditions from the data. A new trajectory segmentation map-matching algorithm is proposed to deal accurately and efficiently with large-scale, high-resolution GPS trajectory data. The new algorithm separated the GPS trajectory into segments. It found the shortest path for each segment in a scientific manner and ultimately generated a best-matched path for the entire trajectory. The similarity of a trajectory segment and its matched path is described by a similaritymore » score system based on the longest common subsequence. The numerical experiment indicated that the proposed map-matching algorithm was very promising in relation to accuracy and computational efficiency. Large-scale data set applications verified that the proposed method is robust and capable of dealing with real-world, large-scale GPS data in a computationally efficient and accurate manner.« less
NASA Astrophysics Data System (ADS)
Rasera, L. G.; Mariethoz, G.; Lane, S. N.
2017-12-01
Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.
Multi-scale approaches for high-speed imaging and analysis of large neural populations
Ahrens, Misha B.; Yuste, Rafael; Peterka, Darcy S.; Paninski, Liam
2017-01-01
Progress in modern neuroscience critically depends on our ability to observe the activity of large neuronal populations with cellular spatial and high temporal resolution. However, two bottlenecks constrain efforts towards fast imaging of large populations. First, the resulting large video data is challenging to analyze. Second, there is an explicit tradeoff between imaging speed, signal-to-noise, and field of view: with current recording technology we cannot image very large neuronal populations with simultaneously high spatial and temporal resolution. Here we describe multi-scale approaches for alleviating both of these bottlenecks. First, we show that spatial and temporal decimation techniques based on simple local averaging provide order-of-magnitude speedups in spatiotemporally demixing calcium video data into estimates of single-cell neural activity. Second, once the shapes of individual neurons have been identified at fine scale (e.g., after an initial phase of conventional imaging with standard temporal and spatial resolution), we find that the spatial/temporal resolution tradeoff shifts dramatically: after demixing we can accurately recover denoised fluorescence traces and deconvolved neural activity of each individual neuron from coarse scale data that has been spatially decimated by an order of magnitude. This offers a cheap method for compressing this large video data, and also implies that it is possible to either speed up imaging significantly, or to “zoom out” by a corresponding factor to image order-of-magnitude larger neuronal populations with minimal loss in accuracy or temporal resolution. PMID:28771570
Scaling an in situ network for high resolution modeling during SMAPVEX15
USDA-ARS?s Scientific Manuscript database
Among the greatest challenges within the field of soil moisture estimation is that of scaling sparse point measurements within a network to produce higher resolution map products. Large-scale field experiments present an ideal opportunity to develop methodologies for this scaling, by coupling in si...
Roughness of stylolites: implications of 3D high resolution topography measurements.
Schmittbuhl, J; Renard, F; Gratier, J P; Toussaint, R
2004-12-03
Stylolites are natural pressure-dissolution surfaces in sedimentary rocks. We present 3D high resolution measurements at laboratory scales of their complex roughness. The topography is shown to be described by a self-affine scaling invariance. At large scales, the Hurst exponent is zeta(1) approximately 0.5 and very different from that at small scales where zeta(2) approximately 1.2. A crossover length scale at around L(c)=1 mm is well characterized. Measurements are consistent with a Langevin equation that describes the growth of a stylolitic interface as a competition between stabilizing long range elastic interactions at large scales or local surface tension effects at small scales and a destabilizing quenched material disorder.
Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction
NASA Technical Reports Server (NTRS)
Li, Zhijin; Chao, Yi; Li, P. Peggy
2012-01-01
A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.
NASA Astrophysics Data System (ADS)
Xu, Jincheng; Liu, Wei; Wang, Jin; Liu, Linong; Zhang, Jianfeng
2018-02-01
De-absorption pre-stack time migration (QPSTM) compensates for the absorption and dispersion of seismic waves by introducing an effective Q parameter, thereby making it an effective tool for 3D, high-resolution imaging of seismic data. Although the optimal aperture obtained via stationary-phase migration reduces the computational cost of 3D QPSTM and yields 3D stationary-phase QPSTM, the associated computational efficiency is still the main problem in the processing of 3D, high-resolution images for real large-scale seismic data. In the current paper, we proposed a division method for large-scale, 3D seismic data to optimize the performance of stationary-phase QPSTM on clusters of graphics processing units (GPU). Then, we designed an imaging point parallel strategy to achieve an optimal parallel computing performance. Afterward, we adopted an asynchronous double buffering scheme for multi-stream to perform the GPU/CPU parallel computing. Moreover, several key optimization strategies of computation and storage based on the compute unified device architecture (CUDA) were adopted to accelerate the 3D stationary-phase QPSTM algorithm. Compared with the initial GPU code, the implementation of the key optimization steps, including thread optimization, shared memory optimization, register optimization and special function units (SFU), greatly improved the efficiency. A numerical example employing real large-scale, 3D seismic data showed that our scheme is nearly 80 times faster than the CPU-QPSTM algorithm. Our GPU/CPU heterogeneous parallel computing framework significant reduces the computational cost and facilitates 3D high-resolution imaging for large-scale seismic data.
NASA Astrophysics Data System (ADS)
Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela
2016-10-01
Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.
Li, Zhijin; Vogelmann, Andrew M.; Feng, Sha; ...
2015-01-20
We produce fine-resolution, three-dimensional fields of meteorological and other variables for the U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM) Southern Great Plains site. The Community Gridpoint Statistical Interpolation system is implemented in a multiscale data assimilation (MS-DA) framework that is used within the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. The MS-DA algorithm uses existing reanalysis products and constrains fine-scale atmospheric properties by assimilating high-resolution observations. A set of experiments show that the data assimilation analysis realistically reproduces the intensity, structure, and time evolution of clouds and precipitation associated with a mesoscale convective system.more » Evaluations also show that the large-scale forcing derived from the fine-resolution analysis has an overall accuracy comparable to the existing ARM operational product. For enhanced applications, the fine-resolution fields are used to characterize the contribution of subgrid variability to the large-scale forcing and to derive hydrometeor forcing, which are presented in companion papers.« less
Bridging the scales in atmospheric composition simulations using a nudging technique
NASA Astrophysics Data System (ADS)
D'Isidoro, Massimo; Maurizi, Alberto; Russo, Felicita; Tampieri, Francesco
2010-05-01
Studying the interaction between climate and anthropogenic activities, specifically those concentrated in megacities/hot spots, requires the description of processes in a very wide range of scales from local, where anthropogenic emissions are concentrated to global where we are interested to study the impact of these sources. The description of all the processes at all scales within the same numerical implementation is not feasible because of limited computer resources. Therefore, different phenomena are studied by means of different numerical models that can cover different range of scales. The exchange of information from small to large scale is highly non-trivial though of high interest. In fact uncertainties in large scale simulations are expected to receive large contribution from the most polluted areas where the highly inhomogeneous distribution of sources connected to the intrinsic non-linearity of the processes involved can generate non negligible departures between coarse and fine scale simulations. In this work a new method is proposed and investigated in a case study (August 2009) using the BOLCHEM model. Monthly simulations at coarse (0.5° European domain, run A) and fine (0.1° Central Mediterranean domain, run B) horizontal resolution are performed using the coarse resolution as boundary condition for the fine one. Then another coarse resolution run (run C) is performed, in which the high resolution fields remapped on to the coarse grid are used to nudge the concentrations on the Po Valley area. The nudging is applied to all gas and aerosol species of BOLCHEM. Averaged concentrations and variances over Po Valley and other selected areas for O3 and PM are computed. It is observed that although the variance of run B is markedly larger than that of run A, the variance of run C is smaller because the remapping procedure removes large portion of variance from run B fields. Mean concentrations show some differences depending on species: in general mean values of run C lie between run A and run B. A propagation of the signal outside the nudging region is observed, and is evaluated in terms of differences between coarse resolution (with and without nudging) and fine resolution simulations.
NASA Technical Reports Server (NTRS)
Dominguez, Anthony; Kleissl, Jan P.; Luvall, Jeffrey C.
2011-01-01
Large-eddy Simulation (LES) was used to study convective boundary layer (CBL) flow through suburban regions with both large and small scale heterogeneities in surface temperature. Constant remotely sensed surface temperatures were applied at the surface boundary at resolutions of 10 m, 90 m, 200 m, and 1 km. Increasing the surface resolution from 1 km to 200 m had the most significant impact on the mean and turbulent flow characteristics as the larger scale heterogeneities became resolved. While previous studies concluded that scales of heterogeneity much smaller than the CBL inversion height have little impact on the CBL characteristics, we found that further increasing the surface resolution (resolving smaller scale heterogeneities) results in an increase in mean surface heat flux, thermal blending height, and potential temperature profile. The results of this study will help to better inform sub-grid parameterization for meso-scale meteorological models. The simulation tool developed through this study (combining LES and high resolution remotely sensed surface conditions) is a significant step towards future studies on the micro-scale meteorology in urban areas.
NASA Astrophysics Data System (ADS)
Hendrickx, J. M. H.; Allen, R. G.; Myint, S. W.; Ogden, F. L.
2015-12-01
Large scale mapping of evapotranspiration and root zone soil moisture is only possible when satellite images are used. The spatial resolution of this imagery typically depends on its temporal resolution or the satellite overpass time. For example, the Landsat satellite acquires images at 30 m resolution every 16 days while the MODIS satellite acquires images at 250 m resolution every day. In this study we deal with optical/thermal imagery that is impacted by cloudiness contrary to radar imagery that penetrates through clouds. Due to cloudiness, the temporal resolution of Landsat drops from 16 days to about one clear sky Landsat image per month in the southwestern USA and about one every ten years in the humid tropics of Panama. Only by launching additional satellites can the temporal resolution be improved. Since this is too costly, an alternative is found by using ground measurements with high temporal resolution (from minutes to days) but poor spatial resolution. The challenge for large-scale evapotranspiration and root zone soil moisture mapping is to construct a layer stack consisting of N time layers covering the period of interest each containing M pixels covering the region of interest. We will present examples of the Phoenix Active Management Area in AZ (14,600 km2), Green River Basin in WY (44,000 km2), the Kishwaukee Watershed in IL (3,150 km2), the area covered by Landsat Path 28/Row 35 in OK (30,000 km2) and the Agua Salud Watershed in Panama (200 km2). In these regions we used Landsat or MODIS imagery for mapping evapotranspiration and root zone soil moisture by the algorithm Mapping EvapoTranspiration at high Resolution with Internalized Calibration (METRIC) together with meteorological measurements and sometimes either Large Aperture Scintillometers (LAS) or Eddy Covariance (EC). We conclude with lessons learned for future large-scale hydrological studies.
Improving crop condition monitoring at field scale by using optimal Landsat and MODIS images
USDA-ARS?s Scientific Manuscript database
Satellite remote sensing data at coarse resolution (kilometers) have been widely used in monitoring crop condition for decades. However, crop condition monitoring at field scale requires high resolution data in both time and space. Although a large number of remote sensing instruments with different...
Obtaining high-resolution stage forecasts by coupling large-scale hydrologic models with sensor data
NASA Astrophysics Data System (ADS)
Fries, K. J.; Kerkez, B.
2017-12-01
We investigate how "big" quantities of distributed sensor data can be coupled with a large-scale hydrologic model, in particular the National Water Model (NWM), to obtain hyper-resolution forecasts. The recent launch of the NWM provides a great example of how growing computational capacity is enabling a new generation of massive hydrologic models. While the NWM spans an unprecedented spatial extent, there remain many questions about how to improve forecast at the street-level, the resolution at which many stakeholders make critical decisions. Further, the NWM runs on supercomputers, so water managers who may have access to their own high-resolution measurements may not readily be able to assimilate them into the model. To that end, we ask the question: how can the advances of the large-scale NWM be coupled with new local observations to enable hyper-resolution hydrologic forecasts? A methodology is proposed whereby the flow forecasts of the NWM are directly mapped to high-resolution stream levels using Dynamical System Identification. We apply the methodology across a sensor network of 182 gages in Iowa. Of these sites, approximately one third have shown to perform well in high-resolution flood forecasting when coupled with the outputs of the NWM. The quality of these forecasts is characterized using Principal Component Analysis and Random Forests to identify where the NWM may benefit from new sources of local observations. We also discuss how this approach can help municipalities identify where they should place low-cost sensors to most benefit from flood forecasts of the NWM.
Advances in DNA sequencing technologies for high resolution HLA typing.
Cereb, Nezih; Kim, Hwa Ran; Ryu, Jaejun; Yang, Soo Young
2015-12-01
This communication describes our experience in large-scale G group-level high resolution HLA typing using three different DNA sequencing platforms - ABI 3730 xl, Illumina MiSeq and PacBio RS II. Recent advances in DNA sequencing technologies, so-called next generation sequencing (NGS), have brought breakthroughs in deciphering the genetic information in all living species at a large scale and at an affordable level. The NGS DNA indexing system allows sequencing multiple genes for large number of individuals in a single run. Our laboratory has adopted and used these technologies for HLA molecular testing services. We found that each sequencing technology has its own strengths and weaknesses, and their sequencing performances complement each other. HLA genes are highly complex and genotyping them is quite challenging. Using these three sequencing platforms, we were able to meet all requirements for G group-level high resolution and high volume HLA typing. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
High-Resolution Climate Data Visualization through GIS- and Web-based Data Portals
NASA Astrophysics Data System (ADS)
WANG, X.; Huang, G.
2017-12-01
Sound decisions on climate change adaptation rely on an in-depth assessment of potential climate change impacts at regional and local scales, which usually requires finer resolution climate projections at both spatial and temporal scales. However, effective downscaling of global climate projections is practically difficult due to the lack of computational resources and/or long-term reference data. Although a large volume of downscaled climate data has been make available to the public, how to understand and interpret the large-volume climate data and how to make use of the data to drive impact assessment and adaptation studies are still challenging for both impact researchers and decision makers. Such difficulties have become major barriers preventing informed climate change adaptation planning at regional scales. Therefore, this research will explore new GIS- and web-based technologies to help visualize the large-volume regional climate data with high spatiotemporal resolutions. A user-friendly public data portal, named Climate Change Data Portal (CCDP, http://ccdp.network), will be established to allow intuitive and open access to high-resolution regional climate projections at local scales. The CCDP offers functions of visual representation through geospatial maps and data downloading for a variety of climate variables (e.g., temperature, precipitation, relative humidity, solar radiation, and wind) at multiple spatial resolutions (i.e., 25 - 50 km) and temporal resolutions (i.e., annual, seasonal, monthly, daily, and hourly). The vast amount of information the CCDP encompasses can provide a crucial basis for assessing impacts of climate change on local communities and ecosystems and for supporting better decision making under a changing climate.
A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume
NASA Astrophysics Data System (ADS)
Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration
2017-11-01
An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.
A Bayesian Nonparametric Approach to Image Super-Resolution.
Polatkan, Gungor; Zhou, Mingyuan; Carin, Lawrence; Blei, David; Daubechies, Ingrid
2015-02-01
Super-resolution methods form high-resolution images from low-resolution images. In this paper, we develop a new Bayesian nonparametric model for super-resolution. Our method uses a beta-Bernoulli process to learn a set of recurring visual patterns, called dictionary elements, from the data. Because it is nonparametric, the number of elements found is also determined from the data. We test the results on both benchmark and natural images, comparing with several other models from the research literature. We perform large-scale human evaluation experiments to assess the visual quality of the results. In a first implementation, we use Gibbs sampling to approximate the posterior. However, this algorithm is not feasible for large-scale data. To circumvent this, we then develop an online variational Bayes (VB) algorithm. This algorithm finds high quality dictionaries in a fraction of the time needed by the Gibbs sampler.
High-resolution mapping of forest carbon stocks in the Colombian Amazon
NASA Astrophysics Data System (ADS)
Asner, G. P.; Clark, J. K.; Mascaro, J.; Galindo García, G. A.; Chadwick, K. D.; Navarrete Encinales, D. A.; Paez-Acosta, G.; Cabrera Montenegro, E.; Kennedy-Bowdoin, T.; Duque, Á.; Balaji, A.; von Hildebrand, P.; Maatoug, L.; Bernal, J. F. Phillips; Yepes Quintero, A. P.; Knapp, D. E.; García Dávila, M. C.; Jacobson, J.; Ordóñez, M. F.
2012-07-01
High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or light detection and ranging (LiDAR) samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high-resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (> 40%) of the Colombian Amazon - a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i) employing a universal approach to airborne LiDAR-calibration with limited field data; (ii) quantifying environmental controls over carbon densities; and (iii) developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon maps have 14% uncertainty at 1 ha resolution, and the regional map based on stratification has 28% uncertainty in any given hectare. High-resolution approaches with quantifiable pixel-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.
High-resolution Mapping of Forest Carbon Stocks in the Colombian Amazon
NASA Astrophysics Data System (ADS)
Asner, G. P.; Clark, J. K.; Mascaro, J.; Galindo García, G. A.; Chadwick, K. D.; Navarrete Encinales, D. A.; Paez-Acosta, G.; Cabrera Montenegro, E.; Kennedy-Bowdoin, T.; Duque, Á.; Balaji, A.; von Hildebrand, P.; Maatoug, L.; Bernal, J. F. Phillips; Knapp, D. E.; García Dávila, M. C.; Jacobson, J.; Ordóñez, M. F.
2012-03-01
High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or Light Detection and Ranging (LiDAR) samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (>40 %) of the Colombian Amazon - a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i) employing a universal approach to airborne LiDAR-calibration with limited field data; (ii) quantifying environmental controls over carbon densities; and (iii) developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon mapping samples had 14.6 % uncertainty at 1 ha resolution, and regional maps based on stratification and regression approaches had 25.6 % and 29.6 % uncertainty, respectively, in any given hectare. High-resolution approaches with reported local-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision-makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.
Huang, Yongjun; Flores, Jaime Gonzalo Flor; Cai, Ziqiang; Yu, Mingbin; Kwong, Dim-Lee; Wen, Guangjun; Churchill, Layne; Wong, Chee Wei
2017-06-29
For the sensitive high-resolution force- and field-sensing applications, the large-mass microelectromechanical system (MEMS) and optomechanical cavity have been proposed to realize the sub-aN/Hz 1/2 resolution levels. In view of the optomechanical cavity-based force- and field-sensors, the optomechanical coupling is the key parameter for achieving high sensitivity and resolution. Here we demonstrate a chip-scale optomechanical cavity with large mass which operates at ≈77.7 kHz fundamental mode and intrinsically exhibiting large optomechanical coupling of 44 GHz/nm or more, for both optical resonance modes. The mechanical stiffening range of ≈58 kHz and a more than 100 th -order harmonics are obtained, with which the free-running frequency instability is lower than 10 -6 at 100 ms integration time. Such results can be applied to further improve the sensing performance of the optomechanical inspired chip-scale sensors.
Spatial resolution requirements for urban land cover mapping from space
NASA Technical Reports Server (NTRS)
Todd, William J.; Wrigley, Robert C.
1986-01-01
Very low resolution (VLR) satellite data (Advanced Very High Resolution Radiometer, DMSP Operational Linescan System), low resolution (LR) data (Landsat MSS), medium resolution (MR) data (Landsat TM), and high resolution (HR) satellite data (Spot HRV, Large Format Camera) were evaluated and compared for interpretability at differing spatial resolutions. VLR data (500 m - 1.0 km) is useful for Level 1 (urban/rural distinction) mapping at 1:1,000,000 scale. Feature tone/color is utilized to distinguish generalized urban land cover using LR data (80 m) for 1:250,000 scale mapping. Advancing to MR data (30 m) and 1:100,000 scale mapping, confidence in land cover mapping is greatly increased, owing to the element of texture/pattern which is now evident in the imagery. Shape and shadow contribute to detailed Level II/III urban land use mapping possible if the interpreter can use HR (10-15 m) satellite data; mapping scales can be 1:25,000 - 1:50,000.
Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset
NASA Astrophysics Data System (ADS)
Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.
2017-12-01
Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.
SOLAR WIND TURBULENCE FROM MHD TO SUB-ION SCALES: HIGH-RESOLUTION HYBRID SIMULATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franci, Luca; Verdini, Andrea; Landi, Simone
2015-05-10
We present results from a high-resolution and large-scale hybrid (fluid electrons and particle-in-cell protons) two-dimensional numerical simulation of decaying turbulence. Two distinct spectral regions (separated by a smooth break at proton scales) develop with clear power-law scaling, each one occupying about a decade in wavenumbers. The simulation results simultaneously exhibit several properties of the observed solar wind fluctuations: spectral indices of the magnetic, kinetic, and residual energy spectra in the magnetohydrodynamic (MHD) inertial range along with a flattening of the electric field spectrum, an increase in magnetic compressibility, and a strong coupling of the cascade with the density and themore » parallel component of the magnetic fluctuations at sub-proton scales. Our findings support the interpretation that in the solar wind, large-scale MHD fluctuations naturally evolve beyond proton scales into a turbulent regime that is governed by the generalized Ohm’s law.« less
Solar Wind Turbulence from MHD to Sub-ion Scales: High-resolution Hybrid Simulations
NASA Astrophysics Data System (ADS)
Franci, Luca; Verdini, Andrea; Matteini, Lorenzo; Landi, Simone; Hellinger, Petr
2015-05-01
We present results from a high-resolution and large-scale hybrid (fluid electrons and particle-in-cell protons) two-dimensional numerical simulation of decaying turbulence. Two distinct spectral regions (separated by a smooth break at proton scales) develop with clear power-law scaling, each one occupying about a decade in wavenumbers. The simulation results simultaneously exhibit several properties of the observed solar wind fluctuations: spectral indices of the magnetic, kinetic, and residual energy spectra in the magnetohydrodynamic (MHD) inertial range along with a flattening of the electric field spectrum, an increase in magnetic compressibility, and a strong coupling of the cascade with the density and the parallel component of the magnetic fluctuations at sub-proton scales. Our findings support the interpretation that in the solar wind, large-scale MHD fluctuations naturally evolve beyond proton scales into a turbulent regime that is governed by the generalized Ohm’s law.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakaguchi, Koichi; Leung, Lai-Yung R.; Zhao, Chun
This study presents a diagnosis of a multi-resolution approach using the Model for Prediction Across Scales - Atmosphere (MPAS-A) for simulating regional climate. Four AMIP experiments are conducted for 1999-2009. In the first two experiments, MPAS-A is configured using global quasi-uniform grids at 120 km and 30 km grid spacing. In the other two experiments, MPAS-A is configured using variable-resolution (VR) mesh with local refinement at 30 km over North America and South America embedded inside a quasi-uniform domain at 120 km elsewhere. Precipitation and related fields in the four simulations are examined to determine how well the VR simulationsmore » reproduce the features simulated by the globally high-resolution model in the refined domain. In previous analyses of idealized aqua-planet simulations, the characteristics of the global high-resolution simulation in moist processes only developed near the boundary of the refined region. In contrast, the AMIP simulations with VR grids are able to reproduce the high-resolution characteristics across the refined domain, particularly in South America. This indicates the importance of finely resolved lower-boundary forcing such as topography and surface heterogeneity for the regional climate, and demonstrates the ability of the MPAS-A VR to replicate the large-scale moisture transport as simulated in the quasi-uniform high-resolution model. Outside of the refined domain, some upscale effects are detected through large-scale circulation but the overall climatic signals are not significant at regional scales. Our results provide support for the multi-resolution approach as a computationally efficient and physically consistent method for modeling regional climate.« less
NASA Technical Reports Server (NTRS)
Ormsby, J. P.
1982-01-01
An examination of the possibilities of using Landsat data to simulate NOAA-6 Advanced Very High Resolution Radiometer (AVHRR) data on two channels, as well as using actual NOAA-6 imagery, for large-scale hydrological studies is presented. A running average was obtained of 18 consecutive pixels of 1 km resolution taken by the Landsat scanners were scaled up to 8-bit data and investigated for different gray levels. AVHRR data comprising five channels of 10-bit, band-interleaved information covering 10 deg latitude were analyzed and a suitable pixel grid was chosen for comparison with the Landsat data in a supervised classification format, an unsupervised mode, and with ground truth. Landcover delineation was explored by removing snow, water, and cloud features from the cluster analysis, and resulted in less than 10% difference. Low resolution large-scale data was determined useful for characterizing some landcover features if weekly and/or monthly updates are maintained.
Extended-Range High-Resolution Dynamical Downscaling over a Continental-Scale Domain
NASA Astrophysics Data System (ADS)
Husain, S. Z.; Separovic, L.; Yu, W.; Fernig, D.
2014-12-01
High-resolution mesoscale simulations, when applied for downscaling meteorological fields over large spatial domains and for extended time periods, can provide valuable information for many practical application scenarios including the weather-dependent renewable energy industry. In the present study, a strategy has been proposed to dynamically downscale coarse-resolution meteorological fields from Environment Canada's regional analyses for a period of multiple years over the entire Canadian territory. The study demonstrates that a continuous mesoscale simulation over the entire domain is the most suitable approach in this regard. Large-scale deviations in the different meteorological fields pose the biggest challenge for extended-range simulations over continental scale domains, and the enforcement of the lateral boundary conditions is not sufficient to restrict such deviations. A scheme has therefore been developed to spectrally nudge the simulated high-resolution meteorological fields at the different model vertical levels towards those embedded in the coarse-resolution driving fields derived from the regional analyses. A series of experiments were carried out to determine the optimal nudging strategy including the appropriate nudging length scales, nudging vertical profile and temporal relaxation. A forcing strategy based on grid nudging of the different surface fields, including surface temperature, soil-moisture, and snow conditions, towards their expected values obtained from a high-resolution offline surface scheme was also devised to limit any considerable deviation in the evolving surface fields due to extended-range temporal integrations. The study shows that ensuring large-scale atmospheric similarities helps to deliver near-surface statistical scores for temperature, dew point temperature and horizontal wind speed that are better or comparable to the operational regional forecasts issued by Environment Canada. Furthermore, the meteorological fields resulting from the proposed downscaling strategy have significantly improved spatiotemporal variance compared to those from the operational forecasts, and any time series generated from the downscaled fields do not suffer from discontinuities due to switching between the consecutive forecasts.
NASA Astrophysics Data System (ADS)
Postadjian, T.; Le Bris, A.; Sahbi, H.; Mallet, C.
2017-05-01
Semantic classification is a core remote sensing task as it provides the fundamental input for land-cover map generation. The very recent literature has shown the superior performance of deep convolutional neural networks (DCNN) for many classification tasks including the automatic analysis of Very High Spatial Resolution (VHR) geospatial images. Most of the recent initiatives have focused on very high discrimination capacity combined with accurate object boundary retrieval. Therefore, current architectures are perfectly tailored for urban areas over restricted areas but not designed for large-scale purposes. This paper presents an end-to-end automatic processing chain, based on DCNNs, that aims at performing large-scale classification of VHR satellite images (here SPOT 6/7). Since this work assesses, through various experiments, the potential of DCNNs for country-scale VHR land-cover map generation, a simple yet effective architecture is proposed, efficiently discriminating the main classes of interest (namely buildings, roads, water, crops, vegetated areas) by exploiting existing VHR land-cover maps for training.
Nested high-resolution large-eddy simulations in WRF to support wind power
NASA Astrophysics Data System (ADS)
Mirocha, J.; Kirkil, G.; Kosovic, B.; Lundquist, J. K.
2009-12-01
The WRF model’s grid nesting capability provides a potentially powerful framework for simulating flow over a wide range of scales. One such application is computation of realistic inflow boundary conditions for large eddy simulations (LES) by nesting LES domains within mesoscale domains. While nesting has been widely and successfully applied at GCM to mesoscale resolutions, the WRF model’s nesting behavior at the high-resolution (Δx < 1000m) end of the spectrum is less well understood. Nesting LES within msoscale domains can significantly improve turbulent flow prediction at the scale of a wind park, providing a basis for superior site characterization, or for improved simulation of turbulent inflows encountered by turbines. We investigate WRF’s grid nesting capability at high mesh resolutions using nested mesoscale and large-eddy simulations. We examine the spatial scales required for flow structures to equilibrate to the finer mesh as flow enters a nest, and how the process depends on several parameters, including grid resolution, turbulence subfilter stress models, relaxation zones at nest interfaces, flow velocities, surface roughnesses, terrain complexity and atmospheric stability. Guidance on appropriate domain sizes and turbulence models for LES in light of these results is provided This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 LLNL-ABS-416482
Light sheet theta microscopy for rapid high-resolution imaging of large biological samples.
Migliori, Bianca; Datta, Malika S; Dupre, Christophe; Apak, Mehmet C; Asano, Shoh; Gao, Ruixuan; Boyden, Edward S; Hermanson, Ola; Yuste, Rafael; Tomer, Raju
2018-05-29
Advances in tissue clearing and molecular labeling methods are enabling unprecedented optical access to large intact biological systems. These developments fuel the need for high-speed microscopy approaches to image large samples quantitatively and at high resolution. While light sheet microscopy (LSM), with its high planar imaging speed and low photo-bleaching, can be effective, scaling up to larger imaging volumes has been hindered by the use of orthogonal light sheet illumination. To address this fundamental limitation, we have developed light sheet theta microscopy (LSTM), which uniformly illuminates samples from the same side as the detection objective, thereby eliminating limits on lateral dimensions without sacrificing the imaging resolution, depth, and speed. We present a detailed characterization of LSTM, and demonstrate its complementary advantages over LSM for rapid high-resolution quantitative imaging of large intact samples with high uniform quality. The reported LSTM approach is a significant step for the rapid high-resolution quantitative mapping of the structure and function of very large biological systems, such as a clarified thick coronal slab of human brain and uniformly expanded tissues, and also for rapid volumetric calcium imaging of highly motile animals, such as Hydra, undergoing non-isomorphic body shape changes.
NASA Astrophysics Data System (ADS)
Sankey, T.; Donald, J.; McVay, J.
2015-12-01
High resolution remote sensing images and datasets are typically acquired at a large cost, which poses big a challenge for many scientists. Northern Arizona University recently acquired a custom-engineered, cutting-edge UAV and we can now generate our own images with the instrument. The UAV has a unique capability to carry a large payload including a hyperspectral sensor, which images the Earth surface in over 350 spectral bands at 5 cm resolution, and a lidar scanner, which images the land surface and vegetation in 3-dimensions. Both sensors represent the newest available technology with very high resolution, precision, and accuracy. Using the UAV sensors, we are monitoring the effects of regional forest restoration treatment efforts. Individual tree canopy width and height are measured in the field and via the UAV sensors. The high-resolution UAV images are then used to segment individual tree canopies and to derive 3-dimensional estimates. The UAV image-derived variables are then correlated to the field-based measurements and scaled to satellite-derived tree canopy measurements. The relationships between the field-based and UAV-derived estimates are then extrapolated to a larger area to scale the tree canopy dimensions and to estimate tree density within restored and control forest sites.
NUMERICAL SIMULATIONS OF CORONAL HEATING THROUGH FOOTPOINT BRAIDING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansteen, V.; Pontieu, B. De; Carlsson, M.
2015-10-01
Advanced three-dimensional (3D) radiative MHD simulations now reproduce many properties of the outer solar atmosphere. When including a domain from the convection zone into the corona, a hot chromosphere and corona are self-consistently maintained. Here we study two realistic models, with different simulated areas, magnetic field strength and topology, and numerical resolution. These are compared in order to characterize the heating in the 3D-MHD simulations which self-consistently maintains the structure of the atmosphere. We analyze the heating at both large and small scales and find that heating is episodic and highly structured in space, but occurs along loop-shaped structures, andmore » moves along with the magnetic field. On large scales we find that the heating per particle is maximal near the transition region and that widely distributed opposite-polarity field in the photosphere leads to a greater heating scale height in the corona. On smaller scales, heating is concentrated in current sheets, the thicknesses of which are set by the numerical resolution. Some current sheets fragment in time, this process occurring more readily in the higher-resolution model leading to spatially highly intermittent heating. The large-scale heating structures are found to fade in less than about five minutes, while the smaller, local, heating shows timescales of the order of two minutes in one model and one minutes in the other, higher-resolution, model.« less
NASA Astrophysics Data System (ADS)
Langford, Z. L.; Kumar, J.; Hoffman, F. M.
2015-12-01
Observations indicate that over the past several decades, landscape processes in the Arctic have been changing or intensifying. A dynamic Arctic landscape has the potential to alter ecosystems across a broad range of scales. Accurate characterization is useful to understand the properties and organization of the landscape, optimal sampling network design, measurement and process upscaling and to establish a landscape-based framework for multi-scale modeling of ecosystem processes. This study seeks to delineate the landscape at Seward Peninsula of Alaska into ecoregions using large volumes (terabytes) of high spatial resolution satellite remote-sensing data. Defining high-resolution ecoregion boundaries is difficult because many ecosystem processes in Arctic ecosystems occur at small local to regional scales, which are often resolved in by coarse resolution satellites (e.g., MODIS). We seek to use data-fusion techniques and data analytics algorithms applied to Phased Array type L-band Synthetic Aperture Radar (PALSAR), Interferometric Synthetic Aperture Radar (IFSAR), Satellite for Observation of Earth (SPOT), WorldView-2, WorldView-3, and QuickBird-2 to develop high-resolution (˜5m) ecoregion maps for multiple time periods. Traditional analysis methods and algorithms are insufficient for analyzing and synthesizing such large geospatial data sets, and those algorithms rarely scale out onto large distributed- memory parallel computer systems. We seek to develop computationally efficient algorithms and techniques using high-performance computing for characterization of Arctic landscapes. We will apply a variety of data analytics algorithms, such as cluster analysis, complex object-based image analysis (COBIA), and neural networks. We also propose to use representativeness analysis within the Seward Peninsula domain to determine optimal sampling locations for fine-scale measurements. This methodology should provide an initial framework for analyzing dynamic landscape trends in Arctic ecosystems, such as shrubification and disturbances, and integration of ecoregions into multi-scale models.
NASA Astrophysics Data System (ADS)
Zarzycki, C. M.; Gettelman, A.; Callaghan, P.
2017-12-01
Accurately predicting weather extremes such as precipitation (floods and droughts) and temperature (heat waves) requires high resolution to resolve mesoscale dynamics and topography at horizontal scales of 10-30km. Simulating such resolutions globally for climate scales (years to decades) remains computationally impractical. Simulating only a small region of the planet is more tractable at these scales for climate applications. This work describes global simulations using variable-resolution static meshes with multiple dynamical cores that target the continental United States using developmental versions of the Community Earth System Model version 2 (CESM2). CESM2 is tested in idealized, aquaplanet and full physics configurations to evaluate variable mesh simulations against uniform high and uniform low resolution simulations at resolutions down to 15km. Different physical parameterization suites are also evaluated to gauge their sensitivity to resolution. Idealized variable-resolution mesh cases compare well to high resolution tests. More recent versions of the atmospheric physics, including cloud schemes for CESM2, are more stable with respect to changes in horizontal resolution. Most of the sensitivity is due to sensitivity to timestep and interactions between deep convection and large scale condensation, expected from the closure methods. The resulting full physics model produces a comparable climate to the global low resolution mesh and similar high frequency statistics in the high resolution region. Some biases are reduced (orographic precipitation in the western United States), but biases do not necessarily go away at high resolution (e.g. summertime JJA surface Temp). The simulations are able to reproduce uniform high resolution results, making them an effective tool for regional climate studies and are available in CESM2.
NASA Technical Reports Server (NTRS)
Weinan, E.; Shu, Chi-Wang
1994-01-01
High order essentially non-oscillatory (ENO) schemes, originally designed for compressible flow and in general for hyperbolic conservation laws, are applied to incompressible Euler and Navier-Stokes equations with periodic boundary conditions. The projection to divergence-free velocity fields is achieved by fourth-order central differences through fast Fourier transforms (FFT) and a mild high-order filtering. The objective of this work is to assess the resolution of ENO schemes for large scale features of the flow when a coarse grid is used and small scale features of the flow, such as shears and roll-ups, are not fully resolved. It is found that high-order ENO schemes remain stable under such situations and quantities related to large scale features, such as the total circulation around the roll-up region, are adequately resolved.
NASA Technical Reports Server (NTRS)
Weinan, E.; Shu, Chi-Wang
1992-01-01
High order essentially non-oscillatory (ENO) schemes, originally designed for compressible flow and in general for hyperbolic conservation laws, are applied to incompressible Euler and Navier-Stokes equations with periodic boundary conditions. The projection to divergence-free velocity fields is achieved by fourth order central differences through Fast Fourier Transforms (FFT) and a mild high-order filtering. The objective of this work is to assess the resolution of ENO schemes for large scale features of the flow when a coarse grid is used and small scale features of the flow, such as shears and roll-ups, are not fully resolved. It is found that high-order ENO schemes remain stable under such situations and quantities related to large-scale features, such as the total circulation around the roll-up region, are adequately resolved.
Van de Kamer, J B; Lagendijk, J J W
2002-05-21
SAR distributions in a healthy female adult head as a result of a radiating vertical dipole antenna (frequency 915 MHz) representing a hand-held mobile phone have been computed for three different resolutions: 2 mm, 1 mm and 0.4 mm. The extremely high resolution of 0.4 mm was obtained with our quasistatic zooming technique, which is briefly described in this paper. For an effectively transmitted power of 0.25 W, the maximum averaged SAR values in both cubic- and arbitrary-shaped volumes are, respectively, about 1.72 and 2.55 W kg(-1) for 1 g and 0.98 and 1.73 W kg(-1) for 10 g of tissue. These numbers do not vary much (<8%) for the different resolutions, indicating that SAR computations at a resolution of 2 mm are sufficiently accurate to describe the large-scale distribution. However, considering the detailed SAR pattern in the head, large differences may occur if high-resolution computations are performed rather than low-resolution ones. These deviations are caused by both increased modelling accuracy and improved anatomical description in higher resolution simulations. For example, the SAR profile across a boundary between tissues with high dielectric contrast is much more accurately described at higher resolutions. Furthermore, low-resolution dielectric geometries may suffer from loss of anatomical detail, which greatly affects small-scale SAR distributions. Thus. for strongly inhomogeneous regions high-resolution SAR modelling is an absolute necessity.
The scale dependence of optical diversity in a prairie ecosystem
NASA Astrophysics Data System (ADS)
Gamon, J. A.; Wang, R.; Stilwell, A.; Zygielbaum, A. I.; Cavender-Bares, J.; Townsend, P. A.
2015-12-01
Biodiversity loss, one of the most crucial challenges of our time, endangers ecosystem services that maintain human wellbeing. Traditional methods of measuring biodiversity require extensive and costly field sampling by biologists with extensive experience in species identification. Remote sensing can be used for such assessment based upon patterns of optical variation. This provides efficient and cost-effective means to determine ecosystem diversity at different scales and over large areas. Sampling scale has been described as a "fundamental conceptual problem" in ecology, and is an important practical consideration in both remote sensing and traditional biodiversity studies. On the one hand, with decreasing spatial and spectral resolution, the differences among different optical types may become weak or even disappear. Alternately, high spatial and/or spectral resolution may introduce redundant or contradictory information. For example, at high resolution, the variation within optical types (e.g., between leaves on a single plant canopy) may add complexity unrelated to specie richness. We studied the scale-dependence of optical diversity in a prairie ecosystem at Cedar Creek Ecosystem Science Reserve, Minnesota, USA using a variety of spectrometers from several platforms on the ground and in the air. Using the coefficient of variation (CV) of spectra as an indicator of optical diversity, we found that high richness plots generally have a higher coefficient of variation. High resolution imaging spectrometer data (1 mm pixels) showed the highest sensitivity to richness level. With decreasing spatial resolution, the difference in CV between richness levels decreased, but remained significant. These findings can be used to guide airborne studies of biodiversity and develop more effective large-scale biodiversity sampling methods.
High-Resolution Large Field-of-View FUV Compact Camera
NASA Technical Reports Server (NTRS)
Spann, James F.
2006-01-01
The need for a high resolution camera with a large field of view and capable to image dim emissions in the far-ultraviolet is driven by the widely varying intensities of FUV emissions and spatial/temporal scales of phenomena of interest in the Earth% ionosphere. In this paper, the concept of a camera is presented that is designed to achieve these goals in a lightweight package with sufficient visible light rejection to be useful for dayside and nightside emissions. The camera employs the concept of self-filtering to achieve good spectral resolution tuned to specific wavelengths. The large field of view is sufficient to image the Earth's disk at Geosynchronous altitudes and capable of a spatial resolution of >20 km. The optics and filters are emphasized.
Chang, Xueli; Du, Siliang; Li, Yingying; Fang, Shenghui
2018-01-01
Large size high resolution (HR) satellite image matching is a challenging task due to local distortion, repetitive structures, intensity changes and low efficiency. In this paper, a novel matching approach is proposed for the large size HR satellite image registration, which is based on coarse-to-fine strategy and geometric scale-invariant feature transform (SIFT). In the coarse matching step, a robust matching method scale restrict (SR) SIFT is implemented at low resolution level. The matching results provide geometric constraints which are then used to guide block division and geometric SIFT in the fine matching step. The block matching method can overcome the memory problem. In geometric SIFT, with area constraints, it is beneficial for validating the candidate matches and decreasing searching complexity. To further improve the matching efficiency, the proposed matching method is parallelized using OpenMP. Finally, the sensing image is rectified to the coordinate of reference image via Triangulated Irregular Network (TIN) transformation. Experiments are designed to test the performance of the proposed matching method. The experimental results show that the proposed method can decrease the matching time and increase the number of matching points while maintaining high registration accuracy. PMID:29702589
Measuring Large-Scale Social Networks with High Resolution
Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune
2014-01-01
This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359
Techniques for automatic large scale change analysis of temporal multispectral imagery
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.
Atmospheric gravity waves with small vertical-to-horizotal wavelength ratios
NASA Astrophysics Data System (ADS)
Song, I. S.; Jee, G.; Kim, Y. H.; Chun, H. Y.
2017-12-01
Gravity wave modes with small vertical-to-horizontal wavelength ratios of an order of 10-3 are investigated through the systematic scale analysis of governing equations for gravity wave perturbations embedded in the quasi-geostrophic large-scale flow. These waves can be categorized as acoustic gravity wave modes because their total energy is given by the sum of kinetic, potential, and elastic parts. It is found that these waves can be forced by density fluctuations multiplied by the horizontal gradients of the large-scale pressure (geopotential) fields. These theoretical findings are evaluated using the results of a high-resolution global model (Specified Chemistry WACCM with horizontal resolution of 25 km and vertical resolution of 600 m) by computing the density-related gravity-wave forcing terms from the modeling results.
NASA Astrophysics Data System (ADS)
Welle, Paul D.; Mauter, Meagan S.
2017-09-01
This work introduces a generalizable approach for estimating the field-scale agricultural yield losses due to soil salinization. When integrated with regional data on crop yields and prices, this model provides high-resolution estimates for revenue losses over large agricultural regions. These methods account for the uncertainty inherent in model inputs derived from satellites, experimental field data, and interpreted model results. We apply this method to estimate the effect of soil salinity on agricultural outputs in California, performing the analysis with both high-resolution (i.e. field scale) and low-resolution (i.e. county-scale) data sources to highlight the importance of spatial resolution in agricultural analysis. We estimate that soil salinity reduced agricultural revenues by 3.7 billion (1.7-7.0 billion) in 2014, amounting to 8.0 million tons of lost production relative to soil salinities below the crop-specific thresholds. When using low-resolution data sources, we find that the costs of salinization are underestimated by a factor of three. These results highlight the need for high-resolution data in agro-environmental assessment as well as the challenges associated with their integration.
Large and small-scale structures in Saturn's rings
NASA Astrophysics Data System (ADS)
Albers, N.; Rehnberg, M. E.; Brown, Z. L.; Sremcevic, M.; Esposito, L. W.
2017-09-01
Observations made by the Cassini spacecraft have revealed both large and small scale structures in Saturn's rings in unprecedented detail. Analysis of high-resolution measurements by the Cassini Ultraviolet Spectrograph (UVIS) High Speed Photometer (HSP) and the Imaging Science Subsystem (ISS) show an abundance of intrinsic small-scale structures (or clumping) seen across the entire ring system. These include self-gravity wakes (50-100m), sub-km structure at the A and B ring edges, and "straw"/"ropy" structures (1-3km).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Gang
Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less
Dual-axis confocal microscope for high-resolution in vivo imaging
Wang, Thomas D.; Mandella, Michael J.; Contag, Christopher H.; Kino, Gordon S.
2007-01-01
We describe a novel confocal microscope that uses separate low-numerical-aperture objectives with the illumination and collection axes crossed at angle θ from the midline. This architecture collects images in scattering media with high transverse and axial resolution, long working distance, large field of view, and reduced noise from scattered light. We measured transverse and axial (FWHM) resolution of 1.3 and 2.1 μm, respectively, in free space, and confirm subcellular resolution in excised esophageal mucosa. The optics may be scaled to millimeter dimensions and fiber coupled for collection of high-resolution images in vivo. PMID:12659264
NASA Astrophysics Data System (ADS)
Lewis, Q. W.; Rhoads, B. L.
2017-12-01
The merging of rivers at confluences results in complex three-dimensional flow patterns that influence sediment transport, bed morphology, downstream mixing, and physical habitat conditions. The capacity to characterize comprehensively flow at confluences using traditional sensors, such as acoustic Doppler velocimeters and profiles, is limited by the restricted spatial resolution of these sensors and difficulties in measuring velocities simultaneously at many locations within a confluence. This study assesses two-dimensional surficial patterns of flow structure at a small stream confluence in Illinois, USA, using large scale particle image velocimetry (LSPIV) derived from videos captured by unmanned aerial systems (UAS). The method captures surface velocity patterns at high spatial and temporal resolution over multiple scales, ranging from the entire confluence to details of flow within the confluence mixing interface. Flow patterns at high momentum ratio are compared to flow patterns when the two incoming flows have nearly equal momentum flux. Mean surface flow patterns during the two types of events provide details on mean patterns of surface flow in different hydrodynamic regions of the confluence and on changes in these patterns with changing momentum flux ratio. LSPIV data derived from the highest resolution imagery also reveal general characteristics of large-scale vortices that form along the shear layer between the flows during the high-momentum ratio event. The results indicate that the use of LSPIV and UAS is well-suited for capturing in detail mean surface patterns of flow at small confluences, but that characterization of evolving turbulent structures is limited by scale considerations related to structure size, image resolution, and camera instability. Complementary methods, including camera platforms mounted at fixed positions close to the water surface, provide opportunities to accurately characterize evolving turbulent flow structures in confluences.
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
NASA Astrophysics Data System (ADS)
Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; Senior, Catherine A.; Bellucci, Alessio; Bao, Qing; Chang, Ping; Corti, Susanna; Fučkar, Neven S.; Guemas, Virginie; von Hardenberg, Jost; Hazeleger, Wilco; Kodama, Chihiro; Koenigk, Torben; Leung, L. Ruby; Lu, Jian; Luo, Jing-Jia; Mao, Jiafu; Mizielinski, Matthew S.; Mizuta, Ryo; Nobre, Paulo; Satoh, Masaki; Scoccimarro, Enrico; Semmler, Tido; Small, Justin; von Storch, Jin-Song
2016-11-01
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, "what are the origins and consequences of systematic model biases?", but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.
Wu, Yiming; Zhang, Xiujuan; Pan, Huanhuan; Deng, Wei; Zhang, Xiaohong; Zhang, Xiwei; Jie, Jiansheng
2013-01-01
Single-crystalline organic nanowires (NWs) are important building blocks for future low-cost and efficient nano-optoelectronic devices due to their extraordinary properties. However, it remains a critical challenge to achieve large-scale organic NW array assembly and device integration. Herein, we demonstrate a feasible one-step method for large-area patterned growth of cross-aligned single-crystalline organic NW arrays and their in-situ device integration for optical image sensors. The integrated image sensor circuitry contained a 10 × 10 pixel array in an area of 1.3 × 1.3 mm2, showing high spatial resolution, excellent stability and reproducibility. More importantly, 100% of the pixels successfully operated at a high response speed and relatively small pixel-to-pixel variation. The high yield and high spatial resolution of the operational pixels, along with the high integration level of the device, clearly demonstrate the great potential of the one-step organic NW array growth and device construction approach for large-scale optoelectronic device integration. PMID:24287887
NASA Astrophysics Data System (ADS)
Rasouli, K.; Pomeroy, J. W.; Hayashi, M.; Fang, X.; Gutmann, E. D.; Li, Y.
2017-12-01
The hydrology of mountainous cold regions has a large spatial variability that is driven both by climate variability and near-surface process variability associated with complex terrain and patterns of vegetation, soils, and hydrogeology. There is a need to downscale large-scale atmospheric circulations towards the fine scales that cold regions hydrological processes operate at to assess their spatial variability in complex terrain and quantify uncertainties by comparison to field observations. In this research, three high resolution numerical weather prediction models, namely, the Intermediate Complexity Atmosphere Research (ICAR), Weather Research and Forecasting (WRF), and Global Environmental Multiscale (GEM) models are used to represent spatial and temporal patterns of atmospheric conditions appropriate for hydrological modelling. An area covering high mountains and foothills of the Canadian Rockies was selected to assess and compare high resolution ICAR (1 km × 1 km), WRF (4 km × 4 km), and GEM (2.5 km × 2.5 km) model outputs with station-based meteorological measurements. ICAR with very low computational cost was run with different initial and boundary conditions and with finer spatial resolution, which allowed an assessment of modelling uncertainty and scaling that was difficult with WRF. Results show that ICAR, when compared with WRF and GEM, performs very well in precipitation and air temperature modelling in the Canadian Rockies, while all three models show a fair performance in simulating wind and humidity fields. Representation of local-scale atmospheric dynamics leading to realistic fields of temperature and precipitation by ICAR, WRF, and GEM makes these models suitable for high resolution cold regions hydrological predictions in complex terrain, which is a key factor in estimating water security in western Canada.
NASA Astrophysics Data System (ADS)
Dipankar, A.; Stevens, B. B.; Zängl, G.; Pondkule, M.; Brdar, S.
2014-12-01
The effect of clouds on large scale dynamics is represented in climate models through parameterization of various processes, of which the parameterization of shallow and deep convection are particularly uncertain. The atmospheric boundary layer, which controls the coupling to the surface, and which defines the scale of shallow convection, is typically 1 km in depth. Thus, simulations on a O(100 m) grid largely obviate the need for such parameterizations. By crossing this threshold of O(100m) grid resolution one can begin thinking of large-eddy simulation (LES), wherein the sub-grid scale parameterization have a sounder theoretical foundation. Substantial initiatives have been taken internationally to approach this threshold. For example, Miura et al., 2007 and Mirakawa et al., 2014 approach this threshold by doing global simulations, with (gradually) decreasing grid resolution, to understand the effect of cloud-resolving scales on the general circulation. Our strategy, on the other hand, is to take a big leap forward by fixing the resolution at O(100 m), and gradually increasing the domain size. We believe that breaking this threshold would greatly help in improving the parameterization schemes and reducing the uncertainty in climate predictions. To take this forward, the German Federal Ministry of Education and Research has initiated a project on HD(CP)2 that aims for a limited area LES at resolution O(100 m) using the new unified modeling system ICON (Zängl et al., 2014). In the talk, results from the HD(CP)2 evaluation simulation will be shown that targets high resolution simulation over a small domain around Jülich, Germany. This site is chosen because high resolution HD(CP)2 Observational Prototype Experiment took place in this region from 1.04.2013 to 31.05.2013, in order to critically evaluate the model. Nesting capabilities of ICON is used to gradually increase the resolution from the outermost domain, which is forced from the COSMO-DE data, to the innermost and finest resolution domain centered around Jülich (see Fig. 1 top panel). Furthermore, detailed analyses of the simulation results against the observation data will be presented. A reprsentative figure showing time series of column integrated water vapor (IWV) for both model and observation on 24.04.2013 is shown in bottom panel of Fig. 1.
Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.
Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less
Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals
Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.; ...
2018-02-09
Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less
Demonstration of nanoimprinted hyperlens array for high-throughput sub-diffraction imaging
NASA Astrophysics Data System (ADS)
Byun, Minsueop; Lee, Dasol; Kim, Minkyung; Kim, Yangdoo; Kim, Kwan; Ok, Jong G.; Rho, Junsuk; Lee, Heon
2017-04-01
Overcoming the resolution limit of conventional optics is regarded as the most important issue in optical imaging science and technology. Although hyperlenses, super-resolution imaging devices based on highly anisotropic dispersion relations that allow the access of high-wavevector components, have recently achieved far-field sub-diffraction imaging in real-time, the previously demonstrated devices have suffered from the extreme difficulties of both the fabrication process and the non-artificial objects placement. This results in restrictions on the practical applications of the hyperlens devices. While implementing large-scale hyperlens arrays in conventional microscopy is desirable to solve such issues, it has not been feasible to fabricate such large-scale hyperlens array with the previously used nanofabrication methods. Here, we suggest a scalable and reliable fabrication process of a large-scale hyperlens device based on direct pattern transfer techniques. We fabricate a 5 cm × 5 cm size hyperlenses array and experimentally demonstrate that it can resolve sub-diffraction features down to 160 nm under 410 nm wavelength visible light. The array-based hyperlens device will provide a simple solution for much more practical far-field and real-time super-resolution imaging which can be widely used in optics, biology, medical science, nanotechnology and other closely related interdisciplinary fields.
Towards large-scale, human-based, mesoscopic neurotechnologies.
Chang, Edward F
2015-04-08
Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.
Optical mapping and its potential for large-scale sequencing projects.
Aston, C; Mishra, B; Schwartz, D C
1999-07-01
Physical mapping has been rediscovered as an important component of large-scale sequencing projects. Restriction maps provide landmark sequences at defined intervals, and high-resolution restriction maps can be assembled from ensembles of single molecules by optical means. Such optical maps can be constructed from both large-insert clones and genomic DNA, and are used as a scaffold for accurately aligning sequence contigs generated by shotgun sequencing.
Scale and modeling issues in water resources planning
Lins, H.F.; Wolock, D.M.; McCabe, G.J.
1997-01-01
Resource planners and managers interested in utilizing climate model output as part of their operational activities immediately confront the dilemma of scale discordance. Their functional responsibilities cover relatively small geographical areas and necessarily require data of relatively high spatial resolution. Climate models cover a large geographical, i.e. global, domain and produce data at comparatively low spatial resolution. Although the scale differences between model output and planning input are large, several techniques have been developed for disaggregating climate model output to a scale appropriate for use in water resource planning and management applications. With techniques in hand to reduce the limitations imposed by scale discordance, water resource professionals must now confront a more fundamental constraint on the use of climate models-the inability to produce accurate representations and forecasts of regional climate. Given the current capabilities of climate models, and the likelihood that the uncertainty associated with long-term climate model forecasts will remain high for some years to come, the water resources planning community may find it impractical to utilize such forecasts operationally.
NASA Astrophysics Data System (ADS)
Ba, Yu Tao; xian Liu, Bao; Sun, Feng; Wang, Li hua; Tang, Yu jia; Zhang, Da wei
2017-04-01
High-resolution mapping of PM2.5 is the prerequisite for precise analytics and subsequent anti-pollution interventions. Considering the large variances of particulate distribution, urban-scale mapping is challenging either with ground-based fixed stations, with satellites or via models. In this study, a dynamic fusion method between high-density sensor network and MODIS Aerosol Optical Depth (AOD) was introduced. The sensor network was deployed in Beijing ( > 1000 fixed monitors across 16000 km2 area) to provide raw observations with high temporal resolution (sampling interval < 1 hour), high spatial resolution in flat areas ( < 1 km), and low spatial resolution in mountainous areas ( > 5 km). The MODIS AOD was calibrated to provide distribution map with low temporal resolution (daily) and moderate spatial resolution ( = 3 km). By encoding the data quality and defects (e.g. could, reflectance, abnormal), a hybrid interpolation procedure with cross-validation generated PM2.5 distribution with both high temporal and spatial resolution. Several no-pollutant and high-pollution periods were tested to validate the proposed fusion method for capturing the instantaneous patterns of PM2.5 emission.
McShane, Ryan R.; Driscoll, Katelyn P.; Sando, Roy
2017-09-27
Many approaches have been developed for measuring or estimating actual evapotranspiration (ETa), and research over many years has led to the development of remote sensing methods that are reliably reproducible and effective in estimating ETa. Several remote sensing methods can be used to estimate ETa at the high spatial resolution of agricultural fields and the large extent of river basins. More complex remote sensing methods apply an analytical approach to ETa estimation using physically based models of varied complexity that require a combination of ground-based and remote sensing data, and are grounded in the theory behind the surface energy balance model. This report, funded through cooperation with the International Joint Commission, provides an overview of selected remote sensing methods used for estimating water consumed through ETa and focuses on Mapping Evapotranspiration at High Resolution with Internalized Calibration (METRIC) and Operational Simplified Surface Energy Balance (SSEBop), two energy balance models for estimating ETa that are currently applied successfully in the United States. The METRIC model can produce maps of ETa at high spatial resolution (30 meters using Landsat data) for specific areas smaller than several hundred square kilometers in extent, an improvement in practice over methods used more generally at larger scales. Many studies validating METRIC estimates of ETa against measurements from lysimeters have shown model accuracies on daily to seasonal time scales ranging from 85 to 95 percent. The METRIC model is accurate, but the greater complexity of METRIC results in greater data requirements, and the internalized calibration of METRIC leads to greater skill required for implementation. In contrast, SSEBop is a simpler model, having reduced data requirements and greater ease of implementation without a substantial loss of accuracy in estimating ETa. The SSEBop model has been used to produce maps of ETa over very large extents (the conterminous United States) using lower spatial resolution (1 kilometer) Moderate Resolution Imaging Spectroradiometer (MODIS) data. Model accuracies ranging from 80 to 95 percent on daily to annual time scales have been shown in numerous studies that validated ETa estimates from SSEBop against eddy covariance measurements. The METRIC and SSEBop models can incorporate low and high spatial resolution data from MODIS and Landsat, but the high spatiotemporal resolution of ETa estimates using Landsat data over large extents takes immense computing power. Cloud computing is providing an opportunity for processing an increasing amount of geospatial “big data” in a decreasing period of time. For example, Google Earth EngineTM has been used to implement METRIC with automated calibration for regional-scale estimates of ETa using Landsat data. The U.S. Geological Survey also is using Google Earth EngineTM to implement SSEBop for estimating ETa in the United States at a continental scale using Landsat data.
Targeted carbon conservation at national scales with high-resolution monitoring
Asner, Gregory P.; Knapp, David E.; Martin, Roberta E.; Tupayachi, Raul; Anderson, Christopher B.; Mascaro, Joseph; Sinca, Felipe; Chadwick, K. Dana; Higgins, Mark; Farfan, William; Llactayo, William; Silman, Miles R.
2014-01-01
Terrestrial carbon conservation can provide critical environmental, social, and climate benefits. Yet, the geographically complex mosaic of threats to, and opportunities for, conserving carbon in landscapes remain largely unresolved at national scales. Using a new high-resolution carbon mapping approach applied to Perú, a megadiverse country undergoing rapid land use change, we found that at least 0.8 Pg of aboveground carbon stocks are at imminent risk of emission from land use activities. Map-based information on the natural controls over carbon density, as well as current ecosystem threats and protections, revealed three biogeographically explicit strategies that fully offset forthcoming land-use emissions. High-resolution carbon mapping affords targeted interventions to reduce greenhouse gas emissions in rapidly developing tropical nations. PMID:25385593
Targeted carbon conservation at national scales with high-resolution monitoring.
Asner, Gregory P; Knapp, David E; Martin, Roberta E; Tupayachi, Raul; Anderson, Christopher B; Mascaro, Joseph; Sinca, Felipe; Chadwick, K Dana; Higgins, Mark; Farfan, William; Llactayo, William; Silman, Miles R
2014-11-25
Terrestrial carbon conservation can provide critical environmental, social, and climate benefits. Yet, the geographically complex mosaic of threats to, and opportunities for, conserving carbon in landscapes remain largely unresolved at national scales. Using a new high-resolution carbon mapping approach applied to Perú, a megadiverse country undergoing rapid land use change, we found that at least 0.8 Pg of aboveground carbon stocks are at imminent risk of emission from land use activities. Map-based information on the natural controls over carbon density, as well as current ecosystem threats and protections, revealed three biogeographically explicit strategies that fully offset forthcoming land-use emissions. High-resolution carbon mapping affords targeted interventions to reduce greenhouse gas emissions in rapidly developing tropical nations.
Guitet, Stéphane; Hérault, Bruno; Molto, Quentin; Brunaux, Olivier; Couteron, Pierre
2015-01-01
Precise mapping of above-ground biomass (AGB) is a major challenge for the success of REDD+ processes in tropical rainforest. The usual mapping methods are based on two hypotheses: a large and long-ranged spatial autocorrelation and a strong environment influence at the regional scale. However, there are no studies of the spatial structure of AGB at the landscapes scale to support these assumptions. We studied spatial variation in AGB at various scales using two large forest inventories conducted in French Guiana. The dataset comprised 2507 plots (0.4 to 0.5 ha) of undisturbed rainforest distributed over the whole region. After checking the uncertainties of estimates obtained from these data, we used half of the dataset to develop explicit predictive models including spatial and environmental effects and tested the accuracy of the resulting maps according to their resolution using the rest of the data. Forest inventories provided accurate AGB estimates at the plot scale, for a mean of 325 Mg.ha-1. They revealed high local variability combined with a weak autocorrelation up to distances of no more than10 km. Environmental variables accounted for a minor part of spatial variation. Accuracy of the best model including spatial effects was 90 Mg.ha-1 at plot scale but coarse graining up to 2-km resolution allowed mapping AGB with accuracy lower than 50 Mg.ha-1. Whatever the resolution, no agreement was found with available pan-tropical reference maps at all resolutions. We concluded that the combined weak autocorrelation and weak environmental effect limit AGB maps accuracy in rainforest, and that a trade-off has to be found between spatial resolution and effective accuracy until adequate "wall-to-wall" remote sensing signals provide reliable AGB predictions. Waiting for this, using large forest inventories with low sampling rate (<0.5%) may be an efficient way to increase the global coverage of AGB maps with acceptable accuracy at kilometric resolution.
Atomic-scale imaging of DNA using scanning tunnelling microscopy.
Driscoll, R J; Youngquist, M G; Baldeschwieler, J D
1990-07-19
The scanning tunnelling microscope (STM) has been used to visualize DNA under water, under oil and in air. Images of single-stranded DNA have shown that submolecular resolution is possible. Here we describe atomic-resolution imaging of duplex DNA. Topographic STM images of uncoated duplex DNA on a graphite substrate obtained in ultra-high vacuum are presented that show double-helical structure, base pairs, and atomic-scale substructure. Experimental STM profiles show excellent correlation with atomic contours of the van der Waals surface of A-form DNA derived from X-ray crystallography. A comparison of variations in the barrier to quantum mechanical tunnelling (barrier-height) with atomic-scale topography shows correlation over the phosphate-sugar backbone but anticorrelation over the base pairs. This relationship may be due to the different chemical characteristics of parts of the molecule. Further investigation of this phenomenon should lead to a better understanding of the physics of imaging adsorbates with the STM and may prove useful in sequencing DNA. The improved resolution compared with previously published STM images of DNA may be attributable to ultra-high vacuum, high data-pixel density, slow scan rate, a fortuitously clean and sharp tip and/or a relatively dilute and extremely clean sample solution. This work demonstrates the potential of the STM for characterization of large biomolecular structures, but additional development will be required to make such high resolution imaging of DNA and other large molecules routine.
Turbulent kinetics of a large wind farm and their impact in the neutral boundary layer
Na, Ji Sung; Koo, Eunmo; Munoz-Esparza, Domingo; ...
2015-12-28
High-resolution large-eddy simulation of the flow over a large wind farm (64 wind turbines) is performed using the HIGRAD/FIRETEC-WindBlade model, which is a high-performance computing wind turbine–atmosphere interaction model that uses the Lagrangian actuator line method to represent rotating turbine blades. These high-resolution large-eddy simulation results are used to parameterize the thrust and power coefficients that contain information about turbine interference effects within the wind farm. Those coefficients are then incorporated into the WRF (Weather Research and Forecasting) model in order to evaluate interference effects in larger-scale models. In the high-resolution WindBlade wind farm simulation, insufficient distance between turbines createsmore » the interference between turbines, including significant vertical variations in momentum and turbulent intensity. The characteristics of the wake are further investigated by analyzing the distribution of the vorticity and turbulent intensity. Quadrant analysis in the turbine and post-turbine areas reveals that the ejection motion induced by the presence of the wind turbines is dominant compared to that in the other quadrants, indicating that the sweep motion is increased at the location where strong wake recovery occurs. Regional-scale WRF simulations reveal that although the turbulent mixing induced by the wind farm is partly diffused to the upper region, there is no significant change in the boundary layer depth. The velocity deficit does not appear to be very sensitive to the local distribution of turbine coefficients. However, differences of about 5% on parameterized turbulent kinetic energy were found depending on the turbine coefficient distribution. Furthermore, turbine coefficients that consider interference in the wind farm should be used in wind farm parameterization for larger-scale models to better describe sub-grid scale turbulent processes.« less
NASA Astrophysics Data System (ADS)
Michaelis, Dirk; Schroeder, Andreas
2012-11-01
Tomographic PIV has triggered vivid activity, reflected in a large number of publications, covering both: development of the technique and a wide range of fluid dynamic experiments. Maturing of tomo PIV allows the application in medium to large scale wind tunnels. Limiting factor for wind tunnel application is the small size of the measurement volume, being typically about of 50 × 50 × 15 mm3. Aim of this study is the optimization towards large measurement volumes and high spatial resolution performing cylinder wake measurements in a 1 meter wind tunnel. Main limiting factors for the volume size are the laser power and the camera sensitivity. So, a high power laser with 800 mJ per pulse is used together with low noise sCMOS cameras, mounted in forward scattering direction to gain intensity due to the Mie scattering characteristics. A mirror is used to bounce the light back, to have all cameras in forward scattering. Achievable particle density is growing with number of cameras, so eight cameras are used for a high spatial resolution. Optimizations lead to volume size of 230 × 200 × 52 mm3 = 2392 cm3, more than 60 times larger than previously. 281 × 323 × 68 vectors are calculated with spacing of 0.76 mm. The achieved measurement volume size and spatial resolution is regarded as a major step forward in the application of tomo PIV in wind tunnels. Supported by EU-project: no. 265695.
Towards a New Assessment of Urban Areas from Local to Global Scales
NASA Astrophysics Data System (ADS)
Bhaduri, B. L.; Roy Chowdhury, P. K.; McKee, J.; Weaver, J.; Bright, E.; Weber, E.
2015-12-01
Since early 2000s, starting with NASA MODIS, satellite based remote sensing has facilitated collection of imagery with medium spatial resolution but high temporal resolution (daily). This trend continues with an increasing number of sensors and data products. Increasing spatial and temporal resolutions of remotely sensed data archives, from both public and commercial sources, have significantly enhanced the quality of mapping and change data products. However, even with automation of such analysis on evolving computing platforms, rates of data processing have been suboptimal largely because of the ever-increasing pixel to processor ratio coupled with limitations of the computing architectures. Novel approaches utilizing spatiotemporal data mining techniques and computational architectures have emerged that demonstrates the potential for sustained and geographically scalable landscape monitoring to be operational. We exemplify this challenge with two broad research initiatives on High Performance Geocomputation at Oak Ridge National Laboratory: (a) mapping global settlement distribution; (b) developing national critical infrastructure databases. Our present effort, on large GPU based architectures, to exploit high resolution (1m or less) satellite and airborne imagery for extracting settlements at global scale is yielding understanding of human settlement patterns and urban areas at unprecedented resolution. Comparison of such urban land cover database, with existing national and global land cover products, at various geographic scales in selected parts of the world is revealing intriguing patterns and insights for urban assessment. Early results, from the USA, Taiwan, and Egypt, indicate closer agreements (5-10%) in urban area assessments among databases at larger, aggregated geographic extents. However, spatial variability at local scales could be significantly different (over 50% disagreement).
Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate
Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.
2015-01-01
Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906
NASA Astrophysics Data System (ADS)
Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.
2018-04-01
The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.
Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W
2017-01-01
Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.
Holl, Felix; Savory, David J.; Andrade-Pacheco, Ricardo; Gething, Peter W.; Bennett, Adam; Sturrock, Hugh J. W.
2017-01-01
Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth’s land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources. PMID:28953943
NASA Astrophysics Data System (ADS)
Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.
2014-04-01
Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.
Simulation of Deep Convective Clouds with the Dynamic Reconstruction Turbulence Closure
NASA Astrophysics Data System (ADS)
Shi, X.; Chow, F. K.; Street, R. L.; Bryan, G. H.
2017-12-01
The terra incognita (TI), or gray zone, in simulations is a range of grid spacing comparable to the most energetic eddy diameter. Spacing in mesoscale and simulations is much larger than the eddies, and turbulence is parameterized with one-dimensional vertical-mixing. Large eddy simulations (LES) have grid spacing much smaller than the energetic eddies, and use three-dimensional models of turbulence. Studies of convective weather use convection-permitting resolutions, which are in the TI. Neither mesoscale-turbulence nor LES models are designed for the TI, so TI turbulence parameterization needs to be discussed. Here, the effects of sub-filter scale (SFS) closure schemes on the simulation of deep tropical convection are evaluated by comparing three closures, i.e. Smagorinsky model, Deardorff-type TKE model and the dynamic reconstruction model (DRM), which partitions SFS turbulence into resolvable sub-filter scales (RSFS) and unresolved sub-grid scales (SGS). The RSFS are reconstructed, and the SGS are modeled with a dynamic eddy viscosity/diffusivity model. The RSFS stresses/fluxes allow backscatter of energy/variance via counter-gradient stresses/fluxes. In high-resolution (100m) simulations of tropical convection use of these turbulence models did not lead to significant differences in cloud water/ice distribution, precipitation flux, or vertical fluxes of momentum and heat. When model resolutions are coarsened, the Smagorinsky and TKE models overestimate cloud ice and produces large-amplitude downward heat flux in the middle troposphere (not found in the high-resolution simulations). This error is a result of unrealistically large eddy diffusivities, i.e., the eddy diffusivity of the DRM is on the order of 1 for the coarse resolution simulations, the eddy diffusivity of the Smagorinsky and TKE model is on the order of 100. Splitting the eddy viscosity/diffusivity scalars into vertical and horizontal components by using different length scales and strain rate components helps to reduce the errors, but does not completely remedy the problem. In contrast, the coarse resolution simulations using the DRM produce results that are more consistent with the high-resolution results, suggesting that the DRM is a more appropriate turbulence model for simulating convection in the TI.
Photosynthesis in high definition
NASA Astrophysics Data System (ADS)
Hilton, Timothy W.
2018-01-01
Photosynthesis is the foundation for almost all known life, but quantifying it at scales above a single plant is difficult. A new satellite illuminates plants' molecular machinery at much-improved spatial resolution, taking us one step closer to combined `inside-outside' insights into large-scale photosynthesis.
Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Chun; Leung, L. Ruby; Park, Sang-Hun
Advances in computing resources are gradually moving regional and global numerical forecasting simulations towards sub-10 km resolution, but global high resolution climate simulations remain a challenge. The non-hydrostatic Model for Prediction Across Scales (MPAS) provides a global framework to achieve very high resolution using regional mesh refinement. Previous studies using the hydrostatic version of MPAS (H-MPAS) with the physics parameterizations of Community Atmosphere Model version 4 (CAM4) found notable resolution dependent behaviors. This study revisits the resolution sensitivity using the non-hydrostatic version of MPAS (NH-MPAS) with both CAM4 and CAM5 physics. A series of aqua-planet simulations at global quasi-uniform resolutionsmore » ranging from 240 km to 30 km and global variable resolution simulations with a regional mesh refinement of 30 km resolution over the tropics are analyzed, with a primary focus on the distinct characteristics of NH-MPAS in simulating precipitation, clouds, and large-scale circulation features compared to H-MPAS-CAM4. The resolution sensitivity of total precipitation and column integrated moisture in NH-MPAS is smaller than that in H-MPAS-CAM4. This contributes importantly to the reduced resolution sensitivity of large-scale circulation features such as the inter-tropical convergence zone and Hadley circulation in NH-MPAS compared to H-MPAS. In addition, NH-MPAS shows almost no resolution sensitivity in the simulated westerly jet, in contrast to the obvious poleward shift in H-MPAS with increasing resolution, which is partly explained by differences in the hyperdiffusion coefficients used in the two models that influence wave activity. With the reduced resolution sensitivity, simulations in the refined region of the NH-MPAS global variable resolution configuration exhibit zonally symmetric features that are more comparable to the quasi-uniform high-resolution simulations than those from H-MPAS that displays zonal asymmetry in simulations inside the refined region. Overall, NH-MPAS with CAM5 physics shows less resolution sensitivity compared to CAM4. These results provide a reference for future studies to further explore the use of NH-MPAS for high-resolution climate simulations in idealized and realistic configurations.« less
Large-scale Density Structures in Magneto-rotational Disk Turbulence
NASA Astrophysics Data System (ADS)
Youdin, Andrew; Johansen, A.; Klahr, H.
2009-01-01
Turbulence generated by the magneto-rotational instability (MRI) is a strong candidate to drive accretion flows in disks, including sufficiently ionized regions of protoplanetary disks. The MRI is often studied in local shearing boxes, which model a small section of the disk at high resolution. I will present simulations of large, stratified shearing boxes which extend up to 10 gas scale-heights across. These simulations are a useful bridge to fully global disk simulations. We find that MRI turbulence produces large-scale, axisymmetric density perturbations . These structures are part of a zonal flow --- analogous to the banded flow in Jupiter's atmosphere --- which survives in near geostrophic balance for tens of orbits. The launching mechanism is large-scale magnetic tension generated by an inverse cascade. We demonstrate the robustness of these results by careful study of various box sizes, grid resolutions, and microscopic diffusion parameterizations. These gas structures can trap solid material (in the form of large dust or ice particles) with important implications for planet formation. Resolved disk images at mm-wavelengths (e.g. from ALMA) will verify or constrain the existence of these structures.
DMI's Baltic Sea Coastal operational forecasting system
NASA Astrophysics Data System (ADS)
Murawski, Jens; Berg, Per; Weismann Poulsen, Jacob
2017-04-01
Operational forecasting is challenged with bridging the gap between the large scales of the driving weather systems and the local, human scales of the model applications. The limit of what can be represented by local model has been continuously shifted to higher and higher spatial resolution, with the aim to better resolve the local dynamic and to make it possible to describe processes that could only be parameterised in older versions, with the ultimate goal to improve the quality of the forecast. Current hardware trends demand a str onger focus on the development of efficient, highly parallelised software and require a refactoring of the code with a solid focus on portable performance. The gained performance can be used for running high resolution model with a larger coverage. Together with the development of efficient two-way nesting routines, this has made it possible to approach the near-coastal zone with model applications that can run in a time effective way. Denmarks Meteorological Institute uses the HBM(1) ocean circulation model for applications that covers the entire Baltic Sea and North Sea with an integrated model set-up that spans the range of horizontal resolution from 1nm for the entire Baltic Sea to approx. 200m resolution in local fjords (Limfjord). For the next model generation, the high resolution set-ups are going to be extended and new high resolution domains in coastal zones are either implemented or tested for operational use. For the first time it will be possible to cover large stretches of the Baltic coastal zone with sufficiently high resolution to model the local hydrodynamic adequately. (1) HBM stands for HIROMB-BOOS-Model, whereas HIROMB stands for "High Resolution Model for the Baltic Sea" and BOOS stands for "Baltic Operational Oceanography System".
Li, Quanfeng; Wang, Qi; Hou, Yubin; Lu, Qingyou
2012-04-01
We present a home-built 18/20 T high magnetic field scanning tunneling microscope (STM) featuring fully low voltage (lower than ±15 V) operability in low temperatures, large scale searching ability, and 20 fA high current resolution (measured by using a 100 GOhm dummy resistor to replace the tip-sample junction) with a bandwidth of 3.03 kHz. To accomplish low voltage operation which is important in achieving high precision, low noise, and low interference with the strong magnetic field, the coarse approach is implemented with an inertial slider driven by the lateral bending of a piezoelectric scanner tube (PST) whose inner electrode is axially split into two for enhanced bending per volt. The PST can also drive the same sliding piece to inertial slide in the other bending direction (along the sample surface) of the PST, which realizes the large area searching ability. The STM head is housed in a three segment tubular chamber, which is detachable near the STM head for the convenience of sample and tip changes. Atomic resolution images of a graphite sample taken under 17.6 T and 18.0001 T are presented to show its performance. © 2012 American Institute of Physics
High-resolution Observations of Hα Spectra with a Subtractive Double Pass
NASA Astrophysics Data System (ADS)
Beck, C.; Rezaei, R.; Choudhary, D. P.; Gosain, S.; Tritschler, A.; Louis, R. E.
2018-02-01
High-resolution imaging spectroscopy in solar physics has relied on Fabry-Pérot interferometers (FPIs) in recent years. FPI systems, however, become technically challenging and expensive for telescopes larger than the 1 m class. A conventional slit spectrograph with a diffraction-limited performance over a large field of view (FOV) can be built at much lower cost and effort. It can be converted into an imaging spectro(polari)meter using the concept of a subtractive double pass (SDP). We demonstrate that an SDP system can reach a similar performance as FPI-based systems with a high spatial and moderate spectral resolution across a FOV of 100^'' ×100^' ' with a spectral coverage of 1 nm. We use Hα spectra taken with an SDP system at the Dunn Solar Telescope and complementary full-disc data to infer the properties of small-scale superpenumbral filaments. We find that the majority of all filaments end in patches of opposite-polarity fields. The internal fine-structure in the line-core intensity of Hα at spatial scales of about 0.5'' exceeds that in other parameters such as the line width, indicating small-scale opacity effects in a larger-scale structure with common properties. We conclude that SDP systems in combination with (multi-conjugate) adaptive optics are a valid alternative to FPI systems when high spatial resolution and a large FOV are required. They can also reach a cadence that is comparable to that of FPI systems, while providing a much larger spectral range and a simultaneous multi-line capability.
NASA Astrophysics Data System (ADS)
Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Subramanian, Aneesh; Weisheimer, Antje; Christensen, Hannah; Juricke, Stephan; Palmer, Tim
2016-04-01
The PRACE Climate SPHINX project investigates the sensitivity of climate simulations to model resolution and stochastic parameterization. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in 30-years climate integrations as a function of model resolution (from 80km up to 16km for the atmosphere). The experiments include more than 70 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), using RCP8.5 CMIP5 forcing. A total amount of 20 million core hours will be used at end of the project (March 2016) and about 150 TBytes of post-processed data will be available to the climate community. Preliminary results show a clear improvement in the representation of climate variability over the Euro-Atlantic following resolution increase. More specifically, the well-known atmospheric blocking negative bias over Europe is definitely resolved. High resolution runs also show improved fidelity in representation of tropical variability - such as the MJO and its propagation - over the low resolution simulations. It is shown that including stochastic parameterization in the low resolution runs help to improve some of the aspects of the MJO propagation further. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).
Large-Scale, High-Resolution Neurophysiological Maps Underlying fMRI of Macaque Temporal Lobe
Papanastassiou, Alex M.; DiCarlo, James J.
2013-01-01
Maps obtained by functional magnetic resonance imaging (fMRI) are thought to reflect the underlying spatial layout of neural activity. However, previous studies have not been able to directly compare fMRI maps to high-resolution neurophysiological maps, particularly in higher level visual areas. Here, we used a novel stereo microfocal x-ray system to localize thousands of neural recordings across monkey inferior temporal cortex (IT), construct large-scale maps of neuronal object selectivity at subvoxel resolution, and compare those neurophysiology maps with fMRI maps from the same subjects. While neurophysiology maps contained reliable structure at the sub-millimeter scale, fMRI maps of object selectivity contained information at larger scales (>2.5 mm) and were only partly correlated with raw neurophysiology maps collected in the same subjects. However, spatial smoothing of neurophysiology maps more than doubled that correlation, while a variety of alternative transforms led to no significant improvement. Furthermore, raw spiking signals, once spatially smoothed, were as predictive of fMRI maps as local field potential signals. Thus, fMRI of the inferior temporal lobe reflects a spatially low-passed version of neurophysiology signals. These findings strongly validate the widespread use of fMRI for detecting large (>2.5 mm) neuronal domains of object selectivity but show that a complete understanding of even the most pure domains (e.g., faces vs nonface objects) requires investigation at fine scales that can currently only be obtained with invasive neurophysiological methods. PMID:24048850
NASA Technical Reports Server (NTRS)
Duvall, T. L., Jr.; Wilcox, J. M.; Svalgaard, L.; Scherrer, P. H.; Mcintosh, P. S.
1977-01-01
Two methods of observing the neutral line of the large-scale photospheric magnetic field are compared: neutral line positions inferred from H-alpha photographs (McIntosh and Nolte, 1975) and observations of the photospheric magnetic field made with low spatial resolution (three minutes) and high sensitivity using the Stanford magnetograph. The comparison is found to be very favorable.
NASA Astrophysics Data System (ADS)
Mesinger, F.
The traditional views hold that high-resolution limited area models (LAMs) down- scale large-scale lateral boundary information, and that predictability of small scales is short. Inspection of various rms fits/errors has contributed to these views. It would follow that the skill of LAMs should visibly deteriorate compared to that of their driver models at more extended forecast times. The limited area Eta Model at NCEP has an additional handicap of being driven by LBCs of the previous Avn global model run, at 0000 and 1200 UTC estimated to amount to about an 8 h loss in accuracy. This should make its relative skill compared to that of the Avn deteriorate even faster. These views are challenged by various Eta results including rms fits to raobs out to 84 h. It is argued that it is the largest scales that contribute the most to the skill of the Eta relative to that of the Avn.
NASA Astrophysics Data System (ADS)
Ji, X.; Shen, C.
2017-12-01
Flood inundation presents substantial societal hazards and also changes biogeochemistry for systems like the Amazon. It is often expensive to simulate high-resolution flood inundation and propagation in a long-term watershed-scale model. Due to the Courant-Friedrichs-Lewy (CFL) restriction, high resolution and large local flow velocity both demand prohibitively small time steps even for parallel codes. Here we develop a parallel surface-subsurface process-based model enhanced by multi-resolution meshes that are adaptively switched on or off. The high-resolution overland flow meshes are enabled only when the flood wave invades to floodplains. This model applies semi-implicit, semi-Lagrangian (SISL) scheme in solving dynamic wave equations, and with the assistant of the multi-mesh method, it also adaptively chooses the dynamic wave equation only in the area of deep inundation. Therefore, the model achieves a balance between accuracy and computational cost.
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; ...
2016-11-22
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relativelymore » few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. Lastly, HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.« less
NASA Astrophysics Data System (ADS)
Burney, J. A.; Goldblatt, R.
2016-12-01
Understanding drivers of land use change - and in particular, levels of ecosystem degradation - in semi-arid regions is of critical importance because these agroecosystems (1) are home to the world's poorest populations, almost all of whom depend on agriculture for their livelihoods, (2) play a critical role in the global carbon and climate cycles, and (3) have in many cases seen dramatic changes in temperature and precipitation, relative to global averages, over the past several decades. However, assessing ecosystem health (or, conversely, degradation) presents a difficult measurement problem. Established methods are very labor intensive and rest on detailed questionnaires and field assessments. High-resolution satellite imagery has a unique role semi-arid ecosystem assessment in that it can be used for rapid (or repeated) and very simple measurements of tree and shrub density, an excellent overall indicator for dryland ecosystem health. Because trees and large shrubs are more sparse in semi-arid regions, sub-meter resolution imagery in conjunction with automated image analysis can be used to assess density differences at high spatial resolution without expensive and time-consuming ground-truthing. This could be used down to the farm level, for example, to better assess the larger-scale ecosystem impacts of different management practices, to assess compliance with REDD+ carbon offset protocols, or to evaluate implementation of conservation goals. Here we present results comparing spatial and spectral remote sensing methods for semi-arid ecosystem assessment across new data sources, using the Brazilian Sertão as an example, and the implications for large-scale use in semi-arid ecosystem science.
DEM Based Modeling: Grid or TIN? The Answer Depends
NASA Astrophysics Data System (ADS)
Ogden, F. L.; Moreno, H. A.
2015-12-01
The availability of petascale supercomputing power has enabled process-based hydrological simulations on large watersheds and two-way coupling with mesoscale atmospheric models. Of course with increasing watershed scale come corresponding increases in watershed complexity, including wide ranging water management infrastructure and objectives, and ever increasing demands for forcing data. Simulations of large watersheds using grid-based models apply a fixed resolution over the entire watershed. In large watersheds, this means an enormous number of grids, or coarsening of the grid resolution to reduce memory requirements. One alternative to grid-based methods is the triangular irregular network (TIN) approach. TINs provide the flexibility of variable resolution, which allows optimization of computational resources by providing high resolution where necessary and low resolution elsewhere. TINs also increase required effort in model setup, parameter estimation, and coupling with forcing data which are often gridded. This presentation discusses the costs and benefits of the use of TINs compared to grid-based methods, in the context of large watershed simulations within the traditional gridded WRF-HYDRO framework and the new TIN-based ADHydro high performance computing watershed simulator.
Shear-driven dynamo waves at high magnetic Reynolds number.
Tobias, S M; Cattaneo, F
2013-05-23
Astrophysical magnetic fields often display remarkable organization, despite being generated by dynamo action driven by turbulent flows at high conductivity. An example is the eleven-year solar cycle, which shows spatial coherence over the entire solar surface. The difficulty in understanding the emergence of this large-scale organization is that whereas at low conductivity (measured by the magnetic Reynolds number, Rm) dynamo fields are well organized, at high Rm their structure is dominated by rapidly varying small-scale fluctuations. This arises because the smallest scales have the highest rate of strain, and can amplify magnetic field most efficiently. Therefore most of the effort to find flows whose large-scale dynamo properties persist at high Rm has been frustrated. Here we report high-resolution simulations of a dynamo that can generate organized fields at high Rm; indeed, the generation mechanism, which involves the interaction between helical flows and shear, only becomes effective at large Rm. The shear does not enhance generation at large scales, as is commonly thought; instead it reduces generation at small scales. The solution consists of propagating dynamo waves, whose existence was postulated more than 60 years ago and which have since been used to model the solar cycle.
Stochastic Downscaling of Digital Elevation Models
NASA Astrophysics Data System (ADS)
Rasera, Luiz Gustavo; Mariethoz, Gregoire; Lane, Stuart N.
2016-04-01
High-resolution digital elevation models (HR-DEMs) are extremely important for the understanding of small-scale geomorphic processes in Alpine environments. In the last decade, remote sensing techniques have experienced a major technological evolution, enabling fast and precise acquisition of HR-DEMs. However, sensors designed to measure elevation data still feature different spatial resolution and coverage capabilities. Terrestrial altimetry allows the acquisition of HR-DEMs with centimeter to millimeter-level precision, but only within small spatial extents and often with dead ground problems. Conversely, satellite radiometric sensors are able to gather elevation measurements over large areas but with limited spatial resolution. In the present study, we propose an algorithm to downscale low-resolution satellite-based DEMs using topographic patterns extracted from HR-DEMs derived for example from ground-based and airborne altimetry. The method consists of a multiple-point geostatistical simulation technique able to generate high-resolution elevation data from low-resolution digital elevation models (LR-DEMs). Initially, two collocated DEMs with different spatial resolutions serve as an input to construct a database of topographic patterns, which is also used to infer the statistical relationships between the two scales. High-resolution elevation patterns are then retrieved from the database to downscale a LR-DEM through a stochastic simulation process. The output of the simulations are multiple equally probable DEMs with higher spatial resolution that also depict the large-scale geomorphic structures present in the original LR-DEM. As these multiple models reflect the uncertainty related to the downscaling, they can be employed to quantify the uncertainty of phenomena that are dependent on fine topography, such as catchment hydrological processes. The proposed methodology is illustrated for a case study in the Swiss Alps. A swissALTI3D HR-DEM (with 5 m resolution) and a SRTM-derived LR-DEM from the Western Alps are used to downscale a SRTM-based LR-DEM from the eastern part of the Alps. The results show that the method is capable of generating multiple high-resolution synthetic DEMs that reproduce the spatial structure and statistics of the original DEM.
Monitoring Termite-Mediated Ecosystem Processes Using Moderate and High Resolution Satellite Imagery
NASA Astrophysics Data System (ADS)
Lind, B. M.; Hanan, N. P.
2016-12-01
Termites are considered dominant decomposers and prominent ecosystem engineers in the global tropics and they build some of the largest and architecturally most complex non-human-made structures in the world. Termite mounds significantly alter soil texture, structure, and nutrients, and have major implications for local hydrological dynamics, vegetation characteristics, and biological diversity. An understanding of how these processes change across large scales has been limited by our ability to detect termite mounds at high spatial resolutions. Our research develops methods to detect large termite mounds in savannas across extensive geographic areas using moderate and high resolution satellite imagery. We also investigate the effect of termite mounds on vegetation productivity using Landsat-8 maximum composite NDVI data as a proxy for production. Large termite mounds in arid and semi-arid Senegal generate highly reflective `mound scars' with diameters ranging from 10 m at minimum to greater than 30 m. As Sentinel-2 has several bands with 10 m resolution and Landsat-8 has improved calibration, higher radiometric resolution, 15 m spatial resolution (pansharpened), and improved contrast between vegetated and bare surfaces compared to previous Landsat missions, we found that the largest and most influential mounds in the landscape can be detected. Because mounds as small as 4 m in diameter are easily detected in high resolution imagery we used these data to validate detection results and quantify omission errors for smaller mounds.
A High Resolution View of Galactic Centers: Arp 220 and M31
NASA Astrophysics Data System (ADS)
Lockhart, Kelly E.
The centers of galaxy are small in size and yet incredibly complex. They play host to supermassive black holes and nuclear star clusters (NSCs) and are subject to large gas inows, nuclear starbursts, and active galactic nuclear (AGN) activity. They can also be the launching site for large-scale galactic outows. However, though these systems are quite important to galactic evolution, observations are quite difficult due to their small size. Using high spatial resolution narrowband imaging with HST/WFC3 of Arp 220, a latestage galaxy merger, I discover an ionized gas bubble feature ( r = 600 pc) just off the nucleus. The bubble is aligned with both the western nucleus and with the large-scale galactic outflow. Using energetics arguments, I link the bubble with a young, obscured AGN or with an intense nuclear starburst. Given its alignment along the large-scale outflow axis, I argue that the bubble presents evidence for a link between the galactic center and the large-scale outflow. I also present new observations of the NSC in M31, the closest large spiral galaxy to our own. Using the OSIRIS near-infrared integral field spectrograph (IFS) on Keck, I map the kinematics of the old stellar population in the eccentric disk of the NSC. I compare the observations to models to derive a precession speed of the disk of 0+/-5 km s-1 pc-1 , and hence confirm that winds from the old stellar population may be the source of gas needed to form the young stellar population in the NSC. Studies of galactic centers are dependent on high spatial resolution observations. In particular, IFSs are ideal instruments for these studies as they provide two-dimensional spectroscopy of the field of view, enabling 2D kinematic studies. I report on work to characterize and improve the data reduction pipeline of the OSIRIS IFS, and discuss implications for future generations of IFS instrumentation.
NASA Astrophysics Data System (ADS)
Yan, Hui; Wang, K. G.; Jones, Jim E.
2016-06-01
A parallel algorithm for large-scale three-dimensional phase-field simulations of phase coarsening is developed and implemented on high-performance architectures. From the large-scale simulations, a new kinetics in phase coarsening in the region of ultrahigh volume fraction is found. The parallel implementation is capable of harnessing the greater computer power available from high-performance architectures. The parallelized code enables increase in three-dimensional simulation system size up to a 5123 grid cube. Through the parallelized code, practical runtime can be achieved for three-dimensional large-scale simulations, and the statistical significance of the results from these high resolution parallel simulations are greatly improved over those obtainable from serial simulations. A detailed performance analysis on speed-up and scalability is presented, showing good scalability which improves with increasing problem size. In addition, a model for prediction of runtime is developed, which shows a good agreement with actual run time from numerical tests.
Measuring large-scale vertical motion in the atmosphere with dropsondes
NASA Astrophysics Data System (ADS)
Bony, Sandrine; Stevens, Bjorn
2017-04-01
Large-scale vertical velocity modulates important processes in the atmosphere, including the formation of clouds, and constitutes a key component of the large-scale forcing of Single-Column Model simulations and Large-Eddy Simulations. Its measurement has also been a long-standing challenge for observationalists. We will show that it is possible to measure the vertical profile of large-scale wind divergence and vertical velocity from aircraft by using dropsondes. This methodology was tested in August 2016 during the NARVAL2 campaign in the lower Atlantic trades. Results will be shown for several research flights, the robustness and the uncertainty of measurements will be assessed, ands observational estimates will be compared with data from high-resolution numerical forecasts.
Large-scale runoff generation - parsimonious parameterisation using high-resolution topography
NASA Astrophysics Data System (ADS)
Gong, L.; Halldin, S.; Xu, C.-Y.
2011-08-01
World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow) and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting at very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation) algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TRG only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm is driven by the HydroSHEDS dataset with a resolution of 3" (around 90 m at the equator). The TRG algorithm was validated against the VIC algorithm in a common model framework in 3 river basins in different climates. The TRG algorithm performed equally well or marginally better than the VIC algorithm with one less parameter to be calibrated. The TRG algorithm also lacked equifinality problems and offered a realistic spatial pattern for runoff generation and evaporation.
Large-scale runoff generation - parsimonious parameterisation using high-resolution topography
NASA Astrophysics Data System (ADS)
Gong, L.; Halldin, S.; Xu, C.-Y.
2010-09-01
World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow) and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting a very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation) algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TGR only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm is driven by the HydroSHEDS dataset with a resolution of 3'' (around 90 m at the equator). The TRG algorithm was validated against the VIC algorithm in a common model framework in 3 river basins in different climates. The TRG algorithm performed equally well or marginally better than the VIC algorithm with one less parameter to be calibrated. The TRG algorithm also lacked equifinality problems and offered a realistic spatial pattern for runoff generation and evaporation.
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
High resolution modeling of reservoir storage and extent dynamics at the continental scale
NASA Astrophysics Data System (ADS)
Shin, S.; Pokhrel, Y. N.
2017-12-01
Over the past decade, significant progress has been made in developing reservoir schemes in large scale hydrological models to better simulate hydrological fluxes and storages in highly managed river basins. These schemes have been successfully used to study the impact of reservoir operation on global river basins. However, improvements in the existing schemes are needed for hydrological fluxes and storages, especially at the spatial resolution to be used in hyper-resolution hydrological modeling. In this study, we developed a reservoir routing scheme with explicit representation of reservoir storage and extent at the grid scale of 5km or less. Instead of setting reservoir area to a fixed value or diagnosing it using the area-storage equation, which is a commonly used approach in the existing reservoir schemes, we explicitly simulate the inundated storage and area for all grid cells that are within the reservoir extent. This approach enables a better simulation of river-floodplain-reservoir storage by considering both the natural flood and man-made reservoir storage. Results of the seasonal dynamics of reservoir storage, river discharge at the downstream of dams, and the reservoir inundation extent are evaluated with various datasets from ground-observations and satellite measurements. The new model captures the dynamics of these variables with a good accuracy for most of the large reservoirs in the western United States. It is expected that the incorporation of the newly developed reservoir scheme in large-scale land surface models (LSMs) will lead to improved simulation of river flow and terrestrial water storage in highly managed river basins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Na, Ji Sung; Koo, Eunmo; Munoz-Esparza, Domingo
High-resolution large-eddy simulation of the flow over a large wind farm (64 wind turbines) is performed using the HIGRAD/FIRETEC-WindBlade model, which is a high-performance computing wind turbine–atmosphere interaction model that uses the Lagrangian actuator line method to represent rotating turbine blades. These high-resolution large-eddy simulation results are used to parameterize the thrust and power coefficients that contain information about turbine interference effects within the wind farm. Those coefficients are then incorporated into the WRF (Weather Research and Forecasting) model in order to evaluate interference effects in larger-scale models. In the high-resolution WindBlade wind farm simulation, insufficient distance between turbines createsmore » the interference between turbines, including significant vertical variations in momentum and turbulent intensity. The characteristics of the wake are further investigated by analyzing the distribution of the vorticity and turbulent intensity. Quadrant analysis in the turbine and post-turbine areas reveals that the ejection motion induced by the presence of the wind turbines is dominant compared to that in the other quadrants, indicating that the sweep motion is increased at the location where strong wake recovery occurs. Regional-scale WRF simulations reveal that although the turbulent mixing induced by the wind farm is partly diffused to the upper region, there is no significant change in the boundary layer depth. The velocity deficit does not appear to be very sensitive to the local distribution of turbine coefficients. However, differences of about 5% on parameterized turbulent kinetic energy were found depending on the turbine coefficient distribution. Furthermore, turbine coefficients that consider interference in the wind farm should be used in wind farm parameterization for larger-scale models to better describe sub-grid scale turbulent processes.« less
Feng, Sha; Vogelmann, Andrew M.; Li, Zhijin; ...
2015-01-20
Fine-resolution three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multi-scale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scalesmore » larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 (CAM5) is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.« less
Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus
2018-01-01
Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.
The Use of Scale-Dependent Precision to Increase Forecast Accuracy in Earth System Modelling
NASA Astrophysics Data System (ADS)
Thornes, Tobias; Duben, Peter; Palmer, Tim
2016-04-01
At the current pace of development, it may be decades before the 'exa-scale' computers needed to resolve individual convective clouds in weather and climate models become available to forecasters, and such machines will incur very high power demands. But the resolution could be improved today by switching to more efficient, 'inexact' hardware with which variables can be represented in 'reduced precision'. Currently, all numbers in our models are represented as double-precision floating points - each requiring 64 bits of memory - to minimise rounding errors, regardless of spatial scale. Yet observational and modelling constraints mean that values of atmospheric variables are inevitably known less precisely on smaller scales, suggesting that this may be a waste of computer resources. More accurate forecasts might therefore be obtained by taking a scale-selective approach whereby the precision of variables is gradually decreased at smaller spatial scales to optimise the overall efficiency of the model. To study the effect of reducing precision to different levels on multiple spatial scales, we here introduce a new model atmosphere developed by extending the Lorenz '96 idealised system to encompass three tiers of variables - which represent large-, medium- and small-scale features - for the first time. In this chaotic but computationally tractable system, the 'true' state can be defined by explicitly resolving all three tiers. The abilities of low resolution (single-tier) double-precision models and similar-cost high resolution (two-tier) models in mixed-precision to produce accurate forecasts of this 'truth' are compared. The high resolution models outperform the low resolution ones even when small-scale variables are resolved in half-precision (16 bits). This suggests that using scale-dependent levels of precision in more complicated real-world Earth System models could allow forecasts to be made at higher resolution and with improved accuracy. If adopted, this new paradigm would represent a revolution in numerical modelling that could be of great benefit to the world.
NASA Astrophysics Data System (ADS)
Gao, Yang; Leung, L. Ruby; Zhao, Chun; Hagos, Samson
2017-03-01
Simulating summer precipitation is a significant challenge for climate models that rely on cumulus parameterizations to represent moist convection processes. Motivated by recent advances in computing that support very high-resolution modeling, this study aims to systematically evaluate the effects of model resolution and convective parameterizations across the gray zone resolutions. Simulations using the Weather Research and Forecasting model were conducted at grid spacings of 36 km, 12 km, and 4 km for two summers over the conterminous U.S. The convection-permitting simulations at 4 km grid spacing are most skillful in reproducing the observed precipitation spatial distributions and diurnal variability. Notable differences are found between simulations with the traditional Kain-Fritsch (KF) and the scale-aware Grell-Freitas (GF) convection schemes, with the latter more skillful in capturing the nocturnal timing in the Great Plains and North American monsoon regions. The GF scheme also simulates a smoother transition from convective to large-scale precipitation as resolution increases, resulting in reduced sensitivity to model resolution compared to the KF scheme. Nonhydrostatic dynamics has a positive impact on precipitation over complex terrain even at 12 km and 36 km grid spacings. With nudging of the winds toward observations, we show that the conspicuous warm biases in the Southern Great Plains are related to precipitation biases induced by large-scale circulation biases, which are insensitive to model resolution. Overall, notable improvements in simulating summer rainfall and its diurnal variability through convection-permitting modeling and scale-aware parameterizations suggest promising venues for improving climate simulations of water cycle processes.
Guitet, Stéphane; Hérault, Bruno; Molto, Quentin; Brunaux, Olivier; Couteron, Pierre
2015-01-01
Precise mapping of above-ground biomass (AGB) is a major challenge for the success of REDD+ processes in tropical rainforest. The usual mapping methods are based on two hypotheses: a large and long-ranged spatial autocorrelation and a strong environment influence at the regional scale. However, there are no studies of the spatial structure of AGB at the landscapes scale to support these assumptions. We studied spatial variation in AGB at various scales using two large forest inventories conducted in French Guiana. The dataset comprised 2507 plots (0.4 to 0.5 ha) of undisturbed rainforest distributed over the whole region. After checking the uncertainties of estimates obtained from these data, we used half of the dataset to develop explicit predictive models including spatial and environmental effects and tested the accuracy of the resulting maps according to their resolution using the rest of the data. Forest inventories provided accurate AGB estimates at the plot scale, for a mean of 325 Mg.ha-1. They revealed high local variability combined with a weak autocorrelation up to distances of no more than10 km. Environmental variables accounted for a minor part of spatial variation. Accuracy of the best model including spatial effects was 90 Mg.ha-1 at plot scale but coarse graining up to 2-km resolution allowed mapping AGB with accuracy lower than 50 Mg.ha-1. Whatever the resolution, no agreement was found with available pan-tropical reference maps at all resolutions. We concluded that the combined weak autocorrelation and weak environmental effect limit AGB maps accuracy in rainforest, and that a trade-off has to be found between spatial resolution and effective accuracy until adequate “wall-to-wall” remote sensing signals provide reliable AGB predictions. Waiting for this, using large forest inventories with low sampling rate (<0.5%) may be an efficient way to increase the global coverage of AGB maps with acceptable accuracy at kilometric resolution. PMID:26402522
Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks
NASA Astrophysics Data System (ADS)
Audebert, Nicolas; Le Saux, Bertrand; Lefèvre, Sébastien
2018-06-01
In this work, we investigate various methods to deal with semantic labeling of very high resolution multi-modal remote sensing data. Especially, we study how deep fully convolutional networks can be adapted to deal with multi-modal and multi-scale remote sensing data for semantic labeling. Our contributions are threefold: (a) we present an efficient multi-scale approach to leverage both a large spatial context and the high resolution data, (b) we investigate early and late fusion of Lidar and multispectral data, (c) we validate our methods on two public datasets with state-of-the-art results. Our results indicate that late fusion make it possible to recover errors steaming from ambiguous data, while early fusion allows for better joint-feature learning but at the cost of higher sensitivity to missing data.
Intermediate-scale plasma irregularities in the polar ionosphere inferred from GPS radio occultation
NASA Astrophysics Data System (ADS)
Shume, E. B.; Komjathy, A.; Langley, R. B.; Verkhoglyadova, O.; Butala, M. D.; Mannucci, A. J.
2015-02-01
We report intermediate-scale plasma irregularities in the polar ionosphere inferred from high-resolution radio occultation (RO) measurements using GPS (Global Positioning System) to CASSIOPE (CAScade Smallsat and IOnospheric Polar Explorer) satellite radio links. The high inclination of CASSIOPE and the high rate of signal reception by the GPS Attitude, Positioning, and Profiling RO receiver on CASSIOPE enable a high-resolution investigation of the dynamics of the polar ionosphere with unprecedented detail. Intermediate-scale, scintillation-producing irregularities, which correspond to 1 to 40 km scales, were inferred by applying multiscale spectral analysis on the RO phase measurements. Using our multiscale spectral analysis approach and satellite data (Polar Operational Environmental Satellites and Defense Meteorological Satellite Program), we discovered that the irregularity scales and phase scintillations have distinct features in the auroral oval and polar cap. We found that large length scales and more intense phase scintillations are prevalent in the auroral oval compared to the polar cap implying that the irregularity scales and phase scintillation characteristics are a function of the solar wind and magnetospheric forcings.
NASA Technical Reports Server (NTRS)
Mankbadi, M. R.; Georgiadis, N. J.; DeBonis, J. R.
2015-01-01
The objective of this work is to compare a high-order solver with a low-order solver for performing large-eddy simulations (LES) of a compressible mixing layer. The high-order method is the Wave-Resolving LES (WRLES) solver employing a Dispersion Relation Preserving (DRP) scheme. The low-order solver is the Wind-US code, which employs the second-order Roe Physical scheme. Both solvers are used to perform LES of the turbulent mixing between two supersonic streams at a convective Mach number of 0.46. The high-order and low-order methods are evaluated at two different levels of grid resolution. For a fine grid resolution, the low-order method produces a very similar solution to the high-order method. At this fine resolution the effects of numerical scheme, subgrid scale modeling, and filtering were found to be negligible. Both methods predict turbulent stresses that are in reasonable agreement with experimental data. However, when the grid resolution is coarsened, the difference between the two solvers becomes apparent. The low-order method deviates from experimental results when the resolution is no longer adequate. The high-order DRP solution shows minimal grid dependence. The effects of subgrid scale modeling and spatial filtering were found to be negligible at both resolutions. For the high-order solver on the fine mesh, a parametric study of the spanwise width was conducted to determine its effect on solution accuracy. An insufficient spanwise width was found to impose an artificial spanwise mode and limit the resolved spanwise modes. We estimate that the spanwise depth needs to be 2.5 times larger than the largest coherent structures to capture the largest spanwise mode and accurately predict turbulent mixing.
NASA Astrophysics Data System (ADS)
Macander, M. J.; Frost, G. V., Jr.
2015-12-01
Regional-scale mapping of vegetation and other ecosystem properties has traditionally relied on medium-resolution remote sensing such as Landsat (30 m) and MODIS (250 m). Yet, the burgeoning availability of high-resolution (<=2 m) imagery and ongoing advances in computing power and analysis tools raises the prospect of performing ecosystem mapping at fine spatial scales over large study domains. Here we demonstrate cutting-edge mapping approaches over a ~35,000 km² study area on Alaska's North Slope using calibrated and atmospherically-corrected mosaics of high-resolution WorldView-2 and GeoEye-1 imagery: (1) an a priori spectral approach incorporating the Satellite Imagery Automatic Mapper (SIAM) algorithms; (2) image segmentation techniques; and (3) texture metrics. The SIAM spectral approach classifies radiometrically-calibrated imagery to general vegetation density categories and non-vegetated classes. The SIAM classes were developed globally and their applicability in arctic tundra environments has not been previously evaluated. Image segmentation, or object-based image analysis, automatically partitions high-resolution imagery into homogeneous image regions that can then be analyzed based on spectral, textural, and contextual information. We applied eCognition software to delineate waterbodies and vegetation classes, in combination with other techniques. Texture metrics were evaluated to determine the feasibility of using high-resolution imagery to algorithmically characterize periglacial surface forms (e.g., ice-wedge polygons), which are an important physical characteristic of permafrost-dominated regions but which cannot be distinguished by medium-resolution remote sensing. These advanced mapping techniques provide products which can provide essential information supporting a broad range of ecosystem science and land-use planning applications in northern Alaska and elsewhere in the circumpolar Arctic.
NASA Astrophysics Data System (ADS)
Shume, E. B.; Komjathy, A.; Langley, R. B.; Verkhoglyadova, O. P.; Butala, M.; Mannucci, A. J.
2014-12-01
In this research, we report intermediate scale plasma density irregularities in the high-latitude ionosphere inferred from high-resolution radio occultation (RO) measurements in the CASSIOPE (CAScade Smallsat and IOnospheric Polar Explorer) - GPS (Global Positioning System) satellites radio link. The high inclination of the CASSIOPE satellite and high rate of signal receptionby the occultation antenna of the GPS Attitude, Positioning and Profiling (GAP) instrument on the Enhanced Polar Outflow Probe platform on CASSIOPE enable a high temporal and spatial resolution investigation of the dynamics of the polar ionosphere, magnetosphere-ionospherecoupling, solar wind effects, etc. with unprecedented details compared to that possible in the past. We have carried out high spatial resolution analysis in altitude and geomagnetic latitude of scintillation-producing plasma density irregularities in the polar ionosphere. Intermediate scale, scintillation-producing plasma density irregularities, which corresponds to 2 to 40 km spatial scales were inferred by applying multi-scale spectral analysis on the RO phase delay measurements. Using our multi-scale spectral analysis approach and Polar Operational Environmental Satellites (POES) and Defense Meteorological Satellite Program (DMSP) observations, we infer that the irregularity scales and phase scintillations have distinct features in the auroral oval and polar cap regions. In specific terms, we found that large length scales and and more intense phase scintillations are prevalent in the auroral oval compared to the polar cap region. Hence, the irregularity scales and phase scintillation characteristics are a function of the solar wind and the magnetospheric forcing. Multi-scale analysis may become a powerful diagnostic tool for characterizing how the ionosphere is dynamically driven by these factors.
Leempoel, Kevin; Parisod, Christian; Geiser, Céline; Joost, Stéphane
2018-02-01
Plant species are known to adapt locally to their environment, particularly in mountainous areas where conditions can vary drastically over short distances. The climate of such landscapes being largely influenced by topography, using fine-scale models to evaluate environmental heterogeneity may help detecting adaptation to micro-habitats. Here, we applied a multiscale landscape genomic approach to detect evidence of local adaptation in the alpine plant Biscutella laevigata . The two gene pools identified, experiencing limited gene flow along a 1-km ridge, were different in regard to several habitat features derived from a very high resolution (VHR) digital elevation model (DEM). A correlative approach detected signatures of selection along environmental gradients such as altitude, wind exposure, and solar radiation, indicating adaptive pressures likely driven by fine-scale topography. Using a large panel of DEM-derived variables as ecologically relevant proxies, our results highlighted the critical role of spatial resolution. These high-resolution multiscale variables indeed indicate that the robustness of associations between genetic loci and environmental features depends on spatial parameters that are poorly documented. We argue that the scale issue is critical in landscape genomics and that multiscale ecological variables are key to improve our understanding of local adaptation in highly heterogeneous landscapes.
Kooperman, Gabriel J.; Pritchard, Michael S.; O'Brien, Travis A.; ...
2018-04-01
Deficiencies in the parameterizations of convection used in global climate models often lead to a distorted representation of the simulated rainfall intensity distribution (i.e., too much rainfall from weak rain rates). While encouraging improvements in high percentile rainfall intensity have been found as the horizontal resolution of the Community Atmosphere Model is increased to ~25 km, we demonstrate no corresponding improvement in the moderate rain rates that generate the majority of accumulated rainfall. Using a statistical framework designed to emphasize links between precipitation intensity and accumulated rainfall beyond just the frequency distribution, we show that CAM cannot realistically simulate moderatemore » rain rates, and cannot capture their intensification with climate change, even as resolution is increased. However, by separating the parameterized convective and large-scale resolved contributions to total rainfall, we find that the intensity, geographic pattern, and climate change response of CAM's large-scale rain rates are more consistent with observations (TRMM 3B42), superparameterization, and theoretical expectations, despite issues with parameterized convection. Increasing CAM's horizontal resolution does improve the representation of total rainfall intensity, but not due to changes in the intensity of large-scale rain rates, which are surprisingly insensitive to horizontal resolution. Rather, improvements occur through an increase in the relative contribution of the large-scale component to the total amount of accumulated rainfall. Analysis of sensitivities to convective timescale and entrainment rate confirm the importance of these parameters in the possible development of scale-aware parameterizations, but also reveal unrecognized trade-offs from the entanglement of precipitation frequency and total amount.« less
NASA Astrophysics Data System (ADS)
Kooperman, Gabriel J.; Pritchard, Michael S.; O'Brien, Travis A.; Timmermans, Ben W.
2018-04-01
Deficiencies in the parameterizations of convection used in global climate models often lead to a distorted representation of the simulated rainfall intensity distribution (i.e., too much rainfall from weak rain rates). While encouraging improvements in high percentile rainfall intensity have been found as the horizontal resolution of the Community Atmosphere Model is increased to ˜25 km, we demonstrate no corresponding improvement in the moderate rain rates that generate the majority of accumulated rainfall. Using a statistical framework designed to emphasize links between precipitation intensity and accumulated rainfall beyond just the frequency distribution, we show that CAM cannot realistically simulate moderate rain rates, and cannot capture their intensification with climate change, even as resolution is increased. However, by separating the parameterized convective and large-scale resolved contributions to total rainfall, we find that the intensity, geographic pattern, and climate change response of CAM's large-scale rain rates are more consistent with observations (TRMM 3B42), superparameterization, and theoretical expectations, despite issues with parameterized convection. Increasing CAM's horizontal resolution does improve the representation of total rainfall intensity, but not due to changes in the intensity of large-scale rain rates, which are surprisingly insensitive to horizontal resolution. Rather, improvements occur through an increase in the relative contribution of the large-scale component to the total amount of accumulated rainfall. Analysis of sensitivities to convective timescale and entrainment rate confirm the importance of these parameters in the possible development of scale-aware parameterizations, but also reveal unrecognized trade-offs from the entanglement of precipitation frequency and total amount.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kooperman, Gabriel J.; Pritchard, Michael S.; O'Brien, Travis A.
Deficiencies in the parameterizations of convection used in global climate models often lead to a distorted representation of the simulated rainfall intensity distribution (i.e., too much rainfall from weak rain rates). While encouraging improvements in high percentile rainfall intensity have been found as the horizontal resolution of the Community Atmosphere Model is increased to ~25 km, we demonstrate no corresponding improvement in the moderate rain rates that generate the majority of accumulated rainfall. Using a statistical framework designed to emphasize links between precipitation intensity and accumulated rainfall beyond just the frequency distribution, we show that CAM cannot realistically simulate moderatemore » rain rates, and cannot capture their intensification with climate change, even as resolution is increased. However, by separating the parameterized convective and large-scale resolved contributions to total rainfall, we find that the intensity, geographic pattern, and climate change response of CAM's large-scale rain rates are more consistent with observations (TRMM 3B42), superparameterization, and theoretical expectations, despite issues with parameterized convection. Increasing CAM's horizontal resolution does improve the representation of total rainfall intensity, but not due to changes in the intensity of large-scale rain rates, which are surprisingly insensitive to horizontal resolution. Rather, improvements occur through an increase in the relative contribution of the large-scale component to the total amount of accumulated rainfall. Analysis of sensitivities to convective timescale and entrainment rate confirm the importance of these parameters in the possible development of scale-aware parameterizations, but also reveal unrecognized trade-offs from the entanglement of precipitation frequency and total amount.« less
NASA Technical Reports Server (NTRS)
Stoller, Ray A.; Wedding, Donald K.; Friedman, Peter S.
1993-01-01
A development status evaluation is presented for gas plasma display technology, noting how tradeoffs among the parameters of size, resolution, speed, portability, color, and image quality can yield cost-effective solutions for medical imaging, CAD, teleconferencing, multimedia, and both civil and military applications. Attention is given to plasma-based large-area displays' suitability for radar, sonar, and IR, due to their lack of EM susceptibility. Both monochrome and color displays are available.
High Resolution IRAS Maps and IR Emission of M31 -- II. Diffuse Component and Interstellar Dust
NASA Technical Reports Server (NTRS)
Xu, C.; Helou, G.
1995-01-01
Large-scale dust heating and cooling in the diffuse medium of M31 is studied using the high resolution (HiRes) IRAS maps in conjunction with UV, optical (UBV), and the HI maps. A dust heating/cooling model is developed based on a radiative transfer model which assumes a 'Sandwich' configuration of dust and stars takes account of the effect of dust grain scattering.
NASA Technical Reports Server (NTRS)
Engquist, B. E. (Editor); Osher, S. (Editor); Somerville, R. C. J. (Editor)
1985-01-01
Papers are presented on such topics as the use of semi-Lagrangian advective schemes in meteorological modeling; computation with high-resolution upwind schemes for hyperbolic equations; dynamics of flame propagation in a turbulent field; a modified finite element method for solving the incompressible Navier-Stokes equations; computational fusion magnetohydrodynamics; and a nonoscillatory shock capturing scheme using flux-limited dissipation. Consideration is also given to the use of spectral techniques in numerical weather prediction; numerical methods for the incorporation of mountains in atmospheric models; techniques for the numerical simulation of large-scale eddies in geophysical fluid dynamics; high-resolution TVD schemes using flux limiters; upwind-difference methods for aerodynamic problems governed by the Euler equations; and an MHD model of the earth's magnetosphere.
Remote sensing in support of high-resolution terrestrial carbon monitoring and modeling
NASA Astrophysics Data System (ADS)
Hurtt, G. C.; Zhao, M.; Dubayah, R.; Huang, C.; Swatantran, A.; ONeil-Dunne, J.; Johnson, K. D.; Birdsey, R.; Fisk, J.; Flanagan, S.; Sahajpal, R.; Huang, W.; Tang, H.; Armstrong, A. H.
2014-12-01
As part of its Phase 1 Carbon Monitoring System (CMS) activities, NASA initiated a Local-Scale Biomass Pilot study. The goals of the pilot study were to develop protocols for fusing high-resolution remotely sensed observations with field data, provide accurate validation test areas for the continental-scale biomass product, and demonstrate efficacy for prognostic terrestrial ecosystem modeling. In Phase 2, this effort was expanded to the state scale. Here, we present results of this activity focusing on the use of remote sensing in high-resolution ecosystem modeling. The Ecosystem Demography (ED) model was implemented at 90 m spatial resolution for the entire state of Maryland. We rasterized soil depth and soil texture data from SSURGO. For hourly meteorological data, we spatially interpolated 32-km 3-hourly NARR into 1-km hourly and further corrected them at monthly level using PRISM data. NLCD data were used to mask sand, seashore, and wetland. High-resolution 1 m forest/non-forest mapping was used to define forest fraction of 90 m cells. Three alternative strategies were evaluated for initialization of forest structure using high-resolution lidar, and the model was used to calculate statewide estimates of forest biomass, carbon sequestration potential, time to reach sequestration potential, and sensitivity to future forest growth and disturbance rates, all at 90 m resolution. To our knowledge, no dynamic ecosystem model has been run at such high spatial resolution over such large areas utilizing remote sensing and validated as extensively. There are over 3 million 90 m land cells in Maryland, greater than 43 times the ~73,000 half-degree cells in a state-of-the-art global land model.
Multiscale/multiresolution landslides susceptibility mapping
NASA Astrophysics Data System (ADS)
Grozavu, Adrian; Cătălin Stanga, Iulian; Valeriu Patriche, Cristian; Toader Juravle, Doru
2014-05-01
Within the European strategies, landslides are considered an important threatening that requires detailed studies to identify areas where these processes could occur in the future and to design scientific and technical plans for landslide risk mitigation. In this idea, assessing and mapping the landslide susceptibility is an important preliminary step. Generally, landslide susceptibility at small scale (for large regions) can be assessed through qualitative approach (expert judgements), based on a few variables, while studies at medium and large scale requires quantitative approach (e.g. multivariate statistics), a larger set of variables and, necessarily, the landslide inventory. Obviously, the results vary more or less from a scale to another, depending on the available input data, but also on the applied methodology. Since it is almost impossible to have a complete landslide inventory on large regions (e.g. at continental level), it is very important to verify the compatibility and the validity of results obtained at different scales, identifying the differences and fixing the inherent errors. This paper aims at assessing and mapping the landslide susceptibility at regional level through a multiscale-multiresolution approach from small scale and low resolution to large scale and high resolution of data and results, comparing the compatibility of results. While the first ones could be used for studies at european and national level, the later ones allows results validation, including through fields surveys. The test area, namely the Barlad Plateau (more than 9000 sq.km) is located in Eastern Romania, covering a region where both the natural environment and the human factor create a causal context that favor these processes. The landslide predictors were initially derived from various databases available at pan-european level and progressively completed and/or enhanced together with scale and the resolution: the topography (from SRTM at 90 meters to digital elevation models based on topographical maps, 1:25,000 and 1:5,000), the lithology (from geological maps, 1:200,000), land cover and land use (from CLC 2006 to maps derived from orthorectified aerial images, 0.5 meters resolution), rainfall (from Worldclim, ECAD to our own data), the seismicity (the seismic zonation of Romania) etc. The landslide inventory was created as polygonal data based on aerial images (resolution 0.5 meters), the information being considered at county level (NUTS 3) and, eventually, at communal level (LAU2). The methodological framework is based on the logistic regression as a quantitative method and the analytic hierarchy process as a semi-qualitative methods, both being applied once identically for all scales and once recalibrated for each scale and resolution (from 1:1,000,000 and one km pixel resolution to 1:25,000 and ten meters resolution). The predictive performance of the two models was assessed using the ROC (Receiver Operating Characteristic) curve and the AUC (Area Under Curve) parameter and the results indicate a good correspondence between the susceptibility estimated for the test samples (0.855-0.890) and for the validation samples (0.830-0.865). Finally, the results were compared in pairs in order to fix the errors at small scale and low resolution and to optimize the methodology for landslide susceptibility mapping on large areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Khee-Gan; Hennawi, Joseph F.; Eilers, Anna-Christina
2014-11-01
We present the first observations of foreground Lyα forest absorption from high-redshift galaxies, targeting 24 star-forming galaxies (SFGs) with z ∼ 2.3-2.8 within a 5' × 14' region of the COSMOS field. The transverse sightline separation is ∼2 h {sup –1} Mpc comoving, allowing us to create a tomographic reconstruction of the three-dimensional (3D) Lyα forest absorption field over the redshift range 2.20 ≤ z ≤ 2.45. The resulting map covers 6 h {sup –1} Mpc × 14 h {sup –1} Mpc in the transverse plane and 230 h {sup –1} Mpc along the line of sight with a spatialmore » resolution of ≈3.5 h {sup –1} Mpc, and is the first high-fidelity map of a large-scale structure on ∼Mpc scales at z > 2. Our map reveals significant structures with ≳ 10 h {sup –1} Mpc extent, including several spanning the entire transverse breadth, providing qualitative evidence for the filamentary structures predicted to exist in the high-redshift cosmic web. Simulated reconstructions with the same sightline sampling, spectral resolution, and signal-to-noise ratio recover the salient structures present in the underlying 3D absorption fields. Using data from other surveys, we identified 18 galaxies with known redshifts coeval with our map volume, enabling a direct comparison with our tomographic map. This shows that galaxies preferentially occupy high-density regions, in qualitative agreement with the same comparison applied to simulations. Our results establish the feasibility of the CLAMATO survey, which aims to obtain Lyα forest spectra for ∼1000 SFGs over ∼1 deg{sup 2} of the COSMOS field, in order to map out the intergalactic medium large-scale structure at (z) ∼ 2.3 over a large volume (100 h {sup –1} Mpc){sup 3}.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Guang; Fan, Jiwen; Xu, Kuan-Man
2015-06-01
Arakawa and Wu (2013, hereafter referred to as AW13) recently developed a formal approach to a unified parameterization of atmospheric convection for high-resolution numerical models. The work is based on ideas formulated by Arakawa et al. (2011). It lays the foundation for a new parameterization pathway in the era of high-resolution numerical modeling of the atmosphere. The key parameter in this approach is convective cloud fraction. In conventional parameterization, it is assumed that <<1. This assumption is no longer valid when horizontal resolution of numerical models approaches a few to a few tens kilometers, since in such situations convective cloudmore » fraction can be comparable to unity. Therefore, they argue that the conventional approach to parameterizing convective transport must include a factor 1 - in order to unify the parameterization for the full range of model resolutions so that it is scale-aware and valid for large convective cloud fractions. While AW13’s approach provides important guidance for future convective parameterization development, in this note we intend to show that the conventional approach already has this scale awareness factor 1 - built in, although not recognized for the last forty years. Therefore, it should work well even in situations of large convective cloud fractions in high-resolution numerical models.« less
The impact of Lyman-α radiative transfer on large-scale clustering in the Illustris simulation
NASA Astrophysics Data System (ADS)
Behrens, C.; Byrohl, C.; Saito, S.; Niemeyer, J. C.
2018-06-01
Context. Lyman-α emitters (LAEs) are a promising probe of the large-scale structure at high redshift, z ≳ 2. In particular, the Hobby-Eberly Telescope Dark Energy Experiment aims at observing LAEs at 1.9 < z < 3.5 to measure the baryon acoustic oscillation (BAO) scale and the redshift-space distortion (RSD). However, it has been pointed out that the complicated radiative transfer (RT) of the resonant Lyman-α emission line generates an anisotropic selection bias in the LAE clustering on large scales, s ≳ 10 Mpc. This effect could potentially induce a systematic error in the BAO and RSD measurements. Also, there exists a recent claim to have observational evidence of the effect in the Lyman-α intensity map, albeit statistically insignificant. Aims: We aim at quantifying the impact of the Lyman-α RT on the large-scale galaxy clustering in detail. For this purpose, we study the correlations between the large-scale environment and the ratio of an apparent Lyman-α luminosity to an intrinsic one, which we call the "observed fraction", at 2 < z < 6. Methods: We apply our Lyman-α RT code by post-processing the full Illustris simulations. We simply assume that the intrinsic luminosity of the Lyman-α emission is proportional to the star formation rate of galaxies in Illustris, yielding a sufficiently large sample of LAEs to measure the anisotropic selection bias. Results: We find little correlation between large-scale environment and the observed fraction induced by the RT, and hence a smaller anisotropic selection bias than has previously been claimed. We argue that the anisotropy was overestimated in previous work due to insufficient spatial resolution; it is important to keep the resolution such that it resolves the high-density region down to the scale of the interstellar medium, that is, 1 physical kpc. We also find that the correlation can be further enhanced by assumptions in modeling intrinsic Lyman-α emission.
NASA Technical Reports Server (NTRS)
Mankbadi, Mina R.; Georgiadis, Nicholas J.; DeBonis, James R.
2015-01-01
The objective of this work is to compare a high-order solver with a low-order solver for performing Large-Eddy Simulations (LES) of a compressible mixing layer. The high-order method is the Wave-Resolving LES (WRLES) solver employing a Dispersion Relation Preserving (DRP) scheme. The low-order solver is the Wind-US code, which employs the second-order Roe Physical scheme. Both solvers are used to perform LES of the turbulent mixing between two supersonic streams at a convective Mach number of 0.46. The high-order and low-order methods are evaluated at two different levels of grid resolution. For a fine grid resolution, the low-order method produces a very similar solution to the highorder method. At this fine resolution the effects of numerical scheme, subgrid scale modeling, and filtering were found to be negligible. Both methods predict turbulent stresses that are in reasonable agreement with experimental data. However, when the grid resolution is coarsened, the difference between the two solvers becomes apparent. The low-order method deviates from experimental results when the resolution is no longer adequate. The high-order DRP solution shows minimal grid dependence. The effects of subgrid scale modeling and spatial filtering were found to be negligible at both resolutions. For the high-order solver on the fine mesh, a parametric study of the spanwise width was conducted to determine its effect on solution accuracy. An insufficient spanwise width was found to impose an artificial spanwise mode and limit the resolved spanwise modes. We estimate that the spanwise depth needs to be 2.5 times larger than the largest coherent structures to capture the largest spanwise mode and accurately predict turbulent mixing.
NASA Technical Reports Server (NTRS)
Myneni, Ranga
2003-01-01
The problem of how the scale, or spatial resolution, of reflectance data impacts retrievals of vegetation leaf area index (LAI) and fraction absorbed photosynthetically active radiation (PAR) has been investigated. We define the goal of scaling as the process by which it is established that LAI and FPAR values derived from coarse resolution sensor data equal the arithmetic average of values derived independently from fine resolution sensor data. The increasing probability of land cover mixtures with decreasing resolution is defined as heterogeneity, which is a key concept in scaling studies. The effect of pixel heterogeneity on spectral reflectances and LAI/FPAR retrievals is investigated with 1 km Advanced Very High Resolution Radiometer (AVHRR) data aggregated to different coarse spatial resolutions. It is shown that LAI retrieval errors at coarse resolution are inversely related to the proportion of the dominant land cover in such pixel. Further, large errors in LAI retrievals are incurred when forests are minority biomes in non-forest pixels compared to when forest biomes are mixed with one another, and vice-versa. A physically based technique for scaling with explicit spatial resolution dependent radiative transfer formulation is developed. The successful application of this theory to scaling LAI retrievals from AVHRR data of different resolutions is demonstrated
Towards a Fine-Resolution Global Coupled Climate System for Prediction on Decadal/Centennial Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClean, Julie L.
The over-arching goal of this project was to contribute to the realization of a fully coupled fine resolution Earth System Model simulation in which a weather-scale atmosphere is coupled to an ocean in which mesoscale eddies are largely resolved. Both a prototype fine-resolution fully coupled ESM simulation and a first-ever multi-decadal forced fine-resolution global coupled ocean/ice simulation were configured, tested, run, and analyzed as part of this grant. Science questions focused on the gains from the use of high horizontal resolution, particularly in the ocean and sea-ice, with respect to climatically important processes. Both these fine resolution coupled ocean/sea icemore » and fully-coupled simulations and precedent stand-alone eddy-resolving ocean and eddy-permitting coupled ocean/ice simulations were used to explore the high resolution regime. Overall, these studies showed that the presence of mesoscale eddies significantly impacted mixing processes and the global meridional overturning circulation in the ocean simulations. Fourteen refereed publications and a Ph.D. dissertation resulted from this grant.« less
Map Scale, Proportion, and Google[TM] Earth
ERIC Educational Resources Information Center
Roberge, Martin C.; Cooper, Linda L.
2010-01-01
Aerial imagery has a great capacity to engage and maintain student interest while providing a contextual setting to strengthen their ability to reason proportionally. Free, on-demand, high-resolution, large-scale aerial photography provides both a bird's eye view of the world and a new perspective on one's own community. This article presents an…
Tait, E. W.; Ratcliff, L. E.; Payne, M. C.; ...
2016-04-20
Experimental techniques for electron energy loss spectroscopy (EELS) combine high energy resolution with high spatial resolution. They are therefore powerful tools for investigating the local electronic structure of complex systems such as nanostructures, interfaces and even individual defects. Interpretation of experimental electron energy loss spectra is often challenging and can require theoretical modelling of candidate structures, which themselves may be large and complex, beyond the capabilities of traditional cubic-scaling density functional theory. In this work, we present functionality to compute electron energy loss spectra within the onetep linear-scaling density functional theory code. We first demonstrate that simulated spectra agree withmore » those computed using conventional plane wave pseudopotential methods to a high degree of precision. The ability of onetep to tackle large problems is then exploited to investigate convergence of spectra with respect to supercell size. As a result, we apply the novel functionality to a study of the electron energy loss spectra of defects on the (1 0 1) surface of an anatase slab and determine concentrations of defects which might be experimentally detectable.« less
NASA Astrophysics Data System (ADS)
Bai, Rui; Tiejian, Li; Huang, Yuefei; Jiaye, Li; Wang, Guangqian; Yin, Dongqin
2015-12-01
The increasing resolution of Digital Elevation Models (DEMs) and the development of drainage network extraction algorithms make it possible to develop high-resolution drainage networks for large river basins. These vector networks contain massive numbers of river reaches with associated geographical features, including topological connections and topographical parameters. These features create challenges for efficient map display and data management. Of particular interest are the requirements of data management for multi-scale hydrological simulations using multi-resolution river networks. In this paper, a hierarchical pyramid method is proposed, which generates coarsened vector drainage networks from the originals iteratively. The method is based on the Horton-Strahler's (H-S) order schema. At each coarsening step, the river reaches with the lowest H-S order are pruned, and their related sub-basins are merged. At the same time, the topological connections and topographical parameters of each coarsened drainage network are inherited from the former level using formulas that are presented in this study. The method was applied to the original drainage networks of a watershed in the Huangfuchuan River basin extracted from a 1-m-resolution airborne LiDAR DEM and applied to the full Yangtze River basin in China, which was extracted from a 30-m-resolution ASTER GDEM. In addition, a map-display and parameter-query web service was published for the Mississippi River basin, and its data were extracted from the 30-m-resolution ASTER GDEM. The results presented in this study indicate that the developed method can effectively manage and display massive amounts of drainage network data and can facilitate multi-scale hydrological simulations.
Photogrammetric portrayal of Mars topography.
Wu, S.S.C.
1979-01-01
Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.-Author
Photogrammetric portrayal of Mars topography
NASA Technical Reports Server (NTRS)
Wu, S. S. C.
1979-01-01
Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.
Xenia Mission: Spacecraft Design Concept
NASA Technical Reports Server (NTRS)
Hopkins, R. C.; Johnson, C. L.; Kouveliotou, C.; Jones, D.; Baysinger, M.; Bedsole, T.; Maples, C. C.; Benfield, P. J.; Turner, M.; Capizzo, P.;
2009-01-01
The proposed Xenia mission will, for the first time, chart the chemical and dynamical state of the majority of baryonic matter in the universe. using high-resolution spectroscopy, Xenia will collect essential information from major traces of the formation and evolution of structures from the early universe to the present time. The mission is based on innovative instrumental and observational approaches: observing with fast reaction gamma-ray bursts (GRBs) with a high spectral resolution. This enables the study of their (star-forming) environment from the dark to the local universe and the use of GRBs as backlight of large-scale cosmological structures, observing and surveying extended sources with high sensitivity using two wide field-of-view x-ray telescopes - one with a high angular resolution and the other with a high spectral resolution.
NASA Astrophysics Data System (ADS)
Beers, A.; Ray, C.
2015-12-01
Climate change is likely to affect mountainous areas unevenly due to the complex interactions between topography, vegetation, and the accumulation of snow and ice. This heterogeneity will complicate relationships between species presence and large-scale drivers such as precipitation and make predicting habitat extent and connectivity much more difficult. We studied the potential for fine-scale variation in climate and habitat use throughout the year in the American pika (Ochotona princeps), a talus specialist of mountainous western North America known for strong microhabitat affiliation. Not all areas of talus are likely to be equally hospitable, which may reduce connectivity more than predicted by large-scale occupancy drivers. We used high resolution remotely sensed data to create metrics of the terrain and land cover in the Niwot Ridge (NWT) LTER site in Colorado. We hypothesized that pikas preferentially use heterogeneous terrain, as it might foster greater snow accumulation, and used radio telemetry to test this with radio-collared pikas. Pikas use heterogeneous terrain during snow covered periods and less heterogeneous area during the summer. This suggests that not all areas of talus habitat are equally suitable as shelter from extreme conditions but that pikas need more than just shelter from winter cold. With those results we created a predictive map using the same habitat metrics to model the extent of suitable habitat across the NWT area. These strong effects of terrain on pika habitat use and territory occupancy show the great utility that high resolution remotely sensed data can have in ecological applications. With increasing effects of climate change in mountainous regions, this modeling approach is crucial for quantifying habitat connectivity at both small and large scales and to identify potential refugia for threatened or isolated species.
Validation of Satellite Retrieved Land Surface Variables
NASA Technical Reports Server (NTRS)
Lakshmi, Venkataraman; Susskind, Joel
1999-01-01
The effective use of satellite observations of the land surface is limited by the lack of high spatial resolution ground data sets for validation of satellite products. Recent large scale field experiments include FIFE, HAPEX-Sahel and BOREAS which provide us with data sets that have large spatial coverage and long time coverage. It is the objective of this paper to characterize the difference between the satellite estimates and the ground observations. This study and others along similar lines will help us in utilization of satellite retrieved data in large scale modeling studies.
NASA Astrophysics Data System (ADS)
Goode, J. R.; Candelaria, T.; Kramer, N. R.; Hill, A. F.
2016-12-01
As global energy demands increase, generating hydroelectric power by constructing dams and reservoirs on large river systems is increasingly seen as a renewable alternative to fossil fuels, especially in emerging economies. Many large-scale hydropower projects are located in steep mountainous terrain, where environmental factors have the potential to conspire against the sustainability and success of such projects. As reservoir storage capacity decreases when sediment builds up behind dams, high sediment yields can limit project life expectancy and overall hydropower viability. In addition, episodically delivered sediment from landslides can make quantifying sediment loads difficult. These factors, combined with remote access, limit the critical data needed to effectively evaluate development decisions. In the summer of 2015, we conducted a basic survey to characterize the geomorphology, hydrology and ecology of 620 km of the Rio Maranon, Peru - a major tributary to the Amazon River, which flows north from the semi-arid Peruvian Andes - prior to its dissection by several large hydropower dams. Here we present one component of this larger study: a first order analysis of potential sediment inputs to the Rio Maranon, Peru. To evaluate sediment delivery and storage in this system, we used high resolution Google Earth imagery to delineate landslides, combined with high resolution imagery from a DJI Phantom 3 Drone, flown at alluvial fan inputs to the river in the field. Because hillslope-derived sediment inputs from headwater tributaries are important to overall ecosystem health in large river systems, our study has the potential to contribute to the understanding the impacts of large Andean dams on sediment connectivity to the Amazon basin.
Imaging mouse cerebellum with serial optical coherence scanner (Conference Presentation)
NASA Astrophysics Data System (ADS)
Liu, Chao J.; Williams, Kristen; Orr, Harry; Taner, Akkin
2017-02-01
We present the serial optical coherence scanner (SOCS), which consists of a polarization sensitive optical coherence tomography and a vibratome with associated controls for serial imaging, to visualize the cerebellum and adjacent brainstem of mouse. The cerebellar cortical layers and white matter are distinguished by using intrinsic optical contrasts. Images from serial scans reveal the large-scale anatomy in detail and map the nerve fiber pathways in the cerebellum and adjacent brainstem. The optical system, which has 5.5 μm axial resolution, utilizes a scan lens or a water-immersion microscope objective resulting in 10 μm or 4 μm lateral resolution, respectively. The large-scale brain imaging at high resolution requires an efficient way to collect large datasets. It is important to improve the SOCS system to deal with large-scale and large number of samples in a reasonable time. The imaging and slicing procedure for a section took about 4 minutes due to a low speed of the vibratome blade to maintain slicing quality. SOCS has potential to investigate pathological changes and monitor the effects of therapeutic drugs in cerebellar diseases such as spinocerebellar ataxia 1 (SCA1). The SCA1 is a neurodegenerative disease characterized by atrophy and eventual loss of Purkinje cells from the cerebellar cortex, and the optical contrasts provided by SOCS is being evaluated for biomarkers of the disease.
High-resolution modeling of local air-sea interaction within the Marine Continent using COAMPS
NASA Astrophysics Data System (ADS)
Jensen, T. G.; Chen, S.; Flatau, M. K.; Smith, T.; Rydbeck, A.
2016-12-01
The Maritime Continent (MC) is a region of intense deep atmospheric convection that serves as an important source of forcing for the Hadley and Walker circulations. The convective activity in the MC region spans multiple scales from local mesoscales to regional scales, and impacts equatorial wave propagation, coupled air-sea interaction and intra seasonal oscillations. The complex distribution of islands, shallow seas with fairly small heat storage and deep seas with large heat capacity is challenging to model. Diurnal convection over land-sea is part of a land-sea breeze system on a small scale, and is highly influenced by large variations in orography over land and marginal seas. Daytime solar insolation, run-off from the Archipelago and nighttime rainfall tends to stabilize the water column, while mixing by tidal currents and locally forced winds promote vertical mixing. The runoff from land and rivers and high net precipitation result in fresh water lenses that enhance vertical stability in the water column and help maintain high SST. We use the fully coupled atmosphere-ocean-wave version of the Coupled Ocean-Atmosphere Mesoscale Prediction System (COAMPS) developed at NRL with resolution of a few kilometers to investigate the air-sea interaction associated with the land-sea breeze system in the MC under active and inactive phases of the Madden-Julian Oscillation. The high resolution enables simulation of strong SST gradients associated with local upwelling in deeper waters and strong salinity gradients near rivers and from heavy precipitation.
The Large-scale Structure of the Universe: Probes of Cosmology and Structure Formation
NASA Astrophysics Data System (ADS)
Noh, Yookyung
The usefulness of large-scale structure as a probe of cosmology and structure formation is increasing as large deep surveys in multi-wavelength bands are becoming possible. The observational analysis of large-scale structure guided by large volume numerical simulations are beginning to offer us complementary information and crosschecks of cosmological parameters estimated from the anisotropies in Cosmic Microwave Background (CMB) radiation. Understanding structure formation and evolution and even galaxy formation history is also being aided by observations of different redshift snapshots of the Universe, using various tracers of large-scale structure. This dissertation work covers aspects of large-scale structure from the baryon acoustic oscillation scale, to that of large scale filaments and galaxy clusters. First, I discuss a large- scale structure use for high precision cosmology. I investigate the reconstruction of Baryon Acoustic Oscillation (BAO) peak within the context of Lagrangian perturbation theory, testing its validity in a large suite of cosmological volume N-body simulations. Then I consider galaxy clusters and the large scale filaments surrounding them in a high resolution N-body simulation. I investigate the geometrical properties of galaxy cluster neighborhoods, focusing on the filaments connected to clusters. Using mock observations of galaxy clusters, I explore the correlations of scatter in galaxy cluster mass estimates from multi-wavelength observations and different measurement techniques. I also examine the sources of the correlated scatter by considering the intrinsic and environmental properties of clusters.
NASA Astrophysics Data System (ADS)
Toigo, Anthony D.; Lee, Christopher; Newman, Claire E.; Richardson, Mark I.
2012-09-01
We investigate the sensitivity of the circulation and thermal structure of the martian atmosphere to numerical model resolution in a general circulation model (GCM) using the martian implementation (MarsWRF) of the planetWRF atmospheric model. We provide a description of the MarsWRF GCM and use it to study the global atmosphere at horizontal resolutions from 7.5° × 9° to 0.5° × 0.5°, encompassing the range from standard Mars GCMs to global mesoscale modeling. We find that while most of the gross-scale features of the circulation (the rough location of jets, the qualitative thermal structure, and the major large-scale features of the surface level winds) are insensitive to horizontal resolution over this range, several major features of the circulation are sensitive in detail. The northern winter polar circulation shows the greatest sensitivity, showing a continuous transition from a smooth polar winter jet at low resolution, to a distinct vertically “split” jet as resolution increases. The separation of the lower and middle atmosphere polar jet occurs at roughly 10 Pa, with the split jet structure developing in concert with the intensification of meridional jets at roughly 10 Pa and above 0.1 Pa. These meridional jets appear to represent the separation of lower and middle atmosphere mean overturning circulations (with the former being consistent with the usual concept of the “Hadley cell”). Further, the transition in polar jet structure is more sensitive to changes in zonal than meridional horizontal resolution, suggesting that representation of small-scale wave-mean flow interactions is more important than fine-scale representation of the meridional thermal gradient across the polar front. Increasing the horizontal resolution improves the match between the modeled thermal structure and the Mars Climate Sounder retrievals for northern winter high latitudes. While increased horizontal resolution also improves the simulation of the northern high latitudes at equinox, even the lowest model resolution considered here appears to do a good job for the southern winter and southern equinoctial pole (although in detail some discrepancies remain). These results suggest that studies of the northern winter jet (e.g., transient waves and cyclogenesis) will be more sensitive to global model resolution that those of the south (e.g., the confining dynamics of the southern polar vortex relevant to studies of argon transport). For surface winds, the major effect of increased horizontal resolution is in the superposition of circulations forced by local-scale topography upon the large-scale surface wind patterns. While passive predictions of dust lifting are generally insensitive to model horizontal resolution when no lifting threshold is considered, increasing the stress threshold produces significantly more lifting in higher resolution simulations with the generation of finer-scale, higher-stress winds due primarily to better-resolved topography. Considering the positive feedbacks expected for radiatively active dust lifting, we expect this bias to increase when such feedbacks are permitted.
Agent-based large-scale emergency evacuation using real-time open government data.
DOT National Transportation Integrated Search
2014-01-01
The open government initiatives have provided tremendous data resources for the : transportation system and emergency services in urban areas. This paper proposes : a traffic simulation framework using high temporal resolution demographic data : and ...
NASA Technical Reports Server (NTRS)
Goldsmith, Paul F.
2012-01-01
Surveys of all different types provide basic data using different tracers. Molecular clouds have structure over a very wide range of scales. Thus, "high resolution" surveys and studies of selected nearby clouds add critical information. The combination of large-area and high resolution allows Increased spatial dynamic range, which in turn enables detection of new and perhaps critical morphology (e.g. filaments). Theoretical modeling has made major progress, and suggests that multiple forces are at work. Galactic-scale modeling also progressing - indicates that stellar feedback is required. Models must strive to reproduce observed cloud structure at all scales. Astrochemical observations are not unrelated to questions of cloud evolution and star formation but we are still learning how to use this capability.
Rodríguez, José-Rodrigo; Turégano-López, Marta; DeFelipe, Javier; Merchán-Pérez, Angel
2018-01-01
Semithin sections are commonly used to examine large areas of tissue with an optical microscope, in order to locate and trim the regions that will later be studied with the electron microscope. Ideally, the observation of semithin sections would be from mesoscopic to nanoscopic scales directly, instead of using light microscopy and then electron microscopy (EM). Here we propose a method that makes it possible to obtain high-resolution scanning EM images of large areas of the brain in the millimeter to nanometer range. Since our method is compatible with light microscopy, it is also feasible to generate hybrid light and electron microscopic maps. Additionally, the same tissue blocks that have been used to obtain semithin sections can later be used, if necessary, for transmission EM, or for focused ion beam milling and scanning electron microscopy (FIB-SEM). PMID:29568263
Rodríguez, José-Rodrigo; Turégano-López, Marta; DeFelipe, Javier; Merchán-Pérez, Angel
2018-01-01
Semithin sections are commonly used to examine large areas of tissue with an optical microscope, in order to locate and trim the regions that will later be studied with the electron microscope. Ideally, the observation of semithin sections would be from mesoscopic to nanoscopic scales directly, instead of using light microscopy and then electron microscopy (EM). Here we propose a method that makes it possible to obtain high-resolution scanning EM images of large areas of the brain in the millimeter to nanometer range. Since our method is compatible with light microscopy, it is also feasible to generate hybrid light and electron microscopic maps. Additionally, the same tissue blocks that have been used to obtain semithin sections can later be used, if necessary, for transmission EM, or for focused ion beam milling and scanning electron microscopy (FIB-SEM).
High-resolution observations of combustion in heterogeneous surface fuels
E. Louise Loudermilk; Gary L. Achtemeier; Joseph J. O' Brien; J. Kevin Hiers; Benjamin S. Hornsby
2014-01-01
In ecosystems with frequent surface fires, fire and fuel heterogeneity at relevant scales have been largely ignored. This could be because complete burns give an impression of homogeneity, or due to the difficulty in capturing fine-scale variation in fuel characteristics and fire behaviour. Fire movement between patches of fuel can have implications for modelling fire...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guba, O.; Taylor, M. A.; Ullrich, P. A.
2014-11-27
We evaluate the performance of the Community Atmosphere Model's (CAM) spectral element method on variable-resolution grids using the shallow-water equations in spherical geometry. We configure the method as it is used in CAM, with dissipation of grid scale variance, implemented using hyperviscosity. Hyperviscosity is highly scale selective and grid independent, but does require a resolution-dependent coefficient. For the spectral element method with variable-resolution grids and highly distorted elements, we obtain the best results if we introduce a tensor-based hyperviscosity with tensor coefficients tied to the eigenvalues of the local element metric tensor. The tensor hyperviscosity is constructed so that, formore » regions of uniform resolution, it matches the traditional constant-coefficient hyperviscosity. With the tensor hyperviscosity, the large-scale solution is almost completely unaffected by the presence of grid refinement. This later point is important for climate applications in which long term climatological averages can be imprinted by stationary inhomogeneities in the truncation error. We also evaluate the robustness of the approach with respect to grid quality by considering unstructured conforming quadrilateral grids generated with a well-known grid-generating toolkit and grids generated by SQuadGen, a new open source alternative which produces lower valence nodes.« less
Guba, O.; Taylor, M. A.; Ullrich, P. A.; ...
2014-06-25
We evaluate the performance of the Community Atmosphere Model's (CAM) spectral element method on variable resolution grids using the shallow water equations in spherical geometry. We configure the method as it is used in CAM, with dissipation of grid scale variance implemented using hyperviscosity. Hyperviscosity is highly scale selective and grid independent, but does require a resolution dependent coefficient. For the spectral element method with variable resolution grids and highly distorted elements, we obtain the best results if we introduce a tensor-based hyperviscosity with tensor coefficients tied to the eigenvalues of the local element metric tensor. The tensor hyperviscosity ismore » constructed so that for regions of uniform resolution it matches the traditional constant coefficient hyperviscsosity. With the tensor hyperviscosity the large scale solution is almost completely unaffected by the presence of grid refinement. This later point is important for climate applications where long term climatological averages can be imprinted by stationary inhomogeneities in the truncation error. We also evaluate the robustness of the approach with respect to grid quality by considering unstructured conforming quadrilateral grids generated with a well-known grid-generating toolkit and grids generated by SQuadGen, a new open source alternative which produces lower valence nodes.« less
Physical basis for river segmentation from water surface observables
NASA Astrophysics Data System (ADS)
Samine Montazem, A.; Garambois, P. A.; Calmant, S.; Moreira, D. M.; Monnier, J.; Biancamaria, S.
2017-12-01
With the advent of satellite missions such as SWOT we will have access to high resolution estimates of the elevation, slope and width of the free surface. A segmentation strategy is required in order to sub-sample the data set into reach master points for further hydraulic analyzes and inverse modelling. The question that arises is : what will be the best node repartition strategy that preserves hydraulic properties of river flow? The concept of hydraulic visibility introduced by Garambois et al. (2016) is investigated in order to highlight and characterize the spatio-temporal variations of water surface slope and curvature for different flow regimes and reach geometries. We show that free surface curvature is a powerful proxy for characterizing the hydraulic behavior of a reach since concavity of water surface is driven by variations in channel geometry that impacts the hydraulic properties of the flow. We evaluated the performance of three segmentation strategies by means of a well documented case, that of the Garonne river in France. We conclude that local extrema of free surface curvature appear as the best candidate for locating the segment boundaries for an optimal hydraulic representation of the segmented river. We show that for a given river different segmentation scales are possible: a fine-scale segmentation which is driven by fine-scale hydraulic to large-scale segmentation driven by large-scale geomorphology. The segmentation technique is then applied to high resolution GPS profiles of free surface elevation collected on the Negro river basin, a major contributor of the Amazon river. We propose two segmentations: a low-resolution one that can be used for basin hydrology and a higher resolution one better suited for local hydrodynamic studies.
A High-Resolution WRF Tropical Channel Simulation Driven by a Global Reanalysis
NASA Astrophysics Data System (ADS)
Holland, G.; Leung, L.; Kuo, Y.; Hurrell, J.
2006-12-01
Since 2003, NCAR has invested in the development and application of Nested Regional Climate Model (NRCM) based on the Weather Research and Forecasting (WRF) model and the Community Climate System Model, as a key component of the Prediction Across Scales Initiative. A prototype tropical channel model has been developed to investigate scale interactions and the influence of tropical convection on large scale circulation and tropical modes. The model was developed based on the NCAR Weather Research and Forecasting Model (WRF), configured as a tropical channel between 30 ° S and 45 ° N, wide enough to allow teleconnection effects over the mid-latitudes. Compared to the limited area domain that WRF is typically applied over, the channel mode alleviates issues with reflection of tropical modes that could result from imposing east/west boundaries. Using a large amount of available computing resources on a supercomputer (Blue Vista) during its bedding in period, a simulation has been completed with the tropical channel applied at 36 km horizontal resolution for 5 years from 1996 to 2000, with large scale circulation provided by the NCEP/NCAR global reanalysis at the north/south boundaries. Shorter simulations of 2 years and 6 months have also been performed to include two-way nests at 12 km and 4 km resolution, respectively, over the western Pacific warm pool, to explicitly resolve tropical convection in the Maritime Continent. The simulations realistically captured the large-scale circulation including the trade winds over the tropical Pacific and Atlantic, the Australian and Asian monsoon circulation, and hurricane statistics. Preliminary analysis and evaluation of the simulations will be presented.
NASA Technical Reports Server (NTRS)
Lim, Young-Kwon; Stefanova, Lydia B.; Chan, Steven C.; Schubert, Siegfried D.; OBrien, James J.
2010-01-01
This study assesses the regional-scale summer precipitation produced by the dynamical downscaling of analyzed large-scale fields. The main goal of this study is to investigate how much the regional model adds smaller scale precipitation information that the large-scale fields do not resolve. The modeling region for this study covers the southeastern United States (Florida, Georgia, Alabama, South Carolina, and North Carolina) where the summer climate is subtropical in nature, with a heavy influence of regional-scale convection. The coarse resolution (2.5deg latitude/longitude) large-scale atmospheric variables from the National Center for Environmental Prediction (NCEP)/DOE reanalysis (R2) are downscaled using the NCEP Environmental Climate Prediction Center regional spectral model (RSM) to produce precipitation at 20 km resolution for 16 summer seasons (19902005). The RSM produces realistic details in the regional summer precipitation at 20 km resolution. Compared to R2, the RSM-produced monthly precipitation shows better agreement with observations. There is a reduced wet bias and a more realistic spatial pattern of the precipitation climatology compared with the interpolated R2 values. The root mean square errors of the monthly R2 precipitation are reduced over 93 (1,697) of all the grid points in the five states (1,821). The temporal correlation also improves over 92 (1,675) of all grid points such that the domain-averaged correlation increases from 0.38 (R2) to 0.55 (RSM). The RSM accurately reproduces the first two observed eigenmodes, compared with the R2 product for which the second mode is not properly reproduced. The spatial patterns for wet versus dry summer years are also successfully simulated in RSM. For shorter time scales, the RSM resolves heavy rainfall events and their frequency better than R2. Correlation and categorical classification (above/near/below average) for the monthly frequency of heavy precipitation days is also significantly improved by the RSM.
NASA Astrophysics Data System (ADS)
Kourafalou, V.; Kang, H.; Perlin, N.; Le Henaff, M.; Lamkin, J. T.
2016-02-01
Connectivity around the South Florida coastal regions and between South Florida and Cuba are largely influenced by a) local coastal processes and b) circulation in the Florida Straits, which is controlled by the larger scale Florida Current variability. Prediction of the physical connectivity is a necessary component for several activities that require ocean forecasts, such as oil spills, fisheries research, search and rescue. This requires a predictive system that can accommodate the intense coastal to offshore interactions and the linkages to the complex regional circulation. The Florida Straits, South Florida and Florida Keys Hybrid Coordinate Ocean Model is such a regional ocean predictive system, covering a large area over the Florida Straits and the adjacent land areas, representing both coastal and oceanic processes. The real-time ocean forecast system is high resolution ( 900m), embedded in larger scale predictive models. It includes detailed coastal bathymetry, high resolution/high frequency atmospheric forcing and provides 7-day forecasts, updated daily (see: http://coastalmodeling.rsmas.miami.edu/). The unprecedented high resolution and coastal details of this system provide value added on global forecasts through downscaling and allow a variety of applications. Examples will be presented, focusing on the period of a 2015 fisheries cruise around the coastal areas of Cuba, where model predictions helped guide the measurements on biophysical connectivity, under intense variability of the mesoscale eddy field and subsequent Florida Current meandering.
NASA Astrophysics Data System (ADS)
Maxwell, R. M.; Condon, L. E.; Kollet, S. J.
2013-12-01
Groundwater is an important component of the hydrologic cycle yet its importance is often overlooked. Aquifers are a critical water resource, particularly in irrigation, but also participates in moderating the land-energy balance over the so-called critical zone of 2-10m in water table depth. Yet,the scaling behavior of groundwater is not well known. Here, we present the results of a fully-integrated hydrologic model run over a 6.3M km2 domain that covers much of North America focused on the continental United States. This model encompasses both the Mississippi and Colorado River watersheds in their entirety at 1km resolution and is constructed using the fully-integrated groundwater-vadose zone-surface water-land surface model, ParFlow. Results from this work are compared to observations (both of surface water flow and groundwater depths) and approaches are presented for observing of these integrated systems. Furthermore, results are used to understand the scaling behavior of groundwater over the continent at high resolution. Implications for understanding dominant hydrological processes at large scales will be discussed.
2013-09-30
flow models, such as Delft3D, with our developed Boussinesq -type model. The vision of this project is to develop an operational tool for the...situ measurements or large-scale wave models. This information will be used to drive the offshore wave boundary condition. • Execute the Boussinesq ...model to match with the Boussinesq -type theory would be one which can simulate sheared and stratified currents due to large-scale (non-wave) forcings
Global high-resolution simulations of tropospheric nitrogen dioxide using CHASER V4.0
NASA Astrophysics Data System (ADS)
Sekiya, Takashi; Miyazaki, Kazuyuki; Ogochi, Koji; Sudo, Kengo; Takigawa, Masayuki
2018-03-01
We evaluate global tropospheric nitrogen dioxide (NO2) simulations using the CHASER V4.0 global chemical transport model (CTM) at horizontal resolutions of 0.56, 1.1, and 2.8°. Model evaluation was conducted using satellite tropospheric NO2 retrievals from the Ozone Monitoring Instrument (OMI) and the Global Ozone Monitoring Experiment-2 (GOME-2) and aircraft observations from the 2014 Front Range Air Pollution and Photochemistry Experiment (FRAPPÉ). Agreement against satellite retrievals improved greatly at 1.1 and 0.56° resolutions (compared to 2.8° resolution) over polluted and biomass burning regions. The 1.1° simulation generally captured the regional distribution of the tropospheric NO2 column well, whereas 0.56° resolution was necessary to improve the model performance over areas with strong local sources, with mean bias reductions of 67 % over Beijing and 73 % over San Francisco in summer. Validation using aircraft observations indicated that high-resolution simulations reduced negative NO2 biases below 700 hPa over the Denver metropolitan area. These improvements in high-resolution simulations were attributable to (1) closer spatial representativeness between simulations and observations and (2) better representation of large-scale concentration fields (i.e., at 2.8°) through the consideration of small-scale processes. Model evaluations conducted at 0.5 and 2.8° bin grids indicated that the contributions of both these processes were comparable over most polluted regions, whereas the latter effect (2) made a larger contribution over eastern China and biomass burning areas. The evaluations presented in this paper demonstrate the potential of using a high-resolution global CTM for studying megacity-scale air pollutants across the entire globe, potentially also contributing to global satellite retrievals and chemical data assimilation.
Detecting Multi-scale Structures in Chandra Images of Centaurus A
NASA Astrophysics Data System (ADS)
Karovska, M.; Fabbiano, G.; Elvis, M. S.; Evans, I. N.; Kim, D. W.; Prestwich, A. H.; Schwartz, D. A.; Murray, S. S.; Forman, W.; Jones, C.; Kraft, R. P.; Isobe, T.; Cui, W.; Schreier, E. J.
1999-12-01
Centaurus A (NGC 5128) is a giant early-type galaxy with a merger history, containing the nearest radio-bright AGN. Recent Chandra High Resolution Camera (HRC) observations of Cen A reveal X-ray multi-scale structures in this object with unprecedented detail and clarity. We show the results of an analysis of the Chandra data with smoothing and edge enhancement techniques that allow us to enhance and quantify the multi-scale structures present in the HRC images. These techniques include an adaptive smoothing algorithm (Ebeling et al 1999), and a multi-directional gradient detection algorithm (Karovska et al 1994). The Ebeling et al adaptive smoothing algorithm, which is incorporated in the CXC analysis s/w package, is a powerful tool for smoothing images containing complex structures at various spatial scales. The adaptively smoothed images of Centaurus A show simultaneously the high-angular resolution bright structures at scales as small as an arcsecond and the extended faint structures as large as several arc minutes. The large scale structures suggest complex symmetry, including a component possibly associated with the inner radio lobes (as suggested by the ROSAT HRI data, Dobereiner et al 1996), and a separate component with an orthogonal symmetry that may be associated with the galaxy as a whole. The dust lane and the x-ray ridges are very clearly visible. The adaptively smoothed images and the edge-enhanced images also suggest several filamentary features including a large filament-like structure extending as far as about 5 arcminutes to North-West.
A Large Scale, High Resolution Agent-Based Insurgency Model
2013-09-30
CUDA) is NVIDIA Corporation’s software development model for General Purpose Programming on Graphics Processing Units (GPGPU) ( NVIDIA Corporation ...Conference. Argonne National Laboratory, Argonne, IL, October, 2005. NVIDIA Corporation . NVIDIA CUDA Programming Guide 2.0 [Online]. NVIDIA Corporation
Cross-Scale Molecular Analysis of Chemical Heterogeneity in Shale Rocks
Hao, Zhao; Bechtel, Hans A.; Kneafsey, Timothy; ...
2018-02-07
The organic and mineralogical heterogeneity in shale at micrometer and nanometer spatial scales contributes to the quality of gas reserves, gas flow mechanisms and gas production. Here, we demonstrate two molecular imaging approaches based on infrared spectroscopy to obtain mineral and kerogen information at these mesoscale spatial resolutions in large-sized shale rock samples. The first method is a modified microscopic attenuated total reflectance measurement that utilizes a large germanium hemisphere combined with a focal plane array detector to rapidly capture chemical images of shale rock surfaces spanning hundreds of micrometers with micrometer spatial resolution. The second method, synchrotron infrared nano-spectroscopy,more » utilizes a metallic atomic force microscope tip to obtain chemical images of micrometer dimensions but with nanometer spatial resolution. This chemically "deconvoluted" imaging at the nano-pore scale is then used to build a machine learning model to generate a molecular distribution map across scales with a spatial span of 1000 times, which enables high-throughput geochemical characterization in greater details across the nano-pore and micro-grain scales and allows us to identify co-localization of mineral phases with chemically distinct organics and even with gas phase sorbents. Finally, this characterization is fundamental to understand mineral and organic compositions affecting the behavior of shales.« less
Cross-Scale Molecular Analysis of Chemical Heterogeneity in Shale Rocks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Zhao; Bechtel, Hans A.; Kneafsey, Timothy
The organic and mineralogical heterogeneity in shale at micrometer and nanometer spatial scales contributes to the quality of gas reserves, gas flow mechanisms and gas production. Here, we demonstrate two molecular imaging approaches based on infrared spectroscopy to obtain mineral and kerogen information at these mesoscale spatial resolutions in large-sized shale rock samples. The first method is a modified microscopic attenuated total reflectance measurement that utilizes a large germanium hemisphere combined with a focal plane array detector to rapidly capture chemical images of shale rock surfaces spanning hundreds of micrometers with micrometer spatial resolution. The second method, synchrotron infrared nano-spectroscopy,more » utilizes a metallic atomic force microscope tip to obtain chemical images of micrometer dimensions but with nanometer spatial resolution. This chemically "deconvoluted" imaging at the nano-pore scale is then used to build a machine learning model to generate a molecular distribution map across scales with a spatial span of 1000 times, which enables high-throughput geochemical characterization in greater details across the nano-pore and micro-grain scales and allows us to identify co-localization of mineral phases with chemically distinct organics and even with gas phase sorbents. Finally, this characterization is fundamental to understand mineral and organic compositions affecting the behavior of shales.« less
The Advanced Telescope for High Energy Astrophysics
NASA Astrophysics Data System (ADS)
Guainazzi, Matteo
2017-08-01
Athena (the Advanced Telescope for High Energy Astrophysics) is a next generation X-ray observatory currently under study by ESA for launch in 2028. Athena is designed to address the Hot and Energetic Universe science theme, which addresses two key questions: 1) How did ordinary matter evolve into the large scale structures we see today? 2) How do black holes grow and shape the Universe. To address these topics Athena employs an innovative X-ray telescope based on Silicon Pore Optics technology to deliver extremely light weight and high throughput, while retaining excellent angular resolution. The mirror can be adjusted to focus onto one of two focal place instruments: the X-ray Integral Field Unit (X-IFU) which provides spatially-resolved, high resolution spectroscopy, and the Wide Field Imager (WFI) which provides spectral imaging over a large field of view, as well as high time resolution and count rate tolerance. Athena is currently in Phase A and the study status will be reviewed, along with the scientific motivations behind the mission.
NASA Astrophysics Data System (ADS)
Wylezalek, Dominika; Schnorr Müller, Allan; Zakamska, Nadia L.; Storchi-Bergmann, Thaisa; Greene, Jenny E.; Müller-Sánchez, Francisco; Kelly, Michael; Liu, Guilin; Law, David R.; Barrera-Ballesteros, Jorge K.; Riffel, Rogemar A.; Thomas, Daniel
2017-05-01
Ionized gas outflows driven by active galactic nuclei (AGN) are ubiquitous in high-luminosity AGN with outflow speeds apparently correlated with the total bolometric luminosity of the AGN. This empirical relation and theoretical work suggest that in the range Lbol ˜ 1043-45 erg s-1 there must exist a threshold luminosity above which the AGN becomes powerful enough to launch winds that will be able to escape the galaxy potential. In this paper, we present pilot observations of two AGN in this transitional range that were taken with the Gemini North Multi-Object Spectrograph integral field unit (IFU). Both sources have also previously been observed within the Sloan Digital Sky Survey-IV (SDSS) Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey. While the MaNGA IFU maps probe the gas fields on galaxy-wide scales and show that some regions are dominated by AGN ionization, the new Gemini IFU data zoom into the centre with four times better spatial resolution. In the object with the lower Lbol we find evidence of a young or stalled biconical AGN-driven outflow where none was obvious at the MaNGA resolution. In the object with the higher Lbol we trace the large-scale biconical outflow into the nuclear region and connect the outflow from small to large scales. These observations suggest that AGN luminosity and galaxy potential are crucial in shaping wind launching and propagation in low-luminosity AGN. The transition from small and young outflows to galaxy-wide feedback can only be understood by combining large-scale IFU data that trace the galaxy velocity field with higher resolution, small-scale IFU maps.
NASA Astrophysics Data System (ADS)
Mateo, Cherry May R.; Yamazaki, Dai; Kim, Hyungjun; Champathong, Adisorn; Vaze, Jai; Oki, Taikan
2017-10-01
Global-scale river models (GRMs) are core tools for providing consistent estimates of global flood hazard, especially in data-scarce regions. Due to former limitations in computational power and input datasets, most GRMs have been developed to use simplified representations of flow physics and run at coarse spatial resolutions. With increasing computational power and improved datasets, the application of GRMs to finer resolutions is becoming a reality. To support development in this direction, the suitability of GRMs for application to finer resolutions needs to be assessed. This study investigates the impacts of spatial resolution and flow connectivity representation on the predictive capability of a GRM, CaMa-Flood, in simulating the 2011 extreme flood in Thailand. Analyses show that when single downstream connectivity (SDC) is assumed, simulation results deteriorate with finer spatial resolution; Nash-Sutcliffe efficiency coefficients decreased by more than 50 % between simulation results at 10 km resolution and 1 km resolution. When multiple downstream connectivity (MDC) is represented, simulation results slightly improve with finer spatial resolution. The SDC simulations result in excessive backflows on very flat floodplains due to the restrictive flow directions at finer resolutions. MDC channels attenuated these effects by maintaining flow connectivity and flow capacity between floodplains in varying spatial resolutions. While a regional-scale flood was chosen as a test case, these findings should be universal and may have significant impacts on large- to global-scale simulations, especially in regions where mega deltas exist.These results demonstrate that a GRM can be used for higher resolution simulations of large-scale floods, provided that MDC in rivers and floodplains is adequately represented in the model structure.
An efficient photogrammetric stereo matching method for high-resolution images
NASA Astrophysics Data System (ADS)
Li, Yingsong; Zheng, Shunyi; Wang, Xiaonan; Ma, Hao
2016-12-01
Stereo matching of high-resolution images is a great challenge in photogrammetry. The main difficulty is the enormous processing workload that involves substantial computing time and memory consumption. In recent years, the semi-global matching (SGM) method has been a promising approach for solving stereo problems in different data sets. However, the time complexity and memory demand of SGM are proportional to the scale of the images involved, which leads to very high consumption when dealing with large images. To solve it, this paper presents an efficient hierarchical matching strategy based on the SGM algorithm using single instruction multiple data instructions and structured parallelism in the central processing unit. The proposed method can significantly reduce the computational time and memory required for large scale stereo matching. The three-dimensional (3D) surface is reconstructed by triangulating and fusing redundant reconstruction information from multi-view matching results. Finally, three high-resolution aerial date sets are used to evaluate our improvement. Furthermore, precise airborne laser scanner data of one data set is used to measure the accuracy of our reconstruction. Experimental results demonstrate that our method remarkably outperforms in terms of time and memory savings while maintaining the density and precision of the 3D cloud points derived.
Automated AFM for small-scale and large-scale surface profiling in CMP applications
NASA Astrophysics Data System (ADS)
Zandiatashbar, Ardavan; Kim, Byong; Yoo, Young-kook; Lee, Keibock; Jo, Ahjin; Lee, Ju Suk; Cho, Sang-Joon; Park, Sang-il
2018-03-01
As the feature size is shrinking in the foundries, the need for inline high resolution surface profiling with versatile capabilities is increasing. One of the important areas of this need is chemical mechanical planarization (CMP) process. We introduce a new generation of atomic force profiler (AFP) using decoupled scanners design. The system is capable of providing small-scale profiling using XY scanner and large-scale profiling using sliding stage. Decoupled scanners design enables enhanced vision which helps minimizing the positioning error for locations of interest in case of highly polished dies. Non-Contact mode imaging is another feature of interest in this system which is used for surface roughness measurement, automatic defect review, and deep trench measurement. Examples of the measurements performed using the atomic force profiler are demonstrated.
The Advanced Pair Telescope (APT) Mission Concept
NASA Technical Reports Server (NTRS)
Hunter, Stanley; Buckley, James H.
2008-01-01
We present a mission concept for the Advanced Pair Telescope (APT), a high-energy gamma-ray instrument with an order of magnitude improvement in sensitivity, 6 sr field of view, and angular resolution a factor of 3-10 times that of GLAST. With its very wide instantaneous field-of-view and large effective area, this instrument would be capable of detecting GRBs at very large redshifts, would enable a very high resolution study of SNRs and PWN, and could provide hour-scale temporal resolution of transients from many AGN and galactic sources. The APT instrument will consist of a Xe time-projection-chamber tracker that bridges the energy regime between Compton scattering and pair production and will provide an unprecedented improvement in angular resolution; a thick scintillating-fiber trackerlcalorimeter that will provide sensitivity and energy resolution to higher energies and will possess a factor of 10 improvement in geometric factor over GLAST; and an anticoincidence detector using scintillator-tiles to reject charged particles. After the anticipated 10-years of GLAST operation , the APT instrument would provide continued coverage of the critial high-energy gamma-ray band (between 30 MeV to 100 GeV), providing an essential component of broad-band multiwavelength studies of the high-energy universe.
Analysis of Radar and Optical Space Borne Data for Large Scale Topographical Mapping
NASA Astrophysics Data System (ADS)
Tampubolon, W.; Reinhardt, W.
2015-03-01
Normally, in order to provide high resolution 3 Dimension (3D) geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term "Rapid Mapping". In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS) imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs) from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar) or the usage of the GCPs in both, the optical and the radar satellite data processing. Finally this results will be used in the future as a reference for further geospatial data acquisitions to support topographical mapping in even larger scales up to the 1:10.000 map scale.
D.J. Hayes; W.B. Cohen
2006-01-01
This article describes the development of a methodology for scaling observations of changes in tropical forest cover to large areas at high temporal frequency from coarse-resolution satellite imagery. The approach for estimating proportional forest cover change as a continuous variable is based on a regression model that relates multispectral, multitemporal Moderate...
Full-scale high-speed ``Edgerton'' retroreflective shadowgraphy of gunshots
NASA Astrophysics Data System (ADS)
Settles, Gary
2005-11-01
Almost 1/2 century ago, H. E. ``Doc'' Edgerton demonstrated a simple and elegant direct-shadowgraph technique for imaging large-scale events like explosions and gunshots. Only a retroreflective screen, flashlamp illumination, and an ordinary view camera were required. Retroreflective shadowgraphy has seen occasional use since then, but its unique combination of large scale, simplicity and portability has barely been tapped. It functions well in environments hostile to most optical diagnostics, such as full-scale outdoor daylight ballistics and explosives testing. Here, shadowgrams cast upon a 2.4 m square retroreflective screen are imaged by a Photron Fastcam APX-RS digital camera that is capable of megapixel image resolution at 3000 frames/sec up to 250,000 frames/sec at lower resolution. Microsecond frame exposures are used to examine the external ballistics of several firearms, including a high-powered rifle, an AK-47 submachine gun, and several pistols and revolvers. Muzzle blast phenomena and the mechanism of gunpowder residue deposition on the shooter's hands are clearly visualized. In particular, observing the firing of a pistol with and without a silencer (suppressor) suggests that some of the muzzle blast energy is converted by the silencer into supersonic jet noise.
NASA Astrophysics Data System (ADS)
Chirayath, V.
2014-12-01
Fluid Lensing is a theoretical model and algorithm I present for fluid-optical interactions in turbulent flows as well as two-fluid surface boundaries that, when coupled with an unique computer vision and image-processing pipeline, may be used to significantly enhance the angular resolution of a remote sensing optical system with applicability to high-resolution 3D imaging of subaqueous regions and through turbulent fluid flows. This novel remote sensing technology has recently been implemented on a quadcopter-based UAS for imaging shallow benthic systems to create the first dataset of a biosphere with unprecedented sub-cm-level imagery in 3D over areas as large as 15 square kilometers. Perturbed two-fluid boundaries with different refractive indices, such as the surface between the ocean and air, may be exploited for use as lensing elements for imaging targets on either side of the interface with enhanced angular resolution. I present theoretical developments behind Fluid Lensing and experimental results from its recent implementation for the Reactive Reefs project to image shallow reef ecosystems at cm scales. Preliminary results from petabyte-scale aerial survey efforts using Fluid Lensing to image at-risk coral reefs in American Samoa (August, 2013) show broad applicability to large-scale automated species identification, morphology studies and reef ecosystem characterization for shallow marine environments and terrestrial biospheres, of crucial importance to understanding climate change's impact on coastal zones, global oxygen production and carbon sequestration.
High-resolution hybrid simulations of turbulence from inertial to sub-proton scales
NASA Astrophysics Data System (ADS)
Franci, Luca; Hellinger, Petr; Landi, Simone; Matteini, Lorenzo; Verdini, Andrea
2015-04-01
We investigate properties of turbulence from MHD scales to ion scales by means of two-dimensional, large-scale, high-resolution hybrid particle-in-cell simulations, which to our knowledge constitute the most accurate hybrid simulations of ion scale turbulence ever presented so far. We impose an initial ambient magnetic field perpendicular to the simulation box, and we add a spectrum of large-scale, linearly polarized Alfvén waves, balanced and Alfvénically equipartitioned, on average. When turbulence is fully developed, we observe an inertial range which is characterized by the power spectrum of perpendicular magnetic field fluctuations following a Kolmogorov law with spectral index close to -5/3, while the proton bulk velocity fluctuations exhibit a less steeper slope with index close to -3/2. Both these trends hold over a full decade. A definite transition is observed at a scale of the order of the proton inertial length, above which both spectra steepen, with the perpendicular magnetic field still exhibiting a power law with spectral index about -3 over another full decade. The spectrum of perpendicular electric fluctuations follows the one of the proton bulk velocity at MHD scales and reaches a sort of plateau at small scales. The turbulent nature of our data is also supported by the presence of intermittency. This is revealed by the non-Gaussianity of the probability distribution functions of MHD primitive variables increasing as approaching kinetic scales. All these features are in good agreement with solar wind observations.
Continuous data assimilation for downscaling large-footprint soil moisture retrievals
NASA Astrophysics Data System (ADS)
Altaf, Muhammad U.; Jana, Raghavendra B.; Hoteit, Ibrahim; McCabe, Matthew F.
2016-10-01
Soil moisture is a key component of the hydrologic cycle, influencing processes leading to runoff generation, infiltration and groundwater recharge, evaporation and transpiration. Generally, the measurement scale for soil moisture is found to be different from the modeling scales for these processes. Reducing this mismatch between observation and model scales in necessary for improved hydrological modeling. An innovative approach to downscaling coarse resolution soil moisture data by combining continuous data assimilation and physically based modeling is presented. In this approach, we exploit the features of Continuous Data Assimilation (CDA) which was initially designed for general dissipative dynamical systems and later tested numerically on the incompressible Navier-Stokes equation, and the Benard equation. A nudging term, estimated as the misfit between interpolants of the assimilated coarse grid measurements and the fine grid model solution, is added to the model equations to constrain the model's large scale variability by available measurements. Soil moisture fields generated at a fine resolution by a physically-based vadose zone model (HYDRUS) are subjected to data assimilation conditioned upon coarse resolution observations. This enables nudging of the model outputs towards values that honor the coarse resolution dynamics while still being generated at the fine scale. Results show that the approach is feasible to generate fine scale soil moisture fields across large extents, based on coarse scale observations. Application of this approach is likely in generating fine and intermediate resolution soil moisture fields conditioned on the radiometerbased, coarse resolution products from remote sensing satellites.
Supporting observation campaigns with high resolution modeling
NASA Astrophysics Data System (ADS)
Klocke, Daniel; Brueck, Matthias; Voigt, Aiko
2017-04-01
High resolution simulation in support of measurement campaigns offers a promising and emerging way to create large-scale context for small-scale observations of clouds and precipitation processes. As these simulation include the coupling of measured small-scale processes with the circulation, they also help to integrate the research communities from modeling and observations and allow for detailed model evaluations against dedicated observations. In connection with the measurement campaign NARVAL (August 2016 and December 2013) simulations with a grid-spacing of 2.5 km for the tropical Atlantic region (9000x3300 km), with local refinement to 1.2 km for the western part of the domain, were performed using the icosahedral non-hydrostatic (ICON) general circulation model. These simulations are again used to drive large eddy resolving simulations with the same model for selected days in the high definition clouds and precipitation for advancing climate prediction (HD(CP)2) project. The simulations are presented with the focus on selected results showing the benefit for the scientific communities doing atmospheric measurements and numerical modeling of climate and weather. Additionally, an outlook will be given on how similar simulations will support the NAWDEX measurement campaign in the North Atlantic and AC3 measurement campaign in the Arctic.
Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics
NASA Astrophysics Data System (ADS)
Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.
2016-12-01
The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
LAI inversion algorithm based on directional reflectance kernels.
Tang, S; Chen, J M; Zhu, Q; Li, X; Chen, M; Sun, R; Zhou, Y; Deng, F; Xie, D
2007-11-01
Leaf area index (LAI) is an important ecological and environmental parameter. A new LAI algorithm is developed using the principles of ground LAI measurements based on canopy gap fraction. First, the relationship between LAI and gap fraction at various zenith angles is derived from the definition of LAI. Then, the directional gap fraction is acquired from a remote sensing bidirectional reflectance distribution function (BRDF) product. This acquisition is obtained by using a kernel driven model and a large-scale directional gap fraction algorithm. The algorithm has been applied to estimate a LAI distribution in China in mid-July 2002. The ground data acquired from two field experiments in Changbai Mountain and Qilian Mountain were used to validate the algorithm. To resolve the scale discrepancy between high resolution ground observations and low resolution remote sensing data, two TM images with a resolution approaching the size of ground plots were used to relate the coarse resolution LAI map to ground measurements. First, an empirical relationship between the measured LAI and a vegetation index was established. Next, a high resolution LAI map was generated using the relationship. The LAI value of a low resolution pixel was calculated from the area-weighted sum of high resolution LAIs composing the low resolution pixel. The results of this comparison showed that the inversion algorithm has an accuracy of 82%. Factors that may influence the accuracy are also discussed in this paper.
High-resolution RCMs as pioneers for future GCMs
NASA Astrophysics Data System (ADS)
Schar, C.; Ban, N.; Arteaga, A.; Charpilloz, C.; Di Girolamo, S.; Fuhrer, O.; Hoefler, T.; Leutwyler, D.; Lüthi, D.; Piaget, N.; Ruedisuehli, S.; Schlemmer, L.; Schulthess, T. C.; Wernli, H.
2017-12-01
Currently large efforts are underway to refine the horizontal resolution of global and regional climate models to O(1 km), with the intent to represent convective clouds explicitly rather than using semi-empirical parameterizations. This refinement will move the governing equations closer to first principles and is expected to reduce the uncertainties of climate models. High resolution is particularly attractive in order to better represent critical cloud feedback processes (e.g. related to global climate sensitivity and extratropical summer convection) and extreme events (such as heavy precipitation events, floods, and hurricanes). The presentation will be illustrated using decade-long simulations at 2 km horizontal grid spacing, some of these covering the European continent on a computational mesh with 1536x1536x60 grid points. To accomplish such simulations, use is made of emerging heterogeneous supercomputing architectures, using a version of the COSMO limited-area weather and climate model that is able to run entirely on GPUs. Results show that kilometer-scale resolution dramatically improves the simulation of precipitation in terms of the diurnal cycle and short-term extremes. The modeling framework is used to address changes of precipitation scaling with climate change. It is argued that already today, modern supercomputers would in principle enable global atmospheric convection-resolving climate simulations, provided appropriately refactored codes were available, and provided solutions were found to cope with the rapidly growing output volume. A discussion will be provided of key challenges affecting the design of future high-resolution climate models. It is suggested that km-scale RCMs should be exploited to pioneer this terrain, at a time when GCMs are not yet available at such resolutions. Areas of interest include the development of new parameterization schemes adequate for km-scale resolution, the exploration of new validation methodologies and data sets, the assessment of regional-scale climate feedback processes, and the development of alternative output analysis methodologies.
Computational study of 3-D hot-spot initiation in shocked insensitive high-explosive
NASA Astrophysics Data System (ADS)
Najjar, F. M.; Howard, W. M.; Fried, L. E.; Manaa, M. R.; Nichols, A., III; Levesque, G.
2012-03-01
High-explosive (HE) material consists of large-sized grains with micron-sized embedded impurities and pores. Under various mechanical/thermal insults, these pores collapse generating hightemperature regions leading to ignition. A hydrodynamic study has been performed to investigate the mechanisms of pore collapse and hot spot initiation in TATB crystals, employing a multiphysics code, ALE3D, coupled to the chemistry module, Cheetah. This computational study includes reactive dynamics. Two-dimensional high-resolution large-scale meso-scale simulations have been performed. The parameter space is systematically studied by considering various shock strengths, pore diameters and multiple pore configurations. Preliminary 3-D simulations are undertaken to quantify the 3-D dynamics.
Obscuring and Feeding Supermassive Black Holes with Evolving Nuclear Star Clusters
NASA Astrophysics Data System (ADS)
Schartmann, M.; Burkert, A.; Krause, M.; Camenzind, M.; Meisenheimer, K.; Davies, R. I.
2010-05-01
Recently, high-resolution observations made with the help of the near-infrared adaptive optics integral field spectrograph SINFONI at the VLT proved the existence of massive and young nuclear star clusters in the centers of a sample of Seyfert galaxies. With the help of high-resolution hydrodynamical simulations with the pluto code, we follow the evolution of such clusters, especially focusing on mass and energy feedback from young stars. This leads to a filamentary inflow of gas on large scales (tens of parsecs), whereas a turbulent and very dense disk builds up on the parsec scale. Here we concentrate on the long-term evolution of the nuclear disk in NGC 1068 with the help of an effective viscous disk model, using the mass input from the large-scale simulations and accounting for star formation in the disk. This two-stage modeling enables us to connect the tens-of-parsecs scale region (observable with SINFONI) with the parsec-scale environment (MIDI observations). At the current age of the nuclear star cluster, our simulations predict disk sizes of the order 0.8 to 0.9 pc, gas masses of order 106 M⊙, and mass transfer rates through the inner boundary of order 0.025 M⊙ yr-1, in good agreement with values derived from observations.
Wen, C; Ma, Y J
2018-03-01
The determination of atomic structures and further quantitative information such as chemical compositions at atomic scale for semiconductor defects or heteroepitaxial interfaces can provide direct evidence to understand their formation, modification, and/or effects on the properties of semiconductor films. The commonly used method, high-resolution transmission electron microscopy (HRTEM), suffers from difficulty in acquiring images that correctly show the crystal structure at atomic resolution, because of the limitation in microscope resolution or deviation from the Scherzer-defocus conditions. In this study, an image processing method, image deconvolution, was used to achieve atomic-resolution (∼1.0 Å) structure images of small lattice-mismatch (∼1.0%) AlN/6H-SiC (0001) and large lattice-mismatch (∼8.5%) AlSb/GaAs (001) heteroepitaxial interfaces using simulated HRTEM images of a conventional 300-kV field-emission-gun transmission electron microscope under non-Scherzer-defocus conditions. Then, atomic-scale chemical compositions at the interface were determined for the atomic intermixing and Lomer dislocation with an atomic step by analyzing the deconvoluted image contrast. Furthermore, the effect of dynamical scattering on contrast analysis was also evaluated for differently weighted atomic columns in the compositions. Copyright © 2018 Elsevier Ltd. All rights reserved.
4D electron microscopy: principles and applications.
Flannigan, David J; Zewail, Ahmed H
2012-10-16
The transmission electron microscope (TEM) is a powerful tool enabling the visualization of atoms with length scales smaller than the Bohr radius at a factor of only 20 larger than the relativistic electron wavelength of 2.5 pm at 200 keV. The ability to visualize matter at these scales in a TEM is largely due to the efforts made in correcting for the imperfections in the lens systems which introduce aberrations and ultimately limit the achievable spatial resolution. In addition to the progress made in increasing the spatial resolution, the TEM has become an all-in-one characterization tool. Indeed, most of the properties of a material can be directly mapped in the TEM, including the composition, structure, bonding, morphology, and defects. The scope of applications spans essentially all of the physical sciences and includes biology. Until recently, however, high resolution visualization of structural changes occurring on sub-millisecond time scales was not possible. In order to reach the ultrashort temporal domain within which fundamental atomic motions take place, while simultaneously retaining high spatial resolution, an entirely new approach from that of millisecond-limited TEM cameras had to be conceived. As shown below, the approach is also different from that of nanosecond-limited TEM, whose resolution cannot offer the ultrafast regimes of dynamics. For this reason "ultrafast electron microscopy" is reserved for the field which is concerned with femtosecond to picosecond resolution capability of structural dynamics. In conventional TEMs, electrons are produced by heating a source or by applying a strong extraction field. Both methods result in the stochastic emission of electrons, with no control over temporal spacing or relative arrival time at the specimen. The timing issue can be overcome by exploiting the photoelectric effect and using pulsed lasers to generate precisely timed electron packets of ultrashort duration. The spatial and temporal resolutions achievable with short intense pulses containing a large number of electrons, however, are limited to tens of nanometers and nanoseconds, respectively. This is because Coulomb repulsion is significant in such a pulse, and the electrons spread in space and time, thus limiting the beam coherence. It is therefore not possible to image the ultrafast elementary dynamics of complex transformations. The challenge was to retain the high spatial resolution of a conventional TEM while simultaneously enabling the temporal resolution required to visualize atomic-scale motions. In this Account, we discuss the development of four-dimensional ultrafast electron microscopy (4D UEM) and summarize techniques and applications that illustrate the power of the approach. In UEM, images are obtained either stroboscopically with coherent single-electron packets or with a single electron bunch. Coulomb repulsion is absent under the single-electron condition, thus permitting imaging, diffraction, and spectroscopy, all with high spatiotemporal resolution, the atomic scale (sub-nanometer and femtosecond). The time resolution is limited only by the laser pulse duration and energy carried by the electron packets; the CCD camera has no bearing on the temporal resolution. In the regime of single pulses of electrons, the temporal resolution of picoseconds can be attained when hundreds of electrons are in the bunch. The applications given here are selected to highlight phenomena of different length and time scales, from atomic motions during structural dynamics to phase transitions and nanomechanical oscillations. We conclude with a brief discussion of emerging methods, which include scanning ultrafast electron microscopy (S-UEM), scanning transmission ultrafast electron microscopy (ST-UEM) with convergent beams, and time-resolved imaging of biological structures at ambient conditions with environmental cells.
NASA Astrophysics Data System (ADS)
Huisman, J. A.; Brogi, C.; Pätzold, S.; Weihermueller, L.; von Hebel, C.; Van Der Kruk, J.; Vereecken, H.
2017-12-01
Subsurface structures of the vadose zone can play a key role in crop yield potential, especially during water stress periods. Geophysical techniques like electromagnetic induction EMI can provide information about dominant shallow subsurface features. However, previous studies with EMI have typically not reached beyond the field scale. We used high-resolution large-scale multi-configuration EMI measurements to characterize patterns of soil structural organization (layering and texture) and their impact on crop productivity at the km2 scale. We collected EMI data on an agricultural area of 1 km2 (102 ha) near Selhausen (NRW, Germany). The area consists of 51 agricultural fields cropped in rotation. Therefore, measurements were collected between April and December 2016, preferably within few days after the harvest. EMI data were automatically filtered, temperature corrected, and interpolated onto a common grid of 1 m resolution. Inspecting the ECa maps, we identified three main sub-areas with different subsurface heterogeneity. We also identified small-scale geomorphological structures as well as anthropogenic activities such as soil management and buried drainage networks. To identify areas with similar subsurface structures, we applied image classification techniques. We fused ECa maps obtained with different coil distances in a multiband image and applied supervised and unsupervised classification methodologies. Both showed good results in reconstructing observed patterns in plant productivity and the subsurface structures associated with them. However, the supervised methodology proved more efficient in classifying the whole study area. In a second step, we selected hundred locations within the study area and obtained a soil profile description with type, depth, and thickness of the soil horizons. Using this ground truth data it was possible to assign a typical soil profile to each of the main classes obtained from the classification. The proposed methodology was effective in producing a high resolution subsurface model in a large and complex study area that extends well beyond the field scale.
The Price of Precision: Large-Scale Mapping of Forest Structure and Biomass Using Airborne Lidar
NASA Astrophysics Data System (ADS)
Dubayah, R.
2015-12-01
Lidar remote sensing provides one of the best means for acquiring detailed information on forest structure. However, its application over large areas has been limited largely because of its expense. Nonetheless, extant data exist over many states in the U.S., funded largely by state and federal consortia and mainly for infrastructure, emergency response, flood plain and coastal mapping. These lidar data are almost always acquired in leaf-off seasons, and until recently, usually with low point count densities. Even with these limitations, they provide unprecedented wall-to-wall mappings that enable development of appropriate methodologies for large-scale deployment of lidar. In this talk we summarize our research and lessons learned in deriving forest structure over regional areas as part of NASA's Carbon Monitoring System (CMS). We focus on two areas: the entire state of Maryland and Sonoma County, California. The Maryland effort used low density, leaf-off data acquired by each county in varying epochs, while the on-going Sonoma work employs state-of-the-art, high density, wall-to-wall, leaf-on lidar data. In each area we combine these lidar coverages with high-resolution multispectral imagery from the National Agricultural Imagery Program (NAIP) and in situ plot data to produce maps of canopy height, tree cover and biomass, and compare our results against FIA plot data and national biomass maps. Our work demonstrates that large-scale mapping of forest structure at high spatial resolution is achievable but products may be complex to produce and validate over large areas. Furthermore, fundamental issues involving statistical approaches, plot types and sizes, geolocation, modeling scales, allometry, and even the definitions of "forest" and "non-forest" must be approached carefully. Ultimately, determining the "price of precision", that is, does the value of wall-to-wall forest structure data justify their expense, should consider not only carbon market applications, but the other ways the underlying lidar data may be used.
NASA Astrophysics Data System (ADS)
Noh, S. J.; Kim, S.; Habibi, H.; Seo, D. J.; Welles, E.; Philips, B.; Adams, E.; Smith, M. B.; Wells, E.
2017-12-01
With the development of the National Water Model (NWM), the NWS has made a step-change advance in operational water forecasting by enabling high-resolution hydrologic modeling across the US. As a part of a separate initiative to enhance flash flood forecasting and inundation mapping capacity, the NWS has been mandated to provide forecasts at even finer spatiotemporal resolutions when and where such information is demanded. In this presentation, we describe implementation of the NWM at a hyper resolution over a nested domain. We use WRF-Hydro as the core model but at significantly higher resolutions with scale-commensurate model parameters. The demonstration domain is multiple urban catchments within the Cities of Arlington and Grand Prairie in the Dallas-Fort Worth Metroplex. This area is susceptible to urban flooding due to the hydroclimatology coupled with large impervious cover. The nested model is based on hyper-resolution terrain data to resolve significant land surface features such as streets and large man-made structures, and forced by the high-resolution radar-based quantitative precipitation information. In this presentation, we summarize progress and preliminary results and share issues and challenges.
NASA Technical Reports Server (NTRS)
Schubert, Siegfried; Kang, In-Sik; Reale, Oreste
2009-01-01
This talk gives an update on the progress and further plans for a coordinated project to carry out and analyze high-resolution simulations of tropical storm activity with a number of state-of-the-art global climate models. Issues addressed include, the mechanisms by which SSTs control tropical storm. activity on inter-annual and longer time scales, the modulation of that activity by the Madden Julian Oscillation on sub-seasonal time scales, as well as the sensitivity of the results to model formulation. The project also encourages companion coarser resolution runs to help assess resolution dependence, and. the ability of the models to capture the large-scale and long-terra changes in the parameters important for hurricane development. Addressing the above science questions is critical to understanding the nature of the variability of the Asian-Australian monsoon and its regional impacts, and thus CLIVAR RAMP fully endorses the proposed tropical storm simulation activity. The project is open to all interested organizations and investigators, and the results from the runs will be shared among the participants, as well as made available to the broader scientific community for analysis.
Soil organic carbon - a large scale paired catchment assessment
NASA Astrophysics Data System (ADS)
Kunkel, V.; Hancock, G. R.; Wells, T.
2016-12-01
Soil organic carbon (SOC) concentration can vary both spatially and temporally driven by differences in soil properties, topography and climate. However most studies have focused on point scale data sets with a paucity of studies examining larger scale catchments. Here we examine the spatial and temporal distribution of SOC for two large catchments. The Krui (575 km2) and Merriwa River (675km2) catchments (New South Wales, Australia). Both have similar shape, soils, topography and orientation. We show that SOC distribution is very similar for both catchments and that elevation (and associated increase in soil moisture) is a major influence on SOC. We also show that there is little change in SOC from the initial assessment in 2006 to 2015 despite a major drought from 2003 to 2010 and extreme rainfall events in 2007 and 2010 -therefore SOC concentration appears robust. However, we found significant relationships between erosion and deposition patterns (as quantified using 137Cs) and SOC for both catchments again demonstrating a strong geomorphic relationship. Vegetation across the catchments was assessed using remote sensing (Landsat and MODIS). Vegetation patterns were temporally consistent with above ground biomass increasing with elevation. SOC could be predicted using both these low and high resolution remote sensing platforms. Results indicate that, although moderate resolution (250 m) allows for reasonable prediction of the spatial distribution of SOC, the higher resolution (30 m) improved the strength of the SOC-NDVI relationship. The relationship between SOC and 137Cs, as a surrogate for the erosion and deposition of SOC, suggested that sediment transport and deposition influences the distribution of SOC within the catchment. The findings demonstrate that over the large catchment scale and at the decadal time scale that SOC is relatively constant and can largely be predicted by topography.
Three-Dimensional Terahertz Coded-Aperture Imaging Based on Single Input Multiple Output Technology.
Chen, Shuo; Luo, Chenggao; Deng, Bin; Wang, Hongqiang; Cheng, Yongqiang; Zhuang, Zhaowen
2018-01-19
As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. In this paper, we propose a three-dimensional (3D) TCAI architecture based on single input multiple output (SIMO) technology, which can reduce the coding and sampling times sharply. The coded aperture applied in the proposed TCAI architecture loads either purposive or random phase modulation factor. In the transmitting process, the purposive phase modulation factor drives the terahertz beam to scan the divided 3D imaging cells. In the receiving process, the random phase modulation factor is adopted to modulate the terahertz wave to be spatiotemporally independent for high resolution. Considering human-scale targets, images of each 3D imaging cell are reconstructed one by one to decompose the global computational complexity, and then are synthesized together to obtain the complete high-resolution image. As for each imaging cell, the multi-resolution imaging method helps to reduce the computational burden on a large-scale reference-signal matrix. The experimental results demonstrate that the proposed architecture can achieve high-resolution imaging with much less time for 3D targets and has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.
Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data.
Brinkmann, Benjamin H; Bower, Mark R; Stengel, Keith A; Worrell, Gregory A; Stead, Matt
2009-05-30
The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single-neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single-neuron action potentials, high frequency oscillations, and high amplitude ultra-slow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range-encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information.
Large-scale Electrophysiology: Acquisition, Compression, Encryption, and Storage of Big Data
Brinkmann, Benjamin H.; Bower, Mark R.; Stengel, Keith A.; Worrell, Gregory A.; Stead, Matt
2009-01-01
The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32 kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single neuron action potentials, high frequency oscillations, and high amplitude ultraslow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information. PMID:19427545
SOMAR-LES: A framework for multi-scale modeling of turbulent stratified oceanic flows
NASA Astrophysics Data System (ADS)
Chalamalla, Vamsi K.; Santilli, Edward; Scotti, Alberto; Jalali, Masoud; Sarkar, Sutanu
2017-12-01
A new multi-scale modeling technique, SOMAR-LES, is presented in this paper. Localized grid refinement gives SOMAR (the Stratified Ocean Model with Adaptive Resolution) access to small scales of the flow which are normally inaccessible to general circulation models (GCMs). SOMAR-LES drives a LES (Large Eddy Simulation) on SOMAR's finest grids, forced with large scale forcing from the coarser grids. Three-dimensional simulations of internal tide generation, propagation and scattering are performed to demonstrate this multi-scale modeling technique. In the case of internal tide generation at a two-dimensional bathymetry, SOMAR-LES is able to balance the baroclinic energy budget and accurately model turbulence losses at only 10% of the computational cost required by a non-adaptive solver running at SOMAR-LES's fine grid resolution. This relative cost is significantly reduced in situations with intermittent turbulence or where the location of the turbulence is not known a priori because SOMAR-LES does not require persistent, global, high resolution. To illustrate this point, we consider a three-dimensional bathymetry with grids adaptively refined along the tidally generated internal waves to capture remote mixing in regions of wave focusing. The computational cost in this case is found to be nearly 25 times smaller than that of a non-adaptive solver at comparable resolution. In the final test case, we consider the scattering of a mode-1 internal wave at an isolated two-dimensional and three-dimensional topography, and we compare the results with Legg (2014) numerical experiments. We find good agreement with theoretical estimates. SOMAR-LES is less dissipative than the closure scheme employed by Legg (2014) near the bathymetry. Depending on the flow configuration and resolution employed, a reduction of more than an order of magnitude in computational costs is expected, relative to traditional existing solvers.
Study of Structure and Small-Scale Fragmentation in TMC-1
NASA Technical Reports Server (NTRS)
Langer, W. D.; Velusamy, T.; Kuiper, T. B.; Levin, S.; Olsen, E.; Migenes, V.
1995-01-01
Large-scale C(sup 18)O maps show that the Taurus molecular cloud 1 (TMC-1) has numerous cores located along a ridge which extends about 12 minutes by at least 35 minutes. The cores traced by C(sup 18)O are about a few arcminutes (0.1-0.2 pc) in extent, typically contain about 0.5-3 solar mass, and are probably gravitationally bound. We present a detailed study of the small-scale fragmentary structure of one of these cores, called core D, within TMC-1 using very high spectral and spatial resolution maps of CCS and CS. The CCS lines are excellent tracers for investigating the density, temperature, and velocity structure in dense cores. The high spectral resolution, 0.008 km /s, data consist mainly of single-dish, Nyquist-sampled maps of CCS at 22 GHz with 45 sec spatial resolution taken with NASA's 70 m DSN antenna at Goldstone. The high spatial resolution spectral line maps were made with the Very Large Array (9 sec resolution) at 22 GHz and with the OVRO millimeter array in CCS and CS at 93 GHz and 98 GHz, respectively, with 6 sec resolution. These maps are supplemented with single-dish observations of CCS and CC(sup 34)S spectra at 33 GHz using a NASA 34 m DSN antenna, CCS 93 GHz, C(sup 34)S (2-1), and C(sup 18)O (1-0) single-dish observations made with the AT&T Bell Laboratories 7 m antenna. Our high spectral and spatial CCS and CS maps show that core D is highly fragmented. The single-dish CCS observations map out several clumps which range in size from approx. 45 sec to 90 sec (0.03-0.06 pc). These clumps have very narrow intrinsic line widths, 0.11-0.25 km/s, slightly larger than the thermal line width for CCS at 10 K, and masses about 0.03-0.2 solar mass. Interferometer observations of some of these clumps show that they have considerable additional internal structure, consisting of several condensations ranging in size from approx. 10 sec- 30 sec (0.007-0.021 pc), also with narrow line widths. The mass of these smallest fragments is of order 0.01 solar mass. These small-scale structures traced by CCS appear to be gravitationally unbound by a large factor. Most of these objects have masses that fall below those of the putative proto-brown dwarfs (approx. less than 0.1 solar mass). The presence of many small gravitationally unbound clumps suggests that fragmentation mechanisms other than a purely Jeans gravitational instability may be important for the dynamics of these cold dense cores.
NASA Astrophysics Data System (ADS)
Aalbers, Emma E.; Lenderink, Geert; van Meijgaard, Erik; van den Hurk, Bart J. J. M.
2018-06-01
High-resolution climate information provided by e.g. regional climate models (RCMs) is valuable for exploring the changing weather under global warming, and assessing the local impact of climate change. While there is generally more confidence in the representativeness of simulated processes at higher resolutions, internal variability of the climate system—`noise', intrinsic to the chaotic nature of atmospheric and oceanic processes—is larger at smaller spatial scales as well, limiting the predictability of the climate signal. To quantify the internal variability and robustly estimate the climate signal, large initial-condition ensembles of climate simulations conducted with a single model provide essential information. We analyze a regional downscaling of a 16-member initial-condition ensemble over western Europe and the Alps at 0.11° resolution, similar to the highest resolution EURO-CORDEX simulations. We examine the strength of the forced climate response (signal) in mean and extreme daily precipitation with respect to noise due to internal variability, and find robust small-scale geographical features in the forced response, indicating regional differences in changes in the probability of events. However, individual ensemble members provide only limited information on the forced climate response, even for high levels of global warming. Although the results are based on a single RCM-GCM chain, we believe that they have general value in providing insight in the fraction of the uncertainty in high-resolution climate information that is irreducible, and can assist in the correct interpretation of fine-scale information in multi-model ensembles in terms of a forced response and noise due to internal variability.
NASA Astrophysics Data System (ADS)
Aalbers, Emma E.; Lenderink, Geert; van Meijgaard, Erik; van den Hurk, Bart J. J. M.
2017-09-01
High-resolution climate information provided by e.g. regional climate models (RCMs) is valuable for exploring the changing weather under global warming, and assessing the local impact of climate change. While there is generally more confidence in the representativeness of simulated processes at higher resolutions, internal variability of the climate system—`noise', intrinsic to the chaotic nature of atmospheric and oceanic processes—is larger at smaller spatial scales as well, limiting the predictability of the climate signal. To quantify the internal variability and robustly estimate the climate signal, large initial-condition ensembles of climate simulations conducted with a single model provide essential information. We analyze a regional downscaling of a 16-member initial-condition ensemble over western Europe and the Alps at 0.11° resolution, similar to the highest resolution EURO-CORDEX simulations. We examine the strength of the forced climate response (signal) in mean and extreme daily precipitation with respect to noise due to internal variability, and find robust small-scale geographical features in the forced response, indicating regional differences in changes in the probability of events. However, individual ensemble members provide only limited information on the forced climate response, even for high levels of global warming. Although the results are based on a single RCM-GCM chain, we believe that they have general value in providing insight in the fraction of the uncertainty in high-resolution climate information that is irreducible, and can assist in the correct interpretation of fine-scale information in multi-model ensembles in terms of a forced response and noise due to internal variability.
Low rank approximation methods for MR fingerprinting with large scale dictionaries.
Yang, Mingrui; Ma, Dan; Jiang, Yun; Hamilton, Jesse; Seiberlich, Nicole; Griswold, Mark A; McGivney, Debra
2018-04-01
This work proposes new low rank approximation approaches with significant memory savings for large scale MR fingerprinting (MRF) problems. We introduce a compressed MRF with randomized singular value decomposition method to significantly reduce the memory requirement for calculating a low rank approximation of large sized MRF dictionaries. We further relax this requirement by exploiting the structures of MRF dictionaries in the randomized singular value decomposition space and fitting them to low-degree polynomials to generate high resolution MRF parameter maps. In vivo 1.5T and 3T brain scan data are used to validate the approaches. T 1 , T 2 , and off-resonance maps are in good agreement with that of the standard MRF approach. Moreover, the memory savings is up to 1000 times for the MRF-fast imaging with steady-state precession sequence and more than 15 times for the MRF-balanced, steady-state free precession sequence. The proposed compressed MRF with randomized singular value decomposition and dictionary fitting methods are memory efficient low rank approximation methods, which can benefit the usage of MRF in clinical settings. They also have great potentials in large scale MRF problems, such as problems considering multi-component MRF parameters or high resolution in the parameter space. Magn Reson Med 79:2392-2400, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Scholz, L. T.; Bierer, B.; Ortiz Perez, A.; Woellenstein, J.; Sachs, T.; Palzer, S.
2016-12-01
The determination of carbon dioxide (CO2) fluxes between ecosystems and the atmosphere is crucial for understanding ecological processes on regional and global scales. High quality data sets with full uncertainty estimates are needed to evaluate model simulations. However, current flux monitoring techniques are unsuitable to provide reliable data of a large area at both a detailed level and an appropriate resolution, at best in combination with a high sampling rate. Currently used sensing technologies, such as non-dispersive infrared (NDIR) gas analyzers, cannot be deployed in large numbers to provide high spatial resolution due to their costs and complex maintenance requirements. Here, we propose a novel CO2 measurement system, whose gas sensing unit is made up of low-cost, low-power consuming components only, such as an IR-LED and a photoacoustic detector. The sensor offers a resolution of < 50 ppm in the interesting concentration range up to 5000 ppm and an almost linear and fast sensor response of just a few seconds. Since the sensor can be applied in-situ without special precautions, it allows for environmental monitoring in a non-invasive way. Its low energy consumption enables long-term measurements. The low overall costs favor the manufacturing in large quantities. This allows the operation of multiple sensors at a reasonable price and thus provides concentration measurements at any desired spatial coverage and at high temporal resolution. With appropriate 3D configuration of the units, vertical and horizontal fluxes can be determined. By applying a closely meshed wireless sensor network, inhomogeneities as well as CO2 sources and sinks in the lower atmosphere can be monitored. In combination with sensors for temperature, pressure and humidity, our sensor paves the way towards the reliable and extensive monitoring of ecosystem-atmosphere exchange rates. The technique can also be easily adapted to other relevant greenhouse gases.
Higher resolution satellite remote sensing and the impact on image mapping
Watkins, Allen H.; Thormodsgard, June M.
1987-01-01
Recent advances in spatial, spectral, and temporal resolution of civil land remote sensing satellite data are presenting new opportunities for image mapping applications. The U.S. Geological Survey's experimental satellite image mapping program is evolving toward larger scale image map products with increased information content as a result of improved image processing techniques and increased resolution. Thematic mapper data are being used to produce experimental image maps at 1:100,000 scale that meet established U.S. and European map accuracy standards. Availability of high quality, cloud-free, 30-meter ground resolution multispectral data from the Landsat thematic mapper sensor, along with 10-meter ground resolution panchromatic and 20-meter ground resolution multispectral data from the recently launched French SPOT satellite, present new cartographic and image processing challenges.The need to fully exploit these higher resolution data increases the complexity of processing the images into large-scale image maps. The removal of radiometric artifacts and noise prior to geometric correction can be accomplished by using a variety of image processing filters and transforms. Sensor modeling and image restoration techniques allow maximum retention of spatial and radiometric information. An optimum combination of spectral information and spatial resolution can be obtained by merging different sensor types. These processing techniques are discussed and examples are presented.
Combining points and lines in rectifying satellite images
NASA Astrophysics Data System (ADS)
Elaksher, Ahmed F.
2017-09-01
The quick advance in remote sensing technologies established the potential to gather accurate and reliable information about the Earth surface using high resolution satellite images. Remote sensing satellite images of less than one-meter pixel size are currently used in large-scale mapping. Rigorous photogrammetric equations are usually used to describe the relationship between the image coordinates and ground coordinates. These equations require the knowledge of the exterior and interior orientation parameters of the image that might not be available. On the other hand, the parallel projection transformation could be used to represent the mathematical relationship between the image-space and objectspace coordinate systems and provides the required accuracy for large-scale mapping using fewer ground control features. This article investigates the differences between point-based and line-based parallel projection transformation models in rectifying satellite images with different resolutions. The point-based parallel projection transformation model and its extended form are presented and the corresponding line-based forms are developed. Results showed that the RMS computed using the point- or line-based transformation models are equivalent and satisfy the requirement for large-scale mapping. The differences between the transformation parameters computed using the point- and line-based transformation models are insignificant. The results showed high correlation between the differences in the ground elevation and the RMS.
The Role of Jet Adjustment Processes in Subtropical Dust Storms
NASA Astrophysics Data System (ADS)
Pokharel, Ashok Kumar; Kaplan, Michael L.; Fiedler, Stephanie
2017-11-01
Meso-α/β/γ scale atmospheric processes of jet dynamics responsible for generating Harmattan, Saudi Arabian, and Bodélé Depression dust storms are analyzed with observations and high-resolution modeling. The analysis of the role of jet adjustment processes in each dust storm shows similarities as follows: (1) the presence of a well-organized baroclinic synoptic scale system, (2) cross mountain flows that produced a leeside inversion layer prior to the large-scale dust storm, (3) the presence of thermal wind imbalance in the exit region of the midtropospheric jet streak in the lee of the respective mountains shortly after the time of the inversion formation, (4) dust storm formation accompanied by large magnitude ageostrophic isallobaric low-level winds as part of the meso-β scale adjustment process, (5) substantial low-level turbulence kinetic energy (TKE), and (6) emission and uplift of mineral dust in the lee of nearby mountains. The thermally forced meso-γ scale adjustment processes, which occurred in the canyons/small valleys, may have been the cause of numerous observed dust streaks leading to the entry of the dust into the atmosphere due to the presence of significant vertical motion and TKE generation. This study points to the importance of meso-β to meso-γ scale adjustment processes at low atmospheric levels due to an imbalance within the exit region of an upper level jet streak for the formation of severe dust storms. The low level TKE, which is one of the prerequisites to deflate the dust from the surface, cannot be detected with the low resolution data sets; so our results show that a high spatial resolution is required for better representing TKE as a proxy for dust emission.
NASA Astrophysics Data System (ADS)
Hull, Charles L. H.; Girart, Josep M.; Tychoniec, Łukasz; Rao, Ramprasad; Cortés, Paulo C.; Pokhrel, Riwaj; Zhang, Qizhou; Houde, Martin; Dunham, Michael M.; Kristensen, Lars E.; Lai, Shih-Ping; Li, Zhi-Yun; Plambeck, Richard L.
2017-10-01
We present high angular resolution dust polarization and molecular line observations carried out with the Atacama Large Millimeter/submillimeter Array (ALMA) toward the Class 0 protostar Serpens SMM1. By complementing these observations with new polarization observations from the Submillimeter Array (SMA) and archival data from the Combined Array for Research in Millimeter-wave Astronomy (CARMA) and the James Clerk Maxwell Telescopes (JCMT), we can compare the magnetic field orientations at different spatial scales. We find major changes in the magnetic field orientation between large (˜0.1 pc) scales—where the magnetic field is oriented E-W, perpendicular to the major axis of the dusty filament where SMM1 is embedded—and the intermediate and small scales probed by CARMA (˜1000 au resolution), the SMA (˜350 au resolution), and ALMA (˜140 au resolution). The ALMA maps reveal that the redshifted lobe of the bipolar outflow is shaping the magnetic field in SMM1 on the southeast side of the source; however, on the northwestern side and elsewhere in the source, low-velocity shocks may be causing the observed chaotic magnetic field pattern. High-spatial-resolution continuum and spectral-line observations also reveal a tight (˜130 au) protobinary system in SMM1-b, the eastern component of which is launching an extremely high-velocity, one-sided jet visible in both {CO}(J=2\\to 1) and {SiO}(J=5\\to 4); however, that jet does not appear to be shaping the magnetic field. These observations show that with the sensitivity and resolution of ALMA, we can now begin to understand the role that feedback (e.g., from protostellar outflows) plays in shaping the magnetic field in very young, star-forming sources like SMM1.
Microwave sensing technology issues related to a global change technology architecture trade study
NASA Technical Reports Server (NTRS)
Campbell, Thomas G.; Shiue, Jim; Connolly, Denis; Woo, Ken
1991-01-01
The objectives are to enable the development of lighter and less power consuming, high resolution microwave sensors which will operate at frequencies from 1 to 200 GHz. These systems will use large aperture antenna systems (both reflector and phased arrays) capable of wide scan angle, high polarization purity, and utilize sidelobe suppression techniques as required. Essentially, the success of this technology program will enable high resolution microwave radiometers from geostationary orbit, lightweight and more efficient radar systems from low Earth orbit, and eliminate mechanical scanning methods to the fullest extent possible; a main source of platform instability in large space systems. The Global Change Technology Initiative (GCTI) will develop technology which will enable the use of satellite systems for Earth observations on a global scale.
NASA Astrophysics Data System (ADS)
Omrani, Hiba; Drobinski, Philippe; Dubos, Thomas
2010-05-01
In this work, we consider the effect of indiscriminate and spectral nudging on the large and small scales of an idealized model simulation. The model is a two layer quasi-geostrophic model on the beta-plane driven at its boundaries by the « global » version with periodic boundary condition. This setup mimics the configuration used for regional climate modelling. The effect of large-scale nudging is studied by using the "perfect model" approach. Two sets of experiments are performed: (1) the effect of nudging is investigated with a « global » high resolution two layer quasi-geostrophic model driven by a low resolution two layer quasi-geostrophic model. (2) similar simulations are conducted with the two layer quasi-geostrophic Limited Area Model (LAM) where the size of the LAM domain comes into play in addition to the first set of simulations. The study shows that the indiscriminate nudging time that minimizes the error at both the large and small scales is reached for a nudging time close to the predictability time, for spectral nudging, the optimum nudging time should tend to zero since the best large scale dynamics is supposed to be given by the driving large-scale fields are generally given at much lower frequency than the model time step(e,g, 6-hourly analysis) with a basic interpolation between the fields, the optimum nudging time differs from zero, however remaining smaller than the predictability time.
A high-resolution European dataset for hydrologic modeling
NASA Astrophysics Data System (ADS)
Ntegeka, Victor; Salamon, Peter; Gomes, Goncalo; Sint, Hadewij; Lorini, Valerio; Thielen, Jutta
2013-04-01
There is an increasing demand for large scale hydrological models not only in the field of modeling the impact of climate change on water resources but also for disaster risk assessments and flood or drought early warning systems. These large scale models need to be calibrated and verified against large amounts of observations in order to judge their capabilities to predict the future. However, the creation of large scale datasets is challenging for it requires collection, harmonization, and quality checking of large amounts of observations. For this reason, only a limited number of such datasets exist. In this work, we present a pan European, high-resolution gridded dataset of meteorological observations (EFAS-Meteo) which was designed with the aim to drive a large scale hydrological model. Similar European and global gridded datasets already exist, such as the HadGHCND (Caesar et al., 2006), the JRC MARS-STAT database (van der Goot and Orlandi, 2003) and the E-OBS gridded dataset (Haylock et al., 2008). However, none of those provide similarly high spatial resolution and/or a complete set of variables to force a hydrologic model. EFAS-Meteo contains daily maps of precipitation, surface temperature (mean, minimum and maximum), wind speed and vapour pressure at a spatial grid resolution of 5 x 5 km for the time period 1 January 1990 - 31 December 2011. It furthermore contains calculated radiation, which is calculated by using a staggered approach depending on the availability of sunshine duration, cloud cover and minimum and maximum temperature, and evapotranspiration (potential evapotranspiration, bare soil and open water evapotranspiration). The potential evapotranspiration was calculated using the Penman-Monteith equation with the above-mentioned meteorological variables. The dataset was created as part of the development of the European Flood Awareness System (EFAS) and has been continuously updated throughout the last years. The dataset variables are used as inputs to the hydrological calibration and validation of EFAS as well as for establishing long-term discharge "proxy" climatologies which can then in turn be used for statistical analysis to derive return periods or other time series derivatives. In addition, this dataset will be used to assess climatological trends in Europe. Unfortunately, to date no baseline dataset at the European scale exists to test the quality of the herein presented data. Hence, a comparison against other existing datasets can therefore only be an indication of data quality. Due to availability, a comparison was made for precipitation and temperature only, arguably the most important meteorological drivers for hydrologic models. A variety of analyses was undertaken at country scale against data reported to EUROSTAT and E-OBS datasets. The comparison revealed that while the datasets showed overall similar temporal and spatial patterns, there were some differences in magnitudes especially for precipitation. It is not straightforward to define the specific cause for these differences. However, in most cases the comparatively low observation station density appears to be the principal reason for the differences in magnitude.
Large-scale horizontal flows from SOUP observations of solar granulation
NASA Technical Reports Server (NTRS)
November, L. J.; Simon, G. W.; Tarbell, T. D.; Title, A. M.; Ferguson, S. H.
1987-01-01
Using high resolution time sequence photographs of solar granulation from the SOUP experiment on Spacelab 2, large scale horizontal flows were observed in the solar surface. The measurement method is based upon a local spatial cross correlation analysis. The horizontal motions have amplitudes in the range 300 to 1000 m/s. Radial outflow of granulation from a sunspot penumbra into surrounding photosphere is a striking new discovery. Both the supergranulation pattern and cellular structures having the scale of mesogranulation are seen. The vertical flows that are inferred by continuity of mass from these observed horizontal flows have larger upflow amplitudes in cell centers than downflow amplitudes at cell boundaries.
Scales of snow depth variability in high elevation rangeland sagebrush
NASA Astrophysics Data System (ADS)
Tedesche, Molly E.; Fassnacht, Steven R.; Meiman, Paul J.
2017-09-01
In high elevation semi-arid rangelands, sagebrush and other shrubs can affect transport and deposition of wind-blown snow, enabling the formation of snowdrifts. Datasets from three field experiments were used to investigate the scales of spatial variability of snow depth around big mountain sagebrush ( Artemisia tridentata Nutt.) at a high elevation plateau rangeland in North Park, Colorado, during the winters of 2002, 2003, and 2008. Data were collected at multiple resolutions (0.05 to 25 m) and extents (2 to 1000 m). Finer scale data were collected specifically for this study to examine the correlation between snow depth, sagebrush microtopography, the ground surface, and the snow surface, as well as the temporal consistency of snow depth patterns. Variograms were used to identify the spatial structure and the Moran's I statistic was used to determine the spatial correlation. Results show some temporal consistency in snow depth at several scales. Plot scale snow depth variability is partly a function of the nature of individual shrubs, as there is some correlation between the spatial structure of snow depth and sagebrush, as well as between the ground and snow depth. The optimal sampling resolution appears to be 25-cm, but over a large area, this would require a multitude of samples, and thus a random stratified approach is recommended with a fine measurement resolution of 5-cm.
NASA Astrophysics Data System (ADS)
Xie, Hongbo; Mao, Chensheng; Ren, Yongjie; Zhu, Jigui; Wang, Chao; Yang, Lei
2017-10-01
In high precision and large-scale coordinate measurement, one commonly used approach to determine the coordinate of a target point is utilizing the spatial trigonometric relationships between multiple laser transmitter stations and the target point. A light receiving device at the target point is the key element in large-scale coordinate measurement systems. To ensure high-resolution and highly sensitive spatial coordinate measurement, a high-performance and miniaturized omnidirectional single-point photodetector (OSPD) is greatly desired. We report one design of OSPD using an aspheric lens, which achieves an enhanced reception angle of -5 deg to 45 deg in vertical and 360 deg in horizontal. As the heart of our OSPD, the aspheric lens is designed in a geometric model and optimized by LightTools Software, which enables the reflection of a wide-angle incident light beam into the single-point photodiode. The performance of home-made OSPD is characterized with working distances from 1 to 13 m and further analyzed utilizing developed a geometric model. The experimental and analytic results verify that our device is highly suitable for large-scale coordinate metrology. The developed device also holds great potential in various applications such as omnidirectional vision sensor, indoor global positioning system, and optical wireless communication systems.
NASA Astrophysics Data System (ADS)
Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca
2018-06-01
We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.
NASA Astrophysics Data System (ADS)
Lin, S.; Li, J.; Liu, Q.
2018-04-01
Satellite remote sensing data provide spatially continuous and temporally repetitive observations of land surfaces, and they have become increasingly important for monitoring large region of vegetation photosynthetic dynamic. But remote sensing data have their limitation on spatial and temporal scale, for example, higher spatial resolution data as Landsat data have 30-m spatial resolution but 16 days revisit period, while high temporal scale data such as geostationary data have 30-minute imaging period, which has lower spatial resolution (> 1 km). The objective of this study is to investigate whether combining high spatial and temporal resolution remote sensing data can improve the gross primary production (GPP) estimation accuracy in cropland. For this analysis we used three years (from 2010 to 2012) Landsat based NDVI data, MOD13 vegetation index product and Geostationary Operational Environmental Satellite (GOES) geostationary data as input parameters to estimate GPP in a small region cropland of Nebraska, US. Then we validated the remote sensing based GPP with the in-situ measurement carbon flux data. Results showed that: 1) the overall correlation between GOES visible band and in-situ measurement photosynthesis active radiation (PAR) is about 50 % (R2 = 0.52) and the European Center for Medium-Range Weather Forecasts ERA-Interim reanalysis data can explain 64 % of PAR variance (R2 = 0.64); 2) estimating GPP with Landsat 30-m spatial resolution data and ERA daily meteorology data has the highest accuracy(R2 = 0.85, RMSE < 3 gC/m2/day), which has better performance than using MODIS 1-km NDVI/EVI product import; 3) using daily meteorology data as input for GPP estimation in high spatial resolution data would have higher relevance than 8-day and 16-day input. Generally speaking, using the high spatial resolution and high frequency satellite based remote sensing data can improve GPP estimation accuracy in cropland.
Hierarchical algorithms for modeling the ocean on hierarchical architectures
NASA Astrophysics Data System (ADS)
Hill, C. N.
2012-12-01
This presentation will describe an approach to using accelerator/co-processor technology that maps hierarchical, multi-scale modeling techniques to an underlying hierarchical hardware architecture. The focus of this work is on making effective use of both CPU and accelerator/co-processor parts of a system, for large scale ocean modeling. In the work, a lower resolution basin scale ocean model is locally coupled to multiple, "embedded", limited area higher resolution sub-models. The higher resolution models execute on co-processor/accelerator hardware and do not interact directly with other sub-models. The lower resolution basin scale model executes on the system CPU(s). The result is a multi-scale algorithm that aligns with hardware designs in the co-processor/accelerator space. We demonstrate this approach being used to substitute explicit process models for standard parameterizations. Code for our sub-models is implemented through a generic abstraction layer, so that we can target multiple accelerator architectures with different programming environments. We will present two application and implementation examples. One uses the CUDA programming environment and targets GPU hardware. This example employs a simple non-hydrostatic two dimensional sub-model to represent vertical motion more accurately. The second example uses a highly threaded three-dimensional model at high resolution. This targets a MIC/Xeon Phi like environment and uses sub-models as a way to explicitly compute sub-mesoscale terms. In both cases the accelerator/co-processor capability provides extra compute cycles that allow improved model fidelity for little or no extra wall-clock time cost.
NASA Astrophysics Data System (ADS)
Franci, Luca; Landi, Simone; Matteini, Lorenzo; Verdini, Andrea; Hellinger, Petr
2016-04-01
We investigate the properties of the ion-scale spectral break of solar wind turbulence by means of two-dimensional, large-scale, high-resolution hybrid particle-in-cell simulations. We impose an initial ambient magnetic field perpendicular to the simulation box, and we add a spectrum of in-plane large- scale magnetic and kinetic fluctuations, with energy equipartition and vanishing correlation. We perform a set of ten simulations with different values of the ion plasma beta, β_i. In all cases, we observe the power spectrum of the total magnetic fluctuations following a power law with a spectral index of -5/3 in the inertial range, with a smooth break around ion scales and a steeper power law in the sub-ion range. This spectral break always occurs at spatial scales of the order of the proton gyroradius, ρ_i, and the proton inertial length, di = ρi / √{β_i}. When the plasma beta is of the order of 1, the two scales are very close to each other and determining which is directly related to the steepening of the spectra it's not straightforward at all. In order to overcome this limitation, we extended the range of values of βi over three orders of magnitude, from 0.01 to 10, so that the two ion scales were well separated. This let us observe that the break always seems to occur at the larger of the two scales, i.e., at di for βi 1. The effect of βi on the spectra of the parallel and perpendicular magnetic components separately and of the density fluctuations is also investigated. We compare all our numerical results with solar wind observations and suggest possible explanations for our findings.
A rapid extraction of landslide disaster information research based on GF-1 image
NASA Astrophysics Data System (ADS)
Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na
2015-08-01
In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.
NASA Astrophysics Data System (ADS)
Broxton, P. D.; Harpold, A. A.; van Leeuwen, W.; Biederman, J. A.
2016-12-01
Quantifying the amount of snow in forested mountainous environments, as well as how it may change due to warming and forest disturbance, is critical given its importance for water supply and ecosystem health. Forest canopies affect snow accumulation and ablation in ways that are difficult to observe and model. Furthermore, fine-scale forest structure can accentuate or diminish the effects of forest-snow interactions. Despite decades of research demonstrating the importance of fine-scale forest structure (e.g. canopy edges and gaps) on snow, we still lack a comprehensive understanding of where and when forest structure has the largest impact on snowpack mass and energy budgets. Here, we use a hyper-resolution (1 meter spatial resolution) mass and energy balance snow model called the Snow Physics and Laser Mapping (SnowPALM) model along with LIDAR-derived forest structure to determine where spatial variability of fine-scale forest structure has the largest influence on large scale mass and energy budgets. SnowPALM was set up and calibrated at sites representing diverse climates in New Mexico, Arizona, and California. Then, we compared simulations at different model resolutions (i.e. 1, 10, and 100 m) to elucidate the effects of including versus not including information about fine scale canopy structure. These experiments were repeated for different prescribed topographies (i.e. flat, 30% slope north, and south-facing) at each site. Higher resolution simulations had more snow at lower canopy cover, with the opposite being true at high canopy cover. Furthermore, there is considerable scatter, indicating that different canopy arrangements can lead to different amounts of snow, even when the overall canopy coverage is the same. This modeling is contributing to the development of a high resolution machine learning algorithm called the Snow Water Artificial Network (SWANN) model to generate predictions of snow distributions over much larger domains, which has implications for improving land surface models that do not currently resolve or parameterize fine-scale canopy structure. In addition, these findings have implications for understanding the potential of different forest management strategies (i.e. thinning) based on local topography and climate to maximize the amount and retention of snow.
Monitoring black-tailed prairie dog colonies with high-resolution satellite imagery
Sidle, John G.; Johnson, D.H.; Euliss, B.R.; Tooze, M.
2002-01-01
The United States Fish and Wildlife Service has determined that the black-tailed prairie dog (Cynomys ludovicianus) warrants listing as a threatened species under the Endangered Species Act. Central to any conservation planning for the black-tailed prairie dog is an appropriate detection and monitoring technique. Because coarse-resolution satellite imagery is not adequate to detect black-tailed prairie dog colonies, we examined the usefulness of recently available high-resolution (1-m) satellite imagery. In 6 purchased scenes of national grasslands, we were easily able to visually detect small and large colonies without using image-processing algorithms. The Ikonos (Space Imaging(tm)) satellite imagery was as adequate as large-scale aerial photography to delineate colonies. Based on the high quality of imagery, we discuss a possible monitoring program for black-tailed prairie dog colonies throughout the Great Plains, using the species' distribution in North Dakota as an example. Monitoring plots could be established and imagery acquired periodically to track the expansion and contraction of colonies.
Towards native-state imaging in biological context in the electron microscope
Weston, Anne E.; Armer, Hannah E. J.
2009-01-01
Modern cell biology is reliant on light and fluorescence microscopy for analysis of cells, tissues and protein localisation. However, these powerful techniques are ultimately limited in resolution by the wavelength of light. Electron microscopes offer much greater resolution due to the shorter effective wavelength of electrons, allowing direct imaging of sub-cellular architecture. The harsh environment of the electron microscope chamber and the properties of the electron beam have led to complex chemical and mechanical preparation techniques, which distance biological samples from their native state and complicate data interpretation. Here we describe recent advances in sample preparation and instrumentation, which push the boundaries of high-resolution imaging. Cryopreparation, cryoelectron microscopy and environmental scanning electron microscopy strive to image samples in near native state. Advances in correlative microscopy and markers enable high-resolution localisation of proteins. Innovation in microscope design has pushed the boundaries of resolution to atomic scale, whilst automatic acquisition of high-resolution electron microscopy data through large volumes is finally able to place ultrastructure in biological context. PMID:19916039
NASA Astrophysics Data System (ADS)
Shin, S.; Pokhrel, Y. N.
2016-12-01
Land surface models have been used to assess water resources sustainability under changing Earth environment and increasing human water needs. Overwhelming observational records indicate that human activities have ubiquitous and pertinent effects on the hydrologic cycle; however, they have been crudely represented in large scale land surface models. In this study, we enhance an integrated continental-scale land hydrology model named Leaf-Hydro-Flood to better represent land-water management. The model is implemented at high resolution (5km grids) over the continental US. Surface water and groundwater are withdrawn based on actual practices. Newly added irrigation, water diversion, and dam operation schemes allow better simulations of stream flows, evapotranspiration, and infiltration. Results of various hydrologic fluxes and stores from two sets of simulation (one with and the other without human activities) are compared over a range of river basin and aquifer scales. The improved simulations of land hydrology have potential to build consistent modeling framework for human-water-climate interactions.
NASA Astrophysics Data System (ADS)
Mizyuk, Artem; Senderov, Maxim; Korotaev, Gennady
2016-04-01
Large number of numerical ocean models were implemented for the Black Sea basin during last two decades. They reproduce rather similar structure of synoptical variability of the circulation. Since 00-s numerical studies of the mesoscale structure are carried out using high performance computing (HPC). With the growing capacity of computing resources it is now possible to reconstruct the Black Sea currents with spatial resolution of several hundreds meters. However, how realistic these results can be? In the proposed study an attempt is made to understand which spatial scales are reproduced by ocean model in the Black Sea. Simulations are made using parallel version of NEMO (Nucleus for European Modelling of the Ocean). A two regional configurations with spatial resolutions 5 km and 2.5 km are described. Comparison of the SST from simulations with two spatial resolutions shows rather qualitative difference of the spatial structures. Results of high resolution simulation are compared also with satellite observations and observation-based products from Copernicus using spatial correlation and spectral analysis. Spatial scales of correlations functions for simulated and observed SST are rather close and differs much from satellite SST reanalysis. Evolution of spectral density for modelled SST and reanalysis showed agreed time periods of small scales intensification. Using of the spectral analysis for satellite measurements is complicated due to gaps. The research leading to this results has received funding from Russian Science Foundation (project № 15-17-20020)
Exploring image data assimilation in the prospect of high-resolution satellite data
NASA Astrophysics Data System (ADS)
Verron, J. A.; Duran, M.; Gaultier, L.; Brankart, J. M.; Brasseur, P.
2016-02-01
Many recent works show the key importance of studying the ocean at fine scales including the meso- and submesoscales. Satellite observations such as ocean color data provide informations on a wide range of scales but do not directly provide information on ocean dynamics. Satellite altimetry provide informations on the ocean dynamic topography (SSH) but so far with a limited resolution in space and even more, in time. However, in the near future, high-resolution SSH data (e.g. SWOT) will give a vision of the dynamic topography at such fine space resolution. This raises some challenging issues for data assimilation in physical oceanography: develop reliable methodology to assimilate high resolution data, make integrated use of various data sets including biogeochemical data, and even more simply, solve the challenge of handling large amont of data and huge state vectors. In this work, we propose to consider structured information rather than pointwise data. First, we take an image data assimilation approach in studying the feasibility of inverting tracer observations from Sea Surface Temperature and/or Ocean Color datasets, to improve the description of mesoscale dynamics provided by altimetric observations. Finite Size Lyapunov Exponents are used as an image proxy. The inverse problem is formulated in a Bayesian framework and expressed in terms of a cost function measuring the misfits between the two images. Second, we explore the inversion of SWOT-like high resolution SSH data and more especially the various possible proxies of the actual SSH that could be used to control the ocean circulation at various scales. One focus is made on controlling the subsurface ocean from surface only data. A key point lies in the errors and uncertainties that are associated to SWOT data.
Mechanisms and Model Diversity of Trade-Wind Shallow Cumulus Cloud Feedbacks: A Review.
Vial, Jessica; Bony, Sandrine; Stevens, Bjorn; Vogel, Raphaela
2017-01-01
Shallow cumulus clouds in the trade-wind regions are at the heart of the long standing uncertainty in climate sensitivity estimates. In current climate models, cloud feedbacks are strongly influenced by cloud-base cloud amount in the trades. Therefore, understanding the key factors controlling cloudiness near cloud-base in shallow convective regimes has emerged as an important topic of investigation. We review physical understanding of these key controlling factors and discuss the value of the different approaches that have been developed so far, based on global and high-resolution model experimentations and process-oriented analyses across a range of models and for observations. The trade-wind cloud feedbacks appear to depend on two important aspects: (1) how cloudiness near cloud-base is controlled by the local interplay between turbulent, convective and radiative processes; (2) how these processes interact with their surrounding environment and are influenced by mesoscale organization. Our synthesis of studies that have explored these aspects suggests that the large diversity of model responses is related to fundamental differences in how the processes controlling trade cumulus operate in models, notably, whether they are parameterized or resolved. In models with parameterized convection, cloudiness near cloud-base is very sensitive to the vigor of convective mixing in response to changes in environmental conditions. This is in contrast with results from high-resolution models, which suggest that cloudiness near cloud-base is nearly invariant with warming and independent of large-scale environmental changes. Uncertainties are difficult to narrow using current observations, as the trade cumulus variability and its relation to large-scale environmental factors strongly depend on the time and/or spatial scales at which the mechanisms are evaluated. New opportunities for testing physical understanding of the factors controlling shallow cumulus cloud responses using observations and high-resolution modeling on large domains are discussed.
Mechanisms and Model Diversity of Trade-Wind Shallow Cumulus Cloud Feedbacks: A Review
NASA Astrophysics Data System (ADS)
Vial, Jessica; Bony, Sandrine; Stevens, Bjorn; Vogel, Raphaela
2017-11-01
Shallow cumulus clouds in the trade-wind regions are at the heart of the long standing uncertainty in climate sensitivity estimates. In current climate models, cloud feedbacks are strongly influenced by cloud-base cloud amount in the trades. Therefore, understanding the key factors controlling cloudiness near cloud-base in shallow convective regimes has emerged as an important topic of investigation. We review physical understanding of these key controlling factors and discuss the value of the different approaches that have been developed so far, based on global and high-resolution model experimentations and process-oriented analyses across a range of models and for observations. The trade-wind cloud feedbacks appear to depend on two important aspects: (1) how cloudiness near cloud-base is controlled by the local interplay between turbulent, convective and radiative processes; (2) how these processes interact with their surrounding environment and are influenced by mesoscale organization. Our synthesis of studies that have explored these aspects suggests that the large diversity of model responses is related to fundamental differences in how the processes controlling trade cumulus operate in models, notably, whether they are parameterized or resolved. In models with parameterized convection, cloudiness near cloud-base is very sensitive to the vigor of convective mixing in response to changes in environmental conditions. This is in contrast with results from high-resolution models, which suggest that cloudiness near cloud-base is nearly invariant with warming and independent of large-scale environmental changes. Uncertainties are difficult to narrow using current observations, as the trade cumulus variability and its relation to large-scale environmental factors strongly depend on the time and/or spatial scales at which the mechanisms are evaluated. New opportunities for testing physical understanding of the factors controlling shallow cumulus cloud responses using observations and high-resolution modeling on large domains are discussed.
Measuring Cosmic Expansion and Large Scale Structure with Destiny
NASA Technical Reports Server (NTRS)
Benford, Dominic J.; Lauer, Tod R.
2007-01-01
Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4
A ``Cyber Wind Facility'' for HPC Wind Turbine Field Experiments
NASA Astrophysics Data System (ADS)
Brasseur, James; Paterson, Eric; Schmitz, Sven; Campbell, Robert; Vijayakumar, Ganesh; Lavely, Adam; Jayaraman, Balaji; Nandi, Tarak; Jha, Pankaj; Dunbar, Alex; Motta-Mena, Javier; Craven, Brent; Haupt, Sue
2013-03-01
The Penn State ``Cyber Wind Facility'' (CWF) is a high-fidelity multi-scale high performance computing (HPC) environment in which ``cyber field experiments'' are designed and ``cyber data'' collected from wind turbines operating within the atmospheric boundary layer (ABL) environment. Conceptually the ``facility'' is akin to a high-tech wind tunnel with controlled physical environment, but unlike a wind tunnel it replicates commercial-scale wind turbines operating in the field and forced by true atmospheric turbulence with controlled stability state. The CWF is created from state-of-the-art high-accuracy technology geometry and grid design and numerical methods, and with high-resolution simulation strategies that blend unsteady RANS near the surface with high fidelity large-eddy simulation (LES) in separated boundary layer, blade and rotor wake regions, embedded within high-resolution LES of the ABL. CWF experiments complement physical field facility experiments that can capture wider ranges of meteorological events, but with minimal control over the environment and with very small numbers of sensors at low spatial resolution. I shall report on the first CWF experiments aimed at dynamical interactions between ABL turbulence and space-time wind turbine loadings. Supported by DOE and NSF.
High-resolution seismic reflection profiling for mapping shallow aquifers in Lee County, Florida
Missimer, T.M.; Gardner, Richard Alfred
1976-01-01
High-resolution continuous seismic reflection profiling equipment was utilized to define the configuration of sedimentary layers underlying part of Lee County, Florida. About 45 miles (72 kilometers) of profile were made on the Caloosahatchee River Estuary and San Carlos Bay. Two different acoustic energy sources, a high resolution boomer and a 45-electrode high resolution sparker, both having a power input of 300 joules, were used to obtain both adequate penetration and good resolution. The seismic profiles show that much of the strata of middle Miocene to Holocene age apparently are extensively folded but not faulted. Initial interpretations indicate that: (1) the top of the Hawthorn Formation (which contains the upper Hawthorn aquifer) has much relief due chiefly to apparent folding; (2) the limestone, sandstone, and unconsolidated sand and phosphorite, which together compose the sandstone aquifer, appear to be discontinuous; (3) the green clay unit of the Tamiami Formation contains large scale angular beds dipping eastward; and (4) numerous deeply cut alluvium-filled paleochannels underlie the Caloosahatchee River. (Woodard-USGS)
Crust and Mantle Deformation Revealed from High-Resolution Radially Anisotropic Velocity Models
NASA Astrophysics Data System (ADS)
Li, A.; Dave, R.; Yao, Y.
2017-12-01
Love wave tomography, which can achieve a similar model resolution as Rayleigh wave, so far has limited applications to the USArray data. Recently, we have developed high-resolution Love wave phase velocity maps in the Wyoming craton and Texas using data at the Transportable Array stations. 3-D, radially anisotropic velocity models are obtained by jointly inverting Love and Rayleigh wave phase velocities. A high-velocity anomaly extending to about 200 km depth beneath central Wyoming correlates with negative radial anisotropy (Vsv>Vsh), suggesting that mantle downwelling develops under the cratonic lithosphere. Surprisingly, the significantly low velocity beneath the Yellowstone hotspot, which has been interpreted as partial melting and asthenospheric upwelling, is associated with the largest radial anisotropy (Vsh>Vsv) in the area. This observation does not support mantle upwelling. Instead, it indicates that the upper mantle beneath the hotspot has experienced strong shear deformation probably by the plate motion and large-scale mantle flow. In Texas, positive radial anisotropy in the lower crust extends from the coast to the Ouachita belt, which is characterized by high velocity and negative radial anisotropy. In the upper mantle, large variations of velocity and anisotropy exit under the coastal plain. A common feature in these anisotropic models is that high-velocity anomalies in the upper mantle often correlate with negative anisotropy (Vsv>Vsh) while low-velocity anomalies are associated with positive anisotropy (Vsh>Vsv). The manifestation of mantle downweling as negative radial anisotropy is largely due to the relatively high viscosity of the high-velocity mantle block, which is less affected by the surrounding large-scale horizontal flow. However, mantle upwelling, which is often associated with low-velocity anomalies, presumably low-viscosity mantle blocks, is invisible in radial anisotropy models. Such upwelling may happen too quickly to make last effects or too slow to alter the dominant shear deformation in the asthenosphere.
Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby; ...
2016-10-22
Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby
Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2012-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.
Estimating planktonic diversity through spatial dominance patterns in a model ocean.
Soccodato, Alice; d'Ovidio, Francesco; Lévy, Marina; Jahn, Oliver; Follows, Michael J; De Monte, Silvia
2016-10-01
In the open ocean, the observation and quantification of biodiversity patterns is challenging. Marine ecosystems are indeed largely composed by microbial planktonic communities whose niches are affected by highly dynamical physico-chemical conditions, and whose observation requires advanced methods for morphological and molecular classification. Optical remote sensing offers an appealing complement to these in-situ techniques. Global-scale coverage at high spatiotemporal resolution is however achieved at the cost of restrained information on the local assemblage. Here, we use a coupled physical and ecological model ocean simulation to explore one possible metrics for comparing measures performed on such different scales. We show that a large part of the local diversity of the virtual plankton ecosystem - corresponding to what accessible by genomic methods - can be inferred from crude, but spatially extended, information - as conveyed by remote sensing. Shannon diversity of the local community is indeed highly correlated to a 'seascape' index, which quantifies the surrounding spatial heterogeneity of the most abundant functional group. The error implied in drastically reducing the resolution of the plankton community is shown to be smaller in frontal regions as well as in regions of intermediate turbulent energy. On the spatial scale of hundreds of kms, patterns of virtual plankton diversity are thus largely sustained by mixing communities that occupy adjacent niches. We provide a proof of principle that in the open ocean information on spatial variability of communities can compensate for limited local knowledge, suggesting the possibility of integrating in-situ and satellite observations to monitor biodiversity distribution at the global scale. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Aires, Filipe; Miolane, Léo; Prigent, Catherine; Pham Duc, Binh; Papa, Fabrice; Fluet-Chouinard, Etienne; Lehner, Bernhard
2017-04-01
The Global Inundation Extent from Multi-Satellites (GIEMS) provides multi-year monthly variations of the global surface water extent at 25kmx25km resolution. It is derived from multiple satellite observations. Its spatial resolution is usually compatible with climate model outputs and with global land surface model grids but is clearly not adequate for local applications that require the characterization of small individual water bodies. There is today a strong demand for high-resolution inundation extent datasets, for a large variety of applications such as water management, regional hydrological modeling, or for the analysis of mosquitos-related diseases. A new procedure is introduced to downscale the GIEMS low spatial resolution inundations to a 3 arc second (90 m) dataset. The methodology is based on topography and hydrography information from the HydroSHEDS database. A new floodability index is adopted and an innovative smoothing procedure is developed to ensure the smooth transition, in the high-resolution maps, between the low-resolution boxes from GIEMS. Topography information is relevant for natural hydrology environments controlled by elevation, but is more limited in human-modified basins. However, the proposed downscaling approach is compatible with forthcoming fusion with other more pertinent satellite information in these difficult regions. The resulting GIEMS-D3 database is the only high spatial resolution inundation database available globally at the monthly time scale over the 1993-2007 period. GIEMS-D3 is assessed by analyzing its spatial and temporal variability, and evaluated by comparisons to other independent satellite observations from visible (Google Earth and Landsat), infrared (MODIS) and active microwave (SAR).
Daily time series evapotranspiration maps for Oklahoma and Texas panhandle
USDA-ARS?s Scientific Manuscript database
Evapotranspiration (ET) is an important process in ecosystems’ water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. ...
A novel representation of groundwater dynamics in large-scale land surface modelling
NASA Astrophysics Data System (ADS)
Rahman, Mostaquimur; Rosolem, Rafael; Kollet, Stefan
2017-04-01
Land surface processes are connected to groundwater dynamics via shallow soil moisture. For example, groundwater affects evapotranspiration (by influencing the variability of soil moisture) and runoff generation mechanisms. However, contemporary Land Surface Models (LSM) generally consider isolated soil columns and free drainage lower boundary condition for simulating hydrology. This is mainly due to the fact that incorporating detailed groundwater dynamics in LSMs usually requires considerable computing resources, especially for large-scale applications (e.g., continental to global). Yet, these simplifications undermine the potential effect of groundwater dynamics on land surface mass and energy fluxes. In this study, we present a novel approach of representing high-resolution groundwater dynamics in LSMs that is computationally efficient for large-scale applications. This new parameterization is incorporated in the Joint UK Land Environment Simulator (JULES) and tested at the continental-scale.
NASA Astrophysics Data System (ADS)
Gowda, P. H.
2016-12-01
Evapotranspiration (ET) is an important process in ecosystems' water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. There are efforts to develop such datasets on a regional to global scale but often faced with the limitations of spatial-temporal resolution tradeoffs in satellite remote sensing technology. In this study, we developed frameworks for generating high and medium resolution daily ET maps from Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data, respectively. For developing high resolution (30-m) daily time series ET maps with Landsat TM data, the series version of Two Source Energy Balance (TSEB) model was used to compute sensible and latent heat fluxes of soil and canopy separately. Landsat 5 (2000-2011) and Landsat 8 (2013-2014) imageries for row 28/35 and 27/36 covering central Oklahoma was used. MODIS data (2001-2014) covering Oklahoma and Texas Panhandle was used to develop medium resolution (250-m), time series daily ET maps with SEBS (Surface Energy Balance System) model. An extensive network of weather stations managed by Texas High Plains ET Network and Oklahoma Mesonet was used to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. A linear interpolation sub-model was used to estimate the daily ET between the image acquisition days. Accuracy assessment of daily ET maps were done against eddy covariance data from two grassland sites at El Reno, OK. Statistical results indicated good performance by modeling frameworks developed for deriving time series ET maps. Results indicated that the proposed ET mapping framework is suitable for deriving daily time series ET maps at regional scale with Landsat and MODIS data.
NASA Astrophysics Data System (ADS)
Guenther, A. B.; Duhl, T.
2011-12-01
Increasing computational resources have enabled a steady improvement in the spatial resolution used for earth system models. Land surface models and landcover distributions have kept ahead by providing higher spatial resolution than typically used in these models. Satellite observations have played a major role in providing high resolution landcover distributions over large regions or the entire earth surface but ground observations are needed to calibrate these data and provide accurate inputs for models. As our ability to resolve individual landscape components improves, it is important to consider what scale is sufficient for providing inputs to earth system models. The required spatial scale is dependent on the processes being represented and the scientific questions being addressed. This presentation will describe the development a contiguous U.S. landcover database using high resolution imagery (1 to 1000 meters) and surface observations of species composition and other landcover characteristics. The database includes plant functional types and species composition and is suitable for driving land surface models (CLM and MEGAN) that predict land surface exchange of carbon, water, energy and biogenic reactive gases (e.g., isoprene, sesquiterpenes, and NO). We investigate the sensitivity of model results to landcover distributions with spatial scales ranging over six orders of magnitude (1 meter to 1000000 meters). The implications for predictions of regional climate and air quality will be discussed along with recommendations for regional and global earth system modeling.
Tropical Waves and the Quasi-Biennial Oscillation in a 7-km Global Climate Simulation
NASA Technical Reports Server (NTRS)
Holt, Laura A.; Alexander, M. Joan; Coy, Lawrence; Molod, Andrea; Putman, William; Pawson, Steven
2016-01-01
This study investigates tropical waves and their role in driving a quasi-biennial oscillation (QBO)-like signal in stratospheric winds in a global 7-km-horizontal-resolution atmospheric general circulation model. The Nature Run (NR) is a 2-year global mesoscale simulation of the Goddard Earth Observing System Model, version 5 (GEOS-5). In the tropics, there is evidence that the NR supports a broad range of convectively generated waves. The NR precipitation spectrum resembles the observed spectrum in many aspects, including the preference for westward-propagating waves. However, even with very high horizontal resolution and a healthy population of resolved waves, the zonal force provided by the resolved waves is still too low in the QBO region and parameterized gravity wave drag is the main driver of the NR QBO-like oscillation (NRQBO). The authors suggest that causes include coarse vertical resolution and excessive dissipation. Nevertheless, the very-high-resolution NR provides an opportunity to analyze the resolved wave forcing of the NR-QBO. In agreement with previous studies, large-scale Kelvin and small-scale waves contribute to the NRQBO driving in eastward shear zones and small-scale waves dominate the NR-QBO driving in westward shear zones. Waves with zonal wavelength,1000 km account for up to half of the small-scale (,3300 km) resolved wave forcing in eastward shear zones and up to 70% of the small-scale resolved wave forcing in westward shear zones of the NR-QBO.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leroy, Adam K.; Hughes, Annie; Schruba, Andreas
2016-11-01
The cloud-scale density, velocity dispersion, and gravitational boundedness of the interstellar medium (ISM) vary within and among galaxies. In turbulent models, these properties play key roles in the ability of gas to form stars. New high-fidelity, high-resolution surveys offer the prospect to measure these quantities across galaxies. We present a simple approach to make such measurements and to test hypotheses that link small-scale gas structure to star formation and galactic environment. Our calculations capture the key physics of the Larson scaling relations, and we show good correspondence between our approach and a traditional “cloud properties” treatment. However, we argue thatmore » our method is preferable in many cases because of its simple, reproducible characterization of all emission. Using, low- J {sup 12}CO data from recent surveys, we characterize the molecular ISM at 60 pc resolution in the Antennae, the Large Magellanic Cloud (LMC), M31, M33, M51, and M74. We report the distributions of surface density, velocity dispersion, and gravitational boundedness at 60 pc scales and show galaxy-to-galaxy and intragalaxy variations in each. The distribution of flux as a function of surface density appears roughly lognormal with a 1 σ width of ∼0.3 dex, though the center of this distribution varies from galaxy to galaxy. The 60 pc resolution line width and molecular gas surface density correlate well, which is a fundamental behavior expected for virialized or free-falling gas. Varying the measurement scale for the LMC and M31, we show that the molecular ISM has higher surface densities, lower line widths, and more self-gravity at smaller scales.« less
Regional sea level variability in a high-resolution global coupled climate model
NASA Astrophysics Data System (ADS)
Palko, D.; Kirtman, B. P.
2016-12-01
The prediction of trends at regional scales is essential in order to adapt to and prepare for the effects of climate change. However, GCMs are unable to make reliable predictions at regional scales. The prediction of local sea level trends is particularly critical. The main goal of this research is to utilize high-resolution (HR) (0.1° resolution in the ocean) coupled model runs of CCSM4 to analyze regional sea surface height (SSH) trends. Unlike typical, lower resolution (1.0°) GCM runs these HR runs resolve features in the ocean, like the Gulf Stream, which may have a large effect on regional sea level. We characterize the variability of regional SSH along the Atlantic coast of the US using tide gauge observations along with fixed radiative forcing runs of CCSM4 and HR interactive ensemble runs. The interactive ensemble couples an ensemble mean atmosphere with a single ocean realization. This coupling results in a 30% decrease in the strength of the Atlantic meridional overturning circulation; therefore, the HR interactive ensemble is analogous to a HR hosing experiment. By characterizing the variability in these high-resolution GCM runs and observations we seek to understand what processes influence coastal SSH along the Eastern Coast of the United States and better predict future SLR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Held, Isaac; V. Balaji; Fueglistaler, Stephan
We have constructed and analyzed a series of idealized models of tropical convection interacting with large-scale circulations, with 25-50km resolution and with 1-2km cloud resolving resolution to set the stage for rigorous tests of convection closure schemes in high resolution global climate models. Much of the focus has been on the climatology of tropical cyclogenesis in rotating systems and the related problem of the spontaneous aggregation of convection in non-rotating systems. The PI (Held) will be delivering the honorary Bjerknes lecture at the Fall 2016 AGU meeting in December on this work. We have also provided new analyses of long-standingmore » issues related to the interaction between convection and the large-scale circulation: Kelvin waves in the upper troposphere and lower stratosphere, water vapor transport into the stratosphere, and upper tropospheric temperature trends. The results of these analyses help to improve our understanding of processes, and provide tests for future high resolution global modeling. Our final goal of testing new convections schemes in next-generation global atmospheric models at GFDL has been left for future work due to the complexity of the idealized model results meant as tests for these models uncovered in this work and to computational resource limitations. 11 papers have been published with support from this grant, 2 are in review, and another major summary paper is in preparation.« less
NASA Technical Reports Server (NTRS)
Strom, Stephen; Sargent, Wallace L. W.; Wolff, Sidney; Ahearn, Michael F.; Angel, J. Roger; Beckwith, Steven V. W.; Carney, Bruce W.; Conti, Peter S.; Edwards, Suzan; Grasdalen, Gary
1991-01-01
Optical/infrared (O/IR) astronomy in the 1990's is reviewed. The following subject areas are included: research environment; science opportunities; technical development of the 1980's and opportunities for the 1990's; and ground-based O/IR astronomy outside the U.S. Recommendations are presented for: (1) large scale programs (Priority 1: a coordinated program for large O/IR telescopes); (2) medium scale programs (Priority 1: a coordinated program for high angular resolution; Priority 2: a new generation of 4-m class telescopes); (3) small scale programs (Priority 1: near-IR and optical all-sky surveys; Priority 2: a National Astrometric Facility); and (4) infrastructure issues (develop, purchase, and distribute optical CCDs and infrared arrays; a program to support large optics technology; a new generation of large filled aperture telescopes; a program to archive and disseminate astronomical databases; and a program for training new instrumentalists)
Defining habitat covariates in camera-trap based occupancy studies
Niedballa, Jürgen; Sollmann, Rahel; Mohamed, Azlan bin; Bender, Johannes; Wilting, Andreas
2015-01-01
In species-habitat association studies, both the type and spatial scale of habitat covariates need to match the ecology of the focal species. We assessed the potential of high-resolution satellite imagery for generating habitat covariates using camera-trapping data from Sabah, Malaysian Borneo, within an occupancy framework. We tested the predictive power of covariates generated from satellite imagery at different resolutions and extents (focal patch sizes, 10–500 m around sample points) on estimates of occupancy patterns of six small to medium sized mammal species/species groups. High-resolution land cover information had considerably more model support for small, patchily distributed habitat features, whereas it had no advantage for large, homogeneous habitat features. A comparison of different focal patch sizes including remote sensing data and an in-situ measure showed that patches with a 50-m radius had most support for the target species. Thus, high-resolution satellite imagery proved to be particularly useful in heterogeneous landscapes, and can be used as a surrogate for certain in-situ measures, reducing field effort in logistically challenging environments. Additionally, remote sensed data provide more flexibility in defining appropriate spatial scales, which we show to impact estimates of wildlife-habitat associations. PMID:26596779
NASA Astrophysics Data System (ADS)
Tourigny, E.; Nobre, C.; Cardoso, M. F.
2012-12-01
Deforestation of tropical forests for logging and agriculture, associated to slash-and-burn practices, is a major source of CO2 emissions, both immediate due to biomass burning and future due to the elimination of a potential CO2 sink. Feedbacks between climate change and LUCC (Land-Use and Land-Cover Change) can potentially increase the loss of tropical forests and increase the rate of CO2 emissions, through mechanisms such as land and soil degradation and the increase in wildfire occurrence and severity. However, current understanding of the processes of fires (including ignition, spread and consequences) in tropical forests and climatic feedbacks are poorly understood and need further research. As the processes of LUCC and associated fires occur at local scales, linking them to large-scale atmospheric processes requires a means of up-scaling higher resolutions processes to lower resolutions. Our approach is to couple models which operate at various spatial and temporal scales: a Global Climate Model (GCM), Dynamic Global Vegetation Model (DGVM) and local-scale LUCC and fire spread model. The climate model resolves large scale atmospheric processes and forcings, which are imposed on the surface DGVM and fed-back to climate. Higher-resolution processes such as deforestation, land use management and associated (as well as natural) fires are resolved at the local level. A dynamic tiling scheme allows to represent local-scale heterogeneity while maintaining computational efficiency of the land surface model, compared to traditional landscape models. Fire behavior is modeled at the regional scale (~500m) to represent the detailed landscape using a semi-empirical fire spread model. The relatively coarse scale (as compared to other fire spread models) is necessary due to the paucity of detailed land-cover information and fire history (particularly in the tropics and developing countries). This work presents initial results of a spatially-explicit fire spread model coupled to the IBIS DGVM model. Our area of study comprises selected regions in and near the Brazilian "arc of deforestation". For model training and evaluation, several areas have been mapped using high-resolution imagery from the Landsat TM/ETM+ sensors (Figure 1). This high resolution reference data is used for local-scale simulations and also to evaluate the accuracy of the global MCD45 burned area product, which will be used in future studies covering the entire "arc of deforestation".; Area of study along the arc of deforestation and cerrado: landsat scenes used and burned area (2010) from MCD45 product.
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
Stochastic Ocean Eddy Perturbations in a Coupled General Circulation Model.
NASA Astrophysics Data System (ADS)
Howe, N.; Williams, P. D.; Gregory, J. M.; Smith, R. S.
2014-12-01
High-resolution ocean models, which are eddy permitting and resolving, require large computing resources to produce centuries worth of data. Also, some previous studies have suggested that increasing resolution does not necessarily solve the problem of unresolved scales, because it simply introduces a new set of unresolved scales. Applying stochastic parameterisations to ocean models is one solution that is expected to improve the representation of small-scale (eddy) effects without increasing run-time. Stochastic parameterisation has been shown to have an impact in atmosphere-only models and idealised ocean models, but has not previously been studied in ocean general circulation models. Here we apply simple stochastic perturbations to the ocean temperature and salinity tendencies in the low-resolution coupled climate model, FAMOUS. The stochastic perturbations are implemented according to T(t) = T(t-1) + (ΔT(t) + ξ(t)), where T is temperature or salinity, ΔT is the corresponding deterministic increment in one time step, and ξ(t) is Gaussian noise. We use high-resolution HiGEM data coarse-grained to the FAMOUS grid to provide information about the magnitude and spatio-temporal correlation structure of the noise to be added to the lower resolution model. Here we present results of adding white and red noise, showing the impacts of an additive stochastic perturbation on mean climate state and variability in an AOGCM.
Overcoming complexities for consistent, continental-scale flood mapping
NASA Astrophysics Data System (ADS)
Smith, Helen; Zaidman, Maxine; Davison, Charlotte
2013-04-01
The EU Floods Directive requires all member states to produce flood hazard maps by 2013. Although flood mapping practices are well developed in Europe, there are huge variations in the scale and resolution of the maps between individual countries. Since extreme flood events are rarely confined to a single country, this is problematic, particularly for the re/insurance industry whose exposures often extend beyond country boundaries. Here, we discuss the challenges of large-scale hydrological and hydraulic modelling, using our experience of developing a 12-country model and set of maps, to illustrate how consistent, high-resolution river flood maps across Europe can be produced. The main challenges addressed include: data acquisition; manipulating the vast quantities of high-resolution data; and computational resources. Our starting point was to develop robust flood-frequency models that are suitable for estimating peak flows for a range of design flood return periods. We used the index flood approach, based on a statistical analysis of historic river flow data pooled on the basis of catchment characteristics. Historical flow data were therefore sourced for each country and collated into a large pan-European database. After a lengthy validation these data were collated into 21 separate analysis zones or regions, grouping smaller river basins according to their physical and climatic characteristics. The very large continental scale basins were each modelled separately on account of their size (e.g. Danube, Elbe, Drava and Rhine). Our methodology allows the design flood hydrograph to be predicted at any point on the river network for a range of return periods. Using JFlow+, JBA's proprietary 2D hydraulic hydrodynamic model, the calculated out-of-bank flows for all watercourses with an upstream drainage area exceeding 50km2 were routed across two different Digital Terrain Models in order to map the extent and depth of floodplain inundation. This generated modelling for a total river length of approximately 250,000km. Such a large-scale, high-resolution modelling exercise is extremely demanding on computational resources and would have been unfeasible without the use of Graphics Processing Units on a network of standard specification gaming computers. Our GPU grid is the world's largest flood-dedicated computer grid. The European river basins were split out into approximately 100 separate hydraulic models and managed individually, although care was taken to ensure flow continuity was maintained between models. The flood hazard maps from the modelling were pieced together using GIS techniques, to provide flood depth and extent information across Europe to a consistent scale and standard. After discussing the methodological challenges, we shall present our flood hazard maps and, from extensive validation work, compare these against historical flow records and observed flood extents.
Real-time and sub-wavelength ultrafast coherent diffraction imaging in the extreme ultraviolet.
Zürch, M; Rothhardt, J; Hädrich, S; Demmler, S; Krebs, M; Limpert, J; Tünnermann, A; Guggenmos, A; Kleineberg, U; Spielmann, C
2014-12-08
Coherent Diffraction Imaging is a technique to study matter with nanometer-scale spatial resolution based on coherent illumination of the sample with hard X-ray, soft X-ray or extreme ultraviolet light delivered from synchrotrons or more recently X-ray Free-Electron Lasers. This robust technique simultaneously allows quantitative amplitude and phase contrast imaging. Laser-driven high harmonic generation XUV-sources allow table-top realizations. However, the low conversion efficiency of lab-based sources imposes either a large scale laser system or long exposure times, preventing many applications. Here we present a lensless imaging experiment combining a high numerical aperture (NA = 0.8) setup with a high average power fibre laser driven high harmonic source. The high flux and narrow-band harmonic line at 33.2 nm enables either sub-wavelength spatial resolution close to the Abbe limit (Δr = 0.8λ) for long exposure time, or sub-70 nm imaging in less than one second. The unprecedented high spatial resolution, compactness of the setup together with the real-time capability paves the way for a plethora of applications in fundamental and life sciences.
Phytoplankton plasticity drives large variability in carbon fixation efficiency
NASA Astrophysics Data System (ADS)
Ayata, Sakina-Dorothée.; Lévy, Marina; Aumont, Olivier; Resplandy, Laure; Tagliabue, Alessandro; Sciandra, Antoine; Bernard, Olivier
2014-12-01
Phytoplankton C:N stoichiometry is highly flexible due to physiological plasticity, which could lead to high variations in carbon fixation efficiency (carbon consumption relative to nitrogen). However, the magnitude, as well as the spatial and temporal scales of variability, remains poorly constrained. We used a high-resolution biogeochemical model resolving various scales from small to high, spatially and temporally, in order to quantify and better understand this variability. We find that phytoplankton C:N ratio is highly variable at all spatial and temporal scales (5-12 molC/molN), from mesoscale to regional scale, and is mainly driven by nitrogen supply. Carbon fixation efficiency varies accordingly at all scales (±30%), with higher values under oligotrophic conditions and lower values under eutrophic conditions. Hence, phytoplankton plasticity may act as a buffer by attenuating carbon sequestration variability. Our results have implications for in situ estimations of C:N ratios and for future predictions under high CO2 world.
Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado
NASA Astrophysics Data System (ADS)
Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.
2015-12-01
Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.
NASA Astrophysics Data System (ADS)
Rengarajan, Rajagopalan; Goodenough, Adam A.; Schott, John R.
2016-10-01
Many remote sensing applications rely on simulated scenes to perform complex interaction and sensitivity studies that are not possible with real-world scenes. These applications include the development and validation of new and existing algorithms, understanding of the sensor's performance prior to launch, and trade studies to determine ideal sensor configurations. The accuracy of these applications is dependent on the realism of the modeled scenes and sensors. The Digital Image and Remote Sensing Image Generation (DIRSIG) tool has been used extensively to model the complex spectral and spatial texture variation expected in large city-scale scenes and natural biomes. In the past, material properties that were used to represent targets in the simulated scenes were often assumed to be Lambertian in the absence of hand-measured directional data. However, this assumption presents a limitation for new algorithms that need to recognize the anisotropic behavior of targets. We have developed a new method to model and simulate large-scale high-resolution terrestrial scenes by combining bi-directional reflectance distribution function (BRDF) products from Moderate Resolution Imaging Spectroradiometer (MODIS) data, high spatial resolution data, and hyperspectral data. The high spatial resolution data is used to separate materials and add textural variations to the scene, and the directional hemispherical reflectance from the hyperspectral data is used to adjust the magnitude of the MODIS BRDF. In this method, the shape of the BRDF is preserved since it changes very slowly, but its magnitude is varied based on the high resolution texture and hyperspectral data. In addition to the MODIS derived BRDF, target/class specific BRDF values or functions can also be applied to features of specific interest. The purpose of this paper is to discuss the techniques and the methodology used to model a forest region at a high resolution. The simulated scenes using this method for varying view angles show the expected variations in the reflectance due to the BRDF effects of the Harvard forest. The effectiveness of this technique to simulate real sensor data is evaluated by comparing the simulated data with the Landsat 8 Operational Land Image (OLI) data over the Harvard forest. Regions of interest were selected from the simulated and the real data for different targets and their Top-of-Atmospheric (TOA) radiance were compared. After adjusting for scaling correction due to the difference in atmospheric conditions between the simulated and the real data, the TOA radiance is found to agree within 5 % in the NIR band and 10 % in the visible bands for forest targets under similar illumination conditions. The technique presented in this paper can be extended for other biomes (e.g. desert regions and agricultural regions) by using the appropriate geographic regions. Since the entire scene is constructed in a simulated environment, parameters such as BRDF or its effects can be analyzed for general or target specific algorithm improvements. Also, the modeling and simulation techniques can be used as a baseline for the development and comparison of new sensor designs and to investigate the operational and environmental factors that affects the sensor constellations such as Sentinel and Landsat missions.
High-resolution simulation of deep pencil beam surveys - analysis of quasi-periodicity
NASA Astrophysics Data System (ADS)
Weiss, A. G.; Buchert, T.
1993-07-01
We carry out pencil beam constructions in a high-resolution simulation of the large-scale structure of galaxies. The initial density fluctuations are taken to have a truncated power spectrum. All the models have {OMEGA} = 1. As an example we present the results for the case of "Hot-Dark-Matter" (HDM) initial conditions with scale-free n = 1 power index on large scales as a representative of models with sufficient large-scale power. We use an analytic approximation for particle trajectories of a self-gravitating dust continuum and apply a local dynamical biasing of volume elements to identify luminous matter in the model. Using this method, we are able to resolve formally a simulation box of 1200h^-1^ Mpc (e.g. for HDM initial conditions) down to the scale of galactic halos using 2160^3^ particles. We consider this as the minimal resolution necessary for a sensible simulation of deep pencil beam data. Pencil beam probes are taken for a given epoch using the parameters of observed beams. In particular, our analysis concentrates on the detection of a quasi-periodicity in the beam probes using several different methods. The resulting beam ensembles are analyzed statistically using number distributions, pair-count histograms, unnormalized pair-counts, power spectrum analysis and trial-period folding. Periodicities are classified according to their significance level in the power spectrum of the beams. The simulation is designed for application to parameter studies which prepare future observational projects. We find that a large percentage of the beams show quasi- periodicities with periods which cluster at a certain length scale. The periods found range between one and eight times the cutoff length in the initial fluctuation spectrum. At significance levels similar to those of the data of Broadhurst et al. (1990), we find about 15% of the pencil beams to show periodicities, about 30% of which are around the mean separation of rich clusters, while the distribution of scales reaches values of more than 200h^-1^ Mpc. The detection of periodicities larger than the typical void size must not be due to missing of "walls" (like the so called "Great Wall" seen in the CfA catalogue of galaxies), but can be due to different clustering properties of galaxies along the beams.
NASA Astrophysics Data System (ADS)
Li, Hui; Sriver, Ryan L.
2018-01-01
High-resolution Atmosphere General Circulation Models (AGCMs) are capable of directly simulating realistic tropical cyclone (TC) statistics, providing a promising approach for TC-climate studies. Active air-sea coupling in a coupled model framework is essential to capturing TC-ocean interactions, which can influence TC-climate connections on interannual to decadal time scales. Here we investigate how the choices of ocean coupling can affect the directly simulated TCs using high-resolution configurations of the Community Earth System Model (CESM). We performed a suite of high-resolution, multidecadal, global-scale CESM simulations in which the atmosphere (˜0.25° grid spacing) is configured with three different levels of ocean coupling: prescribed climatological sea surface temperature (SST) (ATM), mixed layer ocean (SLAB), and dynamic ocean (CPL). We find that different levels of ocean coupling can influence simulated TC frequency, geographical distributions, and storm intensity. ATM simulates more storms and higher overall storm intensity than the coupled simulations. It also simulates higher TC track density over the eastern Pacific and the North Atlantic, while TC tracks are relatively sparse within CPL and SLAB for these regions. Storm intensification and the maximum wind speed are sensitive to the representations of local surface flux feedbacks in different coupling configurations. Key differences in storm number and distribution can be attributed to variations in the modeled large-scale climate mean state and variability that arise from the combined effect of intrinsic model biases and air-sea interactions. Results help to improve our understanding about the representation of TCs in high-resolution coupled Earth system models, with important implications for TC-climate applications.
NASA Astrophysics Data System (ADS)
Hewitt, Helene T.; Bell, Michael J.; Chassignet, Eric P.; Czaja, Arnaud; Ferreira, David; Griffies, Stephen M.; Hyder, Pat; McClean, Julie L.; New, Adrian L.; Roberts, Malcolm J.
2017-12-01
As the importance of the ocean in the weather and climate system is increasingly recognised, operational systems are now moving towards coupled prediction not only for seasonal to climate timescales but also for short-range forecasts. A three-way tension exists between the allocation of computing resources to refine model resolution, the expansion of model complexity/capability, and the increase of ensemble size. Here we review evidence for the benefits of increased ocean resolution in global coupled models, where the ocean component explicitly represents transient mesoscale eddies and narrow boundary currents. We consider lessons learned from forced ocean/sea-ice simulations; from studies concerning the SST resolution required to impact atmospheric simulations; and from coupled predictions. Impacts of the mesoscale ocean in western boundary current regions on the large-scale atmospheric state have been identified. Understanding of air-sea feedback in western boundary currents is modifying our view of the dynamics in these key regions. It remains unclear whether variability associated with open ocean mesoscale eddies is equally important to the large-scale atmospheric state. We include a discussion of what processes can presently be parameterised in coupled models with coarse resolution non-eddying ocean models, and where parameterizations may fall short. We discuss the benefits of resolution and identify gaps in the current literature that leave important questions unanswered.
Automated geographic registration and radiometric correction for UAV-based mosaics
USDA-ARS?s Scientific Manuscript database
Texas A&M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to s...
Large-Scale Biomonitoring of Remote and Threatened Ecosystems via High-Throughput Sequencing
Gibson, Joel F.; Shokralla, Shadi; Curry, Colin; Baird, Donald J.; Monk, Wendy A.; King, Ian; Hajibabaei, Mehrdad
2015-01-01
Biodiversity metrics are critical for assessment and monitoring of ecosystems threatened by anthropogenic stressors. Existing sorting and identification methods are too expensive and labour-intensive to be scaled up to meet management needs. Alternately, a high-throughput DNA sequencing approach could be used to determine biodiversity metrics from bulk environmental samples collected as part of a large-scale biomonitoring program. Here we show that both morphological and DNA sequence-based analyses are suitable for recovery of individual taxonomic richness, estimation of proportional abundance, and calculation of biodiversity metrics using a set of 24 benthic samples collected in the Peace-Athabasca Delta region of Canada. The high-throughput sequencing approach was able to recover all metrics with a higher degree of taxonomic resolution than morphological analysis. The reduced cost and increased capacity of DNA sequence-based approaches will finally allow environmental monitoring programs to operate at the geographical and temporal scale required by industrial and regulatory end-users. PMID:26488407
StePS: Stereographically Projected Cosmological Simulations
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-05-01
StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.
NASA Astrophysics Data System (ADS)
Kaiser, Jennifer; Jacob, Daniel J.; Zhu, Lei; Travis, Katherine R.; Fisher, Jenny A.; González Abad, Gonzalo; Zhang, Lin; Zhang, Xuesong; Fried, Alan; Crounse, John D.; St. Clair, Jason M.; Wisthaler, Armin
2018-04-01
Isoprene emissions from vegetation have a large effect on atmospheric chemistry and air quality. Bottom-up
isoprene emission inventories used in atmospheric models are based on limited vegetation information and uncertain land cover data, leading to potentially large errors. Satellite observations of atmospheric formaldehyde (HCHO), a high-yield isoprene oxidation product, provide top-down
information to evaluate isoprene emission inventories through inverse analyses. Past inverse analyses have however been hampered by uncertainty in the HCHO satellite data, uncertainty in the time- and NOx-dependent yield of HCHO from isoprene oxidation, and coarse resolution of the atmospheric models used for the inversion. Here we demonstrate the ability to use HCHO satellite data from OMI in a high-resolution inversion to constrain isoprene emissions on ecosystem-relevant scales. The inversion uses the adjoint of the GEOS-Chem chemical transport model at 0.25° × 0.3125° horizontal resolution to interpret observations over the southeast US in August-September 2013. It takes advantage of concurrent NASA SEAC4RS aircraft observations of isoprene and its oxidation products including HCHO to validate the OMI HCHO data over the region, test the GEOS-Chem isoprene oxidation mechanism and NOx environment, and independently evaluate the inversion. This evaluation shows in particular that local model errors in NOx concentrations propagate to biases in inferring isoprene emissions from HCHO data. It is thus essential to correct model NOx biases, which was done here using SEAC4RS observations but can be done more generally using satellite NO2 data concurrently with HCHO. We find in our inversion that isoprene emissions from the widely used MEGAN v2.1 inventory are biased high over the southeast US by 40 % on average, although the broad-scale distributions are correct including maximum emissions in Arkansas/Louisiana and high base emission factors in the oak-covered Ozarks of southeast Missouri. A particularly large discrepancy is in the Edwards Plateau of central Texas where MEGAN v2.1 is too high by a factor of 3, possibly reflecting errors in land cover. The lower isoprene emissions inferred from our inversion, when implemented into GEOS-Chem, decrease surface ozone over the southeast US by 1-3 ppb and decrease the isoprene contribution to organic aerosol from 40 to 20 %.
NASA Astrophysics Data System (ADS)
Kang, H.; Kourafalou, V. H.; Hogan, P. J.; Smedstad, O.
2008-12-01
The South Florida coastal seas include shelf areas and shallow water bodies around ecologically fragile environments and Marine Protected Areas, such as Florida Bay, the Florida Keys National Marine Sanctuary (around the largest coral reef system of the continental U.S.) and the Dry Tortugas Ecological Reserve. Man- made changes in the hydrology of the Everglades have caused dramatic degradation of the coastal ecosystem through discharge in Florida Bay. New management scenarios are under way to restore historical flows. The environmental impacts of the management propositions are examined with an inter-disciplinary, multi-nested modeling system. The HYbrid Coordinate Ocean Model (HYCOM) has been employed for the Regional Model for South Florida Coastal Seas (SoFLA-HYCOM, 1/25 degree resolution) and for the embedded, high resolution coastal Florida Keys model (FKEYS- HYCOM, 1/100 degree). Boundary conditions are extracted from GODAE products: the large scale North Atlantic model (ATL-HYCOM, 1/12 degree) and the intermediate scale Gulf of Mexico model (GOM-HYCOM, 1/25 degree). The study targets the impacts of large scale oceanic features on the coastal dynamics. Eddies that travel along the Loop Current/Florida Current front are known to be an important mechanism for the interaction of nearshore and offshore flows. The high resolution FKEYS simulations reveal both mescoscale and sub- mesoscale eddy passages during a targeted 2-year simulation period (2004-2005), forced with high resolution/high frequency atmospheric forcing. Eddies influence sea level changes in the vicinity of Florida Bay with possible implications on current and future flushing patterns. They also enable upwelling of cooler, nutrient-rich waters in the vicinity of the Reef Tract and they influence transport and recruitment pathways for coral fish larvae, as they carry waters of different properties (such as river-borne low-salinity/nutrient-rich waters from as far as the Mississippi River) and waters containing larvae from upstream sources (such as from the Dry Tortugas spawning grounds).
NASA Astrophysics Data System (ADS)
Hamada, Y.; O'Connor, B. L.
2012-12-01
Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary field measurements necessary to verify remote sensing landscape models, and for generating hydrologic models and analyses.
Large-scale horizontal flows from SOUP observations of solar granulation
NASA Astrophysics Data System (ADS)
November, L. J.; Simon, G. W.; Tarbell, T. D.; Title, A. M.; Ferguson, S. H.
1987-09-01
Using high-resolution time-sequence photographs of solar granulation from the SOUP experiment on Spacelab 2 the authors observed large-scale horizontal flows in the solar surface. The measurement method is based upon a local spatial cross correlation analysis. The horizontal motions have amplitudes in the range 300 to 1000 m/s. Radial outflow of granulation from a sunspot penumbra into the surrounding photosphere is a striking new discovery. Both the supergranulation pattern and cellular structures having the scale of mesogranulation are seen. The vertical flows that are inferred by continuity of mass from these observed horizontal flows have larger upflow amplitudes in cell centers than downflow amplitudes at cell boundaries.
Mini-Sosie high-resolution seismic method aids hazards studies
Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.
1992-01-01
The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors
The Athena X-ray Integral Field Unit (X-IFU)
NASA Astrophysics Data System (ADS)
Pajot, F.; Barret, D.; Lam-Trong, T.; den Herder, J.-W.; Piro, L.; Cappi, M.; Huovelin, J.; Kelley, R.; Mas-Hesse, J. M.; Mitsuda, K.; Paltani, S.; Rauw, G.; Rozanska, A.; Wilms, J.; Barbera, M.; Douchin, F.; Geoffray, H.; den Hartog, R.; Kilbourne, C.; Le Du, M.; Macculi, C.; Mesnager, J.-M.; Peille, P.
2018-04-01
The X-ray Integral Field Unit (X-IFU) of the Advanced Telescope for High-ENergy Astrophysics (Athena) large-scale mission of ESA will provide spatially resolved high-resolution X-ray spectroscopy from 0.2 to 12 keV, with 5^' ' } pixels over a field of view of 5 arc minute equivalent diameter and a spectral resolution of 2.5 eV (FWHM) up to 7 keV. The core scientific objectives of Athena drive the main performance parameters of the X-IFU. We present the current reference configuration of the X-IFU, and the key issues driving the design of the instrument.
NASA Astrophysics Data System (ADS)
Lu, Chieh Han; Chen, Peilin; Chen, Bi-Chang
2017-02-01
Optical imaging techniques provide much important information in understanding life science especially cellular structure and morphology because "seeing is believing". However, the resolution of optical imaging is limited by the diffraction limit, which is discovered by Ernst Abbe, i.e. λ/2(NA) (NA is the numerical aperture of the objective lens). Fluorescence super-resolution microscopic techniques such as Stimulated emission depletion microscopy (STED), Photoactivated localization microscopy (PALM), and Stochastic optical reconstruction microscopy (STORM) are invented to have the capability of seeing biological entities down to molecular level that are smaller than the diffraction limit (around 200-nm in lateral resolution). These techniques do not physically violate the Abbe limit of resolution but exploit the photoluminescence properties and labelling specificity of fluorescence molecules to achieve super-resolution imaging. However, these super-resolution techniques limit most of their applications to the 2D imaging of fixed or dead samples due to the high laser power needed or slow speed for the localization process. Extended from 2D imaging, light sheet microscopy has been proven to have a lot of applications on 3D imaging at much better spatiotemporal resolutions due to its intrinsic optical sectioning and high imaging speed. Herein, we combine the advantage of localization microscopy and light-sheet microscopy to have super-resolved cellular imaging in 3D across large field of view. With high-density labeled spontaneous blinking fluorophore and wide-field detection of light-sheet microscopy, these allow us to construct 3D super-resolution multi-cellular imaging at high speed ( minutes) by light-sheet single-molecule localization microscopy.
Towards a Full-sky, High-resolution Dust Extinction Map with WISE and Planck
NASA Astrophysics Data System (ADS)
Meisner, Aaron M.; Finkbeiner, D. P.
2014-01-01
We have recently completed a custom processing of the entire WISE 12 micron All-sky imaging data set. The result is a full-sky map of diffuse, mid-infrared Galactic dust emission with angular resolution of 15 arcseconds, and with contaminating artifacts such as compact sources removed. At the same time, the 2013 Planck HFI maps represent a complementary data set in the far-infrared, with zero-point relatively immune to zodiacal contamination and angular resolution superior to previous full-sky data sets at similar frequencies. Taken together, these WISE and Planck data products present an opportunity to improve upon the SFD (1998) dust extinction map, by virtue of enhanced angular resolution and potentially better-controlled systematics on large scales. We describe our continuing efforts to construct and test high-resolution dust extinction and temperature maps based on our custom WISE processing and Planck HFI data.
Beyond Solar-B: MTRAP, the Magnetic Transition Region Probe
NASA Technical Reports Server (NTRS)
Davis, John M.; Moore, Ronald L.; Hathaway, David H.
2003-01-01
The next generation of solar missions will reveal and measure fine-scale solar magnetic fields and their effects in the solar atmosphere at heights, small scales, sensitivities, and fields of view well beyond the reach of Solar-B. The necessity for, and potential of, such observations for understanding solar magnetic fields, their generation in and below the photosphere, and their control of the solar atmosphere and heliosphere, were the focus of a science definition workshop, 'High-Resolution Solar Magnetography from Space: Beyond Solar-B,' held in Huntsville Alabama in April 2001. Forty internationally prominent scientists active in solar research involving fine-scale solar magnetism participated in this Workshop and reached consensus that the key science objective to be pursued beyond Solar-B is a physical understanding of the fine-scale magnetic structure and activity in the magnetic transition region, defined as the region between the photosphere and corona where neither the plasma nor the magnetic field strongly dominates the other. The observational objective requires high cadence (less than 10s) vector magnetic field maps, and spatially resolved spectra from the IR, visible, vacuum UV, to the EUV at high resolution (less than 50km) over a large FOV (approximately 140,000 km). A polarimetric resolution of one part in ten thousand is required to measure transverse magnetic fields of less than 30G. The latest SEC Roadmap includes a mission identified as MTRAP to meet these requirements. Enabling technology development requirements include large, lightweight, reflecting optics, large format sensors (16K x 16K pixels) with high QE at 150 nm, and extendable spacecraft structures. The Science Organizing Committee of the Beyond Solar-B Workshop recommends that: (1) Science and Technology Definition Teams should be established in FY04 to finalize the science requirements and to define technology development efforts needed to ensure the practicality of MTRAP's observational goals; (2) The necessary technology development funding should be included in Code S budgets for FY06 and beyond to prepare MTRAP for a new start no later than the nominal end of the Solar-B mission, around 2010.
Beyond Solar-B: MTRAP, the Magnetic TRAnsition Region Probe
NASA Astrophysics Data System (ADS)
Davis, J. M.; Moore, R. L.; Hathaway, D. H.; Science Definition CommitteeHigh-Resolution Solar Magnetography Beyond Solar-B Team
2003-05-01
The next generation of solar missions will reveal and measure fine-scale solar magnetic fields and their effects in the solar atmosphere at heights, small scales, sensitivities, and fields of view well beyond the reach of Solar-B. The necessity for, and potential of, such observations for understanding solar magnetic fields, their generation in and below the photosphere, and their control of the solar atmosphere and heliosphere, were the focus of a science definition workshop, "High-Resolution Solar Magnetography from Space: Beyond Solar-B," held in Huntsville Alabama in April 2001. Forty internationally prominent scientists active in solar research involving fine-scale solar magnetism participated in this Workshop and reached consensus that the key science objective to be pursued beyond Solar-B is a physical understanding of the fine-scale magnetic structure and activity in the magnetic transition region, defined as the region between the photosphere and corona where neither the plasma nor the magnetic field strongly dominates the other. The observational objective requires high cadence (< 10s) vector magnetic field maps, and spatially resolved spectra from the IR, visible, vacuum UV, to the EUV at high resolution (< 50km) over a large FOV ( 140,000 km). A polarimetric resolution of one part in ten thousand is required to measure transverse magnetic fields of < 30G. The latest SEC Roadmap includes a mission identified as MTRAP to meet these requirements. Enabling technology development requirements include large, lightweight, reflecting optics, large format sensors (16K x 16K pixels) with high QE at 150 nm, and extendable spacecraft structures. The Science Organizing Committee of the Beyond Solar-B Workshop recommends that: 1. Science and Technology Definition Teams should be established in FY04 to finalize the science requirements and to define technology development efforts needed to ensure the practicality of MTRAP's observational goals. 2. The necessary technology development funding should be included in Code S budgets for FY06 and beyond to prepare MTRAP for a new start no later than the nominal end of the Solar-B mission, around 2010.
NASA Astrophysics Data System (ADS)
Miles, B.; Chepudira, K.; LaBar, W.
2017-12-01
The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next-generation large-scale and high-resolution real-time environmental monitoring networks used in domains such as hydrology, geomorphology, and geophysics, as well as management applications such as flood early warning, and regulatory enforcement.
NASA Astrophysics Data System (ADS)
Hugue, F.; Lapointe, M.; Eaton, B. C.; Lepoutre, A.
2016-01-01
We illustrate an approach to quantify patterns in hydraulic habitat composition and local heterogeneity applicable at low cost over very large river extents, with selectable reach window scales. Ongoing developments in remote sensing and geographical information science massively improve efficiencies in analyzing earth surface features. With the development of new satellite sensors and drone platforms and with the lowered cost of high resolution multispectral imagery, fluvial geomorphology is experiencing a revolution in mapping streams at high resolution. Exploiting the power of aerial or satellite imagery is particularly useful in a riverscape research framework (Fausch et al., 2002), where high resolution sampling of fluvial features and very large coverage extents are needed. This study presents a satellite remote sensing method that requires very limited field calibration data to estimate over various scales ranging from 1 m to many tens or river kilometers (i) spatial composition metrics for key hydraulic mesohabitat types and (ii) reach-scale wetted habitat heterogeneity indices such as the hydromorphological index of diversity (HMID). When the purpose is hydraulic habitat characterization applied over long river networks, the proposed method (although less accurate) is much less computationally expensive and less data demanding than two dimensional computational fluid dynamics (CFD). Here, we illustrate the tools based on a Worldview 2 satellite image of the Kiamika River, near Mont Laurier, Quebec, Canada, specifically over a 17-km river reach below the Kiamika dam. In the first step, a high resolution water depth (D) map is produced from a spectral band ratio (calculated from the multispectral image), calibrated with limited field measurements. Next, based only on known river discharge and estimated cross section depths at time of image capture, empirical-based pseudo-2D hydraulic rules are used to rapidly generate a two-dimensional map of flow velocity (V) over the 17-km Kiamika reach. The joint distribution of D and V variables over wetted zones then is used to reveal structural patterns in hydraulic habitat availability at patch, reach, and segment scales. Here we analyze 156 bivariate (D, V) density function plots estimated over moving reach windows along the satellite scene extent to extract 14 physical habitat metrics (such as river width, mean and modal depths and velocity, variances and covariance in D and V over 1-m pixels, HMID, entropy). A principal component analysis on the set of metrics is then used to cluster river reaches in regard to similarity in their hydraulic habitat composition and heterogeneity. Applications of this approach can include (i) specific fish habitat detection at riverscape scales (e.g., large areas of riffle spawning beds, deeper pools) for regional management, (ii) studying how river habitat heterogeneity is correlated to fish distribution and (iii) guidance for site location for restoration of key habitats or for post regulation monitoring of representative reaches of various types.
NASA Technical Reports Server (NTRS)
Campbell, W. J.; Josberger, E. G.; Gloersen, P.; Johannessen, O. M.; Guest, P. S.
1987-01-01
The data acquired during the summer 1984 Marginal Ice Zone Experiment in the Fram Strait-Greenland Sea marginal ice zone, using airborne active and passive microwave sensors and the Nimbus 7 SMMR, were analyzed to compile a sequential description of the mesoscale and large-scale ice morphology variations during the period of June 6 - July 16, 1984. Throughout the experiment, the long ice edge between northwest Svalbard and central Greenland meandered; eddies were repeatedly formed, moved, and disappeared but the ice edge remained within a 100-km-wide zone. The ice pack behind this alternately diffuse and compact edge underwent rapid and pronounced variations in ice concentration over a 200-km-wide zone. The high-resolution ice concentration distributions obtained in the aircraft images agree well with the low-resolution distributions of SMMR images.
An Eye Model for Computational Dosimetry Using A Multi-Scale Voxel Phantom
NASA Astrophysics Data System (ADS)
Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek
2014-06-01
The lens of the eye is a radiosensitive tissue with cataract formation being the major concern. Recently reduced recommended dose limits to the lens of the eye have made understanding the dose to this tissue of increased importance. Due to memory limitations, the voxel resolution of computational phantoms used for radiation dose calculations is too large to accurately represent the dimensions of the eye. A revised eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and is then transformed into a high-resolution voxel model. This eye model is combined with an existing set of whole body models to form a multi-scale voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.
Visualization of the Eastern Renewable Generation Integration Study: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron
The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less
Azmy, Muna Maryam; Hashim, Mazlan; Numata, Shinya; Hosaka, Tetsuro; Noor, Nur Supardi Md.; Fletcher, Christine
2016-01-01
General flowering (GF) is a unique phenomenon wherein, at irregular intervals, taxonomically diverse trees in Southeast Asian dipterocarp forests synchronize their reproduction at the community level. Triggers of GF, including drought and low minimum temperatures a few months previously has been limitedly observed across large regional scales due to lack of meteorological stations. Here, we aim to identify the climatic conditions that trigger large-scale GF in Peninsular Malaysia using satellite sensors, Tropical Rainfall Measuring Mission (TRMM) and Moderate Resolution Imaging Spectroradiometer (MODIS), to evaluate the climatic conditions of focal forests. We observed antecedent drought, low temperature and high photosynthetic radiation conditions before large-scale GF events, suggesting that large-scale GF events could be triggered by these factors. In contrast, we found higher-magnitude GF in forests where lower precipitation preceded large-scale GF events. GF magnitude was also negatively influenced by land surface temperature (LST) for a large-scale GF event. Therefore, we suggest that spatial extent of drought may be related to that of GF forests, and that the spatial pattern of LST may be related to that of GF occurrence. With significant new findings and other results that were consistent with previous research we clarified complicated environmental correlates with the GF phenomenon. PMID:27561887
Azmy, Muna Maryam; Hashim, Mazlan; Numata, Shinya; Hosaka, Tetsuro; Noor, Nur Supardi Md; Fletcher, Christine
2016-08-26
General flowering (GF) is a unique phenomenon wherein, at irregular intervals, taxonomically diverse trees in Southeast Asian dipterocarp forests synchronize their reproduction at the community level. Triggers of GF, including drought and low minimum temperatures a few months previously has been limitedly observed across large regional scales due to lack of meteorological stations. Here, we aim to identify the climatic conditions that trigger large-scale GF in Peninsular Malaysia using satellite sensors, Tropical Rainfall Measuring Mission (TRMM) and Moderate Resolution Imaging Spectroradiometer (MODIS), to evaluate the climatic conditions of focal forests. We observed antecedent drought, low temperature and high photosynthetic radiation conditions before large-scale GF events, suggesting that large-scale GF events could be triggered by these factors. In contrast, we found higher-magnitude GF in forests where lower precipitation preceded large-scale GF events. GF magnitude was also negatively influenced by land surface temperature (LST) for a large-scale GF event. Therefore, we suggest that spatial extent of drought may be related to that of GF forests, and that the spatial pattern of LST may be related to that of GF occurrence. With significant new findings and other results that were consistent with previous research we clarified complicated environmental correlates with the GF phenomenon.
Podshivalov, L; Fischer, A; Bar-Yoseph, P Z
2011-04-01
This paper describes a new alternative for individualized mechanical analysis of bone trabecular structure. This new method closes the gap between the classic homogenization approach that is applied to macro-scale models and the modern micro-finite element method that is applied directly to micro-scale high-resolution models. The method is based on multiresolution geometrical modeling that generates intermediate structural levels. A new method for estimating multiscale material properties has also been developed to facilitate reliable and efficient mechanical analysis. What makes this method unique is that it enables direct and interactive analysis of the model at every intermediate level. Such flexibility is of principal importance in the analysis of trabecular porous structure. The method enables physicians to zoom-in dynamically and focus on the volume of interest (VOI), thus paving the way for a large class of investigations into the mechanical behavior of bone structure. This is one of the very few methods in the field of computational bio-mechanics that applies mechanical analysis adaptively on large-scale high resolution models. The proposed computational multiscale FE method can serve as an infrastructure for a future comprehensive computerized system for diagnosis of bone structures. The aim of such a system is to assist physicians in diagnosis, prognosis, drug treatment simulation and monitoring. Such a system can provide a better understanding of the disease, and hence benefit patients by providing better and more individualized treatment and high quality healthcare. In this paper, we demonstrate the feasibility of our method on a high-resolution model of vertebra L3. Copyright © 2010 Elsevier Inc. All rights reserved.
Regional climate model sensitivity to domain size
NASA Astrophysics Data System (ADS)
Leduc, Martin; Laprise, René
2009-05-01
Regional climate models are increasingly used to add small-scale features that are not present in their lateral boundary conditions (LBC). It is well known that the limited area over which a model is integrated must be large enough to allow the full development of small-scale features. On the other hand, integrations on very large domains have shown important departures from the driving data, unless large scale nudging is applied. The issue of domain size is studied here by using the “perfect model” approach. This method consists first of generating a high-resolution climatic simulation, nicknamed big brother (BB), over a large domain of integration. The next step is to degrade this dataset with a low-pass filter emulating the usual coarse-resolution LBC. The filtered nesting data (FBB) are hence used to drive a set of four simulations (LBs for Little Brothers), with the same model, but on progressively smaller domain sizes. The LB statistics for a climate sample of four winter months are compared with BB over a common region. The time average (stationary) and transient-eddy standard deviation patterns of the LB atmospheric fields generally improve in terms of spatial correlation with the reference (BB) when domain gets smaller. The extraction of the small-scale features by using a spectral filter allows detecting important underestimations of the transient-eddy variability in the vicinity of the inflow boundary, which can penalize the use of small domains (less than 100 × 100 grid points). The permanent “spatial spin-up” corresponds to the characteristic distance that the large-scale flow needs to travel before developing small-scale features. The spin-up distance tends to grow in size at higher levels in the atmosphere.
Analysis of terrestrial conditions and dynamics
NASA Technical Reports Server (NTRS)
Goward, S. N. (Principal Investigator)
1984-01-01
Land spectral reflectance properties for selected locations, including the Goddard Space Flight Center, the Wallops Flight Facility, a MLA test site in Cambridge, Maryland, and an acid test site in Burlington, Vermont, were measured. Methods to simulate the bidirectional reflectance properties of vegetated landscapes and a data base for spatial resolution were developed. North American vegetation patterns observed with the Advanced Very High Resolution Radiometer were assessed. Data and methods needed to model large-scale vegetation activity with remotely sensed observations and climate data were compiled.
NASA Astrophysics Data System (ADS)
Candela, S. G.; Howat, I.; Noh, M. J.; Porter, C. C.; Morin, P. J.
2016-12-01
In the last decade, high resolution satellite imagery has become an increasingly accessible tool for geoscientists to quantify changes in the Arctic land surface due to geophysical, ecological and anthropomorphic processes. However, the trade off between spatial coverage and spatial-temporal resolution has limited detailed, process-level change detection over large (i.e. continental) scales. The ArcticDEM project utilized over 300,000 Worldview image pairs to produce a nearly 100% coverage elevation model (above 60°N) offering the first polar, high spatial - high resolution (2-8m by region) dataset, often with multiple repeats in areas of particular interest to geo-scientists. A dataset of this size (nearly 250 TB) offers endless new avenues of scientific inquiry, but quickly becomes unmanageable computationally and logistically for the computing resources available to the average scientist. Here we present TopoDiff, a framework for a generalized. automated workflow that requires minimal input from the end user about a study site, and utilizes cloud computing resources to provide a temporally sorted and differenced dataset, ready for geostatistical analysis. This hands-off approach allows the end user to focus on the science, without having to manage thousands of files, or petabytes of data. At the same time, TopoDiff provides a consistent and accurate workflow for image sorting, selection, and co-registration enabling cross-comparisons between research projects.
Inflationary tensor perturbations after BICEP2.
Caligiuri, Jerod; Kosowsky, Arthur
2014-05-16
The measurement of B-mode polarization of the cosmic microwave background at large angular scales by the BICEP experiment suggests a stochastic gravitational wave background from early-Universe inflation with a surprisingly large amplitude. The power spectrum of these tensor perturbations can be probed both with further measurements of the microwave background polarization at smaller scales and also directly via interferometry in space. We show that sufficiently sensitive high-resolution B-mode measurements will ultimately have the ability to test the inflationary consistency relation between the amplitude and spectrum of the tensor perturbations, confirming their inflationary origin. Additionally, a precise B-mode measurement of the tensor spectrum will predict the tensor amplitude on solar system scales to 20% accuracy for an exact power-law tensor spectrum, so a direct detection will then measure the running of the tensor spectral index to high precision.
Terrestrial photography as a complementary measurement in weather stations for snow monitoring
NASA Astrophysics Data System (ADS)
Pimentel, Rafael; José Pérez-Palazón, María; Herrero, Javier; José Polo, María
2015-04-01
Snow monitoring constitutes a basic key to know snow behaviour and evolution, which have particular features in semiarid regions (i.e. highly strong spatiotemporal variability, and the occurrence of several accumulation-melting cycles throughout the year). On one hand, traditional snow observation, such as snow surveys and snow pillows have the inconvenience of a limited accessibility during snow season and the impossibility to cover a vast extension. On the other hand, satellite remote sensing techniques, largely employed in medium to large scale regional studies, has the disadvantage of a fixed spatial and temporal resolutions which in some cases are not able to reproduce snow processes at small scale. An economic alternative is the use of terrestrial photography which scales are adapted to the study problem. At the microscale resolution permits the continuous monitoring of snow, adapting the resolution of the observation to the scales of the processes. Besides its use as raw observation datasets to calibrate and validate models' results, terrestrial photography constitutes valuable information to complement weather stations observations. It allows the discriminating possible mistakes in meteorological observations (i.e. overestimation on rain measurements) and a better understanding of snow behaviour against certain weather agents (i.e. blowing snow). Thus, terrestrial photography is a feasible and convenient technique to be included in weather monitoring stations in mountainous areas in semiarid regions.
Introducing CGOLS: The Cholla Galactic Outflow Simulation Suite
NASA Astrophysics Data System (ADS)
Schneider, Evan E.; Robertson, Brant E.
2018-06-01
We present the Cholla Galactic OutfLow Simulations (CGOLS) suite, a set of extremely high resolution global simulations of isolated disk galaxies designed to clarify the nature of multiphase structure in galactic winds. Using the GPU-based code Cholla, we achieve unprecedented resolution in these simulations, modeling galaxies over a 20 kpc region at a constant resolution of 5 pc. The simulations include a feedback model designed to test the effects of different mass- and energy-loading factors on galactic outflows over kiloparsec scales. In addition to describing the simulation methodology in detail, we also present the results from an adiabatic simulation that tests the frequently adopted analytic galactic wind model of Chevalier & Clegg. Our results indicate that the Chevalier & Clegg model is a good fit to nuclear starburst winds in the nonradiative region of parameter space. Finally, we investigate the role of resolution and convergence in large-scale simulations of multiphase galactic winds. While our largest-scale simulations show convergence of observable features like soft X-ray emission, our tests demonstrate that simulations of this kind with resolutions greater than 10 pc are not yet converged, confirming the need for extreme resolution in order to study the structure of winds and their effects on the circumgalactic medium.
Recent Developments in Transition-Edge Strip Detectors for Solar X-Rays
NASA Technical Reports Server (NTRS)
Rausch, Adam J.; Deiker, Steven W.; Hilton, Gene; Irwin, Kent D.; Martinez-Galarce, Dennis S.; Shing, Lawrence; Stern, Robert A.; Ullom, Joel N.; Vale, Leila R.
2008-01-01
LMSAL and NIST are developing position-sensitive x-ray strip detectors based on Transition Edge Sensor (TES) microcalorimeters optimized for solar physics. By combining high spectral (E/ delta E approximately equals 1600) and temporal (single photon delta t approximately equals 10 micro s) resolutions with imaging capabilities, these devices will be able to study high-temperature (>l0 MK) x-ray lines as never before. Diagnostics from these lines should provide significant new insight into the physics of both microflares and the early stages of flares. Previously, the large size of traditional TESs, along with the heat loads associated with wiring large arrays, presented obstacles to using these cryogenic detectors for solar missions. Implementing strip detector technology at small scales, however, addresses both issues: here, a line of substantially smaller effective pixels requires only two TESs, decreasing both the total array size and the wiring requirements for the same spatial resolution. Early results show energy resolutions of delta E(sub fwhm) approximately equals 30 eV and spatial resolutions of approximately 10-15 micron, suggesting the strip-detector concept is viable.
A digital gigapixel large-format tile-scan camera.
Ben-Ezra, M
2011-01-01
Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.
NASA Astrophysics Data System (ADS)
Duperret, Anne; Raimbault, Céline; Duguet, Timothée; Le Gall, Bernard; Costa, Stéphane; Vandycke, Sara
2017-04-01
During the EC2CO/DRIL/CROCODYL project, high-resolution land-sea DEM have been produced in NW Normandy and SW Brittany rocky coastal zone, using high-resolution bathymetry from shallow-water cruises CROCOLIT-1,-2,-3 (Duperret, 2013), SPLASHALIOT-3 (Maillet, 2014), THAPENFROM-1 (Duperret, 2015) and aerial topographic LiDAR data from the Litto3D project. Two study sites were selected to map detailed geomorphology of shore platforms in order to better understand rock coast evolution processes through time and long-term rates of rocky coastal erosion versus geological context. The eastern English Channel is made of coastal chalk cliffs that currently eroding with fast mean rates of the order of a few dm/year. In Normandy coast (NW France), this results to the generation of roughly linear coastal segments of about 20-30km long each. On coastal segments only made of Upper Cretaceous Chalk, erosion occurs by present-day sudden and repeated vertical failures and cliff collapses. Cliff collapse process is shaping vertical chalk cliffs in association with resulting roughly flat shore platforms. Even if shore platforms width are short and homogeneous (a few hundred meters in width), the detailed morphology observed on high-resolution bathymetry evidenced two main submarine geomorphological types. One is linear and regular and associated with linear coastal sections. This corresponds to homogeneous Chalk Formation and the lack of large-scale tectonic features. Coastal sections with chalk lithology variations, local folding, large-scale fractures transverse-oriented to the coastline and onshore valleys incision evidence chaotic shore platforms morphologies. They conduct to variations in coastline orientation and to meter-scale shoreline indentations The southwestern part of Brittany is made of low-lying granitic headland and indented bay cut into meta/granitic rocks. Erosion rates are poorly known, due to slow coastal evolutions through contemporary times. Land-Sea DEM evidence similar onshore and offshore morphologies, with flat and wide superposed plains, limited each one by 10m high scarps. In this case, shore platform extension reaches a few km in width and appears as superposed paleo-shore platforms generated since Pleistocene (Raimbault et al, in press). The erosive process is thus link to a long-term alteration of granitic rocks since Cenozoic, mainly clear and etched during recent past high sea levels. Coastal areas with large bays appear locally to be guided by large-scale Cenozoic fractures. In some places, km-scale fractures favor a spatial concentration of erosion. They are shaping coastline orientation and shore platform ending at km-scale.
High Resolution Imaging of the Sun with CORONAS-1
NASA Technical Reports Server (NTRS)
Karovska, Margarita
1998-01-01
We applied several image restoration and enhancement techniques, to CORONAS-I images. We carried out the characterization of the Point Spread Function (PSF) using the unique capability of the Blind Iterative Deconvolution (BID) technique, which recovers the real PSF at a given location and time of observation, when limited a priori information is available on its characteristics. We also applied image enhancement technique to extract the small scale structure imbeded in bright large scale structures on the disk and on the limb. The results demonstrate the capability of the image post-processing to substantially increase the yield from the space observations by improving the resolution and reducing noise in the images.
NASA Astrophysics Data System (ADS)
Vrieling, Anton; Skidmore, Andrew K.; Wang, Tiejun; Meroni, Michele; Ens, Bruno J.; Oosterbeek, Kees; O'Connor, Brian; Darvishzadeh, Roshanak; Heurich, Marco; Shepherd, Anita; Paganini, Marc
2017-07-01
Vegetation indices derived from satellite image time series have been extensively used to estimate the timing of phenological events like season onset. Medium spatial resolution (≥250 m) satellite sensors with daily revisit capability are typically employed for this purpose. In recent years, phenology is being retrieved at higher resolution (≤30 m) in response to increasing availability of high-resolution satellite data. To overcome the reduced acquisition frequency of such data, previous attempts involved fusion between high- and medium-resolution data, or combinations of multi-year acquisitions in a single phenological reconstruction. The objectives of this study are to demonstrate that phenological parameters can now be retrieved from single-season high-resolution time series, and to compare these retrievals against those derived from multi-year high-resolution and single-season medium-resolution satellite data. The study focuses on the island of Schiermonnikoog, the Netherlands, which comprises a highly-dynamic saltmarsh, dune vegetation, and agricultural land. Combining NDVI series derived from atmospherically-corrected images from RapidEye (5 m-resolution) and the SPOT5 Take5 experiment (10m-resolution) acquired between March and August 2015, phenological parameters were estimated using a function fitting approach. We then compared results with phenology retrieved from four years of 30 m Landsat 8 OLI data, and single-year 100 m Proba-V and 250 m MODIS temporal composites of the same period. Retrieved phenological parameters from combined RapidEye/SPOT5 displayed spatially consistent results and a large spatial variability, providing complementary information to existing vegetation community maps. Retrievals that combined four years of Landsat observations into a single synthetic year were affected by the inclusion of years with warmer spring temperatures, whereas adjustment of the average phenology to 2015 observations was only feasible for a few pixels due to cloud cover around phenological transition dates. The Proba-V and MODIS phenology retrievals scaled poorly relative to their high-resolution equivalents, indicating that medium-resolution phenology retrievals need to be interpreted with care, particularly in landscapes with fine-scale land cover variability.
NASA Astrophysics Data System (ADS)
Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration
2014-11-01
We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.
NASA Astrophysics Data System (ADS)
Ukhorskiy, A. Y.; Sorathia, K.; Merkin, V. G.; Sitnov, M. I.; Mitchell, D. G.; Wiltberger, M. J.; Lyon, J.
2017-12-01
Much of plasma heating and transport from the magnetotail into the inner magnetosphere occurs in the form of mesoscale discrete injections associated with sharp dipolarizations of magnetic field (dipolarization fronts). In this study we investigate the mechanisms of ion acceleration at dipolarization fronts in a high-resolution global magnetospheric MHD model (LFM). We use large-scale three-dimensional test-particle simulations (CHIMP) to address the following science questions: 1) what are the characteristic scales of dipolarization regions that can stably trap ions? 2) what role does the trapping play in ion transport and acceleration? 3) how does it depend on particle energy and distance from Earth? 4) to what extent ion acceleration is adiabatic? High-resolution LFM was run using idealized solar wind conditions with fixed nominal values of density and velocity and a southward IMF component of -5 nT. To simulate ion interaction with dipolarization fronts, a large ensemble of test particles distributed in energy, pitch-angle, and gyrophase was initialized inside one of the LFM dipolarization channels in the magnetotail. Full Lorentz ion trajectories were then computed over the course of the front inward propagation from the distance of 17 to 6 Earth radii. A large fraction of ions with different initial energies stayed in phase with the front over the entire distance. The effect of magnetic trapping at different energies was elucidated with a correlation of the ion guiding center and the ExB drift velocities. The role of trapping in ion energization was quantified by comparing the partial pressure of ions that exhibit trapping to the pressure of all trapped ions.
Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.; Sit, M. A.
2016-12-01
Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.
Dust Polarization toward Embedded Protostars in Ophiuchus with ALMA. I. VLA 1623
NASA Astrophysics Data System (ADS)
Sadavoy, Sarah I.; Myers, Philip C.; Stephens, Ian W.; Tobin, John; Commerçon, Benoît; Henning, Thomas; Looney, Leslie; Kwon, Woojin; Segura-Cox, Dominique; Harris, Robert
2018-06-01
We present high-resolution (∼30 au) ALMA Band 6 dust polarization observations of VLA 1623. The VLA 1623 data resolve compact ∼40 au inner disks around the two protobinary sources, VLA 1623-A and VLA 1623-B, and also an extended ∼180 au ring of dust around VLA 1623-A. This dust ring was previously identified as a large disk in lower-resolution observations. We detect highly structured dust polarization toward the inner disks and the extended ring with typical polarization fractions ≈1.7% and ≈2.4%, respectively. The two components also show distinct polarization morphologies. The inner disks have uniform polarization angles aligned with their minor axes. This morphology is consistent with expectations from dust scattering. By contrast, the extended dust ring has an azimuthal polarization morphology not previously seen in lower-resolution observations. We find that our observations are well-fit by a static, oblate spheroid model with a flux-frozen, poloidal magnetic field. We propose that the polarization traces magnetic grain alignment likely from flux freezing on large scales and magnetic diffusion on small scales. Alternatively, the azimuthal polarization may be attributed to grain alignment by the anisotropic radiation field. If the grains are radiatively aligned, then our observations indicate that large (∼100 μm) dust grains grow quickly at large angular extents. Finally, we identify significant proper motion of VLA 1623 using our observations and those in the literature. This result indicates that the proper motion of nearby systems must be corrected for when combining ALMA data from different epochs.
Multiplexed, High Density Electrophysiology with Nanofabricated Neural Probes
Du, Jiangang; Blanche, Timothy J.; Harrison, Reid R.; Lester, Henry A.; Masmanidis, Sotiris C.
2011-01-01
Extracellular electrode arrays can reveal the neuronal network correlates of behavior with single-cell, single-spike, and sub-millisecond resolution. However, implantable electrodes are inherently invasive, and efforts to scale up the number and density of recording sites must compromise on device size in order to connect the electrodes. Here, we report on silicon-based neural probes employing nanofabricated, high-density electrical leads. Furthermore, we address the challenge of reading out multichannel data with an application-specific integrated circuit (ASIC) performing signal amplification, band-pass filtering, and multiplexing functions. We demonstrate high spatial resolution extracellular measurements with a fully integrated, low noise 64-channel system weighing just 330 mg. The on-chip multiplexers make possible recordings with substantially fewer external wires than the number of input channels. By combining nanofabricated probes with ASICs we have implemented a system for performing large-scale, high-density electrophysiology in small, freely behaving animals that is both minimally invasive and highly scalable. PMID:22022568
NASA Technical Reports Server (NTRS)
Ambrosia, Vincent G.; Myers, Jeffrey S.; Ekstrand, Robert E.; Fitzgerald, Michael T.
1991-01-01
A simple method for enhancing the spatial and spectral resolution of disparate data sets is presented. Two data sets, digitized aerial photography at a nominal spatial resolution 3,7 meters and TMS digital data at 24.6 meters, were coregistered through a bilinear interpolation to solve the problem of blocky pixel groups resulting from rectification expansion. The two data sets were then subjected to intensity-saturation-hue (ISH) transformations in order to 'blend' the high-spatial-resolution (3.7 m) digitized RC-10 photography with the high spectral (12-bands) and lower spatial (24.6 m) resolution TMS digital data. The resultant merged products make it possible to perform large-scale mapping, ease photointerpretation, and can be derived for any of the 12 available TMS spectral bands.
Nishibayashi, Yoshiaki; Yamauchi, Akiyoshi; Onodera, Gen; Uemura, Sakae
2003-07-25
Oxidative kinetic resolution of racemic secondary alcohols by using acetone as a hydrogen acceptor in the presence of a catalytic amount of [RuCl(2)(PPh(3))(ferrocenyloxazolinylphosphine)] (2) proceeds effectively to recover the corresponding alcohols in high yields with an excellent enantioselectivity. When 1-indanol is employed as a racemic alcohol, the oxidation proceeds quite smoothly even in the presence of 0.0025 mol % of the catalyst 2 to give an optically active 1-indanol in good yield with high enantioselectivity (up to 94% ee), where turnover frequency (TOF) exceeds 80,000 h(-1). From a practical viewpoint, the kinetic resolution is investigated in a large scale, optically pure (S)-1-indanol (75 g, 56% yield, >99% ee) being obtained from racemic 1-indanol (134 g) by employing this kinetic resolution method twice.
Double inflation - A possible resolution of the large-scale structure problem
NASA Technical Reports Server (NTRS)
Turner, Michael S.; Villumsen, Jens V.; Vittorio, Nicola; Silk, Joseph; Juszkiewicz, Roman
1987-01-01
A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Omega = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of about 100 Mpc, while the small-scale structure over less than about 10 Mpc resembles that in a low-density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations.
Using pan-sharpened high resolution satellite data to improve impervious surfaces estimation
NASA Astrophysics Data System (ADS)
Xu, Ru; Zhang, Hongsheng; Wang, Ting; Lin, Hui
2017-05-01
Impervious surface is an important environmental and socio-economic indicator for numerous urban studies. While a large number of researches have been conducted to estimate the area and distribution of impervious surface from satellite data, the accuracy for impervious surface estimation (ISE) is insufficient due to high diversity of urban land cover types. This study evaluated the use of panchromatic (PAN) data in very high resolution satellite image for improving the accuracy of ISE by various pan-sharpening approaches, with a further comprehensive analysis of its scale effects. Three benchmark pan-sharpening approaches, Gram-Schmidt (GS), PANSHARP and principal component analysis (PCA) were applied to WorldView-2 in three spots of Hong Kong. The on-screen digitization were carried out based on Google Map and the results were viewed as referenced impervious surfaces. The referenced impervious surfaces and the ISE results were then re-scaled to various spatial resolutions to obtain the percentage of impervious surfaces. The correlation coefficient (CC) and root mean square error (RMSE) were adopted as the quantitative indicator to assess the accuracy. The accuracy differences between three research areas were further illustrated by the average local variance (ALV) which was used for landscape pattern analysis. The experimental results suggested that 1) three research regions have various landscape patterns; 2) ISE accuracy extracted from pan-sharpened data was better than ISE from original multispectral (MS) data; and 3) this improvement has a noticeable scale effects with various resolutions. The improvement was reduced slightly as the resolution became coarser.
NASA Astrophysics Data System (ADS)
Calderer, Antoni; Guo, Xin; Shen, Lian; Sotiropoulos, Fotis
2018-02-01
We develop a numerical method for simulating coupled interactions of complex floating structures with large-scale ocean waves and atmospheric turbulence. We employ an efficient large-scale model to develop offshore wind and wave environmental conditions, which are then incorporated into a high resolution two-phase flow solver with fluid-structure interaction (FSI). The large-scale wind-wave interaction model is based on a two-fluid dynamically-coupled approach that employs a high-order spectral method for simulating the water motion and a viscous solver with undulatory boundaries for the air motion. The two-phase flow FSI solver is based on the level set method and is capable of simulating the coupled dynamic interaction of arbitrarily complex bodies with airflow and waves. The large-scale wave field solver is coupled with the near-field FSI solver with a one-way coupling approach by feeding into the latter waves via a pressure-forcing method combined with the level set method. We validate the model for both simple wave trains and three-dimensional directional waves and compare the results with experimental and theoretical solutions. Finally, we demonstrate the capabilities of the new computational framework by carrying out large-eddy simulation of a floating offshore wind turbine interacting with realistic ocean wind and waves.
Non-Gaussian Multi-resolution Modeling of Magnetosphere-Ionosphere Coupling Processes
NASA Astrophysics Data System (ADS)
Fan, M.; Paul, D.; Lee, T. C. M.; Matsuo, T.
2016-12-01
The most dynamic coupling between the magnetosphere and ionosphere occurs in the Earth's polar atmosphere. Our objective is to model scale-dependent stochastic characteristics of high-latitude ionospheric electric fields that originate from solar wind magnetosphere-ionosphere interactions. The Earth's high-latitude ionospheric electric field exhibits considerable variability, with increasing non-Gaussian characteristics at decreasing spatio-temporal scales. Accurately representing the underlying stochastic physical process through random field modeling is crucial not only for scientific understanding of the energy, momentum and mass exchanges between the Earth's magnetosphere and ionosphere, but also for modern technological systems including telecommunication, navigation, positioning and satellite tracking. While a lot of efforts have been made to characterize the large-scale variability of the electric field in the context of Gaussian processes, no attempt has been made so far to model the small-scale non-Gaussian stochastic process observed in the high-latitude ionosphere. We construct a novel random field model using spherical needlets as building blocks. The double localization of spherical needlets in both spatial and frequency domains enables the model to capture the non-Gaussian and multi-resolutional characteristics of the small-scale variability. The estimation procedure is computationally feasible due to the utilization of an adaptive Gibbs sampler. We apply the proposed methodology to the computational simulation output from the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) magnetosphere model. Our non-Gaussian multi-resolution model results in characterizing significantly more energy associated with the small-scale ionospheric electric field variability in comparison to Gaussian models. By accurately representing unaccounted-for additional energy and momentum sources to the Earth's upper atmosphere, our novel random field modeling approach will provide a viable remedy to the current numerical models' systematic biases resulting from the underestimation of high-latitude energy and momentum sources.
SLIDE - a web-based tool for interactive visualization of large-scale -omics data.
Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon
2018-06-28
Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.
Lei, Yu; Lin, Guan-yu
2013-01-01
Tandem gratings of double-dispersion mount make it possible to design an imaging spectrometer for the weak light observation with high spatial resolution, high spectral resolution, and high optical transmission efficiency. The traditional tandem Wadsworth mounting is originally designed to match the coaxial telescope and large-scale imaging spectrometer. When it is used to connect the off-axis telescope such as off-axis parabolic mirror, it presents lower imaging quality than to connect the coaxial telescope. It may also introduce interference among the detector and the optical elements as it is applied to the short focal length and small-scale spectrometer in a close volume by satellite. An advanced tandem Wadsworth mounting has been investigated to deal with the situation. The Wadsworth astigmatism-corrected mounting condition for which is expressed as the distance between the second concave grating and the imaging plane is calculated. Then the optimum arrangement for the first plane grating and the second concave grating, which make the anterior Wadsworth condition fulfilling each wavelength, is analyzed by the geometric and first order differential calculation. These two arrangements comprise the advanced Wadsworth mounting condition. The spectral resolution has also been calculated by these conditions. An example designed by the optimum theory proves that the advanced tandem Wadsworth mounting performs excellently in spectral broadband.
NASA Astrophysics Data System (ADS)
Stössel, Achim; von Storch, Jin-Song; Notz, Dirk; Haak, Helmuth; Gerdes, Rüdiger
2018-03-01
This study is on high-frequency temporal variability (HFV) and meso-scale spatial variability (MSV) of winter sea-ice drift in the Southern Ocean simulated with a global high-resolution (0.1°) sea ice-ocean model. Hourly model output is used to distinguish MSV characteristics via patterns of mean kinetic energy (MKE) and turbulent kinetic energy (TKE) of ice drift, surface currents, and wind stress, and HFV characteristics via time series of raw variables and correlations. We find that (1) along the ice edge, the MSV of ice drift coincides with that of surface currents, in particular such due to ocean eddies; (2) along the coast, the MKE of ice drift is substantially larger than its TKE and coincides with the MKE of wind stress; (3) in the interior of the ice pack, the TKE of ice drift is larger than its MKE, mostly following the TKE pattern of wind stress; (4) the HFV of ice drift is dominated by weather events, and, in the absence of tidal currents, locally and to a much smaller degree by inertial oscillations; (5) along the ice edge, the curl of the ice drift is highly correlated with that of surface currents, mostly reflecting the impact of ocean eddies. Where ocean eddies occur and the ice is relatively thin, ice velocity is characterized by enhanced relative vorticity, largely matching that of surface currents. Along the ice edge, ocean eddies produce distinct ice filaments, the realism of which is largely confirmed by high-resolution satellite passive-microwave data.
Catastrophic flooding origin of shelf valley systems in the English Channel.
Gupta, Sanjeev; Collier, Jenny S; Palmer-Felgate, Andy; Potter, Graeme
2007-07-19
Megaflood events involving sudden discharges of exceptionally large volumes of water are rare, but can significantly affect landscape evolution, continental-scale drainage patterns and climate change. It has been proposed that a significant flood event eroded a network of large ancient valleys on the floor of the English Channel-the narrow seaway between England and France. This hypothesis has remained untested through lack of direct evidence, and alternative non-catastrophist ideas have been entertained for valley formation. Here we analyse a new regional bathymetric map of part of the English Channel derived from high-resolution sonar data, which shows the morphology of the valley in unprecedented detail. We observe a large bedrock-floored valley that contains a distinct assemblage of landforms, including streamlined islands and longitudinal erosional grooves, which are indicative of large-scale subaerial erosion by high-magnitude water discharges. Our observations support the megaflood model, in which breaching of a rock dam at the Dover Strait instigated catastrophic drainage of a large pro-glacial lake in the southern North Sea basin. We suggest that megaflooding provides an explanation for the permanent isolation of Britain from mainland Europe during interglacial high-sea-level stands, and consequently for patterns of early human colonisation of Britain together with the large-scale reorganization of palaeodrainage in northwest Europe.
The High-Resolution Wave-Propagation Method Applied to Meso- and Micro-Scale Flows
NASA Technical Reports Server (NTRS)
Ahmad, Nashat N.; Proctor, Fred H.
2012-01-01
The high-resolution wave-propagation method for computing the nonhydrostatic atmospheric flows on meso- and micro-scales is described. The design and implementation of the Riemann solver used for computing the Godunov fluxes is discussed in detail. The method uses a flux-based wave decomposition in which the flux differences are written directly as the linear combination of the right eigenvectors of the hyperbolic system. The two advantages of the technique are: 1) the need for an explicit definition of the Roe matrix is eliminated and, 2) the inclusion of source term due to gravity does not result in discretization errors. The resulting flow solver is conservative and able to resolve regions of large gradients without introducing dispersion errors. The methodology is validated against exact analytical solutions and benchmark cases for non-hydrostatic atmospheric flows.
Demodulation algorithm for optical fiber F-P sensor.
Yang, Huadong; Tong, Xinglin; Cui, Zhang; Deng, Chengwei; Guo, Qian; Hu, Pan
2017-09-10
The demodulation algorithm is very important to improving the measurement accuracy of a sensing system. In this paper, the variable step size hill climbing search method will be initially used for the optical fiber Fabry-Perot (F-P) sensing demodulation algorithm. Compared with the traditional discrete gap transformation demodulation algorithm, the computation is greatly reduced by changing step size of each climb, which could achieve nano-scale resolution, high measurement accuracy, high demodulation rates, and large dynamic demodulation range. An optical fiber F-P pressure sensor based on micro-electro-mechanical system (MEMS) has been fabricated to carry out the experiment, and the results show that the resolution of the algorithm can reach nano-scale level, the sensor's sensitivity is about 2.5 nm/KPa, which is similar to the theoretical value, and this sensor has great reproducibility.
The impact of mesoscale convective systems on global precipitation: A modeling study
NASA Astrophysics Data System (ADS)
Tao, Wei-Kuo
2017-04-01
The importance of precipitating mesoscale convective systems (MCSs) has been quantified from TRMM precipitation radar and microwave imager retrievals. MCSs generate more than 50% of the rainfall in most tropical regions. Typical MCSs have horizontal scales of a few hundred kilometers (km); therefore, a large domain and high resolution are required for realistic simulations of MCSs in cloud-resolving models (CRMs). Almost all traditional global and climate models do not have adequate parameterizations to represent MCSs. Typical multi-scale modeling frameworks (MMFs) with 32 CRM grid points and 4 km grid spacing also might not have sufficient resolution and domain size for realistically simulating MCSs. In this study, the impact of MCSs on precipitation processes is examined by conducting numerical model simulations using the Goddard Cumulus Ensemble model (GCE) and Goddard MMF (GMMF). The results indicate that both models can realistically simulate MCSs with more grid points (i.e., 128 and 256) and higher resolutions (1 or 2 km) compared to those simulations with less grid points (i.e., 32 and 64) and low resolution (4 km). The modeling results also show that the strengths of the Hadley circulations, mean zonal and regional vertical velocities, surface evaporation, and amount of surface rainfall are either weaker or reduced in the GMMF when using more CRM grid points and higher CRM resolution. In addition, the results indicate that large-scale surface evaporation and wind feed back are key processes for determining the surface rainfall amount in the GMMF. A sensitivity test with reduced sea surface temperatures (SSTs) is conducted and results in both reduced surface rainfall and evaporation.
Potential for geophysical experiments in large scale tests.
Dieterich, J.H.
1981-01-01
Potential research applications for large-specimen geophysical experiments include measurements of scale dependence of physical parameters and examination of interactions with heterogeneities, especially flaws such as cracks. In addition, increased specimen size provides opportunities for improved recording resolution and greater control of experimental variables. Large-scale experiments using a special purpose low stress (100MPa).-Author
NASA Astrophysics Data System (ADS)
Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.
2015-05-01
A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.
Calculations of High-Temperature Jet Flow Using Hybrid Reynolds-Average Navier-Stokes Formulations
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.; Elmiligui, Alaa; Giriamaji, Sharath S.
2008-01-01
Two multiscale-type turbulence models are implemented in the PAB3D solver. The models are based on modifying the Reynolds-averaged Navier Stokes equations. The first scheme is a hybrid Reynolds-averaged- Navier Stokes/large-eddy-simulation model using the two-equation k(epsilon) model with a Reynolds-averaged-Navier Stokes/large-eddy-simulation transition function dependent on grid spacing and the computed turbulence length scale. The second scheme is a modified version of the partially averaged Navier Stokes model in which the unresolved kinetic energy parameter f(sub k) is allowed to vary as a function of grid spacing and the turbulence length scale. This parameter is estimated based on a novel two-stage procedure to efficiently estimate the level of scale resolution possible for a given flow on a given grid for partially averaged Navier Stokes. It has been found that the prescribed scale resolution can play a major role in obtaining accurate flow solutions. The parameter f(sub k) varies between zero and one and is equal to one in the viscous sublayer and when the Reynolds-averaged Navier Stokes turbulent viscosity becomes smaller than the large-eddy-simulation viscosity. The formulation, usage methodology, and validation examples are presented to demonstrate the enhancement of PAB3D's time-accurate turbulence modeling capabilities. The accurate simulations of flow and turbulent quantities will provide a valuable tool for accurate jet noise predictions. Solutions from these models are compared with Reynolds-averaged Navier Stokes results and experimental data for high-temperature jet flows. The current results show promise for the capability of hybrid Reynolds-averaged Navier Stokes and large eddy simulation and partially averaged Navier Stokes in simulating such flow phenomena.
NASA Astrophysics Data System (ADS)
Freychet, N.; Duchez, A.; Wu, C.-H.; Chen, C.-A.; Hsu, H.-H.; Hirschi, J.; Forryan, A.; Sinha, B.; New, A. L.; Graham, T.; Andrews, M. B.; Tu, C.-Y.; Lin, S.-J.
2017-02-01
This work investigates the variability of extreme weather events (drought spells, DS15, and daily heavy rainfall, PR99) over East Asia. It particularly focuses on the large scale atmospheric circulation associated with high levels of the occurrence of these extreme events. Two observational datasets (APHRODITE and PERSIANN) are compared with two high-resolution global climate models (HiRAM and HadGEM3-GC2) and an ensemble of other lower resolution climate models from CMIP5. We first evaluate the performance of the high resolution models. They both exhibit good skill in reproducing extreme events, especially when compared with CMIP5 results. Significant differences exist between the two observational datasets, highlighting the difficulty of having a clear estimate of extreme events. The link between the variability of the extremes and the large scale circulation is investigated, on monthly and interannual timescales, using composite and correlation analyses. Both extreme indices DS15 and PR99 are significantly linked to the low level wind intensity over East Asia, i.e. the monsoon circulation. It is also found that DS15 events are strongly linked to the surface temperature over the Siberian region and to the land-sea pressure contrast, while PR99 events are linked to the sea surface temperature anomalies over the West North Pacific. These results illustrate the importance of the monsoon circulation on extremes over East Asia. The dependencies on of the surface temperature over the continent and the sea surface temperature raise the question as to what extent they could affect the occurrence of extremes over tropical regions in future projections.
Large-scale imputation of epigenomic datasets for systematic annotation of diverse human tissues.
Ernst, Jason; Kellis, Manolis
2015-04-01
With hundreds of epigenomic maps, the opportunity arises to exploit the correlated nature of epigenetic signals, across both marks and samples, for large-scale prediction of additional datasets. Here, we undertake epigenome imputation by leveraging such correlations through an ensemble of regression trees. We impute 4,315 high-resolution signal maps, of which 26% are also experimentally observed. Imputed signal tracks show overall similarity to observed signals and surpass experimental datasets in consistency, recovery of gene annotations and enrichment for disease-associated variants. We use the imputed data to detect low-quality experimental datasets, to find genomic sites with unexpected epigenomic signals, to define high-priority marks for new experiments and to delineate chromatin states in 127 reference epigenomes spanning diverse tissues and cell types. Our imputed datasets provide the most comprehensive human regulatory region annotation to date, and our approach and the ChromImpute software constitute a useful complement to large-scale experimental mapping of epigenomic information.
Correction of eddy current distortions in high angular resolution diffusion imaging.
Zhuang, Jiancheng; Lu, Zhong-Lin; Vidal, Christine Bouteiller; Damasio, Hanna
2013-06-01
To correct distortions caused by eddy currents induced by large diffusion gradients during high angular resolution diffusion imaging without any auxiliary reference scans. Image distortion parameters were obtained by image coregistration, performed only between diffusion-weighted images with close diffusion gradient orientations. A linear model that describes distortion parameters (translation, scale, and shear) as a function of diffusion gradient directions was numerically computed to allow individualized distortion correction for every diffusion-weighted image. The assumptions of the algorithm were successfully verified in a series of experiments on phantom and human scans. Application of the proposed algorithm in high angular resolution diffusion images markedly reduced eddy current distortions when compared to results obtained with previously published methods. The method can correct eddy current artifacts in the high angular resolution diffusion images, and it avoids the problematic procedure of cross-correlating images with significantly different contrasts resulting from very different gradient orientations or strengths. Copyright © 2012 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Patel, Vimla L.; Branch, Timothy; Gutnik, Lily; Arocha, Jose F.
2006-01-01
High-risk behavior in youths related to HIV transmission continues to occur despite large-scale efforts to disseminate information about safe sexual practices through education. Our study examined the relationships among knowledge, decision-making strategies, and risk assessment about HIV by youths during peer group focused discussions. Two focus…
Selective logging in the Brazilian Amazon.
G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva
2005-01-01
Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...
Subsurface Monitoring of CO2 Sequestration - A Review and Look Forward
NASA Astrophysics Data System (ADS)
Daley, T. M.
2012-12-01
The injection of CO2 into subsurface formations is at least 50 years old with large-scale utilization of CO2 for enhanced oil recovery (CO2-EOR) beginning in the 1970s. Early monitoring efforts had limited measurements in available boreholes. With growing interest in CO2 sequestration beginning in the 1990's, along with growth in geophysical reservoir monitoring, small to mid-size sequestration monitoring projects began to appear. The overall goals of a subsurface monitoring plan are to provide measurement of CO2 induced changes in subsurface properties at a range of spatial and temporal scales. The range of spatial scales allows tracking of the location and saturation of the plume with varying detail, while finer temporal sampling (up to continuous) allows better understanding of dynamic processes (e.g. multi-phase flow) and constraining of reservoir models. Early monitoring of small scale pilots associated with CO2-EOR (e.g., the McElroy field and the Lost Hills field), developed many of the methodologies including tomographic imaging and multi-physics measurements. Large (reservoir) scale sequestration monitoring began with the Sleipner and Weyburn projects. Typically, large scale monitoring, such as 4D surface seismic, has limited temporal sampling due to costs. Smaller scale pilots can allow more frequent measurements as either individual time-lapse 'snapshots' or as continuous monitoring. Pilot monitoring examples include the Frio, Nagaoka and Otway pilots using repeated well logging, crosswell imaging, vertical seismic profiles and CASSM (continuous active-source seismic monitoring). For saline reservoir sequestration projects, there is typically integration of characterization and monitoring, since the sites are not pre-characterized resource developments (oil or gas), which reinforces the need for multi-scale measurements. As we move beyond pilot sites, we need to quantify CO2 plume and reservoir properties (e.g. pressure) over large scales, while still obtaining high resolution. Typically the high-resolution (spatial and temporal) tools are deployed in permanent or semi-permanent borehole installations, where special well design may be necessary, such as non-conductive casing for electrical surveys. Effective utilization of monitoring wells requires an approach of modular borehole monitoring (MBM) were multiple measurements can be made. An example is recent work at the Citronelle pilot injection site where an MBM package with seismic, fluid sampling and distributed fiber sensing was deployed. For future large scale sequestration monitoring, an adaptive borehole-monitoring program is proposed.
NASA Astrophysics Data System (ADS)
Luo, D.; Cai, F.
2017-12-01
Small-scale and high-resolution marine sparker multi-channel seismic surveys using large energy sparkers are characterized by a high dominant frequency of the seismic source, wide bandwidth, and a high resolution. The technology with a high-resolution and high-detection precision was designed to improve the imaging quality of shallow sedimentary. In the study, a 20KJ sparker and 24-channel streamer cable with a 6.25m group interval were used as a seismic source and receiver system, respectively. Key factors for seismic imaging of gas hydrate are enhancement of S/N ratio, amplitude compensation and detailed velocity analysis. However, the data in this study has some characteristics below: 1. Small maximum offsets are adverse to velocity analysis and multiple attenuation. 2. Lack of low frequency information, that is, information less than 100Hz are invisible. 3. Low S/N ratio since less coverage times (only 12 times). These characteristics make it difficult to reach the targets of seismic imaging. In the study, the target processing methods are used to improve the seismic imaging quality of gas hydrate. First, some technologies of noise suppression are combined used in pre-stack seismic data to suppression of seismic noise and improve the S/N ratio. These technologies including a spectrum sharing noise elimination method, median filtering and exogenous interference suppression method. Second, the combined method of three technologies including SRME, τ-p deconvolution and high precision Radon transformation is used to remove multiples. Third, accurate velocity field are used in amplitude energy compensation to highlight the Bottom Simulating Reflector (short for BSR, the indicator of gas hydrates) and gas migration pathways (such as gas chimneys, hot spots et al.). Fourth, fine velocity analysis technology are used to improve accuracy of velocity analysis. Fifth, pre-stack deconvolution processing technology is used to compensate for low frequency energy and suppress of ghost, thus formation reflection characteristics are highlighted. The result shows that the small-scale and high resolution marine sparker multi-channel seismic surveys are very effective in improving the resolution and quality of gas hydrate imaging than the conventional seismic acquisition technology.
Meteorological impact assessment of possible large scale irrigation in Southwest Saudi Arabia
NASA Astrophysics Data System (ADS)
Ter Maat, H. W.; Hutjes, R. W. A.; Ohba, R.; Ueda, H.; Bisselink, B.; Bauer, T.
2006-11-01
On continental to regional scales feedbacks between landuse and landcover change and climate have been widely documented over the past 10-15 years. In the present study we explore the possibility that also vegetation changes over much smaller areas may affect local precipitation regimes. Large scale (˜ 10 5 ha) irrigated plantations in semi-arid environments under particular conditions may affect local circulations and induce additional rainfall. Capturing this rainfall 'surplus' could then reduce the need for external irrigation sources and eventually lead to self-sustained water cycling. This concept is studied in the coastal plains in South West Saudi Arabia where the mountains of the Asir region exhibit the highest rainfall of the peninsula due to orographic lifting and condensation of moisture imported with the Indian Ocean monsoon and with disturbances from the Mediterranean Sea. We use a regional atmospheric modeling system (RAMS) forced by ECMWF analysis data to resolve the effect of complex surface conditions in high resolution (Δ x = 4 km). After validation, these simulations are analysed with a focus on the role of local processes (sea breezes, orographic lifting and the formation of fog in the coastal mountains) in generating rainfall, and on how these will be affected by large scale irrigated plantations in the coastal desert. The validation showed that the model simulates the regional and local weather reasonably well. The simulations exhibit a slightly larger diurnal temperature range than those captured by the observations, but seem to capture daily sea-breeze phenomena well. Monthly rainfall is well reproduced at coarse resolutions, but appears more localized at high resolutions. The hypothetical irrigated plantation (3.25 10 5 ha) has significant effects on atmospheric moisture, but due to weakened sea breezes this leads to limited increases of rainfall. In terms of recycling of irrigation gifts the rainfall enhancement in this particular setting is rather insignificant.
Pushing the limits of spatial resolution with the Kuiper Airborne observatory
NASA Technical Reports Server (NTRS)
Lester, Daniel
1994-01-01
The study of astronomical objects at high spatial resolution in the far-IR is one of the most serious limitations to our work at these wavelengths, which carry information about the luminosity of dusty and obscured sources. At IR wavelengths shorter than 30 microns, ground based telescopes with large apertures at superb sites achieve diffraction-limited performance close to the seeing limit in the optical. At millimeter wavelengths, ground based interferometers achieve resolution that is close to this. The inaccessibility of the far-IR from the ground makes it difficult, however, to achieve complementary resolution in the far-IR. The 1983 IRAS survey, while extraordinarily sensitive, provides us with a sky map at a spatial resolution that is limited by detector size on a spatial scale that is far larger than that available in other wavelengths on the ground. The survey resolution is of order 4 min in the 100 micron bandpass, and 2 min at 60 microns (IRAS Explanatory Supplement, 1988). Information on a scale of 1' is available on some sources from the CPC. Deconvolution and image resolution using this database is one of the subjects of this workshop.
NASA Astrophysics Data System (ADS)
Turner, Alexander J.; Jacob, Daniel J.; Benmergui, Joshua; Brandman, Jeremy; White, Laurent; Randles, Cynthia A.
2018-06-01
Anthropogenic methane emissions originate from a large number of fine-scale and often transient point sources. Satellite observations of atmospheric methane columns are an attractive approach for monitoring these emissions but have limitations from instrument precision, pixel resolution, and measurement frequency. Dense observations will soon be available in both low-Earth and geostationary orbits, but the extent to which they can provide fine-scale information on methane sources has yet to be explored. Here we present an observation system simulation experiment (OSSE) to assess the capabilities of different satellite observing system configurations. We conduct a 1-week WRF-STILT simulation to generate methane column footprints at 1.3 × 1.3 km2 spatial resolution and hourly temporal resolution over a 290 × 235 km2 domain in the Barnett Shale, a major oil and gas field in Texas with a large number of point sources. We sub-sample these footprints to match the observing characteristics of the recently launched TROPOMI instrument (7 × 7 km2 pixels, 11 ppb precision, daily frequency), the planned GeoCARB instrument (2.7 × 3.0 km2 pixels, 4 ppb precision, nominal twice-daily frequency), and other proposed observing configurations. The information content of the various observing systems is evaluated using the Fisher information matrix and its eigenvalues. We find that a week of TROPOMI observations should provide information on temporally invariant emissions at ˜ 30 km spatial resolution. GeoCARB should provide information available on temporally invariant emissions ˜ 2-7 km spatial resolution depending on sampling frequency (hourly to daily). Improvements to the instrument precision yield greater increases in information content than improved sampling frequency. A precision better than 6 ppb is critical for GeoCARB to achieve fine resolution of emissions. Transient emissions would be missed with either TROPOMI or GeoCARB. An aspirational high-resolution geostationary instrument with 1.3 × 1.3 km2 pixel resolution, hourly return time, and 1 ppb precision would effectively constrain the temporally invariant emissions in the Barnett Shale at the kilometer scale and provide some information on hourly variability of sources.
NASA Astrophysics Data System (ADS)
Trinks, Immo; Neubauer, Wolfgang; Hinterleitner, Alois; Kucera, Matthias; Löcker, Klaus; Nau, Erich; Wallner, Mario; Gabler, Manuel; Zitz, Thomas
2014-05-01
Over the past three years the 2010 in Vienna founded Ludwig Boltzmann Institute for Archaeological Prospection and Virtual Archaeology (http://archpro.lbg.ac.at), in collaboration with its ten European partner organizations, has made considerable progress in the development and application of near-surface geophysical survey technology and methodology mapping square kilometres rather than hectares in unprecedented spatial resolution. The use of multiple novel motorized multichannel GPR and magnetometer systems (both Förster/Fluxgate and Cesium type) in combination with advanced and centimetre precise positioning systems (robotic totalstations and Realtime Kinematic GPS) permitting efficient navigation in open fields have resulted in comprehensive blanket coverage archaeological prospection surveys of important cultural heritage sites, such as the landscape surrounding Stonehenge in the framework of the Stonehenge Hidden Landscape Project, the mapping of the World Cultural Heritage site Birka-Hovgården in Sweden, or the detailed investigation of the Roman urban landscape of Carnuntum near Vienna. Efficient state-of-the-art archaeological prospection survey solutions require adequate fieldwork methodologies and appropriate data processing tools for timely quality control of the data in the field and large-scale data visualisations after arrival back in the office. The processed and optimized visualisations of the geophysical measurement data provide the basis for subsequent archaeological interpretation. Integration of the high-resolution geophysical prospection data with remote sensing data acquired through aerial photography, airborne laser- and hyperspectral-scanning, terrestrial laser-scanning or detailed digital terrain models derived through photogrammetric methods permits improved understanding and spatial analysis as well as the preparation of comprehensible presentations for the stakeholders (scientific community, cultural heritage managers, public). Of paramount importance in regard to large-scale high-resolution data acquisition when using motorized survey systems is the exact data positioning as well as the removal of any measurement effects caused by the survey vehicle. The large amount of generated data requires efficient semi-automatic and automatized tools for the extraction and rendering of important information. Semi-automatic data segmentation and classification precede the detailed 3D archaeological interpretation, which still requires considerable manual input. We present the latest technological and methodological developments in regard to motorized near-surface GPR and magnetometer prospection as well as application examples from different iconic European archaeological sites.
NASA Astrophysics Data System (ADS)
Kyrke-Smith, Teresa M.; Gudmundsson, G. Hilmar; Farrell, Patrick E.
2017-11-01
We investigate correlations between seismically derived estimates of basal acoustic impedance and basal slipperiness values obtained from a surface-to-bed inversion using a Stokes ice flow model. Using high-resolution measurements along several seismic profiles on Pine Island Glacier (PIG), we find no significant correlation at kilometer scale between acoustic impedance and either retrieved basal slipperiness or basal drag. However, there is a stronger correlation when comparing average values along the individual profiles. We hypothesize that the correlation appears at the length scales over which basal variations are important to large-scale ice sheet flow. Although the seismic technique is sensitive to the material properties of the bed, at present there is no clear way of incorporating high-resolution seismic measurements of bed properties on ice streams into ice flow models. We conclude that more theoretical work needs to be done before constraints on mechanical conditions at the ice-bed interface from acoustic impedance measurements can be of direct use to ice sheet models.
NASA Astrophysics Data System (ADS)
Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Christensen, Hannah M.; Juricke, Stephan; Subramanian, Aneesh; Watson, Peter A. G.; Weisheimer, Antje; Palmer, Tim N.
2017-03-01
The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), together with coupled transient runs (1850-2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate - specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).
Dynamic Moss Observed with Hi-C
NASA Technical Reports Server (NTRS)
Alexander, Caroline; Winebarger, Amy; Morton, Richard; Savage, Sabrina
2014-01-01
The High-resolution Coronal Imager (Hi-C), flown on 11 July 2012, has revealed an unprecedented level of detail and substructure within the solar corona. Hi--C imaged a large active region (AR11520) with 0.2-0.3'' spatial resolution and 5.5s cadence over a 5 minute period. An additional dataset with a smaller FOV, the same resolution, but with a higher temporal cadence (1s) was also taken during the rocket flight. This dataset was centered on a large patch of 'moss' emission that initially seemed to show very little variability. Image processing revealed this region to be much more dynamic than first thought with numerous bright and dark features observed to appear, move and disappear over the 5 minute observation. Moss is thought to be emission from the upper transition region component of hot loops so studying its dynamics and the relation between the bright/dark features and underlying magnetic features is important to tie the interaction of the different atmospheric layers together. Hi-C allows us to study the coronal emission of the moss at the smallest scales while data from SDO/AIA and HMI is used to give information on these structures at different heights/temperatures. Using the high temporal and spatial resolution of Hi-C the observed moss features were tracked and the distribution of displacements, speeds, and sizes were measured. This allows us to comment on both the physical processes occurring within the dynamic moss and the scales at which these changes are occurring.
Dynamic Moss Observed with Hi-C
NASA Technical Reports Server (NTRS)
Alexander, Caroline; Winebarger, Amy; Morton, Richard; Savage, Sabrina
2014-01-01
The High-resolution Coronal Imager (Hi-C), flown on 11 July 2012, has revealed an unprecedented level of detail and substructure within the solar corona. Hi-C imaged a large active region (AR11520) with 0.2-0.3'' spatial resolution and 5.5s cadence over a 5 minute period. An additional dataset with a smaller FOV, the same resolution, but with a higher temporal cadence (1s) was also taken during the rocket flight. This dataset was centered on a large patch of 'moss' emission that initially seemed to show very little variability. Image processing revealed this region to be much more dynamic than first thought with numerous bright and dark features observed to appear, move and disappear over the 5 minute observation. Moss is thought to be emission from the upper transition region component of hot loops so studying its dynamics and the relation between the bright/dark features and underlying magnetic features is important to tie the interaction of the different atmospheric layers together. Hi-C allows us to study the coronal emission of the moss at the smallest scales while data from SDO/AIA and HMI is used to give information on these structures at different heights/temperatures. Using the high temporal and spatial resolution of Hi-C the observed moss features were tracked and the distribution of displacements, speeds, and sizes were measured. This allows us to comment on both the physical processes occurring within the dynamic moss and the scales at which these changes are occurring.
Forced Imbibition in Porous Media: A Fourfold Scenario
NASA Astrophysics Data System (ADS)
Odier, Céleste; Levaché, Bertrand; Santanach-Carreras, Enric; Bartolo, Denis
2017-11-01
We establish a comprehensive description of the patterns formed when a wetting liquid displaces a viscous fluid confined in a porous medium. Building on model microfluidic experiments, we evidence four imbibition scenarios all yielding different large-scale morphologies. Combining high-resolution imaging and confocal microscopy, we show that they originate from two liquid-entrainment transitions and a Rayleigh-Plateau instability at the pore scale. Finally, we demonstrate and explain the long-time coarsening of the resulting patterns.
Endalamaw, Abraham; Bolton, W. Robert; Young-Robertson, Jessica M.; ...
2017-09-14
Modeling hydrological processes in the Alaskan sub-arctic is challenging because of the extreme spatial heterogeneity in soil properties and vegetation communities. Nevertheless, modeling and predicting hydrological processes is critical in this region due to its vulnerability to the effects of climate change. Coarse-spatial-resolution datasets used in land surface modeling pose a new challenge in simulating the spatially distributed and basin-integrated processes since these datasets do not adequately represent the small-scale hydrological, thermal, and ecological heterogeneity. The goal of this study is to improve the prediction capacity of mesoscale to large-scale hydrological models by introducing a small-scale parameterization scheme, which bettermore » represents the spatial heterogeneity of soil properties and vegetation cover in the Alaskan sub-arctic. The small-scale parameterization schemes are derived from observations and a sub-grid parameterization method in the two contrasting sub-basins of the Caribou Poker Creek Research Watershed (CPCRW) in Interior Alaska: one nearly permafrost-free (LowP) sub-basin and one permafrost-dominated (HighP) sub-basin. The sub-grid parameterization method used in the small-scale parameterization scheme is derived from the watershed topography. We found that observed soil thermal and hydraulic properties – including the distribution of permafrost and vegetation cover heterogeneity – are better represented in the sub-grid parameterization method than the coarse-resolution datasets. Parameters derived from the coarse-resolution datasets and from the sub-grid parameterization method are implemented into the variable infiltration capacity (VIC) mesoscale hydrological model to simulate runoff, evapotranspiration (ET), and soil moisture in the two sub-basins of the CPCRW. Simulated hydrographs based on the small-scale parameterization capture most of the peak and low flows, with similar accuracy in both sub-basins, compared to simulated hydrographs based on the coarse-resolution datasets. On average, the small-scale parameterization scheme improves the total runoff simulation by up to 50 % in the LowP sub-basin and by up to 10 % in the HighP sub-basin from the large-scale parameterization. This study shows that the proposed sub-grid parameterization method can be used to improve the performance of mesoscale hydrological models in the Alaskan sub-arctic watersheds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endalamaw, Abraham; Bolton, W. Robert; Young-Robertson, Jessica M.
Modeling hydrological processes in the Alaskan sub-arctic is challenging because of the extreme spatial heterogeneity in soil properties and vegetation communities. Nevertheless, modeling and predicting hydrological processes is critical in this region due to its vulnerability to the effects of climate change. Coarse-spatial-resolution datasets used in land surface modeling pose a new challenge in simulating the spatially distributed and basin-integrated processes since these datasets do not adequately represent the small-scale hydrological, thermal, and ecological heterogeneity. The goal of this study is to improve the prediction capacity of mesoscale to large-scale hydrological models by introducing a small-scale parameterization scheme, which bettermore » represents the spatial heterogeneity of soil properties and vegetation cover in the Alaskan sub-arctic. The small-scale parameterization schemes are derived from observations and a sub-grid parameterization method in the two contrasting sub-basins of the Caribou Poker Creek Research Watershed (CPCRW) in Interior Alaska: one nearly permafrost-free (LowP) sub-basin and one permafrost-dominated (HighP) sub-basin. The sub-grid parameterization method used in the small-scale parameterization scheme is derived from the watershed topography. We found that observed soil thermal and hydraulic properties – including the distribution of permafrost and vegetation cover heterogeneity – are better represented in the sub-grid parameterization method than the coarse-resolution datasets. Parameters derived from the coarse-resolution datasets and from the sub-grid parameterization method are implemented into the variable infiltration capacity (VIC) mesoscale hydrological model to simulate runoff, evapotranspiration (ET), and soil moisture in the two sub-basins of the CPCRW. Simulated hydrographs based on the small-scale parameterization capture most of the peak and low flows, with similar accuracy in both sub-basins, compared to simulated hydrographs based on the coarse-resolution datasets. On average, the small-scale parameterization scheme improves the total runoff simulation by up to 50 % in the LowP sub-basin and by up to 10 % in the HighP sub-basin from the large-scale parameterization. This study shows that the proposed sub-grid parameterization method can be used to improve the performance of mesoscale hydrological models in the Alaskan sub-arctic watersheds.« less
On the use of kinetic energy preserving DG-schemes for large eddy simulation
NASA Astrophysics Data System (ADS)
Flad, David; Gassner, Gregor
2017-12-01
Recently, element based high order methods such as Discontinuous Galerkin (DG) methods and the closely related flux reconstruction (FR) schemes have become popular for compressible large eddy simulation (LES). Element based high order methods with Riemann solver based interface numerical flux functions offer an interesting dispersion dissipation behavior for multi-scale problems: dispersion errors are very low for a broad range of scales, while dissipation errors are very low for well resolved scales and are very high for scales close to the Nyquist cutoff. In some sense, the inherent numerical dissipation caused by the interface Riemann solver acts as a filter of high frequency solution components. This observation motivates the trend that element based high order methods with Riemann solvers are used without an explicit LES model added. Only the high frequency type inherent dissipation caused by the Riemann solver at the element interfaces is used to account for the missing sub-grid scale dissipation. Due to under-resolution of vortical dominated structures typical for LES type setups, element based high order methods suffer from stability issues caused by aliasing errors of the non-linear flux terms. A very common strategy to fight these aliasing issues (and instabilities) is so-called polynomial de-aliasing, where interpolation is exchanged with projection based on an increased number of quadrature points. In this paper, we start with this common no-model or implicit LES (iLES) DG approach with polynomial de-aliasing and Riemann solver dissipation and review its capabilities and limitations. We find that the strategy gives excellent results, but only when the resolution is such, that about 40% of the dissipation is resolved. For more realistic, coarser resolutions used in classical LES e.g. of industrial applications, the iLES DG strategy becomes quite inaccurate. We show that there is no obvious fix to this strategy, as adding for instance a sub-grid-scale models on top doesn't change much or in worst case decreases the fidelity even more. Finally, the core of this work is a novel LES strategy based on split form DG methods that are kinetic energy preserving. The scheme offers excellent stability with full control over the amount and shape of the added artificial dissipation. This premise is the main idea of the work and we will assess the LES capabilities of the novel split form DG approach when applied to shock-free, moderate Mach number turbulence. We will demonstrate that the novel DG LES strategy offers similar accuracy as the iLES methodology for well resolved cases, but strongly increases fidelity in case of more realistic coarse resolutions.
Simulation of the Atmospheric Boundary Layer for Wind Energy Applications
NASA Astrophysics Data System (ADS)
Marjanovic, Nikola
Energy production from wind is an increasingly important component of overall global power generation, and will likely continue to gain an even greater share of electricity production as world governments attempt to mitigate climate change and wind energy production costs decrease. Wind energy generation depends on wind speed, which is greatly influenced by local and synoptic environmental forcings. Synoptic forcing, such as a cold frontal passage, exists on a large spatial scale while local forcing manifests itself on a much smaller scale and could result from topographic effects or land-surface heat fluxes. Synoptic forcing, if strong enough, may suppress the effects of generally weaker local forcing. At the even smaller scale of a wind farm, upstream turbines generate wakes that decrease the wind speed and increase the atmospheric turbulence at the downwind turbines, thereby reducing power production and increasing fatigue loading that may damage turbine components, respectively. Simulation of atmospheric processes that span a considerable range of spatial and temporal scales is essential to improve wind energy forecasting, wind turbine siting, turbine maintenance scheduling, and wind turbine design. Mesoscale atmospheric models predict atmospheric conditions using observed data, for a wide range of meteorological applications across scales from thousands of kilometers to hundreds of meters. Mesoscale models include parameterizations for the major atmospheric physical processes that modulate wind speed and turbulence dynamics, such as cloud evolution and surface-atmosphere interactions. The Weather Research and Forecasting (WRF) model is used in this dissertation to investigate the effects of model parameters on wind energy forecasting. WRF is used for case study simulations at two West Coast North American wind farms, one with simple and one with complex terrain, during both synoptically and locally-driven weather events. The model's performance with different grid nesting configurations, turbulence closures, and grid resolutions is evaluated by comparison to observation data. Improvement to simulation results from the use of more computationally expensive high resolution simulations is only found for the complex terrain simulation during the locally-driven event. Physical parameters, such as soil moisture, have a large effect on locally-forced events, and prognostic turbulence kinetic energy (TKE) schemes are found to perform better than non-local eddy viscosity turbulence closure schemes. Mesoscale models, however, do not resolve turbulence directly, which is important at finer grid resolutions capable of resolving wind turbine components and their interactions with atmospheric turbulence. Large-eddy simulation (LES) is a numerical approach that resolves the largest scales of turbulence directly by separating large-scale, energetically important eddies from smaller scales with the application of a spatial filter. LES allows higher fidelity representation of the wind speed and turbulence intensity at the scale of a wind turbine which parameterizations have difficulty representing. Use of high-resolution LES enables the implementation of more sophisticated wind turbine parameterizations to create a robust model for wind energy applications using grid spacing small enough to resolve individual elements of a turbine such as its rotor blades or rotation area. Generalized actuator disk (GAD) and line (GAL) parameterizations are integrated into WRF to complement its real-world weather modeling capabilities and better represent wind turbine airflow interactions, including wake effects. The GAD parameterization represents the wind turbine as a two-dimensional disk resulting from the rotation of the turbine blades. Forces on the atmosphere are computed along each blade and distributed over rotating, annular rings intersecting the disk. While typical LES resolution (10-20 m) is normally sufficient to resolve the GAD, the GAL parameterization requires significantly higher resolution (1-3 m) as it does not distribute the forces from the blades over annular elements, but applies them along lines representing individual blades. In this dissertation, the GAL is implemented into WRF and evaluated against the GAD parameterization from two field campaigns that measured the inflow and near-wake regions of a single turbine. The data-sets are chosen to allow validation under the weakly convective and weakly stable conditions characterizing most turbine operations. The parameterizations are evaluated with respect to their ability to represent wake wind speed, variance, and vorticity by comparing fine-resolution GAD and GAL simulations along with coarse-resolution GAD simulations. Coarse-resolution GAD simulations produce aggregated wake characteristics similar to both GAD and GAL simulations (saving on computational cost), while the GAL parameterization enables resolution of near wake physics (such as vorticity shedding and wake expansion) for high fidelity applications. (Abstract shortened by ProQuest.).
Bryson, Mitch; Johnson-Roberson, Matthew; Murphy, Richard J; Bongiorno, Daniel
2013-01-01
Intertidal ecosystems have primarily been studied using field-based sampling; remote sensing offers the ability to collect data over large areas in a snapshot of time that could complement field-based sampling methods by extrapolating them into the wider spatial and temporal context. Conventional remote sensing tools (such as satellite and aircraft imaging) provide data at limited spatial and temporal resolutions and relatively high costs for small-scale environmental science and ecologically-focussed studies. In this paper, we describe a low-cost, kite-based imaging system and photogrammetric/mapping procedure that was developed for constructing high-resolution, three-dimensional, multi-spectral terrain models of intertidal rocky shores. The processing procedure uses automatic image feature detection and matching, structure-from-motion and photo-textured terrain surface reconstruction algorithms that require minimal human input and only a small number of ground control points and allow the use of cheap, consumer-grade digital cameras. The resulting maps combine imagery at visible and near-infrared wavelengths and topographic information at sub-centimeter resolutions over an intertidal shoreline 200 m long, thus enabling spatial properties of the intertidal environment to be determined across a hierarchy of spatial scales. Results of the system are presented for an intertidal rocky shore at Jervis Bay, New South Wales, Australia. Potential uses of this technique include mapping of plant (micro- and macro-algae) and animal (e.g. gastropods) assemblages at multiple spatial and temporal scales.
Bryson, Mitch; Johnson-Roberson, Matthew; Murphy, Richard J.; Bongiorno, Daniel
2013-01-01
Intertidal ecosystems have primarily been studied using field-based sampling; remote sensing offers the ability to collect data over large areas in a snapshot of time that could complement field-based sampling methods by extrapolating them into the wider spatial and temporal context. Conventional remote sensing tools (such as satellite and aircraft imaging) provide data at limited spatial and temporal resolutions and relatively high costs for small-scale environmental science and ecologically-focussed studies. In this paper, we describe a low-cost, kite-based imaging system and photogrammetric/mapping procedure that was developed for constructing high-resolution, three-dimensional, multi-spectral terrain models of intertidal rocky shores. The processing procedure uses automatic image feature detection and matching, structure-from-motion and photo-textured terrain surface reconstruction algorithms that require minimal human input and only a small number of ground control points and allow the use of cheap, consumer-grade digital cameras. The resulting maps combine imagery at visible and near-infrared wavelengths and topographic information at sub-centimeter resolutions over an intertidal shoreline 200 m long, thus enabling spatial properties of the intertidal environment to be determined across a hierarchy of spatial scales. Results of the system are presented for an intertidal rocky shore at Jervis Bay, New South Wales, Australia. Potential uses of this technique include mapping of plant (micro- and macro-algae) and animal (e.g. gastropods) assemblages at multiple spatial and temporal scales. PMID:24069206
NASA Astrophysics Data System (ADS)
Roesler, E. L.; Bosler, P. A.; Taylor, M.
2016-12-01
The impact of strong extratropical storms on coastal communities is large, and the extent to which storms will change with a warming Arctic is unknown. Understanding storms in reanalysis and in climate models is important for future predictions. We know that the number of detected Arctic storms in reanalysis is sensitive to grid resolution. To understand Arctic storm sensitivity to resolution in climate models, we describe simulations designed to identify and compare Arctic storms at uniform low resolution (1 degree), at uniform high resolution (1/8 degree), and at variable resolution (1 degree to 1/8 degree). High-resolution simulations resolve more fine-scale structure and extremes, such as storms, in the atmosphere than a uniform low-resolution simulation. However, the computational cost of running a globally uniform high-resolution simulation is often prohibitive. The variable resolution tool in atmospheric general circulation models permits regional high-resolution solutions at a fraction of the computational cost. The storms are identified using the open-source search algorithm, Stride Search. The uniform high-resolution simulation has over 50% more storms than the uniform low-resolution and over 25% more storms than the variable resolution simulations. Storm statistics from each of the simulations is presented and compared with reanalysis. We propose variable resolution as a cost-effective means of investigating physics/dynamics coupling in the Arctic environment. Future work will include comparisons with observed storms to investigate tuning parameters for high resolution models. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND2016-7402 A
Distant Influence of Kuroshio Eddies on North Pacific Weather Patterns?
Ma, Xiaohui; Chang, Ping; Saravanan, R; Montuoro, Raffaele; Hsieh, Jen-Shan; Wu, Dexing; Lin, Xiaopei; Wu, Lixin; Jing, Zhao
2015-12-04
High-resolution satellite measurements of surface winds and sea-surface temperature (SST) reveal strong coupling between meso-scale ocean eddies and near-surface atmospheric flow over eddy-rich oceanic regions, such as the Kuroshio and Gulf Stream, highlighting the importance of meso-scale oceanic features in forcing the atmospheric planetary boundary layer (PBL). Here, we present high-resolution regional climate modeling results, supported by observational analyses, demonstrating that meso-scale SST variability, largely confined in the Kuroshio-Oyashio confluence region (KOCR), can further exert a significant distant influence on winter rainfall variability along the U.S. Northern Pacific coast. The presence of meso-scale SST anomalies enhances the diabatic conversion of latent heat energy to transient eddy energy, intensifying winter cyclogenesis via moist baroclinic instability, which in turn leads to an equivalent barotropic downstream anticyclone anomaly with reduced rainfall. The finding points to the potential of improving forecasts of extratropical winter cyclones and storm systems and projections of their response to future climate change, which are known to have major social and economic impacts, by improving the representation of ocean eddy-atmosphere interaction in forecast and climate models.
Using an SLR inversion to measure the mass balance of Greenland before and during GRACE
NASA Astrophysics Data System (ADS)
Bonin, Jennifer
2016-04-01
The GRACE mission has done an admirable job of measuring large-scale mass changes over Greenland since its launch in 2002. However before that time, measurements of large-scale ice mass balance were few and far between, leading to a lack of baseline knowledge. High-quality Satellite Laser Ranging (SLR) data existed a decade earlier, but normally has too low a spatial resolution to be used for this purpose. I demonstrate that a least squares inversion technique can reconstitute the SLR data and use it to measure ice loss over Greenland. To do so, I first simulate the problem by degrading today's GRACE data to a level comparable with SLR, then demonstrating that the inversion can re-localize Greenland's contribution to the low-resolution signal, giving an accurate time series of mass change over all of Greenland which compares well with the full-resolution GRACE estimates. I then utilize that method on the actual SLR data, resulting in an independent 1994-2014 time series of mass change over Greenland. I find favorable agreement between the pure-SLR inverted results and the 2012 Ice-sheet Mass Balance Inter-comparison Exercise (IMBIE) results, which are largely based on the "input-output" modeling method before GRACE's launch.
Large-area super-resolution optical imaging by using core-shell microfibers
NASA Astrophysics Data System (ADS)
Liu, Cheng-Yang; Lo, Wei-Chieh
2017-09-01
We first numerically and experimentally report large-area super-resolution optical imaging achieved by using core-shell microfibers. The particular spatial electromagnetic waves for different core-shell microfibers are studied by using finite-difference time-domain and ray tracing calculations. The focusing properties of photonic nanojets are evaluated in terms of intensity profile and full width at half-maximum along propagation and transversal directions. In experiment, the general optical fiber is chemically etched down to 6 μm diameter and coated with different metallic thin films by using glancing angle deposition. The direct imaging of photonic nanojets for different core-shell microfibers is performed with a scanning optical microscope system. We show that the intensity distribution of a photonic nanojet is highly related to the metallic shell due to the surface plasmon polaritons. Furthermore, large-area super-resolution optical imaging is performed by using different core-shell microfibers placed over the nano-scale grating with 150 nm line width. The core-shell microfiber-assisted imaging is achieved with super-resolution and hundreds of times the field-of-view in contrast to microspheres. The possible applications of these core-shell optical microfibers include real-time large-area micro-fluidics and nano-structure inspections.
NASA Astrophysics Data System (ADS)
Deo, Ram K.; Domke, Grant M.; Russell, Matthew B.; Woodall, Christopher W.; Andersen, Hans-Erik
2018-05-01
Aboveground biomass (AGB) estimates for regional-scale forest planning have become cost-effective with the free access to satellite data from sensors such as Landsat and MODIS. However, the accuracy of AGB predictions based on passive optical data depends on spatial resolution and spatial extent of target area as fine resolution (small pixels) data are associated with smaller coverage and longer repeat cycles compared to coarse resolution data. This study evaluated various spatial resolutions of Landsat-derived predictors on the accuracy of regional AGB models at three different sites in the eastern USA: Maine, Pennsylvania-New Jersey, and South Carolina. We combined national forest inventory data with Landsat-derived predictors at spatial resolutions ranging from 30–1000 m to understand the optimal spatial resolution of optical data for large-area (regional) AGB estimation. Ten generic models were developed using the data collected in 2014, 2015 and 2016, and the predictions were evaluated (i) at the county-level against the estimates of the USFS Forest Inventory and Analysis Program which relied on EVALIDator tool and national forest inventory data from the 2009–2013 cycle and (ii) within a large number of strips (~1 km wide) predicted via LiDAR metrics at 30 m spatial resolution. The county-level estimates by the EVALIDator and Landsat models were highly related (R 2 > 0.66), although the R 2 varied significantly across sites and resolution of predictors. The mean and standard deviation of county-level estimates followed increasing and decreasing trends, respectively, with models of coarser resolution. The Landsat-based total AGB estimates were larger than the LiDAR-based total estimates within the strips, however the mean of AGB predictions by LiDAR were mostly within one-standard deviations of the mean predictions obtained from the Landsat-based model at any of the resolutions. We conclude that satellite data at resolutions up to 1000 m provide acceptable accuracy for continental scale analysis of AGB.
Impact of oceanic-scale interactions on the seasonal modulation of ocean dynamics by the atmosphere.
Sasaki, Hideharu; Klein, Patrice; Qiu, Bo; Sasai, Yoshikazu
2014-12-15
Ocean eddies (with a size of 100-300 km), ubiquitous in satellite observations, are known to represent about 80% of the total ocean kinetic energy. Recent studies have pointed out the unexpected role of smaller oceanic structures (with 1-50 km scales) in generating and sustaining these eddies. The interpretation proposed so far invokes the internal instability resulting from the large-scale interaction between upper and interior oceanic layers. Here we show, using a new high-resolution simulation of the realistic North Pacific Ocean, that ocean eddies are instead sustained by a different process that involves small-scale mixed-layer instabilities set up by large-scale atmospheric forcing in winter. This leads to a seasonal evolution of the eddy kinetic energy in a very large part of this ocean, with an amplitude varying by a factor almost equal to 2. Perspectives in terms of the impacts on climate dynamics and future satellite observational systems are briefly discussed.
Impact of oceanic-scale interactions on the seasonal modulation of ocean dynamics by the atmosphere
Sasaki, Hideharu; Klein, Patrice; Qiu, Bo; Sasai, Yoshikazu
2014-01-01
Ocean eddies (with a size of 100–300 km), ubiquitous in satellite observations, are known to represent about 80% of the total ocean kinetic energy. Recent studies have pointed out the unexpected role of smaller oceanic structures (with 1–50 km scales) in generating and sustaining these eddies. The interpretation proposed so far invokes the internal instability resulting from the large-scale interaction between upper and interior oceanic layers. Here we show, using a new high-resolution simulation of the realistic North Pacific Ocean, that ocean eddies are instead sustained by a different process that involves small-scale mixed-layer instabilities set up by large-scale atmospheric forcing in winter. This leads to a seasonal evolution of the eddy kinetic energy in a very large part of this ocean, with an amplitude varying by a factor almost equal to 2. Perspectives in terms of the impacts on climate dynamics and future satellite observational systems are briefly discussed. PMID:25501039
What if we took a global look?
NASA Astrophysics Data System (ADS)
Ouellet Dallaire, C.; Lehner, B.
2014-12-01
Freshwater resources are facing unprecedented pressures. In hope to cope with this, Environmental Hydrology, Freshwater Biology, and Fluvial Geomorphology have defined conceptual approaches such as "environmental flow requirements", "instream flow requirements" or "normative flow regime" to define appropriate flow regime to maintain a given ecological status. These advances in the fields of freshwater resources management are asking scientists to create bridges across disciplines. Holistic and multi-scales approaches are becoming more and more common in water sciences research. The intrinsic nature of river systems demands these approaches to account for the upstream-downstream link of watersheds. Before recent technological developments, large scale analyses were cumbersome and, often, the necessary data was unavailable. However, new technologies, both for information collection and computing capacity, enable a high resolution look at the global scale. For rivers around the world, this new outlook is facilitated by the hydrologically relevant geo-spatial database HydroSHEDS. This database now offers more than 24 millions of kilometers of rivers, some never mapped before, at the click of a fingertip. Large and, even, global scale assessments can now be used to compare rivers around the world. A river classification framework was developed using HydroSHEDS called GloRiC (Global River Classification). This framework advocates for holistic approach to river systems by using sub-classifications drawn from six disciplines related to river sciences: Hydrology, Physiography and climate, Geomorphology, Chemistry, Biology and Human impact. Each of these disciplines brings complementary information on the rivers that is relevant at different scales. A first version of a global river reach classification was produced at the 500m resolution. Variables used in the classification have influence on processes involved at different scales (ex. topography index vs. pH). However, all variables are computed at the same high spatial resolution. This way, we can have a global look at local phenomenon.
Shields, Christine A.; Kiehl, Jeffrey T.; Meehl, Gerald A.
2016-06-02
The global fully coupled half-degree Community Climate System Model Version 4 (CCSM4) was integrated for a suite of climate change ensemble simulations including five historical runs, five Representative Concentration Pathway 8.5 [RCP8.5) runs, and a long Pre-Industrial control run. This study focuses on precipitation at regional scales and its sensitivity to horizontal resolution. The half-degree historical CCSM4 simulations are compared to observations, where relevant, and to the standard 1° CCSM4. Both the halfdegree and 1° resolutions are coupled to a nominal 1° ocean. North American and South Asian/Indian monsoon regimes are highlighted because these regimes demonstrate improvements due to highermore » resolution, primarily because of better-resolved topography. Agriculturally sensitive areas are analyzed and include Southwest, Central, and Southeast U.S., Southern Europe, and Australia. Both mean and extreme precipitation is discussed for convective and large-scale precipitation processes. Convective precipitation tends to decrease with increasing resolution and large-scale precipitation tends to increase. Improvements for the half-degree agricultural regions can be found for mean and extreme precipitation in the Southeast U.S., Southern Europe, and Australian regions. Climate change responses differ between the model resolutions for the U.S. Southwest/Central regions and are seasonally dependent in the Southeast and Australian regions. Both resolutions project a clear drying signal across Southern Europe due to increased greenhouse warming. As a result, differences between resolutions tied to the representation of convective and large-scale precipitation play an important role in the character of the climate change and depend on regional influences.« less
Ultra-Scale Computing for Emergency Evacuation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhaduri, Budhendra L; Nutaro, James J; Liu, Cheng
2010-01-01
Emergency evacuations are carried out in anticipation of a disaster such as hurricane landfall or flooding, and in response to a disaster that strikes without a warning. Existing emergency evacuation modeling and simulation tools are primarily designed for evacuation planning and are of limited value in operational support for real time evacuation management. In order to align with desktop computing, these models reduce the data and computational complexities through simple approximations and representations of real network conditions and traffic behaviors, which rarely represent real-world scenarios. With the emergence of high resolution physiographic, demographic, and socioeconomic data and supercomputing platforms, itmore » is possible to develop micro-simulation based emergency evacuation models that can foster development of novel algorithms for human behavior and traffic assignments, and can simulate evacuation of millions of people over a large geographic area. However, such advances in evacuation modeling and simulations demand computational capacity beyond the desktop scales and can be supported by high performance computing platforms. This paper explores the motivation and feasibility of ultra-scale computing for increasing the speed of high resolution emergency evacuation simulations.« less
NASA Technical Reports Server (NTRS)
Putnam, William M.
2011-01-01
Earth system models like the Goddard Earth Observing System model (GEOS-5) have been pushing the limits of large clusters of multi-core microprocessors, producing breath-taking fidelity in resolving cloud systems at a global scale. GPU computing presents an opportunity for improving the efficiency of these leading edge models. A GPU implementation of GEOS-5 will facilitate the use of cloud-system resolving resolutions in data assimilation and weather prediction, at resolutions near 3.5 km, improving our ability to extract detailed information from high-resolution satellite observations and ultimately produce better weather and climate predictions
Hazardous thunderstorm intensification over Lake Victoria
Thiery, Wim; Davin, Edouard L.; Seneviratne, Sonia I.; Bedka, Kristopher; Lhermitte, Stef; van Lipzig, Nicole P. M.
2016-01-01
Weather extremes have harmful impacts on communities around Lake Victoria, where thousands of fishermen die every year because of intense night-time thunderstorms. Yet how these thunderstorms will evolve in a future warmer climate is still unknown. Here we show that Lake Victoria is projected to be a hotspot of future extreme precipitation intensification by using new satellite-based observations, a high-resolution climate projection for the African Great Lakes and coarser-scale ensemble projections. Land precipitation on the previous day exerts a control on night-time occurrence of extremes on the lake by enhancing atmospheric convergence (74%) and moisture availability (26%). The future increase in extremes over Lake Victoria is about twice as large relative to surrounding land under a high-emission scenario, as only over-lake moisture advection is high enough to sustain Clausius–Clapeyron scaling. Our results highlight a major hazard associated with climate change over East Africa and underline the need for high-resolution projections to assess local climate change. PMID:27658848
Compactified cosmological simulations of the infinite universe
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-06-01
We present a novel N-body simulation method that compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to follow the evolution of the large-scale structure. Our approach eliminates the need for periodic boundary conditions, a mere numerical convenience which is not supported by observation and which modifies the law of force on large scales in an unrealistic fashion. We demonstrate that our method outclasses standard simulations executed on workstation-scale hardware in dynamic range, it is balanced in following a comparable number of high and low k modes and, its fundamental geometry and topology match observations. Our approach is also capable of simulating an expanding, infinite universe in static coordinates with Newtonian dynamics. The price of these achievements is that most of the simulated volume has smoothly varying mass and spatial resolution, an approximation that carries different systematics than periodic simulations. Our initial implementation of the method is called StePS which stands for Stereographically projected cosmological simulations. It uses stereographic projection for space compactification and naive O(N^2) force calculation which is nevertheless faster to arrive at a correlation function of the same quality than any standard (tree or P3M) algorithm with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence our code can function as a high-speed prediction tool for modern large-scale surveys. To learn about the limits of the respective methods, we compare StePS with GADGET-2 running matching initial conditions.
Magnetic Doppler imaging of Ap stars
NASA Astrophysics Data System (ADS)
Silvester, J.; Wade, G. A.; Kochukhov, O.; Landstreet, J. D.; Bagnulo, S.
2008-04-01
Historically, the magnetic field geometries of the chemically peculiar Ap stars were modelled in the context of a simple dipole field. However, with the acquisition of increasingly sophisticated diagnostic data, it has become clear that the large-scale field topologies exhibit important departures from this simple model. Recently, new high-resolution circular and linear polarisation spectroscopy has even hinted at the presence of strong, small-scale field structures, which were completely unexpected based on earlier modelling. This project investigates the detailed structure of these strong fossil magnetic fields, in particular the large-scale field geometry, as well as small scale magnetic structures, by mapping the magnetic and chemical surface structure of a selected sample of Ap stars. These maps will be used to investigate the relationship between the local field vector and local surface chemistry, looking for the influence the field may have on the various chemical transport mechanisms (i.e., diffusion, convection and mass loss). This will lead to better constraints on the origin and evolution, as well as refining the magnetic field model for Ap stars. Mapping will be performed using high resolution and signal-to-noise ratio time-series of spectra in both circular and linear polarisation obtained using the new-generation ESPaDOnS (CFHT, Mauna Kea, Hawaii) and NARVAL spectropolarimeters (Pic du Midi Observatory). With these data we will perform tomographic inversion of Doppler-broadened Stokes IQUV Zeeman profiles of a large variety of spectral lines using the INVERS10 magnetic Doppler imaging code, simultaneously recovering the detailed surface maps of the vector magnetic field and chemical abundances.
Pattern-based, multi-scale segmentation and regionalization of EOSD land cover
NASA Astrophysics Data System (ADS)
Niesterowicz, Jacek; Stepinski, Tomasz F.
2017-10-01
The Earth Observation for Sustainable Development of Forests (EOSD) map is a 25 m resolution thematic map of Canadian forests. Because of its large spatial extent and relatively high resolution the EOSD is difficult to analyze using standard GIS methods. In this paper we propose multi-scale segmentation and regionalization of EOSD as new methods for analyzing EOSD on large spatial scales. Segments, which we refer to as forest land units (FLUs), are delineated as tracts of forest characterized by cohesive patterns of EOSD categories; we delineated from 727 to 91,885 FLUs within the spatial extent of EOSD depending on the selected scale of a pattern. Pattern of EOSD's categories within each FLU is described by 1037 landscape metrics. A shapefile containing boundaries of all FLUs together with an attribute table listing landscape metrics make up an SQL-searchable spatial database providing detailed information on composition and pattern of land cover types in Canadian forest. Shapefile format and extensive attribute table pertaining to the entire legend of EOSD are designed to facilitate broad range of investigations in which assessment of composition and pattern of forest over large areas is needed. We calculated four such databases using different spatial scales of pattern. We illustrate the use of FLU database for producing forest regionalization maps of two Canadian provinces, Quebec and Ontario. Such maps capture the broad scale variability of forest at the spatial scale of the entire province. We also demonstrate how FLU database can be used to map variability of landscape metrics, and thus the character of landscape, over the entire Canada.
Space-based observations of nitrogen dioxide: Trends in anthropogenic emissions
NASA Astrophysics Data System (ADS)
Russell, Ashley Ray
Space-based instruments provide routine global observations, offering a unique perspective on the spatial and temporal variation of atmospheric constituents. In this dissertation, trends in regional-scale anthropogenic nitrogen oxide emissions (NO + NO2 ≡ NOx) are investigated using high resolution observations from the Ozone Monitoring Instrument (OMI). By comparing trends in OMI observations with those from ground-based measurements and an emissions inventory, I show that satellite observations are well-suited for capturing changes in emissions over time. The high spatial and temporal resolutions of the observations provide a uniquely complete view of regional-scale changes in the spatial patterns of NO 2. I show that NOx concentrations have decreased significantly in urban regions of the United States between 2005 and 2011, with an average reduction of 32 ± 7%. By examining day-of-week and interannual trends, I show that these reductions can largely be attributed to improved emission control technology in the mobile source fleet; however, I also show that the economic downturn of the late 2000's has impacted emissions. Additionally, I describe the development of a high-resolution retrieval of NO2 from OMI observations known as the Berkeley High Resolution (BEHR) retrieval. The BEHR product uses higher spatial and temporal resolution terrain and profile parameters than the operational retrievals and is shown to provide a more quantitative measure of tropospheric NO2 column density. These results have important implications for future retrievals of NO2 from space-based observations.
USDA-ARS?s Scientific Manuscript database
Thermal and multispectral remote sensing data from low-altitude aircraft can provide high spatial resolution necessary for sub-field (= 10 m) and plant canopy (= 1 m) scale evapotranspiration (ET) monitoring. In this study, high resolution aircraft sub-meter scale thermal infrared and multispectral...
Large scale track analysis for wide area motion imagery surveillance
NASA Astrophysics Data System (ADS)
van Leeuwen, C. J.; van Huis, J. R.; Baan, J.
2016-10-01
Wide Area Motion Imagery (WAMI) enables image based surveillance of areas that can cover multiple square kilometers. Interpreting and analyzing information from such sources, becomes increasingly time consuming as more data is added from newly developed methods for information extraction. Captured from a moving Unmanned Aerial Vehicle (UAV), the high-resolution images allow detection and tracking of moving vehicles, but this is a highly challenging task. By using a chain of computer vision detectors and machine learning techniques, we are capable of producing high quality track information of more than 40 thousand vehicles per five minutes. When faced with such a vast number of vehicular tracks, it is useful for analysts to be able to quickly query information based on region of interest, color, maneuvers or other high-level types of information, to gain insight and find relevant activities in the flood of information. In this paper we propose a set of tools, combined in a graphical user interface, which allows data analysts to survey vehicles in a large observed area. In order to retrieve (parts of) images from the high-resolution data, we developed a multi-scale tile-based video file format that allows to quickly obtain only a part, or a sub-sampling of the original high resolution image. By storing tiles of a still image according to a predefined order, we can quickly retrieve a particular region of the image at any relevant scale, by skipping to the correct frames and reconstructing the image. Location based queries allow a user to select tracks around a particular region of interest such as landmark, building or street. By using an integrated search engine, users can quickly select tracks that are in the vicinity of locations of interest. Another time-reducing method when searching for a particular vehicle, is to filter on color or color intensity. Automatic maneuver detection adds information to the tracks that can be used to find vehicles based on their behavior.
Utilization of Large Scale Surface Models for Detailed Visibility Analyses
NASA Astrophysics Data System (ADS)
Caha, J.; Kačmařík, M.
2017-11-01
This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.
NASA Technical Reports Server (NTRS)
Waugh, Darryn W.; Plumb, R. Alan
1994-01-01
We present a trajectory technique, contour advection with surgery (CAS), for tracing the evolution of material contours in a specified (including observed) evolving flow. CAS uses the algorithms developed by Dritschel for contour dynamics/surgery to trace the evolution of specified contours. The contours are represented by a series of particles, which are advected by a specified, gridded, wind distribution. The resolution of the contours is preserved by continually adjusting the number of particles, and finescale features are produced that are not present in the input data (and cannot easily be generated using standard trajectory techniques). The reliability, and dependence on the spatial and temporal resolution of the wind field, of the CAS procedure is examined by comparisons with high-resolution numerical data (from contour dynamics calculations and from a general circulation model), and with routine stratospheric analyses. These comparisons show that the large-scale motions dominate the deformation field and that CAS can accurately reproduce small scales from low-resolution wind fields. The CAS technique therefore enables examination of atmospheric tracer transport at previously unattainable resolution.
NASA Technical Reports Server (NTRS)
Meng, Ran; Wu, Jin; Schwager, Kathy L.; Zhao, Feng; Dennison, Philip E.; Cook, Bruce D.; Brewster, Kristen; Green, Timothy M.; Serbin, Shawn P.
2017-01-01
As a primary disturbance agent, fire significantly influences local processes and services of forest ecosystems. Although a variety of remote sensing based approaches have been developed and applied to Landsat mission imagery to infer burn severity at 30 m spatial resolution, forest burn severity have still been seldom assessed at fine spatial scales (less than or equal to 5 m) from very-high-resolution (VHR) data. We assessed a 432 ha forest fire that occurred in April 2012 on Long Island, New York, within the Pine Barrens region, a unique but imperiled fire-dependent ecosystem in the northeastern United States. The mapping of forest burn severity was explored here at fine spatial scales, for the first time using remotely sensed spectral indices and a set of Multiple Endmember Spectral Mixture Analysis (MESMA) fraction images from bi-temporal - pre- and post-fire event - WorldView-2 (WV-2) imagery at 2 m spatial resolution. We first evaluated our approach using 1 m by 1 m validation points at the sub-crown scale per severity class (i.e. unburned, low, moderate, and high severity) from the post-fire 0.10 m color aerial ortho-photos; then, we validated the burn severity mapping of geo-referenced dominant tree crowns (crown scale) and 15 m by 15 m fixed-area plots (inter-crown scale) with the post-fire 0.10 m aerial ortho-photos and measured crown information of twenty forest inventory plots. Our approach can accurately assess forest burn severity at the sub-crown (overall accuracy is 84% with a Kappa value of 0.77), crown (overall accuracy is 82% with a Kappa value of 0.76), and inter-crown scales (89% of the variation in estimated burn severity ratings (i.e. Geo-Composite Burn Index (CBI)). This work highlights that forest burn severity mapping from VHR data can capture heterogeneous fire patterns at fine spatial scales over the large spatial extents. This is important since most ecological processes associated with fire effects vary at the less than 30 m scale and VHR approaches could significantly advance our ability to characterize fire effects on forest ecosystems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meng, Ran; Wu, Jin; Schwager, Kathy L.
As a primary disturbance agent, fire significantly influences local processes and services of forest ecosystems. Although a variety of remote sensing based approaches have been developed and applied to Landsat mission imagery to infer burn severity at 30 m spatial resolution, forest burn severity have still been seldom assessed at fine spatial scales (≤ 5 m) from very-high-resolution (VHR) data. Here we assessed a 432 ha forest fire that occurred in April 2012 on Long Island, New York, within the Pine Barrens region, a unique but imperiled fire-dependent ecosystem in the northeastern United States. The mapping of forest burn severitymore » was explored here at fine spatial scales, for the first time using remotely sensed spectral indices and a set of Multiple Endmember Spectral Mixture Analysis (MESMA) fraction images from bi-temporal — pre- and post-fire event — WorldView-2 (WV-2) imagery at 2 m spatial resolution. We first evaluated our approach using 1 m by 1 m validation points at the sub-crown scale per severity class (i.e. unburned, low, moderate, and high severity) from the post-fire 0.10 m color aerial ortho-photos; then, we validated the burn severity mapping of geo-referenced dominant tree crowns (crown scale) and 15 m by 15 m fixed-area plots (inter-crown scale) with the post-fire 0.10 m aerial ortho-photos and measured crown information of twenty forest inventory plots. Our approach can accurately assess forest burn severity at the sub-crown (overall accuracy is 84% with a Kappa value of 0.77), crown (overall accuracy is 82% with a Kappa value of 0.76), and inter-crown scales (89% of the variation in estimated burn severity ratings (i.e. Geo-Composite Burn Index (CBI)). Lastly, this work highlights that forest burn severity mapping from VHR data can capture heterogeneous fire patterns at fine spatial scales over the large spatial extents. This is important since most ecological processes associated with fire effects vary at the < 30 m scale and VHR approaches could significantly advance our ability to characterize fire effects on forest ecosystems.« less
Meng, Ran; Wu, Jin; Schwager, Kathy L.; ...
2017-01-21
As a primary disturbance agent, fire significantly influences local processes and services of forest ecosystems. Although a variety of remote sensing based approaches have been developed and applied to Landsat mission imagery to infer burn severity at 30 m spatial resolution, forest burn severity have still been seldom assessed at fine spatial scales (≤ 5 m) from very-high-resolution (VHR) data. Here we assessed a 432 ha forest fire that occurred in April 2012 on Long Island, New York, within the Pine Barrens region, a unique but imperiled fire-dependent ecosystem in the northeastern United States. The mapping of forest burn severitymore » was explored here at fine spatial scales, for the first time using remotely sensed spectral indices and a set of Multiple Endmember Spectral Mixture Analysis (MESMA) fraction images from bi-temporal — pre- and post-fire event — WorldView-2 (WV-2) imagery at 2 m spatial resolution. We first evaluated our approach using 1 m by 1 m validation points at the sub-crown scale per severity class (i.e. unburned, low, moderate, and high severity) from the post-fire 0.10 m color aerial ortho-photos; then, we validated the burn severity mapping of geo-referenced dominant tree crowns (crown scale) and 15 m by 15 m fixed-area plots (inter-crown scale) with the post-fire 0.10 m aerial ortho-photos and measured crown information of twenty forest inventory plots. Our approach can accurately assess forest burn severity at the sub-crown (overall accuracy is 84% with a Kappa value of 0.77), crown (overall accuracy is 82% with a Kappa value of 0.76), and inter-crown scales (89% of the variation in estimated burn severity ratings (i.e. Geo-Composite Burn Index (CBI)). Lastly, this work highlights that forest burn severity mapping from VHR data can capture heterogeneous fire patterns at fine spatial scales over the large spatial extents. This is important since most ecological processes associated with fire effects vary at the < 30 m scale and VHR approaches could significantly advance our ability to characterize fire effects on forest ecosystems.« less
Video-rate volumetric neuronal imaging using 3D targeted illumination.
Xiao, Sheng; Tseng, Hua-An; Gritton, Howard; Han, Xue; Mertz, Jerome
2018-05-21
Fast volumetric microscopy is required to monitor large-scale neural ensembles with high spatio-temporal resolution. Widefield fluorescence microscopy can image large 2D fields of view at high resolution and speed while remaining simple and costeffective. A focal sweep add-on can further extend the capacity of widefield microscopy by enabling extended-depth-of-field (EDOF) imaging, but suffers from an inability to reject out-of-focus fluorescence background. Here, by using a digital micromirror device to target only in-focus sample features, we perform EDOF imaging with greatly enhanced contrast and signal-to-noise ratio, while reducing the light dosage delivered to the sample. Image quality is further improved by the application of a robust deconvolution algorithm. We demonstrate the advantages of our technique for in vivo calcium imaging in the mouse brain.
NASA Astrophysics Data System (ADS)
Camera, Corrado; Bruggeman, Adriana; Hadjinicolaou, Panos; Pashiardis, Stelios; Lange, Manfred A.
2014-01-01
High-resolution gridded daily data sets are essential for natural resource management and the analyses of climate changes and their effects. This study aims to evaluate the performance of 15 simple or complex interpolation techniques in reproducing daily precipitation at a resolution of 1 km2 over topographically complex areas. Methods are tested considering two different sets of observation densities and different rainfall amounts. We used rainfall data that were recorded at 74 and 145 observational stations, respectively, spread over the 5760 km2 of the Republic of Cyprus, in the Eastern Mediterranean. Regression analyses utilizing geographical copredictors and neighboring interpolation techniques were evaluated both in isolation and combined. Linear multiple regression (LMR) and geographically weighted regression methods (GWR) were tested. These included a step-wise selection of covariables, as well as inverse distance weighting (IDW), kriging, and 3D-thin plate splines (TPS). The relative rank of the different techniques changes with different station density and rainfall amounts. Our results indicate that TPS performs well for low station density and large-scale events and also when coupled with regression models. It performs poorly for high station density. The opposite is observed when using IDW. Simple IDW performs best for local events, while a combination of step-wise GWR and IDW proves to be the best method for large-scale events and high station density. This study indicates that the use of step-wise regression with a variable set of geographic parameters can improve the interpolation of large-scale events because it facilitates the representation of local climate dynamics.
Duan, Xiaojie; Lieber, Charles M.
2013-01-01
High spatio-temporal resolution interfacing between electrical sensors and biological systems, from single live cells to tissues, is crucial for many areas, including fundamental biophysical studies as well as medical monitoring and intervention. This focused review summarizes recent progresses in the development and application of novel nanoscale devices for intracellular electrical recordings of action potentials, and the effort of merging electronic and biological systems seamlessly in three dimension using macroporous nanoelectronic scaffolds. The uniqueness of these nanoscale devices for minimally invasive, large scale, high spatial resolution, and three dimensional neural activity mapping will be highlighted. PMID:23946279
Low-Cost Ultra-High Spatial and Temporal Resolution Mapping of Intertidal Rock Platforms
NASA Astrophysics Data System (ADS)
Bryson, M.; Johnson-Roberson, M.; Murphy, R.
2012-07-01
Intertidal ecosystems have primarily been studied using field-based sampling; remote sensing offers the ability to collect data over large areas in a snapshot of time which could compliment field-based sampling methods by extrapolating them into the wider spatial and temporal context. Conventional remote sensing tools (such as satellite and aircraft imaging) provide data at relatively course, sub-meter resolutions or with limited temporal resolutions and relatively high costs for small-scale environmental science and ecology studies. In this paper, we describe a low-cost, kite-based imaging system and photogrammetric pipeline that was developed for constructing highresolution, 3D, photo-realistic terrain models of intertidal rocky shores. The processing pipeline uses automatic image feature detection and matching, structure-from-motion and photo-textured terrain surface reconstruction algorithms that require minimal human input and only a small number of ground control points and allow the use of cheap, consumer-grade digital cameras. The resulting maps combine colour and topographic information at sub-centimeter resolutions over an area of approximately 100m, thus enabling spatial properties of the intertidal environment to be determined across a hierarchy of spatial scales. Results of the system are presented for an intertidal rock platform at Cape Banks, Sydney, Australia. Potential uses of this technique include mapping of plant (micro- and macro-algae) and animal (e.g. gastropods) assemblages at multiple spatial and temporal scales.
Large-Scale Astrophysical Visualization on Smartphones
NASA Astrophysics Data System (ADS)
Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.
2011-07-01
Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.
High-resolution face verification using pore-scale facial features.
Li, Dong; Zhou, Huiling; Lam, Kin-Man
2015-08-01
Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.
NASA Astrophysics Data System (ADS)
Deo, R. K.; Domke, G. M.; Russell, M.; Woodall, C. W.
2017-12-01
Landsat data have been widely used to support strategic forest inventory and management decisions despite the limited success of passive optical remote sensing for accurate estimation of aboveground biomass (AGB). The archive of publicly available Landsat data, available at 30-m spatial resolutions since 1984, has been a valuable resource for cost-effective large-area estimation of AGB to inform national requirements such as for the US national greenhouse gas inventory (NGHGI). In addition, other optical satellite data such as MODIS imagery of wider spatial coverage and higher temporal resolution are enriching the domain of spatial predictors for regional scale mapping of AGB. Because NGHGIs require national scale AGB information and there are tradeoffs in the prediction accuracy versus operational efficiency of Landsat, this study evaluated the impact of various resolutions of Landsat predictors on the accuracy of regional AGB models across three different sites in the eastern USA: Maine, Pennsylvania-New Jersey, and South Carolina. We used recent national forest inventory (NFI) data with numerous Landsat-derived predictors at ten different spatial resolutions ranging from 30 to 1000 m to understand the optimal spatial resolution of the optical data for enhanced spatial inventory of AGB for NGHGI reporting. Ten generic spatial models at different spatial resolutions were developed for all sites and large-area estimates were evaluated (i) at the county-level against the independent designed-based estimates via the US NFI Evalidator tool and (ii) within a large number of strips ( 1 km wide) predicted via LiDAR metrics at a high spatial resolution. The county-level estimates by the Evalidator and Landsat models were statistically equivalent and produced coefficients of determination (R2) above 0.85 that varied with sites and resolution of predictors. The mean and standard deviation of county-level estimates followed increasing and decreasing trends, respectively, with models of decreasing resolutions. The Landsat-based total AGB estimates within the strips against the total AGB obtained using LiDAR metrics did not differ significantly and were within ±15 Mg/ha for each of the sites. We conclude that the optical satellite data at resolutions up to 1000 m provide acceptable accuracy for the US' NGHGI.
Electrophysiological Source Imaging: A Noninvasive Window to Brain Dynamics.
He, Bin; Sohrabpour, Abbas; Brown, Emery; Liu, Zhongming
2018-06-04
Brain activity and connectivity are distributed in the three-dimensional space and evolve in time. It is important to image brain dynamics with high spatial and temporal resolution. Electroencephalography (EEG) and magnetoencephalography (MEG) are noninvasive measurements associated with complex neural activations and interactions that encode brain functions. Electrophysiological source imaging estimates the underlying brain electrical sources from EEG and MEG measurements. It offers increasingly improved spatial resolution and intrinsically high temporal resolution for imaging large-scale brain activity and connectivity on a wide range of timescales. Integration of electrophysiological source imaging and functional magnetic resonance imaging could further enhance spatiotemporal resolution and specificity to an extent that is not attainable with either technique alone. We review methodological developments in electrophysiological source imaging over the past three decades and envision its future advancement into a powerful functional neuroimaging technology for basic and clinical neuroscience applications.
First results of high-resolution modeling of Cenozoic subduction orogeny in Andes
NASA Astrophysics Data System (ADS)
Liu, S.; Sobolev, S. V.; Babeyko, A. Y.; Krueger, F.; Quinteros, J.; Popov, A.
2016-12-01
The Andean Orogeny is the result of the upper-plate crustal shortening during the Cenozoic Nazca plate subduction beneath South America plate. With up to 300 km shortening, the Earth's second highest Altiplano-Puna Plateau was formed with a pronounced N-S oriented deformation diversity. Furthermore, the tectonic shortening in the Southern Andes was much less intensive and started much later. The mechanism of the shortening and the nature of N-S variation of its magnitude remain controversial. The previous studies of the Central Andes suggested that they might be related to the N-S variation in the strength of the lithosphere, friction coupling at slab interface, and are probably influenced by the interaction of the climate and tectonic systems. However, the exact nature of the strength variation was not explored due to the lack of high numerical resolution and 3D numerical models at that time. Here we will employ large-scale subduction models with a high resolution to reveal and quantify the factors controlling the strength of lithospheric structures and their effect on the magnitude of tectonic shortening in the South America plate between 18°-35°S. These high-resolution models are performed by using the highly scalable parallel 3D code LaMEM (Lithosphere and Mantle Evolution Model). This code is based on finite difference staggered grid approach and employs massive linear and non-linear solvers within the PETSc library to complete high-performance MPI-based parallelization in geodynamic modeling. Currently, in addition to benchmark-models we are developing high-resolution (< 1km) 2D subduction models with application to Nazca-South America convergence. In particular, we will present the models focusing on the effect of friction reduction in the Paleozoic-Cenozoic sediments above the uppermost crust in the Subandean Ranges. Future work will be focused on the origin of different styles of deformation and topography evolution in Altiplano-Puna Plateau and Central-Southern Andes through 3D modeling of large-scale interaction of subducting and overriding plates.
NASA Astrophysics Data System (ADS)
Plebe, Alice; Grasso, Giorgio
2016-12-01
This paper describes a system developed for the simulation of flames inside an open-source 3D computer graphic software, Blender, with the aim of analyzing in virtual reality scenarios of hazards in large-scale industrial plants. The advantages of Blender are of rendering at high resolution the very complex structure of large industrial plants, and of embedding a physical engine based on smoothed particle hydrodynamics. This particle system is used to evolve a simulated fire. The interaction of this fire with the components of the plant is computed using polyhedron separation distance, adopting a Voronoi-based strategy that optimizes the number of feature distance computations. Results on a real oil and gas refining industry are presented.
Mission Concepts for High-Resolution Solar Imaging with a Photon Sieve
NASA Astrophysics Data System (ADS)
Rabin, Douglas M.; Davila, Joseph; Daw, Adrian N.; Denis, Kevin L.; Novo-Gradac, Anne-Marie; Shah, Neerav; Widmyer, Thomas R.
2017-08-01
The best EUV coronal imagers are unable to probe the expected energy dissipation scales of the solar corona (<100 km) because conventional optics cannot be figured to near diffraction-limited accuracy at these wavelengths. Davila (2011) has proposed that a photon sieve, a diffractive imaging element similar to a Fresnel zone plate, provides a technically feasible path to the required angular resolution. We have produced photon sieves as large as 80 mm clear aperture. We discuss laboratory measurements of these devices and the path to larger apertures. The focal length of a sieve with high EUV resolution is at least 10 m. Options for solar imaging with such a sieve include a sounding rocket, a single spacecraft with a deployed boom, and two spacecraft flying in precise formation.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
Jensen, Tue V.; Pinson, Pierre
2017-01-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.
Jensen, Tue V; Pinson, Pierre
2017-11-28
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
NASA Astrophysics Data System (ADS)
Jensen, Tue V.; Pinson, Pierre
2017-11-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
NASA Astrophysics Data System (ADS)
Tijerina, D.; Gochis, D.; Condon, L. E.; Maxwell, R. M.
2017-12-01
Development of integrated hydrology modeling systems that couple atmospheric, land surface, and subsurface flow is growing trend in hydrologic modeling. Using an integrated modeling framework, subsurface hydrologic processes, such as lateral flow and soil moisture redistribution, are represented in a single cohesive framework with surface processes like overland flow and evapotranspiration. There is a need for these more intricate models in comprehensive hydrologic forecasting and water management over large spatial areas, specifically the Continental US (CONUS). Currently, two high-resolution, coupled hydrologic modeling applications have been developed for this domain: CONUS-ParFlow built using the integrated hydrologic model ParFlow and the National Water Model that uses the NCAR Weather Research and Forecasting hydrological extension package (WRF-Hydro). Both ParFlow and WRF-Hydro include land surface models, overland flow, and take advantage of parallelization and high-performance computing (HPC) capabilities; however, they have different approaches to overland subsurface flow and groundwater-surface water interactions. Accurately representing large domains remains a challenge considering the difficult task of representing complex hydrologic processes, computational expense, and extensive data needs; both models have accomplished this, but have differences in approach and continue to be difficult to validate. A further exploration of effective methodology to accurately represent large-scale hydrology with integrated models is needed to advance this growing field. Here we compare the outputs of CONUS-ParFlow and the National Water Model to each other and with observations to study the performance of hyper-resolution models over large domains. Models were compared over a range of scales for major watersheds within the CONUS with a specific focus on the Mississippi, Ohio, and Colorado River basins. We use a novel set of approaches and analysis for this comparison to better understand differences in process and bias. This intercomparison is a step toward better understanding how much water we have and interactions between surface and subsurface. Our goal is to advance our understanding and simulation of the hydrologic system and ultimately improve hydrologic forecasts.
Thermal-chemical Mantle Convection Models With Adaptive Mesh Refinement
NASA Astrophysics Data System (ADS)
Leng, W.; Zhong, S.
2008-12-01
In numerical modeling of mantle convection, resolution is often crucial for resolving small-scale features. New techniques, adaptive mesh refinement (AMR), allow local mesh refinement wherever high resolution is needed, while leaving other regions with relatively low resolution. Both computational efficiency for large- scale simulation and accuracy for small-scale features can thus be achieved with AMR. Based on the octree data structure [Tu et al. 2005], we implement the AMR techniques into the 2-D mantle convection models. For pure thermal convection models, benchmark tests show that our code can achieve high accuracy with relatively small number of elements both for isoviscous cases (i.e. 7492 AMR elements v.s. 65536 uniform elements) and for temperature-dependent viscosity cases (i.e. 14620 AMR elements v.s. 65536 uniform elements). We further implement tracer-method into the models for simulating thermal-chemical convection. By appropriately adding and removing tracers according to the refinement of the meshes, our code successfully reproduces the benchmark results in van Keken et al. [1997] with much fewer elements and tracers compared with uniform-mesh models (i.e. 7552 AMR elements v.s. 16384 uniform elements, and ~83000 tracers v.s. ~410000 tracers). The boundaries of the chemical piles in our AMR code can be easily refined to the scales of a few kilometers for the Earth's mantle and the tracers are concentrated near the chemical boundaries to precisely trace the evolvement of the boundaries. It is thus very suitable for our AMR code to study the thermal-chemical convection problems which need high resolution to resolve the evolvement of chemical boundaries, such as the entrainment problems [Sleep, 1988].
A Comparison Between Gravity Wave Momentum Fluxes in Observations and Climate Models
NASA Technical Reports Server (NTRS)
Geller, Marvin A.; Alexadner, M. Joan; Love, Peter T.; Bacmeister, Julio; Ern, Manfred; Hertzog, Albert; Manzini, Elisa; Preusse, Peter; Sato, Kaoru; Scaife, Adam A.;
2013-01-01
For the first time, a formal comparison is made between gravity wave momentum fluxes in models and those derived from observations. Although gravity waves occur over a wide range of spatial and temporal scales, the focus of this paper is on scales that are being parameterized in present climate models, sub-1000-km scales. Only observational methods that permit derivation of gravity wave momentum fluxes over large geographical areas are discussed, and these are from satellite temperature measurements, constant-density long-duration balloons, and high-vertical-resolution radiosonde data. The models discussed include two high-resolution models in which gravity waves are explicitly modeled, Kanto and the Community Atmosphere Model, version 5 (CAM5), and three climate models containing gravity wave parameterizations,MAECHAM5, Hadley Centre Global Environmental Model 3 (HadGEM3), and the Goddard Institute for Space Studies (GISS) model. Measurements generally show similar flux magnitudes as in models, except that the fluxes derived from satellite measurements fall off more rapidly with height. This is likely due to limitations on the observable range of wavelengths, although other factors may contribute. When one accounts for this more rapid fall off, the geographical distribution of the fluxes from observations and models compare reasonably well, except for certain features that depend on the specification of the nonorographic gravity wave source functions in the climate models. For instance, both the observed fluxes and those in the high-resolution models are very small at summer high latitudes, but this is not the case for some of the climate models. This comparison between gravity wave fluxes from climate models, high-resolution models, and fluxes derived from observations indicates that such efforts offer a promising path toward improving specifications of gravity wave sources in climate models.
NASA Astrophysics Data System (ADS)
McClellan, M. D.; Wright, W. J.; Job, M. J.; Comas, X.
2015-12-01
Peatlands have the capability to produce and release significant amounts of free phase biogenic gasses (CO2, CH4) into the atmosphere and are thus regarded as key contributors of greenhouse gases into the atmosphere. Many studies throughout the past two decades have investigated gas flux dynamics in peat soils; however a high resolution temporal understanding in the variability of these fluxes (particularly at the matrix scale) is still lacking. This study implements an array of hydrogeophysical methods to investigate the temporal variability in biogenic gas accumulation and release in high resolution for a large 0.073 m3 peat monolith from the Blue Cypress Preserve in central Florida. An autonomous rail system was constructed in order to estimate gas content variability (i.e. build-up and release) within the peat matrix using a series of continuous, uninterrupted ground penetrating radar (GPR) transects along the sample. This system ran non-stop implementing a 0.01 m shot interval using high frequency (1.2 GHz) antennas. GPR measurements were constrained with an array of 6 gas traps fitted with time-lapse cameras in order to capture gas releases at 15 minute intervals. A gas chromatograph was used to determine CH4 and CO2 content of the gas collected in the gas traps. The aim of this study is to investigate the temporal variability in the accumulation and release of biogenic gases in subtropical peat soils at the lab scale at a high resolution. This work has implications for better understanding carbon dynamics in subtropical freshwater peatlands and how climate change may alter such dynamics.
A Comparative Study of Point Cloud Data Collection and Processing
NASA Astrophysics Data System (ADS)
Pippin, J. E.; Matheney, M.; Gentle, J. N., Jr.; Pierce, S. A.; Fuentes-Pineda, G.
2016-12-01
Over the past decade, there has been dramatic growth in the acquisition of publicly funded high-resolution topographic data for scientific, environmental, engineering and planning purposes. These data sets are valuable for applications of interest across a large and varied user community. However, because of the large volumes of data produced by high-resolution mapping technologies and expense of aerial data collection, it is often difficult to collect and distribute these datasets. Furthermore, the data can be technically challenging to process, requiring software and computing resources not readily available to many users. This study presents a comparison of advanced computing hardware and software that is used to collect and process point cloud datasets, such as LIDAR scans. Activities included implementation and testing of open source libraries and applications for point cloud data processing such as, Meshlab, Blender, PDAL, and PCL. Additionally, a suite of commercial scale applications, Skanect and Cloudcompare, were applied to raw datasets. Handheld hardware solutions, a Structure Scanner and Xbox 360 Kinect V1, were tested for their ability to scan at three field locations. The resultant data projects successfully scanned and processed subsurface karst features ranging from small stalactites to large rooms, as well as a surface waterfall feature. Outcomes support the feasibility of rapid sensing in 3D at field scales.
Wang, Ran; Gamon, John A; Cavender-Bares, Jeannine; Townsend, Philip A; Zygielbaum, Arthur I
2018-03-01
Remote sensing has been used to detect plant biodiversity in a range of ecosystems based on the varying spectral properties of different species or functional groups. However, the most appropriate spatial resolution necessary to detect diversity remains unclear. At coarse resolution, differences among spectral patterns may be too weak to detect. In contrast, at fine resolution, redundant information may be introduced. To explore the effect of spatial resolution, we studied the scale dependence of spectral diversity in a prairie ecosystem experiment at Cedar Creek Ecosystem Science Reserve, Minnesota, USA. Our study involved a scaling exercise comparing synthetic pixels resampled from high-resolution images within manipulated diversity treatments. Hyperspectral data were collected using several instruments on both ground and airborne platforms. We used the coefficient of variation (CV) of spectral reflectance in space as the indicator of spectral diversity and then compared CV at different scales ranging from 1 mm 2 to 1 m 2 to conventional biodiversity metrics, including species richness, Shannon's index, Simpson's index, phylogenetic species variation, and phylogenetic species evenness. In this study, higher species richness plots generally had higher CV. CV showed higher correlations with Shannon's index and Simpson's index than did species richness alone, indicating evenness contributed to the spectral diversity. Correlations with species richness and Simpson's index were generally higher than with phylogenetic species variation and evenness measured at comparable spatial scales, indicating weaker relationships between spectral diversity and phylogenetic diversity metrics than with species diversity metrics. High resolution imaging spectrometer data (1 mm 2 pixels) showed the highest sensitivity to diversity level. With decreasing spatial resolution, the difference in CV between diversity levels decreased and greatly reduced the optical detectability of biodiversity. The optimal pixel size for distinguishing α diversity in these prairie plots appeared to be around 1 mm to 10 cm, a spatial scale similar to the size of an individual herbaceous plant. These results indicate a strong scale-dependence of the spectral diversity-biodiversity relationships, with spectral diversity best able to detect a combination of species richness and evenness, and more weakly detecting phylogenetic diversity. These findings can be used to guide airborne studies of biodiversity and develop more effective large-scale biodiversity sampling methods. ©2018 The Authors Ecological Applications published by Wiley Periodicals, Inc. on behalf of Ecological Society of America.
Remote Sensing Analysis of Forest Disturbances
NASA Technical Reports Server (NTRS)
Asner, Gregory P. (Inventor)
2015-01-01
The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.
Remote sensing analysis of forest disturbances
NASA Technical Reports Server (NTRS)
Asner, Gregory P. (Inventor)
2012-01-01
The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.
Preservation and Access to Manuscript Collections of the Czech National Library.
ERIC Educational Resources Information Center
Karen, Vladimir; Psohlavec, Stanislav
In 1996, the Czech National Library started a large-scale digitization of its extensive and invaluable collection of historical manuscripts and printed books. Each page of the selected documents is scanned using a high-resolution, full-color digital camera, processed, and archived on a CD-ROM disk. HTML coded description is added to the entire…
Evaluation of a Mesoscale Convective System in Variable-Resolution CESM
NASA Astrophysics Data System (ADS)
Payne, A. E.; Jablonowski, C.
2017-12-01
Warm season precipitation over the Southern Great Plains (SGP) follows a well observed diurnal pattern of variability, peaking at night-time, due to the eastward propagation of mesoscale convection systems that develop over the eastern slopes of the Rockies in the late afternoon. While most climate models are unable to adequately capture the organization of convection and characteristic pattern of precipitation over this region, models with high enough resolution to explicitly resolve convection show improvement. However, high resolution simulations are computationally expensive and, in the case of regional climate models, are subject to boundary conditions. Newly developed variable resolution global climate models strike a balance between the benefits of high-resolution regional climate models and the large-scale dynamics of global climate models and low computational cost. Recently developed parameterizations that are insensitive to the model grid scale provide a way to improve model performance. Here, we present an evaluation of the newly available Cloud Layers Unified by Binormals (CLUBB) parameterization scheme in a suite of variable-resolution CESM simulations with resolutions ranging from 110 km to 7 km within a regionally refined region centered over the SGP Atmospheric Radiation Measurement (ARM) site. Simulations utilize the hindcast approach developed by the Department of Energy's Cloud-Associated Parameterizations Testbed (CAPT) for the assessment of climate models. We limit our evaluation to a single mesoscale convective system that passed over the region on May 24, 2008. The effects of grid-resolution on the timing and intensity of precipitation, as well as, on the transition from shallow to deep convection are assessed against ground-based observations from the SGP ARM site, satellite observations and ERA-Interim reanalysis.
Flexible Organic Electronics for Use in Neural Sensing
Bink, Hank; Lai, Yuming; Saudari, Sangameshwar R.; Helfer, Brian; Viventi, Jonathan; Van der Spiegel, Jan; Litt, Brian; Kagan, Cherie
2016-01-01
Recent research in brain-machine interfaces and devices to treat neurological disease indicate that important network activity exists at temporal and spatial scales beyond the resolution of existing implantable devices. High density, active electrode arrays hold great promise in enabling high-resolution interface with the brain to access and influence this network activity. Integrating flexible electronic devices directly at the neural interface can enable thousands of multiplexed electrodes to be connected using many fewer wires. Active electrode arrays have been demonstrated using flexible, inorganic silicon transistors. However, these approaches may be limited in their ability to be cost-effectively scaled to large array sizes (8×8 cm). Here we show amplifiers built using flexible organic transistors with sufficient performance for neural signal recording. We also demonstrate a pathway for a fully integrated, amplified and multiplexed electrode array built from these devices. PMID:22255558
Pinxterhuis, Erik B.; Gualtierotti, Jean-Baptiste; Heeres, Hero J.
2017-01-01
Access to enantiopure compounds on large scale in an environmentally friendly and cost-efficient manner remains one of the greatest challenges in chemistry. Resolution of racemates using enantioselective liquid–liquid extraction has great potential to meet that challenge. However, a relatively feeble understanding of the chemical principles and physical properties behind this technique has hampered the development of hosts possessing sufficient resolving power for their application to large scale processes. Herein we present, employing the previously untested SPINOL based phosphoric acids host family, an in depths study of the parameters affecting the efficiency of the resolution of amino-alcohols in the optic of further understanding the core principles behind ELLE. We have systematically investigated the dependencies of the enantioselection by parameters such as the choice of solvent, the temperature, as well as the pH and bring to light many previously unsuspected and highly intriguing interactions. Furthermore, utilizing these new insights to our advantage, we developed novel, highly efficient, extraction and resolving protocols which provide remarkable levels of enantioselectivity. It was shown that the extraction is catalytic in host by demonstrating transport in a U-tube and finally it was demonstrated how the solvent dependency could be exploited in an unprecedented triphasic resolution system. PMID:28989671
High Resolution Laser Mass Spectrometry Bioimaging
Murray, Kermit K.; Seneviratne, Chinthaka A.; Ghorai, Suman
2016-01-01
MSI (MSI) was introduced more than five decades ago with secondary ion mass spectrometry (SIMS) and a decade later with laser desorption/ionization (LDI) mass spectrometry (MS). Large biomolecule imaging by matrix-assisted laser desorption/ionization (MALDI) was developed in the 1990s and ambient laser MS a decade ago. Although SIMS has been capable of imaging with a moderate mass range at sub-micrometer lateral resolution from its inception, laser MS requires additional effort to achieve a lateral resolution of 10 μm or below which is required to image at the size scale of single mammalian cells. This review covers untargeted large biomolecule MSI using lasers for desorption/ionization or laser desorption and post-ionization. These methods include laser microprobe (LDI) MSI, MALDI MSI, laser ambient and atmospheric pressure MSI, and near-field laser ablation MS. Novel approaches to improving lateral resolution are discussed, including oversampling, beam shaping, transmission geometry, reflective and through-hole objectives, microscope mode, and near-field optics. PMID:26972785
High resolution laser mass spectrometry bioimaging.
Murray, Kermit K; Seneviratne, Chinthaka A; Ghorai, Suman
2016-07-15
Mass spectrometry imaging (MSI) was introduced more than five decades ago with secondary ion mass spectrometry (SIMS) and a decade later with laser desorption/ionization (LDI) mass spectrometry (MS). Large biomolecule imaging by matrix-assisted laser desorption/ionization (MALDI) was developed in the 1990s and ambient laser MS a decade ago. Although SIMS has been capable of imaging with a moderate mass range at sub-micrometer lateral resolution from its inception, laser MS requires additional effort to achieve a lateral resolution of 10μm or below which is required to image at the size scale of single mammalian cells. This review covers untargeted large biomolecule MSI using lasers for desorption/ionization or laser desorption and post-ionization. These methods include laser microprobe (LDI) MSI, MALDI MSI, laser ambient and atmospheric pressure MSI, and near-field laser ablation MS. Novel approaches to improving lateral resolution are discussed, including oversampling, beam shaping, transmission geometry, reflective and through-hole objectives, microscope mode, and near-field optics. Copyright © 2016 Elsevier Inc. All rights reserved.
High-performance holographic technologies for fluid-dynamics experiments
Orlov, Sergei S.; Abarzhi, Snezhana I.; Oh, Se Baek; Barbastathis, George; Sreenivasan, Katepalli R.
2010-01-01
Modern technologies offer new opportunities for experimentalists in a variety of research areas of fluid dynamics. Improvements are now possible in the state-of-the-art in precision, dynamic range, reproducibility, motion-control accuracy, data-acquisition rate and information capacity. These improvements are required for understanding complex turbulent flows under realistic conditions, and for allowing unambiguous comparisons to be made with new theoretical approaches and large-scale numerical simulations. One of the new technologies is high-performance digital holography. State-of-the-art motion control, electronics and optical imaging allow for the realization of turbulent flows with very high Reynolds number (more than 107) on a relatively small laboratory scale, and quantification of their properties with high space–time resolutions and bandwidth. In-line digital holographic technology can provide complete three-dimensional mapping of the flow velocity and density fields at high data rates (over 1000 frames per second) over a relatively large spatial area with high spatial (1–10 μm) and temporal (better than a few nanoseconds) resolution, and can give accurate quantitative description of the fluid flows, including those of multi-phase and unsteady conditions. This technology can be applied in a variety of problems to study fundamental properties of flow–particle interactions, rotating flows, non-canonical boundary layers and Rayleigh–Taylor mixing. Some of these examples are discussed briefly. PMID:20211881
Hayworth, Kenneth J.; Morgan, Josh L.; Schalek, Richard; Berger, Daniel R.; Hildebrand, David G. C.; Lichtman, Jeff W.
2014-01-01
The automated tape-collecting ultramicrotome (ATUM) makes it possible to collect large numbers of ultrathin sections quickly—the equivalent of a petabyte of high resolution images each day. However, even high throughput image acquisition strategies generate images far more slowly (at present ~1 terabyte per day). We therefore developed WaferMapper, a software package that takes a multi-resolution approach to mapping and imaging select regions within a library of ultrathin sections. This automated method selects and directs imaging of corresponding regions within each section of an ultrathin section library (UTSL) that may contain many thousands of sections. Using WaferMapper, it is possible to map thousands of tissue sections at low resolution and target multiple points of interest for high resolution imaging based on anatomical landmarks. The program can also be used to expand previously imaged regions, acquire data under different imaging conditions, or re-image after additional tissue treatments. PMID:25018701
Larue, Michelle A; Knight, Joseph
2014-12-01
The Southern Ocean is one of the most rapidly changing ecosystems on the planet due to the effects of climate change and commercial fishing for ecologically important krill and fish. Because sea ice loss is expected to be accompanied by declines in krill and fish predators, decoupling the effects of climate and anthropogenic changes on these predator populations is crucial for ecosystem-based management of the Southern Ocean. We reviewed research published from 2007 to 2014 that incorporated very high-resolution satellite imagery to assess distribution, abundance, and effects of climate and other anthropogenic changes on populations of predators in polar regions. Very high-resolution imagery has been used to study 7 species of polar animals in 13 papers, many of which provide methods through which further research can be conducted. Use of very high-resolution imagery in the Southern Ocean can provide a broader understanding of climate and anthropogenic forces on populations and inform management and conservation recommendations. We recommend that conservation biologists continue to integrate high-resolution remote sensing into broad-scale biodiversity and population studies in remote areas, where it can provide much needed detail. © 2014 Society for Conservation Biology.
Fukuda, Kenjiro; Someya, Takao
2017-07-01
Printed electronics enable the fabrication of large-scale, low-cost electronic devices and systems, and thus offer significant possibilities in terms of developing new electronics/optics applications in various fields. Almost all electronic applications require information processing using logic circuits. Hence, realizing the high-speed operation of logic circuits is also important for printed devices. This report summarizes recent progress in the development of printed thin-film transistors (TFTs) and integrated circuits in terms of materials, printing technologies, and applications. The first part of this report gives an overview of the development of functional inks such as semiconductors, electrodes, and dielectrics. The second part discusses high-resolution printing technologies and strategies to enable high-resolution patterning. The main focus of this report is on obtaining printed electrodes with high-resolution patterning and the electrical performance of printed TFTs using such printed electrodes. In the final part, some applications of printed electronics are introduced to exemplify their potential. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Tang, Y.; Birch, S.; Hayes, A.; Kirk, R. L.; Kutsop, N. W. S.; Squyres, S. W.
2017-12-01
Observations from ESA's Rosetta spacecraft of comet 67P/Churyumov-Gerasimenko (67P) have provided insights into the geological processes that act to modify the surface of a small, primitive body. The landscapes of 67P are shaped by both large scale violent changes, such as cliff collapses and jet events, as well as smaller and more subtle changes such as the formation of pits and ripples within the larger-scale granular deposits. Explosive jets are located through triangulating the same jet in multiple images. They appear to originate from locations close to numerous newly formed, small-scale pits, which were only observed after known jet events (for example, the jet observed on March 11th, 2015, in image N20150311T053737597ID30F22). This implies a possible link between these two dynamical processes. We generated high-resolution photoclinometric digital terrain models (DTM) of the surface of 67P (at 1.5m/pixel) in locations where recent jet events were observed and over surfaces where newly formed pits are observed. A comparison of DTMs generated of the surface both before and after the appearance of the pits provides insight to the magnitude of dynamical changes, including the volume of the ejected material. By tracking the change in the surface topography at such high resolution, we constrain both the volume of materials that are ejected from the surface during the jet event, and of materials that are retained in nearby deposits. By studying these events and their aftermath, it will be possible to formulate numerical models as to the formation of the jets and explain why and how they occur. We will use this information in conjunction with numerical modeling of the large-scale global transport of sedimentary materials on 67P, to facilitate a better understanding of cometary landscape evolution.
Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.
2012-01-01
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394
Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P
2012-06-05
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.
A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data
Venkat, A.; Christensen, C.; Gyulassy, A.; Summa, B.; Federer, F.; Angelucci, A.; Pascucci, V.
2017-01-01
The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data. PMID:28638896
A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.
Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V
2016-08-01
The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.
Unstructured-grid coastal ocean modelling in Southern Adriatic and Northern Ionian Seas
NASA Astrophysics Data System (ADS)
Federico, Ivan; Pinardi, Nadia; Coppini, Giovanni; Oddo, Paolo
2016-04-01
The Southern Adriatic Northern Ionian coastal Forecasting System (SANIFS) is a short-term forecasting system based on unstructured grid approach. The model component is built on SHYFEM finite element three-dimensional hydrodynamic model. The operational chain exploits a downscaling approach starting from the Mediterranean oceanographic-scale model MFS (Mediterranean Forecasting System, operated by INGV). The implementation set-up has been designed to provide accurate hydrodynamics and active tracer processes in the coastal waters of Southern Eastern Italy (Apulia, Basilicata and Calabria regions), where the model is characterized by a variable resolution in range of 50-500 m. The horizontal resolution is also high in open-sea areas, where the elements size is approximately 3 km. The model is forced: (i) at the lateral open boundaries through a full nesting strategy directly with the MFS (temperature, salinity, non-tidal sea surface height and currents) and OTPS (tidal forcing) fields; (ii) at surface through two alternative atmospheric forcing datasets (ECMWF and COSMOME) via MFS-bulk-formulae. Given that the coastal fields are driven by a combination of both local/coastal and deep ocean forcings propagating along the shelf, the performance of SANIFS was verified first (i) at the large and shelf-coastal scales by comparing with a large scale CTD survey and then (ii) at the coastal-harbour scale by comparison with CTD, ADCP and tide gauge data. Sensitivity tests were performed on initialization conditions (mainly focused on spin-up procedures) and on surface boundary conditions by assessing the reliability of two alternative datasets at different horizontal resolution (12.5 and 7 km). The present work highlights how downscaling could improve the simulation of the flow field going from typical open-ocean scales of the order of several km to the coastal (and harbour) scales of tens to hundreds of meters.
NASA Astrophysics Data System (ADS)
Schaaf, Benjamin; Feser, Frauke
2015-04-01
The evaluation of long-term changes in wind speeds is very important for the coastal areas and the protection measures. Therefor the wind variability at the regional scale for the coast of Northern Germany shall be analysed. In order to derive changes in storminess it is essential to analyse long, homogeneous meteorological time series. Wind measurements often suffer from inconsistencies which arise from changes in instrumentation, observation method, or station location. Reanalysis data take into account such inhomogeneities of observation data and convert these measurements into a consistent, gridded data set with the same grid spacing and time intervals. This leads to a smooth, homogeneous data set, but with relatively low resolution (about 210 km for the longest reanalysis data set, the NCEP reanalysis starting in 1948). Therefore a high-resolution regional atmospheric model will be used to bring these reanalyses to a higher resolution, using in addition to a dynamical downscaling approach the spectral nudging technique. This method 'nudges' the large spatial scales of the regional climate model towards the reanalysis, while the smaller spatial scales are left unchanged. It was applied successfully in a number of applications, leading to realistic atmospheric weather descriptions of the past. With the regional climate model COSMO-CLM a very high-resolution data set was calculated for the last 67 years, the period from 1948 until now. The model area is North Germany with the coastal area of the North sea and parts of the Baltic sea. This is one of the first model simulations on climate scale with a very high resolution of 2.8 km, so even small scale effects can be detected. With this hindcast-simulation there are numerous options of evaluation. One can create wind climatologies for regional areas such as for the metropolitan region of Hamburg. Otherwise one can investigate individual storms in a case study. With a filtering and tracking program the course of individual storms can be tracked and compared with observations. Also statistical studies can be done and one can calculate percentiles, return periods and other different extreme value statistic variables. Later, with a further nesting simulation, the resolution can be reduced to 1 km for individual areas of interest to analyse small islands (as Foehr or Amrum) and their effects on the atmospheric flow more closely.
Local short-duration precipitation extremes in Sweden: observations, forecasts and projections
NASA Astrophysics Data System (ADS)
Olsson, Jonas; Berg, Peter; Simonsson, Lennart
2015-04-01
Local short-duration precipitation extremes (LSPEs) are a key driver of hydrological hazards, notably in steep catchments with thin soils and in urban environments. The triggered floodings, landslides, etc., have large consequences for society in terms of both economy and health. Accurate estimations of LSPEs on both climatological time-scales (past, present, future) and in real-time is thus of great importance for improved hydrological predictions as well as design of constructions and infrastructure affected by hydrological fluxes. Analysis of LSPEs is, however, associated with various limitations and uncertainties. These are to a large degree associated with the small-scale nature of the meteorological processes behind LSPEs and the associated requirements on observation sensors as well as model descriptions. Some examples of causes for the limitations involved are given in the following. - Observations: High-resolution data sets available for LSPE analyses are often limited to either relatively long series from one or a few stations or relatively short series from larger station networks. Radar data have excellent resolutions in both time and space but the estimated local precipitation intensity is still highly uncertain. New and promising techniques (e.g. microwave links) are still in their infancy. - Weather forecasts (short-range): Although forecasts with the required spatial resolution for potential generation of LSPEs (around 2-4 km) are becoming operationally available, the actual forecast precision of LSPEs is largely unknown. Forecasted LSPEs may be displaced in time or, more critically, in space which strongly affects the possibility to assess hydrological risk. - Climate projections: The spatial resolution of the current RCM generation (around 25 km) is not sufficient for proper description of LSPEs. Statistical post-processing (i.e. downscaling) is required which adds substantial uncertainty to the final result. Ensemble generation of sufficiently high-resolution RCM projections is not yet computationally feasible. In this presentation, examples of recent research in Sweden related to these aspects will be given with some main findings shown and discussed. Finally, some ongoing and future research directions will be outlined (the former hopefully accompanied by some brand-new results).
The Holocene Geomagnetic Field: Spikes, Low Field Anomalies, and Asymmetries
NASA Astrophysics Data System (ADS)
Constable, C.
2017-12-01
Our understanding of the Holocene magnetic field is constrained by individual paleomagnetic records of variable quality and resolution, composite regional secular variation curves, and low resolution global time-varying geomagnetic field models. Although spatial and temporal data coverages have greatly improved in recent years, typical views of millennial-scale secular variation and the underlying physical processes continue to be heavily influenced by more detailed field structure and short term variability inferred from the historical record and modern observations. Recent models of gyre driven decay of the geomagnetic dipole on centennial time scales, and studies of the evolution of the South Atlantic Anomaly provide one prominent example. Since 1840 dipole decay has largely been driven by meridional flux advection, with generally smaller fairly steady contributions from magnetic diffusion. The decay is dominantly associated with geomagnetic activity in the Southern Hemisphere. In contrast to the present decay, dipole strength generally grew between 1500 and 1000 BC, sustaining high but fluctuating values around 90-100 ZAm2 until after 1500 AD. Thus high dipole moments appear to have been present shortly after 1000 AD at the time of the Levantine spikes, which represent extreme variations in regional geomagnetic field strength. It has been speculated that the growth in dipole moment originated from a strong flux patch near the equatorial region at the core-mantle boundary that migrated north and west to augment the dipole strength, suggesting the presence of a large-scale anticyclonic gyre in the northern hemisphere, not totally unlike the southern hemisphere flow that dominates present day dipole decay. The later brief episodes of high field strength in the Levant may have contributed to prolonged values of high dipole strength until the onset of dipole decay in the late second millennium AD. This could support the concept of a large-scale stable flow configuration for several millennia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
Evaluation of Multi-Channel ADCs for Gamma-Ray Spectroscopy
NASA Astrophysics Data System (ADS)
Tan, Hui; Hennig, Wolfgang; Walby, Mark D.; Breus, Dimitry; Harris, Jackson
2013-04-01
As nuclear physicists increasingly design large scale experiments with hundreds or thousands of detector channels, there are growing needs for high density readout electronics with good timing and energy resolution that at the same time offer lower cost per channel compared to existing commercial solutions. Recent improvements in the design of commercial analog to digital converters (ADCs) have resulted in a variety of multi-channel ADCs that are natural choice for designing such high density readout modules. However, multi-channel ADCs typically are designed for medical imaging/ultrasound applications and therefore are not rated for their spectroscopic characteristics. In this work, we evaluated the gamma-ray spectroscopic performance of several multi-channel ADCs, including their energy resolution, nonlinearity, and timing resolution. Some of these ADCs demonstrated excellent energy resolution, 2.66% FWHM at 662 keV with a LaBr3 or 1.78 keV FWHM at 1332.5 keV with a high purity germanium (HPGe) detector, and sub-nanosecond timing resolution with LaBr 3. We present results from these measurements to illustrate their suitability for gamma-ray spectroscopy.
NASA Astrophysics Data System (ADS)
Comas, X.; Wright, W. J.; Hynek, S. A.; Ntarlagiannis, D.; Terry, N.; Job, M. J.; Fletcher, R. C.; Brantley, S.
2017-12-01
Previous studies in the Rio Icacos watershed in the Luquillo Mountains (Puerto Rico) have shown that regolith materials are rapidly developed from the alteration of quartz diorite bedrock, and create a blanket on top of the bedrock with a thickness that decreases with proximity to the knickpoint. The watershed is also characterized by a system of heterogeneous fractures that likely drive bedrock weathering and the formation of corestones and associated spheroidal fracturing and rindlets. Previous efforts to characterize the spatial distribution of fractures were based on aerial images that did not account for the architecture of the critical zone below the subsurface. In this study we use an array of near-surface geophysical methods at multiple scales to better understand how the spatial distribution and density of fractures varies with topography and proximity to the knickpoint. Large km-scale surveys using ground penetrating radar (GPR), terrain conductivity, and capacitively coupled resistivity, were combined with smaller scale surveys (10-100 m) using electrical resistivity imaging (ERI), and shallow seismics, and were directly constrained with boreholes from previous studies. Geophysical results were compared to theoretical models of compressive stress as due to gravity and regional compression, and showed consistency at describing increased dilation of fractures with proximity to the knickpoint. This study shows the potential of multidisciplinary approaches to model critical zone processes at multiple scales of measurement and high spatial resolution. The approach can be particularly efficient at large km-scales when applying geophysical methods that allow for rapid data acquisition (i.e. walking pace) at high spatial resolution (i.e. cm scales).
Investigating the scale-adaptivity of a shallow cumulus parameterization scheme with LES
NASA Astrophysics Data System (ADS)
Brast, Maren; Schemann, Vera; Neggers, Roel
2017-04-01
In this study we investigate the scale-adaptivity of a new parameterization scheme for shallow cumulus clouds in the gray zone. The Eddy-Diffusivity Multiple Mass-Flux (or ED(MF)n ) scheme is a bin-macrophysics scheme, in which subgrid transport is formulated in terms of discretized size densities. While scale-adaptivity in the ED-component is achieved using a pragmatic blending approach, the MF-component is filtered such that only the transport by plumes smaller than the grid size is maintained. For testing, ED(MF)n is implemented in a large-eddy simulation (LES) model, replacing the original subgrid-scheme for turbulent transport. LES thus plays the role of a non-hydrostatic testing ground, which can be run at different resolutions to study the behavior of the parameterization scheme in the boundary-layer gray zone. In this range convective cumulus clouds are partially resolved. We find that at high resolutions the clouds and the turbulent transport are predominantly resolved by the LES, and the transport represented by ED(MF)n is small. This partitioning changes towards coarser resolutions, with the representation of shallow cumulus clouds becoming exclusively carried by the ED(MF)n. The way the partitioning changes with grid-spacing matches the results of previous LES studies, suggesting some scale-adaptivity is captured. Sensitivity studies show that a scale-inadaptive ED component stays too active at high resolutions, and that the results are fairly insensitive to the number of transporting updrafts in the ED(MF)n scheme. Other assumptions in the scheme, such as the distribution of updrafts across sizes and the value of the area fraction covered by updrafts, are found to affect the location of the gray zone.
Flank vents and graben as indicators of Late Amazonian volcanotectonic activity on Olympus Mons
NASA Astrophysics Data System (ADS)
Peters, S. I.; Christensen, P. R.
2017-03-01
Previous studies have focused on large-scale features on Olympus Mons, such as its flank terraces, the summit caldera complex, and the basal escarpment and aureole deposits. Here we identify and characterize previously unrecognized and unmapped small scale features to help further understand the volcanotectonic evolution of this enormous volcano. Using Context Camera, High Resolution Imaging Science Experiment, Thermal Emission Imaging System, High Resolution Stereo Camera Digital Terrain Model, and Mars Orbiter Laser Altimeter data, we identified and characterized the morphology and distribution of 60 flank vents and 84 grabens on Olympus Mons. We find that effusive eruptions have dominated volcanic activity on Olympus Mons in the Late Amazonian. Explosive eruptions were rare, implying volatile-poor magmas and/or a lack of magma-water interactions during the Late Amazonian. The distribution of flank vents suggests dike propagation of hundreds of kilometers and shallow magma storage. Small grabens, not previously observed in lower-resolution data, occur primarily on the lower flanks of Olympus Mons and indicate late-stage extensional tectonism. Based on superposition relationships, we have concluded two stages of development for Olympus Mons during the Late Amazonian: (1) primarily effusive resurfacing and formation of flank vents followed by (2) waning effusive volcanism and graben formation and/or reactivation. This developmental sequence resembles that proposed for Ascraeus Mons and other large Martian shields, suggesting a similar geologic evolution for these volcanoes.
Simulation Based Exploration of Critical Zone Dynamics in Intensively Managed Landscapes
NASA Astrophysics Data System (ADS)
Kumar, P.
2017-12-01
The advent of high-resolution measurements of topographic and (vertical) vegetation features using areal LiDAR are enabling us to resolve micro-scale ( 1m) landscape structural characteristics over large areas. Availability of hyperspectral measurements is further augmenting these LiDAR data by enabling the biogeochemical characterization of vegetation and soils at unprecedented spatial resolutions ( 1-10m). Such data have opened up novel opportunities for modeling Critical Zone processes and exploring questions that were not possible before. We show how an integrated 3-D model at 1m grid resolution can enable us to resolve micro-topographic and ecological dynamics and their control on hydrologic and biogeochemical processes over large areas. We address the computational challenge of such detailed modeling by exploiting hybrid CPU and GPU computing technologies. We show results of moisture, biogeochemical, and vegetation dynamics from studies in the Critical Zone Observatory for Intensively managed Landscapes (IMLCZO) in the Midwestern United States.
NASA Astrophysics Data System (ADS)
Rajib, A.; Merwade, V.; Liu, Z.; Lane, C.; Golden, H. E.; Tavakoly, A. A.; Follum, M. L.
2017-12-01
There have been many initiatives to develop frameworks for continental-scale modeling and mapping floodplain dynamics. The choice of a model for such needs should be governed by its suitability to be executed in high performance cyber platforms, ability to integrate supporting hydraulic/hydrodynamic tools, and ability to assimilate earth observations. Furthermore, disseminating large volume of outputs for public use and interoperability with similar frameworks should be considered. Considering these factors, we have conducted a series of modeling experiments and developed a suite of cyber-enabled platforms that have transformed Soil and Water Assessment Tool (SWAT) into an appropriate model for use in a continental-scale, high resolution, near real-time flood information framework. Our first experiment uses a medium size watershed in Indiana, USA and attempts burning-in a high resolution, National Hydrography Dataset Plus(NHDPlus) into the SWAT model. This is crucial with a view to make the outputs comparable with other global/national initiatives. The second experiment is built upon the first attempt to add a modified landscape representation in the model which differentiates between the upland and floodplain processes. Our third experiment involves two separate efforts: coupling SWAT with a hydrodynamic model LISFLOOD-FP and a new generation, low complexity hydraulic model AutoRoute. We have executed the prototype "loosely-coupled" models for the Upper Mississippi-Ohio River Basin in the USA, encompassing 1 million square km drainage area and nearly 0.2 million NHDPlus river reaches. The preliminary results suggest reasonable accuracy for both streamflow and flood inundation. In this presentation, we will also showcase three cyber-enabled platforms, including SWATShare to run and calibrate large scale SWAT models online using high performance computational resources, HydroGlobe to automatically extract and assimilate multiple remotely sensed earth observations in model sub-basins, and SWATFlow to visualize/download streamflow and flood inundation maps through an interactive interface. With all these transformational changes to enhance and support SWAT, it is expected that the model can be a sustainable alternative in the Global Flood Partnership program.
NASA Astrophysics Data System (ADS)
McClain, Bobbi J.; Porter, William F.
2000-11-01
Satellite imagery is a useful tool for large-scale habitat analysis; however, its limitations need to be tested. We tested these limitations by varying the methods of a habitat evaluation for white-tailed deer ( Odocoileus virginianus) in the Adirondack Park, New York, USA, utilizing harvest data to create and validate the assessment models. We used two classified images, one with a large minimum mapping unit but high accuracy and one with no minimum mapping unit but slightly lower accuracy, to test the sensitivity of the evaluation to these differences. We tested the utility of two methods of assessment, habitat suitability index modeling, and pattern recognition modeling. We varied the scale at which the models were applied by using five separate sizes of analysis windows. Results showed that the presence of a large minimum mapping unit eliminates important details of the habitat. Window size is relatively unimportant if the data are averaged to a large resolution (i.e., township), but if the data are used at the smaller resolution, then the window size is an important consideration. In the Adirondacks, the proportion of hardwood and softwood in an area is most important to the spatial dynamics of deer populations. The low occurrence of open area in all parts of the park either limits the effect of this cover type on the population or limits our ability to detect the effect. The arrangement and interspersion of cover types were not significant to deer populations.
Interactive Correlation Analysis and Visualization of Climate Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Kwan-Liu
The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods formore » visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.« less
A high-resolution regional reanalysis for the European CORDEX region
NASA Astrophysics Data System (ADS)
Bollmeyer, Christoph; Keller, Jan; Ohlwein, Christian; Wahl, Sabrina
2015-04-01
Within the Hans-Ertel-Centre for Weather Research (HErZ), the climate monitoring branch concentrates efforts on the assessment and analysis of regional climate in Germany and Europe. In joint cooperation with DWD (German Weather Service), a high-resolution reanalysis system based on the COSMO model has been developed. Reanalyses gain more and more importance as a source of meteorological information for many purposes and applications. Several global reanalyses projects (e.g., ERA, MERRA, CSFR, JMA9) produce and verify these data sets to provide time series as long as possible combined with a high data quality. Due to a spatial resolution down to 50-70km and 3-hourly temporal output, they are not suitable for small scale problems (e.g., regional climate assessment, meso-scale NWP verification, input for subsequent models such as river runoff simulations, renewable energy applications). The implementation of regional reanalyses based on a limited area model along with a data assimilation scheme is able to generate reanalysis data sets with high spatio-temporal resolution. The work presented here focuses on two regional reanalyses for Europe and Germany. The European reanalysis COSMO-REA6 matches the CORDEX EURO-11 specifications, albeit at a higher spatial resolution, i.e., 0.055° (6km) instead of 0.11° (12km). Nested into COSMO-REA6 is COSMO-REA2, a convective-scale reanalysis with 2km resolution for Germany. COSMO-REA6 comprises the assimilation of observational data using the existing nudging scheme of COSMO and is complemented by a special soil moisture analysis and boundary conditions given by ERA-Interim data. COSMO-REA2 also uses the nudging scheme complemented by a latent heat nudging of radar information. The reanalysis data set currently covers 17 years (1997-2013) for COSMO-REA6 and 4 years (2010-2013) for COSMO-REA2 with a very large set of output variables and a high temporal output step of hourly 3D-fields and quarter-hourly 2D-fields. The evaluation of the reanalyses is done using independent observations for the most important meteorological parameters with special emphasis on precipitation and high-impact weather situations.
Tang, Shiming; Zhang, Yimeng; Li, Zhihao; Li, Ming; Liu, Fang; Jiang, Hongfei; Lee, Tai Sing
2018-04-26
One general principle of sensory information processing is that the brain must optimize efficiency by reducing the number of neurons that process the same information. The sparseness of the sensory representations in a population of neurons reflects the efficiency of the neural code. Here, we employ large-scale two-photon calcium imaging to examine the responses of a large population of neurons within the superficial layers of area V1 with single-cell resolution, while simultaneously presenting a large set of natural visual stimuli, to provide the first direct measure of the population sparseness in awake primates. The results show that only 0.5% of neurons respond strongly to any given natural image - indicating a ten-fold increase in the inferred sparseness over previous measurements. These population activities are nevertheless necessary and sufficient to discriminate visual stimuli with high accuracy, suggesting that the neural code in the primary visual cortex is both super-sparse and highly efficient. © 2018, Tang et al.
SMOS L1C and L2 Validation in Australia
NASA Technical Reports Server (NTRS)
Rudiger, Christoph; Walker, Jeffrey P.; Kerr, Yann H.; Mialon, Arnaud; Merlin, Olivier; Kim, Edward J.
2012-01-01
Extensive airborne field campaigns (Australian Airborne Cal/val Experiments for SMOS - AACES) were undertaken during the 2010 summer and winter seasons of the southern hemisphere. The purpose of those campaigns was the validation of the Level 1c (brightness temperature) and Level 2 (soil moisture) products of the ESA-led Soil Moisture and Ocean Salinity (SMOS) mission. As SMOS is the first satellite to globally map L-band (1.4GHz) emissions from the Earth?s surface, and the first 2-dimensional interferometric microwave radiometer used for Earth observation, large scale and long-term validation campaigns have been conducted world-wide, of which AACES is the most extensive. AACES combined large scale medium-resolution airborne L-band and spectral observations, along with high-resolution in-situ measurements of soil moisture across a 50,000km2 area of the Murrumbidgee River catchment, located in south-eastern Australia. This paper presents a qualitative assessment of the SMOS brightness temperature and soil moisture products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Renzi, Fabiana; Panetta, Gianna; Vallone, Beatrice
Recombinant His-tagged XendoU, a eukaryotic endoribonuclease, appeared to aggregate in the presence of divalent cations. Monodisperse protein which yielded crystals diffracting to 2.2 Å was obtained by addition of EDTA. XendoU is the first endoribonuclease described in higher eukaryotes as being involved in the endonucleolytic processing of intron-encoded small nucleolar RNAs. It is conserved among eukaryotes and its viral homologue is essential in SARS replication and transcription. The large-scale purification and crystallization of recombinant XendoU are reported. The tendency of the recombinant enzyme to aggregate could be reversed upon the addition of chelating agents (EDTA, imidazole): aggregation is a potentialmore » drawback when purifying and crystallizing His-tagged proteins, which are widely used, especially in high-throughput structural studies. Purified monodisperse XendoU crystallized in two different space groups: trigonal P3{sub 1}21, diffracting to low resolution, and monoclinic C2, diffracting to higher resolution.« less
Scaling an in situ network for high resolution modeling during SMAPVEX15
NASA Astrophysics Data System (ADS)
Coopersmith, E. J.; Cosh, M. H.; Jacobs, J. M.; Jackson, T. J.; Crow, W. T.; Holifield Collins, C.; Goodrich, D. C.; Colliander, A.
2015-12-01
Among the greatest challenges within the field of soil moisture estimation is that of scaling sparse point measurements within a network to produce higher resolution map products. Large-scale field experiments present an ideal opportunity to develop methodologies for this scaling, by coupling in situ networks, temporary networks, and aerial mapping of soil moisture. During the Soil Moisture Active Passive Validation Experiments in 2015 (SMAPVEX15) in and around the USDA-ARS Walnut Gulch Experimental Watershed and LTAR site in southeastern Arizona, USA, a high density network of soil moisture stations was deployed across a sparse, permanent in situ network in coordination with intensive soil moisture sampling and an aircraft campaign. This watershed is also densely instrumented with precipitation gages (one gauge/0.57 km2) to monitor the North American Monsoon System, which dominates the hydrologic cycle during the summer months in this region. Using the precipitation and soil moisture time series values provided, a physically-based model is calibrated that will provide estimates at the 3km, 9km, and 36km scales. The results from this model will be compared with the point-scale gravimetric samples, aircraft-based sensor, and the satellite-based products retrieved from NASA's Soil Moisture Active Passive mission.
NASA Astrophysics Data System (ADS)
Piniewski, Mikołaj
2016-05-01
The objective of this study was to apply a previously developed large-scale and high-resolution SWAT model of the Vistula and the Odra basins, calibrated with the focus of natural flow simulation, in order to assess the impact of three different dam reservoirs on streamflow using the Indicators of Hydrologic Alteration (IHA). A tailored spatial calibration approach was designed, in which calibration was focused on a large set of relatively small non-nested sub-catchments with semi-natural flow regime. These were classified into calibration clusters based on the flow statistics similarity. After performing calibration and validation that gave overall positive results, the calibrated parameter values were transferred to the remaining part of the basins using an approach based on hydrological similarity of donor and target catchments. The calibrated model was applied in three case studies with the purpose of assessing the effect of dam reservoirs (Włocławek, Siemianówka and Czorsztyn Reservoirs) on streamflow alteration. Both the assessment based on gauged streamflow (Before-After design) and the one based on simulated natural streamflow showed large alterations in selected flow statistics related to magnitude, duration, high and low flow pulses and rate of change. Some benefits of using a large-scale and high-resolution hydrological model for the assessment of streamflow alteration include: (1) providing an alternative or complementary approach to the classical Before-After designs, (2) isolating the climate variability effect from the dam (or any other source of alteration) effect, (3) providing a practical tool that can be applied at a range of spatial scales over large area such as a country, in a uniform way. Thus, presented approach can be applied for designing more natural flow regimes, which is crucial for river and floodplain ecosystem restoration in the context of the European Union's policy on environmental flows.
Solar Confocal Interferometers for Sub-Picometer-Resolution Spectral Filters
NASA Technical Reports Server (NTRS)
Gary, G. Allen; Pietraszewski, Chris; West, Edward A.; Dines, Terence C.
2006-01-01
The confocal Fabry-Perot interferometer allows sub-picometer spectral resolution of Fraunhofer line profiles. Such high spectral resolution is needed to keep pace with the higher spatial resolution of the new set of large-aperture solar telescopes. The line-of-sight spatial resolution derived for line profile inversions would then track the improvements of the transverse spatial scale provided by the larger apertures. The confocal interferometer's unique properties allow a simultaneous increase in both etendue and spectral power. Methods: We have constructed and tested two confocal interferometers. Conclusions: In this paper we compare the confocal interferometer with other spectral imaging filters, provide initial design parameters, show construction details for two designs, and report on the laboratory test results for these interferometers, and propose a multiple etalon system for future testing of these units and to obtain sub-picometer spectral resolution information on the photosphere in both the visible and near-infrared.
Preliminary Assessment of Microwave Readout Multiplexing Factor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croce, Mark Philip; Koehler, Katrina Elizabeth; Rabin, Michael W.
2017-01-23
Ultra-high resolution microcalorimeter gamma spectroscopy is a new non-destructive assay technology for measurement of plutonium isotopic composition, with the potential to reduce total measurement uncertainty to a level competitive with destructive analysis methods [1-4]. Achieving this level of performance in practical applications requires not only the energy resolution now routinely achieved with transition-edge sensor microcalorimeter arrays (an order of magnitude better than for germanium detectors) but also high throughput. Microcalorimeter gamma spectrometers have not yet achieved detection efficiency and count rate capability that is comparable to germanium detectors, largely because of limits from existing readout technology. Microcalorimeter detectors must bemore » operated at low temperature to achieve their exceptional energy resolution. Although the typical 100 mK operating temperatures can be achieved with reliable, cryogen-free systems, the cryogenic complexity and heat load from individual readout channels for large sensor arrays is prohibitive. Multiplexing is required for practical systems. The most mature multiplexing technology at present is time-division multiplexing (TDM) [3, 5-6]. In TDM, the sensor outputs are switched by applying bias current to one SQUID amplifier at a time. Transition-edge sensor (TES) microcalorimeter arrays as large as 256 pixels have been developed for X-ray and gamma-ray spectroscopy using TDM technology. Due to bandwidth limits and noise scaling, TDM is limited to a maximum multiplexing factor of approximately 32-40 sensors on one readout line [8]. Increasing the size of microcalorimeter arrays above the kilopixel scale, required to match the throughput of germanium detectors, requires the development of a new readout technology with a much higher multiplexing factor.« less
Development and Application of a Process-based River System Model at a Continental Scale
NASA Astrophysics Data System (ADS)
Kim, S. S. H.; Dutta, D.; Vaze, J.; Hughes, J. D.; Yang, A.; Teng, J.
2014-12-01
Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model describing the conceptual hydrological framework, methods used for representing different hydrological processes in the model and the results and evaluation of the model performance. The operational implementation of the model for water accounting is discussed.
Towards a muon radiography of the Puy de Dôme
NASA Astrophysics Data System (ADS)
Cârloganu, C.; Niess, V.; Béné, S.; Busato, E.; Dupieux, P.; Fehr, F.; Gay, P.; Miallier, D.; Vulpescu, B.; Boivin, P.; Combaret, C.; Labazuy, P.; Laktineh, I.; Lénat, J.-F.; Mirabito, L.; Portal, A.
2013-02-01
High-energy (above a few hundred GeV) atmospheric muons are a natural probe for geophysical studies. They can travel through kilometres of rock allowing for a radiography of the density distribution within large structures, like mountains or volcanoes. A collaboration between volcanologists, astroparticle and particle physicists, Tomuvol was formed in 2009 to study tomographic muon imaging of volcanoes with high-resolution, large-scale tracking detectors. We report on two campaigns of measurements at the flank of the Puy de Dôme using glass resistive plate chambers (GRPCs) developed for particle physics, within the CALICE collaboration.
Towards a muon radiography of the Puy de Dôme
NASA Astrophysics Data System (ADS)
Cârloganu, C.; Niess, V.; Béné, S.; Busato, E.; Dupieux, P.; Fehr, F.; Gay, P.; Miallier, D.; Vulpescu, B.; Boivin, P.; Combaret, C.; Labazuy, P.; Laktineh, I.; Lénat, J.-F.; Mirabito, L.; Portal, A.
2012-09-01
High energy (above 100 GeV) atmospheric muons are a natural probe for geophysical studies. They can travel through kilometres of rock allowing for a radiography of the density distribution within large structures, like mountains or volcanoes. A collaboration between volcanologists, astroparticle and particle physicists, TOMUVOL, was formed in 2009 to study tomographic muon imaging of volcanoes with high resolution, large scale tracking detectors. We report on two campaigns of measurements at the flank of the Puy de Dôme using Glass Resistive Plate Chambers (GRPCs) developed for Particle Physics, within the CALICE collaboration.
NASA Astrophysics Data System (ADS)
Thomas, N.; Rueda, X.; Lambin, E.; Mendenhall, C. D.
2012-12-01
Large intact forested regions of the world are known to be critical to maintaining Earth's climate, ecosystem health, and human livelihoods. Remote sensing has been successfully implemented as a tool to monitor forest cover and landscape dynamics over broad regions. Much of this work has been done using coarse resolution sensors such as AVHRR and MODIS in combination with moderate resolution sensors, particularly Landsat. Finer scale analysis of heterogeneous and fragmented landscapes is commonly performed with medium resolution data and has had varying success depending on many factors including the level of fragmentation, variability of land cover types, patch size, and image availability. Fine scale tree cover in mixed agricultural areas can have a major impact on biodiversity and ecosystem sustainability but may often be inadequately captured with the global to regional (coarse resolution and moderate resolution) satellite sensors and processing techniques widely used to detect land use and land cover changes. This study investigates whether advanced remote sensing methods are able to assess and monitor percent tree canopy cover in spatially complex human-dominated agricultural landscapes that prove challenging for traditional mapping techniques. Our study areas are in high altitude, mixed agricultural coffee-growing regions in Costa Rica and the Colombian Andes. We applied Random Forests regression tree analysis to Landsat data along with additional spectral, environmental, and spatial variables to predict percent tree canopy cover at 30m resolution. Image object-based texture, shape, and neighborhood metrics were generated at the Landsat scale using eCognition and included in the variable suite. Training and validation data was generated using high resolution imagery from digital aerial photography at 1m to 2.5 m resolution. Our results are promising with Pearson's correlation coefficients between observed and predicted percent tree canopy cover of .86 (Costa Rica) and .83 (Colombia). The tree cover mapping developed here supports two distinct projects on sustaining biodiversity and natural and human capital: in Costa Rica the tree canopy cover map is utilized to predict bird community composition; and in Colombia the mapping is performed for two time periods and used to assess the impact of coffee eco-certification programs on the landscape. This research identifies ways to leverage readily available, high quality, and cost-free Landsat data or other medium resolution satellite data sources in combination with high resolution data, such as that frequently available through Google Earth, to monitor and support sustainability efforts in fragmented and heterogeneous landscapes.
Temporal and spatial scaling impacts on extreme precipitation
NASA Astrophysics Data System (ADS)
Eggert, B.; Berg, P.; Haerter, J. O.; Jacob, D.; Moseley, C.
2015-01-01
Both in the current climate and in the light of climate change, understanding of the causes and risk of precipitation extremes is essential for protection of human life and adequate design of infrastructure. Precipitation extreme events depend qualitatively on the temporal and spatial scales at which they are measured, in part due to the distinct types of rain formation processes that dominate extremes at different scales. To capture these differences, we first filter large datasets of high-resolution radar measurements over Germany (5 min temporally and 1 km spatially) using synoptic cloud observations, to distinguish convective and stratiform rain events. In a second step, for each precipitation type, the observed data are aggregated over a sequence of time intervals and spatial areas. The resulting matrix allows a detailed investigation of the resolutions at which convective or stratiform events are expected to contribute most to the extremes. We analyze where the statistics of the two types differ and discuss at which resolutions transitions occur between dominance of either of the two precipitation types. We characterize the scales at which the convective or stratiform events will dominate the statistics. For both types, we further develop a mapping between pairs of spatially and temporally aggregated statistics. The resulting curve is relevant when deciding on data resolutions where statistical information in space and time is balanced. Our study may hence also serve as a practical guide for modelers, and for planning the space-time layout of measurement campaigns. We also describe a mapping between different pairs of resolutions, possibly relevant when working with mismatched model and observational resolutions, such as in statistical bias correction.
Scale criticality in estimating ecosystem carbon dynamics
Zhao, Shuqing; Liu, Shuguang
2014-01-01
Scaling is central to ecology and Earth system sciences. However, the importance of scale (i.e. resolution and extent) for understanding carbon dynamics across scales is poorly understood and quantified. We simulated carbon dynamics under a wide range of combinations of resolution (nine spatial resolutions of 250 m, 500 m, 1 km, 2 km, 5 km, 10 km, 20 km, 50 km, and 100 km) and extent (57 geospatial extents ranging from 108 to 1 247 034 km2) in the southeastern United States to explore the existence of scale dependence of the simulated regional carbon balance. Results clearly show the existence of a critical threshold resolution for estimating carbon sequestration within a given extent and an error limit. Furthermore, an invariant power law scaling relationship was found between the critical resolution and the spatial extent as the critical resolution is proportional to An (n is a constant, and A is the extent). Scale criticality and the power law relationship might be driven by the power law probability distributions of land surface and ecological quantities including disturbances at landscape to regional scales. The current overwhelming practices without considering scale criticality might have largely contributed to difficulties in balancing carbon budgets at regional and global scales.
NASA Astrophysics Data System (ADS)
Rauser, F.
2013-12-01
We present results from the German BMBF initiative 'High Definition Cloud and Precipitation for advancing Climate Prediction -HD(CP)2'. This initiative addresses most of the problems that are discussed in this session in one, unified approach: cloud physics, convection, boundary layer development, radiation and subgrid variability are approached in one organizational framework. HD(CP)2 merges both observation and high performance computing / model development communities to tackle a shared problem: how to improve the understanding of the most important subgrid-scale processes of cloud and precipitation physics, and how to utilize this knowledge for improved climate predictions. HD(CP)2 is a coordinated initiative to: (i) realize; (ii) evaluate; and (iii) statistically characterize and exploit for the purpose of both parameterization development and cloud / precipitation feedback analysis; ultra-high resolution (100 m in the horizontal, 10-50 m in the vertical) regional hind-casts over time periods (3-15 y) and spatial scales (1000-1500 km) that are climatically meaningful. HD(CP)2 thus consists of three elements (the model development and simulations, their observational evaluation and exploitation/synthesis to advance CP prediction) and its first three-year phase has started on October 1st 2012. As a central part of HD(CP)2, the HD(CP)2 Observational Prototype Experiment (HOPE) has been carried out in spring 2013. In this campaign, high resolution measurements with a multitude of instruments from all major centers in Germany have been carried out in a limited domain, to allow for unprecedented resolution and precision in the observation of microphysics parameters on a resolution that will allow for evaluation and improvement of ultra-high resolution models. At the same time, a local area version of the new climate model ICON of the Max Planck Institute and the German weather service has been developed that allows for LES-type simulations on high resolutions on limited domains. The advantage of modifying an existing, evolving climate model is to share insights from high resolution runs directly with the large-scale modelers and to allow for easy intercomparison and evaluation later on. Within this presentation, we will give a short overview on HD(CP)2 , show results from the observation campaign HOPE and the LES simulations of the same domain and conditions and will discuss how these will lead to an improved understanding and evaluation background for the efforts to improve fast physics in our climate model.
Large eddy simulations of compressible magnetohydrodynamic turbulence
NASA Astrophysics Data System (ADS)
Grete, Philipp
2017-02-01
Supersonic, magnetohydrodynamic (MHD) turbulence is thought to play an important role in many processes - especially in astrophysics, where detailed three-dimensional observations are scarce. Simulations can partially fill this gap and help to understand these processes. However, direct simulations with realistic parameters are often not feasible. Consequently, large eddy simulations (LES) have emerged as a viable alternative. In LES the overall complexity is reduced by simulating only large and intermediate scales directly. The smallest scales, usually referred to as subgrid-scales (SGS), are introduced to the simulation by means of an SGS model. Thus, the overall quality of an LES with respect to properly accounting for small-scale physics crucially depends on the quality of the SGS model. While there has been a lot of successful research on SGS models in the hydrodynamic regime for decades, SGS modeling in MHD is a rather recent topic, in particular, in the compressible regime. In this thesis, we derive and validate a new nonlinear MHD SGS model that explicitly takes compressibility effects into account. A filter is used to separate the large and intermediate scales, and it is thought to mimic finite resolution effects. In the derivation, we use a deconvolution approach on the filter kernel. With this approach, we are able to derive nonlinear closures for all SGS terms in MHD: the turbulent Reynolds and Maxwell stresses, and the turbulent electromotive force (EMF). We validate the new closures both a priori and a posteriori. In the a priori tests, we use high-resolution reference data of stationary, homogeneous, isotropic MHD turbulence to compare exact SGS quantities against predictions by the closures. The comparison includes, for example, correlations of turbulent fluxes, the average dissipative behavior, and alignment of SGS vectors such as the EMF. In order to quantify the performance of the new nonlinear closure, this comparison is conducted from the subsonic (sonic Mach number M s ≈ 0.2) to the highly supersonic (M s ≈ 20) regime, and against other SGS closures. The latter include established closures of eddy-viscosity and scale-similarity type. In all tests and over the entire parameter space, we find that the proposed closures are (significantly) closer to the reference data than the other closures. In the a posteriori tests, we perform large eddy simulations of decaying, supersonic MHD turbulence with initial M s ≈ 3. We implemented closures of all types, i.e. of eddy-viscosity, scale-similarity and nonlinear type, as an SGS model and evaluated their performance in comparison to simulations without a model (and at higher resolution). We find that the models need to be calculated on a scale larger than the grid scale, e.g. by an explicit filter, to have an influence on the dynamics at all. Furthermore, we show that only the proposed nonlinear closure improves higher-order statistics.
High-Resolution Land Use and Land Cover Mapping
,
1999-01-01
As the Nation?s population grows, quantifying, monitoring, and managing land use becomes increasingly important. The U.S. Geological Survey (USGS) has a long heritage of leadership and innovation in land use and land cover (LULC) mapping that has been the model both nationally and internationally for over 20 years. At present, the USGS is producing high-resolution LULC data for several watershed and urban areas within the United States. This high-resolution LULC mapping is part of an ongoing USGS Land Cover Characterization Program (LCCP). The four components of the LCCP are global (1:2,000,000-scale), national (1:100,000-scale), urban (1:24,000-scale), and special projects (various scales and time periods). Within the urban and special project components, the USGS Rocky Mountain Mapping Center (RMMC) is collecting historical as well as contemporary high-resolution LULC data. RMMC?s high-resolution LULC mapping builds on the heritage and success of previous USGS LULC programs and provides LULC information to meet user requirements.
Climate Modeling: Ocean Cavities below Ice Shelves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petersen, Mark Roger
The Accelerated Climate Model for Energy (ACME), a new initiative by the U.S. Department of Energy, includes unstructured-mesh ocean, land-ice, and sea-ice components using the Model for Prediction Across Scales (MPAS) framework. The ability to run coupled high-resolution global simulations efficiently on large, high-performance computers is a priority for ACME. Sub-ice shelf ocean cavities are a significant new capability in ACME, and will be used to better understand how changing ocean temperature and currents influence glacial melting and retreat. These simulations take advantage of the horizontal variable-resolution mesh and adaptive vertical coordinate in MPAS-Ocean, in order to place high resolutionmore » below ice shelves and near grounding lines.« less
Local structure of scalar flux in turbulent passive scalar mixing
NASA Astrophysics Data System (ADS)
Konduri, Aditya; Donzis, Diego
2012-11-01
Understanding the properties of scalar flux is important in the study of turbulent mixing. Classical theories suggest that it mainly depends on the large scale structures in the flow. Recent studies suggest that the mean scalar flux reaches an asymptotic value at high Peclet numbers, independent of molecular transport properties of the fluid. A large DNS database of isotropic turbulence with passive scalars forced with a mean scalar gradient with resolution up to 40963, is used to explore the structure of scalar flux based on the local topology of the flow. It is found that regions of small velocity gradients, where dissipation and enstrophy are small, constitute the main contribution to scalar flux. On the other hand, regions of very small scalar gradient (and scalar dissipation) become less important to the scalar flux at high Reynolds numbers. The scaling of the scalar flux spectra is also investigated. The k - 7 / 3 scaling proposed by Lumley (1964) is observed at high Reynolds numbers, but collapse is not complete. A spectral bump similar to that in the velocity spectrum is observed close to dissipative scales. A number of features, including the height of the bump, appear to reach an asymptotic value at high Schmidt number.
NASA Astrophysics Data System (ADS)
Cropp, E. L.; Hazenberg, P.; Castro, C. L.; Demaria, E. M.
2017-12-01
In the southwestern US, the summertime North American Monsoon (NAM) provides about 60% of the region's annual precipitation. Recent research using high-resolution atmospheric model simulations and retrospective predictions has shown that since the 1950's, and more specifically in the last few decades, the mean daily precipitation in the southwestern U.S. during the NAM has followed a decreasing trend. Furthermore, days with more extreme precipitation have intensified. The current work focuses the impact of these long-term changes on the observed small-scale spatial variability of intense precipitation. Since limited long-term high-resolution observational data exist to support such climatological-induced spatial changes in precipitation frequency and intensity, the current work utilizes observations from the USDA-ARS Walnut Gulch Experimental Watershed (WGEW) in southeastern Arizona. Within this 150 km^2 catchment over 90 rain gauges have been installed since the 1950s, measuring at sub-hourly resolution. We have applied geospatial analyses and the kriging interpolation technique to identify long-term changes in the spatial and temporal correlation and anisotropy of intense precipitation. The observed results will be compared with the previously model simulated results, as well as related to large-scale variations in climate patterns, such as the El Niño Southern Oscillation (ENSO) and the Pacific Decadal Oscillation (PDO).
Observational constraints on earthquake source scaling: Understanding the limits in resolution
Hough, S.E.
1996-01-01
I examine the resolution of the type of stress drop estimates that have been used to place observational constraints on the scaling of earthquake source processes. I first show that apparent stress and Brune stress drop are equivalent to within a constant given any source spectral decay between ??1.5 and ??3 (i.e., any plausible value) and so consistent scaling is expected for the two estimates. I then discuss the resolution and scaling of Brune stress drop estimates, in the context of empirical Green's function results from recent earthquake sequences, including the 1992 Joshua Tree, California, mainshock and its aftershocks. I show that no definitive scaling of stress drop with moment is revealed over the moment range 1019-1025; within this sequence, however, there is a tendency for moderate-sized (M 4-5) events to be characterized by high stress drops. However, well-resolved results for recent M > 6 events are inconsistent with any extrapolated stress increase with moment for the aftershocks. Focusing on comer frequency estimates for smaller (M < 3.5) events, I show that resolution is extremely limited even after empirical Green's function deconvolutions. A fundamental limitation to resolution is the paucity of good signal-to-noise at frequencies above 60 Hz, a limitation that will affect nearly all surficial recordings of ground motion in California and many other regions. Thus, while the best available observational results support a constant stress drop for moderate-to large-sized events, very little robust observational evidence exists to constrain the quantities that bear most critically on our understanding of source processes: stress drop values and stress drop scaling for small events.
Laser jetting of femto-liter metal droplets for high resolution 3D printed structures
NASA Astrophysics Data System (ADS)
Zenou, M.; Sa'Ar, A.; Kotler, Z.
2015-11-01
Laser induced forward transfer (LIFT) is employed in a special, high accuracy jetting regime, by adequately matching the sub-nanosecond pulse duration to the metal donor layer thickness. Under such conditions, an effective solid nozzle is formed, providing stability and directionality to the femto-liter droplets which are printed from a large gap in excess of 400 μm. We illustrate the wide applicability of this method by printing several 3D metal objects. First, very high aspect ratio (A/R > 20), micron scale, copper pillars in various configuration, upright and arbitrarily bent, then a micron scale 3D object composed of gold and copper. Such a digital printing method could serve the generation of complex, multi-material, micron-scale, 3D materials and novel structures.
A unified large/small-scale dynamo in helical turbulence
NASA Astrophysics Data System (ADS)
Bhat, Pallavi; Subramanian, Kandaswamy; Brandenburg, Axel
2016-09-01
We use high resolution direct numerical simulations (DNS) to show that helical turbulence can generate significant large-scale fields even in the presence of strong small-scale dynamo action. During the kinematic stage, the unified large/small-scale dynamo grows fields with a shape-invariant eigenfunction, with most power peaked at small scales or large k, as in Subramanian & Brandenburg. Nevertheless, the large-scale field can be clearly detected as an excess power at small k in the negatively polarized component of the energy spectrum for a forcing with positively polarized waves. Its strength overline{B}, relative to the total rms field Brms, decreases with increasing magnetic Reynolds number, ReM. However, as the Lorentz force becomes important, the field generated by the unified dynamo orders itself by saturating on successively larger scales. The magnetic integral scale for the positively polarized waves, characterizing the small-scale field, increases significantly from the kinematic stage to saturation. This implies that the small-scale field becomes as coherent as possible for a given forcing scale, which averts the ReM-dependent quenching of overline{B}/B_rms. These results are obtained for 10243 DNS with magnetic Prandtl numbers of PrM = 0.1 and 10. For PrM = 0.1, overline{B}/B_rms grows from about 0.04 to about 0.4 at saturation, aided in the final stages by helicity dissipation. For PrM = 10, overline{B}/B_rms grows from much less than 0.01 to values of the order the 0.2. Our results confirm that there is a unified large/small-scale dynamo in helical turbulence.
National Laboratory for Advanced Scientific Visualization at UNAM - Mexico
NASA Astrophysics Data System (ADS)
Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo
2016-04-01
In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.
NASA Astrophysics Data System (ADS)
Dube, Timothy; Mutanga, Onisimo
2015-03-01
Aboveground biomass estimation is critical in understanding forest contribution to regional carbon cycles. Despite the successful application of high spatial and spectral resolution sensors in aboveground biomass (AGB) estimation, there are challenges related to high acquisition costs, small area coverage, multicollinearity and limited availability. These challenges hamper the successful regional scale AGB quantification. The aim of this study was to assess the utility of the newly-launched medium-resolution multispectral Landsat 8 Operational Land Imager (OLI) dataset with a large swath width, in quantifying AGB in a forest plantation. We applied different sets of spectral analysis (test I: spectral bands; test II: spectral vegetation indices and test III: spectral bands + spectral vegetation indices) in testing the utility of Landsat 8 OLI using two non-parametric algorithms: stochastic gradient boosting and the random forest ensembles. The results of the study show that the medium-resolution multispectral Landsat 8 OLI dataset provides better AGB estimates for Eucalyptus dunii, Eucalyptus grandis and Pinus taeda especially when using the extracted spectral information together with the derived spectral vegetation indices. We also noted that incorporating the optimal subset of the most important selected medium-resolution multispectral Landsat 8 OLI bands improved AGB accuracies. We compared medium-resolution multispectral Landsat 8 OLI AGB estimates with Landsat 7 ETM + estimates and the latter yielded lower estimation accuracies. Overall, this study demonstrates the invaluable potential and strength of applying the relatively affordable and readily available newly-launched medium-resolution Landsat 8 OLI dataset, with a large swath width (185-km) in precisely estimating AGB. This strength of the Landsat OLI dataset is crucial especially in sub-Saharan Africa where high-resolution remote sensing data availability remains a challenge.
Spatial Downscaling of Alien Species Presences using Machine Learning
NASA Astrophysics Data System (ADS)
Daliakopoulos, Ioannis N.; Katsanevakis, Stelios; Moustakas, Aristides
2017-07-01
Large scale, high-resolution data on alien species distributions are essential for spatially explicit assessments of their environmental and socio-economic impacts, and management interventions for mitigation. However, these data are often unavailable. This paper presents a method that relies on Random Forest (RF) models to distribute alien species presence counts at a finer resolution grid, thus achieving spatial downscaling. A sufficiently large number of RF models are trained using random subsets of the dataset as predictors, in a bootstrapping approach to account for the uncertainty introduced by the subset selection. The method is tested with an approximately 8×8 km2 grid containing floral alien species presence and several indices of climatic, habitat, land use covariates for the Mediterranean island of Crete, Greece. Alien species presence is aggregated at 16×16 km2 and used as a predictor of presence at the original resolution, thus simulating spatial downscaling. Potential explanatory variables included habitat types, land cover richness, endemic species richness, soil type, temperature, precipitation, and freshwater availability. Uncertainty assessment of the spatial downscaling of alien species’ occurrences was also performed and true/false presences and absences were quantified. The approach is promising for downscaling alien species datasets of larger spatial scale but coarse resolution, where the underlying environmental information is available at a finer resolution than the alien species data. Furthermore, the RF architecture allows for tuning towards operationally optimal sensitivity and specificity, thus providing a decision support tool for designing a resource efficient alien species census.
Multi-GNSS PPP-RTK: From Large- to Small-Scale Networks
Nadarajah, Nandakumaran; Wang, Kan; Choudhury, Mazher
2018-01-01
Precise point positioning (PPP) and its integer ambiguity resolution-enabled variant, PPP-RTK (real-time kinematic), can benefit enormously from the integration of multiple global navigation satellite systems (GNSS). In such a multi-GNSS landscape, the positioning convergence time is expected to be reduced considerably as compared to the one obtained by a single-GNSS setup. It is therefore the goal of the present contribution to provide numerical insights into the role taken by the multi-GNSS integration in delivering fast and high-precision positioning solutions (sub-decimeter and centimeter levels) using PPP-RTK. To that end, we employ the Curtin PPP-RTK platform and process data-sets of GPS, BeiDou Navigation Satellite System (BDS) and Galileo in stand-alone and combined forms. The data-sets are collected by various receiver types, ranging from high-end multi-frequency geodetic receivers to low-cost single-frequency mass-market receivers. The corresponding stations form a large-scale (Australia-wide) network as well as a small-scale network with inter-station distances less than 30 km. In case of the Australia-wide GPS-only ambiguity-float setup, 90% of the horizontal positioning errors (kinematic mode) are shown to become less than five centimeters after 103 min. The stated required time is reduced to 66 min for the corresponding GPS + BDS + Galieo setup. The time is further reduced to 15 min by applying single-receiver ambiguity resolution. The outcomes are supported by the positioning results of the small-scale network. PMID:29614040
Multi-GNSS PPP-RTK: From Large- to Small-Scale Networks.
Nadarajah, Nandakumaran; Khodabandeh, Amir; Wang, Kan; Choudhury, Mazher; Teunissen, Peter J G
2018-04-03
Precise point positioning (PPP) and its integer ambiguity resolution-enabled variant, PPP-RTK (real-time kinematic), can benefit enormously from the integration of multiple global navigation satellite systems (GNSS). In such a multi-GNSS landscape, the positioning convergence time is expected to be reduced considerably as compared to the one obtained by a single-GNSS setup. It is therefore the goal of the present contribution to provide numerical insights into the role taken by the multi-GNSS integration in delivering fast and high-precision positioning solutions (sub-decimeter and centimeter levels) using PPP-RTK. To that end, we employ the Curtin PPP-RTK platform and process data-sets of GPS, BeiDou Navigation Satellite System (BDS) and Galileo in stand-alone and combined forms. The data-sets are collected by various receiver types, ranging from high-end multi-frequency geodetic receivers to low-cost single-frequency mass-market receivers. The corresponding stations form a large-scale (Australia-wide) network as well as a small-scale network with inter-station distances less than 30 km. In case of the Australia-wide GPS-only ambiguity-float setup, 90% of the horizontal positioning errors (kinematic mode) are shown to become less than five centimeters after 103 min. The stated required time is reduced to 66 min for the corresponding GPS + BDS + Galieo setup. The time is further reduced to 15 min by applying single-receiver ambiguity resolution. The outcomes are supported by the positioning results of the small-scale network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrmann, A M; Ritz, K; Nunan, N
Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecologicalmore » function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of research most likely to benefit from the high resolving power attainable with this new approach.« less
NASA Astrophysics Data System (ADS)
Grandin, Robert John
Safely using materials in high performance applications requires adequately understanding the mechanisms which control the nucleation and evolution of damage. Most of a material's operational life is spent in a state with noncritical damage, and, for example in metals only a small portion of its life falls within the classical Paris Law regime of crack growth. Developing proper structural health and prognosis models requires understanding the behavior of damage in these early stages within the material's life, and this early-stage damage occurs on length scales at which the material may be considered "granular'' in the sense that the discrete regions which comprise the whole are large enough to require special consideration. Material performance depends upon the characteristics of the granules themselves as well as the interfaces between granules. As a result, properly studying early-stage damage in complex, granular materials requires a means to characterize changes in the granules and interfaces. The granular-scale can range from tenths of microns in ceramics, to single microns in fiber-reinforced composites, to tens of millimeters in concrete. The difficulty of direct-study is often overcome by exhaustive testing of macro-scale damage caused by gross material loads and abuse. Such testing, for example optical or electron microscopy, destructive and further, is costly when used to study the evolution of damage within a material and often limits the study to a few snapshots. New developments in high-resolution computed tomography (HRCT) provide the necessary spatial resolution to directly image the granule length-scale of many materials. Successful application of HRCT with fiber-reinforced composites, however, requires extending the HRCT performance beyond current limits. This dissertation will discuss improvements made in the field of CT reconstruction which enable resolutions to be pushed to the point of being able to image the fiber-scale damage structures and the application of this new capability to the study of early-stage damage.
NASA Astrophysics Data System (ADS)
Matthews, L. D.; Crew, G. B.; Doeleman, S. S.; Lacasse, R.; Saez, A. F.; Alef, W.; Akiyama, K.; Amestica, R.; Anderson, J. M.; Barkats, D. A.; Baudry, A.; Broguière, D.; Escoffier, R.; Fish, V. L.; Greenberg, J.; Hecht, M. H.; Hiriart, R.; Hirota, A.; Honma, M.; Ho, P. T. P.; Impellizzeri, C. M. V.; Inoue, M.; Kohno, Y.; Lopez, B.; Martí-Vidal, I.; Messias, H.; Meyer-Zhao, Z.; Mora-Klein, M.; Nagar, N. M.; Nishioka, H.; Oyama, T.; Pankratius, V.; Perez, J.; Phillips, N.; Pradel, N.; Rottmann, H.; Roy, A. L.; Ruszczyk, C. A.; Shillue, B.; Suzuki, S.; Treacy, R.
2018-01-01
The Atacama Millimeter/submillimeter Array (ALMA) Phasing Project (APP) has developed and deployed the hardware and software necessary to coherently sum the signals of individual ALMA antennas and record the aggregate sum in Very Long Baseline Interferometry (VLBI) Data Exchange Format. These beamforming capabilities allow the ALMA array to collectively function as the equivalent of a single large aperture and participate in global VLBI arrays. The inclusion of phased ALMA in current VLBI networks operating at (sub)millimeter wavelengths provides an order of magnitude improvement in sensitivity, as well as enhancements in u–v coverage and north–south angular resolution. The availability of a phased ALMA enables a wide range of new ultra-high angular resolution science applications, including the resolution of supermassive black holes on event horizon scales and studies of the launch and collimation of astrophysical jets. It also provides a high-sensitivity aperture that may be used for investigations such as pulsar searches at high frequencies. This paper provides an overview of the ALMA Phasing System design, implementation, and performance characteristics.
Srivastava, Rishi; Singh, Mohar; Bajaj, Deepak; Parida, Swarup K.
2016-01-01
Development and large-scale genotyping of user-friendly informative genome/gene-derived InDel markers in natural and mapping populations is vital for accelerating genomics-assisted breeding applications of chickpea with minimal resource expenses. The present investigation employed a high-throughput whole genome next-generation resequencing strategy in low and high pod number parental accessions and homozygous individuals constituting the bulks from each of two inter-specific mapping populations [(Pusa 1103 × ILWC 46) and (Pusa 256 × ILWC 46)] to develop non-erroneous InDel markers at a genome-wide scale. Comparing these high-quality genomic sequences, 82,360 InDel markers with reference to kabuli genome and 13,891 InDel markers exhibiting differentiation between low and high pod number parental accessions and bulks of aforementioned mapping populations were developed. These informative markers were structurally and functionally annotated in diverse coding and non-coding sequence components of genome/genes of kabuli chickpea. The functional significance of regulatory and coding (frameshift and large-effect mutations) InDel markers for establishing marker-trait linkages through association/genetic mapping was apparent. The markers detected a greater amplification (97%) and intra-specific polymorphic potential (58–87%) among a diverse panel of cultivated desi, kabuli, and wild accessions even by using a simpler cost-efficient agarose gel-based assay implicating their utility in large-scale genetic analysis especially in domesticated chickpea with narrow genetic base. Two high-density inter-specific genetic linkage maps generated using aforesaid mapping populations were integrated to construct a consensus 1479 InDel markers-anchored high-resolution (inter-marker distance: 0.66 cM) genetic map for efficient molecular mapping of major QTLs governing pod number and seed yield per plant in chickpea. Utilizing these high-density genetic maps as anchors, three major genomic regions harboring each of pod number and seed yield robust QTLs (15–28% phenotypic variation explained) were identified on chromosomes 2, 4, and 6. The integration of genetic and physical maps at these QTLs mapped on chromosomes scaled-down the long major QTL intervals into high-resolution short pod number and seed yield robust QTL physical intervals (0.89–2.94 Mb) which were essentially got validated in multiple genetic backgrounds of two chickpea mapping populations. The genome-wide InDel markers including natural allelic variants and genomic loci/genes delineated at major six especially in one colocalized novel congruent robust pod number and seed yield robust QTLs mapped on a high-density consensus genetic map were found most promising in chickpea. These functionally relevant molecular tags can drive marker-assisted genetic enhancement to develop high-yielding cultivars with increased seed/pod number and yield in chickpea. PMID:27695461
Enhanced mixing and spatial instability in concentrated bacterial suspensions
NASA Astrophysics Data System (ADS)
Sokolov, Andrey; Goldstein, Raymond E.; Feldchtein, Felix I.; Aranson, Igor S.
2009-09-01
High-resolution optical coherence tomography is used to study the onset of a large-scale convective motion in free-standing thin films of adjustable thickness containing suspensions of swimming aerobic bacteria. Clear evidence is found that beyond a threshold film thickness there exists a transition from quasi-two-dimensional collective swimming to three-dimensional turbulent behavior. The latter state, qualitatively different from bioconvection in dilute bacterial suspensions, is characterized by enhanced diffusivities of oxygen and bacteria. These results emphasize the impact of self-organized bacterial locomotion on the onset of three-dimensional dynamics, and suggest key ingredients necessary to extend standard models of bioconvection to incorporate effects of large-scale collective motion.
Simultaneous wall-shear-stress and wide-field PIV measurements in a turbulent boundary layer
NASA Astrophysics Data System (ADS)
Gomit, Guillaume; Fourrie, Gregoire; de Kat, Roeland; Ganapathisubramani, Bharathram
2015-11-01
Simultaneous particle image velocimetry (PIV) and hot-film shear stress sensor measurements were performed to study the large-scale structures associated with shear stress events in a flat plate turbulent boundary layer at a high Reynolds number (Reτ ~ 4000). The PIV measurement was performed in a streamwise-wall normal plane using an array of six high resolution cameras (4 ×16MP and 2 ×29MP). The resulting field of view covers 8 δ (where δ is the boundary layer thickness) in the streamwise direction and captures the entire boundary layer in the wall-normal direction. The spatial resolution of the measurement is approximately is approximately 70 wall units (1.8 mm) and sampled each 35 wall units (0.9 mm). In association with the PIV setup, a spanwise array of 10 skin-friction sensors (spanning one δ) was used to capture the footprint of the large-scale structures. This combination of measurements allowed the analysis of the three-dimensional conditional structures in the boundary layer. Particularly, from conditional averages, the 3D organisation of the wall normal and streamwise velocity components (u and v) and the Reynolds shear stress (-u'v') related to a low and high shear stress events can be extracted. European Research Council Grant No-277472-WBT.
Prediction of Indian Summer-Monsoon Onset Variability: A Season in Advance.
Pradhan, Maheswar; Rao, A Suryachandra; Srivastava, Ankur; Dakate, Ashish; Salunke, Kiran; Shameera, K S
2017-10-27
Monsoon onset is an inherent transient phenomenon of Indian Summer Monsoon and it was never envisaged that this transience can be predicted at long lead times. Though onset is precipitous, its variability exhibits strong teleconnections with large scale forcing such as ENSO and IOD and hence may be predictable. Despite of the tremendous skill achieved by the state-of-the-art models in predicting such large scale processes, the prediction of monsoon onset variability by the models is still limited to just 2-3 weeks in advance. Using an objective definition of onset in a global coupled ocean-atmosphere model, it is shown that the skillful prediction of onset variability is feasible under seasonal prediction framework. The better representations/simulations of not only the large scale processes but also the synoptic and intraseasonal features during the evolution of monsoon onset are the comprehensions behind skillful simulation of monsoon onset variability. The changes observed in convection, tropospheric circulation and moisture availability prior to and after the onset are evidenced in model simulations, which resulted in high hit rate of early/delay in monsoon onset in the high resolution model.
Ferrari, Renata; Marzinelli, Ezequiel M; Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F; Byrne, Maria; Malcolm, Hamish A; Williams, Stefan B; Steinberg, Peter D
2018-01-01
Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate 'no-take' and 'general-use' (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5-10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and sampling design that once temporal sampling is incorporated will be useful to detect changes of marine benthic communities across multiple spatial and temporal scales.
Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F.; Byrne, Maria; Malcolm, Hamish A.; Williams, Stefan B.; Steinberg, Peter D.
2018-01-01
Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate ‘no-take’ and ‘general-use’ (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5–10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and sampling design that once temporal sampling is incorporated will be useful to detect changes of marine benthic communities across multiple spatial and temporal scales. PMID:29547656
Christopher Potter; Tan Pang-Ning; Vipin Kumar; Chris Kucharik; Steven Klooster; Vanessa Genovese; Warren Cohen; Sean Healey
2005-01-01
Ecosystem structure and function are strongly affected by disturbance events, many of which in North America are associated with seasonal temperature extremes, wildfires, and tropical storms. This study was conducted to evaluate patterns in a 19-year record of global satellite observations of vegetation phenology from the advanced very high resolution radiometer (AVHRR...
North Pacific Mesoscale Coupled Air-Ocean Simulations Compared with Observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cerovecki, Ivana; McClean, Julie; Koracin, Darko
2014-11-14
The overall objective of this study was to improve the representation of regional ocean circulation in the North Pacific by using high resolution atmospheric forcing that accurately represents mesoscale processes in ocean-atmosphere regional (North Pacific) model configuration. The goal was to assess the importance of accurate representation of mesoscale processes in the atmosphere and the ocean on large scale circulation. This is an important question, as mesoscale processes in the atmosphere which are resolved by the high resolution mesoscale atmospheric models such as Weather Research and Forecasting (WRF), are absent in commonly used atmospheric forcing such as CORE forcing, employedmore » in e.g. the Community Climate System Model (CCSM).« less
Duan, Xiaojie; Lieber, Charles M
2013-10-01
High spatiotemporal resolution interfaces between electrical sensors and biological systems, from single live cells to tissues, is crucial for many areas, including fundamental biophysical studies as well as medical monitoring and intervention. Herein, we summarize recent progress in the development and application of novel nanoscale devices for intracellular electrical recording of action potentials and the effort of merging electronic and biological systems seamlessly in three dimensions by using macroporous nanoelectronic scaffolds. The uniqueness of these nanoscale devices for minimally invasive, large-scale, high spatial resolution, and three-dimensional neural activity mapping are highlighted. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Dou, Hao; Sun, Xiao; Li, Bin; Deng, Qianqian; Yang, Xubo; Liu, Di; Tian, Jinwen
2018-03-01
Aircraft detection from very high resolution remote sensing images, has gained more increasing interest in recent years due to the successful civil and military applications. However, several problems still exist: 1) how to extract the high-level features of aircraft; 2) locating objects within such a large image is difficult and time consuming; 3) A common problem of multiple resolutions of satellite images still exists. In this paper, inspirited by biological visual mechanism, the fusion detection framework is proposed, which fusing the top-down visual mechanism (deep CNN model) and bottom-up visual mechanism (GBVS) to detect aircraft. Besides, we use multi-scale training method for deep CNN model to solve the problem of multiple resolutions. Experimental results demonstrate that our method can achieve a better detection result than the other methods.
The spatial and temporal domains of modern ecology.
Estes, Lyndon; Elsen, Paul R; Treuer, Timothy; Ahmed, Labeeb; Caylor, Kelly; Chang, Jason; Choi, Jonathan J; Ellis, Erle C
2018-05-01
To understand ecological phenomena, it is necessary to observe their behaviour across multiple spatial and temporal scales. Since this need was first highlighted in the 1980s, technology has opened previously inaccessible scales to observation. To help to determine whether there have been corresponding changes in the scales observed by modern ecologists, we analysed the resolution, extent, interval and duration of observations (excluding experiments) in 348 studies that have been published between 2004 and 2014. We found that observational scales were generally narrow, because ecologists still primarily use conventional field techniques. In the spatial domain, most observations had resolutions ≤1 m 2 and extents ≤10,000 ha. In the temporal domain, most observations were either unreplicated or infrequently repeated (>1 month interval) and ≤1 year in duration. Compared with studies conducted before 2004, observational durations and resolutions appear largely unchanged, but intervals have become finer and extents larger. We also found a large gulf between the scales at which phenomena are actually observed and the scales those observations ostensibly represent, raising concerns about observational comprehensiveness. Furthermore, most studies did not clearly report scale, suggesting that it remains a minor concern. Ecologists can better understand the scales represented by observations by incorporating autocorrelation measures, while journals can promote attentiveness to scale by implementing scale-reporting standards.
NASA Astrophysics Data System (ADS)
Ramsdale, Jason D.; Balme, Matthew R.; Conway, Susan J.; Gallagher, Colman; van Gasselt, Stephan A.; Hauber, Ernst; Orgel, Csilla; Séjourné, Antoine; Skinner, James A.; Costard, Francois; Johnsson, Andreas; Losiak, Anna; Reiss, Dennis; Swirad, Zuzanna M.; Kereszturi, Akos; Smith, Isaac B.; Platz, Thomas
2017-06-01
The increased volume, spatial resolution, and areal coverage of high-resolution images of Mars over the past 15 years have led to an increased quantity and variety of small-scale landform identifications. Though many such landforms are too small to represent individually on regional-scale maps, determining their presence or absence across large areas helps form the observational basis for developing hypotheses on the geological nature and environmental history of a study area. The combination of improved spatial resolution and near-continuous coverage significantly increases the time required to analyse the data. This becomes problematic when attempting regional or global-scale studies of metre and decametre-scale landforms. Here, we describe an approach for mapping small features (from decimetre to kilometre scale) across large areas, formulated for a project to study the northern plains of Mars, and provide context on how this method was developed and how it can be implemented. Rather than ;mapping; with points and polygons, grid-based mapping uses a ;tick box; approach to efficiently record the locations of specific landforms (we use an example suite of glacial landforms; including viscous flow features, the latitude dependant mantle and polygonised ground). A grid of squares (e.g. 20 km by 20 km) is created over the mapping area. Then the basemap data are systematically examined, grid-square by grid-square at full resolution, in order to identify the landforms while recording the presence or absence of selected landforms in each grid-square to determine spatial distributions. The result is a series of grids recording the distribution of all the mapped landforms across the study area. In some ways, these are equivalent to raster images, as they show a continuous distribution-field of the various landforms across a defined (rectangular, in most cases) area. When overlain on context maps, these form a coarse, digital landform map. We find that grid-based mapping provides an efficient solution to the problems of mapping small landforms over large areas, by providing a consistent and standardised approach to spatial data collection. The simplicity of the grid-based mapping approach makes it extremely scalable and workable for group efforts, requiring minimal user experience and producing consistent and repeatable results. The discrete nature of the datasets, simplicity of approach, and divisibility of tasks, open up the possibility for citizen science in which crowdsourcing large grid-based mapping areas could be applied.
Multiresolution persistent homology for excessively large biomolecular datasets
NASA Astrophysics Data System (ADS)
Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei
2015-10-01
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.
Observations of Seafloor Roughness in a Tidally Modulated Inlet
NASA Astrophysics Data System (ADS)
Lippmann, T. C.; Hunt, J.
2014-12-01
The vertical structure of shallow water flows are influenced by the presence of a bottom boundary layer, which spans the water column for long period waves or mean flows. The nature of the boundary is determined in part by the roughness elements that make up the seafloor, and includes sometimes complex undulations associated with regular and irregular shaped bedforms whose scales range several orders of magnitude from orbital wave ripples (10-1 m) to mega-ripples (100 m) and even larger features (101-103) such as sand waves, bars, and dunes. Modeling efforts often parameterize the effects of roughness elements on flow fields, depending on the complexity of the boundary layer formulations. The problem is exacerbated by the transient nature of bedforms and their large spatial extent and variability. This is particularly important in high flow areas with large sediment transport, such as tidally dominated sandy inlets like New River Inlet, NC. Quantification of small scale seafloor variability over large spatial areas requires the use of mobile platforms that can measure with fine scale (order cm) accuracy in wide swaths. The problem is difficult in shallow water where waves and currents are large, and water clarity is often limited. In this work, we present results from bathymetric surveys obtained with the Coastal Bathymetry Survey System, a personal watercraft equipped with a Imagenex multibeam acoustic echosounder and Applanix POS-MV 320 GPS-aided inertial measurement unit. This system is able to measure shallow water seafloor bathymetry and backscatter intensity with very fine scale (10-1 m) resolution and over relatively large scales (103 m) in the presence of high waves and currents. Wavenumber spectra show that the noise floor of the resolved multibeam bathymetry is on the order of 2.5 - 5 cm in amplitude, depending on water depths ranging 2 - 6 m, and about 30 cm in wavelength. Seafloor roughness elements are estimated from wavenumber spectra across the inlet from bathymetric maps of the seafloor obtained with 10-25 cm horizontal resolution. Implications of the effects of the bottom variability on the vertical structure of the currents will be discussed. This work was supported by ONR and NOAA.
Joseph St. Peter; John Hogland; Nathaniel Anderson; Jason Drake; Paul Medley
2018-01-01
Land cover classification provides valuable information for prioritizing management and conservation operations across large landscapes. Current regional scale land cover geospatial products within the United States have a spatial resolution that is too coarse to provide the necessary information for operations at the local and project scales. This paper describes a...
Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; ...
2015-06-19
Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60-hour case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in-situ measurements from the RACORO field campaign and remote-sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functionsmore » for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be ~0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing datasets are derived from the ARM variational analysis, ECMWF forecasts, and a multi-scale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in 'trial' large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.« less
USGS Releases New Digital Aerial Products
,
2005-01-01
The U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) has initiated distribution of digital aerial photographic products produced by scanning or digitizing film from its historical aerial photography film archive. This archive, located in Sioux Falls, South Dakota, contains thousands of rolls of film that contain more than 8 million frames of historic aerial photographs. The largest portion of this archive consists of original film acquired by Federal agencies from the 1930s through the 1970s to produce 1:24,000-scale USGS topographic quadrangle maps. Most of this photography is reasonably large scale (USGS photography ranges from 1:8,000 to 1:80,000) to support the production of the maps. Two digital products are currently available for ordering: high-resolution scanned products and medium-resolution digitized products.
Development of dynamic kinetic resolution on large scale for (±)-1-phenylethylamine.
Thalén, Lisa K; Bäckvall, Jan-E
2010-09-13
Candida antarctica lipase B (CALB) and racemization catalyst 4 were combined in the dynamic kinetic resolution (DKR) of (±)-1-phenylethylamine (1). Several reaction parameters have been investigated to modify the method for application on multigram scale. A comparison of isopropyl acetate and alkyl methoxyacetates as acyl donors was carried out. It was found that lower catalyst loadings could be used to obtain (R)-2-methoxy-N-(1-phenylethyl)acetamide (3) in good yield and high ee when alkyl methoxyacetates were used as acyl donors compared to when isopropyl acetate was used as the acyl donor. The catalyst loading could be decreased to 1.25 mol % Ru-catalyst 4 and 10 mg CALB per mmol 1 when alkyl methoxyacetates were used as the acyl donor.
ALMA Observations of a Quiescent Molecular Cloud in the Large Magellanic Cloud
NASA Astrophysics Data System (ADS)
Wong, Tony; Hughes, Annie; Tokuda, Kazuki; Indebetouw, Rémy; Bernard, Jean-Philippe; Onishi, Toshikazu; Wojciechowski, Evan; Bandurski, Jeffrey B.; Kawamura, Akiko; Roman-Duval, Julia; Cao, Yixian; Chen, C.-H. Rosie; Chu, You-hua; Cui, Chaoyue; Fukui, Yasuo; Montier, Ludovic; Muller, Erik; Ott, Juergen; Paradis, Deborah; Pineda, Jorge L.; Rosolowsky, Erik; Sewiło, Marta
2017-12-01
We present high-resolution (subparsec) observations of a giant molecular cloud in the nearest star-forming galaxy, the Large Magellanic Cloud. ALMA Band 6 observations trace the bulk of the molecular gas in 12CO(2-1) and the high column density regions in 13CO(2-1). Our target is a quiescent cloud (PGCC G282.98-32.40, which we refer to as the “Planck cold cloud” or PCC) in the southern outskirts of the galaxy where star formation activity is very low and largely confined to one location. We decompose the cloud into structures using a dendrogram and apply an identical analysis to matched-resolution cubes of the 30 Doradus molecular cloud (located near intense star formation) for comparison. Structures in the PCC exhibit roughly 10 times lower surface density and five times lower velocity dispersion than comparably sized structures in 30 Dor, underscoring the non-universality of molecular cloud properties. In both clouds, structures with relatively higher surface density lie closer to simple virial equilibrium, whereas lower surface-density structures tend to exhibit supervirial line widths. In the PCC, relatively high line widths are found in the vicinity of an infrared source whose properties are consistent with a luminous young stellar object. More generally, we find that the smallest resolved structures (“leaves”) of the dendrogram span close to the full range of line widths observed across all scales. As a result, while the bulk of the kinetic energy is found on the largest scales, the small-scale energetics tend to be dominated by only a few structures, leading to substantial scatter in observed size-line-width relationships.
Chung, Ji Ryang; Sung, Chul; Mayerich, David; Kwon, Jaerock; Miller, Daniel E.; Huffman, Todd; Keyser, John; Abbott, Louise C.; Choe, Yoonsuck
2011-01-01
Connectomics is the study of the full connection matrix of the brain. Recent advances in high-throughput, high-resolution 3D microscopy methods have enabled the imaging of whole small animal brains at a sub-micrometer resolution, potentially opening the road to full-blown connectomics research. One of the first such instruments to achieve whole-brain-scale imaging at sub-micrometer resolution is the Knife-Edge Scanning Microscope (KESM). KESM whole-brain data sets now include Golgi (neuronal circuits), Nissl (soma distribution), and India ink (vascular networks). KESM data can contribute greatly to connectomics research, since they fill the gap between lower resolution, large volume imaging methods (such as diffusion MRI) and higher resolution, small volume methods (e.g., serial sectioning electron microscopy). Furthermore, KESM data are by their nature multiscale, ranging from the subcellular to the whole organ scale. Due to this, visualization alone is a huge challenge, before we even start worrying about quantitative connectivity analysis. To solve this issue, we developed a web-based neuroinformatics framework for efficient visualization and analysis of the multiscale KESM data sets. In this paper, we will first provide an overview of KESM, then discuss in detail the KESM data sets and the web-based neuroinformatics framework, which is called the KESM brain atlas (KESMBA). Finally, we will discuss the relevance of the KESMBA to connectomics research, and identify challenges and future directions. PMID:22275895
Graphics Processing Unit (GPU) Acceleration of the Goddard Earth Observing System Atmospheric Model
NASA Technical Reports Server (NTRS)
Putnam, Williama
2011-01-01
The Goddard Earth Observing System 5 (GEOS-5) is the atmospheric model used by the Global Modeling and Assimilation Office (GMAO) for a variety of applications, from long-term climate prediction at relatively coarse resolution, to data assimilation and numerical weather prediction, to very high-resolution cloud-resolving simulations. GEOS-5 is being ported to a graphics processing unit (GPU) cluster at the NASA Center for Climate Simulation (NCCS). By utilizing GPU co-processor technology, we expect to increase the throughput of GEOS-5 by at least an order of magnitude, and accelerate the process of scientific exploration across all scales of global modeling, including: The large-scale, high-end application of non-hydrostatic, global, cloud-resolving modeling at 10- to I-kilometer (km) global resolutions Intermediate-resolution seasonal climate and weather prediction at 50- to 25-km on small clusters of GPUs Long-range, coarse-resolution climate modeling, enabled on a small box of GPUs for the individual researcher After being ported to the GPU cluster, the primary physics components and the dynamical core of GEOS-5 have demonstrated a potential speedup of 15-40 times over conventional processor cores. Performance improvements of this magnitude reduce the required scalability of 1-km, global, cloud-resolving models from an unfathomable 6 million cores to an attainable 200,000 GPU-enabled cores.
Distant Influence of Kuroshio Eddies on North Pacific Weather Patterns?
Ma, Xiaohui; Chang, Ping; Saravanan, R.; Montuoro, Raffaele; Hsieh, Jen-Shan; Wu, Dexing; Lin, Xiaopei; Wu, Lixin; Jing, Zhao
2015-01-01
High-resolution satellite measurements of surface winds and sea-surface temperature (SST) reveal strong coupling between meso-scale ocean eddies and near-surface atmospheric flow over eddy-rich oceanic regions, such as the Kuroshio and Gulf Stream, highlighting the importance of meso-scale oceanic features in forcing the atmospheric planetary boundary layer (PBL). Here, we present high-resolution regional climate modeling results, supported by observational analyses, demonstrating that meso-scale SST variability, largely confined in the Kuroshio-Oyashio confluence region (KOCR), can further exert a significant distant influence on winter rainfall variability along the U.S. Northern Pacific coast. The presence of meso-scale SST anomalies enhances the diabatic conversion of latent heat energy to transient eddy energy, intensifying winter cyclogenesis via moist baroclinic instability, which in turn leads to an equivalent barotropic downstream anticyclone anomaly with reduced rainfall. The finding points to the potential of improving forecasts of extratropical winter cyclones and storm systems and projections of their response to future climate change, which are known to have major social and economic impacts, by improving the representation of ocean eddy–atmosphere interaction in forecast and climate models. PMID:26635077
NASA Astrophysics Data System (ADS)
Claessens, S. J.
2016-12-01
Mass density contrasts in the Earth's crust can be detected using an inversion of terrestrial or airborne gravity data. This contribution shows a technique to detect short-scale density contrasts using in-situ gravity observations in combination with a high-resolution global gravity model that includes variations in the gravity field due to topography. The technique is exemplified at various test sites using the Global Gravity Model Plus (GGMplus), which is a 7.2 arcsec resolution model of the Earth's gravitational field, covering all land masses and near-coastal areas within +/- 60° latitude. The model is a composite of GRACE and GOCE satellite observations, the EGM2008 global gravity model, and short-scale topographic gravity effects. Since variations in the Earth's gravity field due to topography are successfully modelled by GGMplus, any remaining differences with in-situ gravity observations are primarily due to mass density variations. It is shown that this technique effectively filters out large-scale density variations, and highlights short-scale near-surface density contrasts in the Earth's crust. Numerical results using recent high-density gravity surveys are presented, which indicate a strong correlation between density contrasts found and known lines of geological significance.
Automatic Matching of Large Scale Images and Terrestrial LIDAR Based on App Synergy of Mobile Phone
NASA Astrophysics Data System (ADS)
Xia, G.; Hu, C.
2018-04-01
The digitalization of Cultural Heritage based on ground laser scanning technology has been widely applied. High-precision scanning and high-resolution photography of cultural relics are the main methods of data acquisition. The reconstruction with the complete point cloud and high-resolution image requires the matching of image and point cloud, the acquisition of the homonym feature points, the data registration, etc. However, the one-to-one correspondence between image and corresponding point cloud depends on inefficient manual search. The effective classify and management of a large number of image and the matching of large image and corresponding point cloud will be the focus of the research. In this paper, we propose automatic matching of large scale images and terrestrial LiDAR based on APP synergy of mobile phone. Firstly, we develop an APP based on Android, take pictures and record related information of classification. Secondly, all the images are automatically grouped with the recorded information. Thirdly, the matching algorithm is used to match the global and local image. According to the one-to-one correspondence between the global image and the point cloud reflection intensity image, the automatic matching of the image and its corresponding laser radar point cloud is realized. Finally, the mapping relationship between global image, local image and intensity image is established according to homonym feature point. So we can establish the data structure of the global image, the local image in the global image, the local image corresponding point cloud, and carry on the visualization management and query of image.
NASA Astrophysics Data System (ADS)
Nyland, K.; Harwood, J. J.; Mukherjee, D.; Jagannathan, P.; Rujopakarn, W.; Emonts, B.; Alatalo, K.; Bicknell, G. V.; Davis, T. A.; Greene, J. E.; Kimball, A.; Lacy, M.; Lonsdale, Carol; Lonsdale, Colin; Maksym, W. P.; Molnár, D. C.; Morabito, L.; Murphy, E. J.; Patil, P.; Prandoni, I.; Sargent, M.; Vlahakis, C.
2018-05-01
Energetic feedback by active galactic nuclei (AGNs) plays an important evolutionary role in the regulation of star formation on galactic scales. However, the effects of this feedback as a function of redshift and galaxy properties such as mass, environment, and cold gas content remain poorly understood. The broad frequency coverage (1 to 116 GHz), high sensitivity (up to ten times higher than the Karl G. Jansky Very Large Array), and superb angular resolution (maximum baselines of at least a few hundred kilometers) of the proposed next-generation Very Large Array (ngVLA) are uniquely poised to revolutionize our understanding of AGNs and their role in galaxy evolution. Here, we provide an overview of the science related to AGN feedback that will be possible in the ngVLA era and present new continuum ngVLA imaging simulations of resolved radio jets spanning a wide range of intrinsic extents. We also consider key computational challenges and discuss exciting opportunities for multiwavelength synergy with other next-generation instruments, such as the Square Kilometer Array and the James Webb Space Telescope. The unique combination of high-resolution, large collecting area, and wide frequency range will enable significant advancements in our understanding of the effects of jet-driven feedback on sub-galactic scales, particularly for sources with extents of a few parsec to a few kiloparsec, such as young and/or lower-power radio AGNs, AGNs hosted by low-mass galaxies, radio jets that are interacting strongly with the interstellar medium of the host galaxy, and AGNs at high redshift.
Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.
2017-12-01
Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.
The velocity characteristics of dusty filaments in the JCMT GBS clouds
NASA Astrophysics Data System (ADS)
Buckle, J. V.; Salji, C.; Richer, J. S.
2013-07-01
Large scale, high resolution spectral and continuum imaging maps have revealed, to an unprecedented extent, the characteristics of filamentary structure in star-forming molecular clouds, and their close association with star-forming cores. The filaments are associated with the formation of dense molecular cores where star formation occurs, and recent models highlight the important relationship between filaments and star-forming clusters. Velocity-coherent filaments have been proposed as the parent structures of star forming cores in Taurus. In Serpens, accretion flows along filaments have been proposed as the continuous source of mass for the star forming cluster. An evolutionary scenario for filaments based on velocity dispersion and column density measurements has recently been proposed, which we test with large scale molecular line and dust continuum maps. The JCMT Gould Belt Survey with SCUBA-2 and HARP provides dust continuum observations at 850 and 450 micron, and 12CO/13CO/C18O J=3-2 spectral line mapping of several nearby molecular clouds, covering large angular scales at high resolution. Velocities and linewidths of optically thin species, such as C18O which traces the warm, dense gas associated with star formation, are critical for an estimate of the virial stability of filamentary structures. The data and analyses that we present provide robust statistics over a large range of starless and protostellar evolutionary states. We present the velocity characteristics of dusty filaments in Orion, probing the physics at the boundary of filamentary structure and star formation. Using C18O, we investigate the internal structure of filaments, based on fragmentation and velocity coherence in the molecular line data. Through velocity dispersion measurements, we determine whether the filamentary structures are bound, and compare results between clouds of different star formation characteristics.
The use of imprecise processing to improve accuracy in weather & climate prediction
NASA Astrophysics Data System (ADS)
Düben, Peter D.; McNamara, Hugh; Palmer, T. N.
2014-08-01
The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing bit-reproducibility and precision in exchange for improvements in performance and potentially accuracy of forecasts, due to a reduction in power consumption that could allow higher resolution. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud-resolving atmospheric modelling. The impact of both hardware induced faults and low precision arithmetic is tested using the Lorenz '96 model and the dynamical core of a global atmosphere model. In the Lorenz '96 model there is a natural scale separation; the spectral discretisation used in the dynamical core also allows large and small scale dynamics to be treated separately within the code. Such scale separation allows the impact of lower-accuracy arithmetic to be restricted to components close to the truncation scales and hence close to the necessarily inexact parametrised representations of unresolved processes. By contrast, the larger scales are calculated using high precision deterministic arithmetic. Hardware faults from stochastic processors are emulated using a bit-flip model with different fault rates. Our simulations show that both approaches to inexact calculations do not substantially affect the large scale behaviour, provided they are restricted to act only on smaller scales. By contrast, results from the Lorenz '96 simulations are superior when small scales are calculated on an emulated stochastic processor than when those small scales are parametrised. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations. This would allow higher resolution models to be run at the same computational cost.
Effect of small scale transport processes on phytoplankton distribution in coastal seas.
Hernández-Carrasco, Ismael; Orfila, Alejandro; Rossi, Vincent; Garçon, Veronique
2018-06-05
Coastal ocean ecosystems are major contributors to the global biogeochemical cycles and biological productivity. Physical factors induced by the turbulent flow play a crucial role in regulating marine ecosystems. However, while large-scale open-ocean dynamics is well described by geostrophy, the role of multiscale transport processes in coastal regions is still poorly understood due to the lack of continuous high-resolution observations. Here, the influence of small-scale dynamics (O(3.5-25) km, i.e. spanning upper submesoscale and mesoscale processes) on surface phytoplankton derived from satellite chlorophyll-a (Chl-a) is studied using Lagrangian metrics computed from High-Frequency Radar currents. The combination of complementary Lagrangian diagnostics, including the Lagrangian divergence along fluid trajectories, provides an improved description of the 3D flow geometry which facilitates the interpretation of two non-exclusive physical mechanisms affecting phytoplankton dynamics and patchiness. Attracting small-scale fronts, unveiled by backwards Lagrangian Coherent Structures, are associated to negative divergence where particles and Chl-a standing stocks cluster. Filaments of positive divergence, representing large accumulated upward vertical velocities and suggesting accrued injection of subsurface nutrients, match areas with large Chl-a concentrations. Our findings demonstrate that an accurate characterization of small-scale transport processes is necessary to comprehend bio-physical interactions in coastal seas.
Assessing sufficiency of thermal riverscapes for resilient ...
Resilient salmon populations require river networks that provide water temperature regimes sufficient to support a diversity of salmonid life histories across space and time. Efforts to protect, enhance and restore watershed thermal regimes for salmon may target specific locations and features within stream networks hypothesized to provide disproportionately high-value functional resilience to salmon populations. These include relatively small-scale features such as thermal refuges, and larger-scale features such as entire watersheds or aquifers that support thermal regimes buffered from local climatic conditions. Quantifying the value of both small and large scale thermal features to salmon populations has been challenged by both the difficulty of mapping thermal regimes at sufficient spatial and temporal resolutions, and integrating thermal regimes into population models. We attempt to address these challenges by using newly-available datasets and modeling approaches to link thermal regimes to salmon populations across scales. We will describe an individual-based modeling approach for assessing sufficiency of thermal refuges for migrating salmon and steelhead in large rivers, as well as a population modeling approach for assessing large-scale climate refugia for salmon in the Pacific Northwest. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec
NASA Astrophysics Data System (ADS)
Fernández, V.; Dietrich, D. E.; Haney, R. L.; Tintoré, J.
In situ and satellite data obtained during the last ten years have shown that the circula- tion in the Mediterranean Sea is extremely complex in space, with significant features ranging from mesoscale to sub-basin and basin scale, and highly variable in time, with mesoscale to seasonal and interannual signals. Also, the steep bottom topography and the variable atmospheric conditions from one sub-basin to another, make the circula- tion to be composed of numerous energetic and narrow coastal currents, density fronts and mesoscale structures that interact at sub-basin scale with the large scale circula- tion. To simulate numerically and better understand these features, besides high grid resolution, a low numerical dispersion and low physical dissipation ocean model is required. We present the results from a 1/8z horizontal resolution numerical simula- tion of the Mediterranean Sea using DieCAST ocean model, which meets the above requirements since it is stable with low general dissipation and uses accurate fourth- order-accurate approximations with low numerical dispersion. The simulations are carried out with climatological surface forcing using monthly mean winds and relax- ation towards climatological values of temperature and salinity. The model reproduces the main features of the large basin scale circulation, as well as the seasonal variabil- ity of sub-basin scale currents that are well documented by observations in straits and channels. In addition, DieCAST brings out natural fronts and eddies that usually do not appear in numerical simulations of the Mediterranean and that lead to a natural interannual variability. The role of this intrinsic variability in the general circulation will be discussed.
On the use of high-resolution topographic data as a proxy for seismic site conditions (VS30)
Allen, T.I.; Wald, D.J.
2009-01-01
An alternative method has recently been proposed for evaluating global seismic site conditions, or the average shear velocity to 30 m depth (VS30), from the Shuttle Radar Topography Mission (SRTM) 30 arcsec digital elevation models (DEMs). The basic premise of the method is that the topographic slope can be used as a reliable proxy for VS30 in the absence of geologically and geotechnically based site-condition maps through correlations between VS30 measurements and topographic gradient. Here we evaluate the use of higher-resolution (3 and 9 arcsec) DEMs to examine whether we are able to resolve VS30 in more detail than can be achieved using the lower-resolution SRTM data. High-quality DEMs at resolutions greater than 30 arcsec are not uniformly available at the global scale. However, in many regions where such data exist, they may be employed to resolve finer-scale variations in topographic gradient, and consequently, VS30. We use the U.S. Geological Survey Earth Resources Observation and Science (EROS) Data Center's National Elevation Dataset (NED) to investigate the use of high-resolution DEMs for estimating VS30 in several regions across the United States, including the San Francisco Bay area in California, Los Angeles, California, and St. Louis, Missouri. We compare these results with an example from Taipei, Taiwan, that uses 9 arcsec SRTM data, which are globally available. The use of higher-resolution NED data recovers finer-scale variations in topographic gradient, which better correlate to geological and geomorphic features, in particular, at the transition between hills and basins, warranting their use over 30 arcsec SRTM data where available. However, statistical analyses indicate little to no improvement over lower-resolution topography when compared to VS30 measurements, suggesting that some topographic smoothing may provide more stable VS30 estimates. Furthermore, we find that elevation variability in canopy-based SRTM measurements at resolutions greater than 30 arcsec are too large to resolve reliable slopes, particularly in low-gradient sedimentary basins.
Multilevel Cloud Structures above Svalbard
NASA Astrophysics Data System (ADS)
Dörnbrack, Andreas; Pitts, Micheal; Poole, Lamont; Gisinger, Sonja; Maturlli, Marion
2017-04-01
The presentation focusses on the reslts recently published by the authors under the heading "picture of the month" in Monthly Weather Review. The presented picture of the month is a superposition of space-borne lidar observations and high-resolution temperature fields of the ECMWF integrated forecast system (IFS). It displays complex tropospheric and stratospheric clouds in the Arctic winter 2015/16. Near the end of December 2015, the unusual northeastward propagation of warm and humid subtropical air masses as far north as 80°N lifted the tropopause by more than 3 km in 24 h and cooled the stratosphere on a large scale. A widespread formation of thick cirrus clouds near the tropopause and of synoptic-scale polar stratospheric clouds (PSCs) occurred as the temperature dropped below the thresholds for the existence of cloud particles. Additionally, mountain waves were excited by the strong flow at the western edge of the ridge across Svalbard, leading to the formation of mesoscale ice PSCs. The most recent IFS cycle using a horizontal resolution of 8 km globally reproduces the large-scale and mesoscale flow features and leads to a remarkable agreement with the wave structure revealed by the space-borne observations.
Nanoposition sensors with superior linear response to position and unlimited travel ranges
NASA Astrophysics Data System (ADS)
Lee, Sheng-Chiang; Peters, Randall D.
2009-04-01
With the advancement in nanotechnology, the ability of positioning/measuring at subnanometer scale has been one of the most critical issues for the nanofabrication industry and researchers using scanning probe microscopy. Commercial nanopositioners have achieved direct measurements at the scale of 0.01 nm with capacitive sensing metrology. However, the commercial sensors have small dynamic ranges (up to only a few hundred micrometers) and are relatively large in size (centimeters in the transverse directions to the motion), which is necessary for healthy signal detections but making it difficult to use on smaller devices. This limits applications in which large materials (on the scale of centimeters or greater) are handled with needs of subnanometer resolutions. What has been done in the past is to combine the fine and coarse translation stages with different dynamic ranges to simultaneously achieve long travel range and high spatial resolution. In this paper, we present a novel capacitive position sensing metrology with ultrawide dynamic range from subnanometer to literally any practically desired length for a translation stage. This sensor will greatly simplify the task and enhance the performance of direct metrology in a hybrid translational stage covering translation tasks from subnanometer to centimeters.
Coarse climate change projections for species living in a fine-scaled world.
Nadeau, Christopher P; Urban, Mark C; Bridle, Jon R
2017-01-01
Accurately predicting biological impacts of climate change is necessary to guide policy. However, the resolution of climate data could be affecting the accuracy of climate change impact assessments. Here, we review the spatial and temporal resolution of climate data used in impact assessments and demonstrate that these resolutions are often too coarse relative to biologically relevant scales. We then develop a framework that partitions climate into three important components: trend, variance, and autocorrelation. We apply this framework to map different global climate regimes and identify where coarse climate data is most and least likely to reduce the accuracy of impact assessments. We show that impact assessments for many large mammals and birds use climate data with a spatial resolution similar to the biologically relevant area encompassing population dynamics. Conversely, impact assessments for many small mammals, herpetofauna, and plants use climate data with a spatial resolution that is orders of magnitude larger than the area encompassing population dynamics. Most impact assessments also use climate data with a coarse temporal resolution. We suggest that climate data with a coarse spatial resolution is likely to reduce the accuracy of impact assessments the most in climates with high spatial trend and variance (e.g., much of western North and South America) and the least in climates with low spatial trend and variance (e.g., the Great Plains of the USA). Climate data with a coarse temporal resolution is likely to reduce the accuracy of impact assessments the most in the northern half of the northern hemisphere where temporal climatic variance is high. Our framework provides one way to identify where improving the resolution of climate data will have the largest impact on the accuracy of biological predictions under climate change. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Katavouta, Anna; Thompson, Keith
2017-04-01
A high resolution regional model (1/36 degree) of the Gulf of Maine, Scotian Shelf and adjacent deep ocean (GoMSS) is developed to downscale ocean conditions from an existing global operational system. First, predictions from the regional GoMSS model in a one-way nesting set up are evaluated using observations from multiple sources including satellite-borne sensors of surface temperature and sea level, CTDs, Argo floats and moored current meters. It is shown that on the shelf, the regional model predicts more realistic fields than the global system because it has higher resolution and includes tides that are absent from the global system. However, in deep water the regional model misplaces deep ocean eddies and meanders associated with the Gulf Stream. This is because of unrealistic internally generated variability (associated with the one-way nesting set up) that leads to decoupling of the regional model from the global system in the deep water. To overcome this problem, the large scales (length scales > 90 km) of the regional model are spectrally nudged towards the global system fields. This leads to more realistic predictions off the shelf. Wavenumber spectra show that even though spectral nudging constrains the large scales, it does not suppress the variability on small scales; on the contrary, it favours the formation of eddies with length scales below the cut-off wavelength of the spectral nudging.
A framework for WRF to WRF-IBM grid nesting to enable multiscale simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiersema, David John; Lundquist, Katherine A.; Chow, Fotini Katapodes
With advances in computational power, mesoscale models, such as the Weather Research and Forecasting (WRF) model, are often pushed to higher resolutions. As the model’s horizontal resolution is refined, the maximum resolved terrain slope will increase. Because WRF uses a terrain-following coordinate, this increase in resolved terrain slopes introduces additional grid skewness. At high resolutions and over complex terrain, this grid skewness can introduce large numerical errors that require methods, such as the immersed boundary method, to keep the model accurate and stable. Our implementation of the immersed boundary method in the WRF model, WRF-IBM, has proven effective at microscalemore » simulations over complex terrain. WRF-IBM uses a non-conforming grid that extends beneath the model’s terrain. Boundary conditions at the immersed boundary, the terrain, are enforced by introducing a body force term to the governing equations at points directly beneath the immersed boundary. Nesting between a WRF parent grid and a WRF-IBM child grid requires a new framework for initialization and forcing of the child WRF-IBM grid. This framework will enable concurrent multi-scale simulations within the WRF model, improving the accuracy of high-resolution simulations and enabling simulations across a wide range of scales.« less
NASA Astrophysics Data System (ADS)
Scher, C.; Tennant, C.; Larsen, L.; Bellugi, D. G.
2016-12-01
Advances in remote-sensing technology allow for cost-effective, accurate, high-resolution mapping of river-channel topography and shallow aquatic bathymetry over large spatial scales. A combination of near-infrared and green spectra airborne laser swath mapping was used to map river channel bathymetry and watershed geometry over 90+ river-kilometers (75-1175 km2) of the Greys River in Wyoming. The day of flight wetted channel was identified from green LiDAR returns, and more than 1800 valley-bottom cross-sections were extracted at regular 50-m intervals. The bankfull channel geometry was identified using a "watershed-based" algorithm that incrementally filled local minima to a "spill" point, thereby constraining areas of local convergence and delineating all the potential channels along the cross-section for each distinct "spill stage." Multiple potential channels in alluvial floodplains and lack of clearly defined channel banks in bedrock reaches challenge identification of the bankfull channel based on topology alone. Here we combine a variety of topological measures, geometrical considerations, and stage levels to define a stage-dependent bankfull channel geometry, and compare the results with day of flight wetted channel data. Initial results suggest that channel hydraulic geometry and basin hydrology power-law scaling may not accurately capture downstream channel adjustments for rivers draining complex mountain topography.
The FRIGG project: From intermediate galactic scales to self-gravitating cores
NASA Astrophysics Data System (ADS)
Hennebelle, Patrick
2018-03-01
Context. Understanding the detailed structure of the interstellar gas is essential for our knowledge of the star formation process. Aim. The small-scale structure of the interstellar medium (ISM) is a direct consequence of the galactic scales and making the link between the two is essential. Methods: We perform adaptive mesh simulations that aim to bridge the gap between the intermediate galactic scales and the self-gravitating prestellar cores. For this purpose we use stratified supernova regulated ISM magneto-hydrodynamical simulations at the kpc scale to set up the initial conditions. We then zoom, performing a series of concentric uniform refinement and then refining on the Jeans length for the last levels. This allows us to reach a spatial resolution of a few 10-3 pc. The cores are identified using a clump finder and various criteria based on virial analysis. Their most relevant properties are computed and, due to the large number of objects formed in the simulations, reliable statistics are obtained. Results: The cores' properties show encouraging agreements with observations. The mass spectrum presents a clear powerlaw at high masses with an exponent close to ≃-1.3 and a peak at about 1-2 M⊙. The velocity dispersion and the angular momentum distributions are respectively a few times the local sound speed and a few 10-2 pc km s-1. We also find that the distribution of thermally supercritical cores present a range of magnetic mass-to-flux over critical mass-to-flux ratios, typically between ≃0.3 and 3 indicating that they are significantly magnetized. Investigating the time and spatial dependence of these statistical properties, we conclude that they are not significantly affected by the zooming procedure and that they do not present very large fluctuations. The most severe issue appears to be the dependence on the numerical resolution of the core mass function (CMF). While the core definition process may possibly introduce some biases, the peak tends to shift to smaller values when the resolution improves. Conclusions: Our simulations, which use self-consistently generated initial conditions at the kpc scale, produce a large number of prestellar cores from which reliable statistics can be inferred. Preliminary comparisons with observations show encouraging agreements. In particular the inferred CMFs resemble the ones inferred from recent observations. We stress, however, a possible issue with the peak position shifting with numerical resolution.
Realism of Indian Summer Monsoon Simulation in a Quarter Degree Global Climate Model
NASA Astrophysics Data System (ADS)
Salunke, P.; Mishra, S. K.; Sahany, S.; Gupta, K.
2017-12-01
This study assesses the fidelity of Indian Summer Monsoon (ISM) simulations using a global model at an ultra-high horizontal resolution (UHR) of 0.25°. The model used was the atmospheric component of the Community Earth System Model version 1.2.0 (CESM 1.2.0) developed at the National Center for Atmospheric Research (NCAR). Precipitation and temperature over the Indian region were analyzed for a wide range of space and time scales to evaluate the fidelity of the model under UHR, with special emphasis on the ISM simulations during the period of June-through-September (JJAS). Comparing the UHR simulations with observed data from the India Meteorological Department (IMD) over the Indian land, it was found that 0.25° resolution significantly improved spatial rainfall patterns over many regions, including the Western Ghats and the South-Eastern peninsula as compared to the standard model resolution. Convective and large-scale rainfall components were analyzed using the European Centre for Medium Range Weather Forecast (ECMWF) Re-Analysis (ERA)-Interim (ERA-I) data and it was found that at 0.25° resolution, there was an overall increase in the large-scale component and an associated decrease in the convective component of rainfall as compared to the standard model resolution. Analysis of the diurnal cycle of rainfall suggests a significant improvement in the phase characteristics simulated by the UHR model as compared to the standard model resolution. Analysis of the annual cycle of rainfall, however, failed to show any significant improvement in the UHR model as compared to the standard version. Surface temperature analysis showed small improvements in the UHR model simulations as compared to the standard version. Thus, one may conclude that there are some significant improvements in the ISM simulations using a 0.25° global model, although there is still plenty of scope for further improvement in certain aspects of the annual cycle of rainfall.
Extreme weather: Subtropical floods and tropical cyclones
NASA Astrophysics Data System (ADS)
Shaevitz, Daniel A.
Extreme weather events have a large effect on society. As such, it is important to understand these events and to project how they may change in a future, warmer climate. The aim of this thesis is to develop a deeper understanding of two types of extreme weather events: subtropical floods and tropical cyclones (TCs). In the subtropics, the latitude is high enough that quasi-geostrophic dynamics are at least qualitatively relevant, while low enough that moisture may be abundant and convection strong. Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. In the first part of this thesis, I examine the possible triggering of convection by the large-scale dynamics and investigate the coupling between the two. Specifically two examples of extreme precipitation events in the subtropics are analyzed, the 2010 and 2014 floods of India and Pakistan and the 2015 flood of Texas and Oklahoma. I invert the quasi-geostrophic omega equation to decompose the large-scale vertical motion profile to components due to synoptic forcing and diabatic heating. Additionally, I present model results from within the Column Quasi-Geostrophic framework. A single column model and cloud-revolving model are forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. It is found that convection was triggered primarily by mechanically forced orographic ascent over the Himalayas during the India/Pakistan flood and by upper-level Potential Vorticity disturbances during the Texas/Oklahoma flood. Furthermore, a climate attribution analysis was conducted for the Texas/Oklahoma flood and it is found that anthropogenic climate change was responsible for a small amount of rainfall during the event but the intensity of this event may be greatly increased if it occurs in a future climate. In the second part of this thesis, I examine the ability of high-resolution global atmospheric models to simulate TCs. Specifically, I present an intercomparison of several models' ability to simulate the global characteristics of TCs in the current climate. This is a necessary first step before using these models to project future changes in TCs. Overall, the models were able to reproduce the geographic distribution of TCs reasonably well, with some of the models performing remarkably well. The intensity of TCs varied widely between the models, with some of this difference being due to model resolution.
Scherer, Laura; Venkatesh, Aranya; Karuppiah, Ramkumar; Pfister, Stephan
2015-04-21
Physical water scarcities can be described by water stress indices. These are often determined at an annual scale and a watershed level; however, such scales mask seasonal fluctuations and spatial heterogeneity within a watershed. In order to account for this level of detail, first and foremost, water availability estimates must be improved and refined. State-of-the-art global hydrological models such as WaterGAP and UNH/GRDC have previously been unable to reliably reflect water availability at the subbasin scale. In this study, the Soil and Water Assessment Tool (SWAT) was tested as an alternative to global models, using the case study of the Mississippi watershed. While SWAT clearly outperformed the global models at the scale of a large watershed, it was judged to be unsuitable for global scale simulations due to the high calibration efforts required. The results obtained in this study show that global assessments miss out on key aspects related to upstream/downstream relations and monthly fluctuations, which are important both for the characterization of water scarcity in the Mississippi watershed and for water footprints. Especially in arid regions, where scarcity is high, these models provide unsatisfying results.
Viking High-Resolution Topography and Mars '01 Site Selection: Application to the White Rock Area
NASA Astrophysics Data System (ADS)
Tanaka, K. L.; Kirk, Randolph L.; Mackinnon, D. J.; Howington-Kraus, E.
1999-06-01
Definition of the local topography of the Mars '01 Lander site is crucial for assessment of lander safety and rover trafficability. According to Golombek et al., steep surface slopes may (1) cause retro-rockets to be fired too early or late for a safe landing, (2) the landing site slope needs to be < 1deg to ensure lander stability, and (3) a nearly level site is better for power generation of both the lander and the rover and for rover trafficability. Presently available datasets are largely inadequate to determine surface slope at scales pertinent to landing-site issues. Ideally, a topographic model of the entire landing site at meter-scale resolution would permit the best assessment of the pertinent topographic issues. MOLA data, while providing highly accurate vertical measurements, are inadequate to address slopes along paths of less than several hundred meters, because of along-track data spacings of hundreds of meters and horizontal errors in positioning of 500 to 2000 m. The capability to produce stereotopography from MOC image pairs is not yet in hand, nor can we necessarily expect a suitable number of stereo image pairs to be acquired. However, for a limited number of sites, high-resolution Viking stereo imaging is available at tens of meters horizontal resolution, capable of covering landing-ellipse sized areas. Although we would not necessarily suggest that the chosen Mars '01 Lander site should be located where good Viking stereotopography is available, an assessment of typical surface slopes at these scales for a range of surface types may be quite valuable in landing-site selection. Thus this study has a two-fold application: (1) to support the proposal of White Rock as a candidate Mars '01 Lander site, and (2) to evaluate how Viking high resolution stereotopography may be of value in the overall Mars '01 Lander site selection process.
An Outflow-shaped Magnetic Field Toward the Class 0 Protostellar Source Serpens SMM1
NASA Astrophysics Data System (ADS)
Hull, Charles; Girart, Josep M.; Tychoniec, Lukasz; Rao, Ramprasad; Cortés, Paulo; Pokhrel, Riwaj; Zhang, Qizhou; Houde, Martin; Dunham, Michael; Kristensen, Lars; Lai, Shih-Ping; Li, Zhi-Yun; Plambeck, Richard
2018-01-01
The results from the polarization system at the Atacama Large Millimeter/submillimeter Array (ALMA) have begun both to expand and to confound our understanding of the role of the magnetic field in low-mass star formation. Here we show the highest resolution and highest sensitivity polarization images made to date toward the very young, intermediate-mass Class 0 protostellar source Serpens SMM1, the brightest source in the Serpens Main star-forming region. These ALMA observations achieve ~140 AU resolution, allowing us to probe dust polarization—and thus magnetic field orientation—in the innermost regions surrounding the protostar. By complementing these observations with polarization observations from the Submillimeter Array (SMA) and archival data from the Combined Array for Research in Millimeter-wave Astronomy (CARMA) and the James Clerk Maxwell Telescopes (JCMT), we can compare the magnetic field orientations at different spatial scales. We find major changes in the magnetic field orientation between large (~0.1 pc) scales—where the magnetic field is oriented E–W, perpendicular to the major axis of the dusty filament where SMM1 is embedded—and the intermediate and small scales probed by CARMA (~1000 au resolution), the SMA (~350 au resolution), and ALMA. The ALMA maps reveal that the redshifted lobe of the bipolar outflow is clearly shaping the magnetic field in SMM1 on the southeast side of the source. High-spatial-resolution continuum and spectral-line observations also reveal a tight (~130 au) protobinary system in SMM1-b, the eastern component of which is launching an extremely high-velocity, one-sided jet visible in both CO(2-1) and SiO(5-4); however, that jet does not appear to be shaping the magnetic field. These observations show that with the sensitivity and resolution of ALMA, we can now begin to understand the role that feedback (e.g., from protostellar outflows) plays in shaping the magnetic field in very young, star-forming sources like SMM1.
Delensing CMB polarization with external datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kendrick M.; Hanson, Duncan; LoVerde, Marilena
2012-06-01
One of the primary scientific targets of current and future CMB polarization experiments is the search for a stochastic background of gravity waves in the early universe. As instrumental sensitivity improves, the limiting factor will eventually be B-mode power generated by gravitational lensing, which can be removed through use of so-called ''delensing'' algorithms. We forecast prospects for delensing using lensing maps which are obtained externally to CMB polarization: either from large-scale structure observations, or from high-resolution maps of CMB temperature. We conclude that the forecasts in either case are not encouraging, and that significantly delensing large-scale CMB polarization requires high-resolutionmore » polarization maps with sufficient sensitivity to measure the lensing B-mode. We also present a simple formalism for including delensing in CMB forecasts which is computationally fast and agrees well with Monte Carlos.« less
NASA Astrophysics Data System (ADS)
Harris, B.; McDougall, K.; Barry, M.
2012-07-01
Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.
NASA Astrophysics Data System (ADS)
Hori, T.; Agata, R.; Ichimura, T.; Fujita, K.; Yamaguchi, T.; Takahashi, N.
2017-12-01
Recently, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. For construct a system for monitoring and forecasting, it is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate inter-face and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Unstructured FE non-linear seismic wave simulation code has been developed. This achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. A high fidelity FEM simulation code with mesh generator has also been developed to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. This code has been improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, waveform inversion code for modeling 3D crustal structure has been developed, and the high-fidelity FEM code has been improved to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. We are developing the methods for forecasting the slip velocity variation on the plate interface. Although the prototype is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model. Furthermore, large-scale simulation codes for monitoring are being implemented on the GPU clusters and analysis tools are developing to include other functions such as examination in model errors.
NASA Astrophysics Data System (ADS)
Oaida, C. M.; Andreadis, K.; Reager, J. T., II; Famiglietti, J. S.; Levoe, S.
2017-12-01
Accurately estimating how much snow water equivalent (SWE) is stored in mountainous regions characterized by complex terrain and snowmelt-driven hydrologic cycles is not only greatly desirable, but also a big challenge. Mountain snowpack exhibits high spatial variability across a broad range of spatial and temporal scales due to a multitude of physical and climatic factors, making it difficult to observe or estimate in its entirety. Combing remotely sensed data and high resolution hydrologic modeling through data assimilation (DA) has the potential to provide a spatially and temporally continuous SWE dataset at horizontal scales that capture sub-grid snow spatial variability and are also relevant to stakeholders such as water resource managers. Here, we present the evaluation of a new snow DA approach that uses a Local Ensemble Transform Kalman Filter (LETKF) in tandem with the Variable Infiltration Capacity macro-scale hydrologic model across the Western United States, at a daily temporal resolution, and a horizontal resolution of 1.75 km x 1.75 km. The LETKF is chosen for its relative simplicity, ease of implementation, and computational efficiency and scalability. The modeling/DA system assimilates daily MODIS Snow Covered Area and Grain Size (MODSCAG) fractional snow cover over, and has been developed to efficiently calculate SWE estimates over extended periods of time and covering large regional-scale areas at relatively high spatial resolution, ultimately producing a snow reanalysis-type dataset. Here we focus on the assessment of SWE produced by the DA scheme over several basins in California's Sierra Nevada Mountain range where Airborne Snow Observatory data is available, during the last five water years (2013-2017), which include both one of the driest and one of the wettest years. Comparison against such a spatially distributed SWE observational product provides a greater understanding of the model's ability to estimate SWE and SWE spatial variability, and highlights under which conditions snow cover DA can add value in estimating SWE.
High-resolution modeling assessment of tidal stream resource in Western Passage of Maine, USA
NASA Astrophysics Data System (ADS)
Yang, Zhaoqing; Wang, Taiping; Feng, Xi; Xue, Huijie; Kilcher, Levi
2017-04-01
Although significant efforts have been taken to assess the maximum potential of tidal stream energy at system-wide scale, accurate assessment of tidal stream energy resource at project design scale requires detailed hydrodynamic simulations using high-resolution three-dimensional (3-D) numerical models. Extended model validation against high quality measured data is essential to minimize the uncertainties of the resource assessment. Western Passage in the State of Maine in U.S. has been identified as one of the top ranking sites for tidal stream energy development in U.S. coastal waters, based on a number of criteria including tidal power density, market value and transmission distance. This study presents an on-going modeling effort for simulating the tidal hydrodynamics in Western Passage using the 3-D unstructured-grid Finite Volume Community Ocean Model (FVCOM). The model domain covers a large region including the entire the Bay of Fundy with grid resolution varies from 20 m in the Western Passage to approximately 1000 m along the open boundary near the mouth of Bay of Fundy. Preliminary model validation was conducted using existing NOAA measurements within the model domain. Spatial distributions of tidal power density were calculated and extractable tidal energy was estimated using a tidal turbine module embedded in FVCOM under different tidal farm scenarios. Additional field measurements to characterize resource and support model validation were discussed. This study provides an example of high resolution resource assessment based on the guidance recommended by the International Electrotechnical Commission Technical Specification.
Numerical Upscaling of Solute Transport in Fractured Porous Media Based on Flow Aligned Blocks
NASA Astrophysics Data System (ADS)
Leube, P.; Nowak, W.; Sanchez-Vila, X.
2013-12-01
High-contrast or fractured-porous media (FPM) pose one of the largest unresolved challenges for simulating large hydrogeological systems. The high contrast in advective transport between fast conduits and low-permeability rock matrix, including complex mass transfer processes, leads to the typical complex characteristics of early bulk arrivals and long tailings. Adequate direct representation of FPM requires enormous numerical resolutions. For large scales, e.g. the catchment scale, and when allowing for uncertainty in the fracture network architecture or in matrix properties, computational costs quickly reach an intractable level. In such cases, multi-scale simulation techniques have become useful tools. They allow decreasing the complexity of models by aggregating and transferring their parameters to coarser scales and so drastically reduce the computational costs. However, these advantages come at a loss of detail and accuracy. In this work, we develop and test a new multi-scale or upscaled modeling approach based on block upscaling. The novelty is that individual blocks are defined by and aligned with the local flow coordinates. We choose a multi-rate mass transfer (MRMT) model to represent the remaining sub-block non-Fickian behavior within these blocks on the coarse scale. To make the scale transition simple and to save computational costs, we capture sub-block features by temporal moments (TM) of block-wise particle arrival times to be matched with the MRMT model. By predicting spatial mass distributions of injected tracers in a synthetic test scenario, our coarse-scale solution matches reasonably well with the corresponding fine-scale reference solution. For predicting higher TM-orders (such as arrival time and effective dispersion), the prediction accuracy steadily decreases. This is compensated to some extent by the MRMT model. If the MRMT model becomes too complex, it loses its effect. We also found that prediction accuracy is sensitive to the choice of the effective dispersion coefficients and on the block resolution. A key advantage of the flow-aligned blocks is that the small-scale velocity field is reproduced quite accurately on the block-scale through their flow alignment. Thus, the block-scale transverse dispersivities remain in the similar magnitude as local ones, and they do not have to represent macroscopic uncertainty. Also, the flow-aligned blocks minimize numerical dispersion when solving the large-scale transport problem.
Organic electronics for high-resolution electrocorticography of the human brain.
Khodagholy, Dion; Gelinas, Jennifer N; Zhao, Zifang; Yeh, Malcolm; Long, Michael; Greenlee, Jeremy D; Doyle, Werner; Devinsky, Orrin; Buzsáki, György
2016-11-01
Localizing neuronal patterns that generate pathological brain signals may assist with tissue resection and intervention strategies in patients with neurological diseases. Precise localization requires high spatiotemporal recording from populations of neurons while minimizing invasiveness and adverse events. We describe a large-scale, high-density, organic material-based, conformable neural interface device ("NeuroGrid") capable of simultaneously recording local field potentials (LFPs) and action potentials from the cortical surface. We demonstrate the feasibility and safety of intraoperative recording with NeuroGrids in anesthetized and awake subjects. Highly localized and propagating physiological and pathological LFP patterns were recorded, and correlated neural firing provided evidence about their local generation. Application of NeuroGrids to brain disorders, such as epilepsy, may improve diagnostic precision and therapeutic outcomes while reducing complications associated with invasive electrodes conventionally used to acquire high-resolution and spiking data.
Superconducting transition detectors for low-energy gamma-ray astrophysics
NASA Astrophysics Data System (ADS)
Kurfess, J. D.; Johnson, W. N.; Fritz, G. G.; Strickman, M. S.; Kinzer, R. L.; Jung, G.; Drukier, A. K.; Chmielowski, M.
1990-08-01
A program to investigate superconducting devices such as STDs for use in high-resolution Compton telescopes and coded-aperture detectors is presented. For higher energy applications, techniques are investigated with potential for scaling to large detectors, while also providing excellent energy and positional resolution. STDs are discussed, utilizing a uniform array of spherical granules tens of microns in diameter. The typical temperature-magnetic field phase for a low-temperature superconductor, the signal produced by the superconducting-normal transition in the 32-m diameter Sn granule, and the temperature history of an STD granule following heating by an ionizing particle are illustrated.
The Wide Field Imager instrument for Athena
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Barbera, Marco; Emberger, Valentin; Fürmetz, Maria; Manhart, Markus; Müller-Seidlitz, Johannes; Nandra, Kirpal; Plattner, Markus; Rau, Arne; Treberspurg, Wolfgang
2017-08-01
ESA's next large X-ray mission ATHENA is designed to address the Cosmic Vision science theme 'The Hot and Energetic Universe'. It will provide answers to the two key astrophysical questions how does ordinary matter assemble into the large-scale structures we see today and how do black holes grow and shape the Universe. The ATHENA spacecraft will be equipped with two focal plane cameras, a Wide Field Imager (WFI) and an X-ray Integral Field Unit (X-IFU). The WFI instrument is optimized for state-of-the-art resolution spectroscopy over a large field of view of 40 amin x 40 amin and high count rates up to and beyond 1 Crab source intensity. The cryogenic X-IFU camera is designed for high-spectral resolution imaging. Both cameras share alternately a mirror system based on silicon pore optics with a focal length of 12 m and large effective area of about 2 m2 at an energy of 1 keV. Although the mission is still in phase A, i.e. studying the feasibility and developing the necessary technology, the definition and development of the instrumentation made already significant progress. The herein described WFI focal plane camera covers the energy band from 0.2 keV to 15 keV with 450 μm thick fully depleted back-illuminated silicon active pixel sensors of DEPFET type. The spatial resolution will be provided by one million pixels, each with a size of 130 μm x 130 μm. The time resolution requirement for the WFI large detector array is 5 ms and for the WFI fast detector 80 μs. The large effective area of the mirror system will be completed by a high quantum efficiency above 90% for medium and higher energies. The status of the various WFI subsystems to achieve this performance will be described and recent changes will be explained here.
Evaluation of coarse scale land surface remote sensing albedo product over rugged terrain
NASA Astrophysics Data System (ADS)
Wen, J.; Xinwen, L.; You, D.; Dou, B.
2017-12-01
Satellite derived Land surface albedo is an essential climate variable which controls the earth energy budget and it can be used in applications such as climate change, hydrology, and numerical weather prediction. The accuracy and uncertainty of surface albedo products should be evaluated with a reliable reference truth data prior to applications. And more literatures investigated the validation methods about the albedo validation in a flat or homogenous surface. However, the albedo performance over rugged terrain is still unknow due to the validation method limited. A multi-validation strategy is implemented to give a comprehensive albedo validation, which will involve the high resolution albedo processing, high resolution albedo validation based on in situ albedo, and the method to upscale the high resolution albedo to a coarse scale albedo. Among them, the high resolution albedo generation and the upscale method is the core step for the coarse scale albedo validation. In this paper, the high resolution albedo is generated by Angular Bin algorithm. And a albedo upscale method over rugged terrain is developed to obtain the coarse scale albedo truth. The in situ albedo located 40 sites in mountain area are selected globally to validate the high resolution albedo, and then upscaled to the coarse scale albedo by the upscale method. This paper takes MODIS and GLASS albedo product as a example, and the prelimarily results show the RMSE of MODIS and GLASS albedo product over rugged terrain are 0.047 and 0.057, respectively under the RMSE with 0.036 of high resolution albedo.
NASA Astrophysics Data System (ADS)
Fucugauchi, J. U.; Ortiz-Aleman, C.; Martin, R.
2017-12-01
Large complex craters are characterized by central uplifts that represent large-scale differential movement of deep basement from the transient cavity. Here we investigate the central sector of the large multiring Chicxulub crater, which has been surveyed by an array of marine, aerial and land-borne geophysical methods. Despite high contrasts in physical properties,contrasting results for the central uplift have been obtained, with seismic reflection surveys showing lack of resolution in the central zone. We develop an integrated seismic and gravity model for the main structural elements, imaging the central basement uplift and melt and breccia units. The 3-D velocity model built from interpolation of seismic data is validated using perfectly matched layer seismic acoustic wave propagation modeling, optimized at grazing incidence using shift in the frequency domain. Modeling shows significant lack of illumination in the central sector, masking presence of the central uplift. Seismic energy remains trapped in an upper low velocity zone corresponding to the sedimentary infill, melt/breccias and surrounding faulted blocks. After conversion of seismic velocities into a volume of density values, we use massive parallel forward gravity modeling to constrain the size and shape of the central uplift that lies at 4.5 km depth, providing a high-resolution image of crater structure.The Bouguer anomaly and gravity response of modeled units show asymmetries, corresponding to the crater structure and distribution of post-impact carbonates, breccias, melt and target sediments
Artificial fluid properties for large-eddy simulation of compressible turbulent mixing
NASA Astrophysics Data System (ADS)
Cook, Andrew W.
2007-05-01
An alternative methodology is described for large-eddy simulation (LES) of flows involving shocks, turbulence, and mixing. In lieu of filtering the governing equations, it is postulated that the large-scale behavior of a LES fluid, i.e., a fluid with artificial properties, will be similar to that of a real fluid, provided the artificial properties obey certain constraints. The artificial properties consist of modifications to the shear viscosity, bulk viscosity, thermal conductivity, and species diffusivity of a fluid. The modified transport coefficients are designed to damp out high wavenumber modes, close to the resolution limit, without corrupting lower modes. Requisite behavior of the artificial properties is discussed and results are shown for a variety of test problems, each designed to exercise different aspects of the models. When combined with a tenth-order compact scheme, the overall method exhibits excellent resolution characteristics for turbulent mixing, while capturing shocks and material interfaces in a crisp fashion.
A High Resolution Scale-of-four
DOE R&D Accomplishments Database
Fitch, V.
1949-08-25
A high resolution scale-of-four has been developed to be used in conjunction with the nuclear particle detection devices in applications where the counting rate is unusually high. Specifically, it is intended to precede the commercially available medium resolution scaling circuits and so decrease the resolving time of the counting system. The circuit will function reliably on continuously recurring pulses separated by less than 0.1 microseconds. It will resolve two pulses (occurring at a moderate repetition rate) which are spaced at 0.04 microseconds. A five-volt input signal is sufficient to actuate the device.
The importance of calorimetry for highly-boosted jet substructure
Coleman, Evan; Freytsis, Marat; Hinzmann, Andreas; ...
2018-01-09
Here, jet substructure techniques are playing an essential role in exploring the TeV scale at the Large Hadron Collider (LHC), since they facilitate the efficient reconstruction and identification of highly-boosted objects. Both for the LHC and for future colliders, there is a growing interest in using jet substructure methods based only on charged-particle information. The reason is that silicon-based tracking detectors offer excellent granularity and precise vertexing, which can improve the angular resolution on highly-collimated jets and mitigate the impact of pileup. In this paper, we assess how much jet substructure performance degrades by using track-only information, and we demonstratemore » physics contexts in which calorimetry is most beneficial. Specifically, we consider five different hadronic final states - W bosons, Z bosons, top quarks, light quarks, gluons - and test the pairwise discrimination power with a multi-variate combination of substructure observables. In the idealized case of perfect reconstruction, we quantify the loss in discrimination performance when using just charged particles compared to using all detected particles. We also consider the intermediate case of using charged particles plus photons, which provides valuable information about neutral pions. In the more realistic case of a segmented calorimeter, we assess the potential performance gains from improving calorimeter granularity and resolution, comparing a CMS-like detector to more ambitious future detector concepts. Broadly speaking, we find large performance gains from neutral-particle information and from improved calorimetry in cases where jet mass resolution drives the discrimination power, whereas the gains are more modest if an absolute mass scale calibration is not required.« less
The importance of calorimetry for highly-boosted jet substructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Evan; Freytsis, Marat; Hinzmann, Andreas
2017-09-25
Jet substructure techniques are playing an essential role in exploring the TeV scale at the Large Hadron Collider (LHC), since they facilitate the efficient reconstruction and identification of highly-boosted objects. Both for the LHC and for future colliders, there is a growing interest in using jet substructure methods based only on charged-particle information. The reason is that silicon-based tracking detectors offer excellent granularity and precise vertexing, which can improve the angular resolution on highly-collimated jets and mitigate the impact of pileup. In this paper, we assess how much jet substructure performance degrades by using track-only information, and we demonstrate physicsmore » contexts in which calorimetry is most beneficial. Specifically, we consider five different hadronic final states - W bosons, Z bosons, top quarks, light quarks, gluons - and test the pairwise discrimination power with a multi-variate combination of substructure observables. In the idealized case of perfect reconstruction, we quantify the loss in discrimination performance when using just charged particles compared to using all detected particles. We also consider the intermediate case of using charged particles plus photons, which provides valuable information about neutral pions. In the more realistic case of a segmented calorimeter, we assess the potential performance gains from improving calorimeter granularity and resolution, comparing a CMS-like detector to more ambitious future detector concepts. Broadly speaking, we find large performance gains from neutral-particle information and from improved calorimetry in cases where jet mass resolution drives the discrimination power, whereas the gains are more modest if an absolute mass scale calibration is not required.« less
The importance of calorimetry for highly-boosted jet substructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Evan; Freytsis, Marat; Hinzmann, Andreas
Here, jet substructure techniques are playing an essential role in exploring the TeV scale at the Large Hadron Collider (LHC), since they facilitate the efficient reconstruction and identification of highly-boosted objects. Both for the LHC and for future colliders, there is a growing interest in using jet substructure methods based only on charged-particle information. The reason is that silicon-based tracking detectors offer excellent granularity and precise vertexing, which can improve the angular resolution on highly-collimated jets and mitigate the impact of pileup. In this paper, we assess how much jet substructure performance degrades by using track-only information, and we demonstratemore » physics contexts in which calorimetry is most beneficial. Specifically, we consider five different hadronic final states - W bosons, Z bosons, top quarks, light quarks, gluons - and test the pairwise discrimination power with a multi-variate combination of substructure observables. In the idealized case of perfect reconstruction, we quantify the loss in discrimination performance when using just charged particles compared to using all detected particles. We also consider the intermediate case of using charged particles plus photons, which provides valuable information about neutral pions. In the more realistic case of a segmented calorimeter, we assess the potential performance gains from improving calorimeter granularity and resolution, comparing a CMS-like detector to more ambitious future detector concepts. Broadly speaking, we find large performance gains from neutral-particle information and from improved calorimetry in cases where jet mass resolution drives the discrimination power, whereas the gains are more modest if an absolute mass scale calibration is not required.« less
SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.
2014-05-01
As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.
The stratified two-sided jet of Cygnus A. Acceleration and collimation
NASA Astrophysics Data System (ADS)
Boccardi, B.; Krichbaum, T. P.; Bach, U.; Mertens, F.; Ros, E.; Alef, W.; Zensus, J. A.
2016-01-01
Aims: High-resolution Very-Long-Baseline Interferometry (VLBI) observations of relativistic jets are essential for constraining the fundamental parameters of jet formation models. At a distance of 249 Mpc, Cygnus A is a unique target for such studies, since it is the only Fanaroff-Riley Class II radio galaxy for which a detailed subparsec scale imaging of the base of both jet and counter-jet can be obtained. Observing at millimeter wavelengths unveils those regions that appear self-absorbed at longer wavelengths and enables an extremely sharp view toward the nucleus to be obtained. Methods: We performed 7 mm Global VLBI observations, achieving ultra-high resolution imaging on scales down to 90 μas. This resolution corresponds to a linear scale of only ~400 Schwarzschild radii (for MBH = 2.5 × 109M⊙). We studied the kinematic properties of the main emission features of the two-sided flow and probed its transverse structure through a pixel-based analysis. Results: We suggest that a fast and a slow layer with different acceleration gradients exist in the flow. The extension of the acceleration region is large (~ 104RS), indicating that the jet is magnetically driven. The limb brightening of both jet and counter-jet and their large opening angles (φJ ~ 10°) strongly favour a spine-sheath structure. In the acceleration zone, the flow has a parabolic shape (r ∝ z0.55 ± 0.07). The acceleration gradients and the collimation profile are consistent with the expectations for a jet in "equilibrium", achieved in the presence of a mild gradient of the external pressure (p ∝ z- k,k ≤ 2).
Fennec dust forecast intercomparison over the Sahara in June 2011
NASA Astrophysics Data System (ADS)
Chaboureau, Jean-Pierre; Flamant, Cyrille; Dauhut, Thibaut; Kocha, Cécile; Lafore, Jean-Philippe; Lavaysse, Chistophe; Marnas, Fabien; Mokhtari, Mohamed; Pelon, Jacques; Reinares Martínez, Irene; Schepanski, Kerstin; Tulet, Pierre
2016-06-01
In the framework of the Fennec international programme, a field campaign was conducted in June 2011 over the western Sahara. It led to the first observational data set ever obtained that documents the dynamics, thermodynamics and composition of the Saharan atmospheric boundary layer (SABL) under the influence of the heat low. In support to the aircraft operation, four dust forecasts were run daily at low and high resolutions with convection-parameterizing and convection-permitting models, respectively. The unique airborne and ground-based data sets allowed the first ever intercomparison of dust forecasts over the western Sahara. At monthly scale, large aerosol optical depths (AODs) were forecast over the Sahara, a feature observed by satellite retrievals but with different magnitudes. The AOD intensity was correctly predicted by the high-resolution models, while it was underestimated by the low-resolution models. This was partly because of the generation of strong near-surface wind associated with thunderstorm-related density currents that could only be reproduced by models representing convection explicitly. Such models yield emissions mainly in the afternoon that dominate the total emission over the western fringes of the Adrar des Iforas and the Aïr Mountains in the high-resolution forecasts. Over the western Sahara, where the harmattan contributes up to 80 % of dust emission, all the models were successful in forecasting the deep well-mixed SABL. Some of them, however, missed the large near-surface dust concentration generated by density currents and low-level winds. This feature, observed repeatedly by the airborne lidar, was partly forecast by one high-resolution model only.
Fennec dust forecast intercomparison over the Sahara in June 2011
NASA Astrophysics Data System (ADS)
Chaboureau, J. P.; Flamant, C.; Dauhut, T.; Lafore, J. P.; Lavaysse, C.; Pelon, J.; Schepanski, K.; Tulet, P.
2016-12-01
In the framework of the Fennec international programme, a field campaign was conducted in June 2011 over the western Sahara. It led to the first observational data set ever obtained that documents the dynamics, thermodynam-ics and composition of the Saharan atmospheric boundary layer (SABL) under the influence of the heat low. In support to the aircraft operation, four dust forecasts were run daily at low and high resolutions with convection-parameterizing and convection-permitting models, respectively. The unique airborne and ground-based data sets allowed the first ever intercomparison of dust forecasts over the western Sahara. At monthly scale, large aerosol optical depths (AODs) were forecast over the Sahara, a feature observed by satellite retrievals but with different magnitudes. The AOD intensity was correctly predicted by the high-resolution models, while it was underestimated by the low-resolution models. This was partly because of the generation of strong near-surface wind associated with thunderstorm-related density currents that could only be reproduced by models representing convection explicitly. Such models yield emissions mainly in the afternoon that dominate the total emission over the western fringes of the Adrar des Iforas and the Aïr Mountains in the high-resolution forecasts. Over the western Sahara, where the harmattan contributes up to 80 % of dust emission, all the models were successful in forecasting the deep well-mixed SABL. Some of them, however, missed the large near-surface dust concentration generated by density currents and low-level winds. This feature, observed repeatedly by the airborne lidar, was partly forecast by one high-resolution model only.
Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model
NASA Astrophysics Data System (ADS)
Kumar, M.; Duffy, C.
2006-05-01
Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.
NASA Astrophysics Data System (ADS)
Gruber, S.; Fiddes, J.
2013-12-01
In mountainous topography, the difference in scale between atmospheric reanalyses (typically tens of kilometres) and relevant processes and phenomena near the Earth surface, such as permafrost or snow cover (meters to tens of meters) is most obvious. This contrast of scales is one of the major obstacles to using reanalysis data for the simulation of surface phenomena and to confronting reanalyses with independent observation. At the example of modelling permafrost in mountain areas (but simple to generalise to other phenomena and heterogeneous environments), we present and test methods against measurements for (A) scaling atmospheric data from the reanalysis to the ground level and (B) smart sampling of the heterogeneous landscape in order to set up a lumped model simulation that represents the high-resolution land surface. TopoSCALE (Part A, see http://dx.doi.org/10.5194/gmdd-6-3381-2013) is a scheme, which scales coarse-grid climate fields to fine-grid topography using pressure level data. In addition, it applies necessary topographic corrections e.g. those variables required for computation of radiation fields. This provides the necessary driving fields to the LSM. Tested against independent ground data, this scheme has been shown to improve the scaling and distribution of meteorological parameters in complex terrain, as compared to conventional methods, e.g. lapse rate based approaches. TopoSUB (Part B, see http://dx.doi.org/10.5194/gmd-5-1245-2012) is a surface pre-processor designed to sample a fine-grid domain (defined by a digital elevation model) along important topographical (or other) dimensions through a clustering scheme. This allows constructing a lumped model representing the main sources of fine-grid variability and applying a 1D LSM efficiently over large areas. Results can processed to derive (i) summary statistics at coarse-scale re-analysis grid resolution, (ii) high-resolution data fields spatialized to e.g., the fine-scale digital elevation model grid, or (iii) validation products for locations at which measurements exist, only. The ability of TopoSUB to approximate results simulated by a 2D distributed numerical LSM at a factor of ~10,000 less computations is demonstrated by comparison of 2D and lumped simulations. Successful application of the combined scheme in the European Alps is reported and based on its results, open issues for future research are outlined.
Mass Spectrometry as a Preparative Tool for the Surface Science of Large Molecules
NASA Astrophysics Data System (ADS)
Rauschenbach, Stephan; Ternes, Markus; Harnau, Ludger; Kern, Klaus
2016-06-01
Measuring and understanding the complexity that arises when nanostructures interact with their environment are one of the major current challenges of nanoscale science and technology. High-resolution microscopy methods such as scanning probe microscopy have the capacity to investigate nanoscale systems with ultimate precision, for which, however, atomic scale precise preparation methods of surface science are a necessity. Preparative mass spectrometry (pMS), defined as the controlled deposition of m/z filtered ion beams, with soft ionization sources links the world of large, biological molecules and surface science, enabling atomic scale chemical control of molecular deposition in ultrahigh vacuum (UHV). Here we explore the application of high-resolution scanning probe microscopy and spectroscopy to the characterization of structure and properties of large molecules. We introduce the fundamental principles of the combined experiments electrospray ion beam deposition and scanning tunneling microscopy. Examples for the deposition and investigation of single particles, for layer and film growth, and for the investigation of electronic properties of individual nonvolatile molecules show that state-of-the-art pMS technology provides a platform analog to thermal evaporation in conventional molecular beam epitaxy. Additionally, it offers additional, unique features due to the use of charged polyatomic particles. This new field is an enormous sandbox for novel molecular materials research and demands the development of advanced molecular ion beam technology.
Mapping nonlinear receptive field structure in primate retina at single cone resolution
Li, Peter H; Greschner, Martin; Gunning, Deborah E; Mathieson, Keith; Sher, Alexander; Litke, Alan M; Paninski, Liam
2015-01-01
The function of a neural circuit is shaped by the computations performed by its interneurons, which in many cases are not easily accessible to experimental investigation. Here, we elucidate the transformation of visual signals flowing from the input to the output of the primate retina, using a combination of large-scale multi-electrode recordings from an identified ganglion cell type, visual stimulation targeted at individual cone photoreceptors, and a hierarchical computational model. The results reveal nonlinear subunits in the circuity of OFF midget ganglion cells, which subserve high-resolution vision. The model explains light responses to a variety of stimuli more accurately than a linear model, including stimuli targeted to cones within and across subunits. The recovered model components are consistent with known anatomical organization of midget bipolar interneurons. These results reveal the spatial structure of linear and nonlinear encoding, at the resolution of single cells and at the scale of complete circuits. DOI: http://dx.doi.org/10.7554/eLife.05241.001 PMID:26517879
What is the effect of LiDAR-derived DEM resolution on large-scale watershed model results?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ping Yang; Daniel B. Ames; Andre Fonseca
This paper examines the effect of raster cell size on hydrographic feature extraction and hydrological modeling using LiDAR derived DEMs. LiDAR datasets for three experimental watersheds were converted to DEMs at various cell sizes. Watershed boundaries and stream networks were delineated from each DEM and were compared to reference data. Hydrological simulations were conducted and the outputs were compared. Smaller cell size DEMs consistently resulted in less difference between DEM-delineated features and reference data. However, minor differences been found between streamflow simulations resulted for a lumped watershed model run at daily simulations aggregated at an annual average. These findings indicatemore » that while higher resolution DEM grids may result in more accurate representation of terrain characteristics, such variations do not necessarily improve watershed scale simulation modeling. Hence the additional expense of generating high resolution DEM's for the purpose of watershed modeling at daily or longer time steps may not be warranted.« less
NASA Astrophysics Data System (ADS)
Lamb, Derek A.
2016-10-01
While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.
USDA-ARS?s Scientific Manuscript database
In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...
Bottom-up production of meta-atoms for optical magnetism in visible and NIR light
NASA Astrophysics Data System (ADS)
Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe
2018-02-01
Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.