Science.gov

Sample records for dems method evaluation

  1. Performance Evaluation of Four DEM-Based Fluvial Terrace Mapping Methods Across Variable Geomorphic Settings: Application to the Sheepscot River Watershed, Maine

    NASA Astrophysics Data System (ADS)

    Hopkins, A. J.; Snyder, N. P.

    2014-12-01

    Fluvial terraces are utilized in geomorphic studies as recorders of land-use, climate, and tectonic history. Advances in digital topographic data, such as high-resolution digital elevation models (DEMs) derived from airborne lidar surveys, has promoted the development of several methods used to extract terraces from DEMs based on their characteristic morphology. The post-glacial landscape of the Sheepscot River watershed, Maine, where strath and fill terraces are present and record Pleistocene deglaciation, Holocene eustatic forcing, and Anthropocene land-use change, was selected to implement a comparison between terrace mapping methodologies. At four study sites within the watershed, terraces were manually mapped to facilitate the comparison between fully and semi-automated DEM-based mapping procedures, including: (1) edge detection functions in Matlab, (2) feature classification algorithms developed by Wood (1996), (3) spatial relationships between interpreted terraces and surrounding topography (Walter et al., 2007), and (4) the TerEx terrace mapping toolbox developed by Stout and Belmont (2014). Each method was evaluated based on its accuracy and ease of implementation. The four study sites have varying longitudinal slope (0.1% - 5%), channel width (<5 m - 30 m), relief in surrounding landscape (15 m - 75 m), type and density of surrounding land use, and mapped surficial geologic units. In general, all methods overestimate terrace areas (average predicted area 136% of the manually defined area). Surrounding topographic relief appears to exert the greatest control on mapping accuracy, with the most accurate results (92% of terrace area mapped by Walter et al., 2007 method) achieved where the river valley was most confined by adjacent hillslopes. Accuracy decreased for study sites surrounded by a low-relief landscape, with the most accurate results achieved by the TerEx toolbox (Stout and Belmont, 2014; predicted areas were 45% and 89% of manual delineations

  2. Pre-Conditioning Optmization Methods and Display for Mega-Pixel DEM Reconstructions

    NASA Astrophysics Data System (ADS)

    Sette, A. L.; DeLuca, E. E.; Weber, M. A.; Golub, L.

    2004-05-01

    The Atmospheric Imaging Assembly (AIA) for the Solar Dynamics Observatory will provide an unprecedented rate of mega-pixel solar corona data. This hastens the need for faster differential emission measure (DEM) reconstruction methods, as well as scientifically useful ways of displaying this information for mega-pixel datasets. We investigate pre-conditioning methods, which optimize DEM reconstruction by making an informed initial DEM guess that takes advantage of the sharing of DEM information among the pixels in an image. In addition, we evaluate the effectiveness of different DEM image display options, including single temperature emission maps and time-progression DEM movies. This work is supported under contract SP02D4301R to the Lockheed Martin Corp.

  3. Gauss-Newton method for DEM co-registration

    NASA Astrophysics Data System (ADS)

    Wang, Kunlun; Zhang, Tonggang

    2015-12-01

    Digital elevation model (DEM) co-registration is one of the hottest research problems, and it is the critical technology for multi-temporal DEM analysis, which has wide potential application in many fields, such as geological hazards. Currently, the least-squares principle is used in most DEM co-registration methods, in which the matching parameters are obtained by iteration; the surface co-registration is then accomplished. To improve the iterative convergence rate, a Gauss-Newton method for DEM co-registration (G-N) is proposed in this paper. A gradient formula based on a gridded discrete surface is derived in theory, and then the difficulty of applying the Gauss-Newton method to DEM matching is solved. With the G-N algorithm, the surfaces approach each other along the maximal gradient direction, and therefore the iterative convergence and the performance efficiency of the new method can be enhanced greatly. According to experimental results based on the simulated datasets, the average convergence rates of rotation and translation parameters of the G-N algorithm are increased by 40 and 15% compared to those of the ICP algorithm, respectively. The performance efficiency of the G-N algorithm is 74.9% better.

  4. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  5. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2009-09-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  6. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2010-11-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  7. Laser Altimeter Evaluation of an SRTM DEM for Western Washington State

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Harding, D. J.

    2002-05-01

    Interferometric Synthetic Aperture Radar (InSAR) and laser altimeter measurements of topography provide complimentary approaches to characterize landforms. Results from the Shuttle Radar Topography Mission (SRTM) will provide an unprecedented, near-global, Digital Elevation Model (DEM) at 30 m resolution using a single pass C-band (5.6 cm wavelength) radar interferometer. In vegetated terrains, the C-band radar energy penetrates part way into vegetation cover. The elevation of the resulting radar phase center, somewhere between the canopy top and underlying ground, depends on the vegetation height, density, structure, and presence or absence of foliage. The high vertical accuracy and spatial resolution achieved by laser altimeters, and their capability to directly measure the vertical distribution of vegetation and underlying ground topography, provides a method to evaluate InSAR representations of topography. In order to provide an independent assessment of SRTM DEM accuracy and error characteristics, a simple but rigorous methodology based on comparisons to airborne and satellite laser altimeter profiles has been developed and tested. Initially, an SRTM DEM produced for a large part of western Washington State by the JPL PI processor has been compared to Shuttle Laser Altimeter (SLA) and airborne Scanning Lidar Imager of Canopies by Echo Recovery (SLICER) data. The accuracy of the laser altimeter data sets has been previously characterized. For SLICER profiles, each about 40 km long, the mean and standard deviation of elevation differences between the SRTM DEM and SLICER-defined canopy top and ground are computed. The SRTM DEM is usually located between the canopy top and ground. A poor correlation is observed between the per-pixel error estimate provided with the SRTM DEM and the observed SLICER to SRTM elevation differences. In addition to these profile comparisons, a very high resolution DEM acquired by Terrapoint, LLC for the Puget Sound Lidar Consortium

  8. Stochastic Discrete Equation Method (sDEM) for two-phase flows

    SciTech Connect

    Abgrall, R.; Congedo, P.M.; Geraci, G.; Rodio, M.G.

    2015-10-15

    A new scheme for the numerical approximation of a five-equation model taking into account Uncertainty Quantification (UQ) is presented. In particular, the Discrete Equation Method (DEM) for the discretization of the five-equation model is modified for including a formulation based on the adaptive Semi-Intrusive (aSI) scheme, thus yielding a new intrusive scheme (sDEM) for simulating stochastic two-phase flows. Some reference test-cases are performed in order to demonstrate the convergence properties and the efficiency of the overall scheme. The propagation of initial conditions uncertainties is evaluated in terms of mean and variance of several thermodynamic properties of the two phases.

  9. The Discrepancy Evaluation Model. II. The Application of the DEM to an Educational Program.

    ERIC Educational Resources Information Center

    Steinmetz, Andres

    1976-01-01

    The discrepancy evaluation model (DEM) specifies that evaluation consists of comparing performance with a standard, yielding discrepancy information. DEM is applied to programs in order to improve the program by making standards-performance-discrepancy cycles explicit and public. Action-oriented planning is involved in creating standards; a useful…

  10. A New DEM Generalization Method Based on Watershed and Tree Structure

    PubMed Central

    Chen, Yonggang; Ma, Tianwu; Chen, Xiaoyin; Chen, Zhende; Yang, Chunju; Lin, Chenzhi; Shan, Ligang

    2016-01-01

    The DEM generalization is the basis of multi-dimensional observation, the basis of expressing and analyzing the terrain. DEM is also the core of building the Multi-Scale Geographic Database. Thus, many researchers have studied both the theory and the method of DEM generalization. This paper proposed a new method of generalizing terrain, which extracts feature points based on the tree model construction which considering the nested relationship of watershed characteristics. The paper used the 5 m resolution DEM of the Jiuyuan gully watersheds in the Loess Plateau as the original data and extracted the feature points in every single watershed to reconstruct the DEM. The paper has achieved generalization from 1:10000 DEM to 1:50000 DEM by computing the best threshold. The best threshold is 0.06. In the last part of the paper, the height accuracy of the generalized DEM is analyzed by comparing it with some other classic methods, such as aggregation, resample, and VIP based on the original 1:50000 DEM. The outcome shows that the method performed well. The method can choose the best threshold according to the target generalization scale to decide the density of the feature points in the watershed. Meanwhile, this method can reserve the skeleton of the terrain, which can meet the needs of different levels of generalization. Additionally, through overlapped contour contrast, elevation statistical parameters and slope and aspect analysis, we found out that the W8D algorithm performed well and effectively in terrain representation. PMID:27517296

  11. A New DEM Generalization Method Based on Watershed and Tree Structure.

    PubMed

    Chen, Yonggang; Ma, Tianwu; Chen, Xiaoyin; Chen, Zhende; Yang, Chunju; Lin, Chenzhi; Shan, Ligang

    2016-01-01

    The DEM generalization is the basis of multi-dimensional observation, the basis of expressing and analyzing the terrain. DEM is also the core of building the Multi-Scale Geographic Database. Thus, many researchers have studied both the theory and the method of DEM generalization. This paper proposed a new method of generalizing terrain, which extracts feature points based on the tree model construction which considering the nested relationship of watershed characteristics. The paper used the 5 m resolution DEM of the Jiuyuan gully watersheds in the Loess Plateau as the original data and extracted the feature points in every single watershed to reconstruct the DEM. The paper has achieved generalization from 1:10000 DEM to 1:50000 DEM by computing the best threshold. The best threshold is 0.06. In the last part of the paper, the height accuracy of the generalized DEM is analyzed by comparing it with some other classic methods, such as aggregation, resample, and VIP based on the original 1:50000 DEM. The outcome shows that the method performed well. The method can choose the best threshold according to the target generalization scale to decide the density of the feature points in the watershed. Meanwhile, this method can reserve the skeleton of the terrain, which can meet the needs of different levels of generalization. Additionally, through overlapped contour contrast, elevation statistical parameters and slope and aspect analysis, we found out that the W8D algorithm performed well and effectively in terrain representation. PMID:27517296

  12. Evaluation of DEM generation accuracy from UAS imagery

    NASA Astrophysics Data System (ADS)

    Santise, M.; Fornari, M.; Forlani, G.; Roncella, R.

    2014-06-01

    The growing use of UAS platform for aerial photogrammetry comes with a new family of Computer Vision highly automated processing software expressly built to manage the peculiar characteristics of these blocks of images. It is of interest to photogrammetrist and professionals, therefore, to find out whether the image orientation and DSM generation methods implemented in such software are reliable and the DSMs and orthophotos are accurate. On a more general basis, it is interesting to figure out whether it is still worth applying the standard rules of aerial photogrammetry to the case of drones, achieving the same inner strength and the same accuracies as well. With such goals in mind, a test area has been set up at the University Campus in Parma. A large number of ground points has been measured on natural as well as signalized points, to provide a comprehensive test field, to check the accuracy performance of different UAS systems. In the test area, points both at ground-level and features on the buildings roofs were measured, in order to obtain a distributed support also altimetrically. Control points were set on different types of surfaces (buildings, asphalt, target, fields of grass and bumps); break lines, were also employed. The paper presents the results of a comparison between two different surveys for DEM (Digital Elevation Model) generation, performed at 70 m and 140 m flying height, using a Falcon 8 UAS.

  13. Development and Evaluation of Simple Measurement System Using the Oblique Photo and dem

    NASA Astrophysics Data System (ADS)

    Nonaka, H.; Sasaki, H.; Fujimaki, S.; Naruke, S.; Kishimoto, H.

    2016-06-01

    When a disaster occurs, we must grasp and evaluate its damage as soon as possible. Then we try to estimate them from some kind of photographs, such as surveillance camera imagery, satellite imagery, photographs taken from a helicopter and so on. Especially in initial stage, estimation of decent damage situation for a short time is more important than investigation of damage situation for a long time. One of the source of damage situation is the image taken by surveillance camera, satellite sensor and helicopter. If we can measure any targets in these imagery, we can estimate a length of a lava flow, a reach of a cinder and a sediment volume in volcanic eruption or landslide. Therefore in order to measure various information for a short time, we developed a simplified measurement system which uses these photographs. This system requires DEM in addition to photographs, but it is possible to use previously acquired DEM. To measure an object, we require only two steps. One is the determination of the position and the posture in which the photograph is shot. We determine these parameters using DEM. The other step is the measurement of an object in photograph. In this paper, we describe this system and show the experimental results to evaluate this system. In this experiment we measured the top of Mt. Usu by using two measurement method of this system. Then we can measure it about one hour and the difference between the measurement results and the airborne LiDAR data are less than 10 meter.

  14. A coupled DEM-CFD method for impulse wave modelling

    NASA Astrophysics Data System (ADS)

    Zhao, Tao; Utili, Stefano; Crosta, GiovanBattista

    2015-04-01

    Rockslides can be characterized by a rapid evolution, up to a possible transition into a rock avalanche, which can be associated with an almost instantaneous collapse and spreading. Different examples are available in the literature, but the Vajont rockslide is quite unique for its morphological and geological characteristics, as well as for the type of evolution and the availability of long term monitoring data. This study advocates the use of a DEM-CFD framework for the modelling of the generation of hydrodynamic waves due to the impact of a rapid moving rockslide or rock-debris avalanche. 3D DEM analyses in plane strain by a coupled DEM-CFD code were performed to simulate the rockslide from its onset to the impact with still water and the subsequent wave generation (Zhao et al., 2014). The physical response predicted is in broad agreement with the available observations. The numerical results are compared to those published in the literature and especially to Crosta et al. (2014). According to our results, the maximum computed run up amounts to ca. 120 m and 170 m for the eastern and western lobe cross sections, respectively. These values are reasonably similar to those recorded during the event (i.e. ca. 130 m and 190 m respectively). In these simulations, the slope mass is considered permeable, such that the toe region of the slope can move submerged in the reservoir and the impulse water wave can also flow back into the slope mass. However, the upscaling of the grains size in the DEM model leads to an unrealistically high hydraulic conductivity of the model, such that only a small amount of water is splashed onto the northern bank of the Vajont valley. The use of high fluid viscosity and coarse grain model has shown the possibility to model more realistically both the slope and wave motions. However, more detailed slope and fluid properties, and the need for computational efficiency should be considered in future research work. This aspect has also been

  15. Evaluation of terrain datasets for LiDAR data thinning and DEM generation for watershed delineation applications

    NASA Astrophysics Data System (ADS)

    Olivera, F.; Ferreira, C.; Djokic, D.

    2010-12-01

    Watershed delineation based on Digital Elevation Models (DEM) is currently a standard practice in hydrologic studies. Efforts to develop DEMs of high resolution continues to take place, although the advantages of increasing the accuracy of the results are partially offset by the increased file size, difficulty to handle them, slow screen rendering and increase computational effort. Among these efforts, those based on the use of Light Detection and Ranging (LiDAR) pose the problem that interpolation techniques in commercially available GIS software packages (e.g., IDW, Spline, Kriging and TOPORASTER, among others) for developing DEMs from point elevations have difficulty processing large amounts of data. Terrain Dataset is an alternative format for storing topographic data that intelligently decimates data points and creates simplified, yet equally accurate for practical purposes, DEMs or Triangular Irregular Networks (TIN). This study uses terrain datasets to evaluate the impact that the thinning method (i.e., window size and z-value), pyramid level and the interpolation technique (linear or natural neighbor) used to create the DEMs have on the watersheds delineated from them. Two case studies were considered for assessing the effect of the different methods and techniques. One of them consisted of dendritic topography in Williamson Creek, Austin, Texas, and the other of deranged topography in Hillsborough County, Florida. The results were compared using three standardized error metrics that measure the accuracy of the watershed boundaries, and computational effort. For the Williamson creek (steeper terrain), point thinning during the terrain creation process or the interpolation method choice did not affect the watershed delineation; while, in the Hillsborough (flat terrain), the method for point thinning and interpolation techniques highly influenced the resulting watershed delineation.

  16. Research of the gas-solid flow character based on the DEM method

    NASA Astrophysics Data System (ADS)

    Wang, Xueyao; Xiao, Yunhan

    2011-12-01

    Numerical simulation of gas-solid flow behaviors in a rectangular fluidized bed is carried out three dimensionally by the discrete element method (DEM). Euler method and Lagrange method are employed to deal with the gas phase and solid phase respectively. The collided force among particles, striking force between particle and wall, drag force, gravity, Magnus lift force and Saffman lift force are considered when establishing the mathematic models. Soft-sphere model is used to describe the collision of particles. In addition, the Euler method is also used for modeling the solid phase to compare with the results of DEM. The flow patterns, particle mean velocities, particles' diffusion and pressure drop of the bed under typical operating conditions are obtained. The results show that the DEM method can describe the detailed information among particles, while the Euler-Euler method cannot capture the micro-scale character. No matter which method is used, the diffusion of particles increases with the increase of gas velocity. But the gathering and crushing of particles cannot be simulated, so the energy loss of particles' collision cannot be calculated and the diffusion by using the Euler-Euler method is larger. In addition, it is shown by DEM method, with strengthening of the carrying capacity, more and more particles can be schlepped upward and the dense suspension upflow pattern can be formed. However, the results given by the Euler-Euler method are not consistent with the real situation.

  17. Structural and Volumetric re-evaluation of the Vaiont landslide using DEM techniques

    NASA Astrophysics Data System (ADS)

    Superchi, Laura; Pedrazzini, Andrea; Floris, Mario; Genevois, Rinaldo; Ghirotti, Monica; Jaboyedoff, Michel

    2010-05-01

    On the 9th October 1963 a catastrophic landslide occurred on the southern slope of the Vaiont dam reservoir. A mass of approximately 270 million m3 collapsed into the reservoir generating a wave which overtopped the dam and hit the town of Longarone and other villages: almost 2000 people lost their lives. The large volume and high velocity of the landslide combined with the great destruction and loss of life that occurred make the Vaiont landslide as a natural laboratory to investigate landslide failure mechanisms and propagation. Geological, structural, geomorphological, hydrogeological and geomechanical elements should be, then, re-analyzed using methods and techniques not available in the '60s. In order to better quantify the volume involved in the movement and to assess the mechanism of the failure, a structural study is a preliminary and necessary step. The structural features have been investigated based on a digital elevation model (DEM) of the pre- and post-landslide topography at a pixel size of 5m and associated software (COLTOP-3D) to create a colored shaded relief map revealing the orientation of morphological features. Besides,the results allowed to identify on both pre- and post-slide surface six main discontinuity sets, some of which influence directly the Vaiont landslide morphology. Recent and old field surveys allowed to validate the COLTOP-3D analysis results. To estimate the location and shape of the sliding surface and to evaluate the volume of the landslide, the SLBL (Sloping Local Base Level) method has been used, a simple and efficient tool that allows a geometric interpretation of the failure surface based on a DEM. The SLBL application required a geological interpretation to define the contours of the landslide and to estimate the possible curvature of the sliding surface, that is defined by interpolating between points considered as limits of the landslide. The SLBL surface of the Vaiont landslide, was obtained from the DEM reconstruction

  18. DEM-based Watershed Delineation - Comparison of Different Methods and applications

    NASA Astrophysics Data System (ADS)

    Chu, X.; Zhang, J.; Tahmasebi Nasab, M.

    2015-12-01

    Digital elevation models (DEMs) are commonly used for large-scale watershed hydrologic and water quality modeling. With aid of the latest LiDAR technology, submeter scale DEM data are often available for many areas in the United States. Precise characterization of the detailed variations in surface microtopography using such high-resolution DEMs is crucial to the related watershed modeling. Various methods have been developed to delineate a watershed, including determination of flow directions and accumulations, identification of subbasin boundaries, and calculation of the relevant topographic parameters. The objective of this study is to examine different DEM-based watershed delineation methods by comparing their unique features and the discrepancies in their results. Not only does this study cover the traditional watershed delineation methods, but also a new puddle-based unit (PBU) delineation method. The specific topics and issues to be presented involve flow directions (D8 single flow direction vs. multi-direction methods), segmentation of stream channels, drainage systems (single "depressionless" drainage network vs. hierarchical depression-dominated drainage system), and hydrologic connectivity (static structural connectivity vs. dynamic functional connectivity). A variety of real topographic surfaces are selected and delineated by using the selected methods. Comparisons of their delineation results emphasize the importance of selection of the methods and highlight their applicability and potential impacts on watershed modeling.

  19. Discrete Element Method (DEM) Application to The Cone Penetration Test Using COUPi Model

    NASA Astrophysics Data System (ADS)

    Kulchitsky, A. V.; Johnson, J.; Wilkinson, A.; DeGennaro, A. J.; Duvoy, P.

    2011-12-01

    The cone penetration test (CPT) is a soil strength measurement method to determine the tip resistance and sleeve friction versus depth while pushing a cone into regolith with controlled slow quasi-static speed. This test can also be used as an excellent tool to validate the discrete element method (DEM) model by comparing tip resistance and sleeve friction from experiments to model results. DEM by nature requires significant computational resources even for a limited number of particles. Thus, it is important to find particle and ensemble parameters that produce valuable results within reasonable computation times. The Controllable Objects Unbounded Particles Interaction (COUPi) model is a general physical DEM code being developed to model machine/regolith interactions as part of a NASA Lunar Science Institute sponsored project on excavation and mobility modeling. In this work, we consider how different particle shape and size distributions defined in the DEM influence the cone tip and friction sleeve resistance in a CPT DEM simulation. The results are compared to experiments with cone penetration in JSC-1A lunar regolith simulant. The particle shapes include spherical particles, particles composed from the union of three spheres, and some simple polyhedra. This focus is driven by the soil mechanics rule of thumb that particle size and shape distributions are the two most significant factors affecting soil strength. In addition to the particle properties, the packing configuration of an ensemble strongly affects soil strength. Bulk density of the regolith is an important characteristic that significantly influences the tip resistance and sleeve friction (Figure 1). We discuss different approaches used to control granular density in the DEM, including how to obtain higher bulk densities, using numerical "shaking" techniques and varying the friction coefficient during computations.

  20. A Review of Discrete Element Method (DEM) Particle Shapes and Size Distributions for Lunar Soil

    NASA Technical Reports Server (NTRS)

    Lane, John E.; Metzger, Philip T.; Wilkinson, R. Allen

    2010-01-01

    As part of ongoing efforts to develop models of lunar soil mechanics, this report reviews two topics that are important to discrete element method (DEM) modeling the behavior of soils (such as lunar soils): (1) methods of modeling particle shapes and (2) analytical representations of particle size distribution. The choice of particle shape complexity is driven primarily by opposing tradeoffs with total number of particles, computer memory, and total simulation computer processing time. The choice is also dependent on available DEM software capabilities. For example, PFC2D/PFC3D and EDEM support clustering of spheres; MIMES incorporates superquadric particle shapes; and BLOKS3D provides polyhedra shapes. Most commercial and custom DEM software supports some type of complex particle shape beyond the standard sphere. Convex polyhedra, clusters of spheres and single parametric particle shapes such as the ellipsoid, polyellipsoid, and superquadric, are all motivated by the desire to introduce asymmetry into the particle shape, as well as edges and corners, in order to better simulate actual granular particle shapes and behavior. An empirical particle size distribution (PSD) formula is shown to fit desert sand data from Bagnold. Particle size data of JSC-1a obtained from a fine particle analyzer at the NASA Kennedy Space Center is also fitted to a similar empirical PSD function.

  1. Dem Extraction from WORLDVIEW-3 Stereo-Images and Accuracy Evaluation

    NASA Astrophysics Data System (ADS)

    Hu, F.; Gao, X. M.; Li, G. Y.; Li, M.

    2016-06-01

    This paper validates the potentials of Worldview-3 satellite images in large scale topographic mapping, by choosing Worldview-3 along-track stereo-images of Yi Mountain area in Shandong province China for DEM extraction and accuracy evaluation. Firstly, eighteen accurate and evenly-distributed GPS points are collected in field and used as GCPs/check points, the image points of which are accurately measured, and also tie points are extracted from image matching; then, the RFM-based block adjustment to compensate the systematic error in image orientation is carried out and the geo-positioning accuracy is calculated and analysed; next, for the two stereo-pairs of the block, DSMs are separately constructed and mosaicked as an entirety, and also the corresponding DEM is subsequently generated; finally, compared with the selected check points from high-precision airborne LiDAR point cloud covering the same test area, the accuracy of the generated DEM with 2-meter grid spacing is evaluated by the maximum (max.), minimum (min.), mean and standard deviation (std.) values of elevation biases. It is demonstrated that, for Worldview-3 stereo-images used in our research, the planimetric accuracy without GCPs is about 2.16 m (mean error) and 0.55 (std. error), which is superior to the nominal value, while the vertical accuracy is about -1.61 m (mean error) and 0.49 m (std. error); with a small amount of GCPs located in the center and four corners of the test area, the systematic error can be well compensated. The std. value of elevation biases between the generated DEM and the 7256 LiDAR check points are about 0.62 m. If considering the potential uncertainties in the image point measurement, stereo matching and also elevation editing, the accuracy of generating DEM from Worldview-3 stereo-images should be more desirable. Judging from the results, Worldview-3 has the potential for 1:5000 or even larger scale mapping application.

  2. Use of thermal infrared pictures for retrieving intertidal DEM by the waterline method: advantages and limitations

    NASA Astrophysics Data System (ADS)

    Gaudin, D.; Delacourt, C.; Allemand, P.

    2010-12-01

    Digital Elevation Models (DEM) of the intertidal zones have a growing interest for ecological and land development purposes. They are also a fundamental tool for monitoring current sedimentary movements in those low energy environments. Such DEMs have to be constructed with a centimetric resolution as the topographic changes are not predictable and as sediment displacements are weak. Direct construction of DEM by GPS in these muddy environment is difficult: photogrammetric techniques are not efficient on uniform coloured surfaces and terrestrial laser scans are difficult to stabilize on the mud, due to humidity. In this study, we propose to improve and to apply the waterline method to retrieve DEMs in intertidal zones. This technique is based on monitoring accurately the boundary between sand and water during a whole tide rise with thermal infrared images. The DEM is made by stacking all these lines calibrated by an immersed pressure sensor. Using thermal infrared pictures, instead of optical ones, improves the detection of the waterline, since mud and water have very different responses to sun heating and a large emissivity contrast. However, temperature retrieving from thermal infrared data is not trivial, since the luminance of an object is the sum of a radiative part and a reflexive part, whose relative proportions are given by the emissivity. In the following equation, B accounts for the equivalent blackbody luminance, and Linc is the incident luminance : Ltot}=L{rad}+L_{refl=ɛ B+(1-ɛ )Linc The infrared waterline technique has been used for the monitoring of a beach located on the Aber Benoit, 8.5km away from the open sea. The site is mainly constituted of mud, and waves are very small (less than one centimeter high), which are the ideal conditions for using the waterline method. A few measurements have been made to make differential heigh maps of sediments. We reached a mean resolution of 2cm and a vertical accuracy better than one centimeter. The results

  3. Combined DEM Extration Method from StereoSAR and InSAR

    NASA Astrophysics Data System (ADS)

    Zhao, Z.; Zhang, J. X.; Duan, M. Y.; Huang, G. M.; Yang, S. C.

    2015-06-01

    A pair of SAR images acquired from different positions can be used to generate digital elevation model (DEM). Two techniques exploiting this characteristic have been introduced: stereo SAR and interferometric SAR. They permit to recover the third dimension (topography) and, at the same time, to identify the absolute position (geolocation) of pixels included in the imaged area, thus allowing the generation of DEMs. In this paper, StereoSAR and InSAR combined adjustment model are constructed, and unify DEM extraction from InSAR and StereoSAR into the same coordinate system, and then improve three dimensional positioning accuracy of the target. We assume that there are four images 1, 2, 3 and 4. One pair of SAR images 1,2 meet the required conditions for InSAR technology, while the other pair of SAR images 3,4 can form stereo image pairs. The phase model is based on InSAR rigorous imaging geometric model. The master image 1 and the slave image 2 will be used in InSAR processing, but the slave image 2 is only used in the course of establishment, and the pixels of the slave image 2 are relevant to the corresponding pixels of the master image 1 through image coregistration coefficient, and it calculates the corresponding phase. It doesn't require the slave image in the construction of the phase model. In Range-Doppler (RD) model, the range equation and Doppler equation are a function of target geolocation, while in the phase equation, the phase is also a function of target geolocation. We exploit combined adjustment model to deviation of target geolocation, thus the problem of target solution is changed to solve three unkonwns through seven equations. The model was tested for DEM extraction under spaceborne InSAR and StereoSAR data and compared with InSAR and StereoSAR methods respectively. The results showed that the model delivered a better performance on experimental imagery and can be used for DEM extraction applications.

  4. "blue Line" Conditioning of A New Physically-based Method For Dem Interpolation

    NASA Astrophysics Data System (ADS)

    Grimaldi, S.; Teles, V.; Bras, R. L.

    A basic issue for Earth Science studies is the accurate representation of the topogra- phy. The increase of distributed modeling has stimulated the use of Digital Elevation Models (DEMs). Usually DEMs are created with mathematical and statistical interpo- lators that seek the best fit to the observed data. They generally create surfaces with unrealistic normal or log-normal distribution of slopes and curvatures. Landscape to- pography is known to have particular properties such as roughness and self-similar properties. The commonly used interpolators do not reproduce those characteristics. In order to build a topography with characteristic properties, a new physically-based approach was recently developed. This method consists in coupling a standard interpo- lation method and an erosion model, which constrains the interpolation with geomor- phologic laws. This new approach gives more realistic surfaces than the commonly- used interpolators. However, for some basins or when data are scarce, this method as well as common interpolators are not able to recognize the main channels and ridges. In these cases, the interpolated surface has a wrong drainage network and completely different landscape. In order to overcome this difficulty and to improve the physically- based model performance, a new procedure is able to use the "blue line" information to constrain the interpolated surface with the actual network. This information can be easily and accurately obtained from maps or aerial photographs, because only the planar coordinates of the network are needed as input. The steps of the "blue line" procedure are described. Some case studies show the improvement due to the "blue line" information.

  5. High-resolution Pleiades DEMs and improved mapping methods for the E-Corinth marine terraces

    NASA Astrophysics Data System (ADS)

    de Gelder, Giovanni; Fernández-Blanco, David; Delorme, Arthur; Jara-Muñoz, Julius; Melnick, Daniel; Lacassin, Robin; Armijo, Rolando

    2016-04-01

    The newest generation of satellite imagery provides exciting new possibilities for highly detailed mapping, with ground resolution of sub-metric pixels and absolute accuracy within a few meters. This opens new venues for the analysis of geologic and geomorphic landscape features, especially since photogrammetric methods allow the extraction of detailed topographic information from these satellite images. We used tri-stereo imagery from the Pleiades platform of the CNES in combination with Euclidium software for image orientation, and Micmac software for dense matching, to develop state-of-the-art, 2m-resolution digital elevation models (DEMs) for eight areas in Greece. Here, we present our mapping results for an area in the eastern Gulf of Corinth, which contains one of the most extensive and well-preserved flights of marine terraces world-wide. The spatial extent of the terraces has been determined by an iterative combination of an automated surface classification model for terrain slope and roughness, and qualitative assessment of satellite imagery, DEM hillshade maps, slope maps, as well as detailed topographic analyses of profiles and contours. We determined marine terrace shoreline angles by means of swath profiles that run perpendicularly to the paleo-seacliffs, using the graphical interface TerraceM. Our analysis provided us with a minimum and maximum estimate of the paleoshoreline location on ~750 swath profiles, by using the present-day cliff slope as an approximation for its paleo-cliff counterpart. After correlating the marine terraces laterally we obtained 16 different terrace-levels, recording Quaternary sea-level highstands of both major interglacial and several interstadial periods. Our high-resolution Pleiades-DEMs and improved method for paleoshoreline determination allowed us to produce a marine terrace map of unprecedented detail, containing more terrace sub-levels than hitherto. Our mapping demonstrates that we are no longer limited by the

  6. Sensitivity of watershed attributes to spatial resolution and interpolation method of LiDAR DEMs in three distinct landscapes

    NASA Astrophysics Data System (ADS)

    Goulden, T.; Hopkinson, C.; Jamieson, R.; Sterling, S.

    2014-03-01

    This study investigates scaling relationships of watershed area and stream networks delineated from LiDAR DEMs. The delineations are tested against spatial resolution, including 1, 5, 10, 25, and 50 m, and interpolation method, including Inverse Distance Weighting (IDW), Moving Average (MA), Universal Kriging (UK), Natural Neighbor (NN), and Triangular Irregular Networks (TIN). Study sites include Mosquito Creek, Scotty Creek, and Thomas Brook, representing landscapes with high, low, and moderate change in elevation, respectively. Results show scale-dependent irregularities in watershed area due to spatial resolution at Thomas Brook and Mosquito Creek. The highest sensitivity of watershed area to spatial resolution occurred at Scotty Creek, due to high incidence of LiDAR sensor measurement error and subtle changes in elevation. Length of drainage networks did not show a scaling relationship with spatial resolution, due to algorithmic complications of the stream initiation threshold. Stream lengths of main channels at Thomas Brook and Mosquito Creek displayed systematic increases in length with increasing spatial resolution, described through an average fractal dimension of 1.059. The scaling relationship between stream length and DEM resolution allows estimation of stream lengths from low-resolution DEMs in the absence of high-resolution DEMs. Single stream validation at Thomas Brook showed the 1 m DEM produced the lowest length error and highest spatial accuracy, at 3.7% and 71.3%, respectively. Single stream validation at Mosquito Creek showed the 25 m DEM produced the lowest length error, and the 1 m DEM the highest spatial accuracy, at 0.6% and 61.0%, respectively.

  7. Evaluating the influence of spatial resolutions of DEM on watershed runoff and sediment yield using SWAT

    NASA Astrophysics Data System (ADS)

    Reddy, A. Sivasena; Reddy, M. Janga

    2015-10-01

    Digital elevation model (DEM) of a watershed forms key basis for hydrologic modelling and its resolution plays a key role in accurate prediction of various hydrological processes. This study appraises the effect of different DEMs with varied spatial resolutions (namely TOPO 20 m, CARTO 30 m, ASTER 30 m, SRTM 90 m, GEO-AUS 500 m and USGS 1000 m) on hydrological response of watershed using Soil and Water Assessment Tool (SWAT) and applied for a case study of Kaddam watershed in India for estimating runoff and sediment yield. From the results of case study, it was observed that reach lengths, reach slopes, minimum and maximum elevations, sub-watershed areas, land use mapping areas within the sub-watershed and number of HRUs varied substantially due to DEM resolutions, and consequently resulted in a considerable variability in estimated daily runoff and sediment yields. It was also observed that, daily runoff values have increased (decreased) on low (high) rainy days respectively with coarser resolution of DEM. The daily sediment yield values from each sub-watershed decreased with coarser resolution of the DEM. The study found that the performance of SWAT model prediction was not influenced much for finer resolution DEMs up to 90 m for estimation of runoff, but it certainly influenced the estimation of sediment yields. The DEMs of TOPO 20 m and CARTO 30 m provided better estimates of sub-watershed areas, runoff and sediment yield values over other DEMs.

  8. A practical method for SRTM DEM correction over vegetated mountain areas

    NASA Astrophysics Data System (ADS)

    Su, Yanjun; Guo, Qinghua

    2014-01-01

    Digital elevation models (DEMs) are essential to various applications in topography, geomorphology, hydrology, and ecology. The Shuttle Radar Topographic Mission (SRTM) DEM data set is one of the most complete and most widely used DEM data sets; it provides accurate information on elevations over bare land areas. However, the accuracy of SRTM data over vegetated mountain areas is relatively low as a result of the high relief and the penetration limitation of the C-band used for obtaining global DEM products. The objective of this study is to assess the performance of SRTM DEMs and correct them over vegetated mountain areas with small-footprint airborne Light Detection and Ranging (Lidar) data, which can develop elevation products and vegetation products [e.g., vegetation height, Leaf Area Index (LAI)] of high accuracy. The assessing results show that SRTM elevations are systematically higher than those of the actual land surfaces over vegetated mountain areas. The mean difference between SRTM DEM and Lidar DEM increases with vegetation height, whereas the standard deviation of the difference increases with slope. To improve the accuracy of SRTM DEM over vegetated mountain areas, a regression model between the SRTM elevation bias and vegetation height, LAI, and slope was developed based on one control site. Without changing any coefficients, this model was proved to be applicable in all the nine study sites, which have various topography and vegetation conditions. The mean bias of the corrected SRTM DEM at the nine study sites using this model (absolute value) is 89% smaller than that of the original SRTM DEM, and the standard deviation of the corrected SRTM elevation bias is 11% smaller.

  9. Production of Optimized DEM Using IDW Interpolation Method (Case Study; Jam and Riz Basin-Assaloyeh)

    NASA Astrophysics Data System (ADS)

    Soleimani, K.; Modallaldoust, S.

    In this research, preparing the optimized Digital Elevation Model (DEM)of Jam and Riz basin was studied by use of Inverse Distance Weighting (IDW) and utilization of GIS technique. Performing of IDW method depends on several factors including cell size, number of neighbor`s points, point searching radius and optimized power. On this basis, two Geostatistical methods were used for determination of points searching radius of standard ellipse and standard deviation ellipse. Considering the fixed cell size in network with value of 3 which represents weighting degree of points and with determining the rotation angle and measure of axis of standard deviation ellipse and calculation of optimized radius in standard ellipse by use of statistical method, then optimized power was automatically derived in ArcGIS 9.2 environment. In this method the number of neighbor's points was selected with four repetition points of 3, 5, 7 and 15. However, 8 digital elevation models were gained after the mentioned processes. Finally, digital elevation models of 1 to 8 were compared with control points using compare means test in SPSS11.5 statistical software which shown the IDW-3 with the best conditions recommended as the optimized model. Although the results are showing a similar forms but from them IDW3 model has the lowest mean standard error of 0.26842 which is used seven neighbor points.

  10. Evaluation of TanDEM-X elevation data for geomorphological mapping and interpretation in high mountain environments - A case study from SE Tibet, China

    NASA Astrophysics Data System (ADS)

    Pipaud, Isabel; Loibl, David; Lehmkuhl, Frank

    2015-10-01

    Digital elevation models (DEMs) are a prerequisite for many different applications in the field of geomorphology. In this context, the two near-global medium resolution DEMs originating from the SRTM and ASTER missions are widely used. For detailed geomorphological studies, particularly in high mountain environments, these datasets are, however, known to have substantial disadvantages beyond their posting, i.e., data gaps and miscellaneous artifacts. The upcoming TanDEM-X DEM is a promising candidate to improve this situation by application of state-of-the-art radar technology, exhibiting a posting of 12 m and less proneness to errors. In this study, we present a DEM processed from a single TanDEM-X CoSSC scene, covering a study area in the extreme relief of the eastern Nyainqêntanglha Range, southeastern Tibet. The potential of the resulting experimental TanDEM-X DEM for geomorphological applications was evaluated by geomorphometric analyses and an assessment of landform cognoscibility and artifacts in comparison to the ASTER GDEM and the recently released SRTM 1″ DEM. Detailed geomorphological mapping was conducted for four selected core study areas in a manual approach, based exclusively on the TanDEM-X DEM and its basic derivates. The results show that the self-processed TanDEM-X DEM yields a detailed and widely consistent landscape representation. It thus fosters geomorphological analysis by visual and quantitative means, allowing delineation of landforms down to footprints of ~ 30 m. Even in this premature state, the TanDEM-X elevation data are widely superior to the ASTER and SRTM datasets, primarily owing to its significantly higher resolution and its lower susceptibility to artifacts that hamper landform interpretation. Conversely, challenges toward interferometric DEM generation were identified, including (i) triangulation facets and missing topographic information resulting from radar layover on steep slopes facing toward the radar sensor, (ii) low

  11. Development of a coupled discrete element (DEM)-smoothed particle hydrodynamics (SPH) simulation method for polyhedral particles

    NASA Astrophysics Data System (ADS)

    Nassauer, Benjamin; Liedke, Thomas; Kuna, Meinhard

    2016-03-01

    In the present paper, the direct coupling of a discrete element method (DEM) with polyhedral particles and smoothed particle hydrodynamics (SPH) is presented. The two simulation techniques are fully coupled in both ways through interaction forces between the solid DEM particles and the fluid SPH particles. Thus this simulation method provides the possibility to simulate the individual movement of polyhedral, sharp-edged particles as well as the flow field around these particles in fluid-saturated granular matter which occurs in many technical processes e.g. wire sawing, grinding or lapping. The coupled method is exemplified and validated by the simulation of a particle in a shear flow, which shows good agreement with analytical solutions.

  12. Shoreline Mapping with Integrated HSI-DEM using Active Contour Method

    NASA Astrophysics Data System (ADS)

    Sukcharoenpong, Anuchit

    Shoreline mapping has been a critical task for federal/state agencies and coastal communities. It supports important applications such as nautical charting, coastal zone management, and legal boundary determination. Current attempts to incorporate data from hyperspectral imagery to increase the efficiency and efficacy of shoreline mapping have been limited due to the complexity in processing its data as well as its inferior spatial resolution when compared to multispectral imagery or to sensors such as LiDAR. As advancements in remote-sensing technologies increase sensor capabilities, the ability to exploit the spectral formation carried in hyperspectral images becomes more imperative. This work employs a new approach to extracting shorelines from AVIRIS hyperspectral images by combination with a LiDAR-based DEM using a multiphase active contour segmentation technique. Several techniques, such as study of object spectra and knowledge-based segmentation for initial contour generation, have been employed in order to achieve a sub-pixel level of accuracy and maintain low computational expenses. Introducing a DEM into hyperspectral image segmentation proves to be a useful tool to eliminate misclassifications and improve shoreline positional accuracy. Experimental results show that mapping shorelines from hyperspectral imagery and a DEM can be a promising approach as many further applications can be developed to exploit the rich information found in hyperspectral imagery.

  13. A Screening Method for Flash Flooding Risk using Instantaneous Unit Hydrographs Derived from High Resolution DEM data

    NASA Astrophysics Data System (ADS)

    Senevirathne, Nalin; Willgoose, Garry

    2015-04-01

    Flash flooding is considered a severe natural hazard and has had significant impact on human and infrastructure throughout the history. Modelling techniques and the understanding of flash flooding are getting improved with the availability of better quality data such as high resolution Digital Elevation Models (DEM). DEMs allow the automated characterization of the influence of geomorphology on the hydrologic response of catchments. They are particularly useful for small ungauged catchments where available hydrologic data (e.g. rainfall, runoff) are sparse and where site specific studies are rarely done unless some evidence of high risk is available. In this paper, we present new risk indicators, derived directly from instantaneous unit hydrographs (IUH), which can be used to identify flash flooding risk areas within catchments. The study area includes 35 major river basins covering a 1700km long by 50km wide coastal strip of Eastern Australia. Standard terrain analysis methods (pit filling, flow direction, local slope, contributing area, flow velocity and travel time) were used to produce IUHs for every pixel in the study area using a high resolution (1 arc second) DEM. When computing the IUHs, each pixel was considered as the outlet of its own catchment bounded by its contributing area. This allows us to characterise the hydrological response at the finest scale possible for a DEM. Risk indicators related to rate of storm rise and catchment lag time were derived from IUHs. Flash flood risk maps were produced at the catchment scale and they are match well with the data of severe flash flooding that occurred around Toowoomba (at the northern end of the coastal strip studied) in January 2011.

  14. 2D Distinct Element Method (DEM) models of the initiation, propagation and saturation of rock joints

    NASA Astrophysics Data System (ADS)

    Arslan, A.; Schöpfer, M. P.; Walsh, J. J.; Childs, C.

    2009-12-01

    In layered sequences, rock joints usually best develop within the more brittle layers and commonly display a regular spacing that scales with layer thickness. A variety of conceptual and mechanical models have been developed for these observations. A limitation of previous approaches, however, is that fracture initiation and associated interface slip are not explicitly simulated; instead, fractures were predefined and interfaces were welded. To surmount this problem, we have modelled the formation and growth of joints in layered sequences by using the two-dimensional Distinct Element Method (DEM) as implemented in the Particle Flow Code (PFC-2D). In PFC-2D, rock is represented by an assemblage of circular particles that are bonded at particle-particle contacts. Failure occurs if either the tensile or shear strength of a bond is exceeded. The models comprise a central brittle layer with high Young’s modulus, which is embedded in a low Young’s modulus matrix. The interfaces between the layers are defined by ‘smooth joint’ contacts, a modelling feature that eliminates interparticle bumpiness and associated interlocking friction. Consequently, this feature allows the user to assign macroscopic properties such as friction and cohesion along layer interfaces in a controlled manner. Layer parallel extension is applied by assigning a velocity to particles at the lateral boundaries of the model while maintaining a constant vertical confining pressure. Models were extended until joint saturation in the central layer was reached. We thereby explored the impact of confining pressure and interface properties (friction, cohesion) on joint spacing. A number of important conclusions can be drawn from our models: (i) The distributions of average horizontal normal stress within the layer and of shear stress at the interface are consistent with analytical solutions (stress-transfer theory). (ii) At low interfacial shear strength, new joints form preferentially midway between

  15. A method of detecting land use change of remote sensing images based on texture features and DEM

    NASA Astrophysics Data System (ADS)

    Huang, Dong-ming; Wei, Chun-tao; Yu, Jun-chen; Wang, Jian-lin

    2015-12-01

    In this paper, a combination method, between the neural network and textures information, is proposed to remote sensing images classification. The methodology involves an extraction of texture features using the gray level co-occurrence matrix and image classification with BP artificial neural network. The combination of texture features and the digital elevation model (DEM) as classified bands to neural network were used to recognized different classes. This scheme shows high recognition accuracy in the classification of remote sensing images. In the experiments, the proposed method was successfully applied to remote sensing image classification and Land Use Change Detection, in the meanwhile, the effectiveness of the proposed method was verified.

  16. High-resolution DEMs in the study of rainfall- and earthquake-induced landslides: Use of a variable window size method in digital terrain analysis

    NASA Astrophysics Data System (ADS)

    Iwahashi, Junko; Kamiya, Izumi; Yamagishi, Hiromitsu

    2012-06-01

    We undertake digital terrain analyses of rainfall- and earthquake-induced landslides in Japan, using high-resolution orthoimagery and Light Detection and Ranging (LiDAR) DEMs. Our aims are twofold: to demonstrate an effective method for dealing with high-resolution DEMs, which are often too detailed for landslide assessments, and to evaluate the topographic differences between rainfall- and earthquake-induced landslides. The study areas include the Izumozaki (1961 and 2004 heavy rainfalls), Niihama (2004 heavy rainfalls), Houfu (2009 heavy rainfalls), and Hanokidachi/Kurikoma-dam regions (the 2008 M 7.2 Iwate-Miyagi Nairiku earthquake). The study areas include 7,106 landslides in these five regions. We use two topographic attributes (the slope gradient and the Laplacian) calculated from DEMs in varying window sizes. The hit rates for statistical prediction of landslide cells through discriminant analyses are calculated using the two topographic attributes as explanatory variables, and the landslide inventory data as the dependent variable. In cases of surface failure, the hit rates are found to diminish when the window size of the topographic attributes is too large or too small, indicating that an optimal scale factor is key in assessing shallow landslides. The representative window sizes are approximately 30 m for shallow landslides; the optimal window size may be directly related to the average size of landslides in each region. We also find a stark contrast between rainfall- and earthquake-induced landslides. Rainfall-induced landslides are always most common at a slope gradient of 30°, but the frequency of earthquake-induced landslides increases exponentially with slope gradient. We find that the Laplacian, i.e., the attributes of surface convexity and concavity, and the slope gradient are both important factors for rainfall-induced landslides, whereas earthquake-induced landslides are influenced mainly by slope steepness.

  17. Effective Thermal Property Estimation of Unitary Pebble Beds Based on a CFD-DEM Coupled Method for a Fusion Blanket

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Chen, Youhua; Huang, Kai; Liu, Songlin

    2015-12-01

    Lithium ceramic pebble beds have been considered in the solid blanket design for fusion reactors. To characterize the fusion solid blanket thermal performance, studies of the effective thermal properties, i.e. the effective thermal conductivity and heat transfer coefficient, of the pebble beds are necessary. In this paper, a 3D computational fluid dynamics discrete element method (CFD-DEM) coupled numerical model was proposed to simulate heat transfer and thereby estimate the effective thermal properties. The DEM was applied to produce a geometric topology of a prototypical blanket pebble bed by directly simulating the contact state of each individual particle using basic interaction laws. Based on this geometric topology, a CFD model was built to analyze the temperature distribution and obtain the effective thermal properties. The current numerical model was shown to be in good agreement with the existing experimental data for effective thermal conductivity available in the literature. supported by National Special Project for Magnetic Confined Nuclear Fusion Energy of China (Nos. 2013GB108004, 2015GB108002, 2014GB122000 and 2014GB119000), and National Natural Science Foundation of China (No. 11175207)

  18. (Yet) A New Method for the Determination of Flow Directions and Contributing Areas over Gridded DEMs

    NASA Astrophysics Data System (ADS)

    Shelef, E.; Hilley, G. E.

    2012-12-01

    over-dispersive routing schemes. We used these, as well as complimentary methods that prescribe discharge to specific cells to evaluate these flow routing schemes. Additionally, we compared results from the different methods to ALSM topography to evaluate areas of the landscape where the results of different flow-routing methods diverged. Finally, we evolved modeled topography using each of these methods and found substantive differences in the structure of the modeled topography depending on the flow routing scheme used. None of these methods directly address the physics of flow over the surface and as such are ad hoc representations of surface flow. As such, it is difficult to determine which of these methods is most appropriate without benchmarking to more physically-based (and computationally expensive) flow-routing models. Nonetheless, the method proposed accomplishes the necessary dispersion of flow over a topographic surface while maintaining consistency with the gridded coordinate system, and thus is preferable from this standpoint.

  19. A hierarchical pyramid method for managing large-scale high-resolution drainage networks extracted from DEM

    NASA Astrophysics Data System (ADS)

    Bai, Rui; Tiejian, Li; Huang, Yuefei; Jiaye, Li; Wang, Guangqian; Yin, Dongqin

    2015-12-01

    The increasing resolution of Digital Elevation Models (DEMs) and the development of drainage network extraction algorithms make it possible to develop high-resolution drainage networks for large river basins. These vector networks contain massive numbers of river reaches with associated geographical features, including topological connections and topographical parameters. These features create challenges for efficient map display and data management. Of particular interest are the requirements of data management for multi-scale hydrological simulations using multi-resolution river networks. In this paper, a hierarchical pyramid method is proposed, which generates coarsened vector drainage networks from the originals iteratively. The method is based on the Horton-Strahler's (H-S) order schema. At each coarsening step, the river reaches with the lowest H-S order are pruned, and their related sub-basins are merged. At the same time, the topological connections and topographical parameters of each coarsened drainage network are inherited from the former level using formulas that are presented in this study. The method was applied to the original drainage networks of a watershed in the Huangfuchuan River basin extracted from a 1-m-resolution airborne LiDAR DEM and applied to the full Yangtze River basin in China, which was extracted from a 30-m-resolution ASTER GDEM. In addition, a map-display and parameter-query web service was published for the Mississippi River basin, and its data were extracted from the 30-m-resolution ASTER GDEM. The results presented in this study indicate that the developed method can effectively manage and display massive amounts of drainage network data and can facilitate multi-scale hydrological simulations.

  20. ASTER DEM performance

    USGS Publications Warehouse

    Fujisada, H.; Bailey, G.B.; Kelly, Glen G.; Hara, S.; Abrams, M.J.

    2005-01-01

    The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument onboard the National Aeronautics and Space Administration's Terra spacecraft has an along-track stereoscopic capability using its a near-infrared spectral band to acquire the stereo data. ASTER has two telescopes, one for nadir-viewing and another for backward-viewing, with a base-to-height ratio of 0.6. The spatial resolution is 15 m in the horizontal plane. Parameters such as the line-of-sight vectors and the pointing axis were adjusted during the initial operation period to generate Level-1 data products with a high-quality stereo system performance. The evaluation of the digital elevation model (DEM) data was carried out both by Japanese and U.S. science teams separately using different DEM generation software and reference databases. The vertical accuracy of the DEM data generated from the Level-1A data is 20 m with 95% confidence without ground control point (GCP) correction for individual scenes. Geolocation accuracy that is important for the DEM datasets is better than 50 m. This appears to be limited by the spacecraft position accuracy. In addition, a slight increase in accuracy is observed by using GCPs to generate the stereo data. ?? 2005 IEEE.

  1. Evaluation of morphometric parameters of drainage networks derived from topographic maps and DEM in point of floods

    NASA Astrophysics Data System (ADS)

    Ozdemir, Hasan; Bird, Deanne

    2009-02-01

    An evaluation of morphometric parameters of two drainage networks derived from different sources was done to determine the influence of sub-basins to flooding on the main channel in the Havran River basin (Balıkesir-Turkey). Drainage networks for the sub-basins were derived from both topographic maps scaled 1:25.000 and a 10-m resolution digital elevation model (DEM) using geographic information systems (GIS). Blue lines, representing fluvial channels on the topographic maps were accepted as a drainage network, which does not depict all exterior links in the basin. The second drainage network was extracted from the DEM using minimum accumulation area threshold to include all exterior links. Morphometric parameters were applied to the two types of drainage networks at sub-basin levels. These parameters were used to assess the influence of the sub-basins on the main channel with respect to flooding. The results show that the drainage network of sub-basin 4—where a dam was constructed on its outlet to mitigate potential floods—has a lower influence morphometrically to produce probable floods on the main channel than that of sub-basins 1, 3, and 5. The construction of the dam will help reduce flooding on the main channel from sub-basin 4 but it will not prevent potential flooding from sub-basin 1, 3 and 5, which join the main channel downstream of sub-basin 4. Therefore, flood mitigation efforts should be considered in order to protect the settlement and agricultural lands on the floodplain downstream of the dam. In order to increase our understanding of flood hazards, and to determine appropriate mitigation solutions, drainage morphometry research should be included as an essential component to hydrologic studies.

  2. Evaluating DEM conditioning techniques, elevation source data, and grid resolution for field-scale hydrological parameter extraction

    NASA Astrophysics Data System (ADS)

    Woodrow, Kathryn; Lindsay, John B.; Berg, Aaron A.

    2016-09-01

    Although digital elevation models (DEMs) prove useful for a number of hydrological applications, they are often the end result of numerous processing steps that each contains uncertainty. These uncertainties have the potential to greatly influence DEM quality and to further propagate to DEM-derived attributes including derived surface and near-surface drainage patterns. This research examines the impacts of DEM grid resolution, elevation source data, and conditioning techniques on the spatial and statistical distribution of field-scale hydrological attributes for a 12,000 ha watershed of an agricultural area within southwestern Ontario, Canada. Three conditioning techniques, including depression filling (DF), depression breaching (DB), and stream burning (SB), were examined. The catchments draining to each boundary of 7933 agricultural fields were delineated using the surface drainage patterns modeled from LiDAR data, interpolated to a 1 m, 5 m, and 10 m resolution DEMs, and from a 10 m resolution photogrammetric DEM. The results showed that variation in DEM grid resolution resulted in significant differences in the spatial and statistical distributions of contributing areas and the distributions of downslope flowpath length. Degrading the grid resolution of the LiDAR data from 1 m to 10 m resulted in a disagreement in mapped contributing areas of between 29.4% and 37.3% of the study area, depending on the DEM conditioning technique. The disagreements among the field-scale contributing areas mapped from the 10 m LiDAR DEM and photogrammetric DEM were large, with nearly half of the study area draining to alternate field boundaries. Differences in derived contributing areas and flowpaths among various conditioning techniques increased substantially at finer grid resolutions, with the largest disagreement among mapped contributing areas occurring between the 1 m resolution DB DEM and the SB DEM (37% disagreement) and the DB-DF comparison (36.5% disagreement in mapped

  3. The evaluation of unmanned aerial system-based photogrammetry and terrestrial laser scanning to generate DEMs of agricultural watersheds

    NASA Astrophysics Data System (ADS)

    Ouédraogo, Mohamar Moussa; Degré, Aurore; Debouche, Charles; Lisein, Jonathan

    2014-06-01

    Agricultural watersheds tend to be places of intensive farming activities that permanently modify their microtopography. The surface characteristics of the soil vary depending on the crops that are cultivated in these areas. Agricultural soil microtopography plays an important role in the quantification of runoff and sediment transport because the presence of crops, crop residues, furrows and ridges may impact the direction of water flow. To better assess such phenomena, 3-D reconstructions of high-resolution agricultural watershed topography are essential. Fine-resolution topographic data collection technologies can be used to discern highly detailed elevation variability in these areas. Knowledge of the strengths and weaknesses of existing technologies used for data collection on agricultural watersheds may be helpful in choosing an appropriate technology. This study assesses the suitability of terrestrial laser scanning (TLS) and unmanned aerial system (UAS) photogrammetry for collecting the fine-resolution topographic data required to generate accurate, high-resolution digital elevation models (DEMs) in a small watershed area (12 ha). Because of farming activity, 14 TLS scans (≈ 25 points m- 2) were collected without using high-definition surveying (HDS) targets, which are generally used to mesh adjacent scans. To evaluate the accuracy of the DEMs created from the TLS scan data, 1098 ground control points (GCPs) were surveyed using a real time kinematic global positioning system (RTK-GPS). Linear regressions were then applied to each DEM to remove vertical errors from the TLS point elevations, errors caused by the non-perpendicularity of the scanner's vertical axis to the local horizontal plane, and errors correlated with the distance to the scanner's position. The scans were then meshed to generate a DEMTLS with a 1 × 1 m spatial resolution. The Agisoft PhotoScan and MicMac software packages were used to process the aerial photographs and generate a DEMPSC

  4. Numerical slope stability simulations of chasma walls in Valles Marineris/Mars using a distinct element method (dem).

    NASA Astrophysics Data System (ADS)

    Imre, B.

    2003-04-01

    NUMERICAL SLOPE STABILITY SIMULATIONS OF CHASMA WALLS IN VALLES MARINERIS/MARS USING A DISTINCT ELEMENT METHOD (DEM). B. Imre (1) (1) German Aerospace Center, Berlin Adlershof, bernd.imre@gmx.net The 8- to 10-km depths of Valles Marineris (VM) offer excellent views into the upper Martian crust. Layering, fracturing, lithology, stratigraphy and the content of volatiles have influenced the evolution of the Valles Marineris wallslopes. But these parameters also reflect the development of VM and its wall slopes. The scope of this work is to gain understanding in these parameters by back-simulating the development of wall slopes. For that purpose, the two dimensional Particle Flow Code PFC2D has been chosen (ITASCA, version 2.00-103). PFC2D is a distinct element code for numerical modelling of movements and interactions of assemblies of arbitrarily sized circular particles. Particles may be bonded together to represent a solid material. Movements of particles are unlimited. That is of importance because results of open systems with numerous unknown variables are non-unique and therefore highly path dependent. This DEM allows the simulation of whole development paths of VM walls what makes confirmation of the model more complete (e.g. Oreskes et al., Science 263, 1994). To reduce the number of unknown variables a proper (that means as simple as possible) field-site had to be selected. The northern wall of eastern Candor Chasma has been chosen. This wall is up to 8-km high and represents a significant outcrop of the upper Martian crust. It is quite uncomplex, well-aligned and of simple morphology. Currently the work on the model is at the stage of performing the parameter study. Results will be presented via poster by the EGS-Meeting.

  5. Terrain Classification of Aster gDEM for Seismic Microzonation of Port-Au Haiti, Using - and - Based Analytic Methods

    NASA Astrophysics Data System (ADS)

    Yong, A.; Hough, S. E.; Cox, B. R.; Rathje, E. M.; Bachhuber, J.; Hulslander, D.; Christiansen, L.; Abrams, M.

    2010-12-01

    The aftermath of the M7.0 Haiti earthquake of 12 January 2010 witnessed an impressive scientific response from the international community. In addition to conventional post-earthquake investigations, there was also an unprecedented reliance on remote-sensing technologies for scientific investigation and damage assessment. These technologies include sensors from both aerial and space-borne observational platforms. As part of the Haiti earthquake response and recovery effort, we develop a seismic zonation map of Port-au-Prince based on high-resolution satellite imagery as well as data from traditional seismographic monitoring stations and geotechnical site characterizations. Our imagery consists of a global digital elevation model (gDEM) of Hispaniola derived from data recorded by NASA-JPL's Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument onboard the multi-platform satellite Terra. To develop our model we also consider recorded waveforms from portable seismographic stations (Hough et al., in review) and 36 geotechnical shear-wave velocity surveys (Cox et al., in review). Following a similar approach developed by Yong et al. (2008; Bull. Seism Soc. Am.), we use both pixel- and object- based imaging analytic methods to systematically identify and extract local terrain features that are expected to amplify seismic ground motion. Using histogram-stretching techniques applied to the rDEM values, followed by multi-resolution, segmentations of the imagery into terrain types, we systematically classify the terrains of Hispaniola. By associating available Vs30 (average shear-wave velocity in the upper 30 meter depth) calculated from the MASW (Multi-channel Analysis of Surface Wave) survey method, we develop a first-order site characterization map. Our results indicate that the terrain-based Vs30 estimates are significantly associated with amplitudes recorded at station sites. We also find that the damage distribution inferred from UNOSAT

  6. Evaluating topographic and hydrologic attribute sensitivity to upscaled resolution DEMs from LIDAR data

    NASA Astrophysics Data System (ADS)

    Petroselli, A.; Santini, M.; Nardi, F.; Tarolli, P.; Grimaldi, S.

    2008-12-01

    Raster-based Digital Terrain Models (DTMs) have been extensively used for determining topographic attributes used for hydrologic modelling topographically based. Several studies have been reported that the hillslope hydrology response is strongly affected by the local topography. Despite the increasing availability of fine resolution topographic data captured by Light Detection And Ranging (LIDAR) technique, some drawbacks arise, both from the computational point of view, and also because the higher detail does not match with the other spatial attributes (e.g. land use, vegetation cover, climate etc.). A compromise is then needed to satisfy the computational effort, and at the same time make the spatial input homogeneous, by either downscaling the coarsest ones or by upscaling the finest ones. Usually, during resampling of original DTM, topographic details could be lost because of smoothing effects. For this reason it is necessary to investigate whether and how a coarser resolution DTM can preserve hydrologic information, crucial for modeling performances and reliability, as uncertainties in the inputs will be propagated into the output prediction, producing biases. In this work two case studies are presented using 1 m LIDAR DTMs. A series of DTMs having 5, 10 up to 20 m grid size are derived from the finest DTM of 1m, this applying standard resampling methods. Several topographic and hydrologic characteristics are tested at different grid cell sizes e.g. the wetness index, and the flow path length of the main channel, in order to test the changes in the lag time between precipitation and flow peak discharge, resulting in different hydrographs. Even if some of these attributes prove to have few differences in basin averages by changing DTM resolution, it is here shown that, unlike for the lumped models, where the heterogeneities are ignored, for the semi-distributed and distributed models the input spatial variability can affect significantly the results.

  7. An efficient and comprehensive method for drainage network extraction from DEM with billions of pixels using a size-balanced binary search tree

    NASA Astrophysics Data System (ADS)

    Bai, Rui; Li, Tiejian; Huang, Yuefei; Li, Jiaye; Wang, Guangqian

    2015-06-01

    With the increasing resolution of digital elevation models (DEMs), computational efficiency problems have been encountered when extracting the drainage network of a large river basin at billion-pixel scales. The efficiency of the most time-consuming depression-filling pretreatment has been improved by using the O(NlogN) complexity least-cost path search method, but the complete extraction steps following this method have not been proposed and tested. In this paper, an improved O(NlogN) algorithm was proposed by introducing a size-balanced binary search tree (BST) to improve the efficiency of the depression-filling pretreatment further. The following extraction steps, including the flow direction determination and the upslope area accumulation, were also redesigned to benefit from this improvement. Therefore, an efficient and comprehensive method was developed. The method was tested to extract drainage networks of 31 river basins with areas greater than 500,000 km2 from the 30-m-resolution ASTER GDEM and two sub-basins with areas of approximately 1000 km2 from the 1-m-resolution airborne LiDAR DEM. Complete drainage networks with both vector features and topographic parameters were obtained with time consumptions in O(NlogN) complexity. The results indicate that the developed method can be used to extract entire drainage networks from DEMs with billions of pixels with high efficiency.

  8. A Comparison of Elevation Between InSAR DEM and Reference DEMs

    NASA Astrophysics Data System (ADS)

    Yun, Ye; Zeng, Qiming; Jiao, Jian; Yan, Dapeng; Liang, Cunren; Wang, Qing; Zhou, Xiao

    2013-01-01

    Introduction (1) DEM generation Space borne SAR interferometry is one of the methods for the generation of digital elevation model (DEM). (2) Common methods to generate DEMs • Same antenna with two passes: e.g. ERS1/2 • Single-pass interferometry : e.g. SRTM • Geometry of stereopairs : e.g. SPOT and ASTER • Combination of air-photograph, satellite image, topographic map and field measurement : e.g. NGCC (National Geomatics Center of China, which has completed the establishment of 1:50000 topographic databases of China) (3) Purpose of this study Compare DEMs derived from ERS1/2 and common methods by comparison of tandem and reference DEMs which are SRTM DEM, ASTER GDEM and NGCC DEM. Some qualitative and quantitative assessments of the elevation were used to estimate the difference.

  9. The role of method of production and resolution of the DEM on slope-units delineation for landslide susceptibility assessment - Ubaye Valley, French Alps case study

    NASA Astrophysics Data System (ADS)

    Schlögel, Romy; Marchesini, Ivan; Alvioli, Massimiliano; Reichenbach, Paola; Rossi, Mauro; Malet, Jean-Philippe

    2016-04-01

    Landslide susceptibility assessment forms the basis of any hazard mapping, which is one of the essential parts of quantitative risk mapping. For the same study area, different susceptibility maps can be achieved depending on the type of susceptibility mapping methods, mapping unit, and scale. In the Ubaye Valley (South French Alps), we investigate the effect of resolution and method of production of the DEM to delineate slope units for landslide susceptibility mapping method. Slope units delineation has been processed using multiple combinations of circular variance and minimum area size values, which are the input parameters for a new software for terrain partitioning. We rely on this method taking into account homogeneity of aspect direction inside each unit and inhomogeneity between different units. We computed slope units delineation for 5, 10 and 25 meters resolution DEM, and investigate statistical distributions of morphometric variables within the different polygons. Then, for each different slope units partitioning, we calibrated a landslide susceptibility model, considering landslide bodies and scarps as a dependent variable (binary response). This work aims to analyse the role of DEM resolution on slope-units delineation for landslide susceptibility assessment. Area Under the Curve of the Receiver Operating Characteristic is investigated for the susceptibility model calculations. In addition, we analysed further the performance of the Logistic Regression Model by looking at the percentage of significant variable in the statistical analyses. Results show that smaller slope units have a better chance of containing a smaller number of thematic and morphometric variables, allowing for an easier classification. Reliability of the models according to the DEM resolution considered as well as scarp area and landslides bodies presence/absence as dependent variable are discussed.

  10. On the investigation of the performances of a DEM-based hydrogeomorphic floodplain identification method in a large urbanized river basin: the Tiber river case study in Italy

    NASA Astrophysics Data System (ADS)

    Nardi, Fernando; Biscarini, Chiara; Di Francesco, Silvia; Manciola, Piergiorgio

    2013-04-01

    consequently identified as those river buffers, draining towards the channel, with an elevation that is less than the maximum flow depth of the corresponding outlet. Keeping in mind that this hydrogeomorhic model performances are strictly related to the quality and properties of the input DEM and that the intent of this kind of methodology is not to substitute standard flood modeling and mapping methods, in this work the performances of this approach are qualitatively evaluated by comparing results with standard flood maps. The Tiber river basin was selected as case study, one of the main river basins in Italy covering a drainage area of approximately 17.000 km2. This comparison is interesting for understanding the performance of the model in a large and complex domain where the impact of the urbanization matrix is significant. Results of this investigation confirm the potential of such DEM-based floodplain mapping models for providing a fast timely homogeneous and continuous inundation scenario to urban planners and decision makers, but also the drawbacks of using such methodology where the humans are significantly and rapidly modifying the surface properties.

  11. Shading-based DEM refinement under a comprehensive imaging model

    NASA Astrophysics Data System (ADS)

    Peng, Jianwei; Zhang, Yi; Shan, Jie

    2015-12-01

    This paper introduces an approach to refine coarse digital elevation models (DEMs) based on the shape-from-shading (SfS) technique using a single image. Different from previous studies, this approach is designed for heterogeneous terrain and derived from a comprehensive (extended) imaging model accounting for the combined effect of atmosphere, reflectance, and shading. To solve this intrinsic ill-posed problem, the least squares method and a subsequent optimization procedure are applied in this approach to estimate the shading component, from which the terrain gradient is recovered with a modified optimization method. Integrating the resultant gradients then yields a refined DEM at the same resolution as the input image. The proposed SfS method is evaluated using 30 m Landsat-8 OLI multispectral images and 30 m SRTM DEMs. As demonstrated in this paper, the proposed approach is able to reproduce terrain structures with a higher fidelity; and at medium to large up-scale ratios, can achieve elevation accuracy 20-30% better than the conventional interpolation methods. Further, this property is shown to be stable and independent of topographic complexity. With the ever-increasing public availability of satellite images and DEMs, the developed technique is meaningful for global or local DEM product refinement.

  12. Extract relevant features from DEM for groundwater potential mapping

    NASA Astrophysics Data System (ADS)

    Liu, T.; Yan, H.; Zhai, L.

    2015-06-01

    Multi-criteria evaluation (MCE) method has been applied much in groundwater potential mapping researches. But when to data scarce areas, it will encounter lots of problems due to limited data. Digital Elevation Model (DEM) is the digital representations of the topography, and has many applications in various fields. Former researches had been approved that much information concerned to groundwater potential mapping (such as geological features, terrain features, hydrology features, etc.) can be extracted from DEM data. This made using DEM data for groundwater potential mapping is feasible. In this research, one of the most widely used and also easy to access data in GIS, DEM data was used to extract information for groundwater potential mapping in batter river basin in Alberta, Canada. First five determining factors for potential ground water mapping were put forward based on previous studies (lineaments and lineament density, drainage networks and its density, topographic wetness index (TWI), relief and convergence Index (CI)). Extraction methods of the five determining factors from DEM were put forward and thematic maps were produced accordingly. Cumulative effects matrix was used for weight assignment, a multi-criteria evaluation process was carried out by ArcGIS software to delineate the potential groundwater map. The final groundwater potential map was divided into five categories, viz., non-potential, poor, moderate, good, and excellent zones. Eventually, the success rate curve was drawn and the area under curve (AUC) was figured out for validation. Validation result showed that the success rate of the model was 79% and approved the method's feasibility. The method afforded a new way for researches on groundwater management in areas suffers from data scarcity, and also broaden the application area of DEM data.

  13. The Oracle of DEM

    NASA Astrophysics Data System (ADS)

    Gayley, Kenneth

    2013-06-01

    The predictions of the famous Greek oracle of Delphi were just ambiguous enough to seem to convey information, yet the user was only seeing their own thoughts. Are there ways in which X-ray spectral analysis is like that oracle? It is shown using heuristic, generic response functions to mimic actual spectral inversion that the widely known ill conditioning, which makes formal inversion impossible in the presence of random noise, also makes a wide variety of different source distributions (DEMs) produce quite similar X-ray continua and resonance-line fluxes. Indeed, the sole robustly inferable attribute for a thermal, optically thin resonance-line spectrum with normal abundances in CIE is its average temperature. The shape of the DEM distribution, on the other hand, is not well constrained, and may actually depend more on the analysis method, no matter how sophisticated, than on the source plasma. The case is made that X-ray spectra can tell us average temperature, and metallicity, and absorbing column, but the main thing it cannot tell us is the main thing it is most often used to infer: the differential emission measure distribution.

  14. Convolutional Neural Network Based dem Super Resolution

    NASA Astrophysics Data System (ADS)

    Chen, Zixuan; Wang, Xuewen; Xu, Zekai; Hou, Wenguang

    2016-06-01

    DEM super resolution is proposed in our previous publication to improve the resolution for a DEM on basis of some learning examples. Meanwhile, the nonlocal algorithm is introduced to deal with it and lots of experiments show that the strategy is feasible. In our publication, the learning examples are defined as the partial original DEM and their related high measurements due to this way can avoid the incompatibility between the data to be processed and the learning examples. To further extent the applications of this new strategy, the learning examples should be diverse and easy to obtain. Yet, it may cause the problem of incompatibility and unrobustness. To overcome it, we intend to investigate a convolutional neural network based method. The input of the convolutional neural network is a low resolution DEM and the output is expected to be its high resolution one. A three layers model will be adopted. The first layer is used to detect some features from the input, the second integrates the detected features to some compressed ones and the final step transforms the compressed features as a new DEM. According to this designed structure, some learning DEMs will be taken to train it. Specifically, the designed network will be optimized by minimizing the error of the output and its expected high resolution DEM. In practical applications, a testing DEM will be input to the convolutional neural network and a super resolution will be obtained. Many experiments show that the CNN based method can obtain better reconstructions than many classic interpolation methods.

  15. The influence of slope profile extraction techniques and DEM resolution on 2D rockfall simulation

    NASA Astrophysics Data System (ADS)

    Wang, X.; Frattini, P.; Agliardi, F.; Crosta, G. B.

    2012-04-01

    The development of advanced 3D rockfall modelling algorithms and tools during the last decade has allowed to gain insights in the topographic controls on the quality and reliability of rockfall simulation results. These controls include DEM resolution and roughness, and depend on the adopted rockfall simulation approach and DEM generation techniques. Despite the development of 3D simulations, the 2D modelling approach still remains suitable and convenient in some cases. Therefore, the accuracy of high-quality 3D descriptions of topography must be preserved when extracting slope profiles for 2D simulations. In this perspective, this study compares and evaluates three different techniques commonly used to extract slope profiles from DEM, in order to assess their suitability and effects on rockfall simulation results. These methods include: (A) an "interpolated shape" method (ESRI 3D Analyst), (B) a raw raster sampling method (EZ Profiler), and (C) a vector TIN sampling method (ESRI 3D Analyst). The raster DEMs used in the study were all derived from the same TIN DEM used for method C. For raster DEM, the "interpolated shape" method (A) extracts the profile by bi-linear interpolating the elevation among the four neighbouring cells at each sampling location along the profile trace. The EZ Profiler extension (B) extracts the profile by sampling elevation values directly from the DEM raster grid at each sampling location. These methods have been compared to the extraction of profiles from TIN DEM (C), where slope profile elevations are directly obtained by sampling the TIN triangular facets. 2D rockfall simulations performed using a widely used commercial software (RocfallTM) with the different profiles show that: (1) method A and C provide similar results; (2) runout simulated using profiles obtained by method A is usually shorter than method C; (3) method B presents abrupt horizontal steps in the profiles, resulting in unrealistic runout. To study the influence of DEM

  16. Operational TanDEM-X DEM calibration and first validation results

    NASA Astrophysics Data System (ADS)

    Gruber, Astrid; Wessel, Birgit; Huber, Martin; Roth, Achim

    2012-09-01

    In June 2010, the German TanDEM-X satellite was launched. Together with its twin satellite TerraSAR-X it flies in a close formation enabling single-pass SAR interferometry. The primary goal of the TanDEM-X mission is the derivation of a global digital elevation model (DEM) with unprecedented global accuracies of 10 m in absolute and 2 m in relative height. A significant calibration effort is required to achieve this high quality world-wide. In spite of an intensive instrument calibration and a highly accurate orbit and baseline determination, some systematic height errors like offsets and tilts in the order of some meters remain in the interferometric DEMs and have to be determined and removed during the TanDEM-X DEM calibration. The objective of this article is the presentation of an approach for the estimation of correction parameters for remaining systematic height errors applicable to interferometric height models. The approach is based on a least-squares block adjustment using the elevation of ICESat GLA 14 data as ground control points and connecting points of adjacent, overlapping DEMs as tie-points. In the first part its implementation in DLR's ground segment is outlined. In the second part the approach is applied and validated for two of the first TanDEM-X DEM test sites. Therefore, independent reference data, in particular high resolution reference DEMs and GPS tracks, are used. The results show that the absolute height errors of the TanDEM-X DEM are small in these cases, mostly in the order of 1-2 m. An additional benefit of the proposed block adjustment method is that it improves the relative accuracy of adjacent DEMs.

  17. Statistic Tests Aided Multi-Source dem Fusion

    NASA Astrophysics Data System (ADS)

    Fu, C. Y.; Tsay, J. R.

    2016-06-01

    Since the land surface has been changing naturally or manually, DEMs have to be updated continually to satisfy applications using the latest DEM at present. However, the cost of wide-area DEM production is too high. DEMs, which cover the same area but have different quality, grid sizes, generation time or production methods, are called as multi-source DEMs. It provides a solution to fuse multi-source DEMs for low cost DEM updating. The coverage of DEM has to be classified according to slope and visibility in advance, because the precisions of DEM grid points in different areas with different slopes and visibilities are not the same. Next, difference DEM (dDEM) is computed by subtracting two DEMs. It is assumed that dDEM, which only contains random error, obeys normal distribution. Therefore, student test is implemented for blunder detection and three kinds of rejected grid points are generated. First kind of rejected grid points is blunder points and has to be eliminated. Another one is the ones in change areas, where the latest data are regarded as their fusion result. Moreover, the DEM grid points of type I error are correct data and have to be reserved for fusion. The experiment result shows that using DEMs with terrain classification can obtain better blunder detection result. A proper setting of significant levels (α) can detect real blunders without creating too many type I errors. Weighting averaging is chosen as DEM fusion algorithm. The priori precisions estimated by our national DEM production guideline are applied to define weights. Fisher's test is implemented to prove that the priori precisions correspond to the RMSEs of blunder detection result.

  18. Morphometric evaluation of the Afşin-Elbistan lignite basin using kernel density estimation and Getis-Ord's statistics of DEM derived indices, SE Turkey

    NASA Astrophysics Data System (ADS)

    Sarp, Gulcan; Duzgun, Sebnem

    2015-11-01

    A morphometric analysis of river network, basins and relief using geomorphic indices and geostatistical analyses of Digital Elevation Model (DEM) are useful tools for discussing the morphometric evolution of the basin area. In this study, three different indices including valley floor width to height ratio (Vf), stream gradient (SL), and stream sinuosity were applied to Afşin-Elbistan lignite basin to test the imprints of tectonic activity. Perturbations of these indices are usually indicative of differences in the resistance of outcropping lithological units to erosion and active faulting. To map the clusters of high and low indices values, the Kernel density estimation (K) and the Getis-Ord Gi∗ statistics were applied to the DEM-derived indices. The K method and Gi∗ statistic highlighting hot spots and cold spots of the SL index, the stream sinuosity and the Vf index values helped to identify the relative tectonic activity of the basin area. The results indicated that the estimation by the K and Gi∗ including three conceptualization of spatial relationships (CSR) for hot spots (percent volume contours 50 and 95 categorized as high and low respectively) yielded almost similar results in regions of high tectonic activity and low tectonic activity. According to the K and Getis-Ord Gi∗ statistics, the northern, northwestern and southern parts of the basin indicates a high tectonic activity. On the other hand, low elevation plain in the central part of the basin area shows a relatively low tectonic activity.

  19. Rapid Geometric Correction of SSC Terrasar-X Images with Direct Georeferencing, Global dem and Global Geoid Models

    NASA Astrophysics Data System (ADS)

    Vassilaki, D. I.; Stamos, A. A.; Ioannidis, C.

    2013-05-01

    In this paper a process for rapid geometric correction of slant range SAR images is presented. The process is completely independent of ground control information thanks to the direct georeferencing method capabilities offered by the TerraSAR-X sensor. The process is especially rapid due to the use of readily available global DEMs and global geoid models. An additional advantage of this process is its flexibility. If a more accurate local DEM or local geoid model is readily available it can be used instead of the global DEM or global geoid model. The process is applied to geometrically correct a SSC TerraSAR-X image over a sub-urban mountainous area using the SRTM and the ASTER global DEMs and the EGM2008 global geoid model. Additionally two local, more accurate DEMs, are used. The accuracy of the process is evaluated by independent check points.

  20. Accuracy and reliability of the Hungarian digital elevation model (DEM)

    NASA Astrophysics Data System (ADS)

    Detrekoi, Akos; Melykuti, Gabor; Szabo, Gyorgy

    1994-08-01

    In the period 1991-92 a 50 X 50 meter grid digital elevation model (DEM) was created in Hungary. The design and the quality control of DEM are discussed in this paper. The paper has three parts: (1) the data acquisition methods for DEM by scanning and photogrammetry are discussed, (2) a general overview about the accuracy and reliability of DEMs is given, and (3) the algorithm for the checking of data and some general conclusions about the control activity of the Hungarian DEM are reviewed.

  1. Quality Test Various Existing dem in Indonesia Toward 10 Meter National dem

    NASA Astrophysics Data System (ADS)

    Amhar, Fahmi

    2016-06-01

    Indonesia has various DEM from many sources and various acquisition date spreaded in the past two decades. There are DEM from spaceborne system (Radarsat, TerraSAR-X, ALOS, ASTER-GDEM, SRTM), airborne system (IFSAR, Lidar, aerial photos) and also terrestrial one. The research objective is the quality test and how to extract best DEM in particular area. The method is using differential GPS levelling using geodetic GPS equipment on places which is ensured not changed during past 20 years. The result has shown that DEM from TerraSAR-X and SRTM30 have the best quality (rmse 3.1 m and 3.5 m respectively). Based on this research, it was inferred that these parameters are still positively correlated with the basic concept, namely that the lower and the higher the spatial resolution of a DEM data, the more imprecise the resulting vertical height.

  2. Nonlocal similarity based DEM super resolution

    NASA Astrophysics Data System (ADS)

    Xu, Zekai; Wang, Xuewen; Chen, Zixuan; Xiong, Dongping; Ding, Mingyue; Hou, Wenguang

    2015-12-01

    This paper discusses a new topic, DEM super resolution, to improve the resolution of an original DEM based on its partial new measurements obtained with high resolution. A nonlocal algorithm is introduced to perform this task. The original DEM was first divided into overlapping patches, which were classified either as "test" or "learning" data depending on whether or not they are related to high resolution measurements. For each test patch, the similar patches in the learning dataset were identified via template matching. Finally, the high resolution DEM of the test patch was restored by the weighted sum of similar patches under the condition that the reconstruction weights were the same in different resolution cases. A key assumption of this strategy is that there are some repeated or similar modes in the original DEM, which is quite common. Experiments were done to demonstrate that we can restore a DEM by preserving the details without introducing artifacts. Statistic analysis was also conducted to show that this method can obtain higher accuracy than traditional interpolation methods.

  3. Development of parallel DEM for the open source code MFIX

    SciTech Connect

    Gopalakrishnan, Pradeep; Tafti, Danesh

    2013-02-01

    The paper presents the development of a parallel Discrete Element Method (DEM) solver for the open source code, Multiphase Flow with Interphase eXchange (MFIX) based on the domain decomposition method. The performance of the code was evaluated by simulating a bubbling fluidized bed with 2.5 million particles. The DEM solver shows strong scalability up to 256 processors with an efficiency of 81%. Further, to analyze weak scaling, the static height of the fluidized bed was increased to hold 5 and 10 million particles. The results show that global communication cost increases with problem size while the computational cost remains constant. Further, the effects of static bed height on the bubble hydrodynamics and mixing characteristics are analyzed.

  4. Evaluation Methods Sourcebook.

    ERIC Educational Resources Information Center

    Love, Arnold J., Ed.

    The chapters commissioned for this book describe key aspects of evaluation methodology as they are practiced in a Canadian context, providing representative illustrations of recent developments in evaluation methodology as it is currently applied. The following chapters are included: (1) "Program Evaluation with Limited Fiscal and Human Resources"…

  5. Selection: Evaluation and methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Procedures to collect and to analyze data for genetic improvement of dairy cattle are described. Methods of identification and milk recording are presented. Selection traits include production (milk, fat, and protein yields and component percentages), conformation (final score and linear type traits...

  6. Failure and frictional sliding envelopes in three-dimensional stress space: Insights from Distinct Element Method (DEM) models and implications for the brittle-ductile transition of rock

    NASA Astrophysics Data System (ADS)

    Schöpfer, Martin; Childs, Conrad; Manzocchi, Tom

    2013-04-01

    Rocks deformed at low confining pressure are brittle, meaning that after peak stress the strength decreases to a residual value determined by frictional sliding. The difference between the peak and residual value is the stress drop. At high confining pressure, however, no stress drop occurs. The transition pressure at which no loss in strength occurs is a possible definition of the brittle-ductile transition. The Distinct Element Method (DEM) is used to illustrate how this type of brittle-ductile transition emerges from a simple model in which rock is idealised as an assemblage of cemented spherical unbreakable grains. These bonded particle models are subjected to loading under constant mean stress and stress ratio conditions using distortional periodic space, which eliminates possible boundary effects arising from the usage of rigid loading platens. Systematic variation of both mean stress and stress ratio allowed determination of the complete three dimensional yield, peak stress and residual strength envelopes. The models suggest that the brittle-ductile transition is a mean stress and stress ratio dependent space curve, which cannot be adequately described by commonly used failure criteria (e.g., Mohr-Coulomb, Drucker-Prager). The model peak strength data exhibit an intermediate principal stress dependency which is, at least qualitatively, similar to that observed for natural rocks deformed under polyaxial laboratory conditions. Comparison of failure envelopes determined for bonded particle models with and without bond shear failure suggests that the non-linear pressure dependence of strength (concave failure envelopes) is, at high mean stress, the result of microscopic shear failure, a result consistent with earlier two-dimensional numerical multiple-crack simulations [D. A. Lockner & T. R. Madden, JGR, Vol. 96, No. B12, 1991]. Our results may have implications for a wide range of geophysical research areas, including the strength of the crust, the seismogenic

  7. EVALUATION OF IGNITABILITY METHODS (LIQUIDS)

    EPA Science Inventory

    The purpose of the research was to evaluate the ignitability Methods 1010 (Pensky-Martens) and 1020 (Setaflash) as described by OSW Manual SW846 (1). The effort was designed to provide information on accuracy and precision of the two methods. During Phase I of the task, six stand...

  8. Evaluation of modal testing methods

    NASA Technical Reports Server (NTRS)

    Chen, J.-C.

    1984-01-01

    Modal tests are playing an increasingly important role in structural dynamics efforts which are in need of analytical model verification or trouble shootings. In the meantime, the existing modal testing methods are undergoing great changes as well as new methods are being created. Although devoted advocates of each method can be found to argue the relative advantages and disadvantages, the general superiority, if any, of one or the other is not yet evident. The Galileo spacecraft, a realistic, complex structural system, will be used as a test article for performing modal tests by various methods. The results will be used to evaluate the relative merits of the various modal testing methods.

  9. Satellite-derived Digital Elevation Model (DEM) selection, preparation and correction for hydrodynamic modelling in large, low-gradient and data-sparse catchments

    NASA Astrophysics Data System (ADS)

    Jarihani, Abdollah A.; Callow, John N.; McVicar, Tim R.; Van Niel, Thomas G.; Larsen, Joshua R.

    2015-05-01

    Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modelling of environmental processes. Topographic accuracy, methods of preparation and grid size are all important for hydrodynamic models to efficiently replicate flow processes. In remote and data-scarce regions, high resolution DEMs are often not available and therefore it is necessary to evaluate lower resolution data such as the Shuttle Radar Topography Mission (SRTM) and Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) for use within hydrodynamic models. This paper does this in three ways: (i) assessing point accuracy and geometric co-registration error of the original DEMs; (ii) quantifying the effects of DEM preparation methods (vegetation smoothed and hydrologically-corrected) on hydrodynamic modelling relative accuracy; and (iii) quantifying the effect of the hydrodynamic model grid size (30-2000 m) and the associated relative computational costs (run time) on relative accuracy in model outputs. We initially evaluated the accuracy of the original SRTM (∼30 m) seamless C-band DEM (SRTM DEM) and second generation products from the ASTER (ASTER GDEM) against registered survey marks and altimetry data points from the Ice, Cloud, and land Elevation Satellite (ICESat). SRTM DEM (RMSE = 3.25 m,) had higher accuracy than ASTER GDEM (RMSE = 7.43 m). Based on these results, the original version of SRTM DEM, the ASTER GDEM along with vegetation smoothed and hydrologically corrected versions were prepared and used to simulate three flood events along a 200 km stretch of the low-gradient Thompson River, in arid Australia (using five metrics: peak discharge, peak height, travel time, terminal water storage and flood extent). The hydrologically corrected DEMs performed best across these metrics in simulating floods compared with vegetation smoothed DEMs and original DEMs. The response of model performance to grid size was non

  10. Separability of soils in a tallgrass prairie using SPOT and DEM data

    NASA Technical Reports Server (NTRS)

    Su, Haiping; Ransom, Michel D.; Yang, Shie-Shien; Kanemasu, Edward T.

    1990-01-01

    An investigation is conducted which uses a canonical transformation technique to reduce the features from SPOT and DEM data and evaluates the statistical separability of several prairie soils from the canonically transformed variables. Both SPOT and DEM data was gathered for a tallgrass prairie near Manhattan, Kansas, and high resolution SPOT satellite images were integrated with DEM data. Two canonical variables derived from training samples were selected and it is suggested that canonically transformed data were superior to combined SPOT and DEM data. High resolution SPOT images and DEM data can be used to aid second-order soil surveys in grasslands.

  11. Hydrologic enforcement of lidar DEMs

    USGS Publications Warehouse

    Poppenga, Sandra K.; Worstell, Bruce B.; Danielson, Jeffrey J.; Brock, John C.; Evans, Gayla A.; Heidemann, H. Karl

    2014-01-01

    Hydrologic-enforcement (hydro-enforcement) of light detection and ranging (lidar)-derived digital elevation models (DEMs) modifies the elevations of artificial impediments (such as road fills or railroad grades) to simulate how man-made drainage structures such as culverts or bridges allow continuous downslope flow. Lidar-derived DEMs contain an extremely high level of topographic detail; thus, hydro-enforced lidar-derived DEMs are essential to the U.S. Geological Survey (USGS) for complex modeling of riverine flow. The USGS Coastal and Marine Geology Program (CMGP) is integrating hydro-enforced lidar-derived DEMs (land elevation) and lidar-derived bathymetry (water depth) to enhance storm surge modeling in vulnerable coastal zones.

  12. LNG Safety Assessment Evaluation Methods

    SciTech Connect

    Muna, Alice Baca; LaFleur, Angela Christine

    2015-05-01

    Sandia National Laboratories evaluated published safety assessment methods across a variety of industries including Liquefied Natural Gas (LNG), hydrogen, land and marine transportation, as well as the US Department of Defense (DOD). All the methods were evaluated for their potential applicability for use in the LNG railroad application. After reviewing the documents included in this report, as well as others not included because of repetition, the Department of Energy (DOE) Hydrogen Safety Plan Checklist is most suitable to be adapted to the LNG railroad application. This report was developed to survey industries related to rail transportation for methodologies and tools that can be used by the FRA to review and evaluate safety assessments submitted by the railroad industry as a part of their implementation plans for liquefied or compressed natural gas storage ( on-board or tender) and engine fueling delivery systems. The main sections of this report provide an overview of various methods found during this survey. In most cases, the reference document is quoted directly. The final section provides discussion and a recommendation for the most appropriate methodology that will allow efficient and consistent evaluations to be made. The DOE Hydrogen Safety Plan Checklist was then revised to adapt it as a methodology for the Federal Railroad Administration’s use in evaluating safety plans submitted by the railroad industry.

  13. Improving merge methods for grid-based digital elevation models

    NASA Astrophysics Data System (ADS)

    Leitão, J. P.; Prodanović, D.; Maksimović, Č.

    2016-03-01

    Digital Elevation Models (DEMs) are used to represent the terrain in applications such as, for example, overland flow modelling or viewshed analysis. DEMs generated from digitising contour lines or obtained by LiDAR or satellite data are now widely available. However, in some cases, the area of study is covered by more than one of the available elevation data sets. In these cases the relevant DEMs may need to be merged. The merged DEM must retain the most accurate elevation information available while generating consistent slopes and aspects. In this paper we present a thorough analysis of three conventional grid-based DEM merging methods that are available in commercial GIS software. These methods are evaluated for their applicability in merging DEMs and, based on evaluation results, a method for improving the merging of grid-based DEMs is proposed. DEMs generated by the proposed method, called MBlend, showed significant improvements when compared to DEMs produced by the three conventional methods in terms of elevation, slope and aspect accuracy, ensuring also smooth elevation transitions between the original DEMs. The results produced by the improved method are highly relevant different applications in terrain analysis, e.g., visibility, or spotting irregularities in landforms and for modelling terrain phenomena, such as overland flow.

  14. Evaluation of turbulence mitigation methods

    NASA Astrophysics Data System (ADS)

    van Eekeren, Adam W. M.; Huebner, Claudia S.; Dijk, Judith; Schutte, Klamer; Schwering, Piet B. W.

    2014-05-01

    Atmospheric turbulence is a well-known phenomenon that diminishes the recognition range in visual and infrared image sequences. There exist many different methods to compensate for the effects of turbulence. This paper focuses on the performance of two software-based methods to mitigate the effects of low- and medium turbulence conditions. Both methods are capable of processing static and dynamic scenes. The first method consists of local registration, frame selection, blur estimation and deconvolution. The second method consists of local motion compensation, fore- /background segmentation and weighted iterative blind deconvolution. A comparative evaluation using quantitative measures is done on some representative sequences captured during a NATO SET 165 trial in Dayton. The amount of blurring and tilt in the imagery seem to be relevant measures for such an evaluation. It is shown that both methods improve the imagery by reducing the blurring and tilt and therefore enlarge the recognition range. Furthermore, results of a recognition experiment using simulated data are presented that show that turbulence mitigation using the first method improves the recognition range up to 25% for an operational optical system.

  15. The Double Hierarchy Method. A parallel 3D contact method for the interaction of spherical particles with rigid FE boundaries using the DEM

    NASA Astrophysics Data System (ADS)

    Santasusana, Miquel; Irazábal, Joaquín; Oñate, Eugenio; Carbonell, Josep Maria

    2016-04-01

    In this work, we present a new methodology for the treatment of the contact interaction between rigid boundaries and spherical discrete elements (DE). Rigid body parts are present in most of large-scale simulations. The surfaces of the rigid parts are commonly meshed with a finite element-like (FE) discretization. The contact detection and calculation between those DE and the discretized boundaries is not straightforward and has been addressed by different approaches. The algorithm presented in this paper considers the contact of the DEs with the geometric primitives of a FE mesh, i.e. facet, edge or vertex. To do so, the original hierarchical method presented by Horner et al. (J Eng Mech 127(10):1027-1032, 2001) is extended with a new insight leading to a robust, fast and accurate 3D contact algorithm which is fully parallelizable. The implementation of the method has been developed in order to deal ideally with triangles and quadrilaterals. If the boundaries are discretized with another type of geometries, the method can be easily extended to higher order planar convex polyhedra. A detailed description of the procedure followed to treat a wide range of cases is presented. The description of the developed algorithm and its validation is verified with several practical examples. The parallelization capabilities and the obtained performance are presented with the study of an industrial application example.

  16. The Double Hierarchy Method. A parallel 3D contact method for the interaction of spherical particles with rigid FE boundaries using the DEM

    NASA Astrophysics Data System (ADS)

    Santasusana, Miquel; Irazábal, Joaquín; Oñate, Eugenio; Carbonell, Josep Maria

    2016-07-01

    In this work, we present a new methodology for the treatment of the contact interaction between rigid boundaries and spherical discrete elements (DE). Rigid body parts are present in most of large-scale simulations. The surfaces of the rigid parts are commonly meshed with a finite element-like (FE) discretization. The contact detection and calculation between those DE and the discretized boundaries is not straightforward and has been addressed by different approaches. The algorithm presented in this paper considers the contact of the DEs with the geometric primitives of a FE mesh, i.e. facet, edge or vertex. To do so, the original hierarchical method presented by Horner et al. (J Eng Mech 127(10):1027-1032, 2001) is extended with a new insight leading to a robust, fast and accurate 3D contact algorithm which is fully parallelizable. The implementation of the method has been developed in order to deal ideally with triangles and quadrilaterals. If the boundaries are discretized with another type of geometries, the method can be easily extended to higher order planar convex polyhedra. A detailed description of the procedure followed to treat a wide range of cases is presented. The description of the developed algorithm and its validation is verified with several practical examples. The parallelization capabilities and the obtained performance are presented with the study of an industrial application example.

  17. A comparative appraisal of hydrological behavior of SRTM DEM at catchment level

    NASA Astrophysics Data System (ADS)

    Sharma, Arabinda; Tiwari, K. N.

    2014-11-01

    The Shuttle Radar Topography Mission (SRTM) data has emerged as a global elevation data in the past one decade because of its free availability, homogeneity and consistent accuracy compared to other global elevation dataset. The present study explores the consistency in hydrological behavior of the SRTM digital elevation model (DEM) with reference to easily available regional 20 m contour interpolated DEM (TOPO DEM). Analysis ranging from simple vertical accuracy assessment to hydrological simulation of the studied Maithon catchment, using empirical USLE model and semidistributed, physical SWAT model, were carried out. Moreover, terrain analysis involving hydrological indices was performed for comparative assessment of the SRTM DEM with respect to TOPO DEM. Results reveal that the vertical accuracy of SRTM DEM (±27.58 m) in the region is less than the specified standard (±16 m). Statistical analysis of hydrological indices such as topographic wetness index (TWI), stream power index (SPI), slope length factor (SLF) and geometry number (GN) shows a significant differences in hydrological properties of the two studied DEMs. Estimation of soil erosion potentials of the catchment and conservation priorities of microwatersheds of the catchment using SRTM DEM and TOPO DEM produce considerably different results. Prediction of soil erosion potential using SRTM DEM is far higher than that obtained using TOPO DEM. Similarly, conservation priorities determined using the two DEMs are found to be agreed for only 34% of microwatersheds of the catchment. ArcSWAT simulation reveals that runoff predictions are less sensitive to selection of the two DEMs as compared to sediment yield prediction. The results obtained in the present study are vital to hydrological analysis as it helps understanding the hydrological behavior of the DEM without being influenced by the model structural as well as parameter uncertainty. It also reemphasized that SRTM DEM can be a valuable dataset for

  18. EMDataBank unified data resource for 3DEM

    PubMed Central

    Lawson, Catherine L.; Patwardhan, Ardan; Baker, Matthew L.; Hryc, Corey; Garcia, Eduardo Sanz; Hudson, Brian P.; Lagerstedt, Ingvar; Ludtke, Steven J.; Pintilie, Grigore; Sala, Raul; Westbrook, John D.; Berman, Helen M.; Kleywegt, Gerard J.; Chiu, Wah

    2016-01-01

    Three-dimensional Electron Microscopy (3DEM) has become a key experimental method in structural biology for a broad spectrum of biological specimens from molecules to cells. The EMDataBank project provides a unified portal for deposition, retrieval and analysis of 3DEM density maps, atomic models and associated metadata (emdatabank.org). We provide here an overview of the rapidly growing 3DEM structural data archives, which include maps in EM Data Bank and map-derived models in the Protein Data Bank. In addition, we describe progress and approaches toward development of validation protocols and methods, working with the scientific community, in order to create a validation pipeline for 3DEM data. PMID:26578576

  19. EMDataBank unified data resource for 3DEM.

    PubMed

    Lawson, Catherine L; Patwardhan, Ardan; Baker, Matthew L; Hryc, Corey; Garcia, Eduardo Sanz; Hudson, Brian P; Lagerstedt, Ingvar; Ludtke, Steven J; Pintilie, Grigore; Sala, Raul; Westbrook, John D; Berman, Helen M; Kleywegt, Gerard J; Chiu, Wah

    2016-01-01

    Three-dimensional Electron Microscopy (3DEM) has become a key experimental method in structural biology for a broad spectrum of biological specimens from molecules to cells. The EMDataBank project provides a unified portal for deposition, retrieval and analysis of 3DEM density maps, atomic models and associated metadata (emdatabank.org). We provide here an overview of the rapidly growing 3DEM structural data archives, which include maps in EM Data Bank and map-derived models in the Protein Data Bank. In addition, we describe progress and approaches toward development of validation protocols and methods, working with the scientific community, in order to create a validation pipeline for 3DEM data. PMID:26578576

  20. Validation of an LC-MS/MS method for the determination of epirubicin in human serum of patients undergoing drug eluting microsphere-transarterial chemoembolization (DEM-TACE).

    PubMed

    Sottani, Cristina; Leoni, Emanuela; Porro, Benedetta; Montagna, Benedetta; Amatu, Alessio; Sottotetti, Federico; Quaretti, Pietro; Poggi, Guido; Minoia, Claudio

    2009-11-01

    Drug Eluting Microsphere-Transarterial Chemoembolization (DEM-TACE) is a new delivery system to administrate drugs in a controlled manner useful for application in the chemoembolization of colorectal cancer metastases to the liver. DEM-TACE is focused to obtain higher concentrations of the drug to the tumor with lower systemic concentrations than traditional cancer chemotherapy. Therefore a specific, precise and sensitive LC-ESI-MS/MS assay procedure was properly designed to detect and quantify epirubicin at the concentrations expected from a transarterial chemoembolization with microspheres. Serum samples were kept acidic (pH approximately of 3.5) and sample preparation consisted of a solid phase extraction (SPE) procedure with HLB OASIS cartridges using a methylene chloride/2-propanol/methanol mixture solution to recover epirubicin. The analyses consisted of reversed-phase high-performance liquid chromatography (rp-HPLC) coupled with tandem mass spectrometry (MS/MS). Accuracy, precision and matrix effect of this procedure were carried out by analyzing four quality control samples (QCs) on five separate days. The validation parameters were assessed by recovery studies of spiked serum samples. Recoveries were found to vary between 92 and 98% at the QC levels (5, 40, 80 and 150 microg/L) with relative standard deviation (RSD) always less than 3.7%. The limit of detection (LOD) was set at 1 microg/L. The developed procedure has been also applied to investigate the different capability of two types of commercially available microspheres to release epirubicin into the human circulatory system. PMID:19783235

  1. Incorporating DEM Uncertainty in Coastal Inundation Mapping

    PubMed Central

    Leon, Javier X.; Heuvelink, Gerard B. M.; Phinn, Stuart R.

    2014-01-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey

  2. Incorporating DEM uncertainty in coastal inundation mapping.

    PubMed

    Leon, Javier X; Heuvelink, Gerard B M; Phinn, Stuart R

    2014-01-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey

  3. Radar and Lidar Radar DEM

    NASA Technical Reports Server (NTRS)

    Liskovich, Diana; Simard, Marc

    2011-01-01

    Using radar and lidar data, the aim is to improve 3D rendering of terrain, including digital elevation models (DEM) and estimates of vegetation height and biomass in a variety of forest types and terrains. The 3D mapping of vegetation structure and the analysis are useful to determine the role of forest in climate change (carbon cycle), in providing habitat and as a provider of socio-economic services. This in turn will lead to potential for development of more effective land-use management. The first part of the project was to characterize the Shuttle Radar Topography Mission DEM error with respect to ICESat/GLAS point estimates of elevation. We investigated potential trends with latitude, canopy height, signal to noise ratio (SNR), number of LiDAR waveform peaks, and maximum peak width. Scatter plots were produced for each variable and were fitted with 1st and 2nd degree polynomials. Higher order trends were visually inspected through filtering with a mean and median filter. We also assessed trends in the DEM error variance. Finally, a map showing how DEM error was geographically distributed globally was created.

  4. Impacts of DEM uncertainties on critical source areas identification for non-point source pollution control based on SWAT model

    NASA Astrophysics Data System (ADS)

    Xu, Fei; Dong, Guangxia; Wang, Qingrui; Liu, Lumeng; Yu, Wenwen; Men, Cong; Liu, Ruimin

    2016-09-01

    The impacts of different digital elevation model (DEM) resolutions, sources and resampling techniques on nutrient simulations using the Soil and Water Assessment Tool (SWAT) model have not been well studied. The objective of this study was to evaluate the sensitivities of DEM resolutions (from 30 m to 1000 m), sources (ASTER GDEM2, SRTM and Topo-DEM) and resampling techniques (nearest neighbor, bilinear interpolation, cubic convolution and majority) to identification of non-point source (NPS) critical source area (CSA) based on nutrient loads using the SWAT model. The Xiangxi River, one of the main tributaries of Three Gorges Reservoir in China, was selected as the study area. The following findings were obtained: (1) Elevation and slope extracted from the DEMs were more sensitive to DEM resolution changes. Compared with the results of the 30 m DEM, 1000 m DEM underestimated the elevation and slope by 104 m and 41.57°, respectively; (2) The numbers of subwatersheds and hydrologic response units (HRUs) were considerably influenced by DEM resolutions, but the total nitrogen (TN) and total phosphorus (TP) loads of each subwatershed showed higher correlations with different DEM sources; (3) DEM resolutions and sources had larger effects on CSAs identifications, while TN and TP CSAs showed different response to DEM uncertainties. TN CSAs were more sensitive to resolution changes, exhibiting six distribution patterns at all DEM resolutions. TP CSAs were sensitive to source and resampling technique changes, exhibiting three distribution patterns for DEM sources and two distribution patterns for DEM resampling techniques. DEM resolutions and sources are the two most sensitive SWAT model DEM parameters that must be considered when nutrient CSAs are identified.

  5. DEM Particle Fracture Model

    SciTech Connect

    Zhang, Boning; Herbold, Eric B.; Homel, Michael A.; Regueiro, Richard A.

    2015-12-01

    An adaptive particle fracture model in poly-ellipsoidal Discrete Element Method is developed. The poly-ellipsoidal particle will break into several sub-poly-ellipsoids by Hoek-Brown fracture criterion based on continuum stress and the maximum tensile stress in contacts. Also Weibull theory is introduced to consider the statistics and size effects on particle strength. Finally, high strain-rate split Hopkinson pressure bar experiment of silica sand is simulated using this newly developed model. Comparisons with experiments show that our particle fracture model can capture the mechanical behavior of this experiment very well, both in stress-strain response and particle size redistribution. The effects of density and packings o the samples are also studied in numerical examples.

  6. Processing, validating, and comparing DEMs for geomorphic application on the Puna de Atacama Plateau, northwest Argentina

    NASA Astrophysics Data System (ADS)

    Purinton, Benjamin; Bookhagen, Bodo

    2016-04-01

    This study analyzes multiple topographic datasets derived from various remote-sensing methods from the Pocitos Basin of the central Puna Plateau in northwest Argentina at the border to Chile. Here, the arid climate and clear atmospheric conditions and lack of vegetation provide ideal conditions for remote sensing and Digital Elevation Model (DEM) comparison. We compare the following freely available DEMs: SRTM-X (spatial resolution of ~30 m), SRTM-C v4.1 (90 m), and ASTER GDEM2 (30 m). Additional DEMs for comparison are generated from optical and radar datasets acquired freely (ASTER Level 1B stereo pairs and Sentinal-1A radar), through research agreements (RapidEye Level 1B scenes, ALOS radar, and ENVISAT radar), and through commercial sources (TerraSAR-X / TanDEM-X radar). DEMs from ASTER (spatial resolution of 15 m) and RapidEye (~5-10 m) optical datasets are produced by standard photogrammetric techniques and have been post-processed for validation and alignment purposes. Because RapidEye scenes are captured at a low incidence angle (<20°) and stereo pairs are unavailable, merging and averaging methods of two to four overlapping scenes is explored for effective DEM generation. Sentinal-1A, TerraSAR-X / TanDEM-X, ALOS, and ENVISAT radar data is processed through interferometry resulting in DEMs with spatial resolutions ranging from 5 to 30 meters. The SRTM-X dataset serves as a control in the creation of further DEMs, as it is widely used in the geosciences and represents the highest-quality DEM currently available. All DEMs are validated against over 400,000 differential GPS (dGPS) measurements gathered during four field campaigns in 2012 and 2014 to 2016. Of these points, more than 250,000 lie within the Pocitos Basin with average vertical and horizontal accuracies of 0.95 m and 0.69 m, respectively. Dataset accuracy is judged by the lowest standard deviations of elevation compared with the dGPS data and with the SRTM-X control DEM. Of particular interest in

  7. Pragmatism, Evidence, and Mixed Methods Evaluation

    ERIC Educational Resources Information Center

    Hall, Jori N.

    2013-01-01

    Mixed methods evaluation has a long-standing history of enhancing the credibility of evaluation findings. However, using mixed methods in a utilitarian way implicitly emphasizes convenience over engaging with its philosophical underpinnings (Denscombe, 2008). Because of this, some mixed methods evaluators and social science researchers have been…

  8. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  9. EVALUATION OF COMPOSITE RECEPTOR METHODS

    EPA Science Inventory

    A composite receptor model for PM-10 apportionment was evaluated to determine the stability of its solutions and to devise cost-effective measurement strategies. Ambient aerosol samples used in the evaluation were obtained with dichotomous samplers at three sites in the vicinity ...

  10. Local validation of EU-DEM using Least Squares Collocation

    NASA Astrophysics Data System (ADS)

    Ampatzidis, Dimitrios; Mouratidis, Antonios; Gruber, Christian; Kampouris, Vassilios

    2016-04-01

    In the present study we are dealing with the evaluation of the European Digital Elevation Model (EU-DEM) in a limited area, covering few kilometers. We compare EU-DEM derived vertical information against orthometric heights obtained by classical trigonometric leveling for an area located in Northern Greece. We apply several statistical tests and we initially fit a surface model, in order to quantify the existing biases and outliers. Finally, we implement a methodology for orthometric heights prognosis, using the Least Squares Collocation for the remaining residuals of the first step (after the fitted surface application). Our results, taking into account cross validation points, reveal a local consistency between EU-DEM and official heights, which is better than 1.4 meters.

  11. Visualising DEM-related flood-map uncertainties using a disparity-distance equation algorithm

    NASA Astrophysics Data System (ADS)

    Brandt, S. Anders; Lim, Nancy J.

    2016-05-01

    The apparent absoluteness of information presented by crisp-delineated flood boundaries can lead to misconceptions among planners about the inherent uncertainties associated in generated flood maps. Even maps based on hydraulic modelling using the highest-resolution digital elevation models (DEMs), and calibrated with the most optimal Manning's roughness (n) coefficients, are susceptible to errors when compared to actual flood boundaries, specifically in flat areas. Therefore, the inaccuracies in inundation extents, brought about by the characteristics of the slope perpendicular to the flow direction of the river, have to be accounted for. Instead of using the typical Monte Carlo simulation and probabilistic methods for uncertainty quantification, an empirical-based disparity-distance equation that considers the effects of both the DEM resolution and slope was used to create prediction-uncertainty zones around the resulting inundation extents of a one-dimensional (1-D) hydraulic model. The equation was originally derived for the Eskilstuna River where flood maps, based on DEM data of different resolutions, were evaluated for the slope-disparity relationship. To assess whether the equation is applicable to another river with different characteristics, modelled inundation extents from the Testebo River were utilised and tested with the equation. By using the cross-sectional locations, water surface elevations, and DEM, uncertainty zones around the original inundation boundary line can be produced for different confidences. The results show that (1) the proposed method is useful both for estimating and directly visualising model inaccuracies caused by the combined effects of slope and DEM resolution, and (2) the DEM-related uncertainties alone do not account for the total inaccuracy of the derived flood map. Decision-makers can apply it to already existing flood maps, thereby recapitulating and re-analysing the inundation boundaries and the areas that are uncertain

  12. Influence of DEM resolution on drainage network extraction: A multifractal analysis

    NASA Astrophysics Data System (ADS)

    Ariza-Villaverde, A. B.; Jiménez-Hornero, F. J.; Gutiérrez de Ravé, E.

    2015-07-01

    Different hydrological algorithms have been developed to automatically extract drainage networks from digital elevation models (DEMs). D8 is the most widely used algorithm to delineate drainage networks and catchments from a DEM. This algorithm has certain advantages such as simplicity, the provision of a reasonable representation for convergent flow conditions and consistency among flow patterns, calculated contributing areas and the spatial representation of subcatchments. However, it has limitations in selecting suitable flow accumulation threshold values to determine the pixels that belong to drainage networks. Although the effects of DEM resolution on some terrain attributes, stream characterisation and watershed delineation have been studied, analyses of the influence of DEM resolution on flow accumulation threshold values have been limited. Recently, multifractal analyses have been successfully used to find appropriate flow accumulation threshold values. The application of this type of analysis to evaluate the relationship between DEM resolution and flow accumulation threshold value needs to be explored. Therefore, this study tested three DEM resolutions for four drainage basins with different levels of drainage network distribution by comparing the Rényi spectra of the drainage networks that were obtained with the D8 algorithm against those determined by photogrammetric restitution. According to the results, DEM resolution influences the selected flow accumulation threshold value and the simulated network morphology. The suitable flow accumulation threshold value increases as the DEM resolution increases and shows greater variability for basins with lower drainage densities. The links between DEM resolution and terrain attributes were also examined.

  13. The Importance of Precise Digital Elevation Models (DEM) in Modelling Floods

    NASA Astrophysics Data System (ADS)

    Demir, Gokben; Akyurek, Zuhal

    2016-04-01

    Digital elevation Models (DEM) are important inputs for topography for the accurate modelling of floodplain hydrodynamics. Floodplains have a key role as natural retarding pools which attenuate flood waves and suppress flood peaks. GPS, LIDAR and bathymetric surveys are well known surveying methods to acquire topographic data. It is not only time consuming and expensive to obtain topographic data through surveying but also sometimes impossible for remote areas. In this study it is aimed to present the importance of accurate modelling of topography for flood modelling. The flood modelling for Samsun-Terme in Blacksea region of Turkey is done. One of the DEM is obtained from the point observations retrieved from 1/5000 scaled orthophotos and 1/1000 scaled point elevation data from field surveys at x-sections. The river banks are corrected by using the orthophotos and elevation values. This DEM is named as scaled DEM. The other DEM is obtained from bathymetric surveys. 296 538 number of points and the left/right bank slopes were used to construct the DEM having 1 m spatial resolution and this DEM is named as base DEM. Two DEMs were compared by using 27 x-sections. The maximum difference at thalweg of the river bed is 2m and the minimum difference is 20 cm between two DEMs. The channel conveyance capacity in base DEM is larger than the one in scaled DEM and floodplain is modelled in detail in base DEM. MIKE21 with flexible grid is used in 2- dimensional shallow water flow modelling. The model by using two DEMs were calibrated for a flood event (July 9, 2012). The roughness is considered as the calibration parameter. From comparison of input hydrograph at the upstream of the river and output hydrograph at the downstream of the river, the attenuation is obtained as 91% and 84% for the base DEM and scaled DEM, respectively. The time lag in hydrographs does not show any difference for two DEMs and it is obtained as 3 hours. Maximum flood extents differ for the two DEMs

  14. Evaluation Methods of The Text Entities

    ERIC Educational Resources Information Center

    Popa, Marius

    2006-01-01

    The paper highlights some evaluation methods to assess the quality characteristics of the text entities. The main concepts used in building and evaluation processes of the text entities are presented. Also, some aggregated metrics for orthogonality measurements are presented. The evaluation process for automatic evaluation of the text entities is…

  15. TES overlayed on MOLA DEM

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This image is TES thermal data (Orbit 222) overlayed on the MOLA DEM. The color scale is TES T18-T25, which is a cold spot index. The grey scale is MOLA elevation in kilometers. Most cold spots can be attributed to surface spectral emissivity effects. Regions that are colored black-violet-blue have near unity emissivity and are coarse grained CO2. Regions that are yellow-red are fined grained CO2. The red-white spot located approximately 300W85N is our most likely candidate for a CO2 snow storm.

  16. The Safeguards Evaluation Method for evaluating vulnerability to insider threats

    SciTech Connect

    Al-Ayat, R.A.; Judd, B.R.; Renis, T.A.

    1986-01-01

    As protection of DOE facilities against outsiders increases to acceptable levels, attention is shifting toward achieving comparable protection against insiders. Since threats and protection measures for insiders are substantially different from those for outsiders, new perspectives and approaches are needed. One such approach is the Safeguards Evaluation Method. This method helps in assessing safeguards vulnerabilities to theft or diversion of special nuclear material (SNM) by insiders. The Safeguards Evaluation Method-Insider Threat is a simple model that can be used by safeguards and security planners to evaluate safeguards and proposed upgrades at their own facilities. A discussion of the Safeguards Evaluation Method is presented in this paper.

  17. An evaluation method for nanoscale wrinkle

    NASA Astrophysics Data System (ADS)

    Liu, Y. P.; Wang, C. G.; Zhang, L. M.; Tan, H. F.

    2016-06-01

    In this paper, a spectrum-based wrinkling analysis method via two-dimensional Fourier transformation is proposed aiming to solve the difficulty of nanoscale wrinkle evaluation. It evaluates the wrinkle characteristics including wrinkling wavelength and direction simply using a single wrinkling image. Based on this method, the evaluation results of nanoscale wrinkle characteristics show agreement with the open experimental results within an error of 6%. It is also verified to be appropriate for the macro wrinkle evaluation without scale limitations. The spectrum-based wrinkling analysis is an effective method for nanoscale evaluation, which contributes to reveal the mechanism of nanoscale wrinkling.

  18. Wiederbeginn nach dem Zweiten Weltkrieg

    NASA Astrophysics Data System (ADS)

    Strecker, Heinrich; Bassenge-Strecker, Rosemarie

    Dieses Kapitel schildert zunächst die Ausgangslage für die Statistik in Deutschland nach dem Zweiten Weltkrieg: Der statistische Dienst in den Besatzungszonen musste teilweise erst aufgebaut und der statistische Unterricht an den Hochschulen wieder in Gang gebracht werden. In dieser Lage ergriff der Präsident des Bayerischen Statistischen Landesamtes, Karl Wagner, tatkräftig unterstützt von Gerhard Fürst, dem späteren Präsidenten des Statistischen Bundesamtes, die Initiative zur Neugründung der Deutschen Statistischen Gesellschaft (DStatG). Die Gründungsversammlung 1948 im München wurde zu einem Meilenstein in der Geschichte der DStatG. Ziel war es, alle Statistiker zur Zusammenarbeit anzuregen, ihre Qualifikation an das internationale Niveau heranzuführen und die Anwendung neuerer statistischer Methoden in der Praxis zu fördern. Es folgten 24 Jahre fruchtbarer Arbeit unter Karl Wagner (1948-1960) und Gerhard Fürst (1960-1972). Der Beitrag skizziert die Statistischen Wochen, die Tätigkeit der Ausschüsse und die Veröffentlichungen in dieser Zeit.

  19. TecDEM: A MATLAB based toolbox for tectonic geomorphology, Part 1: Drainage network preprocessing and stream profile analysis

    NASA Astrophysics Data System (ADS)

    Shahzad, Faisal; Gloaguen, Richard

    2011-02-01

    We present TecDEM, a software shell implemented in MATLAB that applies tectonic geomorphologic tasks to digital elevation models (DEMs). The first part of this paper series describes drainage partitioning schemes and stream profile analysis. The graphical user interface of TecDEM provides several options: determining flow directions, stream vectorization, watershed delineation, Strahler order labeling, stream profile generation, knickpoints selection, Concavity, Steepness and Hack indices calculations. The knickpoints along selected streams as well as stream profile analysis, and Hack index per stream profile are computed using a semi-automatic method. TecDEM was used to extract and investigate the stream profiles in the Kaghan Valley (Northern Pakistan). Our interpretations of the TecDEM results correlate well with previous tectonic evolution models for this region. TecDEM is designed to assist geoscientists in applying complex tectonic geomorphology tasks to global DEM data.

  20. Genetics | Selection: Evaluation and Methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The procedures used for collecting and analyzing data for genetic improvement of dairy cattle are described. Methods of identification and milk recording are presented. Selection traits include production (milk, fat, and protein yields and component percentages), conformation (final score and linear...

  1. Shuttle radar DEM hydrological correction for erosion modelling in small catchments

    NASA Astrophysics Data System (ADS)

    Jarihani, Ben; Sidle, Roy; Bartley, Rebecca

    2016-04-01

    Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modelling of environmental processes. Catchment and hillslope scale runoff and sediment processes (i.e., patterns of overland flow, infiltration, subsurface stormflow and erosion) are all topographically mediated. In remote and data-scarce regions, high resolution DEMs (LiDAR) are often not available, and moderate to course resolution digital elevation models (e.g., SRTM) have difficulty replicating detailed hydrological patterns, especially in relatively flat landscapes. Several surface reconditioning algorithms (e.g., Smoothing) and "Stream burning" techniques (e.g., Agree or ANUDEM), in conjunction with representation of the known stream networks, have been used to improve DEM performance in replicating known hydrology. Detailed stream network data are not available at regional and national scales, but can be derived at local scales from remotely-sensed data. This research explores the implication of high resolution stream network data derived from Google Earth images for DEM hydrological correction, instead of using course resolution stream networks derived from topographic maps. The accuracy of implemented method in producing hydrological-efficient DEMs were assessed by comparing the hydrological parameters derived from modified DEMs and limited high-resolution airborne LiDAR DEMs. The degree of modification is dominated by the method used and availability of the stream network data. Although stream burning techniques improve DEMs hydrologically, these techniques alter DEM characteristics that may affect catchment boundaries, stream position and length, as well as secondary terrain derivatives (e.g., slope, aspect). Modification of a DEM to better reflect known hydrology can be useful, however, knowledge of the magnitude and spatial pattern of the changes are required before using a DEM for subsequent analyses.

  2. Methods of evaluating hair growth.

    PubMed

    Chamberlain, Alexander J; Dawber, Rodney P R

    2003-02-01

    For decades, scientists and clinicians have examined methods of measuring scalp hair growth. With the development of drugs that stem or even reverse the miniaturization of androgenetic alopecia, there has been a greater need for reliable, economical and minimally invasive means of measuring hair growth and, specifically, response to therapy. We review the various methods of measurement described to date, their limitations and value to the clinician. In our opinion, the potential of computer-assisted technology in this field is yet to be maximized and the currently available tools are less than ideal. The most valuable means of measurement at the present time are global photography and phototrichogram-based techniques (with digital image analysis) such as the 'TrichoScan'. Subjective scoring systems are also of value in the overall assessment of response to therapy and these are under-utilized and merit further refinement. PMID:12581076

  3. Evaluation of Rhenium Joining Methods

    NASA Technical Reports Server (NTRS)

    Reed, Brian D.; Morren, Sybil H.

    1995-01-01

    Coupons of rhenium-to-Cl03 flat plate joints, formed by explosive and diffusion bonding, were evaluated in a series of shear tests. Shear testing was conducted on as-received, thermally-cycled (100 cycles, from 21 to 1100 C), and thermally-aged (3 and 6 hrs at 1100 C) joint coupons. Shear tests were also conducted on joint coupons with rhenium and/or Cl03 electron beam welded tabs to simulate the joint's incorporation into a structure. Ultimate shear strength was used as a figure of merit to assess the effects of the thermal treatment and the electron beam welding of tabs on the joint coupons. All of the coupons survived thermal testing intact and without any visible degradation. Two different lots of as-received, explosively-bonded joint coupons had ultimate shear strengths of 281 and 310 MPa and 162 and 223 MPa, respectively. As-received, diffusion-bonded coupons had ultimate shear strengths of 199 and 348 MPa. For the most part, the thermally-treated and rhenium weld tab coupons had shear strengths slightly reduced or within the range of the as-received values. Coupons with Cl03 weld tabs experienced a significant reduction in shear strength. The degradation of strength appeared to be the result of a poor heat sink provided during the electron beam welding. The Cl03 base material could not dissipate heat as effectively as rhenium, leading to the formation of a brittle rhenium-niobium intermetallic.

  4. Methods of Generating and Evaluating Hypertext.

    ERIC Educational Resources Information Center

    Blustein, James; Staveley, Mark S.

    2001-01-01

    Focuses on methods of generating and evaluating hypertext. Highlights include historical landmarks; nonlinearity; literary hypertext; models of hypertext; manual, automatic, and semi-automatic generation of hypertext; mathematical models for hypertext evaluation, including computing coverage and correlation; human factors in evaluation; and…

  5. Evaluation Methods for Intelligent Tutoring Systems Revisited

    ERIC Educational Resources Information Center

    Greer, Jim; Mark, Mary

    2016-01-01

    The 1993 paper in "IJAIED" on evaluation methods for Intelligent Tutoring Systems (ITS) still holds up well today. Basic evaluation techniques described in that paper remain in use. Approaches such as kappa scores, simulated learners and learning curves are refinements on past evaluation techniques. New approaches have also arisen, in…

  6. Evaluation Measures and Methods: Some Intersections.

    ERIC Educational Resources Information Center

    Elliott, John

    The literature is reviewed for four combinations of evaluation measures and methods: traditional methods with traditional measures (T-Meth/T-Mea), nontraditional methods with traditional measures (N-Meth/T-Mea), traditional measures with nontraditional measures (T-Meth/N-Mea), and nontraditional methods with nontraditional measures (N-Meth/N-Mea).…

  7. Safeguards Evaluation Method for evaluating vulnerability to insider threats

    SciTech Connect

    Al-Ayat, R.A.; Judd, B.R.; Renis, T.A.

    1986-01-01

    As protection of DOE facilities against outsiders increases to acceptable levels, attention is shifting toward achieving comparable protection against insiders. Since threats and protection measures for insiders are substantially different from those for outsiders, new perspectives and approaches are needed. One such approach is the Safeguards Evaluation Method. This method helps in assessing safeguards vulnerabilities to theft or diversion of special nuclear meterial (SNM) by insiders. The Safeguards Evaluation Method-Insider Threat is a simple model that can be used by safeguards and security planners to evaluate safeguards and proposed upgrades at their own facilities. The method is used to evaluate the effectiveness of safeguards in both timely detection (in time to prevent theft) and late detection (after-the-fact). The method considers the various types of potential insider adversaries working alone or in collusion with other insiders. The approach can be used for a wide variety of facilities with various quantities and forms of SNM. An Evaluation Workbook provides documentation of the baseline assessment; this simplifies subsequent on-site appraisals. Quantitative evaluation is facilitated by an accompanying computer program. The method significantly increases an evaluation team's on-site analytical capabilities, thereby producing a more thorough and accurate safeguards evaluation.

  8. Hair Evaluation Methods: Merits and Demerits

    PubMed Central

    Dhurat, Rachita; Saraogi, Punit

    2009-01-01

    Various methods are available for evaluation (for diagnosis and/or quantification) of a patient presenting with hair loss. Hair evaluation methods are grouped into three main categories: Non-invasive methods (e.g., questionnaire, daily hair counts, standardized wash test, 60-s hair count, global photographs, dermoscopy, hair weight, contrasting felt examination, phototrichogram, TrichoScan and polarizing and surface electron microscopy), semi-invasive methods (e.g., trichogram and unit area trichogram) and invasive methods (e.g., scalp biopsy). Any single method is neither 'ideal' nor feasible. However, when interpreted with caution, these are valuable tools for patient diagnosis and monitoring. Daily hair counts, wash test, etc. are good methods for primary evaluation of the patient and to get an approximate assessment of the amount of shedding. Some methods like global photography form an important part of any hair clinic. Analytical methods like phototrichogram are usually possible only in the setting of a clinical trial. Many of these methods (like the scalp biopsy) require expertise for both processing and interpreting. We reviewed the available literature in detail in light of merits and demerits of each method. A plethora of newer methods is being introduced, which are relevant to the cosmetic industry/research. Such methods as well as metabolic/hormonal evaluation are not included in this review. PMID:20927232

  9. TanDEM-X high resolution DEMs and their applications to flow modeling

    NASA Astrophysics Data System (ADS)

    Wooten, Kelly M.

    Lava flow modeling can be a powerful tool in hazard assessments; however, the ability to produce accurate models is usually limited by a lack of high resolution, up-to-date Digital Elevation Models (DEMs). This is especially obvious in places such as Kilauea Volcano (Hawaii), where active lava flows frequently alter the terrain. In this study, we use a new technique to create high resolution DEMs on Kilauea using synthetic aperture radar (SAR) data from the TanDEM-X (TDX) satellite. We convert raw TDX SAR data into a geocoded DEM using GAMMA software [Werner et al., 2000]. This process can be completed in several hours and permits creation of updated DEMs as soon as new TDX data are available. To test the DEMs, we use the Harris and Rowland [2001] FLOWGO lava flow model combined with the Favalli et al. [2005] DOWNFLOW model to simulate the 3-15 August 2011 eruption on Kilauea's East Rift Zone. Results were compared with simulations using the older, lower resolution 2000 SRTM DEM of Hawaii. Effusion rates used in the model are derived from MODIS thermal infrared satellite imagery. FLOWGO simulations using the TDX DEM produced a single flow line that matched the August 2011 flow almost perfectly, but could not recreate the entire flow field due to the relatively high DEM noise level. The issues with short model flow lengths can be resolved by filtering noise from the DEM. Model simulations using the outdated SRTM DEM produced a flow field that followed a different trajectory to that observed. Numerous lava flows have been emplaced at Kilauea since the creation of the SRTM DEM, leading the model to project flow lines in areas that have since been covered by fresh lava flows. These results show that DEMs can quickly become outdated on active volcanoes, but our new technique offers the potential to produce accurate, updated DEMs for modeling lava flow hazards.

  10. Statistical Morphometry of Small Martian Craters: New Methods and Results

    NASA Astrophysics Data System (ADS)

    Watters, W. A.; Geiger, L.; Fendrock, M.; Gibson, R.; Radford, A.

    2015-05-01

    Methods for automatic morphometric characterization of craters for large statistical studies; measured dependence of shape on size, terrain, modification, and velocity (via primary-to-secondary distance); evaluation of Ames Stereo Pipeline DEMs.

  11. How does modifying a DEM to reflect known hydrology affect subsequent terrain analysis?

    NASA Astrophysics Data System (ADS)

    Callow, John Nikolaus; Van Niel, Kimberly P.; Boggs, Guy S.

    2007-01-01

    SummaryMany digital elevation models (DEMs) have difficulty replicating hydrological patterns in flat landscapes. Efforts to improve DEM performance in replicating known hydrology have included a variety of soft (i.e. algorithm-based approaches) and hard techniques, such as " Stream burning" or "surface reconditioning" (e.g. Agree or ANUDEM). Using a representation of the known stream network, these methods trench or mathematically warp the original DEM to improve how accurately stream position, stream length and catchment boundaries replicate known hydrological conditions. However, these techniques permanently alter the DEM and may affect further analyses (e.g. slope). This paper explores the impact that commonly used hydrological correction methods ( Stream burning, Agree.aml and ANUDEM v4.6.3 and ANUDEM v5.1) have on the overall nature of a DEM, finding that different methods produce non-convergent outcomes for catchment parameters (such as catchment boundaries, stream position and length), and differentially compromise secondary terrain analysis. All hydrological correction methods successfully improved calculation of catchment area, stream position and length as compared to using the DEM without any modification, but they all increased catchment slope. No single method performing best across all categories. Different hydrological correction methods changed elevation and slope in different spatial patterns and magnitudes, compromising the ability to derive catchment parameters and conduct secondary terrain analysis from a single DEM. Modification of a DEM to better reflect known hydrology can be useful, however knowledge of the magnitude and spatial pattern of the changes are required before using a DEM for subsequent analyses.

  12. Integration of 2-D hydraulic model and high-resolution LiDAR-derived DEM for floodplain flow modeling

    NASA Astrophysics Data System (ADS)

    Shen, D.; Wang, J.; Cheng, X.; Rui, Y.; Ye, S.

    2015-02-01

    The rapid progress of Light Detection And Ranging (LiDAR) technology has made acquirement and application of high-resolution digital elevation model (DEM) data increasingly popular, especially with regards to the study of floodplain flow modeling. High-resolution DEM data include many redundant interpolation points, needs a high amount of calculation, and does not match the size of computational mesh. These disadvantages are a common problem for floodplain flow modeling studies. Two-dimensional (2-D) hydraulic modeling, a popular method of analyzing floodplain flow, offers high precision of elevation parameterization for computational mesh while ignoring much micro-topographic information of the DEM data itself. We offer a flood simulation method that integrates 2-D hydraulic model results and high-resolution DEM data, enabling the calculation of flood water levels in DEM grid cells through local inverse distance weighted interpolation. To get rid of the false inundation areas during interpolation, it employs the run-length encoding method to mark the inundated DEM grid cells and determine the real inundation areas through the run-length boundary tracing technique, which solves the complicated problem of the connectivity between DEM grid cells. We constructed a 2-D hydraulic model for the Gongshuangcha polder, a flood storage area of Dongting Lake, using our integrated method to simulate the floodplain flow. The results demonstrate that this method can solve DEM associated problems efficiently and simulate flooding processes with greater accuracy than DEM only simulations.

  13. Icesat Validation of Tandem-X I-Dems Over the UK

    NASA Astrophysics Data System (ADS)

    Feng, L.; Muller, J.-P.

    2016-06-01

    From the latest TanDEM-X mission (bistatic X-Band interferometric SAR), globally consistent Digital Elevation Model (DEM) will be available from 2017, but their accuracy has not yet been fully characterised. This paper presents the methods and implementation of statistical procedures for the validation of the vertical accuracy of TanDEM-X iDEMs at grid-spacing of approximately 12.5 m, 30 m and 90 m based on processed ICESat data over the UK in order to assess their potential extrapolation across the globe. The accuracy of the TanDEM-X iDEM in UK was obtained as follows: against ICESat GLA14 elevation data, TanDEM-X iDEM has -0.028±3.654 m over England and Wales and 0.316 ± 5.286 m over Scotland for 12 m, -0.073 ± 6.575 m for 30 m, and 0.0225 ± 9.251 m at 90 m. Moreover, 90 % of all results at the three resolutions of TanDEM-X iDEM data (with a linear error at 90 % confidence level) are below 16.2 m. These validation results also indicate that derivative topographic parameters (slope, aspect and relief) have a strong effect on the vertical accuracy of the TanDEM-X iDEMs. In high-relief and large slope terrain, large errors and data voids are frequent, and their location is strongly influenced by topography, whilst in the low- to medium-relief and low slope sites, errors are smaller. ICESat derived elevations are heavily influenced by surface slope within the 70 m footprint as well as there being slope dependent errors in the TanDEM-X iDEMs.

  14. An Operator Method for Evaluating Laplace Transforms

    ERIC Educational Resources Information Center

    Lanoue, B. G.; Yurekli, O.

    2005-01-01

    This note discusses a simple operator technique based on the differentiation and shifting properties of the Laplace transform to find Laplace transforms for various elementary functions. The method is simpler than known integration techniques to evaluate Laplace transforms.

  15. [Evaluation methods of HME with tracheostomized patients].

    PubMed

    Li, Min

    2014-03-01

    This paper introduced the measurement methods of heat and moisture exchanger during tracheotomy with two main parameters (water loss and pressure drop) and proposed more heat and moisture exchanger evaluation indicators such as the death chamber, the heat exchange rate, as well as those parameters can be used to evaluate the reasonableness of the heat and moisture exchanger performance. PMID:24941781

  16. DEVELOPMENT AND EVALUATION OF COMPOSITE RECEPTOR METHODS

    EPA Science Inventory

    A composite receptor method for PM-10 apportionment was evaluated to determine the stability of its solutions and to devise cost-effective measurement strategies. Aerosol samples used in the evaluation were collected during summer, 1982, by dichotomous samplers at three sites in ...

  17. Creating Alternative Methods for Educational Evaluation.

    ERIC Educational Resources Information Center

    Smith, Nick L.

    1981-01-01

    A project supported by the National Institute of Education is adapting evaluation procedures from such areas as philosophy, geography, operations research, journalism, film criticism, and other areas. The need for such methods is reviewed, as is the context in which they function, and their contributions to evaluation methodology. (Author/GK)

  18. Mapping debris-flow hazard in Honolulu using a DEM

    USGS Publications Warehouse

    Ellen, Stephen D.; Mark, Robert K.

    1993-01-01

    A method for mapping hazard posed by debris flows has been developed and applied to an area near Honolulu, Hawaii. The method uses studies of past debris flows to characterize sites of initiation, volume at initiation, and volume-change behavior during flow. Digital simulations of debris flows based on these characteristics are then routed through a digital elevation model (DEM) to estimate degree of hazard over the area.

  19. DEM simulation of oblique boudinage

    NASA Astrophysics Data System (ADS)

    Komoroczi, Andrea; Abe, Steffen; Urai, Janos L.

    2013-04-01

    Boudinage occurs in mechanically layered rocks if there is a component of lengthening parallel to a brittle layer in a ductile matrix. Asymmetric boudin structures develop if the extension is not layer-parallel, and the boudin blocks rotate. The amount of block rotation is commonly used as shear indicators; therefore, it has been well studied. However, full oblique boudinage has not been modeled yet. We simulated full boudinage processes during layer oblique extension using DEM simulation software. In our boudinage model, the initial setup consists of three layers: there is a brittle center oblique layer in a ductile matrix. We simulated horizontal extension by applying vertical displacement: the top and bottom boundaries of the model are moved at a constant velocity, while the side boundaries were force controlled by applying a constant confining force. By varying the cohesion of the competent layer, various type and shape of boudin blocks were developed. By varying the angle of the competent layer, the rotation of the boudin blocks changed. With higher dip of the competent layer, the rotation of the boudin blocks is more consistent. We also studied the stress field during the simulation. The results show, that in case of ductile material, the disruptions of the layer are driven by the angle of the layer and not the orientation of the external stress field.

  20. Bathymetric survey of water reservoirs in north-eastern Brazil based on TanDEM-X satellite data.

    PubMed

    Zhang, Shuping; Foerster, Saskia; Medeiros, Pedro; de Araújo, José Carlos; Motagh, Mahdi; Waske, Bjoern

    2016-11-15

    Water scarcity in the dry season is a vital problem in dryland regions such as northeastern Brazil. Water supplies in these areas often come from numerous reservoirs of various sizes. However, inventory data for these reservoirs is often limited due to the expense and time required for their acquisition via field surveys, particularly in remote areas. Remote sensing techniques provide a valuable alternative to conventional reservoir bathymetric surveys for water resource management. In this study single pass TanDEM-X data acquired in bistatic mode were used to generate digital elevation models (DEMs) in the Madalena catchment, northeastern Brazil. Validation with differential global positioning system (DGPS) data from field measurements indicated an absolute elevation accuracy of approximately 1m for the TanDEM-X derived DEMs (TDX DEMs). The DEMs derived from TanDEM-X data acquired at low water levels show significant advantages over bathymetric maps derived from field survey, particularly with regard to coverage, evenly distributed measurements and replication of reservoir shape. Furthermore, by mapping the dry reservoir bottoms with TanDEM-X data, TDX DEMs are free of emergent and submerged macrophytes, independent of water depth (e.g. >10m), water quality and even weather conditions. Thus, the method is superior to other existing bathymetric mapping approaches, particularly for inland water bodies. The proposed approach relies on (nearly) dry reservoir conditions at times of image acquisition and is thus restricted to areas that show considerable water levels variations. However, comparisons between TDX DEM and the bathymetric map derived from field surveys show that the amount of water retained during the dry phase has only marginal impact on the total water volume derivation from TDX DEM. Overall, DEMs generated from bistatic TanDEM-X data acquired in low water periods constitute a useful and efficient data source for deriving reservoir bathymetry and show

  1. Fusion of high-resolution DEMs derived from COSMO-SkyMed and TerraSAR-X InSAR datasets

    NASA Astrophysics Data System (ADS)

    Jiang, Houjun; Zhang, Lu; Wang, Yong; Liao, Mingsheng

    2014-06-01

    Voids caused by shadow, layover, and decorrelation usually occur in digital elevation models (DEMs) of mountainous areas that are derived from interferometric synthetic aperture radar (InSAR) datasets. The presence of voids degrades the quality and usability of the DEMs. Thus, void removal is considered as an integral part of the DEM production using InSAR data. The fusion of multiple DEMs has been widely recognized as a promising way for the void removal. Because the vertical accuracy of multiple DEMs can be different, the selection of optimum weights becomes a key problem in the fusion and is studied in this article. As a showcase, two high-resolution InSAR DEMs near Mt. Qilian in northwest China are created and then merged. The two pairs of InSAR data were acquired by TerraSAR-X from an ascending orbit and COSMO-SkyMed from a descending orbit. A maximum likelihood fusion scheme with the weights optimally determined by the height of ambiguity and the variance of phase noise is adopted to syncretize the two DEMs in our study. The fused DEM has a fine spatial resolution of 10 m and depicts the landform of the study area well. The percentage of void cells in the fused DEM is only 0.13 %, while 6.9 and 5.7 % of the cells in the COSMO-SkyMed DEM and the TerraSAR-X DEM are originally voids. Using the ICESat/GLAS elevation data and the Chinese national DEM of scale 1:50,000 as references, we evaluate vertical accuracy levels of the fused DEM as well as the original InSAR DEMs. The results show that substantial improvements could be achieved by DEM fusion after atmospheric phase screen removal. The quality of fused DEM can even meet the high-resolution terrain information (HRTI) standard.

  2. MARE2DEM: an open-source code for anisotropic inversion of controlled-source electromagnetic and magnetotelluric data using parallel adaptive 2D finite elements (Invited)

    NASA Astrophysics Data System (ADS)

    Key, K.

    2013-12-01

    This work announces the public release of an open-source inversion code named MARE2DEM (Modeling with Adaptively Refined Elements for 2D Electromagnetics). Although initially designed for the rapid inversion of marine electromagnetic data, MARE2DEM now supports a wide variety of acquisition configurations for both offshore and onshore surveys that utilize electric and magnetic dipole transmitters or magnetotelluric plane waves. The model domain is flexibly parameterized using a grid of arbitrarily shaped polygonal regions, allowing for complicated structures such as topography or seismically imaged horizons to be easily assimilated. MARE2DEM efficiently solves the forward problem in parallel by dividing the input data parameters into smaller subsets using a parallel data decomposition algorithm. The data subsets are then solved in parallel using an automatic adaptive finite element method that iterative solves the forward problem on successively refined finite element meshes until a specified accuracy tolerance is met, thus freeing the end user from the burden of designing an accurate numerical modeling grid. Regularized non-linear inversion for isotropic or anisotropic conductivity is accomplished with a new implementation of Occam's method referred to as fast-Occam, which is able to minimize the objective function in much fewer forward evaluations than the required by the original method. This presentation will review the theoretical considerations behind MARE2DEM and use a few recent offshore EM data sets to demonstrate its capabilities and to showcase the software interface tools that streamline model building and data inversion.

  3. Evaluation of Sight, Sound, Symbol Instructional Method.

    ERIC Educational Resources Information Center

    Massarotti, Michael C.; Slaichert, William M.

    Evaluated was the Sight-Sound-Symbol (S-S-S) method of teaching basic reading skills with four groups of 16 trainable mentally retarded children. The method involved use of a musical keyboard to teach children to identify numbers, letters, colors, and shapes. Groups either received individual S-S-S instruction for 10 minutes daily, received S-S-S…

  4. AN EVALUATION STUDY OF EPA METHOD 8

    EPA Science Inventory

    Techniques used in EPA Method 8, the source test method for acid mist and sulfur dioxide emissions from sulfuric acid plants, have been evaluated. Evidence is shown that trace amounts of peroxides in isopropyl alcohol result in the conversion of sulfur dioxide to sulfate and caus...

  5. Assessment and Evaluation Methods for Access Services

    ERIC Educational Resources Information Center

    Long, Dallas

    2014-01-01

    This article serves as a primer for assessment and evaluation design by describing the range of methods commonly employed in library settings. Quantitative methods, such as counting and benchmarking measures, are useful for investigating the internal operations of an access services department in order to identify workflow inefficiencies or…

  6. Creating High Quality DEMs of Large Scale Fluvial Environments Using Structure-from-Motion

    NASA Astrophysics Data System (ADS)

    Javernick, L. A.; Brasington, J.; Caruso, B. S.; Hicks, M.; Davies, T. R.

    2012-12-01

    During the past decade, advances in survey and sensor technology have generated new opportunities to investigate the structure and dynamics of fluvial systems. Key geomatic technologies include the Global Positioning System (GPS), digital photogrammetry, LiDAR, and terrestrial laser scanning (TLS). The application of such has resulted in a profound increase in the dimensionality of topographic surveys - from cross-sections to distributed 3d point clouds and digital elevation models (DEMs). Each of these technologies have been used successfully to derive high quality DEMs of fluvial environments; however, they often require specialized and expensive equipment, such as a TLS or large format camera, bespoke platforms such as survey aircraft, and consequently make data acquisition prohibitively expensive or highly labour intensive, thus restricting the extent and frequency of surveys. Recently, advances in computer vision and image analysis have led to development of a novel photogrammetric approach that is fully automated and suitable for use with simple compact (non-metric) cameras. In this paper, we evaluate a new photogrammetric method, Structure-from-Motion (SfM), and demonstrate how this can be used to generate DEMs of comparable quality to airborne LiDAR, using consumer grade cameras at low costs. Using the SfM software PhotoScan (version 0.8.5), high quality DEMs were produced for a 1.6 km reach and a 3.3 km reach of the braided Ahuriri River, New Zealand. Photographs used for DEM creation were acquired from a helicopter flying at 600 m and 800 m above ground level using a consumer grade 10.1mega-pixel, non-metric digital camera, resulting in object space resolution imagery of 0.12 m and 0.16 m respectively. Point clouds for the two study reaches were generated using 147 and 224 photographs respectively, and were extracted automatically in an arbitrary coordinate system; RTK-GPS located ground control points (GCPs) were used to define a 3d non

  7. An efficient method to evaluate energy variances for extrapolation methods

    NASA Astrophysics Data System (ADS)

    Puddu, G.

    2012-08-01

    The energy variance extrapolation method consists of relating the approximate energies in many-body calculations to the corresponding energy variances and inferring eigenvalues by extrapolating to zero variance. The method needs a fast evaluation of the energy variances. For many-body methods that expand the nuclear wavefunctions in terms of deformed Slater determinants, the best available method for the evaluation of energy variances scales with the sixth power of the number of single-particle states. We propose a new method which depends on the number of single-particle orbits and the number of particles rather than the number of single-particle states. We discuss as an example the case of 4He using the chiral N3LO interaction in a basis consisting up to 184 single-particle states.

  8. An assessment of TanDEM-X GlobalDEM over rural and urban areas

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Huber, Martin; Rudari, Roberto; Eddy, Andrew; Lucas, Richard

    2014-10-01

    Digital Elevation Model (DEM) is a key input for the development of risk management systems. Main limitation of the current available DEM is the low level of resolution. DEMs such as STRM 90m or ASTER are globally available free of charge, but offer limited use, for example, to flood modelers in most geographic areas. TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement), the first bistatic SAR can fulfil this gap. The mission objective is the generation of a consistent global digital elevation model with an unprecedented accuracy according to the HRTI-3 (High Resolution Terrain Information) specifications. The mission opens a new era in risk assessment. In the framework of ALTAMIRA INFORMATION research activities, the DIAPASON (Differential Interferometric Automated Process Applied to Survey Of Nature) processing chain has been successfully adapted to TanDEM-X CoSSC (Coregistered Slant Range Single Look Complex) data processing. In this study the capability of CoSSC data for DEM generation is investigated. Within the on-going FP7 RASOR project (Rapid Analysis and Spatialisation and Of Risk), the generated DEM are compared with Intermediate DEM derived from the TanDEM-X first global coverage. The results are presented and discussed.

  9. Further evaluation of traditional icing scaling methods

    NASA Technical Reports Server (NTRS)

    Anderson, David N.

    1996-01-01

    This report provides additional evaluations of two methods to scale icing test conditions; it also describes a hybrid technique for use when scaled conditions are outside the operating envelope of the test facility. The first evaluation is of the Olsen method which can be used to scale the liquid-water content in icing tests, and the second is the AEDC (Ruff) method which is used when the test model is less than full size. Equations for both scaling methods are presented in the paper, and the methods were evaluated by performing icing tests in the NASA Lewis Icing Research Tunnel (IRT). The Olsen method was tested using 53 cm diameter NACA 0012 airfoils. Tests covered liquid-water-contents which varied by as much as a factor of 1.8. The Olsen method was generally effective in giving scale ice shapes which matched the reference shapes for these tests. The AEDC method was tested with NACA 0012 airfoils with chords from 18 cm to 53 cm. The 53 cm chord airfoils were used in reference tests, and 1/2 and 1/3 scale tests were made at conditions determined by applying the AEDC scaling method. The scale and reference airspeeds were matched in these tests. The AEDC method was found to provide fairly effective scaling for 1/2 size tests, but for 1/3 size models, scaling was generally less effective. In addition to these two scaling methods, a hybrid approach was also tested in which the Olsen method was used to adjust the LWC after size was scaled using the constant Weber number method. This approach was found to be an effective way to test when scaled conditions would otherwise be outside the capability of the test facility.

  10. Assessment of Uncertainty Propagation from DEM's on Small Scale Typologically-Differentiated Landslide Susceptibility in Romania

    NASA Astrophysics Data System (ADS)

    Cosmin Sandric, Ionut; Chitu, Zenaida; Jurchescu, Marta; Malet, Jean-Philippe; Ciprian Margarint, Mihai; Micu, Mihai

    2015-04-01

    An increasing number of free and open access global digital elevation models has become available in the past 15 years and these DEMs have been widely used for the assessment of landslide susceptibility at medium and small scales. Even though the global vertical and horizontal accuracies of each DEM are known, what it is still unknown is the uncertainty that propagates from the first and second derivatives of DEMs, like slope gradient, into the final landslide susceptibility map For the present study we focused on the assessment of the uncertainty propagation from the following digital elevation models: SRTM 90m spatial resolution, ASTERDEM 30m spatial resolution, EUDEM 30m spatial resolution and the latest release SRTM 30m spatial resolution. From each DEM dataset the slope gradient was generated and used in the landslide susceptibility analysis. A restricted number of spatial predictors are used for landslide susceptibility assessment, represented by lithology, land-cover and slope, were the slope is the only predictor that changes with each DEM. The study makes use of the first national landslide inventory (Micu et al, 2014) obtained from compiling literature data, personal or institutional landslide inventories. The landslide inventory contains more than 27,900 cases classified in three main categories: slides flows and falls The results present landslide susceptibility maps obtained from each DEM and from the combinations of DEM datasets. Maps with uncertainty propagation at country level and differentiated by topographic regions from Romania and by landslide typology (slides, flows and falls) are obtained for each DEM dataset and for the combinations of these. An objective evaluation of each DEM dataset and a final map of landslide susceptibility and the associated uncertainty are provided

  11. APS Removal And Void Filling For DEM Reconstruction From High-Resolution INSAR Data

    NASA Astrophysics Data System (ADS)

    Liao, Mingsheng; Jiang, Houjun; Wang, Teng; Zhang, Lu

    2012-01-01

    The quality and accuracy of DEMs derived from repeat- pass InSAR is limited by atmospheric phase screen (APS) difference and decorrelation between SAR images. In this paper, we show a compromising but effective approach to avoid DEM gaps and remove height errors induced by the atmosphere. Existing low resolution DEMs are used as external data to improve the quality of interferometric DEM. Our approach focuses on two aspects: 1) Estimate the APS from a differential interferogram with a low-pass filter in the frequency domain, and remove the height errors caused by APS. 2) Fill data voids and calibrate the height with an external DEM. The proposed method has been applied on high-resolution COSMO-SkyMed Tandem data with one-day temporal baseline over Mt. Qilian in north-western China. The resultant DEM has been validated in comparison with an officially-issued 1:50,000 DEM. Our preliminary result shows that atmospheric artifacts and data voids have been removed effectively.

  12. Graphical methods for evaluating covering arrays

    SciTech Connect

    Kim, Youngil; Jang, Dae -Heung; Anderson-Cook, Christine M.

    2015-08-10

    Covering arrays relax the condition of orthogonal arrays by only requiring that all combination of levels be covered but not requiring that the appearance of all combination of levels be balanced. This allows for a much larger number of factors to be simultaneously considered but at the cost of poorer estimation of the factor effects. To better understand patterns between sets of columns and evaluate the degree of coverage to compare and select between alternative arrays, we suggest several new graphical methods that show some of the patterns of coverage for different designs. As a result, these graphical methods for evaluating covering arrays are illustrated with a few examples.

  13. Energy efficiency assessment methods and tools evaluation

    SciTech Connect

    McMordie, K.L.; Richman, E.E.; Keller, J.M.; Dixon, D.R.

    1994-08-01

    Many different methods of assessing the energy savings potential at federal installations, and identifying attractive projects for capital investment have been used by the different federal agencies. These methods range from high-level estimating tools to detailed design tools, both manual and software assisted. These methods have different purposes and provide results that are used for different parts of the project identification, and implementation process. Seven different assessment methods are evaluated in this study. These methods were selected by the program managers at the DoD Energy Policy Office, and DOE Federal Energy Management Program (FEMP). Each of the methods was applied to similar buildings at Bolling Air Force Base (AFB), unless it was inappropriate or the method was designed to make an installation-wide analysis, rather than focusing on particular buildings. Staff at Bolling AFB controlled the collection of data.

  14. Veterinary and human vaccine evaluation methods

    PubMed Central

    Knight-Jones, T. J. D.; Edmond, K.; Gubbins, S.; Paton, D. J.

    2014-01-01

    Despite the universal importance of vaccines, approaches to human and veterinary vaccine evaluation differ markedly. For human vaccines, vaccine efficacy is the proportion of vaccinated individuals protected by the vaccine against a defined outcome under ideal conditions, whereas for veterinary vaccines the term is used for a range of measures of vaccine protection. The evaluation of vaccine effectiveness, vaccine protection assessed under routine programme conditions, is largely limited to human vaccines. Challenge studies under controlled conditions and sero-conversion studies are widely used when evaluating veterinary vaccines, whereas human vaccines are generally evaluated in terms of protection against natural challenge assessed in trials or post-marketing observational studies. Although challenge studies provide a standardized platform on which to compare different vaccines, they do not capture the variation that occurs under field conditions. Field studies of vaccine effectiveness are needed to assess the performance of a vaccination programme. However, if vaccination is performed without central co-ordination, as is often the case for veterinary vaccines, evaluation will be limited. This paper reviews approaches to veterinary vaccine evaluation in comparison to evaluation methods used for human vaccines. Foot-and-mouth disease has been used to illustrate the veterinary approach. Recommendations are made for standardization of terminology and for rigorous evaluation of veterinary vaccines. PMID:24741009

  15. Improving the TanDEM-X DEM for flood modelling using flood extents from Synthetic Aperture Radar images.

    NASA Astrophysics Data System (ADS)

    Mason, David; Trigg, Mark; Garcia-Pintado, Javier; Cloke, Hannah; Neal, Jeffrey; Bates, Paul

    2015-04-01

    Many floodplains in the developed world have now been imaged with high resolution airborne LiDAR or InSAR, giving accurate DEMs that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X World DEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution SAR images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. The paper discusses an additional use of SAR flood extents to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving the DEM for future flood modelling studies in this area. The method is based on the fact that for larger rivers the water elevation changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as a sample of heights with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate height estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the refined heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must be no lower than the refined heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the

  16. Scenario-Based Validation of Moderate Resolution DEMs Freely Available for Complex Himalayan Terrain

    NASA Astrophysics Data System (ADS)

    Singh, Mritunjay Kumar; Gupta, R. D.; Snehmani; Bhardwaj, Anshuman; Ganju, Ashwagosha

    2016-02-01

    Accuracy of the Digital Elevation Model (DEM) affects the accuracy of various geoscience and environmental modelling results. This study evaluates accuracies of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global DEM Version-2 (GDEM V2), the Shuttle Radar Topography Mission (SRTM) X-band DEM and the NRSC Cartosat-1 DEM V1 (CartoDEM). A high resolution (1 m) photogrammetric DEM (ADS80 DEM), having a high absolute accuracy [1.60 m linear error at 90 % confidence (LE90)], resampled at 30 m cell size was used as reference. The overall root mean square error (RMSE) in vertical accuracy was 23, 73, and 166 m and the LE90 was 36, 75, and 256 m for ASTER GDEM V2, SRTM X-band DEM and CartoDEM, respectively. A detailed error analysis was performed for individual as well as combinations of different classes of aspect, slope, land-cover and elevation zones for the study area. For the ASTER GDEM V2, forest areas with North facing slopes (0°-5°) in the 4th elevation zone (3773-4369 m) showed minimum LE90 of 0.99 m, and barren with East facing slopes (>60°) falling under the 2nd elevation zone (2581-3177 m) showed maximum LE90 of 166 m. For the SRTM DEM, pixels with South-East facing slopes of 0°-5° in the 4th elevation zone covered with forest showed least LE90 of 0.33 m and maximum LE90 of 521 m was observed in the barren area with North-East facing slope (>60°) in the 4th elevation zone. In case of the CartoDEM, the snow pixels in the 2nd elevation zone with South-East facing slopes of 5°-15° showed least LE90 of 0.71 m and maximum LE90 of 1266 m was observed for the snow pixels in the 3rd elevation zone (3177-3773 m) within the South facing slope of 45°-60°. These results can be highly useful for the researchers using DEM products in various modelling exercises.

  17. Evaluating Composition Skills: A Method and Example.

    ERIC Educational Resources Information Center

    McLean, James E.; Chissom, Brad S.

    Holistic evaluation is a reliable, valid, and cost-effective alternative to the usual mechanical assessment of writing. Writing samples are scored on a five-point scale against an overall impression of development, organization, and coherentness. The method was applied to the Communication Activities Skills Project (CASP) for grades 3-12. Writing…

  18. A description of rotations for DEM models of particle systems

    NASA Astrophysics Data System (ADS)

    Campello, Eduardo M. B.

    2015-06-01

    In this work, we show how a vector parameterization of rotations can be adopted to describe the rotational motion of particles within the framework of the discrete element method (DEM). It is based on the use of a special rotation vector, called Rodrigues rotation vector, and accounts for finite rotations in a fully exact manner. The use of fictitious entities such as quaternions or complicated structures such as Euler angles is thereby circumvented. As an additional advantage, stick-slip friction models with inter-particle rolling motion are made possible in a consistent and elegant way. A few examples are provided to illustrate the applicability of the scheme. We believe that simple vector descriptions of rotations are very useful for DEM models of particle systems.

  19. Influence of the external DEM on PS-InSAR processing and results on Northern Appennine slopes

    NASA Astrophysics Data System (ADS)

    Bayer, B.; Schmidt, D. A.; Simoni, A.

    2014-12-01

    We present an InSAR analysis of slow moving landslide in the Northern Appennines, Italy, and assess the dependencies on the choice of DEM. In recent years, advanced processing techniques for synthetic aperture radar interferometry (InSAR) have been applied to measure slope movements. The persistent scatterers (PS-InSAR) approach is probably the most widely used and some codes are now available in the public domain. The Stanford method of Persistent Scatterers (StamPS) has been successfully used to analyze landslide areas. One problematic step in the processing chain is the choice of an external DEM that is used to model and remove the topographic phase in a series of interferograms in order to obtain the phase contribution caused by surface deformation. The choice is not trivial, because the PS InSAR results differ significantly in terms of PS identification, positioning, and the resulting deformation signal. We use four different DEMs to process a set of 18 ASAR (Envisat) scenes over a mountain area (~350 km2) of the Northern Appennines of Italy, using StamPS. Slow-moving landslides control the evolution of the landscape and cover approximately 30% of the territory. Our focus in this presentation is to evaluate the influence of DEM resolution and accuracy by comparing PS-InSAR results. On an areal basis, we perform a statistical analysis of displacement time-series to make the comparison. We also consider two case studies to illustrate the differences in terms of PS identification, number and estimated displacements. It is clearly shown that DEM accuracy positively influences the number of PS, while line-of-sight rates differ from case to case and can result in deformation signals that are difficult to interpret. We also take advantage of statistical tools to analyze the obtained time-series datasets for the whole study area. Results indicate differences in the style and amount of displacement that can be related to the accuracy of the employed DEM.

  20. Glacier Volume Change Estimation Using Time Series of Improved Aster Dems

    NASA Astrophysics Data System (ADS)

    Girod, Luc; Nuth, Christopher; Kääb, Andreas

    2016-06-01

    Volume change data is critical to the understanding of glacier response to climate change. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system embarked on the Terra (EOS AM-1) satellite has been a unique source of systematic stereoscopic images covering the whole globe at 15m resolution and at a consistent quality for over 15 years. While satellite stereo sensors with significantly improved radiometric and spatial resolution are available to date, the potential of ASTER data lies in its long consistent time series that is unrivaled, though not fully exploited for change analysis due to lack of data accuracy and precision. Here, we developed an improved method for ASTER DEM generation and implemented it in the open source photogrammetric library and software suite MicMac. The method relies on the computation of a rational polynomial coefficients (RPC) model and the detection and correction of cross-track sensor jitter in order to compute DEMs. ASTER data are strongly affected by attitude jitter, mainly of approximately 4 km and 30 km wavelength, and improving the generation of ASTER DEMs requires removal of this effect. Our sensor modeling does not require ground control points and allows thus potentially for the automatic processing of large data volumes. As a proof of concept, we chose a set of glaciers with reference DEMs available to assess the quality of our measurements. We use time series of ASTER scenes from which we extracted DEMs with a ground sampling distance of 15m. Our method directly measures and accounts for the cross-track component of jitter so that the resulting DEMs are not contaminated by this process. Since the along-track component of jitter has the same direction as the stereo parallaxes, the two cannot be separated and the elevations extracted are thus contaminated by along-track jitter. Initial tests reveal no clear relation between the cross-track and along-track components so that the latter seems not to be

  1. Evaluating Sleep Disturbance: A Review of Methods

    NASA Technical Reports Server (NTRS)

    Smith, Roy M.; Oyung, R.; Gregory, K.; Miller, D.; Rosekind, M.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    There are three general approaches to evaluating sleep disturbance in regards to noise: subjective, behavioral, and physiological. Subjective methods range from standardized questionnaires and scales to self-report measures designed for specific research questions. There are two behavioral methods that provide useful sleep disturbance data. One behavioral method is actigraphy, a motion detector that provides an empirical estimate of sleep quantity and quality. An actigraph, worn on the non-dominant wrist, provides a 24-hr estimate of the rest/activity cycle. The other method involves a behavioral response, either to a specific probe or stimuli or subject initiated (e.g., indicating wakefulness). The classic, gold standard for evaluating sleep disturbance is continuous physiological monitoring of brain, eye, and muscle activity. This allows detailed distinctions of the states and stages of sleep, awakenings, and sleep continuity. Physiological delta can be obtained in controlled laboratory settings and in natural environments. Current ambulatory physiological recording equipment allows evaluation in home and work settings. These approaches will be described and the relative strengths and limitations of each method will be discussed.

  2. Evaluation of Dynamic Methods for Earthwork Assessment

    NASA Astrophysics Data System (ADS)

    Vlček, Jozef; Ďureková, Dominika; Zgútová, Katarína

    2015-05-01

    Rapid development of road construction imposes requests on fast and quality methods for earthwork quality evaluation. Dynamic methods are now adopted in numerous civil engineering sections. Especially evaluation of the earthwork quality can be sped up using dynamic equipment. This paper presents the results of the parallel measurements of chosen devices for determining the level of compaction of soils. Measurements were used to develop the correlations between values obtained from various apparatuses. Correlations show that examined apparatuses are suitable for examination of compaction level of fine-grained soils with consideration of boundary conditions of used equipment. Presented methods are quick and results can be obtained immediately after measurement, and they are thus suitable in cases when construction works have to be performed in a short period of time.

  3. DEM Simulation of Rotational Disruption of Rubble-Pile Asteroids

    NASA Astrophysics Data System (ADS)

    Sanchez, Paul; Scheeres, D. J.

    2010-10-01

    We report on our study of rotation induced disruption of a self-gravitating granular aggregate by using a Discrete Element Method (DEM) granular dynamics code, a class of simulation commonly used in the granular mechanics community. Specifically, we simulate the behavior of a computer simulated asteroid when subjected to an array of rotation rates that cross its disruption limit. The code used to carry out these studies implements a Soft-sphere DEM method as applied for granular systems. In addition a novel algorithm to calculate self-gravitating forces which makes use of the DEM static grid has been developed and implemented in the code. By using a DEM code, it is possible to model a poly-disperse aggregate with a specified size distribution power law, incorporate contact forces such as dry cohesion and friction, and compute internal stresses within the gravitational aggregate. This approach to the modeling of gravitational aggregates is complementary to and distinctly different than other approaches reported in the literature. The simulations use both 2D and 3D modeling for analysis. One aim of this work is to understand the basic processes and dynamics of aggregates during the disruption process. We have used these simulations to understand how to form a contact binary that mimics observed asteroid shapes, how to accelerate the rotation rate of the aggregate so that it has enough time to reshape and find a stable configuration and how to analyze a system that has an occasionally changing shape. From a more physical point of view, we have focused on the understanding of the dynamics of the reshaping process, the evolution of internal stresses during this reshaping and finding the critical disruption angular velocity. This research was supported by a grant from NASA's PG&G Program: NNX10AJ66G

  4. Collection of medical drug information in pharmacies: Drug Event Monitoring (DEM) in Japan.

    PubMed

    Hayashi, Sei-ichiro; Nanaumi, Akira; Akiba, Yasuji; Komiyama, Takako; Takeuchi, Koichi

    2005-07-01

    To establish a system for collecting and reporting information from community pharmacists such as that on adverse effects, the Japan Pharmaceutical Association (JPA) conducts Drug Event Monitoring (DEM). In the fiscal year 2002, a survey was carried out to clarify the incidence of sleepiness due to antiallergic drugs. The investigated active ingredients were ebastine, fexofenadine hydrochloride, cetirizine hydrochloride, and loratadine. Community pharmacists asked the following question to patients who visited their pharmacies: "Have you ever become sleepy after taking this drug?" During a 4-week survey period, reports of 94256 cases were collected. To evaluate the incidence of sleepiness, we analyzed cases in which reports showed alleged absence of concomitant oral drugs, and drug use in conformity with the dose and method described in package inserts. The incidence of sleepiness was significantly different among the drugs (chi(2)-test, p<0.001). The observed incidences of sleepiness due to the drugs (8.8-20.5%) were higher than those described in each package insert (1.8-6.35%). This may be because an active question was used ("Have you ever become sleepy after taking this drug?"). Active intervention by pharmacists may be useful for collecting more information on improvement in the QOL of patients and safety. In addition, the pharmacists were asked to report events other than "sleepiness" in the free description column of the report. Some symptoms not described in the package inserts were reported, suggesting that DEM may lead to the discovery of new adverse effects. These results suggest that community pharmacists have a good opportunity to collect information in DEM, and safety information such as that on adverse effects can be obtained from pharmacies. PMID:15997212

  5. Automatic Detection and Boundary Extraction of Lunar Craters Based on LOLA DEM Data

    NASA Astrophysics Data System (ADS)

    Li, Bo; Ling, ZongCheng; Zhang, Jiang; Wu, ZhongChen

    2015-07-01

    Impact-induced circular structures, known as craters, are the most obvious geographic and geomorphic features on the Moon. The studies of lunar carters' patterns and spatial distributions play an important role in understanding geologic processes of the Moon. In this paper, we proposed a method based on digital elevation model (DEM) data from lunar orbiter laser altimeter to detect the lunar craters automatically. Firstly, the DEM data of study areas are converted to a series of spatial fields having different scales, in which all overlapping depressions are detected in order (larger depressions first, then the smaller ones). Then, every depression's true boundary is calculated by Fourier expansion and shape parameters are computed. Finally, we recognize the craters from training sets manually and build a binary decision tree to automatically classify the identified depressions into craters and non-craters. In addition, our crater-detection method can provides a fast and reliable evaluation of ages of lunar geologic units, which is of great significance in lunar stratigraphy studies as well as global geologic mapping.

  6. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  7. Ice volumes in the Himalayas and Karakoram: evaluating different assessment methods

    NASA Astrophysics Data System (ADS)

    Frey, Holger; Machguth, Horst; Huggel, Christian; Bajracharya, Samjwal; Bolch, Tobias; Kulkarni, Anil; Linsbauer, Andreas; Stoffel, Markus; Salzmann, Nadine

    2013-04-01

    Knowledge about volumes and the ice thickness distribution of Himalayan and Karakoram (HK) glaciers are required for assessing the future evolution, and estimating the sea-level rise potential of these ice bodies, as well as predicting impacts on the hydrological cycle. As field measurements of glacier thicknesses are sparse and restricted to individual glaciers, ice thickness and volume assessments on a larger scale have to rely strongly on modeling approaches. Here, we estimate ice volumes of all glaciers in HK region using three different approaches, compare the results, and examine related uncertainties and variability. The approaches used include volume-thickness relations using different scaling parameters, a slope-dependent thickness estimation, and a new approach to model the ice-thickness distribution based only on digital glacier outlines and a digital elevation model (DEM). By applying different combinations of model parameters and by altering glacier areas by ±5%, uncertainties related to the different methods are evaluated. Glacier outlines have been taken from the Randolph Glacier Inventory (RGI), the International Centre for Integrated Mountain Development (ICIMOD), and minor changes and additions in some regions; topographic information has been obtained from the Shuttle Radar Topography Mission (SRTM) DEM for all methods. The volume-area scaling approach resulted in glacier volumes ranging from 3632 to 6455 km3, depending on the scaling parameters used. The slope-dependent thickness estimations generated a total ice volume of 3335 km3; and a total volume of 2955 km3 resulted from the modified ice-thickness distribution model. Results of the distributed ice thickness modeling are clearly at the lowermost bound of previous estimates, and possibly hint at an overestimation of the potential contribution from HK glaciers to sea-level rise. The range of results also indicates that volume estimations are subject to large uncertainties. Although they are

  8. An automated, open-source pipeline for mass production of digital elevation models (DEMs) from very-high-resolution commercial stereo satellite imagery

    NASA Astrophysics Data System (ADS)

    Shean, David E.; Alexandrov, Oleg; Moratto, Zachary M.; Smith, Benjamin E.; Joughin, Ian R.; Porter, Claire; Morin, Paul

    2016-06-01

    We adapted the automated, open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline a processing workflow for ∼0.5 m ground sample distance (GSD) DigitalGlobe WorldView-1 and WorldView-2 along-track stereo image data, with an overview of ASP capabilities, an evaluation of ASP correlator options, benchmark test results, and two case studies of DEM accuracy. Output DEM products are posted at ∼2 m with direct geolocation accuracy of <5.0 m CE90/LE90. An automated iterative closest-point (ICP) co-registration tool reduces absolute vertical and horizontal error to <0.5 m where appropriate ground-control data are available, with observed standard deviation of ∼0.1-0.5 m for overlapping, co-registered DEMs (n = 14, 17). While ASP can be used to process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We are leveraging these resources to produce dense time series and regional mosaics for the Earth's polar regions.

  9. Creating improved ASTER DEMs over glacierized terrain

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2006-12-01

    Digital elevation models (DEMs) produced from ASTER stereo imagery over glacierized terrain frequently contain data voids, which some software packages fill by interpolation. Even when interpolation is applied, the results are often not accurate enough for studies of glacier thickness changes. DEMs are created by automatic cross-correlation between the image pairs, and rely on spatial variability in the digital number (DN) values for this process. Voids occur in radiometrically homogeneous regions, such as glacier accumulation areas covered with uniform snow, due to lack of correlation. The same property that leads to lack of correlation makes possible the derivation of elevation information from photoclinometry, also known as shape-from-shading. We demonstrate a technique to produce improved DEMs from ASTER data by combining the results from conventional cross-correlation DEM-generation software with elevation information produced from shape-from-shading in the accumulation areas of glacierized terrain. The resulting DEMs incorporate more information from the imagery, and the filled voids more accurately represent the glacier surface. This will allow for more accurate determination of glacier hypsometry and thickness changes, leading to better predictions of response to climate change.

  10. Morphological changes at Mt. Etna detected by TanDEM-X

    NASA Astrophysics Data System (ADS)

    Wegmuller, Urs; Bonforte, Alessandro; De Beni, Emanuela; Guglielmino, Francesco; Strozzi, Tazio

    2014-05-01

    the 2012 TanDEM-X model with the 2000 SRTM DEM in order to evaluate the morphological changes occurred on the volcano during the 12 years time lap. The pixel size of SRTM-DEM is about 90 m and we resampled the TanDEM-X model to fit this value. The results show that most of the variations occurred in the Valle del Bove and on the summit crater areas. In order to compare DEMs with the same pixel size, we performed a further comparison with a 5m ground resolution optical DEM, produced in 2004 and covering only the summit area. The variations in topography have been compared with ground mapping surveys, confirming a good correlation with the spatial extension of the lava flows and of the pyroclastic deposits occurred on Mt. Etna in the last seven years. The comparison between the two DEM's (2004-2012) allows calculating the amount of volcanics emitted and to clearly monitoring the growth and development of the New South East Crater (NSEC). TanDEM-X is a useful tools to monitor volcanic area characterized by a quit frequent activity (a paroxysm every 5-10 days), such us Mt. Etna, especially if concentrated in areas not easily accessible.

  11. Hydrologic validation of a structure-from-motion DEM derived from low-altitude UAV imagery

    NASA Astrophysics Data System (ADS)

    Steiner, Florian; Marzolff, Irene; d'Oleire-Oltmanns, Sebastian

    2015-04-01

    The increasing ease of use of current Unmanned Aerial Vehicles (UAVs) and 3D image processing software has spurred the number of applications relying on high-resolution topographic datasets. Of particular significance in this field is "structure from motion" (SfM), a photogrammetric technique used to generate low-cost digital elevation models (DEMs) for erosion budgeting, measuring of glaciers/lava-flows, archaeological applications and others. It was originally designed to generate 3D-models of buildings, based on unordered collections of images and has become increasingly common in geoscience applications during the last few years. Several studies on the accuracy of this technique already exist, in which the SfM data is mostly compared with Lidar-generated terrain data. The results are mainly positive, indicating that the technique is suitable for such applications. This work aims at validating very high resolution SfM DEMs with a different approach: Not the original elevation data is validated, but data on terrain-related hydrological and geomorphometric parameters derived from the DEM. The study site chosen for this analysis is an abandoned agricultural field near the city of Taroudant, in the semi-arid southern part of Morocco. The site is characterized by aggressive rill and gully erosion and is - apart from sparsely scattered shrub cover - mainly featureless. An area of 5.7 ha, equipped with 30 high-precision ground control points (GCPs), was covered with an unmanned aerial vehicle (UAV) in two different heights (85 and 170 m). A selection of 160 images was used to generate several high-resolution DEMs (2 and 5 cm resolution) of the area using the fully automated SfM software AGISOFT Photoscan. For comparison purposes, a conventional photogrammetry-based workflow using the Leica Photogrammetry Suite was used to generate a DEM with a resolution of 5 cm (LPS DEM). The evaluation is done by comparison of the SfM DEM with the derived orthoimages and the LPS DEM

  12. Development of a 'bare-earth' SRTM DEM product

    NASA Astrophysics Data System (ADS)

    O'Loughlin, Fiachra; Paiva, Rodrigo; Durand, Michael; Alsdorf, Douglas; Bates, Paul

    2015-04-01

    We present the methodology and results from the development of a near-global 'bare-earth' Digital Elevation Model (DEM) derived from the Shuttle Radar Topography Mission (SRTM) data. Digital Elevation Models are the most important input for hydraulic modelling, as the DEM quality governs the accuracy of the model outputs. While SRTM is currently the best near-globally [60N to 60S] available DEM, it requires adjustments to reduce the vegetation contamination and make it useful for hydrodynamic modelling over heavily vegetated areas (e.g. tropical wetlands). Unlike previous methods of accounting for vegetation contamination, which concentrated on correcting relatively small areas and usually applied a static adjustment, we account for vegetation contamination globally and apply a spatial varying correction, based on information about canopy height and density. In creating the final 'bare-earth' SRTM DEM dataset, we produced three different 'bare-earth' SRTM products. The first applies global parameters, while the second and third products apply parameters that are regionalised based on either climatic zones or vegetation types, respectively. We also tested two different canopy density proxies of different spatial resolution. Using ground elevations obtained from the ICESat GLA14 satellite altimeter, we calculate the residual errors for the raw SRTM and the three 'bare-earth' SRTM products and compare performances. The three 'bare-earth' products all show large improvements over the raw SRTM in vegetated areas with the overall mean bias reduced by between 75 and 92% from 4.94 m to 0.40 m. The overall standard deviation is reduced by between 29 and 33 % from 7.12 m to 4.80 m. As expected, improvements are higher in areas with denser vegetation. The final 'bare-earth' SRTM dataset is available at 3 arc-second with lower vertical height errors and less noise than the original SRTM product.

  13. Validation of DEM prediction for granular avalanches on irregular terrain

    NASA Astrophysics Data System (ADS)

    Mead, Stuart R.; Cleary, Paul W.

    2015-09-01

    Accurate numerical simulation can provide crucial information useful for a greater understanding of destructive granular mass movements such as rock avalanches, landslides, and pyroclastic flows. It enables more informed and relatively low cost investigation of significant risk factors, mitigation strategy effectiveness, and sensitivity to initial conditions, material, or soil properties. In this paper, a granular avalanche experiment from the literature is reanalyzed and used as a basis to assess the accuracy of discrete element method (DEM) predictions of avalanche flow. Discrete granular approaches such as DEM simulate the motion and collisions of individual particles and are useful for identifying and investigating the controlling processes within an avalanche. Using a superquadric shape representation, DEM simulations were found to accurately reproduce transient and static features of the avalanche. The effect of material properties on the shape of the avalanche deposit was investigated. The simulated avalanche deposits were found to be sensitive to particle shape and friction, with the particle shape causing the sensitivity to friction to vary. The importance of particle shape, coupled with effect on the sensitivity to friction, highlights the importance of quantifying and including particle shape effects in numerical modeling of granular avalanches.

  14. Efficient parallel CFD-DEM simulations using OpenMP

    NASA Astrophysics Data System (ADS)

    Amritkar, Amit; Deb, Surya; Tafti, Danesh

    2014-01-01

    The paper describes parallelization strategies for the Discrete Element Method (DEM) used for simulating dense particulate systems coupled to Computational Fluid Dynamics (CFD). While the field equations of CFD are best parallelized by spatial domain decomposition techniques, the N-body particulate phase is best parallelized over the number of particles. When the two are coupled together, both modes are needed for efficient parallelization. It is shown that under these requirements, OpenMP thread based parallelization has advantages over MPI processes. Two representative examples, fairly typical of dense fluid-particulate systems are investigated, including the validation of the DEM-CFD and thermal-DEM implementation with experiments. Fluidized bed calculations are performed on beds with uniform particle loading, parallelized with MPI and OpenMP. It is shown that as the number of processing cores and the number of particles increase, the communication overhead of building ghost particle lists at processor boundaries dominates time to solution, and OpenMP which does not require this step is about twice as fast as MPI. In rotary kiln heat transfer calculations, which are characterized by spatially non-uniform particle distributions, the low overhead of switching the parallelization mode in OpenMP eliminates the load imbalances, but introduces increased overheads in fetching non-local data. In spite of this, it is shown that OpenMP is between 50-90% faster than MPI.

  15. Graphical methods for evaluating covering arrays

    DOE PAGESBeta

    Kim, Youngil; Jang, Dae -Heung; Anderson-Cook, Christine M.

    2016-06-01

    Covering arrays relax the condition of orthogonal arrays by only requiring that all combination of levels be covered but not requiring that the appearance of all combination of levels be balanced. This allows for a much larger number of factors to be simultaneously considered but at the cost of poorer estimation of the factor effects. To better understand patterns between sets of columns and evaluate the degree of coverage to compare and select between alternative arrays, we suggest several new graphical methods that show some of the patterns of coverage for different designs. As a result, these graphical methods formore » evaluating covering arrays are illustrated with a few examples.« less

  16. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  17. Electromagnetic Imaging Methods for Nondestructive Evaluation Applications

    PubMed Central

    Deng, Yiming; Liu, Xin

    2011-01-01

    Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693

  18. ArcGeomorphometry: A toolbox for geomorphometric characterisation of DEMs in the ArcGIS environment

    NASA Astrophysics Data System (ADS)

    Rigol-Sanchez, Juan P.; Stuart, Neil; Pulido-Bosch, Antonio

    2015-12-01

    A software tool is described for the extraction of geomorphometric land surface variables and features from Digital Elevation Models (DEMs). The ArcGeomorphometry Toolbox consists of a series of Python/Numpy processing functions, presented through an easy-to-use graphical menu for the widely used ArcGIS package. Although many GIS provide some operations for analysing DEMs, the methods are often only partially implemented and can be difficult to find and used effectively. Since the results of automated characterisation of landscapes from DEMs are influenced by the extent being considered, the resolution of the source DEM and the size of the kernel (analysis window) used for processing, we have developed a tool to allow GIS users to flexibly apply several multi-scale analysis methods to parameterise and classify a DEM into discrete land surface units. Users can control the threshold values for land surface classifications. The size of the processing kernel can be used to identify land surface features across a range of landscape scales. The pattern of land surface units from each attempt at classification is displayed immediately and can then be processed in the GIS alongside additional data that can assist with a visual assessment and comparison of a series of results. The functionality of the ArcGeomorphometry toolbox is described using an example DEM.

  19. Uncertainty of SWAT model at different DEM resolutions in a large mountainous watershed.

    PubMed

    Zhang, Peipei; Liu, Ruimin; Bao, Yimeng; Wang, Jiawei; Yu, Wenwen; Shen, Zhenyao

    2014-04-15

    The objective of this study was to enhance understanding of the sensitivity of the SWAT model to the resolutions of Digital Elevation Models (DEMs) based on the analysis of multiple evaluation indicators. The Xiangxi River, a large tributary of Three Gorges Reservoir in China, was selected as the study area. A range of 17 DEM spatial resolutions, from 30 to 1000 m, was examined, and the annual and monthly model outputs based on each resolution were compared. The following results were obtained: (i) sediment yield was greatly affected by DEM resolution; (ii) the prediction of dissolved oxygen load was significantly affected by DEM resolutions coarser than 500 m; (iii) Total Nitrogen (TN) load was not greatly affected by the DEM resolution; (iv) Nitrate Nitrogen (NO₃-N) and Total Phosphorus (TP) loads were slightly affected by the DEM resolution; and (v) flow and Ammonia Nitrogen (NH₄-N) load were essentially unaffected by the DEM resolution. The flow and dissolved oxygen load decreased more significantly in the dry season than in the wet and normal seasons. Excluding flow and dissolved oxygen, the uncertainties of the other Hydrology/Non-point Source (H/NPS) pollution indicators were greater in the wet season than in the dry and normal seasons. Considering the temporal distribution uncertainties, the optimal DEM resolutions for flow was 30-200 m, for sediment and TP was 30-100 m, for dissolved oxygen and NO₃-N was 30-300 m, for NH₄-N was 30 to 70 m and for TN was 30-150 m. PMID:24509347

  20. Comparative evaluation of pelvic allograft selection methods.

    PubMed

    Bousleiman, Habib; Paul, Laurent; Nolte, Lutz-Peter; Reyes, Mauricio

    2013-05-01

    This paper presents a firsthand comparative evaluation of three different existing methods for selecting a suitable allograft from a bone storage bank. The three examined methods are manual selection, automatic volume-based registration, and automatic surface-based registration. Although the methods were originally published for different bones, they were adapted to be systematically applied on the same data set of hemi-pelvises. A thorough experiment was designed and applied in order to highlight the advantages and disadvantages of each method. The methods were applied on the whole pelvis and on smaller fragments, thus producing a realistic set of clinical scenarios. Clinically relevant criteria are used for the assessment such as surface distances and the quality of the junctions between the donor and the receptor. The obtained results showed that both automatic methods outperform the manual counterpart. Additional advantages of the surface-based method are in the lower computational time requirements and the greater contact surfaces where the donor meets the recipient. PMID:23299829

  1. Assessment methods for the evaluation of vitiligo.

    PubMed

    Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K

    2012-12-01

    There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. PMID:22416879

  2. Evaluation of Alternate Surface Passivation Methods (U)

    SciTech Connect

    Clark, E

    2005-05-31

    Stainless steel containers were assembled from parts passivated by four commercial vendors using three passivation methods. The performance of these containers in storing hydrogen isotope mixtures was evaluated by monitoring the composition of initially 50% H{sub 2} 50% D{sub 2} gas with time using mass spectroscopy. Commercial passivation by electropolishing appears to result in surfaces that do not catalyze hydrogen isotope exchange. This method of surface passivation shows promise for tritium service, and should be studied further and considered for use. On the other hand, nitric acid passivation and citric acid passivation may not result in surfaces that do not catalyze the isotope exchange reaction H{sub 2} + D{sub 2} {yields} 2HD. These methods should not be considered to replace the proprietary passivation processes of the two current vendors used at the Savannah River Site Tritium Facility.

  3. Arthroscopic proficiency: methods in evaluating competency

    PubMed Central

    2013-01-01

    Background The current paradigm of arthroscopic training lacks objective evaluation of technical ability and its adequacy is concerning given the accelerating complexity of the field. To combat insufficiencies, emphasis is shifting towards skill acquisition outside the operating room and sophisticated assessment tools. We reviewed (1) the validity of cadaver and surgical simulation in arthroscopic training, (2) the role of psychomotor analysis and arthroscopic technical ability, (3) what validated assessment tools are available to evaluate technical competency, and (4) the quantification of arthroscopic proficiency. Methods The Medline and Embase databases were searched for published articles in the English literature pertaining to arthroscopic competence, arthroscopic assessment and evaluation and objective measures of arthroscopic technical skill. Abstracts were independently evaluated and exclusion criteria included articles outside the scope of knee and shoulder arthroscopy as well as original articles about specific therapies, outcomes and diagnoses leaving 52 articles citied in this review. Results Simulated arthroscopic environments exhibit high levels of internal validity and consistency for simple arthroscopic tasks, however the ability to transfer complex skills to the operating room has not yet been established. Instrument and force trajectory data can discriminate between technical ability for basic arthroscopic parameters and may serve as useful adjuncts to more comprehensive techniques. There is a need for arthroscopic assessment tools for standardized evaluation and objective feedback of technical skills, yet few comprehensive instruments exist, especially for the shoulder. Opinion on the required arthroscopic experience to obtain proficiency remains guarded and few governing bodies specify absolute quantities. Conclusions Further validation is required to demonstrate the transfer of complex arthroscopic skills from simulated environments to the

  4. Alternative haplotype construction methods for genomic evaluation.

    PubMed

    Jónás, Dávid; Ducrocq, Vincent; Fouilloux, Marie-Noëlle; Croiseau, Pascal

    2016-06-01

    Genomic evaluation methods today use single nucleotide polymorphism (SNP) as genomic markers to trace quantitative trait loci (QTL). Today most genomic prediction procedures use biallelic SNP markers. However, SNP can be combined into short, multiallelic haplotypes that can improve genomic prediction due to higher linkage disequilibrium between the haplotypes and the linked QTL. The aim of this study was to develop a method to identify the haplotypes, which can be expected to be superior in genomic evaluation, as compared with either SNP or other haplotypes of the same size. We first identified the SNP (termed as QTL-SNP) from the bovine 50K SNP chip that had the largest effect on the analyzed trait. It was assumed that these SNP were not the causative mutations and they merely indicated the approximate location of the QTL. Haplotypes of 3, 4, or 5 SNP were selected from short genomic windows surrounding these markers to capture the effect of the QTL. Two methods described in this paper aim at selecting the most optimal haplotype for genomic evaluation. They assumed that if an allele has a high frequency, its allele effect can be accurately predicted. These methods were tested in a classical validation study using a dairy cattle population of 2,235 bulls with genotypes from the bovine 50K SNP chip and daughter yield deviations (DYD) on 5 dairy cattle production traits. Combining the SNP into haplotypes was beneficial with all tested haplotypes, leading to an average increase of 2% in terms of correlations between DYD and genomic breeding value estimates compared with the analysis when the same SNP were used individually. Compared with haplotypes built by merging the QTL-SNP with its flanking SNP, the haplotypes selected with the proposed criteria carried less under- and over-represented alleles: the proportion of alleles with frequencies <1 or >40% decreased, on average, by 17.4 and 43.4%, respectively. The correlations between DYD and genomic breeding value

  5. Evaluating conflation methods using uncertainty modeling

    NASA Astrophysics Data System (ADS)

    Doucette, Peter; Dolloff, John; Canavosio-Zuzelski, Roberto; Lenihan, Michael; Motsko, Dennis

    2013-05-01

    The classic problem of computer-assisted conflation involves the matching of individual features (e.g., point, polyline, or polygon vectors) as stored in a geographic information system (GIS), between two different sets (layers) of features. The classical goal of conflation is the transfer of feature metadata (attributes) from one layer to another. The age of free public and open source geospatial feature data has significantly increased the opportunity to conflate such data to create enhanced products. There are currently several spatial conflation tools in the marketplace with varying degrees of automation. An ability to evaluate conflation tool performance quantitatively is of operational value, although manual truthing of matched features is laborious and costly. In this paper, we present a novel methodology that uses spatial uncertainty modeling to simulate realistic feature layers to streamline evaluation of feature matching performance for conflation methods. Performance results are compiled for DCGIS street centerline features.

  6. DEM interpolation based on artificial neural networks

    NASA Astrophysics Data System (ADS)

    Jiao, Limin; Liu, Yaolin

    2005-10-01

    This paper proposed a systemic resolution scheme of Digital Elevation model (DEM) interpolation based on Artificial Neural Networks (ANNs). In this paper, we employ BP network to fit terrain surface, and then detect and eliminate the samples with gross errors. This paper uses Self-organizing Feature Map (SOFM) to cluster elevation samples. The study area is divided into many more homogenous tiles after clustering. BP model is employed to interpolate DEM in each cluster. Because error samples are eliminated and clusters are built, interpolation result is better. The case study indicates that ANN interpolation scheme is feasible. It also shows that ANN can get a more accurate result by comparing ANN with polynomial and spline interpolation. ANN interpolation doesn't need to determine the interpolation function beforehand, so manmade influence is lessened. The ANN interpolation is more automatic and intelligent. At the end of the paper, we propose the idea of constructing ANN surface model. This model can be used in multi-scale DEM visualization, and DEM generalization, etc.

  7. Evaluation of Scaling Methods for Rotorcraft Icing

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Kreeger, Richard E.

    2010-01-01

    This paper reports result of an experimental study in the NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the current recommended scaling methods developed for fixed-wing unprotected surface icing applications might apply to representative rotor blades at finite angle of attack. Unlike the fixed-wing case, there is no single scaling method that has been systematically developed and evaluated for rotorcraft icing applications. In the present study, scaling was based on the modified Ruff method with scale velocity determined by maintaining constant Weber number. Models were unswept NACA 0012 wing sections. The reference model had a chord of 91.4 cm and scale model had a chord of 35.6 cm. Reference tests were conducted with velocities of 76 and 100 kt (39 and 52 m/s), droplet MVDs of 150 and 195 fun, and with stagnation-point freezing fractions of 0.3 and 0.5 at angle of attack of 0deg and 5deg. It was shown that good ice shape scaling was achieved for NACA 0012 airfoils with angle of attack lip to 5deg.

  8. An elementary calculus method for evaluating ?

    NASA Astrophysics Data System (ADS)

    Lee, Tuo Yeong; Xiong, Yuxuan

    2014-08-01

    We use freshman calculus to prove that for θ ∈ (0, π) and n = 0, 1, 2, … ; in particular, we obtain a simple unified method for evaluating the following infinite series:

  9. SHADED RELIEF, HILLSHADE, DIGITAL ELEVATION MODEL (DEM), NEVADA

    EPA Science Inventory

    Shaded relief of the state of Nevada developed from 1-degree US Geological Survey (USGS) Digital Elevation Models (DEMs). DEM is a terminology adopted by the USGS to describe terrain elevation data sets in a digital raster form.

  10. SHADED RELIEF, HILLSHADE, DIGITAL ELEVATION MODEL (DEM), ARIZONA

    EPA Science Inventory

    Shaded relief of the state of Arizona developed from 1-degree US Geological Survey (USGS) Digital Elevation Models (DEMs). DEM is a terminology adopted by the USGS to describe terrain elevation data sets in a digital raster form.

  11. Image Inpainting Methods Evaluation and Improvement

    PubMed Central

    Vreja, Raluca

    2014-01-01

    With the upgrowing of digital processing of images and film archiving, the need for assisted or unsupervised restoration required the development of a series of methods and techniques. Among them, image inpainting is maybe the most impressive and useful. Based on partial derivative equations or texture synthesis, many other hybrid techniques have been proposed recently. The need for an analytical comparison, beside the visual one, urged us to perform the studies shown in the present paper. Starting with an overview of the domain, an evaluation of the five methods was performed using a common benchmark and measuring the PSNR. Conclusions regarding the performance of the investigated algorithms have been presented, categorizing them in function of the restored image structure. Based on these experiments, we have proposed an adaptation of Oliveira's and Hadhoud's algorithms, which are performing well on images with natural defects. PMID:25136700

  12. Image inpainting methods evaluation and improvement.

    PubMed

    Vreja, Raluca; Brad, Remus

    2014-01-01

    With the upgrowing of digital processing of images and film archiving, the need for assisted or unsupervised restoration required the development of a series of methods and techniques. Among them, image inpainting is maybe the most impressive and useful. Based on partial derivative equations or texture synthesis, many other hybrid techniques have been proposed recently. The need for an analytical comparison, beside the visual one, urged us to perform the studies shown in the present paper. Starting with an overview of the domain, an evaluation of the five methods was performed using a common benchmark and measuring the PSNR. Conclusions regarding the performance of the investigated algorithms have been presented, categorizing them in function of the restored image structure. Based on these experiments, we have proposed an adaptation of Oliveira's and Hadhoud's algorithms, which are performing well on images with natural defects. PMID:25136700

  13. 3D DEM analyses of the 1963 Vajont rock slide

    NASA Astrophysics Data System (ADS)

    Boon, Chia Weng; Houlsby, Guy; Utili, Stefano

    2013-04-01

    The 1963 Vajont rock slide has been modelled using the distinct element method (DEM). The open-source DEM code, YADE (Kozicki & Donzé, 2008), was used together with the contact detection algorithm proposed by Boon et al. (2012). The critical sliding friction angle at the slide surface was sought using a strength reduction approach. A shear-softening contact model was used to model the shear resistance of the clayey layer at the slide surface. The results suggest that the critical sliding friction angle can be conservative if stability analyses are calculated based on the peak friction angles. The water table was assumed to be horizontal and the pore pressure at the clay layer was assumed to be hydrostatic. The influence of reservoir filling was marginal, increasing the sliding friction angle by only 1.6˚. The results of the DEM calculations were found to be sensitive to the orientations of the bedding planes and cross-joints. Finally, the failure mechanism was investigated and arching was found to be present at the bend of the chair-shaped slope. References Boon C.W., Houlsby G.T., Utili S. (2012). A new algorithm for contact detection between convex polygonal and polyhedral particles in the discrete element method. Computers and Geotechnics, vol 44, 73-82, doi.org/10.1016/j.compgeo.2012.03.012. Kozicki, J., & Donzé, F. V. (2008). A new open-source software developed for numerical simulations using discrete modeling methods. Computer Methods in Applied Mechanics and Engineering, 197(49-50), 4429-4443.

  14. Economic methods for multipollutant analysis and evaluation

    SciTech Connect

    Baasel, W.D.

    1985-01-01

    Since 1572, when miners' lung problems were first linked to dust, man's industrial activity has been increasingly accused of causing disease in man and harm to the environment. Since that time each compound or stream thought to be damaging has been looked at independently. If a gas stream caused the problem the bad compound compositions were reduced to an acceptable level and the problem was considered solved. What happened to substances after they were removed usually was not fully considered until the finding of an adverse effect required it. Until 1970, one usual way of getting rid of many toxic wastes was to place the, in landfills and forget about them. The discovery of sickness caused by substances escaping from the Love Canal landfill has caused a total rethinking of that procedure. This and other incidents clearly showed that taking a substance out of one stream which is discharged to the environment and placing it in another may not be an adequate solution. What must be done is to look at all streams leaving an industrial plant and devise a way to reduce the potentially harmful emissions in those streams to an acceptable level, using methods that are inexpensive. To illustrate conceptually how the environmental assessment approach is a vast improvement over the current methods, an example evaluating effluents from a coal-fired 500 MW power plant is presented. Initially only one substance in one stream is evaluated. This is sulfur oxide leaving in the flue gas.

  15. Sparse Representation and Multiscale Methods - Application to Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Stefanescu, R. E. R.; Patra, A. K.; Bursik, M. I.

    2014-12-01

    In general, a Digital Elevation Model (DEM) is produced either digitizing existing maps and elevation values are interpolated from the contours, or elevation information is collected from stereo imagery on digital photogrammetric workstations. Both methods produce a DEM to the required specification, but each method contains a variety of possible production scenarios, and each method results in DEM cells with totally different character. Common artifacts found in DEM are missing-values at different location which can influence the output of the application that uses this particular DEM. In this work we introduce a numerically-stable multiscale scheme to evaluate the missing-value DEM's quantity of interest (elevation, slope, etc.). This method is very efficient for the case when dealing with large high resolution DEMs that cover large area, resulting in O(106-1010) data points. Our scheme relies on graph-based algorithms and low-rank approximations of the entire adjacency matrix of the DEM's graph. When dealing with large data sets such as DEMs, the Laplacian or kernel matrix resulted from the interaction of the data points is stupendously big. One needs to identify a subspace that capture most of the action of the kernel matrix. By the application of a randomized projection on the graph affinity matrix, a well-conditioned basis is identified for it numerical range. This basis is later used in out-of-sample extension at missing-value location. In many cases, this method beats its classical competitors in terms of accuracy, speed, and robustness.

  16. ALOS DEM quality assessment in a rugged topography, A Lebanese watershed as a case study

    NASA Astrophysics Data System (ADS)

    Abdallah, Chadi; El Hage, Mohamad; Termos, Samah; Abboud, Mohammad

    2014-05-01

    Deriving the morphometric descriptors of the Earth's surface from satellite images is a continuing application in remote sensing, which has been distinctly pushed with the increasing availability of DEMs at different scales, specifically those derived from high to very high-resolution stereoscopic and triscopic image data. The extraction of the morphometric descriptors is affected by the errors of the DEM. This study presents a procedure for assessing the quality of ALOS DEM in terms of position and morphometric indices. It involves evaluating the impact of the production parameters on the altimetric accuracy through checking height differences between Ground Control Points (GCP) and the corresponding DEM points, on the planimetric accuracy by comparing extracted drainage lines with topographic maps, and on the morphometric indices by comparing profiles extracted from the DEM with those measured on the field. A twenty set of triplet-stereo imagery from the PRISM instrument on the ALOS satellite has been processed to acquire a 5 m DEM covering the whole Lebanese territories. The Lebanese topography is characterized by its ruggedness with two parallel mountainous chains embedding a depression (The Bekaa Valley). The DEM was extracted via PCI Geomatica 2013. Each of the images required 15 GCPs and around 50 tie points. Field measurements was carried out using differential GPS (Trimble GeoXH6000, ProXRT receiver and the LaserACE 1000 Rangefinder) on Al Awali watershed (482 km2, about 5% of the Lebanese terrain). 3545 GPS points were collected at all ranges of elevation specifying the Lebanese terrain diversity, ranging from cliffy, to steep and gently undulating terrain along with narrow and wide flood plains and including predetermined profiles. Moreover, definite points such as road intersections and river beds were also measured in order to assess the extracted streams from the DEM. ArcGIS 10.1 was also utilized to extract the drainage network. Preliminary results

  17. Quality assessment of TanDEM-X DEMs using airborne LiDAR, photogrammetry and ICESat elevation data

    NASA Astrophysics Data System (ADS)

    Rao, Y. S.; Deo, R.; Nalini, J.; Pillai, A. M.; Muralikrishnan, S.; Dadhwal, V. K.

    2014-11-01

    TanDEM-X mission has been acquiring InSAR data to produce high resolution global DEM with greater vertical accuracy since 2010. In this study, TanDEM-X CoSSC data were processed to produce DEMs at 6 m spatial resolution for two test areas of India. The generated DEMs were compared with DEMs available from airborne LiDAR, photogrammetry, SRTM and ICESat elevation point data. The first test site is in Bihar state of India with almost flat terrain and sparse vegetation cover and the second test site is around Godavari river in Andhra Pradesh (A.P.) state of India with flat to moderate hilly terrain. The quality of the DEMs in these two test sites has been specified in terms of most widely used accuracy measures viz. mean, standard deviation, skew and RMSE. The TanDEM-X DEM over Bihar test area gives 5.0 m RMSE by taking airborne LiDAR data as reference. With ICESat elevation data available at 9000 point locations, RMSE of 5.9 m is obtained. Similarly, TanDEM-X DEM for Godavari area was compared with high resolution aerial photogrammetric DEM and SRTM DEM and found RMSE of 5.3 m and 7.5 m respectively. When compared with ICESat elevation data at several point location and also the same point locations of photogrammetric DEM and SRTM, the RMS errors are 4.1 m, 3.5 m and 4.3 m respectively. DEMs were also compared for open-pit coal mining area where elevation changes from -147 m to 189 m. X- and Y-profiles of all DEMs were also compared to see their trend and differences.

  18. Integration of 2-D hydraulic model and high-resolution lidar-derived DEM for floodplain flow modeling

    NASA Astrophysics Data System (ADS)

    Shen, D.; Wang, J.; Cheng, X.; Rui, Y.; Ye, S.

    2015-08-01

    The rapid progress of lidar technology has made the acquirement and application of high-resolution digital elevation model (DEM) data increasingly popular, especially in regards to the study of floodplain flow. However, high-resolution DEM data pose several disadvantages for floodplain modeling studies; e.g., the data sets contain many redundant interpolation points, large numbers of calculations are required to work with data, and the data do not match the size of the computational mesh. Two-dimensional (2-D) hydraulic modeling, which is a popular method for analyzing floodplain flow, offers highly precise elevation parameterization for computational mesh while ignoring much of the micro-topographic information of the DEM data itself. We offer a flood simulation method that integrates 2-D hydraulic model results and high-resolution DEM data, thus enabling the calculation of flood water levels in DEM grid cells through local inverse distance-weighted interpolation. To get rid of the false inundation areas during interpolation, it employs the run-length encoding method to mark the inundated DEM grid cells and determine the real inundation areas through the run-length boundary tracing technique, which solves the complicated problem of connectivity between DEM grid cells. We constructed a 2-D hydraulic model for the Gongshuangcha detention basin, which is a flood storage area of Dongting Lake in China, by using our integrated method to simulate the floodplain flow. The results demonstrate that this method can solve DEM associated problems efficiently and simulate flooding processes with greater accuracy than simulations only with DEM.

  19. CFD-DEM simulations of current-induced dune formation and morphological evolution

    NASA Astrophysics Data System (ADS)

    Sun, Rui; Xiao, Heng

    2016-06-01

    Understanding the fundamental mechanisms of sediment transport, particularly those during the formation and evolution of bedforms, is of critical scientific importance and has engineering relevance. Traditional approaches of sediment transport simulations heavily rely on empirical models, which are not able to capture the physics-rich, regime-dependent behaviors of the process. With the increase of available computational resources in the past decade, CFD-DEM (computational fluid dynamics-discrete element method) has emerged as a viable high-fidelity method for the study of sediment transport. However, a comprehensive, quantitative study of the generation and migration of different sediment bed patterns using CFD-DEM is still lacking. In this work, current-induced sediment transport problems in a wide range of regimes are simulated, including 'flat bed in motion', 'small dune', 'vortex dune' and suspended transport. Simulations are performed by using SediFoam, an open-source, massively parallel CFD-DEM solver developed by the authors. This is a general-purpose solver for particle-laden flows tailed for particle transport problems. Validation tests are performed to demonstrate the capability of CFD-DEM in the full range of sediment transport regimes. Comparison of simulation results with experimental and numerical benchmark data demonstrates the merits of CFD-DEM approach. In addition, the improvements of the present simulations over existing studies using CFD-DEM are presented. The present solver gives more accurate prediction of sediment transport rate by properly accounting for the influence of particle volume fraction on the fluid flow. In summary, this work demonstrates that CFD-DEM is a promising particle-resolving approach for probing the physics of current-induced sediment transport.

  20. A Method for Missile Autopilot Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Eguchi, Hirofumi

    The essential benefit of HardWare-In-the-Loop (HWIL) simulation can be summarized as that the performance of autopilot system is evaluated realistically without the modeling error by using actual hardware such as seeker systems, autopilot systems and servo equipments. HWIL simulation, however, requires very expensive facilities: in these facilities, the target model generator is the indispensable subsystem. In this paper, one example of HWIL simulation facility with a target model generator for RF seeker systems is introduced at first. But this generator has the functional limitation on the line-of-sight angle as almost other generators, then, a test method to overcome the line-of-sight angle limitation is proposed.

  1. [Evaluation of autonomic dysfunction by novel methods].

    PubMed

    Ando, Yukio; Obayashi, Konen

    2004-07-01

    The autonomic nervous system innervates every organ in the body. Since autonomic disturbances affect patient survival, an understanding and recognition of these disturbances are important. We adopted several new methods to evaluate autonomic function accurately. 123I-metaiodobenzylguanidine scintigraphy can assess the cardiac autonomic function even in the presence of cardiac arrhythmia. Laser-Doppler flowmetry, ultrasonographic study in the vessels and near-infrared spectrophotoscopy techniques serve as useful markers for screening the dysfunction of vasomotor neurons and blood circulation. Electrogastrography and the circadian rhythms of protein C secretion could be markers of the visceromotor nerves in the abdomen. Electrogastrography is a particularly useful tool for reflecting on functional changes in gastrointestinal motility. The evaluation of anemia could be a marker of autonomic dysfunction in the kidney and bone marrow in patients with familial amyloidotic polyneuropathy, pandysautonomia, and multiple system atrophy. Normocytic and normochromic anemia correlated with the severity of autonomic dysfunction were shown in these patients. We also evaluated the dysfunction of the neuroendocrine system and sudomotor neuron using our new autonomic function tests. The glucose-tolerance test could become one of the most useful clinical tools for detecting autonomic dysfunction in the endocrine system. Microhydrography and thermography could be useful tools for diagnosing the lesion site of dyshidrosis. Moreover, it is clinically important to check the systemic circulation and autonomic function in patients treated with sildenafil citrate and organ transplantation to save their lives. Our new autonomic function tests, such as laser-Doppler flowmetry and 123I-metaiodobenzylguanidine scintigraphy, are crucial tools in supplying the best symptomatic treatment for such patients. PMID:15344558

  2. Spaceborne radar interferometry for coastal DEM construction

    USGS Publications Warehouse

    Hong, S.-H.; Lee, C.-W.; Won, J.-S.; Kwoun, Oh-Ig; Lu, Zhiming

    2005-01-01

    Topographic features in coastal regions including tidal flats change more significantly than landmass, and are characterized by extremely low slopes. High precision DEMs are required to monitor dynamic changes in coastal topography. It is difficult to obtain coherent interferometric SAR pairs especially over tidal flats mainly because of variation of tidal conditions. Here we focus on i) coherence of multi-pass ERS SAR interferometric pairs and ii) DEM construction from ERS-ENVISAT pairs. Coherences of multi-pass ERS interferograms were good enough to construct DEM under favorable tidal conditions. Coherence in sand dominant area was generally higher than that in muddy surface. The coarse grained coastal areas are favorable for multi-pass interferometry. Utilization of ERS-ENVISAT interferometric pairs is taken a growing interest. We carried out investigation using a cross-interferometric pair with a normal baseline of about 1.3 km, a 30 minutes temporal separation and the height sensitivity of about 6 meters. Preliminary results of ERS-ENVISAT interferometry were not successful due to baseline and unfavorable scattering conditions. ?? 2005 IEEE.

  3. Multilevel summation method for electrostatic force evaluation.

    PubMed

    Hardy, David J; Wu, Zhe; Phillips, James C; Stone, John E; Skeel, Robert D; Schulten, Klaus

    2015-02-10

    The multilevel summation method (MSM) offers an efficient algorithm utilizing convolution for evaluating long-range forces arising in molecular dynamics simulations. Shifting the balance of computation and communication, MSM provides key advantages over the ubiquitous particle–mesh Ewald (PME) method, offering better scaling on parallel computers and permitting more modeling flexibility, with support for periodic systems as does PME but also for semiperiodic and nonperiodic systems. The version of MSM available in the simulation program NAMD is described, and its performance and accuracy are compared with the PME method. The accuracy feasible for MSM in practical applications reproduces PME results for water property calculations of density, diffusion constant, dielectric constant, surface tension, radial distribution function, and distance-dependent Kirkwood factor, even though the numerical accuracy of PME is higher than that of MSM. Excellent agreement between MSM and PME is found also for interface potentials of air–water and membrane–water interfaces, where long-range Coulombic interactions are crucial. Applications demonstrate also the suitability of MSM for systems with semiperiodic and nonperiodic boundaries. For this purpose, simulations have been performed with periodic boundaries along directions parallel to a membrane surface but not along the surface normal, yielding membrane pore formation induced by an imbalance of charge across the membrane. Using a similar semiperiodic boundary condition, ion conduction through a graphene nanopore driven by an ion gradient has been simulated. Furthermore, proteins have been simulated inside a single spherical water droplet. Finally, parallel scalability results show the ability of MSM to outperform PME when scaling a system of modest size (less than 100 K atoms) to over a thousand processors, demonstrating the suitability of MSM for large-scale parallel simulation. PMID:25691833

  4. Digital image envelope: method and evaluation

    NASA Astrophysics Data System (ADS)

    Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang

    2003-05-01

    Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.

  5. Multilevel Summation Method for Electrostatic Force Evaluation

    PubMed Central

    2015-01-01

    The multilevel summation method (MSM) offers an efficient algorithm utilizing convolution for evaluating long-range forces arising in molecular dynamics simulations. Shifting the balance of computation and communication, MSM provides key advantages over the ubiquitous particle–mesh Ewald (PME) method, offering better scaling on parallel computers and permitting more modeling flexibility, with support for periodic systems as does PME but also for semiperiodic and nonperiodic systems. The version of MSM available in the simulation program NAMD is described, and its performance and accuracy are compared with the PME method. The accuracy feasible for MSM in practical applications reproduces PME results for water property calculations of density, diffusion constant, dielectric constant, surface tension, radial distribution function, and distance-dependent Kirkwood factor, even though the numerical accuracy of PME is higher than that of MSM. Excellent agreement between MSM and PME is found also for interface potentials of air–water and membrane–water interfaces, where long-range Coulombic interactions are crucial. Applications demonstrate also the suitability of MSM for systems with semiperiodic and nonperiodic boundaries. For this purpose, simulations have been performed with periodic boundaries along directions parallel to a membrane surface but not along the surface normal, yielding membrane pore formation induced by an imbalance of charge across the membrane. Using a similar semiperiodic boundary condition, ion conduction through a graphene nanopore driven by an ion gradient has been simulated. Furthermore, proteins have been simulated inside a single spherical water droplet. Finally, parallel scalability results show the ability of MSM to outperform PME when scaling a system of modest size (less than 100 K atoms) to over a thousand processors, demonstrating the suitability of MSM for large-scale parallel simulation. PMID:25691833

  6. Evaluation of methods to assess physical activity

    NASA Astrophysics Data System (ADS)

    Leenders, Nicole Y. J. M.

    Epidemiological evidence has accumulated that demonstrates that the amount of physical activity-related energy expenditure during a week reduces the incidence of cardiovascular disease, diabetes, obesity, and all-cause mortality. To further understand the amount of daily physical activity and related energy expenditure that are necessary to maintain or improve the functional health status and quality of life, instruments that estimate total (TDEE) and physical activity-related energy expenditure (PAEE) under free-living conditions should be determined to be valid and reliable. Without evaluation of the various methods that estimate TDEE and PAEE with the doubly labeled water (DLW) method in females there will be eventual significant limitations on assessing the efficacy of physical activity interventions on health status in this population. A triaxial accelerometer (Tritrac-R3D, (TT)), an uniaxial (Computer Science and Applications Inc., (CSA)) activity monitor, a Yamax-Digiwalker-500sp°ler , (YX-stepcounter), by measuring heart rate responses (HR method) and a 7-d Physical Activity Recall questionnaire (7-d PAR) were compared with the "criterion method" of DLW during a 7-d period in female adults. The DLW-TDEE was underestimated on average 9, 11 and 15% using 7-d PAR, HR method and TT. The underestimation of DLW-PAEE by 7-d PAR was 21% compared to 47% and 67% for TT and YX-stepcounter. Approximately 56% of the variance in DLW-PAEE*kgsp{-1} is explained by the registration of body movement with accelerometry. A larger proportion of the variance in DLW-PAEE*kgsp{-1} was explained by jointly incorporating information from the vertical and horizontal movement measured with the CSA and Tritrac-R3D (rsp2 = 0.87). Although only a small amount of variance in DLW-PAEE*kgsp{-1} is explained by the number of steps taken per day, because of its low cost and ease of use, the Yamax-stepcounter is useful in studies promoting daily walking. Thus, studies involving the

  7. On the Standardization of Vertical Accuracy Figures in Dems

    NASA Astrophysics Data System (ADS)

    Casella, V.; Padova, B.

    2013-01-01

    Digital Elevation Models (DEMs) play a key role in hydrological risk prevention and mitigation: hydraulic numeric simulations, slope and aspect maps all heavily rely on DEMs. Hydraulic numeric simulations require the used DEM to have a defined accuracy, in order to obtain reliable results. Are the DEM accuracy figures clearly and uniquely defined? The paper focuses on some issues concerning DEM accuracy definition and assessment. Two DEM accuracy definitions can be found in literature: accuracy at the interpolated point and accuracy at the nodes. The former can be estimated by means of randomly distributed check points, while the latter by means of check points coincident with the nodes. The two considered accuracy figures are often treated as equivalent, but they aren't. Given the same DEM, assessing it through one or the other approach gives different results. Our paper performs an in-depth characterization of the two figures and proposes standardization coefficients.

  8. Volcanic Landform Classification of Iwate Volcano from DEM-Derived Thematic Maps

    NASA Astrophysics Data System (ADS)

    Prima, A. O.; Yoshida, T.

    2004-12-01

    Over the last three decades, digital elevation models (DEMs) have been developed as surface data instead of contour lines to allow numerical analysis or modeling of terrain by computer. DEMs have allowed the development of algorithms to rapidly derive slope, relief, convexity, concavity and aspect of any points of surface, and also have allowed the definition of a number of new morphometric measures i.e. openness (Yokoyama et al., 2002). Openness is an angular measure of the relation between surface relief and horizontal distance. Openness has two viewer perspectives. Positive values, expressing openness above the surface, are high for convex forms, whereas negative values describe this attribute below the surface and are high for concave forms. The emphasis of terrain convexity and concavity in openness maps facilitates the interpretation of landforms on the Earth_fs surface. Prima et al. (2003) proposed automated landform classification using openness and slope with genetic factors. This method had been proved to produce good classification for constructional (alluvial plains, alluvial fans and volcanoes) and erosional (hills and mountains) landforms. The capability of this method to classify landforms from DEMs with genetic factors is important because it allows landform evolution to be numerically analyzed. In this study, we adopted this method to classify volcanic landforms of Iwate Volcano from Honshu, Japan, where volcanic landforms were categorized referring to geological map of Iwate Volcano (Doi, 2000). This process took three steps. First, the characteristic of each category was evaluated against the mean and standard deviation of slope, and both positive and negative openness, in two dimensional feature spaces. Second, the characteristic of each category were observed and the combinations of mean and standard deviation of slope and openness showing high separabilities were selected. We found that the standard deviation of slope, positive and negative

  9. Quality assessment of Digital Elevation Model (DEM) in view of the Altiplano hydrological modeling

    NASA Astrophysics Data System (ADS)

    Satgé, F.; Arsen, A.; Bonnet, M.; Timouk, F.; Calmant, S.; Pilco, R.; Molina, J.; Lavado, W.; Crétaux, J.; HASM

    2013-05-01

    Topography is crucial data input for hydrological modeling but in many regions of the world, the only way to characterize topography is the use of satellite-based Digital Elevation Models (DEM). In some regions, the quality of these DEMs remains poor and induces modeling errors that may or not be compensated by model parameters tuning. In such regions, the evaluation of these data uncertainties is an important step in the modeling procedure. In this study, which focuses on the Altiplano region, we present the evaluation of the two freely available DEM. The shuttle radar topographic mission (SRTM), a product of the National Aeronautics and Space Administration (NASA) and the Advanced Space Born Thermal Emission and Reflection Global Digital Elevation Map (ASTER GDEM), data provided by the Ministry of Economy, Trade and Industry of Japan (MESI) in collaboration with the NASA, are widely used. While the first represents a resolution of 3 arc seconds (90m) the latter is 1 arc second (30m). In order to select the most reliable DEM, we compared the DEM elevation with high qualities control points elevation. Because of its large spatial coverture (track spaced of 30 km with a measure of each 172 m) and its high vertical accuracy which is less than 15 cm in good weather conditions, the Geoscience Laser Altimeter System (GLAS) on board on the Ice, Cloud and Land elevation Satellite of NASA (ICESat) represent the better solution to establish a high quality elevation database. After a quality check, more than 150 000 ICESat/GLAS measurements are suitable in terms of accuracy for the Altiplano watershed. This data base has been used to evaluate the vertical accuracy for each DEM. Regarding to the full spatial coverture; the comparison has been done for both, all kind of land coverture, range altitude and mean slope.

  10. Volcanic geomorphology using TanDEM-X

    NASA Astrophysics Data System (ADS)

    Poland, Michael; Kubanek, Julia

    2016-04-01

    Topography is perhaps the most fundamental dataset for any volcano, yet is surprisingly difficult to collect, especially during the course of an eruption. For example, photogrammetry and lidar are time-intensive and often expensive, and they cannot be employed when the surface is obscured by clouds. Ground-based surveys can operate in poor weather but have poor spatial resolution and may expose personnel to hazardous conditions. Repeat passes of synthetic aperture radar (SAR) data provide excellent spatial resolution, but topography in areas of surface change (from vegetation swaying in the wind to physical changes in the landscape) between radar passes cannot be imaged. The German Space Agency's TanDEM-X satellite system, however, solves this issue by simultaneously acquiring SAR data of the surface using a pair of orbiting satellites, thereby removing temporal change as a complicating factor in SAR-based topographic mapping. TanDEM-X measurements have demonstrated exceptional value in mapping the topography of volcanic environments in as-yet limited applications. The data provide excellent resolution (down to ~3-m pixel size) and are useful for updating topographic data at volcanoes where surface change has occurred since the most recent topographic dataset was collected. Such data can be used for applications ranging from correcting radar interferograms for topography, to modeling flow pathways in support of hazards mitigation. The most valuable contributions, however, relate to calculating volume changes related to eruptive activity. For example, limited datasets have provided critical measurements of lava dome growth and collapse at volcanoes including Merapi (Indonesia), Colima (Mexico), and Soufriere Hills (Montserrat), and of basaltic lava flow emplacement at Tolbachik (Kamchatka), Etna (Italy), and Kīlauea (Hawai`i). With topographic data spanning an eruption, it is possible to calculate eruption rates - information that might not otherwise be available

  11. The HELI-DEM model estimation

    NASA Astrophysics Data System (ADS)

    Biagi, L.; Caldera, S.; Carcano, L.; Lucchese, A.; Negretti, M.; Sansò, F.; Triglione, D.; Visconti, M. G.

    2014-04-01

    Global DEMs are fundamental for global applications and are necessary also at the local scale, in regions where local models are not available. Local DEMs are preferred when they are available and if are characterized by better accuracies and resolutions. In general, two problems arise. Firstly, an interest region could be patched by several partly overlapping DEMs that present similar accuracies and spatial resolutions: they should be merged in a unified model. Moreover, even when the interest region is covered by one unified DEM, local DEMs with better accuracy could be available and should be used to locally improve it. All these problems have been addressed within HELI-DEM project. HELI-DEM (HELvetia-Italy Digital Elevation Model) is a project that has been funded by the European Regional Development Fund (ERDF) within the Italy-Switzerland cooperation program. It started in 2010 and finished at the end of 2013. The involved institutions in the project were Fondazione Politecnico di Milano, Politecnico di Milano, Politecnico di Torino, Regione Lombardia, Regione Piemonte and Scuola Universitaria della Svizzera Italiana. One specific aim of the project was the creation and the publication of a unified Digital Elevation Model for the part of the Alps between Italy and Switzerland. The interest area is prevalently mountainous, with heights that range from about 200 m to 4600 m. Three low Resolution DTMs (20-25-50 m of resolution) are available that partly overlap and patch the whole project area: they are characterized by accuracies of some meters. Also High Resolution DTMs (1-5 m) are available: they have accuracies of some decimeters but cover limited areas of the project. The various models are available in different reference frames (the European ETRF89 and the Italian Roma40) and are gridded either in cartographic or geographic coordinates. Before merging them, a validation of the input data has been performed in three steps: cross validation of LR DTMs

  12. International genomic evaluation methods for dairy cattle

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Background Genomic evaluations are rapidly replacing traditional evaluation systems used for dairy cattle selection. Economies of scale in genomics promote cooperation across country borders. Genomic information can be transferred across countries using simple conversion equations, by modifying mult...

  13. a Near-Global Bare-Earth dem from Srtm

    NASA Astrophysics Data System (ADS)

    Gallant, J. C.; Read, A. M.

    2016-06-01

    The near-global elevation product from NASA's Shuttle Radar Topographic Mission (SRTM) has been widely used since its release in 2005 at 3 arcsecond resolution and the release of the 1 arcsecond version in late 2014 means that the full potential of the SRTM DEM can now be realised. However the routine use of SRTM for analytical purposes such as catchment hydrology, flood inundation, habitat mapping and soil mapping is still seriously impeded by the presence of artefacts in the data, primarily the offsets due to tree cover and the random noise. This paper describes the algorithms being developed to remove those offsets, based on the methods developed to produce the Australian national elevation model from SRTM data. The offsets due to trees are estimated using the GlobeLand30 (National Geomatics Center of China) and Global Forest Change (University of Maryland) products derived from Landsat, along with the ALOS PALSAR radar image data (JAXA) and the global forest canopy height map (NASA). The offsets are estimated using several processes and combined to produce a single continuous tree offset layer that is subtracted from the SRTM data. The DEM products will be made freely available on completion of the first draft product, and the assessment of that product is expected to drive further improvements to the methods.

  14. Integration of SAR and DEM data: Geometrical considerations

    NASA Technical Reports Server (NTRS)

    Kropatsch, Walter G.

    1991-01-01

    General principles for integrating data from different sources are derived from the experience of registration of SAR images with digital elevation models (DEM) data. The integration consists of establishing geometrical relations between the data sets that allow us to accumulate information from both data sets for any given object point (e.g., elevation, slope, backscatter of ground cover, etc.). Since the geometries of the two data are completely different they cannot be compared on a pixel by pixel basis. The presented approach detects instances of higher level features in both data sets independently and performs the matching at the high level. Besides the efficiency of this general strategy it further allows the integration of additional knowledge sources: world knowledge and sensor characteristics are also useful sources of information. The SAR features layover and shadow can be detected easily in SAR images. An analytical method to find such regions also in a DEM needs in addition the parameters of the flight path of the SAR sensor and the range projection model. The generation of the SAR layover and shadow maps is summarized and new extensions to this method are proposed.

  15. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  16. Democratizing Evaluation: Meanings and Methods from Practice.

    ERIC Educational Resources Information Center

    Ryan, Katherine E.; Johnson, Trav D.

    2000-01-01

    Uses the results of an instrumental case study to identify issues connected to evaluation participation and its representation and the role of the internal evaluator in democratic, deliberative evaluation. Identified direct participation and participation by representation, sanctioned or unsanctioned representation, and extrinsic and intrinsic…

  17. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2015-03-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two data sets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM (Rennó, 2009); and (2) a set of 18 hydrological-topographic descriptors based on the corrected SRTM DEM. Deforestation features are related with the opening of an 800 km road in the central part of the interfluve and occupancy of its vicinity. We used topographic profiles from the pristine forest to the deforested feature to evaluate the recovery of the original canopy coverage by minimizing canopy height variation (corrections ranged from 1 to 38 m). The hydrological-topographic description was obtained by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (above sea level) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND data set was done by in situ hydrological description of 110 km of walking trails also available in this data set. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; the data sets of hydrological features based on topographic modelling are undoubtedly appropriate for ecological modelling and an important contribution to environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the

  18. Volume changes during the 2010 Merapi eruption calculated from TanDEM-X interferometry

    NASA Astrophysics Data System (ADS)

    Kubanek, Julia; Westerhaus, Malte; Heck, Bernhard

    2013-04-01

    NE-SE and NW-SW sectors of the edifice, respectively. We use the DEMs to give values of the volume change at the summit caused by the 2010 eruption. As the TanDEM-X mission is an innovative mission, the present study serves as a test to employ data of a new satellite mission in volcano research. An error analysis of the DEMs to evaluate the volume quantifications was therefore also conducted.

  19. Shape and Albedo from Shading (SAfS) for Pixel-Level dem Generation from Monocular Images Constrained by Low-Resolution dem

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Chung Liu, Wai; Grumpe, Arne; Wöhler, Christian

    2016-06-01

    Lunar topographic information, e.g., lunar DEM (Digital Elevation Model), is very important for lunar exploration missions and scientific research. Lunar DEMs are typically generated from photogrammetric image processing or laser altimetry, of which photogrammetric methods require multiple stereo images of an area. DEMs generated from these methods are usually achieved by various interpolation techniques, leading to interpolation artifacts in the resulting DEM. On the other hand, photometric shape reconstruction, e.g., SfS (Shape from Shading), extensively studied in the field of Computer Vision has been introduced to pixel-level resolution DEM refinement. SfS methods have the ability to reconstruct pixel-wise terrain details that explain a given image of the terrain. If the terrain and its corresponding pixel-wise albedo were to be estimated simultaneously, this is a SAfS (Shape and Albedo from Shading) problem and it will be under-determined without additional information. Previous works show strong statistical regularities in albedo of natural objects, and this is even more logically valid in the case of lunar surface due to its lower surface albedo complexity than the Earth. In this paper we suggest a method that refines a lower-resolution DEM to pixel-level resolution given a monocular image of the coverage with known light source, at the same time we also estimate the corresponding pixel-wise albedo map. We regulate the behaviour of albedo and shape such that the optimized terrain and albedo are the likely solutions that explain the corresponding image. The parameters in the approach are optimized through a kernel-based relaxation framework to gain computational advantages. In this research we experimentally employ the Lunar-Lambertian model for reflectance modelling; the framework of the algorithm is expected to be independent of a specific reflectance model. Experiments are carried out using the monocular images from Lunar Reconnaissance Orbiter (LRO

  20. Monitoring lava dome changes by means of differential DEMs from TanDEM-X interferometry: Examples from Merapi, Indonesia and Volcán de Colima, Mexico

    NASA Astrophysics Data System (ADS)

    Kubanek, J.; Westerhaus, M.; Heck, B.

    2013-12-01

    derived by TanDEM-X interferometry taken before and after the eruption. Our results reveal that the eruption had led to a topographic change of up to 200 m in the summit area of Merapi. We further show the ability of the TanDEM-X data to observe much smaller topographic changes using Volcán de Colima as second test site. An explosion at the crater rim signaled the end of magma ascent in June 2011. The bistatic TanDEM-X data give important information on this explosion as we can observe topographic changes of up to 20 m and less in the summit area when comparing datasets taken before and after the event. We further analyzed datasets from the beginning of the year 2013 when Colima got active again after a dormant period. Our results indicate that repeated DEMs with great detail and good accuracy are obtainable, enabling a quantitative estimation of volume changes in the summit area of the volcano. As the TanDEM-X mission is an innovative mission, the present study serves as a test to employ data of a new satellite mission in volcano research. An error analysis of the DEMs to evaluate the volume quantifications was therefore also conducted.

  1. Co-seismic landslide topographic analysis based on multi-temporal DEM-A case study of the Wenchuan earthquake.

    PubMed

    Ren, Zhikun; Zhang, Zhuqi; Dai, Fuchu; Yin, Jinhui; Zhang, Huiping

    2013-01-01

    Hillslope instability has been thought to be one of the most important factors for landslide susceptibility. In this study, we apply geomorphic analysis using multi-temporal DEM data and shake intensity analysis to evaluate the topographic characteristics of the landslide areas. There are many geomorphologic analysis methods such as roughness, slope aspect, which are also as useful as slope analysis. The analyses indicate that most of the co-seismic landslides occurred in regions with roughness, hillslope and slope aspect of >1.2, >30, and between 90 and 270, respectively. However, the intersection regions from the above three methods are more accurate than that derived by applying single topographic analysis method. The ground motion data indicates that the co-seismic landslides mainly occurred on the hanging wall side of Longmen Shan Thrust Belt within the up-down and horizontal peak ground acceleration (PGA) contour of 150 PGA and 200 gal, respectively. The comparisons of pre- and post-earthquake DEM data indicate that the medium roughness and slope increased, the roughest and steepest regions decreased after the Wenchuan earthquake. However, slope aspects did not even change. Our results indicate that co-seismic landslides mainly occurred at specific regions of high roughness, southward and steep sloping areas under strong ground motion. Co-seismic landslides significantly modified the local topography, especially the hillslope and roughness. The roughest relief and steepest slope are significantly smoothed; however, the medium relief and slope become rougher and steeper, respectively. PMID:24171155

  2. DEM Simulated Results And Seismic Interpretation of the Red River Fault Displacements in Vietnam

    NASA Astrophysics Data System (ADS)

    Bui, H. T.; Yamada, Y.; Matsuoka, T.

    2005-12-01

    The Song Hong basin is the largest Tertiary sedimentary basin in Viet Nam. Its onset is approximately 32 Ma ago since the left-lateral displacement of the Red River Fault commenced. Many researches on structures, formation and tectonic evolution of the Song Hong basin have been carried out for a long time but there are still remained some problems that needed to put into continuous discussion such as: magnitude of the displacements, magnitude of movement along the faults, the time of tectonic inversion and right lateral displacement. Especially the mechanism of the Song Hong basin formation is still in controversy with many different hypotheses due to the activation of the Red River fault. In this paper PFC2D based on the Distinct Element Method (DEM) was used to simulate the development of the Red River fault system that controlled the development of the Song Hong basin from the onshore to the elongated portion offshore area. The numerical results show the different parts of the stress field such as compress field, non-stress field, pull-apart field of the dynamic mechanism along the Red River fault in the onshore area. This propagation to the offshore area is partitioned into two main branch faults that are corresponding to the Song Chay and Song Lo fault systems and said to restrain the east and west flanks of the Song Hong basin. The simulation of the Red River motion also showed well the left lateral displacement since its onset. Though it is the first time the DEM method was applied to study the deformation and geodynamic evolution of the Song Hong basin, the results showed reliably applied into the structural configuration evaluation of the Song Hong basin.

  3. DEM, tide and velocity over sulzberger ice shelf, West Antarctica

    USGS Publications Warehouse

    Baek, S.; Shum, C.K.; Lee, H.; Yi, Y.; Kwoun, Oh-Ig; Lu, Zhiming; Braun, Andreas

    2005-01-01

    Arctic and Antarctic ice sheets preserve more than 77% of the global fresh water and could raise global sea level by several meters if completely melted. Ocean tides near and under ice shelves shifts the grounding line position significantly and are one of current limitations to study glacier dynamics and mass balance. The Sulzberger ice shelf is an area of ice mass flux change in West Antarctica and has not yet been well studied. In this study, we use repeat-pass synthetic aperture radar (SAR) interferometry data from the ERS-1 and ERS-2 tandem missions for generation of a high-resolution (60-m) Digital Elevation Model (DEM) including tidal deformation detection and ice stream velocity of the Sulzberger Ice Shelf. Other satellite data such as laser altimeter measurements with fine foot-prints (70-m) from NASA's ICESat are used for validation and analyses. The resulting DEM has an accuracy of-0.57??5.88 m and is demonstrated to be useful for grounding line detection and ice mass balance studies. The deformation observed by InSAR is found to be primarily due to ocean tides and atmospheric pressure. The 2-D ice stream velocities computed agree qualitatively with previous methods on part of the Ice Shelf from passive microwave remote-sensing data (i.e., LANDSAT). ?? 2005 IEEE.

  4. DEM Modelling of Non-linear Viscoelastic Stress Waves

    NASA Astrophysics Data System (ADS)

    Wang, Wenqiang; Tang, Zhiping; Horie, Yasuyuki

    2001-06-01

    A DEM(Discrete Element Method) simulation of nonlinear viscoelastic stress wave problems is carried out. The interaction forces among elements are described using a model in which neighbor elements are linked by a nonlinear spring and a certain number of Maxwell components in parallel. By making use of exponential relaxation moduli, it is shown that numerical computation of the convolution integral does not require storing and repeatedly calculating strain history, and can reduce the computational cost dramatically. To validate the viscoelastic DM2 code, stress wave propagation in a Maxwell rod with one end subjected to a constant stress loading is simulated. Results excellently fit those from the characteristics calculation. Satisfactory results are also obtained in the simulation of one-dimensional plane wave in a plastic bonded explosive. The code is then used to investigate the problem of meso-scale damage in this explosive under shock loading. Results not only show "compression damage", but also reveal a complex damage evolution. They demonstrate a unique capability of DEM in modeling heterogeneous materials.

  5. Simulation of triaxial response of granular materials by modified DEM

    NASA Astrophysics Data System (ADS)

    Wang, XiaoLiang; Li, JiaChun

    2014-12-01

    A modified discrete element method (DEM) with rolling effect taken into consideration is developed to examine macroscopic behavior of granular materials in this study. Dimensional analysis is firstly performed to establish the relationship between macroscopic mechanical behavior, mesoscale contact parameters at particle level and external loading rate. It is found that only four dimensionless parameters may govern the macroscopic mechanical behavior in bulk. The numerical triaxial apparatus was used to study their influence on the mechanical behavior of granular materials. The parametric study indicates that Poisson's ratio only varies with stiffness ratio, while Young's modulus is proportional to contact modulus and grows with stiffness ratio, both of which agree with the micromechanical model. The peak friction angle is dependent on both inter-particle friction angle and rolling resistance. The dilatancy angle relies on inter-particle friction angle if rolling stiffness coefficient is sufficiently large. Finally, we have recommended a calibration procedure for cohesionless soil, which was at once applied to the simulation of Chende sand using a series of triaxial compression tests. The responses of DEM model are shown in quantitative agreement with experiments. In addition, stress-strain response of triaxial extension was also obtained by numerical triaxial extension tests.

  6. Teaching Practical Public Health Evaluation Methods

    ERIC Educational Resources Information Center

    Davis, Mary V.

    2006-01-01

    Human service fields, and more specifically public health, are increasingly requiring evaluations to prove the worth of funded programs. Many public health practitioners, however, lack the required background and skills to conduct useful, appropriate evaluations. In the late 1990s, the Centers for Disease Control and Prevention (CDC) created the…

  7. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  8. DEM analyses of shear behaviour of rock joints by a novel bond contact model

    NASA Astrophysics Data System (ADS)

    Jiang, M. J.; Liu, J.; Sun, C.; Chen, H.

    2015-09-01

    The failure of rock joints is one of the potential causes for the local and general rock instability, which may trigger devastating geohazards such as landslide. In this paper, the Distinct Element Method (DEM) featured by a novel bond contact model was utilized to simulate shear behaviour of centre/non-coplanar rock joints. The DEM results show that the complete shear behaviour of jointed rock includes four stages: elastic shearing phase, crack propagation, the failure of rock bridges and the through-going discontinuity. The peak shear strength of centre joint increases as the joint connectivity rate decreases. For intermittent noncoplanar rock joints, as the inclination of the rock joints increases, its shear capacity decreases when the inclination angle is negative while increase when positive. Comparison with the experimental results proves the capability of this DEM model in capturing the mechanical properties of the jointed rocks.

  9. GPS-Based Precision Baseline Reconstruction for the TanDEM-X SAR-Formation

    NASA Technical Reports Server (NTRS)

    Montenbruck, O.; vanBarneveld, P. W. L.; Yoon, Y.; Visser, P. N. A. M.

    2007-01-01

    The TanDEM-X formation employs two separate spacecraft to collect interferometric Synthetic Aperture Radar (SAR) measurements over baselines of about 1 km. These will allow the generation ofa global Digital Elevation Model (DEM) with an relative vertical accuracy of 2-4 m and a 10 m ground resolution. As part of the ground processing, the separation of the SAR antennas at the time of each data take must be reconstructed with a 1 mm accuracy using measurements from two geodetic grade GPS receivers. The paper discusses the TanDEM-X mission as well as the methods employed for determining the interferometric baseline with utmost precision. Measurements collected during the close fly-by of the two GRACE satellites serve as a reference case to illustrate the processing concept, expected accuracy and quality control strategies.

  10. Sensitivity Analysis of Uav-Photogrammetry for Creating Digital Elevation Models (dem)

    NASA Astrophysics Data System (ADS)

    Rock, G.; Ries, J. B.; Udelhoven, T.

    2011-09-01

    This study evaluates the potential that lies in the photogrammetric processing of aerial images captured by unmanned aerial vehicles. UAV-Systems have gained increasing attraction during the last years. Miniaturization of electronic components often results in a reduction of quality. Especially the accuracy of the GPS/IMU navigation unit and the camera are of the utmost importance for photogrammetric evaluation of aerial images. To determine the accuracy of digital elevation models (DEMs), an experimental setup was chosen similar to the situation of data acquisition during a field campaign. A quarry was chosen to perform the experiment, because of the presence of different geomorphologic units, such as vertical walls, piles of debris, vegetation and even areas. In the experimental test field, 1042 ground control points (GCPs) were placed, used as input data for the photogrammetric processing and as high accuracy reference data for evaluating the DEMs. Further, an airborne LiDAR dataset covering the whole quarry and additional 2000 reference points, measured by total station, were used as ground truth data. The aerial images were taken using a MAVinci Sirius I - UAV equipped with a Canon 300D as imaging system. The influence of the number of GCPs on the accuracy of the indirect sensor orientation and the absolute deviation's dependency on different parameters of the modelled DEMs was subject of the investigation. Nevertheless, the only significant factor concerning the DEMs accuracy that could be isolated was the flying height of the UAV.

  11. Method for evaluation of laboratory craters using crater detection algorithm for digital topography data

    NASA Astrophysics Data System (ADS)

    Salamunićcar, Goran; Vinković, Dejan; Lončarić, Sven; Vučina, Damir; Pehnec, Igor; Vojković, Marin; Gomerčić, Mladen; Hercigonja, Tomislav

    In our previous work the following has been done: (1) the crater detection algorithm (CDA) based on digital elevation model (DEM) has been developed and the GT-115225 catalog has been assembled [GRS, 48 (5), in press, doi:10.1109/TGRS.2009.2037750]; and (2) the results of comparison between explosion-induced laboratory craters in stone powder surfaces and GT-115225 have been presented using depth/diameter measurements [41stLPSC, Abstract #1428]. The next step achievable using the available technology is to create 3D scans of such labo-ratory craters, in order to compare different properties with simple Martian craters. In this work, we propose a formal method for evaluation of laboratory craters, in order to provide objective, measurable and reproducible estimation of the level of achieved similarity between these laboratory and real impact craters. In the first step, the section of MOLA data for Mars (or SELENE LALT for Moon) is replaced with one or several 3D-scans of laboratory craters. Once embedment was done, the CDA can be used to find out whether this laboratory crater is similar enough to real craters, as to be recognized as a crater by the CDA. The CDA evaluation using ROC' curve represents how true detection rate (TDR=TP/(TP+FN)=TP/GT) depends on the false detection rate (FDR=FP/(TP+FP)). Using this curve, it is now possible to define the measure of similarity between laboratory and real impact craters, as TDR or FDR value, or as a distance from the bottom-right origin of the ROC' curve. With such an approach, the reproducible (formally described) method for evaluation of laboratory craters is provided.

  12. An Investigation of Transgressive Deposits in Late Pleistocene Lake Bonneville using GPR and UAV-produced DEMs.

    NASA Astrophysics Data System (ADS)

    Schide, K.; Jewell, P. W.; Oviatt, C. G.; Jol, H. M.

    2015-12-01

    Lake Bonneville was the largest of the Pleistocene pluvial lakes that once filled the Great Basin of the interior western United States. Its two most prominent shorelines, Bonneville and Provo, are well documented but many of the lake's intermediate shoreline features have yet to be studied. These transgressive barriers and embankments mark short-term changes in the regional water budget and thus represent a proxy for local climate change. The internal and external structures of these features are analyzed using the following methods: ground penetrating radar, 5 meter auto-correlated DEMs, 1-meter DEMs generated from LiDAR, high-accuracy handheld GPS, and 3D imagery collected with an unmanned aerial vehicle. These methods in mapping, surveying, and imaging provide a quantitative analysis of regional sediment availability, transportation, and deposition as well as changes in wave and wind energy. These controls help define climate thresholds and rates of landscape evolution in the Great Basin during the Pleistocene that are then evaluated in the context of global climate change.

  13. Precise Global DEM Generation by ALOS PRISM

    NASA Astrophysics Data System (ADS)

    Tadono, T.; Ishida, H.; Oda, F.; Naito, S.; Minakawa, K.; Iwamoto, H.

    2014-04-01

    The Japan Aerospace Exploration Agency (JAXA) generated the global digital elevation/surface model (DEM/DSM) and orthorectified image (ORI) using the archived data of the Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM) onboard the Advanced Land Observing Satellite (ALOS, nicknamed "Daichi"), which was operated from 2006 to 2011. PRISM consisted of three panchromatic radiometers that acquired along-track stereo images. It had a spatial resolution of 2.5 m in the nadir-looking radiometer and achieved global coverage, making it a suitable potential candidate for precise global DSM and ORI generation. In the past 10 years or so, JAXA has conducted the calibration of the system corrected standard products of PRISM in order to improve absolute accuracies as well as to validate the high-level products such as DSM and ORI. In this paper, we introduce an overview of the global DEM/DSM dataset generation project, including a summary of ALOS and PRISM, in addition to the global data archive status. It is also necessary to consider data processing strategies, since the processing capabilities of the level 1 standard product and the high-level products must be developed in terms of both hardware and software to achieve the project aims. The automatic DSM/ORI processing software and its test processing results are also described.

  14. Evaluation of temperament scoring methods for beef cattle

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this study was to evaluate methods of temperament scoring. Crossbred (n=228) calves were evaluated for temperament by an individual evaluator at weaning by two methods of scoring: 1) pen score (1 to 5 scale, with higher scores indicating increasing degree of nervousness, aggressiven...

  15. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  16. A Ranking Method for Evaluating Constructed Responses

    ERIC Educational Resources Information Center

    Attali, Yigal

    2014-01-01

    This article presents a comparative judgment approach for holistically scored constructed response tasks. In this approach, the grader rank orders (rather than rate) the quality of a small set of responses. A prior automated evaluation of responses guides both set formation and scaling of rankings. Sets are formed to have similar prior scores and…

  17. Multi-Method Evaluation of College Teaching

    ERIC Educational Resources Information Center

    Algozzine, Bob; Beattie, John; Bray, Marty; Flowers, Claudia; Gretes, John; Mohanty, Ganesh; Spooner, Fred

    2010-01-01

    Student evaluation of instruction in college and university courses has been a routine and mandatory part of undergraduate and graduate education for some time. A major shortcoming of the process is that it relies exclusively on the opinions or qualitative judgments of students rather than on assessing the learning or transfer of knowledge that…

  18. Dem Extraction from CHANG'E-1 Lam Data by Surface Skinning Technology

    NASA Astrophysics Data System (ADS)

    Zhang, X.-B.; Zhang, W.-M.

    2011-08-01

    DEM is a digital model or 3-D representation of a terrain's surface and it is created from terrain elevation data. The main models for DEM extraction based on Lidar data or Laser Altimeter data currently use the idea that point cloud is scattered, such as regular grid model, TIN model and contour model. Essentially, in these above methods, the discrete points are interpolated into regular grid data and irregular grid data. In fact, point cloud generated by Laser Altimeter is not totally scattered, but have some regularity. In this paper, to utilize this regularity, the proposed method adopts surface skinning technology to generate DEM from Chang'E-1 Laser Altimeter data. The surface skinning technology is widely used in the field of mechanical engineering. Surface skinning is the process of passing a smooth surface through a set of curves called sectional curves, which, in general, may not be compatible. In the process of generating section line, a need for attention is that it needs to use curvature method to get a set of characteristic points, and these feature points were used to subdivide segment; the next step is generating several curves on some key places. These curves describe the shape of the curved surface. The last step is to generate a curved surface that through these curves. The result shows that, this idea is feasible, useful and it provides a novel way to generate accurate DEM.

  19. Discrete element modelling (DEM) input parameters: understanding their impact on model predictions using statistical analysis

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Wilkinson, S. K.; Stitt, E. H.; Marigo, M.

    2015-09-01

    Selection or calibration of particle property input parameters is one of the key problematic aspects for the implementation of the discrete element method (DEM). In the current study, a parametric multi-level sensitivity method is employed to understand the impact of the DEM input particle properties on the bulk responses for a given simple system: discharge of particles from a flat bottom cylindrical container onto a plate. In this case study, particle properties, such as Young's modulus, friction parameters and coefficient of restitution were systematically changed in order to assess their effect on material repose angles and particle flow rate (FR). It was shown that inter-particle static friction plays a primary role in determining both final angle of repose and FR, followed by the role of inter-particle rolling friction coefficient. The particle restitution coefficient and Young's modulus were found to have insignificant impacts and were strongly cross correlated. The proposed approach provides a systematic method that can be used to show the importance of specific DEM input parameters for a given system and then potentially facilitates their selection or calibration. It is concluded that shortening the process for input parameters selection and calibration can help in the implementation of DEM.

  20. An evaluation of two VTO methods.

    PubMed

    Sample, L B; Sadowsky, P L; Bradley, E

    1998-10-01

    A sample of 34 growing Class II patients was used to assess the reliability of manual and computer-generated visual treatment objectives (VTOs) when compared with the actual treatment results. Skeletal, dental, and soft tissue measurements were performed on the VTO and on the posttreatment tracings. Using paired t-tests and Pearson correlation coefficients, comparisons were made between the VTO and posttreatment tracings. Both the manual and computer VTO methods were accurate when predicting skeletal changes that occurred during treatment. However, both methods were only moderately successful in forecasting dental and soft tissue alterations during treatment. Only slight differences were seen between the manual and computer VTO methods, with the computer being slightly more accurate with the soft tissue prediction. However, the differences between the two methods were not judged to be clinically significant. Overall, the prediction tracings were accurate to only a moderate degree, with marked individual variation evident throughout the sample. PMID:9770097

  1. Novel methods to evaluate fracture risk models

    PubMed Central

    Donaldson, M.G.; Cawthon, P. M.; Schousboe, J.T.; Ensrud, K.E.; Lui, L.Y.; Cauley, J.A.; Hillier, T.A.; Taylor, B.C.; Hochberg, M.C.; Bauer, D.C.; Cummings, S.R.

    2013-01-01

    Fracture prediction models help identify individuals at high risk who may benefit from treatment. Area Under the Curve (AUC) is used to compare prediction models. However, the AUC has limitations and may miss important differences between models. Novel reclassification methods quantify how accurately models classify patients who benefit from treatment and the proportion of patients above/below treatment thresholds. We applied two reclassification methods, using the NOF treatment thresholds, to compare two risk models: femoral neck BMD and age (“simple model”) and FRAX (”FRAX model”). The Pepe method classifies based on case/non-case status and examines the proportion of each above and below thresholds. The Cook method examines fracture rates above and below thresholds. We applied these to the Study of Osteoporotic Fractures. There were 6036 (1037 fractures) and 6232 (389 fractures) participants with complete data for major osteoporotic and hip fracture respectively. Both models for major osteoporotic fracture (0.68 vs. 0.69) and hip fracture (0.75 vs. 0.76) had similar AUCs. In contrast, using reclassification methods, each model classified a substantial number of women differently. Using the Pepe method, the FRAX model (vs. simple model), missed treating 70 (7%) cases of major osteoporotic fracture but avoided treating 285 (6%) non-cases. For hip fracture, the FRAX model missed treating 31 (8%) cases but avoided treating 1026 (18%) non-cases. The Cook method (both models, both fracture outcomes) had similar fracture rates above/below the treatment thresholds. Compared with the AUC, new methods provide more detailed information about how models classify patients. PMID:21351143

  2. [Evaluation of preventive measures using scientific methods].

    PubMed

    Göhlen, Britta; Bossmann, Hildegard

    2010-10-01

    The evaluation of preventive and health activities is due to ethical and financial aspects increasingly gaining importance. But how can this be assured on a scientifically high level? A tool that suits this purpose is Health Technology Assessment (HTA). Provided that the appropriate methodology is selected and scientific literature is evaluated, HTA can help to appraise the outcomes of preventive activities. The German Institute of Medical Documentation and Information publishes HTA reports on behalf of the Federal Health Ministry. These deal with topics related to prevention amongst others. Examples of the year 2009 are reports on the vaccination against human papilloma virus or on the nonmedicinal secondary prevention of coronary heart disease. PMID:20981592

  3. DEM time series of an agricultural watershed

    NASA Astrophysics Data System (ADS)

    Pineux, Nathalie; Lisein, Jonathan; Swerts, Gilles; Degré, Aurore

    2014-05-01

    In agricultural landscape soil surface evolves notably due to erosion and deposition phenomenon. Even if most of the field data come from plot scale studies, the watershed scale seems to be more appropriate to understand them. Currently, small unmanned aircraft systems and images treatments are improving. In this way, 3D models are built from multiple covering shots. When techniques for large areas would be to expensive for a watershed level study or techniques for small areas would be too time consumer, the unmanned aerial system seems to be a promising solution to quantify the erosion and deposition patterns. The increasing technical improvements in this growth field allow us to obtain a really good quality of data and a very high spatial resolution with a high Z accuracy. In the center of Belgium, we equipped an agricultural watershed of 124 ha. For three years (2011-2013), we have been monitoring weather (including rainfall erosivity using a spectropluviograph), discharge at three different locations, sediment in runoff water, and watershed microtopography through unmanned airborne imagery (Gatewing X100). We also collected all available historical data to try to capture the "long-term" changes in watershed morphology during the last decades: old topography maps, soil historical descriptions, etc. An erosion model (LANDSOIL) is also used to assess the evolution of the relief. Short-term evolution of the surface are now observed through flights done at 200m height. The pictures are taken with a side overlap equal to 80%. To precisely georeference the DEM produced, ground control points are placed on the study site and surveyed using a Leica GPS1200 (accuracy of 1cm for x and y coordinates and 1.5cm for the z coordinate). Flights are done each year in December to have an as bare as possible ground surface. Specific treatments are developed to counteract vegetation effect because it is know as key sources of error in the DEM produced by small unmanned aircraft

  4. The Study on Educational Technology Abilities Evaluation Method

    NASA Astrophysics Data System (ADS)

    Jing, Duan

    The traditional methods used to evaluate the test, the test did not really measure that we want to measuring things. Test results and can not serve as a basis for evaluation, so it was worth the natural result of its evaluation of weighing. This system is full use of technical means of education, based on education, psychological theory, to evaluate the object-based, evaluation tools, evaluation of secondary teachers to primary and secondary school teachers in educational technology as the goal, using a variety of evaluation of side France, from various angles established an informal evaluation system.

  5. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... where there is a human intrusion as specified by 10 CFR 63.322. DOE will model the performance of the... 10 Energy 4 2010-01-01 2010-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will...

  6. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... where there is a human intrusion as specified by 10 CFR 63.322. DOE will model the performance of the... 10 Energy 4 2011-01-01 2011-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will...

  7. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... where there is a human intrusion as specified by 10 CFR 63.322. DOE will model the performance of the... 10 Energy 4 2014-01-01 2014-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will...

  8. Developmental Eye Movement (DEM) Test Norms for Mandarin Chinese-Speaking Chinese Children.

    PubMed

    Xie, Yachun; Shi, Chunmei; Tong, Meiling; Zhang, Min; Li, Tingting; Xu, Yaqin; Guo, Xirong; Hong, Qin; Chi, Xia

    2016-01-01

    The Developmental Eye Movement (DEM) test is commonly used as a clinical visual-verbal ocular motor assessment tool to screen and diagnose reading problems at the onset. No established norm exists for using the DEM test with Mandarin Chinese-speaking Chinese children. This study aims to establish the normative values of the DEM test for the Mandarin Chinese-speaking population in China; it also aims to compare the values with three other published norms for English-, Spanish-, and Cantonese-speaking Chinese children. A random stratified sampling method was used to recruit children from eight kindergartens and eight primary schools in the main urban and suburban areas of Nanjing. A total of 1,425 Mandarin Chinese-speaking children aged 5 to 12 years took the DEM test in Mandarin Chinese. A digital recorder was used to record the process. All of the subjects completed a symptomatology survey, and their DEM scores were determined by a trained tester. The scores were computed using the formula in the DEM manual, except that the "vertical scores" were adjusted by taking the vertical errors into consideration. The results were compared with the three other published norms. In our subjects, a general decrease with age was observed for the four eye movement indexes: vertical score, adjusted horizontal score, ratio, and total error. For both the vertical and adjusted horizontal scores, the Mandarin Chinese-speaking children completed the tests much more quickly than the norms for English- and Spanish-speaking children. However, the same group completed the test slightly more slowly than the norms for Cantonese-speaking children. The differences in the means were significant (P<0.001) in all age groups. For several ages, the scores obtained in this study were significantly different from the reported scores of Cantonese-speaking Chinese children (P<0.005). Compared with English-speaking children, only the vertical score of the 6-year-old group, the vertical-horizontal time

  9. Developmental Eye Movement (DEM) Test Norms for Mandarin Chinese-Speaking Chinese Children

    PubMed Central

    Tong, Meiling; Zhang, Min; Li, Tingting; Xu, Yaqin; Guo, Xirong; Hong, Qin; Chi, Xia

    2016-01-01

    The Developmental Eye Movement (DEM) test is commonly used as a clinical visual-verbal ocular motor assessment tool to screen and diagnose reading problems at the onset. No established norm exists for using the DEM test with Mandarin Chinese-speaking Chinese children. This study aims to establish the normative values of the DEM test for the Mandarin Chinese-speaking population in China; it also aims to compare the values with three other published norms for English-, Spanish-, and Cantonese-speaking Chinese children. A random stratified sampling method was used to recruit children from eight kindergartens and eight primary schools in the main urban and suburban areas of Nanjing. A total of 1,425 Mandarin Chinese-speaking children aged 5 to 12 years took the DEM test in Mandarin Chinese. A digital recorder was used to record the process. All of the subjects completed a symptomatology survey, and their DEM scores were determined by a trained tester. The scores were computed using the formula in the DEM manual, except that the “vertical scores” were adjusted by taking the vertical errors into consideration. The results were compared with the three other published norms. In our subjects, a general decrease with age was observed for the four eye movement indexes: vertical score, adjusted horizontal score, ratio, and total error. For both the vertical and adjusted horizontal scores, the Mandarin Chinese-speaking children completed the tests much more quickly than the norms for English- and Spanish-speaking children. However, the same group completed the test slightly more slowly than the norms for Cantonese-speaking children. The differences in the means were significant (P<0.001) in all age groups. For several ages, the scores obtained in this study were significantly different from the reported scores of Cantonese-speaking Chinese children (P<0.005). Compared with English-speaking children, only the vertical score of the 6-year-old group, the vertical

  10. Zusatz- und Weiterqualifikation nach dem Studium

    NASA Astrophysics Data System (ADS)

    Domnick, Ivonne

    Ist der Bachelor geschafft, stellt sich die Frage nach einer Weiterqualifizierung. Neben einem Einstieg ins Berufsleben kann auch ein Masterstudium eventuell weitere entscheidende Bonuspunkte für den Lebenslauf bringen. Mit Zusatzqualifikationen aus fachfremden Bereichen wie Betriebswirtschaft oder Marketing ist es für Naturwissenschaftler leichter, den Einstieg ins Berufsleben zu schaffen. Viele Arbeitgeber sehen gerade bei Naturwissenschaftlern eine Promotion gerne. Hier sollte genau abgewogen werden, ob sie innerhalb einer bestimmten Zeitspanne zu schaffen ist. Auch nach einem Einstieg in den Job lässt sich der Doktortitel unter Umständen noch nachholen. Ebenso ist eine Weiterbildung neben dem Beruf in Teilzeit oder in einem Fernkurs möglich. Zusätzlich gibt es viele mehrwöchige oder mehrmonatige Kurse privater Anbieter, in denen man BWL-Grundkenntnisse erwerben kann.

  11. The Evaluation of Flammability Properties Regarding Testing Methods

    NASA Astrophysics Data System (ADS)

    Osvaldová, Linda Makovická; Gašpercová, Stanislava

    2015-12-01

    In this paper, we address the historical comparison methods with current methods for the assessment of flammability characteristics for materials an especially for wood, wood components and wooden buildings. Nowadays in European Union brings harmonization in evaluated of standards into each European country and try to make one concept of evaluated the flammability properties. In each European country to the one standard level which will be used by evaluation of materials regarding flammability. In our article we focused mainly on improving the evaluation methods in terms of flammability characteristics of using materials at building industry. In the article we present examples of different assessment methods at their own test methods in terms of fire prevention. On the base of old compared of materials by STN, BS and DIN methods for testing materials on fire and new methods of evaluating the flammability properties regarding EU standards before and after starting the flash over.

  12. Animal Methods for Evaluating Forage Quality

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Numerous methods are available that employ animals in the assessment of forage quality. Some of these procedures provide information needed to address very specific goals (e.g., monitoring protein adequacy), some serve as useful contributors to the efforts to accurately predict nutritive value, wher...

  13. Evaluation of Electrochemical Methods for Electrolyte Characterization

    NASA Technical Reports Server (NTRS)

    Heidersbach, Robert H.

    2001-01-01

    This report documents summer research efforts in an attempt to develop an electrochemical method of characterizing electrolytes. The ultimate objective of the characterization would be to determine the composition and corrosivity of Martian soil. Results are presented using potentiodynamic scans, Tafel extrapolations, and resistivity tests in a variety of water-based electrolytes.

  14. Test methods for evaluating reformulated fuels

    SciTech Connect

    Croudace, M.C.

    1994-12-31

    The US Environmental Protection Agency (EPA) introduced regulations in the 1989 Clean Air Act Amendment governing the reformulation of gasoline and diesel fuels to improve air quality. These statutes drove the need for a fast and accurate method for analyzing product composition, especially aromatic and oxygenate content. The current method, gas chromatography, is slow, expensive, non portable, and requires a trained chemist to perform the analysis. The new mid-infrared spectroscopic method uses light to identify and quantify the different components in fuels. Each individual fuel component absorbs a specific wavelength of light depending on the molecule`s unique chemical structure. The quantity of light absorbed is proportional to the concentration of that fuel component in the mixture. The mid-infrared instrument has significant advantages; it is easy to use, rugged, portable, fully automated and cost effective. It can be used to measure multiple oxygenate or aromatic components in unknown fuel mixtures. Regulatory agencies have begun using this method in field compliance testing; petroleum refiners and marketers use it to monitor compliance, product quality and blending accuracy.

  15. EVALUATION OF POHC AND PIC SCREENING METHODS

    EPA Science Inventory

    A recurring theme in environmental work is the need to characterize emissions to the maximum extent at the minimum cost. Unfortunately, many projects have been carried out In the past with little thought or planning concerning the optimum application of analytical methods availab...

  16. Performance Evaluation Methods for Assistive Robotic Technology

    NASA Astrophysics Data System (ADS)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  17. The Discrepancy Evaluation Model: A Systematic Approach for the Evaluation of Career Planning and Placement Programs.

    ERIC Educational Resources Information Center

    Buttram, Joan L.; Covert, Robert W.

    The Discrepancy Evaluation Model (DEM), developed in 1966 by Malcolm Provus, provides information for program assessment and program improvement. Under the DEM, evaluation is defined as the comparison of an actual performance to a desired standard. The DEM embodies five stages of evaluation based upon a program's natural development: program…

  18. Rockslide and Impulse Wave Modelling in the Vajont Reservoir by DEM-CFD Analyses

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Utili, S.; Crosta, G. B.

    2016-06-01

    This paper investigates the generation of hydrodynamic water waves due to rockslides plunging into a water reservoir. Quasi-3D DEM analyses in plane strain by a coupled DEM-CFD code are adopted to simulate the rockslide from its onset to the impact with the still water and the subsequent generation of the wave. The employed numerical tools and upscaling of hydraulic properties allow predicting a physical response in broad agreement with the observations notwithstanding the assumptions and characteristics of the adopted methods. The results obtained by the DEM-CFD coupled approach are compared to those published in the literature and those presented by Crosta et al. (Landslide spreading, impulse waves and modelling of the Vajont rockslide. Rock mechanics, 2014) in a companion paper obtained through an ALE-FEM method. Analyses performed along two cross sections are representative of the limit conditions of the eastern and western slope sectors. The max rockslide average velocity and the water wave velocity reach ca. 22 and 20 m/s, respectively. The maximum computed run up amounts to ca. 120 and 170 m for the eastern and western lobe cross sections, respectively. These values are reasonably similar to those recorded during the event (i.e. ca. 130 and 190 m, respectively). Therefore, the overall study lays out a possible DEM-CFD framework for the modelling of the generation of the hydrodynamic wave due to the impact of a rapid moving rockslide or rock-debris avalanche.

  19. Extraction of Hydrological Proximity Measures from DEMs using Parallel Processing

    SciTech Connect

    Tesfa, Teklu K.; Tarboton, David G.; Watson, Daniel W.; Schreuders, Kimberly A.; Baker, Matthew M.; Wallace, Robert M.

    2011-12-01

    Land surface topography is one of the most important terrain properties which impact hydrological, geomorphological, and ecological processes active on a landscape. In our previous efforts to develop a soil depth model based upon topographic and land cover variables, we extracted a set of hydrological proximity measures (HPMs) from a Digital Elevation Model (DEM) as potential explanatory variables for soil depth. These HPMs may also have other, more general modeling applicability in hydrology, geomorphology and ecology, and so are described here from a general perspective. The HPMs we derived are variations of the distance up to ridge points (cells with no incoming flow) and variations of the distance down to stream points (cells with a contributing area greater than a threshold), following the flow path. These HPMs were computed using the D-infinity flow model that apportions flow between adjacent neighbors based on the direction of steepest downward slope on the eight triangular facets constructed in a 3 x 3 grid cell window using the center cell and each pair of adjacent neighboring grid cells in turn. The D-infinity model typically results in multiple flow paths between 2 points on the topography, with the result that distances may be computed as the minimum, maximum or average of the individual flow paths. In addition, each of the HPMs, are calculated vertically, horizontally, and along the land surface. Previously, these HPMs were calculated using recursive serial algorithms which suffered from stack overflow problems when used to process large datasets, limiting the size of DEMs that could be analyzed using that method to approximately 7000 x 7000 cells. To overcome this limitation, we developed a message passing interface (MPI) parallel approach for calculating these HPMs. The parallel algorithms of the HPMs spatially partition the input grid into stripes which are each assigned to separate processes for computation. Each of those processes then uses a

  20. Synergy of Image and Digital Elevation Models (DEMS) Information for Virtual Reality

    NASA Astrophysics Data System (ADS)

    Maire, C.; Datcu, M.

    2004-09-01

    In the framework of 3D visualization and real-time rendering of large remote sensing image databases, several signal processing techniques are presented and evaluated to filter/enhance SAR Digital Elevation Models (DEMs). Through the SRTM DEM, the interest of InSAR data for such applications is illustrated. A non stationary bayesian filter is presented to remove noise and small artefacts which pervade the SAR DEM while preserving structures and information content. Results obtained are very good, nevertheless large artefacts cannot be filtered and some artefacts remain. Therefore, image information have to be inserted to produce more realistic views. This second step is done by using a segmentation algorithm on the image data. By a topology analysis, the extracted objects are classified/stored in a tree structure to describe the topologic relations between the objects and reflect their interdependencies. An interactive learning procedure is done through a Graphical User Interface to link the signal classes to the semantic ones, i.e. to include human knowledge in the system. The selected information in form of objets are merged/fused in the DEM by assigning regularisation constraints.

  1. Evaluations of Three Methods for Remote Training

    NASA Technical Reports Server (NTRS)

    Woolford, B.; Chmielewski, C.; Pandya, A.; Adolf, J.; Whitmore, M.; Berman, A.; Maida, J.

    1999-01-01

    Long duration space missions require a change in training methods and technologies. For Shuttle missions, crew members could train for all the planned procedures, and carry documentation of planned procedures for a variety of contingencies. As International Space Station (ISS) missions of three months or longer are carried out, many more tasks will need to be performed for which little or no training was received prior to launch. Eventually, exploration missions will last several years, and communications with Earth will have long time delays or be impossible at times. This series of three studies was performed to identify the advantages and disadvantages of three types of training for self-instruction: video-conferencing; multimedia; and virtual reality. These studies each compared two types of training methods, on two different types of tasks. In two of the studies, the subject's were in an isolated, confined environment analogous to space flight; the third study was performed in a laboratory.

  2. Evaluation of toothbrush disinfection via different methods.

    PubMed

    Basman, Adil; Peker, Ilkay; Akca, Gulcin; Alkurt, Meryem Toraman; Sarikir, Cigdem; Celik, Irem

    2016-01-01

    The aim of this study was to compare the efficacy of using a dishwasher or different chemical agents, including 0.12% chlorhexidine gluconate, 2% sodium hypochlorite (NaOCl), a mouthrinse containing essential oils and alcohol, and 50% white vinegar, for toothbrush disinfection. Sixty volunteers were divided into five experimental groups and one control group (n = 10). Participants brushed their teeth using toothbrushes with standard bristles, and they disinfected the toothbrushes according to instructed methods. Bacterial contamination of the toothbrushes was compared between the experimental groups and the control group. Data were analyzed by Kruskal-Wallis and Duncan's multiple range tests, with 95% confidence intervals for multiple comparisons. Bacterial contamination of toothbrushes from individuals in the experimental groups differed from those in the control group (p < 0.05). The most effective method for elimination of all tested bacterial species was 50% white vinegar, followed in order by 2% NaOCl, mouthrinse containing essential oils and alcohol, 0.12% chlorhexidine gluconate, dishwasher use, and tap water (control). The results of this study show that the most effective method for disinfecting toothbrushes was submersion in 50% white vinegar, which is cost-effective, easy to access, and appropriate for household use. PMID:26676193

  3. Evaluation criteria and test methods for electrochromic windows

    SciTech Connect

    Czanderna, A.W. ); Lampert, C.M. )

    1990-07-01

    Report summarizes the test methods used for evaluating electrochromic (EC) windows, and summarizes what is known about degradation of their performance, and recommends methods and procedures for advancing EC windows for buildings applications. 77 refs., 13 figs., 6 tabs.

  4. Method of evaluating subsurface fracturing operations

    SciTech Connect

    Soliman, M.Y.

    1989-06-06

    This patent describes a method of determining parameters of a subsurface operation fracturing an earth formation, comprising: fracturing the formation with a fracturing fluid; determining a first pressure decline value representative of the observed pressure decline of the fractured formation over a time interval. The first pressure decline value functionally related to the properties of the fracturing fluid during the fracturing of the formation; determining a second pressure decline value representative of the pressure decline which should have been observed if the fracturing fluid was incompressible; and determining the parameters of the fracturing operation in response to the pressure decline value.

  5. Organic ion exchange resin separation methods evaluation

    SciTech Connect

    Witwer, K.S.

    1998-05-27

    This document describes testing to find effective methods to separate Organic Ion Exchange Resin (OIER) from a sludge simulant. This task supports a comprehensive strategy for treatment and processing of K-Basin sludge. The simulant to be used resembles sludge that has accumulated in the 105KE and 105KW Basins in the 1OOK area of the Hanford Site. The sludge is an accumulation of fuel element corrosion products, organic and inorganic ion exchange materials, canister gasket materials, iron and aluminum corrosion products, sand, dirt, and other minor amounts of organic matter.

  6. Lava emplacements at Shiveluch volcano (Kamchatka) from June 2011 to September 2014 observed by TanDEM-X SAR-Interferometry

    NASA Astrophysics Data System (ADS)

    Heck, Alexandra; Kubanek, Julia; Westerhaus, Malte; Gottschämmer, Ellen; Heck, Bernhard; Wenzel, Friedemann

    2016-04-01

    As part of the Ring of Fire, Shiveluch volcano is one of the largest and most active volcanoes on Kamchatka Peninsula. During the Holocene, only the southern part of the Shiveluch massive was active. Since the last Plinian eruption in 1964, the activity of Shiveluch is characterized by periods of dome growth and explosive eruptions. The recent active phase began in 1999 and continues until today. Due to the special conditions at active volcanoes, such as smoke development, danger of explosions or lava flows, as well as poor weather conditions and inaccessible area, it is difficult to observe the interaction between dome growth, dome destruction, and explosive eruptions in regular intervals. Consequently, a reconstruction of the eruption processes is hardly possible, though important for a better understanding of the eruption mechanism as well as for hazard forecast and risk assessment. A new approach is provided by the bistatic radar data acquired by the TanDEM-X satellite mission. This mission is composed of two nearly identical satellites, TerraSAR-X and TanDEM-X, flying in a close helix formation. On one hand, the radar signals penetrate clouds and partially vegetation and snow considering the average wavelength of about 3.1 cm. On the other hand, in comparison with conventional InSAR methods, the bistatic radar mode has the advantage that there are no difficulties due to temporal decorrelation. By interferometric evaluation of the simultaneously recorded SAR images, it is possible to calculate high-resolution digital elevation models (DEMs) of Shiveluch volcano and its surroundings. Furthermore, the short recurrence interval of 11 days allows to generate time series of DEMs, with which finally volumetric changes of the dome and of lava flows can be determined, as well as lava effusion rates. Here, this method is used at Shiveluch volcano based on data acquired between June 2011 and September 2014. Although Shiveluch has a fissured topography with steep slopes

  7. Mixing equilibrium in two-density fluidized beds by DEM

    NASA Astrophysics Data System (ADS)

    Di Renzo, A.; Di Maio, F. P.

    2010-05-01

    Interaction of fluid and granular flows in dense two-phase systems is responsible for the significantly different behavior of units used in the chemical industry such as fluidized beds. The momentum exchange phenomena involved during gas fluidization of a binary mixture of solids differing in density is such that the continuous mixing action of the fluid flowing upwards counteracts the natural tendency of the two (fluidized) solids to segregate with the heavier component fully settling at the bottom of the bed. In the present work the complex hydrodynamics of two-density gas-fluidized beds is studied by means of a DEM-CFD computational approach, combining the discrete element method (DEM) and a solution of the locally averaged equations of motion (CFD). The model is first validated against experimental data and then used to investigate the role of gas velocity versus density ratio of the two components in determining the distribution of the components in the system. It is shown first that a unique equilibrium composition profile is reached independent of the initial arrangements of the solids. Then, numerical simulations are used to find the equilibrium conditions of mixing/segregation as a function of the gas velocity in excess of the minimum fluidization velocity of the heavier component and as a function of the density ratio of the two solid species. A mixing map on the gas velocity-density ratio plane is finally reconstructed by plotting iso-mixing lines that shows quantitatively how conditions ranging from full mixing to fully segregated components are obtained.

  8. Mixed Methods and Credibility of Evidence in Evaluation

    ERIC Educational Resources Information Center

    Mertens, Donna M.; Hesse-Biber, Sharlene

    2013-01-01

    We argue for a view of credible evidence that is multidimensional in philosophical and methodological terms. We advocate for the importance of deepening the meaning of credible evaluation practice and findings by bringing multiple philosophical and theoretical lenses to the evaluation process as a basis for the use of mixed methods in evaluation,…

  9. Household batteries: Evaluation of collection methods

    SciTech Connect

    Seeberger, D.A.

    1992-12-31

    While it is difficult to prove that a specific material is causing contamination in a landfill, tests have been conducted at waste-to-energy facilities that indicate that household batteries contribute significant amounts of heavy metals to both air emissions and ash residue. Hennepin County, MN, used a dual approach for developing and implementing a special household battery collection. Alternative collection methods were examined; test collections were conducted. The second phase examined operating and disposal policy issues. This report describes the results of the grant project, moving from a broad examination of the construction and content of batteries, to a description of the pilot collection programs, and ending with a discussion of variables affecting the cost and operation of a comprehensive battery collection program. Three out-of-state companies (PA, NY) were found that accept spent batteries; difficulties in reclaiming household batteries are discussed.

  10. Household batteries: Evaluation of collection methods

    SciTech Connect

    Seeberger, D.A.

    1992-01-01

    While it is difficult to prove that a specific material is causing contamination in a landfill, tests have been conducted at waste-to-energy facilities that indicate that household batteries contribute significant amounts of heavy metals to both air emissions and ash residue. Hennepin County, MN, used a dual approach for developing and implementing a special household battery collection. Alternative collection methods were examined; test collections were conducted. The second phase examined operating and disposal policy issues. This report describes the results of the grant project, moving from a broad examination of the construction and content of batteries, to a description of the pilot collection programs, and ending with a discussion of variables affecting the cost and operation of a comprehensive battery collection program. Three out-of-state companies (PA, NY) were found that accept spent batteries; difficulties in reclaiming household batteries are discussed.