Science.gov

Sample records for dems method evaluation

  1. Srtm Dem-Aided dem Extraction Method for Island and Reef

    NASA Astrophysics Data System (ADS)

    Chen, X. W.; Zhao, C.; Guo, H. T.; Lin, Y. Z.; Yu, D. H.

    2017-09-01

    An SRTM DEM-aided DEM extraction method for island and reef is proposed to solve the problem of island and reef DEM extraction based on satellite imagery. The SRTM DEM is fully integrated into this method, namely, it is used to provide initial elevation for DEM and also to mark the sea area points in order to avoid the adverse effect of sea area image on DEM extraction. When determining elevations of grid points, only the valid land area points (VLPs) are taken into account. On the basis of initial elevation, the image coordinates of VLPs in multi-view images are determined and precise coordinates of conjugate points are obtained based on least square matching, then ground coordinates of VLPs are acquired by forward intersection. Finally, the elevations of VLPs are determined based on these object space points through data interpolation, and the sea area points are set as a uniform value. Experimental results show that the method can effectively solve the problem of island and reef DEM extraction. It can effectively extract DEM from island and reef satellite images regardless of the land area proportion, and island and reef can be completely extracted. Accuracy of the extracted DEM would improve with the increase of DEM resolution; when the resolution is relative high, the accuracy is consistent with SRTM DEM. The computational efficiency depends on the land area proportion and DEM resolution.

  2. Performance Evaluation of Four DEM-Based Fluvial Terrace Mapping Methods Across Variable Geomorphic Settings: Application to the Sheepscot River Watershed, Maine

    NASA Astrophysics Data System (ADS)

    Hopkins, A. J.; Snyder, N. P.

    2014-12-01

    Fluvial terraces are utilized in geomorphic studies as recorders of land-use, climate, and tectonic history. Advances in digital topographic data, such as high-resolution digital elevation models (DEMs) derived from airborne lidar surveys, has promoted the development of several methods used to extract terraces from DEMs based on their characteristic morphology. The post-glacial landscape of the Sheepscot River watershed, Maine, where strath and fill terraces are present and record Pleistocene deglaciation, Holocene eustatic forcing, and Anthropocene land-use change, was selected to implement a comparison between terrace mapping methodologies. At four study sites within the watershed, terraces were manually mapped to facilitate the comparison between fully and semi-automated DEM-based mapping procedures, including: (1) edge detection functions in Matlab, (2) feature classification algorithms developed by Wood (1996), (3) spatial relationships between interpreted terraces and surrounding topography (Walter et al., 2007), and (4) the TerEx terrace mapping toolbox developed by Stout and Belmont (2014). Each method was evaluated based on its accuracy and ease of implementation. The four study sites have varying longitudinal slope (0.1% - 5%), channel width (<5 m - 30 m), relief in surrounding landscape (15 m - 75 m), type and density of surrounding land use, and mapped surficial geologic units. In general, all methods overestimate terrace areas (average predicted area 136% of the manually defined area). Surrounding topographic relief appears to exert the greatest control on mapping accuracy, with the most accurate results (92% of terrace area mapped by Walter et al., 2007 method) achieved where the river valley was most confined by adjacent hillslopes. Accuracy decreased for study sites surrounded by a low-relief landscape, with the most accurate results achieved by the TerEx toolbox (Stout and Belmont, 2014; predicted areas were 45% and 89% of manual delineations

  3. Which DEM is the best for glaciology? -Evaluation of global-scale DEM products-

    NASA Astrophysics Data System (ADS)

    Nagai, Hiroto; Tadono, Takeo

    2017-04-01

    Digital elevation models (DEMs) are fundamental geospatial data to study glacier distribution, changes, dynamics, mass balance and various geomorphological conditions. This study evaluates latest global-scale free DEMs in order to clarify their superiority and inferiority in glaciological uses. Three DEMs are now available; the 1-arcsec. product obtained from the Shuttle Radar Topographic Mission (SRTM1), the second version of Global Digital Elevation Model of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER GDEM2), and the first resampled dataset acquired by the Advanced Land observing Satellite, namely ALOS World 3D-30m (AW3D30). These DEMs have common specifications of global coverage (<60°S/N for SRTM1), freely downloadable via internet, and 1-arcsec. ( 30 m) pixel spacing. We carried out quantitative accuracy evaluation and spatial analysis of missing data (i.e. "void") distribution for these DEMs. Elevation values of the three DEMs are validated at check points (CPs), where elevation was measured by Geospatial Information Authority of Japan, in (A) the Japan Alps (as steep mountains with glaciation), in (B) Mt. Fuji (as monotonous hillslope), and in (C) the Tone river basin (as an flat plain). In all study sites, AW3D30 has the smallest errors against the CP elevation values (A: -6.1±8.6 m, B: +0.1±3.9 m, C: +0.1±2.5 m as the mean value and standard deviation of elevation differences). SRTM1 is secondly accurate (A: -17.8±16.3 m, B: +1.3±6.4 m, C: +0.1±3.1 m,), followed by ASTER GDEM2 (A: -13.9±20.8 m, B: -3.9±10.0 m, C: +4.3±3.8 m,). This accuracy differences among the DEMs are greater in steeper terrains (A>B>C). In the Tone river basin, SRTM1 has equivalent accuracy to AW3D30. High resolution (2.5 m) of the original stereo-pair images for AW3D30 (i.e. ALOS PRISM imagery) contributes for the best absolute accuracy. Glaciers on rather flat terrains are usually distributed in higher latitude (e.g. Antarctica and Greenland

  4. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  5. Evaluating Error of LIDAR Derived dem Interpolation for Vegetation Area

    NASA Astrophysics Data System (ADS)

    Ismail, Z.; Khanan, M. F. Abdul; Omar, F. Z.; Rahman, M. Z. Abdul; Mohd Salleh, M. R.

    2016-09-01

    Light Detection and Ranging or LiDAR data is a data source for deriving digital terrain model while Digital Elevation Model or DEM is usable within Geographical Information System or GIS. The aim of this study is to evaluate the accuracy of LiDAR derived DEM generated based on different interpolation methods and slope classes. Initially, the study area is divided into three slope classes: (a) slope class one (0° - 5°), (b) slope class two (6° - 10°) and (c) slope class three (11° - 15°). Secondly, each slope class is tested using three distinctive interpolation methods: (a) Kriging, (b) Inverse Distance Weighting (IDW) and (c) Spline. Next, accuracy assessment is done based on field survey tachymetry data. The finding reveals that the overall Root Mean Square Error or RMSE for Kriging provided the lowest value of 0.727 m for both 0.5 m and 1 m spatial resolutions of oil palm area, followed by Spline with values of 0.734 m for 0.5 m spatial resolution and 0.747 m for spatial resolution of 1 m. Concurrently, IDW provided the highest RMSE value of 0.784 m for both spatial resolutions of 0.5 and 1 m. For rubber area, Spline provided the lowest RMSE value of 0.746 m for 0.5 m spatial resolution and 0.760 m for 1 m spatial resolution. The highest value of RMSE for rubber area is IDW with the value of 1.061 m for both spatial resolutions. Finally, Kriging gave the RMSE value of 0.790m for both spatial resolutions.

  6. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2010-11-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  7. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2009-09-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  8. Feasibility Analysis of DEM Differential Method on Tree Height Assessment wit Terra-SAR/TanDEM-X Data

    NASA Astrophysics Data System (ADS)

    Zhang, Wangfei; Chen, Erxue; Li, Zengyuan; Feng, Qi; Zhao, Lei

    2016-08-01

    DEM Differential Method is an effective and efficient way for forest tree height assessment with Polarimetric and interferometric technology, however, the assessment accuracy of it is based on the accuracy of interferometric results and DEM. Terra-SAR/TanDEM-X, which established the first spaceborne bistatic interferometer, can provide highly accurate cross-track interferometric images in the whole global without inherent accuracy limitations like temporal decorrelation and atmospheric disturbance. These characters of Terra-SAR/TandDEM-X give great potential for global or regional tree height assessment, which have been constraint by the temporal decorrelation in traditional repeat-pass interferometry. Currently, in China, it will be costly to collect high accurate DEM with Lidar. At the same time, it is also difficult to get truly representative ground survey samples to test and verify the assessment results. In this paper, we analyzed the feasibility of using TerraSAR/TanDEM-X data to assess forest tree height with current free DEM data like ASTER-GDEM and archived ground in-suit data like forest management inventory data (FMI). At first, the accuracy and of ASTER-GDEM and forest management inventory data had been assessment according to the DEM and canopy height model (CHM) extracted from Lidar data. The results show the average elevation RMSE between ASTER-GEDM and Lidar-DEM is about 13 meters, but they have high correlation with the correlation coefficient of 0.96. With a linear regression model, we can compensate ASTER-GDEM and improve its accuracy nearly to the Lidar-DEM with same scale. The correlation coefficient between FMI and CHM is 0.40. its accuracy is able to be improved by a linear regression model withinconfidence intervals of 95%. After compensation of ASTER-GDEM and FMI, we calculated the tree height in Mengla test site with DEM Differential Method. The results showed that the corrected ASTER-GDEM can effectively improve the assessment accuracy

  9. Stochastic Discrete Equation Method (sDEM) for two-phase flows

    SciTech Connect

    Abgrall, R.; Congedo, P.M.; Geraci, G.; Rodio, M.G.

    2015-10-15

    A new scheme for the numerical approximation of a five-equation model taking into account Uncertainty Quantification (UQ) is presented. In particular, the Discrete Equation Method (DEM) for the discretization of the five-equation model is modified for including a formulation based on the adaptive Semi-Intrusive (aSI) scheme, thus yielding a new intrusive scheme (sDEM) for simulating stochastic two-phase flows. Some reference test-cases are performed in order to demonstrate the convergence properties and the efficiency of the overall scheme. The propagation of initial conditions uncertainties is evaluated in terms of mean and variance of several thermodynamic properties of the two phases.

  10. GPU based contouring method on grid DEM data

    NASA Astrophysics Data System (ADS)

    Tan, Liheng; Wan, Gang; Li, Feng; Chen, Xiaohui; Du, Wenlong

    2017-08-01

    This paper presents a novel method to generate contour lines from grid DEM data based on the programmable GPU pipeline. The previous contouring approaches often use CPU to construct a finite element mesh from the raw DEM data, and then extract contour segments from the elements. They also need a tracing or sorting strategy to generate the final continuous contours. These approaches can be heavily CPU-costing and time-consuming. Meanwhile the generated contours would be unsmooth if the raw data is sparsely distributed. Unlike the CPU approaches, we employ the GPU's vertex shader to generate a triangular mesh with arbitrary user-defined density, in which the height of each vertex is calculated through a third-order Cardinal spline function. Then in the same frame, segments are extracted from the triangles by the geometry shader, and translated to the CPU-side with an internal order in the GPU's transform feedback stage. Finally we propose a ;Grid Sorting; algorithm to achieve the continuous contour lines by travelling the segments only once. Our method makes use of multiple stages of GPU pipeline for computation, which can generate smooth contour lines, and is significantly faster than the previous CPU approaches. The algorithm can be easily implemented with OpenGL 3.3 API or higher on consumer-level PCs.

  11. An efficient method for DEM-based overland flow routing

    NASA Astrophysics Data System (ADS)

    Huang, Pin-Chun; Lee, Kwan Tun

    2013-05-01

    The digital elevation model (DEM) is frequently used to represent watershed topographic features based on a raster or a vector data format. It has been widely linked with flow routing equations for watershed runoff simulation. In this study, a recursive formulation was encoded into the conventional kinematic- and diffusion-wave routing algorithms to permit a larger time increment, despite the Courant-Friedrich-Lewy condition having been violated. To meet the requirement of recursive formulation, a novel routing sequence was developed to determine the cell-to-cell computational procedure for the DEM database. The routing sequence can be set either according to the grid elevation in descending order for the kinematic-wave routing or according to the water stage of the grid in descending order for the diffusion-wave routing. The recursive formulation for 1D runoff routing was first applied to a conceptual overland plane to demonstrate the precision of the formulation using an analytical solution for verification. The proposed novel routing sequence with the recursive formulation was then applied to two mountain watersheds for 2D runoff simulations. The results showed that the efficiency of the proposed method was significantly superior to that of the conventional algorithm, especially when applied to a steep watershed.

  12. Numerical Simulation of High Velocity Impact Phenomenon by the Distinct Element Method (dem)

    NASA Astrophysics Data System (ADS)

    Tsukahara, Y.; Matsuo, A.; Tanaka, K.

    2007-12-01

    Continuous-DEM (Distinct Element Method) for impact analysis is proposed in this paper. Continuous-DEM is based on DEM (Distinct Element Method) and the idea of the continuum theory. Numerical simulations of impacts between SUS 304 projectile and concrete target has been performed using the proposed method. The results agreed quantitatively with the impedance matching method. Experimental elastic-plastic behavior with compression and rarefaction wave under plate impact was also qualitatively reproduced, matching the result by AUTODYN®.

  13. A New DEM Generalization Method Based on Watershed and Tree Structure

    PubMed Central

    Chen, Yonggang; Ma, Tianwu; Chen, Xiaoyin; Chen, Zhende; Yang, Chunju; Lin, Chenzhi; Shan, Ligang

    2016-01-01

    The DEM generalization is the basis of multi-dimensional observation, the basis of expressing and analyzing the terrain. DEM is also the core of building the Multi-Scale Geographic Database. Thus, many researchers have studied both the theory and the method of DEM generalization. This paper proposed a new method of generalizing terrain, which extracts feature points based on the tree model construction which considering the nested relationship of watershed characteristics. The paper used the 5 m resolution DEM of the Jiuyuan gully watersheds in the Loess Plateau as the original data and extracted the feature points in every single watershed to reconstruct the DEM. The paper has achieved generalization from 1:10000 DEM to 1:50000 DEM by computing the best threshold. The best threshold is 0.06. In the last part of the paper, the height accuracy of the generalized DEM is analyzed by comparing it with some other classic methods, such as aggregation, resample, and VIP based on the original 1:50000 DEM. The outcome shows that the method performed well. The method can choose the best threshold according to the target generalization scale to decide the density of the feature points in the watershed. Meanwhile, this method can reserve the skeleton of the terrain, which can meet the needs of different levels of generalization. Additionally, through overlapped contour contrast, elevation statistical parameters and slope and aspect analysis, we found out that the W8D algorithm performed well and effectively in terrain representation. PMID:27517296

  14. Evaluation of DEM generation accuracy from UAS imagery

    NASA Astrophysics Data System (ADS)

    Santise, M.; Fornari, M.; Forlani, G.; Roncella, R.

    2014-06-01

    The growing use of UAS platform for aerial photogrammetry comes with a new family of Computer Vision highly automated processing software expressly built to manage the peculiar characteristics of these blocks of images. It is of interest to photogrammetrist and professionals, therefore, to find out whether the image orientation and DSM generation methods implemented in such software are reliable and the DSMs and orthophotos are accurate. On a more general basis, it is interesting to figure out whether it is still worth applying the standard rules of aerial photogrammetry to the case of drones, achieving the same inner strength and the same accuracies as well. With such goals in mind, a test area has been set up at the University Campus in Parma. A large number of ground points has been measured on natural as well as signalized points, to provide a comprehensive test field, to check the accuracy performance of different UAS systems. In the test area, points both at ground-level and features on the buildings roofs were measured, in order to obtain a distributed support also altimetrically. Control points were set on different types of surfaces (buildings, asphalt, target, fields of grass and bumps); break lines, were also employed. The paper presents the results of a comparison between two different surveys for DEM (Digital Elevation Model) generation, performed at 70 m and 140 m flying height, using a Falcon 8 UAS.

  15. Open-Source Digital Elevation Model (DEMs) Evaluation with GPS and LiDAR Data

    NASA Astrophysics Data System (ADS)

    Khalid, N. F.; Din, A. H. M.; Omar, K. M.; Khanan, M. F. A.; Omar, A. H.; Hamid, A. I. A.; Pa'suya, M. F.

    2016-09-01

    Advanced Spaceborne Thermal Emission and Reflection Radiometer-Global Digital Elevation Model (ASTER GDEM), Shuttle Radar Topography Mission (SRTM), and Global Multi-resolution Terrain Elevation Data 2010 (GMTED2010) are freely available Digital Elevation Model (DEM) datasets for environmental modeling and studies. The quality of spatial resolution and vertical accuracy of the DEM data source has a great influence particularly on the accuracy specifically for inundation mapping. Most of the coastal inundation risk studies used the publicly available DEM to estimated the coastal inundation and associated damaged especially to human population based on the increment of sea level. In this study, the comparison between ground truth data from Global Positioning System (GPS) observation and DEM is done to evaluate the accuracy of each DEM. The vertical accuracy of SRTM shows better result against ASTER and GMTED10 with an RMSE of 6.054 m. On top of the accuracy, the correlation of DEM is identified with the high determination of coefficient of 0.912 for SRTM. For coastal zone area, DEMs based on airborne light detection and ranging (LiDAR) dataset was used as ground truth data relating to terrain height. In this case, the LiDAR DEM is compared against the new SRTM DEM after applying the scale factor. From the findings, the accuracy of the new DEM model from SRTM can be improved by applying scale factor. The result clearly shows that the value of RMSE exhibit slightly different when it reached 0.503 m. Hence, this new model is the most suitable and meets the accuracy requirement for coastal inundation risk assessment using open source data. The suitability of these datasets for further analysis on coastal management studies is vital to assess the potentially vulnerable areas caused by coastal inundation.

  16. A coupled DEM-CFD method for impulse wave modelling

    NASA Astrophysics Data System (ADS)

    Zhao, Tao; Utili, Stefano; Crosta, GiovanBattista

    2015-04-01

    Rockslides can be characterized by a rapid evolution, up to a possible transition into a rock avalanche, which can be associated with an almost instantaneous collapse and spreading. Different examples are available in the literature, but the Vajont rockslide is quite unique for its morphological and geological characteristics, as well as for the type of evolution and the availability of long term monitoring data. This study advocates the use of a DEM-CFD framework for the modelling of the generation of hydrodynamic waves due to the impact of a rapid moving rockslide or rock-debris avalanche. 3D DEM analyses in plane strain by a coupled DEM-CFD code were performed to simulate the rockslide from its onset to the impact with still water and the subsequent wave generation (Zhao et al., 2014). The physical response predicted is in broad agreement with the available observations. The numerical results are compared to those published in the literature and especially to Crosta et al. (2014). According to our results, the maximum computed run up amounts to ca. 120 m and 170 m for the eastern and western lobe cross sections, respectively. These values are reasonably similar to those recorded during the event (i.e. ca. 130 m and 190 m respectively). In these simulations, the slope mass is considered permeable, such that the toe region of the slope can move submerged in the reservoir and the impulse water wave can also flow back into the slope mass. However, the upscaling of the grains size in the DEM model leads to an unrealistically high hydraulic conductivity of the model, such that only a small amount of water is splashed onto the northern bank of the Vajont valley. The use of high fluid viscosity and coarse grain model has shown the possibility to model more realistically both the slope and wave motions. However, more detailed slope and fluid properties, and the need for computational efficiency should be considered in future research work. This aspect has also been

  17. Development and Evaluation of Simple Measurement System Using the Oblique Photo and dem

    NASA Astrophysics Data System (ADS)

    Nonaka, H.; Sasaki, H.; Fujimaki, S.; Naruke, S.; Kishimoto, H.

    2016-06-01

    When a disaster occurs, we must grasp and evaluate its damage as soon as possible. Then we try to estimate them from some kind of photographs, such as surveillance camera imagery, satellite imagery, photographs taken from a helicopter and so on. Especially in initial stage, estimation of decent damage situation for a short time is more important than investigation of damage situation for a long time. One of the source of damage situation is the image taken by surveillance camera, satellite sensor and helicopter. If we can measure any targets in these imagery, we can estimate a length of a lava flow, a reach of a cinder and a sediment volume in volcanic eruption or landslide. Therefore in order to measure various information for a short time, we developed a simplified measurement system which uses these photographs. This system requires DEM in addition to photographs, but it is possible to use previously acquired DEM. To measure an object, we require only two steps. One is the determination of the position and the posture in which the photograph is shot. We determine these parameters using DEM. The other step is the measurement of an object in photograph. In this paper, we describe this system and show the experimental results to evaluate this system. In this experiment we measured the top of Mt. Usu by using two measurement method of this system. Then we can measure it about one hour and the difference between the measurement results and the airborne LiDAR data are less than 10 meter.

  18. Evaluation of terrain datasets for LiDAR data thinning and DEM generation for watershed delineation applications

    NASA Astrophysics Data System (ADS)

    Olivera, F.; Ferreira, C.; Djokic, D.

    2010-12-01

    Watershed delineation based on Digital Elevation Models (DEM) is currently a standard practice in hydrologic studies. Efforts to develop DEMs of high resolution continues to take place, although the advantages of increasing the accuracy of the results are partially offset by the increased file size, difficulty to handle them, slow screen rendering and increase computational effort. Among these efforts, those based on the use of Light Detection and Ranging (LiDAR) pose the problem that interpolation techniques in commercially available GIS software packages (e.g., IDW, Spline, Kriging and TOPORASTER, among others) for developing DEMs from point elevations have difficulty processing large amounts of data. Terrain Dataset is an alternative format for storing topographic data that intelligently decimates data points and creates simplified, yet equally accurate for practical purposes, DEMs or Triangular Irregular Networks (TIN). This study uses terrain datasets to evaluate the impact that the thinning method (i.e., window size and z-value), pyramid level and the interpolation technique (linear or natural neighbor) used to create the DEMs have on the watersheds delineated from them. Two case studies were considered for assessing the effect of the different methods and techniques. One of them consisted of dendritic topography in Williamson Creek, Austin, Texas, and the other of deranged topography in Hillsborough County, Florida. The results were compared using three standardized error metrics that measure the accuracy of the watershed boundaries, and computational effort. For the Williamson creek (steeper terrain), point thinning during the terrain creation process or the interpolation method choice did not affect the watershed delineation; while, in the Hillsborough (flat terrain), the method for point thinning and interpolation techniques highly influenced the resulting watershed delineation.

  19. Research of the gas-solid flow character based on the DEM method

    NASA Astrophysics Data System (ADS)

    Wang, Xueyao; Xiao, Yunhan

    2011-12-01

    Numerical simulation of gas-solid flow behaviors in a rectangular fluidized bed is carried out three dimensionally by the discrete element method (DEM). Euler method and Lagrange method are employed to deal with the gas phase and solid phase respectively. The collided force among particles, striking force between particle and wall, drag force, gravity, Magnus lift force and Saffman lift force are considered when establishing the mathematic models. Soft-sphere model is used to describe the collision of particles. In addition, the Euler method is also used for modeling the solid phase to compare with the results of DEM. The flow patterns, particle mean velocities, particles' diffusion and pressure drop of the bed under typical operating conditions are obtained. The results show that the DEM method can describe the detailed information among particles, while the Euler-Euler method cannot capture the micro-scale character. No matter which method is used, the diffusion of particles increases with the increase of gas velocity. But the gathering and crushing of particles cannot be simulated, so the energy loss of particles' collision cannot be calculated and the diffusion by using the Euler-Euler method is larger. In addition, it is shown by DEM method, with strengthening of the carrying capacity, more and more particles can be schlepped upward and the dense suspension upflow pattern can be formed. However, the results given by the Euler-Euler method are not consistent with the real situation.

  20. On the choice of a phase interchange strategy for a multiscale DEM-VOF method

    NASA Astrophysics Data System (ADS)

    Pozzetti, Gabriele; Peters, Bernhard

    2017-07-01

    In this work a novel Multiscale DEM-VOF method is adopted to study three phase flows. It consists in solving the fluid momentum, mass conservation and the phase advection at a different scale with respect to the fluid-particle coupling problem. This allows the VOF scheme to resolve smaller fluid structures than a classic DEM-VOF method, and opens the possibility of adopting different volume interchange techniques. Two different volume interchange techniques are here described and compared with reference to high and low particle concentration scenarios. Considerations about the respective computational costs are also proposed.

  1. DEM-based Watershed Delineation - Comparison of Different Methods and applications

    NASA Astrophysics Data System (ADS)

    Chu, X.; Zhang, J.; Tahmasebi Nasab, M.

    2015-12-01

    Digital elevation models (DEMs) are commonly used for large-scale watershed hydrologic and water quality modeling. With aid of the latest LiDAR technology, submeter scale DEM data are often available for many areas in the United States. Precise characterization of the detailed variations in surface microtopography using such high-resolution DEMs is crucial to the related watershed modeling. Various methods have been developed to delineate a watershed, including determination of flow directions and accumulations, identification of subbasin boundaries, and calculation of the relevant topographic parameters. The objective of this study is to examine different DEM-based watershed delineation methods by comparing their unique features and the discrepancies in their results. Not only does this study cover the traditional watershed delineation methods, but also a new puddle-based unit (PBU) delineation method. The specific topics and issues to be presented involve flow directions (D8 single flow direction vs. multi-direction methods), segmentation of stream channels, drainage systems (single "depressionless" drainage network vs. hierarchical depression-dominated drainage system), and hydrologic connectivity (static structural connectivity vs. dynamic functional connectivity). A variety of real topographic surfaces are selected and delineated by using the selected methods. Comparisons of their delineation results emphasize the importance of selection of the methods and highlight their applicability and potential impacts on watershed modeling.

  2. Flow Dynamics of green sand in the DISAMATIC moulding process using Discrete element method (DEM)

    NASA Astrophysics Data System (ADS)

    Hovad, E.; Larsen, P.; Walther, J. H.; Thorborg, J.; Hattel, J. H.

    2015-06-01

    The DISAMATIC casting process production of sand moulds is simulated with DEM (discrete element method). The main purpose is to simulate the dynamics of the flow of green sand, during the production of the sand mould with DEM. The sand shot is simulated, which is the first stage of the DISAMATIC casting process. Depending on the actual casting geometry the mould can be geometrically quite complex involving e.g. shadowing effects and this is directly reflected in the sand flow during the moulding process. In the present work a mould chamber with “ribs” at the walls is chosen as a baseline geometry to emulate some of these important conditions found in the real moulding process. The sand flow is simulated with the DEM and compared with corresponding video footages from the interior of the chamber during the moulding process. The effect of the rolling resistance and the static friction coefficient is analysed and discussed in relation to the experimental findings.

  3. Structural and Volumetric re-evaluation of the Vaiont landslide using DEM techniques

    NASA Astrophysics Data System (ADS)

    Superchi, Laura; Pedrazzini, Andrea; Floris, Mario; Genevois, Rinaldo; Ghirotti, Monica; Jaboyedoff, Michel

    2010-05-01

    On the 9th October 1963 a catastrophic landslide occurred on the southern slope of the Vaiont dam reservoir. A mass of approximately 270 million m3 collapsed into the reservoir generating a wave which overtopped the dam and hit the town of Longarone and other villages: almost 2000 people lost their lives. The large volume and high velocity of the landslide combined with the great destruction and loss of life that occurred make the Vaiont landslide as a natural laboratory to investigate landslide failure mechanisms and propagation. Geological, structural, geomorphological, hydrogeological and geomechanical elements should be, then, re-analyzed using methods and techniques not available in the '60s. In order to better quantify the volume involved in the movement and to assess the mechanism of the failure, a structural study is a preliminary and necessary step. The structural features have been investigated based on a digital elevation model (DEM) of the pre- and post-landslide topography at a pixel size of 5m and associated software (COLTOP-3D) to create a colored shaded relief map revealing the orientation of morphological features. Besides,the results allowed to identify on both pre- and post-slide surface six main discontinuity sets, some of which influence directly the Vaiont landslide morphology. Recent and old field surveys allowed to validate the COLTOP-3D analysis results. To estimate the location and shape of the sliding surface and to evaluate the volume of the landslide, the SLBL (Sloping Local Base Level) method has been used, a simple and efficient tool that allows a geometric interpretation of the failure surface based on a DEM. The SLBL application required a geological interpretation to define the contours of the landslide and to estimate the possible curvature of the sliding surface, that is defined by interpolating between points considered as limits of the landslide. The SLBL surface of the Vaiont landslide, was obtained from the DEM reconstruction

  4. Discrete Element Method (DEM) Application to The Cone Penetration Test Using COUPi Model

    NASA Astrophysics Data System (ADS)

    Kulchitsky, A. V.; Johnson, J.; Wilkinson, A.; DeGennaro, A. J.; Duvoy, P.

    2011-12-01

    The cone penetration test (CPT) is a soil strength measurement method to determine the tip resistance and sleeve friction versus depth while pushing a cone into regolith with controlled slow quasi-static speed. This test can also be used as an excellent tool to validate the discrete element method (DEM) model by comparing tip resistance and sleeve friction from experiments to model results. DEM by nature requires significant computational resources even for a limited number of particles. Thus, it is important to find particle and ensemble parameters that produce valuable results within reasonable computation times. The Controllable Objects Unbounded Particles Interaction (COUPi) model is a general physical DEM code being developed to model machine/regolith interactions as part of a NASA Lunar Science Institute sponsored project on excavation and mobility modeling. In this work, we consider how different particle shape and size distributions defined in the DEM influence the cone tip and friction sleeve resistance in a CPT DEM simulation. The results are compared to experiments with cone penetration in JSC-1A lunar regolith simulant. The particle shapes include spherical particles, particles composed from the union of three spheres, and some simple polyhedra. This focus is driven by the soil mechanics rule of thumb that particle size and shape distributions are the two most significant factors affecting soil strength. In addition to the particle properties, the packing configuration of an ensemble strongly affects soil strength. Bulk density of the regolith is an important characteristic that significantly influences the tip resistance and sleeve friction (Figure 1). We discuss different approaches used to control granular density in the DEM, including how to obtain higher bulk densities, using numerical "shaking" techniques and varying the friction coefficient during computations.

  5. Fusion of Multi-Scale Dems from Descent and Navcm Images of CHANG'E-3 Using Compressed Sensing Method

    NASA Astrophysics Data System (ADS)

    Peng, M.; Wan, W.; Liu, Z.; Di, K.

    2017-07-01

    The multi-source DEMs generated using the images acquired in the descent and landing phase and after landing contain supplementary information, and this makes it possible and beneficial to produce a higher-quality DEM through fusing the multi-scale DEMs. The proposed fusion method consists of three steps. First, source DEMs are split into small DEM patches, then the DEM patches are classified into a few groups by local density peaks clustering. Next, the grouped DEM patches are used for sub-dictionary learning by stochastic coordinate coding. The trained sub-dictionaries are combined into a dictionary for sparse representation. Finally, the simultaneous orthogonal matching pursuit (SOMP) algorithm is used to achieve sparse representation. We use the real DEMs generated from Chang'e-3 descent images and navigation camera (Navcam) stereo images to validate the proposed method. Through the experiments, we have reconstructed a seamless DEM with the highest resolution and the largest spatial coverage among the input data. The experimental results demonstrated the feasibility of the proposed method.

  6. New land-based method for surveying sandy shores and extracting DEMs: the INSHORE system.

    PubMed

    Baptista, Paulo; Cunha, Telmo R; Matias, Ana; Gama, Cristina; Bernardes, Cristina; Ferreira, Oscar

    2011-11-01

    The INSHORE system (INtegrated System for High Operational REsolution in shore monitoring) is a land-base survey system designed and developed for the specific task of monitoring the evolution in time of sandy shores. This system was developed with two main objectives: (1) to produce highly accurate 3D coordinates of surface points (in the order of 0.02 to 0.03 m); and (2) to be extremely efficient in surveying a beach stretch of several kilometres. Previous tests have demonstrated that INSHORE systems fulfil such objectives. Now, the usefulness of the INSHORE system as a survey tool for the production of Digital Elevation Models (DEMs) of sandy shores is demonstrated. For this purpose, the comparison of DEMs obtained with the INSHORE system and with other relevant survey techniques is presented. This comparison focuses on the final DEM accuracy and also on the survey efficiency and its impact on the costs associated with regular monitoring programmes. The field survey method of the INSHORE system, based on profile networks, has a productivity of about 30 to 40 ha/h, depending on the beach surface characteristics. The final DEM precision, after interpolation of the global positioning system profile network, is approximately 0.08 to 0.12 m (RMS), depending on the profile network's density. Thus, this is a useful method for 3D representation of sandy shore surfaces and can permit, after interpolation, reliable calculations of volume and other physical parameters.

  7. Sensitivity of Particle Size in Discrete Element Method to Particle Gas Method (DEM_PGM) Coupling in Underbody Blast Simulations

    DTIC Science & Technology

    In this paper, the capability of two methods of modelling detonation of high explosives (HE) buried in soil viz., (1) coupled discrete element and...blast simulation method. The main focus of this study is to understand the strengths of DEM_PGM and identify the limitations /strengths compared to the ALE

  8. Calibration of DEM parameters on shear test experiments using Kriging method

    NASA Astrophysics Data System (ADS)

    Xavier, Bednarek; Sylvain, Martin; Abibatou, Ndiaye; Véronique, Peres; Olivier, Bonnefoy

    2017-06-01

    Calibration of powder mixing simulation using Discrete-Element-Method is still an issue. Achieving good agreement with experimental results is difficult because time-efficient use of DEM involves strong assumptions. This work presents a methodology to calibrate DEM parameters using Efficient Global Optimization (EGO) algorithm based on Kriging interpolation method. Classical shear test experiments are used as calibration experiments. The calibration is made on two parameters - Young modulus and friction coefficient. The determination of the minimal number of grains that has to be used is a critical step. Simulations of a too small amount of grains would indeed not represent the realistic behavior of powder when using huge amout of grains will be strongly time consuming. The optimization goal is the minimization of the objective function which is the distance between simulated and measured behaviors. The EGO algorithm uses the maximization of the Expected Improvement criterion to find next point that has to be simulated. This stochastic criterion handles with the two interpolations made by the Kriging method : prediction of the objective function and estimation of the error made. It is thus able to quantify the improvement in the minimization that new simulations at specified DEM parameters would lead to.

  9. A Review of Discrete Element Method (DEM) Particle Shapes and Size Distributions for Lunar Soil

    NASA Technical Reports Server (NTRS)

    Lane, John E.; Metzger, Philip T.; Wilkinson, R. Allen

    2010-01-01

    As part of ongoing efforts to develop models of lunar soil mechanics, this report reviews two topics that are important to discrete element method (DEM) modeling the behavior of soils (such as lunar soils): (1) methods of modeling particle shapes and (2) analytical representations of particle size distribution. The choice of particle shape complexity is driven primarily by opposing tradeoffs with total number of particles, computer memory, and total simulation computer processing time. The choice is also dependent on available DEM software capabilities. For example, PFC2D/PFC3D and EDEM support clustering of spheres; MIMES incorporates superquadric particle shapes; and BLOKS3D provides polyhedra shapes. Most commercial and custom DEM software supports some type of complex particle shape beyond the standard sphere. Convex polyhedra, clusters of spheres and single parametric particle shapes such as the ellipsoid, polyellipsoid, and superquadric, are all motivated by the desire to introduce asymmetry into the particle shape, as well as edges and corners, in order to better simulate actual granular particle shapes and behavior. An empirical particle size distribution (PSD) formula is shown to fit desert sand data from Bagnold. Particle size data of JSC-1a obtained from a fine particle analyzer at the NASA Kennedy Space Center is also fitted to a similar empirical PSD function.

  10. Use of thermal infrared pictures for retrieving intertidal DEM by the waterline method: advantages and limitations

    NASA Astrophysics Data System (ADS)

    Gaudin, D.; Delacourt, C.; Allemand, P.

    2010-12-01

    Digital Elevation Models (DEM) of the intertidal zones have a growing interest for ecological and land development purposes. They are also a fundamental tool for monitoring current sedimentary movements in those low energy environments. Such DEMs have to be constructed with a centimetric resolution as the topographic changes are not predictable and as sediment displacements are weak. Direct construction of DEM by GPS in these muddy environment is difficult: photogrammetric techniques are not efficient on uniform coloured surfaces and terrestrial laser scans are difficult to stabilize on the mud, due to humidity. In this study, we propose to improve and to apply the waterline method to retrieve DEMs in intertidal zones. This technique is based on monitoring accurately the boundary between sand and water during a whole tide rise with thermal infrared images. The DEM is made by stacking all these lines calibrated by an immersed pressure sensor. Using thermal infrared pictures, instead of optical ones, improves the detection of the waterline, since mud and water have very different responses to sun heating and a large emissivity contrast. However, temperature retrieving from thermal infrared data is not trivial, since the luminance of an object is the sum of a radiative part and a reflexive part, whose relative proportions are given by the emissivity. In the following equation, B accounts for the equivalent blackbody luminance, and Linc is the incident luminance : Ltot}=L{rad}+L_{refl=ɛ B+(1-ɛ )Linc The infrared waterline technique has been used for the monitoring of a beach located on the Aber Benoit, 8.5km away from the open sea. The site is mainly constituted of mud, and waves are very small (less than one centimeter high), which are the ideal conditions for using the waterline method. A few measurements have been made to make differential heigh maps of sediments. We reached a mean resolution of 2cm and a vertical accuracy better than one centimeter. The results

  11. Dem Extraction from WORLDVIEW-3 Stereo-Images and Accuracy Evaluation

    NASA Astrophysics Data System (ADS)

    Hu, F.; Gao, X. M.; Li, G. Y.; Li, M.

    2016-06-01

    This paper validates the potentials of Worldview-3 satellite images in large scale topographic mapping, by choosing Worldview-3 along-track stereo-images of Yi Mountain area in Shandong province China for DEM extraction and accuracy evaluation. Firstly, eighteen accurate and evenly-distributed GPS points are collected in field and used as GCPs/check points, the image points of which are accurately measured, and also tie points are extracted from image matching; then, the RFM-based block adjustment to compensate the systematic error in image orientation is carried out and the geo-positioning accuracy is calculated and analysed; next, for the two stereo-pairs of the block, DSMs are separately constructed and mosaicked as an entirety, and also the corresponding DEM is subsequently generated; finally, compared with the selected check points from high-precision airborne LiDAR point cloud covering the same test area, the accuracy of the generated DEM with 2-meter grid spacing is evaluated by the maximum (max.), minimum (min.), mean and standard deviation (std.) values of elevation biases. It is demonstrated that, for Worldview-3 stereo-images used in our research, the planimetric accuracy without GCPs is about 2.16 m (mean error) and 0.55 (std. error), which is superior to the nominal value, while the vertical accuracy is about -1.61 m (mean error) and 0.49 m (std. error); with a small amount of GCPs located in the center and four corners of the test area, the systematic error can be well compensated. The std. value of elevation biases between the generated DEM and the 7256 LiDAR check points are about 0.62 m. If considering the potential uncertainties in the image point measurement, stereo matching and also elevation editing, the accuracy of generating DEM from Worldview-3 stereo-images should be more desirable. Judging from the results, Worldview-3 has the potential for 1:5000 or even larger scale mapping application.

  12. Combined DEM Extration Method from StereoSAR and InSAR

    NASA Astrophysics Data System (ADS)

    Zhao, Z.; Zhang, J. X.; Duan, M. Y.; Huang, G. M.; Yang, S. C.

    2015-06-01

    A pair of SAR images acquired from different positions can be used to generate digital elevation model (DEM). Two techniques exploiting this characteristic have been introduced: stereo SAR and interferometric SAR. They permit to recover the third dimension (topography) and, at the same time, to identify the absolute position (geolocation) of pixels included in the imaged area, thus allowing the generation of DEMs. In this paper, StereoSAR and InSAR combined adjustment model are constructed, and unify DEM extraction from InSAR and StereoSAR into the same coordinate system, and then improve three dimensional positioning accuracy of the target. We assume that there are four images 1, 2, 3 and 4. One pair of SAR images 1,2 meet the required conditions for InSAR technology, while the other pair of SAR images 3,4 can form stereo image pairs. The phase model is based on InSAR rigorous imaging geometric model. The master image 1 and the slave image 2 will be used in InSAR processing, but the slave image 2 is only used in the course of establishment, and the pixels of the slave image 2 are relevant to the corresponding pixels of the master image 1 through image coregistration coefficient, and it calculates the corresponding phase. It doesn't require the slave image in the construction of the phase model. In Range-Doppler (RD) model, the range equation and Doppler equation are a function of target geolocation, while in the phase equation, the phase is also a function of target geolocation. We exploit combined adjustment model to deviation of target geolocation, thus the problem of target solution is changed to solve three unkonwns through seven equations. The model was tested for DEM extraction under spaceborne InSAR and StereoSAR data and compared with InSAR and StereoSAR methods respectively. The results showed that the model delivered a better performance on experimental imagery and can be used for DEM extraction applications.

  13. Evaluation of lidar-derived DEMs through terrain analysis and field comparison

    Treesearch

    Cody P. Gillin; Scott W. Bailey; Kevin J. McGuire; Stephen P. Prisley

    2015-01-01

    Topographic analysis of watershed-scale soil and hydrological processes using digital elevation models (DEMs) is commonplace, but most studies have used DEMs of 10 m resolution or coarser. Availability of higher-resolution DEMs created from light detection and ranging (lidar) data is increasing but their suitability for such applications has received little critical...

  14. Research on the method of extracting DEM based on GBInSAR

    NASA Astrophysics Data System (ADS)

    Yue, Jianping; Yue, Shun; Qiu, Zhiwei; Wang, Xueqin; Guo, Leping

    2016-05-01

    Precise topographical information has a very important role in geology, hydrology, natural resources survey and deformation monitoring. The extracting DEM technology based on synthetic aperture radar interferometry (InSAR) obtains the three-dimensional elevation of the target area through the phase information of the radar image data. The technology has large-scale, high-precision, all-weather features. By changing track in the location of the ground radar system up and down, it can form spatial baseline. Then we can achieve the DEM of the target area by acquiring image data from different angles. Three-dimensional laser scanning technology can quickly, efficiently and accurately obtain DEM of target area, which can verify the accuracy of DEM extracted by GBInSAR. But research on GBInSAR in extracting DEM of the target area is a little. For lack of theory and lower accuracy problems in extracting DEM based on GBInSAR now, this article conducted research and analysis on its principle deeply. The article extracted the DEM of the target area, combined with GBInSAR data. Then it compared the DEM obtained by GBInSAR with the DEM obtained by three-dimensional laser scan data and made statistical analysis and normal distribution test. The results showed the DEM obtained by GBInSAR was broadly consistent with the DEM obtained by three-dimensional laser scanning. And its accuracy is high. The difference of both DEM approximately obeys normal distribution. It indicated that extracting the DEM of target area based on GBInSAR is feasible and provided the foundation for the promotion and application of GBInSAR.

  15. A Method for Improving SRTM DEMs in High-Relief Terrain

    NASA Astrophysics Data System (ADS)

    Falorni, G.; Istanbulluoglu, E.; Bras, R. L.

    2003-12-01

    The Shuttle Radar Topography Mission (SRTM) had the objective of mapping the Earth's surface between 56 o S and 60 o N to produce the first near-global high resolution digital elevation model (DEM). The dataset, with a horizontal resolution of 1 arc second ( ˜ 30 m), has now been released for the conterminous U.S. Recent investigations aimed at assessing the vertical accuracy of the dataset have revealed that elevation accuracy is well within dataset specifications in areas of low- to modest-relief but that errors generally increase in high-relief terrains. Statistical analyses performed with the objective of characterizing the error structure in two study sites within the U.S. have highlighted the existence of correlations between elevation residuals and slope gradient, slope bearing and elevation. In particular, the analyses show that the largest errors occur on steep slopes and that slope bearing has a marked influence on the sign of the elevation residuals. Based on these findings we are currently investigating a method for correcting relevant vertical errors in SRTM-derived DEMs according to their topographic location. We propose to use a combination of indices derived from the statistical analyses to predict the occurrence, magnitude and sign of the vertical errors.

  16. High-resolution Pleiades DEMs and improved mapping methods for the E-Corinth marine terraces

    NASA Astrophysics Data System (ADS)

    de Gelder, Giovanni; Fernández-Blanco, David; Delorme, Arthur; Jara-Muñoz, Julius; Melnick, Daniel; Lacassin, Robin; Armijo, Rolando

    2016-04-01

    The newest generation of satellite imagery provides exciting new possibilities for highly detailed mapping, with ground resolution of sub-metric pixels and absolute accuracy within a few meters. This opens new venues for the analysis of geologic and geomorphic landscape features, especially since photogrammetric methods allow the extraction of detailed topographic information from these satellite images. We used tri-stereo imagery from the Pleiades platform of the CNES in combination with Euclidium software for image orientation, and Micmac software for dense matching, to develop state-of-the-art, 2m-resolution digital elevation models (DEMs) for eight areas in Greece. Here, we present our mapping results for an area in the eastern Gulf of Corinth, which contains one of the most extensive and well-preserved flights of marine terraces world-wide. The spatial extent of the terraces has been determined by an iterative combination of an automated surface classification model for terrain slope and roughness, and qualitative assessment of satellite imagery, DEM hillshade maps, slope maps, as well as detailed topographic analyses of profiles and contours. We determined marine terrace shoreline angles by means of swath profiles that run perpendicularly to the paleo-seacliffs, using the graphical interface TerraceM. Our analysis provided us with a minimum and maximum estimate of the paleoshoreline location on ~750 swath profiles, by using the present-day cliff slope as an approximation for its paleo-cliff counterpart. After correlating the marine terraces laterally we obtained 16 different terrace-levels, recording Quaternary sea-level highstands of both major interglacial and several interstadial periods. Our high-resolution Pleiades-DEMs and improved method for paleoshoreline determination allowed us to produce a marine terrace map of unprecedented detail, containing more terrace sub-levels than hitherto. Our mapping demonstrates that we are no longer limited by the

  17. Mechanical behavior modeling of sand-rubber chips mixtures using discrete element method (DEM)

    NASA Astrophysics Data System (ADS)

    Eidgahee, Danial Rezazadeh; Hosseininia, Ehsan Seyedi

    2013-06-01

    Rubber shreds in mixture with sandy soils are widely used in geotechnical purposes due to their specific controlled compressibility characteristics and light weight. Various studies have been carried out for sand or rubber chips content in order to restrain the compressibility of the mass in different structures such as backfills, road embankments, etc. Considering different rubber contents, sand-rubber mixtures can be made which lead mechanical properties of the blend to go through changes. The aim of this paper is to study the effect of adding different rubber portions on the global engineering properties of the mixtures. This study is performed by using Discrete Element Method (DEM). The simulations showed that adding rubber up to a particular fraction can improve maximum bearing stress characteristics comparing to sand alone masses. Taking the difference between sand and rubber stiffness into account, the result interpretation can be developed to other soft and rigid particle mixtures such as powders or polymers.

  18. TanDEM-X Mission: Overview and Evaluation of intermediate Results

    NASA Astrophysics Data System (ADS)

    Soergel, U.; Jacobsen, K.; Schack, L.

    2013-10-01

    The German Aerospace Center (DLR, Deutsches Zentrum für Luft- und Raumfahrt) currently conducts the bistatic interferometric synthetic aperture radar (SAR) Mission TanDEM-X, which shall result in a DEM of global coverage in an unprecedented resolution and accuracy according to DTED level 3 standard. The mission is based on the two SAR satellites TerraSAR-X and TanDEM-X that have been launched in June 2007 and 2010, respectively. After the commissioning phase of TanDEM satellite and the orbital adjustment the bistatic image acquisition in close formation began end of 2010. The data collection for the mission is scheduled to last about three years, i.e., the bigger part of the required data have been already gathered. Based on this data DLR will conduct several processing steps in order to come up finally with a global and seamless DEM of the Earth's landmass which shall meet the envisaged specifications. Since the entire mission is an endeavor in the framework of a private-public-partnership, the private partner, Astrium, will eventually commercialize the DEM product. In this paper, we will provide an overview of the data collection and the deliverables that will come along with TanDEM-X mission. Furthermore, we will analyze a DEM derived from early stage immediate products of the mission.

  19. On computer simulation of dry particle systems using discrete element method and the development of DEM contact force-displacement models

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    Particle systems are involved in a lot of engineering processes. Particle science and technology deal with the production, characterization, modification, handling, and utilization of a wide variety of particles, in both dry and wet conditions. In investigations of the behavior of dry particle systems, considerations of cost, time, and equipment may prohibit experimental works. Even in the absence of these constraints, successful experiments often yield only limited information. Computer simulation using the discrete element method (DEM) can provide more information and can be performed more quickly and at a lower cost than that of experiments. A DEM simulation requires that the interactive forces between simulated particles must be evaluated accurately as the driving factor of the motion behavior of particle systems. Therefore, to obtain reliable simulation results efficiently, it is necessary to develop simple and accurate particle-particle interactive force-displacement (FD) models. Most existing FD models, however, implement oversimplified versions of contact mechanics and do not correctly account for the plastic deformation near the contact point between particles, thus result in inaccurate simulation results. We present here the DEM simulation algorithms, the geometric modelling of nonspherical particles, and the development of a set of realistic and consistent particle-particle contact FD models that can correctly account for the effect of both the elastic and plastic deformation. The DEM simulations, implemented with the present elasto-plastic FD models, can correctly predict the motion behavior of particles. In the development of the present elasto-plastic FD models, we carried out a series of finite element analyses (FEA) of nonlinear elasto-plastic contact problems. The modelling and results of these FEA, including the dynamic FEA of elasto-plastic sphere collisions, are also presented in this dissertation. We implemented the present FD models into DEM

  20. Evaluation of DEM-DISC, customized e-advice on health and social support services for informal carers and case managers of people with dementia; a cluster randomized trial.

    PubMed

    Van Mierlo, Lisa D; Meiland, Franka J M; Van de Ven, Peter M; Van Hout, Hein P J; Dröes, Rose-Marie

    2015-08-01

    Few personalized e-interventions are available for informal and professional caregivers of people with dementia. The DEMentia Digital Interactive Social Chart (DEM-DISC) is an ICT tool to support customized disease management in dementia. The aim of this study was to improve and evaluate DEM-DISC, its user-friendliness and usefulness and to investigate the future implementation. A cluster randomized controlled trial (RCT) design was used with measurements at baseline, 6 and 12 months. A total of 73 informal caregivers of people with dementia, supported by 19 randomized case managers participated in the study. In the intervention group, both carers (n = 41) and case managers (n = 13) could access DEM-DISC during twelve months. The control group, 32 carers and 14 case managers, had no access to DEM-DISC. Semi-structured interviews were conducted with ten stakeholders. Informal caregivers who used DEM-DISC for twelve months reported an increased sense of competence than controls. A subgroup of users who frequently accessed DEM-DISC reported more met needs after six months than controls. Overall informal caregivers and case managers judged DEM-DISC as easy to learn and user-friendly. This study demonstrates that using DEM-DISC had a positive effect on the sense of competence and experienced (met) needs of informal carers. This shows the importance of user-friendly ICT solutions to assist carers in finding appropriate care services tailored to their specific situation and needs. For further implementation of DEM-DISC methods to keep the information updated is of great importance.

  1. Evaluation of TanDEM-X elevation data for geomorphological mapping and interpretation in high mountain environments - A case study from SE Tibet, China

    NASA Astrophysics Data System (ADS)

    Pipaud, Isabel; Loibl, David; Lehmkuhl, Frank

    2015-10-01

    Digital elevation models (DEMs) are a prerequisite for many different applications in the field of geomorphology. In this context, the two near-global medium resolution DEMs originating from the SRTM and ASTER missions are widely used. For detailed geomorphological studies, particularly in high mountain environments, these datasets are, however, known to have substantial disadvantages beyond their posting, i.e., data gaps and miscellaneous artifacts. The upcoming TanDEM-X DEM is a promising candidate to improve this situation by application of state-of-the-art radar technology, exhibiting a posting of 12 m and less proneness to errors. In this study, we present a DEM processed from a single TanDEM-X CoSSC scene, covering a study area in the extreme relief of the eastern Nyainqêntanglha Range, southeastern Tibet. The potential of the resulting experimental TanDEM-X DEM for geomorphological applications was evaluated by geomorphometric analyses and an assessment of landform cognoscibility and artifacts in comparison to the ASTER GDEM and the recently released SRTM 1″ DEM. Detailed geomorphological mapping was conducted for four selected core study areas in a manual approach, based exclusively on the TanDEM-X DEM and its basic derivates. The results show that the self-processed TanDEM-X DEM yields a detailed and widely consistent landscape representation. It thus fosters geomorphological analysis by visual and quantitative means, allowing delineation of landforms down to footprints of 30 m. Even in this premature state, the TanDEM-X elevation data are widely superior to the ASTER and SRTM datasets, primarily owing to its significantly higher resolution and its lower susceptibility to artifacts that hamper landform interpretation. Conversely, challenges toward interferometric DEM generation were identified, including (i) triangulation facets and missing topographic information resulting from radar layover on steep slopes facing toward the radar sensor, (ii) low

  2. Process evaluation of the implementation of dementia-specific case conferences in nursing homes (FallDem): study protocol for a randomized controlled trial.

    PubMed

    Holle, Daniela; Roes, Martina; Buscher, Ines; Reuther, Sven; Müller, René; Halek, Margareta

    2014-12-11

    Challenging behaviors exhibited by individuals with dementia might result from an unmet need that they cannot communicate directly due to cognitive restrictions. A dementia-specific case conference represents a promising means of analyzing and exploring these unmet needs. The ongoing FallDem study is a stepped-wedged, cluster-randomized trial evaluating the effects of two different types of dementia-specific case conferences on the challenging behaviors of nursing home residents. This study protocol describes the process evaluation that is conducted, along with the FallDem study.The goal of the process evaluation is to explain potential discrepancies between expected and observed outcomes, and to provide insights into implementation processes and recruitment strategies, as well as the contexts and contextual factors that promote or inhibit the implementation of dementia-specific case conferences. The process evaluation will use a mixed-method design comprising longitudinal elements, in which quantitative and qualitative data will be gathered. Qualitative data will be analyzed using content analysis, documentary analysis and a documentary method. Quantitative data (standardized questionnaires) will be analyzed using descriptive statistics. Both types of data will complement one another and provide a more comprehensive picture of the different objects under investigation. The process evaluation will allow for a comprehensive understanding of the changing processes and mechanisms underlying the 'black box' of the complex intervention of the FallDem study. These findings will provide practical knowledge regarding issues related to the implementation of dementia-specific case conferences in nursing homes. Current Controlled Trials identifier: ISRCTN20203855, registered on 10th July 2013.

  3. Computational Fluid Dynamics–Discrete Element Method (CFD-DEM) Study of Mass-Transfer Mechanisms in Riser Flow

    PubMed Central

    2017-01-01

    We report a computational fluid dynamics–discrete element method (CFD-DEM) simulation study on the interplay between mass transfer and a heterogeneous catalyzed chemical reaction in cocurrent gas-particle flows as encountered in risers. Slip velocity, axial gas dispersion, gas bypassing, and particle mixing phenomena have been evaluated under riser flow conditions to study the complex system behavior in detail. The most important factors are found to be directly related to particle cluster formation. Low air-to-solids flux ratios lead to more heterogeneous systems, where the cluster formation is more pronounced and mass transfer more influenced. Falling clusters can be partially circumvented by the gas phase, which therefore does not fully interact with the cluster particles, leading to poor gas–solid contact efficiencies. Cluster gas–solid contact efficiencies are quantified at several gas superficial velocities, reaction rates, and dilution factors in order to gain more insight regarding the influence of clustering phenomena on the performance of riser reactors. PMID:28553011

  4. Shoreline Mapping with Integrated HSI-DEM using Active Contour Method

    NASA Astrophysics Data System (ADS)

    Sukcharoenpong, Anuchit

    Shoreline mapping has been a critical task for federal/state agencies and coastal communities. It supports important applications such as nautical charting, coastal zone management, and legal boundary determination. Current attempts to incorporate data from hyperspectral imagery to increase the efficiency and efficacy of shoreline mapping have been limited due to the complexity in processing its data as well as its inferior spatial resolution when compared to multispectral imagery or to sensors such as LiDAR. As advancements in remote-sensing technologies increase sensor capabilities, the ability to exploit the spectral formation carried in hyperspectral images becomes more imperative. This work employs a new approach to extracting shorelines from AVIRIS hyperspectral images by combination with a LiDAR-based DEM using a multiphase active contour segmentation technique. Several techniques, such as study of object spectra and knowledge-based segmentation for initial contour generation, have been employed in order to achieve a sub-pixel level of accuracy and maintain low computational expenses. Introducing a DEM into hyperspectral image segmentation proves to be a useful tool to eliminate misclassifications and improve shoreline positional accuracy. Experimental results show that mapping shorelines from hyperspectral imagery and a DEM can be a promising approach as many further applications can be developed to exploit the rich information found in hyperspectral imagery.

  5. A Screening Method for Flash Flooding Risk using Instantaneous Unit Hydrographs Derived from High Resolution DEM data

    NASA Astrophysics Data System (ADS)

    Senevirathne, Nalin; Willgoose, Garry

    2015-04-01

    Flash flooding is considered a severe natural hazard and has had significant impact on human and infrastructure throughout the history. Modelling techniques and the understanding of flash flooding are getting improved with the availability of better quality data such as high resolution Digital Elevation Models (DEM). DEMs allow the automated characterization of the influence of geomorphology on the hydrologic response of catchments. They are particularly useful for small ungauged catchments where available hydrologic data (e.g. rainfall, runoff) are sparse and where site specific studies are rarely done unless some evidence of high risk is available. In this paper, we present new risk indicators, derived directly from instantaneous unit hydrographs (IUH), which can be used to identify flash flooding risk areas within catchments. The study area includes 35 major river basins covering a 1700km long by 50km wide coastal strip of Eastern Australia. Standard terrain analysis methods (pit filling, flow direction, local slope, contributing area, flow velocity and travel time) were used to produce IUHs for every pixel in the study area using a high resolution (1 arc second) DEM. When computing the IUHs, each pixel was considered as the outlet of its own catchment bounded by its contributing area. This allows us to characterise the hydrological response at the finest scale possible for a DEM. Risk indicators related to rate of storm rise and catchment lag time were derived from IUHs. Flash flood risk maps were produced at the catchment scale and they are match well with the data of severe flash flooding that occurred around Toowoomba (at the northern end of the coastal strip studied) in January 2011.

  6. Analysis of compaction of railway ballast by different maintenance methods using DEM

    NASA Astrophysics Data System (ADS)

    Ferellec, Jean-Francois; Perales, Robert; Nhu, Viet-Hung; Wone, Michel; Saussine, Gilles

    2017-06-01

    Railway traffic continuously increasing, ballasted tracks need more efficient maintenance processes. Lines with long welded rails which are prone to buckling during heat waves require stabilisation before being fully operational. Stabilisation is performed either naturally using regular traffic at penalising lower speeds, dynamic stabilisation of sleepers or alternatively crib compaction. The objective of this paper is to apply the NSCD approach of DEM to simulate the processes of dynamic stabilisation and crib compaction as they are realised on site and compare their performance in terms of ballast compaction and lateral resistance. The results showed that NSCD is perfectly appropriate to simulate these maintenance processes and estimate their performance.

  7. DEM generated from InSAR in mountainous terrain and its accuracy analysis

    NASA Astrophysics Data System (ADS)

    Hu, Hongbing; Zhan, Yulan

    2011-02-01

    Digital Elevation Model (DEM) derived from survey data is accurate but it is very expensive and time-consuming. In recent years, remote sensing techniques including Synthetic Apenture Radar Interferometry (InSAR) had been developed as a powerful method to derive high precision DEM, especially in mountainous or deep forest areas. The purpose of this paper is to illustrate the principle of InSAR and show the result of a case study in Gejiu city, Yunnan province, China. The accuracy of DEM derived from InSAR (abbreviation as InSAR-DEM) is also evaluated by comparing it with DEM generated from topographic map at the scale of 1:50000 (abbreviation as TOP-DEM). The result shows that: (1)The general precision of the whole selected area acquired by subtracting InSAR-DEM from TOP-DEM is that the maximum, the minimum, the RMSE, and the mean of difference of the two DEMs are 203m, -188m, 26.9m and 5.7m respectively. (2)The topographic trend represented by the two DEMs is coincident, even though TOP-DEM is finer than InSAR-DEM, especial at the valley. (3) Contour maps with the interval of 100m and 50m converted from InSAR-DEM and TOP-DEM respectively show accordant relief trend. Contour from TOP-DEM is smoother than that of from InSAR-DEM, while Contour from InSAR-DEM has more islands than that of from TOP-DEM.(4) Coherence has great influence on the precision of InSAR-DEM, the precision of low-coherence area approaches 100 m while that of high-coherence area can up to m level. (5) The relief trend of 6 profiles represented by InSAR-DEM and TOP-DEM is accordant with tiny difference in partial minutiae. InSAR-DEM displays hypsographies at relative flat areas including surface of water, which reflects the influence of flat earth on InSAR to a certain extent.

  8. High-resolution DEMs in the study of rainfall- and earthquake-induced landslides: Use of a variable window size method in digital terrain analysis

    NASA Astrophysics Data System (ADS)

    Iwahashi, Junko; Kamiya, Izumi; Yamagishi, Hiromitsu

    2012-06-01

    We undertake digital terrain analyses of rainfall- and earthquake-induced landslides in Japan, using high-resolution orthoimagery and Light Detection and Ranging (LiDAR) DEMs. Our aims are twofold: to demonstrate an effective method for dealing with high-resolution DEMs, which are often too detailed for landslide assessments, and to evaluate the topographic differences between rainfall- and earthquake-induced landslides. The study areas include the Izumozaki (1961 and 2004 heavy rainfalls), Niihama (2004 heavy rainfalls), Houfu (2009 heavy rainfalls), and Hanokidachi/Kurikoma-dam regions (the 2008 M 7.2 Iwate-Miyagi Nairiku earthquake). The study areas include 7,106 landslides in these five regions. We use two topographic attributes (the slope gradient and the Laplacian) calculated from DEMs in varying window sizes. The hit rates for statistical prediction of landslide cells through discriminant analyses are calculated using the two topographic attributes as explanatory variables, and the landslide inventory data as the dependent variable. In cases of surface failure, the hit rates are found to diminish when the window size of the topographic attributes is too large or too small, indicating that an optimal scale factor is key in assessing shallow landslides. The representative window sizes are approximately 30 m for shallow landslides; the optimal window size may be directly related to the average size of landslides in each region. We also find a stark contrast between rainfall- and earthquake-induced landslides. Rainfall-induced landslides are always most common at a slope gradient of 30°, but the frequency of earthquake-induced landslides increases exponentially with slope gradient. We find that the Laplacian, i.e., the attributes of surface convexity and concavity, and the slope gradient are both important factors for rainfall-induced landslides, whereas earthquake-induced landslides are influenced mainly by slope steepness.

  9. How to bridge the gap between "unresolved" model and "resolved" model in CFD-DEM coupled method for sediment transport?

    NASA Astrophysics Data System (ADS)

    Liu, D.; Fu, X.; Liu, X.

    2016-12-01

    In nature, granular materials exist widely in water bodies. Understanding the fundamentals of solid-liquid two-phase flow, such as turbulent sediment-laden flow, is of importance for a wide range of applications. A coupling method combining computational fluid dynamics (CFD) and discrete element method (DEM) is now widely used for modeling such flows. In this method, when particles are significantly larger than the CFD cells, the fluid field around each particle should be fully resolved. On the other hand, the "unresolved" model is designed for the situation where particles are significantly smaller than the mesh cells. Using "unresolved" model, large amount of particles can be simulated simultaneously. However, there is a gap between these two situations when the size of DEM particles and CFD cell is in the same order of magnitude. In this work, the most commonly used void fraction models are tested with numerical sedimentation experiments. The range of applicability for each model is presented. Based on this, a new void fraction model, i.e., a modified version of "tri-linear" model, is proposed. Particular attention is paid to the smooth function of void fraction in order to avoid numerical instability. The results show good agreement with the experimental data and analytical solution for both single-particle motion and also group-particle motion, indicating great potential of the new void fraction model.

  10. Towards the Optimal Pixel Size of dem for Automatic Mapping of Landslide Areas

    NASA Astrophysics Data System (ADS)

    Pawłuszek, K.; Borkowski, A.; Tarolli, P.

    2017-05-01

    Determining appropriate spatial resolution of digital elevation model (DEM) is a key step for effective landslide analysis based on remote sensing data. Several studies demonstrated that choosing the finest DEM resolution is not always the best solution. Various DEM resolutions can be applicable for diverse landslide applications. Thus, this study aims to assess the influence of special resolution on automatic landslide mapping. Pixel-based approach using parametric and non-parametric classification methods, namely feed forward neural network (FFNN) and maximum likelihood classification (ML), were applied in this study. Additionally, this allowed to determine the impact of used classification method for selection of DEM resolution. Landslide affected areas were mapped based on four DEMs generated at 1 m, 2 m, 5 m and 10 m spatial resolution from airborne laser scanning (ALS) data. The performance of the landslide mapping was then evaluated by applying landslide inventory map and computation of confusion matrix. The results of this study suggests that the finest scale of DEM is not always the best fit, however working at 1 m DEM resolution on micro-topography scale, can show different results. The best performance was found at 5 m DEM-resolution for FFNN and 1 m DEM resolution for results. The best performance was found to be using 5 m DEM-resolution for FFNN and 1 m DEM resolution for ML classification.

  11. The influence of accuracy, grid size, and interpolation method on the hydrological analysis of LiDAR derived dems: Seneca Nation of Indians, Irving NY

    NASA Astrophysics Data System (ADS)

    Clarkson, Brian W.

    Light Detection and Ranging (LiDAR) derived Digital Elevation Models (DEMs) provide accurate, high resolution digital surfaces for precise topographic analysis. The following study investigates the accuracy of LiDAR derived DEMs by calculating the Root Mean Square Error (RMSE) of multiple interpolation methods with grid cells ranging from 0.5 to 10-meters. A raster cell with smaller dimensions will drastically increase the amount of detail represented in the DEM by increasing the number of elevation values across the study area. Increased horizontal resolutions have raised the accuracy of the interpolated surfaces and the contours generated from the digitized landscapes. As the raster grid cells decrease in size, the level of detail of hydrological processes will significantly improve compared to coarser resolutions including the publicly available National Elevation Datasets (NEDs). Utilizing a LiDAR derived DEM with the lowest RMSE as the 'ground truth', watershed boundaries were delineated for a sub-basin of the Clear Creek Watershed within the territory of the Seneca Nation of Indians located in Southern Erie County, NY. An investigation of the watershed area and boundary location revealed considerable differences comparing the results of applying different interpretation methods on DEM datasets of different horizontal resolutions. Stream networks coupled with watersheds were used to calculate peak flow values for the 10-meter NEDs and LiDAR derived DEMs.

  12. The Discrete Equation Method (DEM) for Fully Compressible, Two-Phase Flows in Ducts of Spatially Varying Cross-Section

    SciTech Connect

    R. A. Berry; R. Saurel; O. LeMetayer

    2010-11-01

    For the simulation of light water nuclear reactor coolant flows, general two-phase models (valid for all volume fractions) have been generally used which, while allowing for velocity disequilibrium, normally force pressure equilibrium between the phases (see, for example, the numerous models of this type described in H. Städtke, Gasdynamic Aspects of Two-Phase Flow, Wiley-VCH, 2006). These equations are not hyperbolic, their physical wave dynamics are incorrect, and their solution algorithms rely on dubious truncation error induced artificial viscosity to render them numerically well posed over a portion of the computational spectrum. The inherent problems of the traditional approach to multiphase modeling, which begins with an averaged system of (ill-posed) partial differential equations (PDEs) which are then discretized to form a numerical scheme, are avoided by employing a new homogenization method known as the Discrete Equation Method (DEM) (R. Abgrall and R. Saurel, Discrete Equations for Physical and Numerical Compressible Multiphase Mixtures, J. Comp. Phys. 186, 361-396, 2003). This method results in well-posed hyperbolic systems, this property being important for transient flows. This also allows a clear treatment of non-conservative terms (terms involving interfacial variables and volume fraction gradients) permitting the solution of interface problems without conservation errors, this feature being important for the direct numerical simulation of two-phase flows. Unlike conventional methods, the averaged system of PDEs for the mixture are not used, and the DEM method directly obtains a well-posed discrete equation system from the single-phase conservation laws, producing a numerical scheme which accurately computes fluxes for arbitrary number of phases and solves non-conservative products. The method effectively uses a sequence of single phase Riemann problem solutions. Phase interactions are accounted for by Riemann solvers at each interface. Non

  13. Evaluation of the performance of the cross-flow air classifier in manufactured sand processing via CFD-DEM simulations

    NASA Astrophysics Data System (ADS)

    Petit, H. A.; Irassar, E. F.; Barbosa, M. R.

    2017-03-01

    Manufactured sands are particulate materials obtained as by product of rock crushing. Particle sizes in the sand can be as high as 6 mm and as low as a few microns. The concrete industry has been increasingly using these sands as fine aggregates to replace natural sands. The main shortcoming is the excess of particles smaller than <0.075 mm (Dust). This problem has been traditionally solved by a washing process. Air classification is being studied to replace the washing process and avoid the use of water. The complex classification process can only been understood with the aid of CFD-DEM simulations. This paper evaluates the applicability of a cross-flow air classifier to reduce the amount of dust in manufactured sands. Computational fluid dynamics (CFD) and discrete element modelling (DEM) were used for the assessment. Results show that the correct classification set up improves the size distribution of the raw materials. The cross-flow air classification is found to be influenced by the particle size distribution and the turbulence inside the chamber. The classifier can be re-designed to work at low inlet velocities to produce manufactured sand for the concrete industry.

  14. ASTER DEM performance

    USGS Publications Warehouse

    Fujisada, H.; Bailey, G.B.; Kelly, Glen G.; Hara, S.; Abrams, M.J.

    2005-01-01

    The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument onboard the National Aeronautics and Space Administration's Terra spacecraft has an along-track stereoscopic capability using its a near-infrared spectral band to acquire the stereo data. ASTER has two telescopes, one for nadir-viewing and another for backward-viewing, with a base-to-height ratio of 0.6. The spatial resolution is 15 m in the horizontal plane. Parameters such as the line-of-sight vectors and the pointing axis were adjusted during the initial operation period to generate Level-1 data products with a high-quality stereo system performance. The evaluation of the digital elevation model (DEM) data was carried out both by Japanese and U.S. science teams separately using different DEM generation software and reference databases. The vertical accuracy of the DEM data generated from the Level-1A data is 20 m with 95% confidence without ground control point (GCP) correction for individual scenes. Geolocation accuracy that is important for the DEM datasets is better than 50 m. This appears to be limited by the spacecraft position accuracy. In addition, a slight increase in accuracy is observed by using GCPs to generate the stereo data.

  15. A hierarchical pyramid method for managing large-scale high-resolution drainage networks extracted from DEM

    NASA Astrophysics Data System (ADS)

    Bai, Rui; Tiejian, Li; Huang, Yuefei; Jiaye, Li; Wang, Guangqian; Yin, Dongqin

    2015-12-01

    The increasing resolution of Digital Elevation Models (DEMs) and the development of drainage network extraction algorithms make it possible to develop high-resolution drainage networks for large river basins. These vector networks contain massive numbers of river reaches with associated geographical features, including topological connections and topographical parameters. These features create challenges for efficient map display and data management. Of particular interest are the requirements of data management for multi-scale hydrological simulations using multi-resolution river networks. In this paper, a hierarchical pyramid method is proposed, which generates coarsened vector drainage networks from the originals iteratively. The method is based on the Horton-Strahler's (H-S) order schema. At each coarsening step, the river reaches with the lowest H-S order are pruned, and their related sub-basins are merged. At the same time, the topological connections and topographical parameters of each coarsened drainage network are inherited from the former level using formulas that are presented in this study. The method was applied to the original drainage networks of a watershed in the Huangfuchuan River basin extracted from a 1-m-resolution airborne LiDAR DEM and applied to the full Yangtze River basin in China, which was extracted from a 30-m-resolution ASTER GDEM. In addition, a map-display and parameter-query web service was published for the Mississippi River basin, and its data were extracted from the 30-m-resolution ASTER GDEM. The results presented in this study indicate that the developed method can effectively manage and display massive amounts of drainage network data and can facilitate multi-scale hydrological simulations.

  16. Evaluation of ASTER and SRTM DEM data for lahar modeling: A case study on lahars from Popocatépetl Volcano, Mexico

    NASA Astrophysics Data System (ADS)

    Huggel, C.; Schneider, D.; Miranda, P. Julio; Delgado Granados, H.; Kääb, A.

    2008-02-01

    Lahars are among the most serious and far-reaching volcanic hazards. In regions with potential interactions of lahars with populated areas and human structures the assessment of the related hazards is crucial for undertaking appropriate mitigating actions and reduce the associated risks. Modeling of lahars has become an important tool in such assessments, in particular where the geologic record of past events is insufficient. Mass-flow modeling strongly relies on digital terrain data. Availability of digital elevation models (DEMs), however, is often limited and thus an obstacle to lahar modeling. Remote-sensing technology has now opened new perspectives in generating DEMs. In this study, we evaluate the feasibility of DEMs derived from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Shuttle Radar Topography Mission (SRTM) for lahar modeling on Popocatépetl Volcano, Mexico. Two GIS-based models are used for lahar modeling, LAHARZ and a flow-routing-based debris-flow model (modified single-flow direction model, MSF), both predicting areas potentially affected by lahars. Results of the lahar modeling show that both the ASTER and SRTM DEMs are basically suitable for use with LAHARZ and MSF. Flow-path prediction is found to be more reliable with SRTM data, though with a coarser spatial resolution. Errors of the ASTER DEM affecting the prediction of flow paths due to the sensor geometry are associated with deeply incised gorges with north-facing slopes. LAHARZ is more sensitive to errors of the ASTER DEM than the MSF model. Lahar modeling with the ASTER DEM results in a more finely spaced predicted inundation area but does not add any significant information in comparison with the SRTM DEM. Lahars at Popocatépetl are modeled with volumes of 1 × 10 5 to 8 × 10 6 m 3 based on ice-melt scenarios of the glaciers on top of the volcano and data on recent and historical lahar events. As regards recently observed lahars, the travel

  17. Evaluating DEM conditioning techniques, elevation source data, and grid resolution for field-scale hydrological parameter extraction

    NASA Astrophysics Data System (ADS)

    Woodrow, Kathryn; Lindsay, John B.; Berg, Aaron A.

    2016-09-01

    Although digital elevation models (DEMs) prove useful for a number of hydrological applications, they are often the end result of numerous processing steps that each contains uncertainty. These uncertainties have the potential to greatly influence DEM quality and to further propagate to DEM-derived attributes including derived surface and near-surface drainage patterns. This research examines the impacts of DEM grid resolution, elevation source data, and conditioning techniques on the spatial and statistical distribution of field-scale hydrological attributes for a 12,000 ha watershed of an agricultural area within southwestern Ontario, Canada. Three conditioning techniques, including depression filling (DF), depression breaching (DB), and stream burning (SB), were examined. The catchments draining to each boundary of 7933 agricultural fields were delineated using the surface drainage patterns modeled from LiDAR data, interpolated to a 1 m, 5 m, and 10 m resolution DEMs, and from a 10 m resolution photogrammetric DEM. The results showed that variation in DEM grid resolution resulted in significant differences in the spatial and statistical distributions of contributing areas and the distributions of downslope flowpath length. Degrading the grid resolution of the LiDAR data from 1 m to 10 m resulted in a disagreement in mapped contributing areas of between 29.4% and 37.3% of the study area, depending on the DEM conditioning technique. The disagreements among the field-scale contributing areas mapped from the 10 m LiDAR DEM and photogrammetric DEM were large, with nearly half of the study area draining to alternate field boundaries. Differences in derived contributing areas and flowpaths among various conditioning techniques increased substantially at finer grid resolutions, with the largest disagreement among mapped contributing areas occurring between the 1 m resolution DB DEM and the SB DEM (37% disagreement) and the DB-DF comparison (36.5% disagreement in mapped

  18. Numerical slope stability simulations of chasma walls in Valles Marineris/Mars using a distinct element method (dem).

    NASA Astrophysics Data System (ADS)

    Imre, B.

    2003-04-01

    NUMERICAL SLOPE STABILITY SIMULATIONS OF CHASMA WALLS IN VALLES MARINERIS/MARS USING A DISTINCT ELEMENT METHOD (DEM). B. Imre (1) (1) German Aerospace Center, Berlin Adlershof, bernd.imre@gmx.net The 8- to 10-km depths of Valles Marineris (VM) offer excellent views into the upper Martian crust. Layering, fracturing, lithology, stratigraphy and the content of volatiles have influenced the evolution of the Valles Marineris wallslopes. But these parameters also reflect the development of VM and its wall slopes. The scope of this work is to gain understanding in these parameters by back-simulating the development of wall slopes. For that purpose, the two dimensional Particle Flow Code PFC2D has been chosen (ITASCA, version 2.00-103). PFC2D is a distinct element code for numerical modelling of movements and interactions of assemblies of arbitrarily sized circular particles. Particles may be bonded together to represent a solid material. Movements of particles are unlimited. That is of importance because results of open systems with numerous unknown variables are non-unique and therefore highly path dependent. This DEM allows the simulation of whole development paths of VM walls what makes confirmation of the model more complete (e.g. Oreskes et al., Science 263, 1994). To reduce the number of unknown variables a proper (that means as simple as possible) field-site had to be selected. The northern wall of eastern Candor Chasma has been chosen. This wall is up to 8-km high and represents a significant outcrop of the upper Martian crust. It is quite uncomplex, well-aligned and of simple morphology. Currently the work on the model is at the stage of performing the parameter study. Results will be presented via poster by the EGS-Meeting.

  19. The evaluation of unmanned aerial system-based photogrammetry and terrestrial laser scanning to generate DEMs of agricultural watersheds

    NASA Astrophysics Data System (ADS)

    Ouédraogo, Mohamar Moussa; Degré, Aurore; Debouche, Charles; Lisein, Jonathan

    2014-06-01

    Agricultural watersheds tend to be places of intensive farming activities that permanently modify their microtopography. The surface characteristics of the soil vary depending on the crops that are cultivated in these areas. Agricultural soil microtopography plays an important role in the quantification of runoff and sediment transport because the presence of crops, crop residues, furrows and ridges may impact the direction of water flow. To better assess such phenomena, 3-D reconstructions of high-resolution agricultural watershed topography are essential. Fine-resolution topographic data collection technologies can be used to discern highly detailed elevation variability in these areas. Knowledge of the strengths and weaknesses of existing technologies used for data collection on agricultural watersheds may be helpful in choosing an appropriate technology. This study assesses the suitability of terrestrial laser scanning (TLS) and unmanned aerial system (UAS) photogrammetry for collecting the fine-resolution topographic data required to generate accurate, high-resolution digital elevation models (DEMs) in a small watershed area (12 ha). Because of farming activity, 14 TLS scans (≈ 25 points m- 2) were collected without using high-definition surveying (HDS) targets, which are generally used to mesh adjacent scans. To evaluate the accuracy of the DEMs created from the TLS scan data, 1098 ground control points (GCPs) were surveyed using a real time kinematic global positioning system (RTK-GPS). Linear regressions were then applied to each DEM to remove vertical errors from the TLS point elevations, errors caused by the non-perpendicularity of the scanner's vertical axis to the local horizontal plane, and errors correlated with the distance to the scanner's position. The scans were then meshed to generate a DEMTLS with a 1 × 1 m spatial resolution. The Agisoft PhotoScan and MicMac software packages were used to process the aerial photographs and generate a DEMPSC

  20. Construction of lunar DEMs based on reflectance modelling

    NASA Astrophysics Data System (ADS)

    Grumpe, Arne; Belkhir, Fethi; Wöhler, Christian

    2014-06-01

    Existing lunar DEMs obtained based on laser altimetry or photogrammetric image analysis are characterised by high large-scale accuracies while their lateral resolution is strongly limited by noise or interpolation artifacts. In contrast, image-based photometric surface reconstruction approaches reveal small-scale surface detail but become inaccurate on large spatial scales. The framework proposed in this study therefore combines photometric image information of high lateral resolution and DEM data of comparably low lateral resolution in order to obtain DEMs of high lateral resolution which are also accurate on large spatial scales. Our first approach combines an extended photoclinometry scheme and a shape from shading based method. A novel variational surface reconstruction method further increases the lateral resolution of the DEM such that it reaches that of the underlying images. We employ the Hapke IMSA and AMSA reflectance models with two different formulations of the single-particle scattering function, such that the single-scattering albedo of the surface particles and optionally the asymmetry parameter of the single-particle scattering function can be estimated pixel-wise. As our DEM construction methods require co-registered images, an illumination-independent image registration scheme is developed. An evaluation of our framework based on synthetic image data yields an average elevation accuracy of the constructed DEMs of better than 20 m as long as the correct reflectance model is assumed. When comparing our DEMs to LOLA single track data, absolute elevation accuracies around 30 m are obtained for test regions that cover an elevation range of several thousands of metres. The proposed illumination-independent image registration method yields subpixel accuracy even in the presence of 3D perspective distortions. The pixel-wise reflectance parameters estimated simultaneously with the DEM reflect compositional contrasts between different surface units

  1. Topographic changes due to the 2008 Mw 7.9 Wenchuan earthquake as revealed by the differential DEM method

    NASA Astrophysics Data System (ADS)

    Ren, Zhikun; Zhang, Zhuqi; Dai, Fuchu; Yin, Jinhui; Zhang, Huiping

    2014-07-01

    Landscape evolution in active orogenic regions is inevitably affected by the repeated strong earthquakes triggered by the corresponding active faults. However, the lack of adequate methods for the documentation and monitoring of mountain-building processes has resulted in a shortage of quantitative estimates of orogenic and eroded volumes. A strong earthquake and its associated co-seismic landslides represent a sudden pulse in landscape evolution in tectonically active areas. The 2008 Mw 7.9 Wenchuan earthquake dramatically modified the topography of the Longmen Shan region. Based on topographic data before the earthquake and stereo pairs of post-earthquake remote sensing imagery, we derived pre- and post-earthquake DEMs (digital elevation models) of the three regions along the Longmen Shan Thrust Belt. By comparing the geomorphic features before and after the earthquake, we find that the Wenchuan earthquake smoothed the steep relief and caused a co-seismic uplift of the Longmen Shan region. The medium-relief regions increased; however, the high-relief regions decreased, indicating that the local relief is controlled by repeated strong earthquakes. The changed slope aspect indicates that the formation and modification of the east- and west-facing slopes are controlled by tectonic events in the Longmen Shan region, which might be associated with the regional stress field. However, the unchanged aspects of other slopes might be controlled by long-term erosion rather than tectonic events. The topographic changes, landslide volume and co-seismic uplift indicate that the greatest seismically induced denudation occurred in association with a thrust faulting mechanism and low-angle fault geometry. Our findings reveal that the local relief has been shaped by the localized, seismically induced high rate of denudation within the plateau margins, and that the formation of local relief is also related to tectonic events, especially the events that have occurred on low

  2. GPU accelerated Discrete Element Method (DEM) molecular dynamics for conservative, faceted particle simulations

    NASA Astrophysics Data System (ADS)

    Spellings, Matthew; Marson, Ryan L.; Anderson, Joshua A.; Glotzer, Sharon C.

    2017-04-01

    Faceted shapes, such as polyhedra, are commonly found in systems of nanoscale, colloidal, and granular particles. Many interesting physical phenomena, like crystal nucleation and growth, vacancy motion, and glassy dynamics are challenging to model in these systems because they require detailed dynamical information at the individual particle level. Within the granular materials community the Discrete Element Method has been used extensively to model systems of anisotropic particles under gravity, with friction. We provide an implementation of this method intended for simulation of hard, faceted nanoparticles, with a conservative Weeks-Chandler-Andersen (WCA) interparticle potential, coupled to a thermodynamic ensemble. This method is a natural extension of classical molecular dynamics and enables rigorous thermodynamic calculations for faceted particles.

  3. An efficient and comprehensive method for drainage network extraction from DEM with billions of pixels using a size-balanced binary search tree

    NASA Astrophysics Data System (ADS)

    Bai, Rui; Li, Tiejian; Huang, Yuefei; Li, Jiaye; Wang, Guangqian

    2015-06-01

    With the increasing resolution of digital elevation models (DEMs), computational efficiency problems have been encountered when extracting the drainage network of a large river basin at billion-pixel scales. The efficiency of the most time-consuming depression-filling pretreatment has been improved by using the O(NlogN) complexity least-cost path search method, but the complete extraction steps following this method have not been proposed and tested. In this paper, an improved O(NlogN) algorithm was proposed by introducing a size-balanced binary search tree (BST) to improve the efficiency of the depression-filling pretreatment further. The following extraction steps, including the flow direction determination and the upslope area accumulation, were also redesigned to benefit from this improvement. Therefore, an efficient and comprehensive method was developed. The method was tested to extract drainage networks of 31 river basins with areas greater than 500,000 km2 from the 30-m-resolution ASTER GDEM and two sub-basins with areas of approximately 1000 km2 from the 1-m-resolution airborne LiDAR DEM. Complete drainage networks with both vector features and topographic parameters were obtained with time consumptions in O(NlogN) complexity. The results indicate that the developed method can be used to extract entire drainage networks from DEMs with billions of pixels with high efficiency.

  4. Depletion of glutathione in vivo as a method of improving the therapeutic ratio of misonidazole and SR 2508. [BSO; DEM

    SciTech Connect

    Yu, N.Y.; Brown, J.M.

    1984-08-01

    Depletion of intracellular glutathione (GSH) can enhance misonidazole (MISO) radiosensitizing efficacy both in vivo and in vitro. However, such treatments may also enhance the systemic toxicity in animals. The purpose of the present study was to test various ways of depleting GSH levels in a variety of experimental mouse tumors, to measure the improvement in the efficacy of MISO and its less toxic analog SR 2508 by this depletion, and to determine the effect of daily GSH depletion on the toxicity MISO and SR 2508. GSH levels were measured daily for 5 days in tumors, livers and brains of mice injected daily with buthionine sulfoximine (BSO), with or without diethylmaleate (DEM). Daily doses of BSO depleted tumor levels of GSH to 20 to 40% of controls by 6 hr after each injection. Injection of DEM 6 hr after BSO further enhanced the depletion. Administration of MISO or SR 2508 at the time of maximum GSH depletion enhanced the MISO efficacy by factors of 2.6 to 8 for depletion to 8% of controls by BSO + DEM, but no enhancement of SR 2508 was seen with tumors at 20% GSH levels achieved with BSO alone in the preliminary experiment. The chronic toxicity of MISO was enhanced not at all or by a factor of up to 2 for BSO and BSO + DEM respectively.

  5. The role of method of production and resolution of the DEM on slope-units delineation for landslide susceptibility assessment - Ubaye Valley, French Alps case study

    NASA Astrophysics Data System (ADS)

    Schlögel, Romy; Marchesini, Ivan; Alvioli, Massimiliano; Reichenbach, Paola; Rossi, Mauro; Malet, Jean-Philippe

    2016-04-01

    Landslide susceptibility assessment forms the basis of any hazard mapping, which is one of the essential parts of quantitative risk mapping. For the same study area, different susceptibility maps can be achieved depending on the type of susceptibility mapping methods, mapping unit, and scale. In the Ubaye Valley (South French Alps), we investigate the effect of resolution and method of production of the DEM to delineate slope units for landslide susceptibility mapping method. Slope units delineation has been processed using multiple combinations of circular variance and minimum area size values, which are the input parameters for a new software for terrain partitioning. We rely on this method taking into account homogeneity of aspect direction inside each unit and inhomogeneity between different units. We computed slope units delineation for 5, 10 and 25 meters resolution DEM, and investigate statistical distributions of morphometric variables within the different polygons. Then, for each different slope units partitioning, we calibrated a landslide susceptibility model, considering landslide bodies and scarps as a dependent variable (binary response). This work aims to analyse the role of DEM resolution on slope-units delineation for landslide susceptibility assessment. Area Under the Curve of the Receiver Operating Characteristic is investigated for the susceptibility model calculations. In addition, we analysed further the performance of the Logistic Regression Model by looking at the percentage of significant variable in the statistical analyses. Results show that smaller slope units have a better chance of containing a smaller number of thematic and morphometric variables, allowing for an easier classification. Reliability of the models according to the DEM resolution considered as well as scarp area and landslides bodies presence/absence as dependent variable are discussed.

  6. On the investigation of the performances of a DEM-based hydrogeomorphic floodplain identification method in a large urbanized river basin: the Tiber river case study in Italy

    NASA Astrophysics Data System (ADS)

    Nardi, Fernando; Biscarini, Chiara; Di Francesco, Silvia; Manciola, Piergiorgio

    2013-04-01

    consequently identified as those river buffers, draining towards the channel, with an elevation that is less than the maximum flow depth of the corresponding outlet. Keeping in mind that this hydrogeomorhic model performances are strictly related to the quality and properties of the input DEM and that the intent of this kind of methodology is not to substitute standard flood modeling and mapping methods, in this work the performances of this approach are qualitatively evaluated by comparing results with standard flood maps. The Tiber river basin was selected as case study, one of the main river basins in Italy covering a drainage area of approximately 17.000 km2. This comparison is interesting for understanding the performance of the model in a large and complex domain where the impact of the urbanization matrix is significant. Results of this investigation confirm the potential of such DEM-based floodplain mapping models for providing a fast timely homogeneous and continuous inundation scenario to urban planners and decision makers, but also the drawbacks of using such methodology where the humans are significantly and rapidly modifying the surface properties.

  7. The Oracle of DEM

    NASA Astrophysics Data System (ADS)

    Gayley, Kenneth

    2013-06-01

    The predictions of the famous Greek oracle of Delphi were just ambiguous enough to seem to convey information, yet the user was only seeing their own thoughts. Are there ways in which X-ray spectral analysis is like that oracle? It is shown using heuristic, generic response functions to mimic actual spectral inversion that the widely known ill conditioning, which makes formal inversion impossible in the presence of random noise, also makes a wide variety of different source distributions (DEMs) produce quite similar X-ray continua and resonance-line fluxes. Indeed, the sole robustly inferable attribute for a thermal, optically thin resonance-line spectrum with normal abundances in CIE is its average temperature. The shape of the DEM distribution, on the other hand, is not well constrained, and may actually depend more on the analysis method, no matter how sophisticated, than on the source plasma. The case is made that X-ray spectra can tell us average temperature, and metallicity, and absorbing column, but the main thing it cannot tell us is the main thing it is most often used to infer: the differential emission measure distribution.

  8. Topographic representation using DEMs and its applications to active tectonics research

    NASA Astrophysics Data System (ADS)

    Oguchi, T.; Lin, Z.; Hayakawa, Y. S.

    2016-12-01

    Identifying topographic deformations due to active tectonics has been a principal issue in tectonic geomorphology. It provides useful information such as whether a fault has been active during the recent past. Traditionally, field observations, conventional surveying, and visual interpretation of topographic maps, aerial photos, and satellite images were the main methods for such geomorphological investigations. However, recent studies have been utilizing digital elevation models (DEMs) to visualize and quantitatively analyze landforms. There are many advantages to the use of DEMs for research in active tectonics. For example, unlike aerial photos and satellite images, DEMs show ground conditions without vegetation and man-made objects such as buildings, permitting direct representation of tectonically deformed landforms. Recent developments and advances in airborne LiDAR also allow the fast creation of DEMs even in vegetated areas such as forested lands. In addition, DEMs enable flexible topographic visualization based on various digital cartographic and computer-graphic techniques, facilitating identification of particular landforms such as active faults. Further, recent progress in morphometric analyses using DEMs can be employed to quantitatively represent topographic characteristics, and objectively evaluate tectonic deformation and the properties of related landforms. This paper presents a review of DEM applications in tectonic geomorphology, with attention to historical development, recent advances, and future perspectives. Examples are taken mainly from Japan, a typical tectonically active country. The broader contributions of DEM-based active tectonics research to other fields, such as fluvial geomorphology and geochronology, will also be discussed.

  9. Operational TanDEM-X DEM calibration and first validation results

    NASA Astrophysics Data System (ADS)

    Gruber, Astrid; Wessel, Birgit; Huber, Martin; Roth, Achim

    2012-09-01

    In June 2010, the German TanDEM-X satellite was launched. Together with its twin satellite TerraSAR-X it flies in a close formation enabling single-pass SAR interferometry. The primary goal of the TanDEM-X mission is the derivation of a global digital elevation model (DEM) with unprecedented global accuracies of 10 m in absolute and 2 m in relative height. A significant calibration effort is required to achieve this high quality world-wide. In spite of an intensive instrument calibration and a highly accurate orbit and baseline determination, some systematic height errors like offsets and tilts in the order of some meters remain in the interferometric DEMs and have to be determined and removed during the TanDEM-X DEM calibration. The objective of this article is the presentation of an approach for the estimation of correction parameters for remaining systematic height errors applicable to interferometric height models. The approach is based on a least-squares block adjustment using the elevation of ICESat GLA 14 data as ground control points and connecting points of adjacent, overlapping DEMs as tie-points. In the first part its implementation in DLR's ground segment is outlined. In the second part the approach is applied and validated for two of the first TanDEM-X DEM test sites. Therefore, independent reference data, in particular high resolution reference DEMs and GPS tracks, are used. The results show that the absolute height errors of the TanDEM-X DEM are small in these cases, mostly in the order of 1-2 m. An additional benefit of the proposed block adjustment method is that it improves the relative accuracy of adjacent DEMs.

  10. DEM and GIS analysis of the stream gradient index to evaluate effects of tectonics: The Normandy intraplate area (NW France)

    NASA Astrophysics Data System (ADS)

    Font, Marianne; Amorese, Daniel; Lagarde, Jean-Louis

    2010-07-01

    Computer-based geomorphometry using a DEM (Digital Elevation Model) allows the analysis of the three-dimensional properties of landscape. This methodology is particularly useful in an intraplate region like western Europe where the simple visual inspection of the topography cannot resolve the evolutionary trends of landforms. In these domains, the morphologies of the topographic surface may be controlled mainly by climate under a low rate of tectonic deformation. Among the geomorphometric parameters, the stream length index ( SL) has been used to characterize fluvial systems in relation to tectonics movements. This work develops an algorithm to derive and map the SL index using a DEM and GIS, to investigate its spatial variations in a broad area. The algorithm is applied to a zone of weak intraplate deformation: the coastal lowlands of Normandy (France). The obtained spatial distributions of SL point to anomalous zones with high SL values. These zones are adjacent to mapped fault scarps and characterized by changes in flow direction. A Kruskal-Wallis test shows that the bedrock lithology has no impact on the SL value. Therefore, the SL variations can be related mainly to a differential uplift due to Quaternary tectonic forcing. Quaternary sea level fluctuations may also be responsible for high SL values in a part of the coastal lowland.

  11. Entwicklung und lmplementierung von Analysemethoden zum Erfassen vonGeschwindigkeitsfeldem mit dem PIV Verfahren (Development and Implementation of Analytical Methods for Detecting Velocity Fields using PIV- Method)

    DTIC Science & Technology

    2016-04-26

    Particle-Image-Velocimetry 1 064nm beam dump 532nm output / Dielectric polarizer Doubling crystal Laser mirrors and plates Output couple Nd:YAG Rod...zuriickgesetzt. Der Anwender befindet sic :1 wieder an dem Funkt im Programm, nachdem er die Bildpaare eingelesen hatte. 5.2.3. Auswertung der Daten und

  12. Influence of friction on sampling disturbance of lunar surface in direct push sampling method based on DEM

    NASA Astrophysics Data System (ADS)

    Gao, Xingwen; Tang, Dewei; Yue, Honghao; Jiang, Shengyuan; Deng, Zongquan

    2017-06-01

    The direct push sampling method is one of the most commonly used sampling methods in lunar regolith exploration. However, the disturbance of in situ bedding information during the sampling process has remained an unresolved problem. In this paper, the discrete element method is used to establish a numerical lunar soil simulant basing on the Hertz-Mindlin contact model. The result of simulated triaxial test shows that the macro mechanical parameters of the simulant accurately simulate most known lunar soil samples. The friction coefficient between the simulant and the wall of the sampling tube is also tested and used as the key variable in the following simulation and study. The disturbance phenomenon is evaluated by the displacement of marked layers, and a swirling structure is observed. The changing trend of the friction coefficient on the soil simulant void ratio and stress distribution is also determined.

  13. Morphometric evaluation of the Afşin-Elbistan lignite basin using kernel density estimation and Getis-Ord's statistics of DEM derived indices, SE Turkey

    NASA Astrophysics Data System (ADS)

    Sarp, Gulcan; Duzgun, Sebnem

    2015-11-01

    A morphometric analysis of river network, basins and relief using geomorphic indices and geostatistical analyses of Digital Elevation Model (DEM) are useful tools for discussing the morphometric evolution of the basin area. In this study, three different indices including valley floor width to height ratio (Vf), stream gradient (SL), and stream sinuosity were applied to Afşin-Elbistan lignite basin to test the imprints of tectonic activity. Perturbations of these indices are usually indicative of differences in the resistance of outcropping lithological units to erosion and active faulting. To map the clusters of high and low indices values, the Kernel density estimation (K) and the Getis-Ord Gi∗ statistics were applied to the DEM-derived indices. The K method and Gi∗ statistic highlighting hot spots and cold spots of the SL index, the stream sinuosity and the Vf index values helped to identify the relative tectonic activity of the basin area. The results indicated that the estimation by the K and Gi∗ including three conceptualization of spatial relationships (CSR) for hot spots (percent volume contours 50 and 95 categorized as high and low respectively) yielded almost similar results in regions of high tectonic activity and low tectonic activity. According to the K and Getis-Ord Gi∗ statistics, the northern, northwestern and southern parts of the basin indicates a high tectonic activity. On the other hand, low elevation plain in the central part of the basin area shows a relatively low tectonic activity.

  14. Advanced Usability Evaluation Methods

    DTIC Science & Technology

    2007-04-01

    tracking in usability evaluation : A practitioner’s guide. In J. Hyönä, R. Radach, & H. Deubel. (Eds.), The mind’s eye: Cognitive and applied...Advanced Usability Evaluation Methods Terence S. Andre, Lt Col, USAF Margaret Schurig, Human Factors Design Specialist, The Boeing Co...TITLE AND SUBTITLE Advanced Usability Evaluation Methods 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT

  15. [Users of regional dementia care networks in Germany : First results of the evaluation study DemNet-D].

    PubMed

    Wolf-Ostermann, Karin; Meyer, Saskia; Schmidt, Annika; Schritz, Anna; Holle, Bernhard; Wübbeler, Markus; Schäfer-Walkmann, Susanne; Gräske, Johannes

    2017-01-01

    In Germany a growing number of community-based support services for people with dementia (PwD) and their caregivers are organized in dementia care networks (DCN), which provide a single point of entry to social facilities and offer personal care and support. The aim of this study was to describe the health, functional and social characteristics of PwDs enrolled in DCNs throughout Germany because no data are currently available on this aspect. As part of the multi-center, multi-professional 12-month follow-up study DemNet-D, data on functional and psychological health, sociodemographic and dementia-specific factors and social inclusion were collected in standardized interviews with PwDs living at home. A total of 560 PwDs with an average age of 80 years were enrolled in the study. Of the participants approximately 50 % had Alzheimer's dementia and more than 75 % demonstrated at least a challenging form of behavior. More than half of the participants lived together with a partner or relative. Instrumental activities of daily living (IADLs) were very limited; nevertheless, one in five PwDs showed no long-term care-dependency level. The participants reported having a relatively low feeling of loneliness and a high feeling of social inclusion, depending on the severity of dementia. This is one of the very first studies generating data on PwDs who receive domiciliary care within DCNs in Germany. The results suggest that the regional DCNs make a successful contribution to overcoming the interface problem and can, therefore, contribute to a more stable care situation and better social integration of PwDs.

  16. Quality Test Various Existing dem in Indonesia Toward 10 Meter National dem

    NASA Astrophysics Data System (ADS)

    Amhar, Fahmi

    2016-06-01

    Indonesia has various DEM from many sources and various acquisition date spreaded in the past two decades. There are DEM from spaceborne system (Radarsat, TerraSAR-X, ALOS, ASTER-GDEM, SRTM), airborne system (IFSAR, Lidar, aerial photos) and also terrestrial one. The research objective is the quality test and how to extract best DEM in particular area. The method is using differential GPS levelling using geodetic GPS equipment on places which is ensured not changed during past 20 years. The result has shown that DEM from TerraSAR-X and SRTM30 have the best quality (rmse 3.1 m and 3.5 m respectively). Based on this research, it was inferred that these parameters are still positively correlated with the basic concept, namely that the lower and the higher the spatial resolution of a DEM data, the more imprecise the resulting vertical height.

  17. DEM-based research on the landform features of China

    NASA Astrophysics Data System (ADS)

    Tang, Guoan; Liu, Aili; Li, Fayuan; Zhou, Jieyu

    2006-10-01

    Landforms can be described and identified by parameterization of digital elevation model (DEM). This paper discusses the large-scale geomorphological characteristics of China based on numerical analysis of terrain parameters and develop a methodology for characterizing landforms from DEMs. The methodology is implemented as a two-step process. First, terrain variables are derived from a 1-km DEM in a given statistical unit including local relief, the earth's surface incision, elevation variance coefficient, roughness, mean slope and mean elevation. Second, every parameter regarded as a single-band image is combined into a multi-band image. Then ISODATA unsupervised classification and the Bayesian technique of Maximum Likelihood supervised classification are applied for landform classification. The resulting landforms are evaluated by the means of Stratified Sampling with respect to an existing map and the overall classification accuracy reaches to rather high value. It's shown that the derived parameters carry sufficient physiographic information and can be used for landform classification. Since the classification method integrates manifold terrain indexes, conquers the limitation of the subjective cognition, as well as a low cost, apparently it could represent an applied foreground in the classification of macroscopic relief forms. Furthermore, it exhibits significance in consummating the theory and the methodology of DEMs on digital terrain analysis.

  18. Use of Integrated MASTER Multispectral Imagery and LiDAR DEM for Active Fault Detection and Evaluation

    NASA Astrophysics Data System (ADS)

    Perez, F. G.; Bryant, W. A.; Treiman, J. A.; Real, C. R.; Hook, S.

    2011-12-01

    Displacement caused by surface fault rupture associated with large earthquakes not only disrupts infrastructure and damages natural and built environments, but also constitutes a life safety hazard. The California Geological Survey (CGS) has the authority and responsibility, under the Alquist-Priolo Earthquake Fault Zoning Act, to identify and map active faults in California for the purpose of surface rupture hazard identification and mitigation through regulatory zoning. Mapping and evaluation of active faults is generally accomplished through conventional aerial photo interpretation and field mapping, which rely on recognizing fault-related geomorphic features and juxtaposition of contrasting rocks, soil, and geologic structure. Faults covered by vegetation or concealed by young alluvium will most likely not be detected by this method. Furthermore, spatial accuracy of photo-interpreted fault traces is limited to the accuracy, scale, and method of transfer to conventional topographic base maps, which generally lack the spatial accuracy of geolocated imagery. The inherent limitations of conventional active fault mapping are expected to be overcome by using integrated MASTER and LiDAR data. MASTER is a multispectral imagery with 50 spectral bands ranging from visible to thermal region of the electromagnetic spectrum. LiDAR on the other hand is a laser-based technology with very high positional accuracy, sub-meter resolution and capability to filter out vegetation. MASTER and LiDAR are integrated via data transformation/fusion and the resulting fused imagery are utilized to interpret active faults through recognition of fault features associated with different distinctive properties related to geology, drainage, vegetation, hydrology, thermal, anthropogenic, and topography. The completeness and accuracy of the fault interpretation is gauged by overlaying it to a baseline data of previously mapped fault traces. The research study, supported by a NASA grant, evaluated

  19. Performance Assessment of the Final TanDEM-X DEM

    NASA Astrophysics Data System (ADS)

    Boer, Johannes; Gonzalez, Carolina; Wecklich, Chrostopher; Brautigam, Benjamin; Schulze, Daniel; Bachmann, Markus; Zink, Manfred

    2016-08-01

    The TanDEM-X system is an innovative radar mission, which is comprised of two formation flying satellites, with the primary goal of generating a global Digital Elevation Model (DEM) of unprecedented accuracy. TanDEM-X, being a large single-pass radar interferometer, achieves this accuracy through a flexible baseline selection enabling the acquisition of highly accurate cross-track interferograms that are not impacted by temporal decorrelation or atmospheric disturbances. At least two global coverages (at least four in the case of difficult terrain) are combined into a homogenous global DEM mosaic consisting of 1° by 1° geocells. With the DEM data production of the Earths continents almost completed, apart from Antarctica, this paper provides a quality summary of the currently available part of the TanDEM-X global DEM with respect to the DEM absolute and relative height accuracy as well as to void density.

  20. Development of parallel DEM for the open source code MFIX

    SciTech Connect

    Gopalakrishnan, Pradeep; Tafti, Danesh

    2013-02-01

    The paper presents the development of a parallel Discrete Element Method (DEM) solver for the open source code, Multiphase Flow with Interphase eXchange (MFIX) based on the domain decomposition method. The performance of the code was evaluated by simulating a bubbling fluidized bed with 2.5 million particles. The DEM solver shows strong scalability up to 256 processors with an efficiency of 81%. Further, to analyze weak scaling, the static height of the fluidized bed was increased to hold 5 and 10 million particles. The results show that global communication cost increases with problem size while the computational cost remains constant. Further, the effects of static bed height on the bubble hydrodynamics and mixing characteristics are analyzed.

  1. Reconstructing Stellar DEMs from X-ray Spectra

    NASA Astrophysics Data System (ADS)

    Kang, H.; van Dyk, D.; Kashyap, V.; Connors, A.

    2004-08-01

    The temperature distribution of Emission Measure is a powerful tool to characterize and understand the composition and physical structure of stellar coronae. Numerous methods have been proposed in the literature to compute the Differential Emission Measure (DEM) based on line fluxes measured from identifiable lines in high-resolution EUV and X-ray spectra. Here we describe a new and powerful method that we have developed to reconstruct DEMs that improves significantly on previous algorithms and further allows for incorporating atomic data errors into the calculations. Some notable features of our algorithm are: an ability to fit to either a selected subset of lines with measured fluxes or to perform a global fit to all lines over the full wavelength range of the instrument, to fully incorporate line blends, to obtain error bars to determine the significance of features seen in the reconstructed DEM, and to directly incorporate prior information such as atomic line sequences, known atomic data errors, systematic effects due to calibration uncertainties, etc. We use highly structured models to account for the mixing of the ion/temperature specific spectra, the mixing of continuum photons with those from the multitude of spectral lines, instrumental response, the effective area of the instrument, and background contamination. We introduce the statistical framework of data augmentation (e.g., EM algorithms and MCMC samplers), in which we treat photon count in each level of the hierarchical structure as missing data. We implement a multi-scale (wavelet-like) prior distribution to smooth the DEM, which gives us the flexibility to overcome lack of information especially with low count data. In this talk we provide several simulation studies with both high-count and low-count data to evaluate the proposed method. We also provide several DEM reconstruction results of the active star alpha Aur (Capella), and validate the method by comparing our results to previous estimates

  2. Constructing a paleo-DEM in an urban area by the example of the city of Aachen, Germany: Methods and previous results

    NASA Astrophysics Data System (ADS)

    Pröschel, Bernhard; Lehmkuhl, Frank

    2017-04-01

    Reconstructing paleo-landscapes in urban areas is always a special challenge since the research area often witnessed constant human impact over long time periods. Dense building development is a major difficulty, particularly in regard to accessibility to in-situ soils and archaeological findings. It is therefore necessary to use data from various sources and combine methods from different fields to gain a detailed picture of the former topography. The area, which is occupied by the city of Aachen today, looks back on a long history of human influence. Traces of human activity can be dated back to Neolithic time. The first architectural structures and the first road network were built by the Romans about 2000 years ago. From then on, the area of Aachen was more or less continuously inhabited forming today's city. This long history is represented by archaeological findings throughout the city. Several meters of settlement deposits, covering different eras, are present in many locations. Therefore, it can be assumed that the modern topography significantly differs from the pre-roman topography. The main objective of this project is a reconstruction of the paleo-topography of Aachen in order to gain new insights on the spatial preconditions that the first settlers found. Moreover, further attention is given to the question whether and to what extent a paleo-DEM can help to clarify specific open archaeological and historical questions. The main database for the reconstruction are the archaeological excavation reports of the past 150 years, provided by municipal and regional archives. After analyzing these written accounts, we linked this information to drill data, provided by the Geological Service of North Rhine-Westphalia. Together with additional sources like geological and hydrological maps, we generated a GIS-based terrain model. The result is a high-resolution terrain model, representing the undisturbed pre-roman topography of the inner city of Aachen without any

  3. Evaluation of the influence of metabolic processes and body composition on cognitive functions: Nutrition and Dementia Project (NutrDem Project).

    PubMed

    Magierski, R; Kłoszewska, I; Sobow, T

    2014-11-01

    The global increase in the prevalence of dementia and its associated comorbidities and consequences has stimulated intensive research focused on better understanding of the basic mechanisms and the possibilities to prevent and/or treat cognitive decline or dementia. The etiology of cognitive decline and dementia is very complex and is based upon the interplay of genetic and environmental factors. A growing body of epidemiological evidence has suggested that metabolic syndrome and its components may be important in the development of cognitive decline. Furthermore, an abnormal body mass index in middle age has been considered as a predictor for the development of dementia. The Nutrition and Dementia Project (NutrDem Project) was started at the Department of Old Age Psychiatry and Psychotic Disorders with close cooperation with Department of Medical Psychology. The aim of this study is to determine the effect of dietary patterns, nutritional status, body composition (with evaluation of visceral fat) and basic regulatory mechanisms of metabolism in elderly patients on cognitive functions and the risk of cognitive impairment (mild cognitive impairment and/or dementia).

  4. Using Distinct-Element Method (DEM) to Investigate Tsaoling Landslide Induced by Chi-Chi Earthquake, Taiwan.

    NASA Astrophysics Data System (ADS)

    Tang, C.; Hu, J.; Lin, M.

    2006-12-01

    Large landslides occurred in the mountainous area near the epicenter on Sept. 21st, 1999, Chi-Chi earthquake in central Taiwan. These landslides were triggered by the Mw = 7.6 earthquake, which resulted in more than 2,400 people casualties and widespread damage. The 1999 Chi-Chi earthquake triggered a catastrophic Tsaloing landslide, which mobilized about 0.125 km3 of rock and soil that slid across the Chingshui River and created a 5 km long natural dam. One fifth of the landslide mass dropped into the Chingshui River, the rest crossed over Chingshui River. At least five large landslides occurred in Tsaoling area are induced by big earthquakes and downpours since 1862 to 1999. Geological investigation shows that the prevailing attitude of sedimentary formation is about N50W with a dipping angle of 12S. First we used Newmark Method to calculate the stability of slope distinct-element method to simulate Tsaoling landslide (PFC3d and PFC2d discrete element code). Because of the discrete, particle-based nature of the model, specification of material properties and boundary condition is more difficult than available continuum methods. The user may specify micro-properties that control particle-particle interaction, but have no way to directly prescribe the micro-properties of the model such as Young's modulus(E), unconfined compressive strength (UCS), Cohesion(C0), Possion's ratio(£h), coefficient of friction(£g), porosity, and the initial stress state. As a result, the process of generating an initial model with the appropriate material behavior and initial stress state is by trial-and-error, requiring the use of numerical equivalent of a biaxial rock mechanics test rig to derive the rock mechanical macro-properties. We conclude that the characteristics of Tsaoling landslide process are: (1) the rocks were bond together on sliding, and (2) the frictional coefficient was very small.

  5. Siku DEM Simulations of Beaufort Sea-Ice Fracture Pattern.

    NASA Astrophysics Data System (ADS)

    Kulchitsky, A. V.; Hutchings, J. K.; Johnson, J.; Velikhovskiy, G.

    2016-12-01

    Leads are fractures in the ice pack where exposed ocean surface increases heat and moisture fluxes to the atmosphere. These leads are the location of shear in the pack and during winter control the transport of ice around the Beaufort Gyre. Hence prediction of lead direction opening and shear is important in forecasting sea ice drift and weather. Regional ice pack deformation is related to the fracture patterns, and related shear zones. Hence climate models need to simulate these processes to simulate realistic sea-ice transport and mass balance. We have developed a new discrete element method (DEM) model of sea ice, Siku, to forecast lead patterns. Siku is the first sea ice DEM model that takes into account the spherical geometry of the Earth, and allows simulation ranging from basin scale to meter scale without nesting. We present simulations with 2.5km resolution in the Chukchi and Beaufort Seas, and 25-100km across the rest of the Arctic. The DEM has been shown to reproduce discontinuous dynamics that result in shear patterns in the ice cover. We evaluate these against observed fracture patterns in thermal band satellite imagery. Simulations with differing ice mechanics produce lead pattern differences that are used to evaluate the physical validity of proposed physics of ice-ice and ice-coast contact. We present simulations demonstrating a good match to observations and discuss the implications for continuum modeling, where predicted ice transport along the Alaskan coast is known to be too slow.

  6. Failure and frictional sliding envelopes in three-dimensional stress space: Insights from Distinct Element Method (DEM) models and implications for the brittle-ductile transition of rock

    NASA Astrophysics Data System (ADS)

    Schöpfer, Martin; Childs, Conrad; Manzocchi, Tom

    2013-04-01

    Rocks deformed at low confining pressure are brittle, meaning that after peak stress the strength decreases to a residual value determined by frictional sliding. The difference between the peak and residual value is the stress drop. At high confining pressure, however, no stress drop occurs. The transition pressure at which no loss in strength occurs is a possible definition of the brittle-ductile transition. The Distinct Element Method (DEM) is used to illustrate how this type of brittle-ductile transition emerges from a simple model in which rock is idealised as an assemblage of cemented spherical unbreakable grains. These bonded particle models are subjected to loading under constant mean stress and stress ratio conditions using distortional periodic space, which eliminates possible boundary effects arising from the usage of rigid loading platens. Systematic variation of both mean stress and stress ratio allowed determination of the complete three dimensional yield, peak stress and residual strength envelopes. The models suggest that the brittle-ductile transition is a mean stress and stress ratio dependent space curve, which cannot be adequately described by commonly used failure criteria (e.g., Mohr-Coulomb, Drucker-Prager). The model peak strength data exhibit an intermediate principal stress dependency which is, at least qualitatively, similar to that observed for natural rocks deformed under polyaxial laboratory conditions. Comparison of failure envelopes determined for bonded particle models with and without bond shear failure suggests that the non-linear pressure dependence of strength (concave failure envelopes) is, at high mean stress, the result of microscopic shear failure, a result consistent with earlier two-dimensional numerical multiple-crack simulations [D. A. Lockner & T. R. Madden, JGR, Vol. 96, No. B12, 1991]. Our results may have implications for a wide range of geophysical research areas, including the strength of the crust, the seismogenic

  7. Designing Tunnel Support in Jointed Rock Masses Via the DEM

    NASA Astrophysics Data System (ADS)

    Boon, C. W.; Houlsby, G. T.; Utili, S.

    2015-03-01

    A systematic approach of using the distinct element method (DEM) to provide useful insights for tunnel support in moderately jointed rock masses is illustrated. This is preceded by a systematic study of common failure patterns for unsupported openings in a rock mass intersected by three independent sets of joints. The results of our simulations show that a qualitative description of the failure patterns using specific descriptors is unattainable. Then, it is shown that DEM analyses can be employed in the preliminary design phase of tunnel supports to determine the main parameters of a support consisting of rock bolts or one lining or a combination of both. A comprehensive parametric analysis investigating the effect of bolt bonded length, bolt spacing, bolt length, bolt pretension, bolt stiffness and lining thickness on the tunnel convergence is illustrated. The highlight of the proposed approach of preliminary support design is the use of a rock bolt and lining interaction diagram to evaluate the relative effectiveness of rock bolts and lining thickness in the design of the tunnel support. The concept of interaction diagram can be used to assist the engineer in making preliminary design decisions given a target maximum allowable convergence. In addition, DEM simulations were validated against available elastic solutions. To the authors' knowledge, this is the first verification of DEM calculations for supported openings against elastic solutions. The methodologies presented in this article are illustrated through 2-D plane strain analyses for the preliminary design stage. More rigorous analyses incorporating 3-D effects have not been attempted in this article because the longitudinal displacement profile is highly sensitive to the joint orientations with respect to the tunnel axis, and cannot be established accurately in 2-D. The methodologies and concepts discussed in this article, however, have the potential to be extended to 3-D analyses.

  8. Using Economic Methods Evaluatively

    ERIC Educational Resources Information Center

    King, Julian

    2017-01-01

    As evaluators, we are often asked to determine whether policies and programs provide value for the resources invested. Addressing that question can be a quandary, and, in some cases, evaluators question whether cost-benefit analysis is fit for this purpose. With increased interest globally in social enterprise, impact investing, and social impact…

  9. Using Economic Methods Evaluatively

    ERIC Educational Resources Information Center

    King, Julian

    2017-01-01

    As evaluators, we are often asked to determine whether policies and programs provide value for the resources invested. Addressing that question can be a quandary, and, in some cases, evaluators question whether cost-benefit analysis is fit for this purpose. With increased interest globally in social enterprise, impact investing, and social impact…

  10. Hydrologic enforcement of lidar DEMs

    USGS Publications Warehouse

    Poppenga, Sandra K.; Worstell, Bruce B.; Danielson, Jeffrey J.; Brock, John C.; Evans, Gayla A.; Heidemann, H. Karl

    2014-01-01

    Hydrologic-enforcement (hydro-enforcement) of light detection and ranging (lidar)-derived digital elevation models (DEMs) modifies the elevations of artificial impediments (such as road fills or railroad grades) to simulate how man-made drainage structures such as culverts or bridges allow continuous downslope flow. Lidar-derived DEMs contain an extremely high level of topographic detail; thus, hydro-enforced lidar-derived DEMs are essential to the U.S. Geological Survey (USGS) for complex modeling of riverine flow. The USGS Coastal and Marine Geology Program (CMGP) is integrating hydro-enforced lidar-derived DEMs (land elevation) and lidar-derived bathymetry (water depth) to enhance storm surge modeling in vulnerable coastal zones.

  11. Selection: Evaluation and methods

    USDA-ARS?s Scientific Manuscript database

    Procedures to collect and to analyze data for genetic improvement of dairy cattle are described. Methods of identification and milk recording are presented. Selection traits include production (milk, fat, and protein yields and component percentages), conformation (final score and linear type traits...

  12. Voltammetry Method Evaluation

    SciTech Connect

    Hoyt, N.; Pereira, C.; Willit, J.; Williamson, M.

    2016-07-29

    The purpose of the ANL MPACT Voltammetry project is to evaluate the suitability of previously developed cyclic voltammetry techniques to provide electroanalytical measurements of actinide concentrations in realistic used fuel processing scenarios. The molten salts in these scenarios are very challenging as they include high concentrations of multiple electrochemically active species, thereby creating a variety of complications. Some of the problems that arise therein include issues related to uncompensated resistance, cylindrical diffusion, and alloying of the electrodeposited metals. Improvements to the existing voltammetry technique to account for these issues have been implemented, resulting in good measurements of actinide concentrations across a wide range of adverse conditions.

  13. Satellite-derived Digital Elevation Model (DEM) selection, preparation and correction for hydrodynamic modelling in large, low-gradient and data-sparse catchments

    NASA Astrophysics Data System (ADS)

    Jarihani, Abdollah A.; Callow, John N.; McVicar, Tim R.; Van Niel, Thomas G.; Larsen, Joshua R.

    2015-05-01

    Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modelling of environmental processes. Topographic accuracy, methods of preparation and grid size are all important for hydrodynamic models to efficiently replicate flow processes. In remote and data-scarce regions, high resolution DEMs are often not available and therefore it is necessary to evaluate lower resolution data such as the Shuttle Radar Topography Mission (SRTM) and Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) for use within hydrodynamic models. This paper does this in three ways: (i) assessing point accuracy and geometric co-registration error of the original DEMs; (ii) quantifying the effects of DEM preparation methods (vegetation smoothed and hydrologically-corrected) on hydrodynamic modelling relative accuracy; and (iii) quantifying the effect of the hydrodynamic model grid size (30-2000 m) and the associated relative computational costs (run time) on relative accuracy in model outputs. We initially evaluated the accuracy of the original SRTM (∼30 m) seamless C-band DEM (SRTM DEM) and second generation products from the ASTER (ASTER GDEM) against registered survey marks and altimetry data points from the Ice, Cloud, and land Elevation Satellite (ICESat). SRTM DEM (RMSE = 3.25 m,) had higher accuracy than ASTER GDEM (RMSE = 7.43 m). Based on these results, the original version of SRTM DEM, the ASTER GDEM along with vegetation smoothed and hydrologically corrected versions were prepared and used to simulate three flood events along a 200 km stretch of the low-gradient Thompson River, in arid Australia (using five metrics: peak discharge, peak height, travel time, terminal water storage and flood extent). The hydrologically corrected DEMs performed best across these metrics in simulating floods compared with vegetation smoothed DEMs and original DEMs. The response of model performance to grid size was non

  14. Separability of soils in a tallgrass prairie using SPOT and DEM data

    NASA Technical Reports Server (NTRS)

    Su, Haiping; Ransom, Michel D.; Yang, Shie-Shien; Kanemasu, Edward T.

    1990-01-01

    An investigation is conducted which uses a canonical transformation technique to reduce the features from SPOT and DEM data and evaluates the statistical separability of several prairie soils from the canonically transformed variables. Both SPOT and DEM data was gathered for a tallgrass prairie near Manhattan, Kansas, and high resolution SPOT satellite images were integrated with DEM data. Two canonical variables derived from training samples were selected and it is suggested that canonically transformed data were superior to combined SPOT and DEM data. High resolution SPOT images and DEM data can be used to aid second-order soil surveys in grasslands.

  15. High-resolution DEM generation from multiple remote sensing data sources for improved volcanic hazard assessment - a case study from Nevado del Ruiz, Colombia

    NASA Astrophysics Data System (ADS)

    Deng, Fanghui; Dixon, Timothy H.; Rodgers, Mel; Charbonnier, Sylvain J.; Gallant, Elisabeth A.; Voss, Nicholas; Xie, Surui; Malservisi, Rocco; Ordoñez, Milton; López, Cristian M.

    2017-04-01

    TRI images. It is a low-cost and effective method to generate high-quality DEMs in relatively small spatial scales. More than 2000 photos were combined to create a DEM of the deep valley in the shadow zones. DEMs from the above three remote sensing data sources were merged into a final DEM with 10×10 m resolution. The effect of this improved DEM on hazard assessment can be evaluated using numerical flow models.

  16. The Double Hierarchy Method. A parallel 3D contact method for the interaction of spherical particles with rigid FE boundaries using the DEM

    NASA Astrophysics Data System (ADS)

    Santasusana, Miquel; Irazábal, Joaquín; Oñate, Eugenio; Carbonell, Josep Maria

    2016-07-01

    In this work, we present a new methodology for the treatment of the contact interaction between rigid boundaries and spherical discrete elements (DE). Rigid body parts are present in most of large-scale simulations. The surfaces of the rigid parts are commonly meshed with a finite element-like (FE) discretization. The contact detection and calculation between those DE and the discretized boundaries is not straightforward and has been addressed by different approaches. The algorithm presented in this paper considers the contact of the DEs with the geometric primitives of a FE mesh, i.e. facet, edge or vertex. To do so, the original hierarchical method presented by Horner et al. (J Eng Mech 127(10):1027-1032, 2001) is extended with a new insight leading to a robust, fast and accurate 3D contact algorithm which is fully parallelizable. The implementation of the method has been developed in order to deal ideally with triangles and quadrilaterals. If the boundaries are discretized with another type of geometries, the method can be easily extended to higher order planar convex polyhedra. A detailed description of the procedure followed to treat a wide range of cases is presented. The description of the developed algorithm and its validation is verified with several practical examples. The parallelization capabilities and the obtained performance are presented with the study of an industrial application example.

  17. Incorporating DEM uncertainty in coastal inundation mapping.

    PubMed

    Leon, Javier X; Heuvelink, Gerard B M; Phinn, Stuart R

    2014-01-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey

  18. Incorporating DEM Uncertainty in Coastal Inundation Mapping

    PubMed Central

    Leon, Javier X.; Heuvelink, Gerard B. M.; Phinn, Stuart R.

    2014-01-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey

  19. Methodologies for watershed modeling with GIS and DEMs for the parameterization of the WEPP model

    NASA Astrophysics Data System (ADS)

    Cochrane, Thomas Arey

    Two methods called the Hillslope and Flowpath methods were developed that use geographical information systems (GIS) and digital elevation models (DEMs) to assess water erosion in small watersheds with the Water Erosion Prediction Project (WEPP) model. The Hillslope method is an automated method for the application of WEPP through the extraction of hillslopes and channels from DEMs. Each hillslope is represented as a rectangular area with a representative slope profile that drains to the top or sides of a single channel. The Hillslope method was further divided into the Calcleng and Chanleng methods, which are similar in every way except on how the hillslope lengths are calculated. The Calcleng method calculates a representative length of hillslope based on the weighted lengths of all flowpaths in a hillslope as identified through a DEM. The Chanleng method calculates the length of hillslopes adjacent to channels by matching the width of the hillslope to the length of adjacent channel. The Flowpath method works by applying the WEPP model to all possible flowpaths within a watershed as identified from a DEM. However, this method does not currently have a channel routing component, which limits its use to predicting spatially variable erosion on hillslopes within the watershed or from watersheds whose channels are not in a depositional or erodible mode. These methods were evaluated with six research watersheds from across the U.S., one from Treynor, Iowa, two from Watkinsville, Georgia, and three from Holly Springs, Mississippi. The effects of using different-sized DEM resolutions on simulations and the ability to accurately predict sediment yield and runoff from different event sizes were studied. Statistical analyses for all methods, resolutions, and event sizes were performed by comparing predicted vs. measured runoff and sediment yield from the watershed outlets on an event by event basis. Comparisons to manual applications by expert users and comparisons of

  20. Effect of DEM mesh size on AnnAGNPS simulation and slope correction.

    PubMed

    Wang, Xiaoyan; Lin, Q

    2011-08-01

    The objective of this paper is to study the impact of the mesh size of the digital elevation model (DEM) on terrain attributes within an Annualized AGricultural NonPoint Source pollution (AnnAGNPS) Model simulation at watershed scale and provide a correction of slope gradient for low resolution DEMs. The effect of different grid sizes of DEMs on terrain attributes was examined by comparing eight DEMs (30, 40, 50, 60, 70, 80, 90, and 100 m). The accuracy of the AnnAGNPS stimulation on runoff, sediments, and nutrient loads is evaluated. The results are as follows: (1) Rnoff does not vary much with decrease of DEM resolution whereas soil erosion and total nitrogen (TN) load change prominently. There is little effect on runoff simulation of AnnAGNPS modeling by the amended slope using an adjusted 50 m DEM. (2) A decrease of sediment yield and TN load is observed with an increase of DEM mesh size from 30 to 60 m; a slight decrease of sediment and TN load with the DEM mesh size bigger than 60 m. There is similar trend for total phosphorus (TP) variation, but with less range of variation, the simulation of sediment, TN, and TP increase, in which sediment increase up to 1.75 times compared to the model using unadjusted 50 m DEM. In all, the amended simulation still has a large difference relative to the results using 30 m DEM. AnnAGNPS is less reliable for sediment loading prediction in a small hilly watershed. (3) Resolution of DEM has significant impact on slope gradient. The average, minimum, maximum of slope from the various DEMs reduced obviously with the decrease of DEM precision. For the grade of 0∼15°, the slopes at lower resolution DEM are generally bigger than those at higher resolution DEM. But for the grade bigger than 15°, the slopes at lower resolution DEM are generally smaller than those at higher resolution DEM. So it is necessary to adjust the slope with a fitting equation. A cubic model is used for correction of slope gradient from lower resolution to

  1. Stratum corneum evaluation methods: overview.

    PubMed

    Myer, Kaley; Maibach, Howard

    2013-08-01

    The stratum corneum serves as a main barrier for the skin, minimizing water loss and regulating absorption of substances. Because of its surface location, it is readily available for analysis. Consequently, many techniques are amenable to investigating its content and function. Here, we review the methods employed to evaluate the stratum corneum and its function. We reviewed Pubmed and Embase search results for 'stratum corneum, 'method,' methods,' 'technique,' and 'evaluation' and extracted pertinent articles that discussed ways to examine the stratum corneum and its constituents. Traditional and novel methods vary by accuracy, ease of use, time requirements, cost, invasiveness, and equipment requirements. The methods reviewed all contribute to our current picture of the stratum corneum. Tape stripping continues to be the most widely used, but variations in the use of the corneocytes obtained further contribute to the diversity in evaluation methods. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. A Global Corrected SRTM DEM Product Over Vegetated Areas Using LiDAR Data

    NASA Astrophysics Data System (ADS)

    Zhao, X.; Guo, Q.; Su, Y.; Hu, T.

    2016-12-01

    The Shuttle Radar Topography Mission (SRTM) digital elevation model (DEM) is one of the most complete and frequently used global-scale DEM products in various applications. However, previous studies have shown that the SRTM DEM is systematically higher than the actual land surface in vegetated mountain areas. The objective of this study is to propose a procedure to calibrate the SRTM DEM over global vegetated mountain areas. To address this, we firstly collected airborne LiDAR data over 200,000 km2 globally used as ground truth data to analyze the uncertainty of the SRTM DEM. The Geoscience Laser Altimeter System (GLAS)/ICESat (Ice, Cloud, and land Elevation Satellite) data were used as complementary data in areas lack of airborne LiDAR data. Secondly, we modelled the SRTM DEM error for each vegetation type using regression methods. Tree height, canopy cover, and terrain slope were used as dependent variables to model the SRTM DEM error. Finally, these regression models were used to estimate the SRTM DEM error in vegetated mountain areas without LiDAR data coverage, and therefore correct the SRTM DEM. Our results show that the new corrected SRTM DEM can significantly reduce the systematic bias of the SRTM DEM in vegetated mountain areas.

  3. Evaluation methods for hospital projects.

    PubMed

    Buelow, Janet R; Zuckweiler, Kathryn M; Rosacker, Kirsten M

    2010-01-01

    The authors report the findings of a survey of hospital managers on the utilization of various project selection and evaluation methodologies. The focus of the analysis was the empirical relationship between a portfolio of project evaluation(1) methods actually utilized for a given project and several measures of perceived project success. The analysis revealed that cost-benefit analysis and top management support were the two project evaluation methods used most often by the hospital managers. The authors' empirical assessment provides evidence that top management support is associated with overall project success.

  4. Radar and Lidar Radar DEM

    NASA Technical Reports Server (NTRS)

    Liskovich, Diana; Simard, Marc

    2011-01-01

    Using radar and lidar data, the aim is to improve 3D rendering of terrain, including digital elevation models (DEM) and estimates of vegetation height and biomass in a variety of forest types and terrains. The 3D mapping of vegetation structure and the analysis are useful to determine the role of forest in climate change (carbon cycle), in providing habitat and as a provider of socio-economic services. This in turn will lead to potential for development of more effective land-use management. The first part of the project was to characterize the Shuttle Radar Topography Mission DEM error with respect to ICESat/GLAS point estimates of elevation. We investigated potential trends with latitude, canopy height, signal to noise ratio (SNR), number of LiDAR waveform peaks, and maximum peak width. Scatter plots were produced for each variable and were fitted with 1st and 2nd degree polynomials. Higher order trends were visually inspected through filtering with a mean and median filter. We also assessed trends in the DEM error variance. Finally, a map showing how DEM error was geographically distributed globally was created.

  5. High-Precision DEM Generation Using Satellite-Borne InSAR Technology

    NASA Astrophysics Data System (ADS)

    Li, Tao; Tang, Xinming; Gao, Xiaoming; Chen, Weinan; Chen, Qianfu; Wu, Danqin

    2016-08-01

    Satellite-borne InSAR is useful in generating DEM globally. Especially after TanDEM-X interferometer started its mission in 2010. In this paper, we analyze the interferometric geometry in surveying and mapping application. And we locate main error sources, i.e., phase error and baseline error, using the parameters extracted from TanDEM-X interferometer. The phase error is suppressed using multi-look iteration. The rich textures as well as the high phase accuracy are both maintained through this method. The baseline error is reduced by using the long-and-short baseline combination method. Finally, we propose to mosaic the ascending and descending DEM according to coherence values to reduce the low coherent areas. Experiments in flat ground, hill and mountain land are conducted to test the feasibility of the proposed methods. Results demonstrate that TanDEM-X may be used in high-precision DEM generation.

  6. Enhanced ASTER DEMs for Decadal Measurements of Glacier Elevation Changes

    NASA Astrophysics Data System (ADS)

    Girod, L.; Nuth, C.; Kääb, A.

    2016-12-01

    Elevation change data is critical to the understanding of a number of geophysical processes, including glaciers through the measurement their volume change. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system on-board the Terra (EOS AM-1) satellite has been a unique source of systematic stereoscopic images covering the whole globe at 15m resolution and at a consistent quality for over 15 years. While satellite stereo sensors with significantly improved radiometric and spatial resolution are available today, the potential of ASTER data lies in its long consistent time series that is unrivaled, though not fully exploited for change analysis due to lack of data accuracy and precision. ASTER data are strongly affected by attitude jitter, mainly of approximately 4 and 30 km wavelength, and improving the generation of ASTER DEMs requires removal of this effect. We developed MMASTER, an improved method for ASTER DEM generation and implemented it in the open source photogrammetric library and software suite MicMac. The method relies on the computation of a rational polynomial coefficients (RPC) model and the detection and correction of cross-track sensor jitter in order to compute DEMs. Our sensor modeling does not require ground control points and thus potentially allows for automatic processing of large data volumes. When compared to ground truth data, we have assessed a ±5m accuracy in DEM differencing when using our processing method, improved from the ±30m when using the AST14DMO DEM product. We demonstrate and discuss this improved ASTER DEM quality for a number of glaciers in Greenland (See figure attached), Alaska, and Svalbard. The quality of our measurements promises to further unlock the underused potential of ASTER DEMs for glacier volume change time series on a global scale. The data produced by our method will thus help to better understand the response of glaciers to climate change and their influence on runoff and sea level.

  7. Incorporation of Rubber Powder as Filler in a New Dry-Hybrid Technology: Rheological and 3D DEM Mastic Performances Evaluation

    PubMed Central

    Vignali, Valeria; Mazzotta, Francesco; Sangiorgi, Cesare; Simone, Andrea; Lantieri, Claudio; Dondi, Giulio

    2016-01-01

    In recent years, the use of crumb rubber as modifier or additive within asphalt concretes has allowed obtaining mixtures able to bind high performances to recovery and reuse of discarded tires. To date, the common technologies that permit the reuse of rubber powder are the wet and dry ones. In this paper, a dry-hybrid technology for the production of Stone Mastic Asphalt mixtures is proposed. It allows the use of the rubber powder as filler, replacing part of the limestone one. Fillers are added and mixed with a high workability bitumen, modified with SBS (styrene-butadiene-styrene) polymer and paraffinic wax. The role of rubber powder and limestone filler within the bituminous mastic has been investigated through two different approaches. The first one is a rheological approach, which comprises a macro-scale laboratory analysis and a micro-scale DEM simulation. The second, instead, is a performance approach at high temperatures, which includes Multiple Stress Creep Recovery tests. The obtained results show that the rubber works as filler and it improves rheological characteristics of the polymer modified bitumen. In particular, it increases stiffness and elasticity at high temperatures and it reduces complex modulus at low temperatures. PMID:28773965

  8. Incorporation of Rubber Powder as Filler in a New Dry-Hybrid Technology: Rheological and 3D DEM Mastic Performances Evaluation.

    PubMed

    Vignali, Valeria; Mazzotta, Francesco; Sangiorgi, Cesare; Simone, Andrea; Lantieri, Claudio; Dondi, Giulio

    2016-10-18

    In recent years, the use of crumb rubber as modifier or additive within asphalt concretes has allowed obtaining mixtures able to bind high performances to recovery and reuse of discarded tires. To date, the common technologies that permit the reuse of rubber powder are the wet and dry ones. In this paper, a dry-hybrid technology for the production of Stone Mastic Asphalt mixtures is proposed. It allows the use of the rubber powder as filler, replacing part of the limestone one. Fillers are added and mixed with a high workability bitumen, modified with SBS (styrene-butadiene-styrene) polymer and paraffinic wax. The role of rubber powder and limestone filler within the bituminous mastic has been investigated through two different approaches. The first one is a rheological approach, which comprises a macro-scale laboratory analysis and a micro-scale DEM simulation. The second, instead, is a performance approach at high temperatures, which includes Multiple Stress Creep Recovery tests. The obtained results show that the rubber works as filler and it improves rheological characteristics of the polymer modified bitumen. In particular, it increases stiffness and elasticity at high temperatures and it reduces complex modulus at low temperatures.

  9. Study protocol of the multi-site randomised controlled REDALI-DEM trial - The effects of structured Relearning methods on Daily Living task performance of persons with Dementia

    PubMed Central

    2011-01-01

    Background Evidence from pilot trials suggests that structured learning techniques may have positive effects on the performance of cognitive tasks, movement sequences or skills in patients with Alzheimer's disease. The purpose of this trial is to evaluate whether the usual method of learning by trial and error or the method of errorless learning demonstrate better effects on the performance of two selected daily living tasks six weeks after the intervention in people with mild to moderate dementia. Methods/Design A seven-centre single-blind, active-controlled design with a 1:1 randomisation for two parallel groups will include 175 persons diagnosed with Alzheimer's disease or mixed type dementia (MMSE 14-24), living at home, showing at least moderate need for assistance in instrumental activities of daily living; primary carer available and informed consent of patient and primary carer. Patients of both study arms will receive 15 one-hour-sessions at home by trained interventionists practising two daily living tasks individually selected. In one group the trial and error technique and in the other group the errorless learning method will be applied. Primary outcome is the task performance measured with the Task Performance Scale six weeks post treatment. Discussion The trial results will inform us to improve guidelines for instructing individuals with memory impairments. A user-friendly practice guideline will allow an efficient implementation of structured relearning techniques for a wide range of service providers in dementia care. Trial registration DRKS00003117 PMID:21851594

  10. LNG Safety Assessment Evaluation Methods

    SciTech Connect

    Muna, Alice Baca; LaFleur, Angela Christine

    2015-05-01

    Sandia National Laboratories evaluated published safety assessment methods across a variety of industries including Liquefied Natural Gas (LNG), hydrogen, land and marine transportation, as well as the US Department of Defense (DOD). All the methods were evaluated for their potential applicability for use in the LNG railroad application. After reviewing the documents included in this report, as well as others not included because of repetition, the Department of Energy (DOE) Hydrogen Safety Plan Checklist is most suitable to be adapted to the LNG railroad application. This report was developed to survey industries related to rail transportation for methodologies and tools that can be used by the FRA to review and evaluate safety assessments submitted by the railroad industry as a part of their implementation plans for liquefied or compressed natural gas storage ( on-board or tender) and engine fueling delivery systems. The main sections of this report provide an overview of various methods found during this survey. In most cases, the reference document is quoted directly. The final section provides discussion and a recommendation for the most appropriate methodology that will allow efficient and consistent evaluations to be made. The DOE Hydrogen Safety Plan Checklist was then revised to adapt it as a methodology for the Federal Railroad Administration’s use in evaluating safety plans submitted by the railroad industry.

  11. Evaluation Using Sequential Trials Methods.

    ERIC Educational Resources Information Center

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  12. DEM Particle Fracture Model

    SciTech Connect

    Zhang, Boning; Herbold, Eric B.; Homel, Michael A.; Regueiro, Richard A.

    2015-12-01

    An adaptive particle fracture model in poly-ellipsoidal Discrete Element Method is developed. The poly-ellipsoidal particle will break into several sub-poly-ellipsoids by Hoek-Brown fracture criterion based on continuum stress and the maximum tensile stress in contacts. Also Weibull theory is introduced to consider the statistics and size effects on particle strength. Finally, high strain-rate split Hopkinson pressure bar experiment of silica sand is simulated using this newly developed model. Comparisons with experiments show that our particle fracture model can capture the mechanical behavior of this experiment very well, both in stress-strain response and particle size redistribution. The effects of density and packings o the samples are also studied in numerical examples.

  13. Impacts of DEM uncertainties on critical source areas identification for non-point source pollution control based on SWAT model

    NASA Astrophysics Data System (ADS)

    Xu, Fei; Dong, Guangxia; Wang, Qingrui; Liu, Lumeng; Yu, Wenwen; Men, Cong; Liu, Ruimin

    2016-09-01

    The impacts of different digital elevation model (DEM) resolutions, sources and resampling techniques on nutrient simulations using the Soil and Water Assessment Tool (SWAT) model have not been well studied. The objective of this study was to evaluate the sensitivities of DEM resolutions (from 30 m to 1000 m), sources (ASTER GDEM2, SRTM and Topo-DEM) and resampling techniques (nearest neighbor, bilinear interpolation, cubic convolution and majority) to identification of non-point source (NPS) critical source area (CSA) based on nutrient loads using the SWAT model. The Xiangxi River, one of the main tributaries of Three Gorges Reservoir in China, was selected as the study area. The following findings were obtained: (1) Elevation and slope extracted from the DEMs were more sensitive to DEM resolution changes. Compared with the results of the 30 m DEM, 1000 m DEM underestimated the elevation and slope by 104 m and 41.57°, respectively; (2) The numbers of subwatersheds and hydrologic response units (HRUs) were considerably influenced by DEM resolutions, but the total nitrogen (TN) and total phosphorus (TP) loads of each subwatershed showed higher correlations with different DEM sources; (3) DEM resolutions and sources had larger effects on CSAs identifications, while TN and TP CSAs showed different response to DEM uncertainties. TN CSAs were more sensitive to resolution changes, exhibiting six distribution patterns at all DEM resolutions. TP CSAs were sensitive to source and resampling technique changes, exhibiting three distribution patterns for DEM sources and two distribution patterns for DEM resampling techniques. DEM resolutions and sources are the two most sensitive SWAT model DEM parameters that must be considered when nutrient CSAs are identified.

  14. Processing, validating, and comparing DEMs for geomorphic application on the Puna de Atacama Plateau, northwest Argentina

    NASA Astrophysics Data System (ADS)

    Purinton, Benjamin; Bookhagen, Bodo

    2016-04-01

    This study analyzes multiple topographic datasets derived from various remote-sensing methods from the Pocitos Basin of the central Puna Plateau in northwest Argentina at the border to Chile. Here, the arid climate and clear atmospheric conditions and lack of vegetation provide ideal conditions for remote sensing and Digital Elevation Model (DEM) comparison. We compare the following freely available DEMs: SRTM-X (spatial resolution of ~30 m), SRTM-C v4.1 (90 m), and ASTER GDEM2 (30 m). Additional DEMs for comparison are generated from optical and radar datasets acquired freely (ASTER Level 1B stereo pairs and Sentinal-1A radar), through research agreements (RapidEye Level 1B scenes, ALOS radar, and ENVISAT radar), and through commercial sources (TerraSAR-X / TanDEM-X radar). DEMs from ASTER (spatial resolution of 15 m) and RapidEye (~5-10 m) optical datasets are produced by standard photogrammetric techniques and have been post-processed for validation and alignment purposes. Because RapidEye scenes are captured at a low incidence angle (<20°) and stereo pairs are unavailable, merging and averaging methods of two to four overlapping scenes is explored for effective DEM generation. Sentinal-1A, TerraSAR-X / TanDEM-X, ALOS, and ENVISAT radar data is processed through interferometry resulting in DEMs with spatial resolutions ranging from 5 to 30 meters. The SRTM-X dataset serves as a control in the creation of further DEMs, as it is widely used in the geosciences and represents the highest-quality DEM currently available. All DEMs are validated against over 400,000 differential GPS (dGPS) measurements gathered during four field campaigns in 2012 and 2014 to 2016. Of these points, more than 250,000 lie within the Pocitos Basin with average vertical and horizontal accuracies of 0.95 m and 0.69 m, respectively. Dataset accuracy is judged by the lowest standard deviations of elevation compared with the dGPS data and with the SRTM-X control DEM. Of particular interest in

  15. Method for evaluating material viscoelasticity

    NASA Astrophysics Data System (ADS)

    Fujii, Yusaku; Yamaguchi, Takao

    2004-01-01

    A method for evaluating the viscoelasticity of materials under oscillation load is proposed. In the method, a material under test is connected to a mass, which generates an oscillating inertial force after the mass is manually struck using a hammer. A pneumatic linear bearing is used to realize linear motion with sufficiently small friction acting on the mass that is the moving part of the bearing. The inertial force acting on the mass is determined highly accurately by means of measuring the velocity of the mass using an optical interferometer.

  16. Evaluation of morphometric parameters derived from Cartosat-1 DEM using remote sensing and GIS techniques for Budigere Amanikere watershed, Dakshina Pinakini Basin, Karnataka, India

    NASA Astrophysics Data System (ADS)

    Dikpal, Ramesh L.; Renuka Prasad, T. J.; Satish, K.

    2017-07-01

    The quantitative analysis of drainage system is an important aspect of characterization of watersheds. Using watershed as a basin unit in morphometric analysis is the most logical choice because all hydrological and geomorphic processes occur within the watershed. The Budigere Amanikere watershed a tributary of Dakshina Pinakini River has been selected for case illustration. Geoinformatics module consisting of ArcGIS 10.3v and Cartosat-1 Digital Elevation Model (DEM) version 1 of resolution 1 arc Sec ( 32 m) data obtained from Bhuvan is effectively used. Sheet and gully erosion are identified in parts of the study area. Slope in the watershed indicating moderate to least runoff and negligible soil loss condition. Third and fourth-order sub-watershed analysis is carried out. Mean bifurcation ratio (R b) 3.6 specify there is no dominant influence of geology and structures, low drainage density (D d) 1.12 and low stream frequency (F s) 1.17 implies highly infiltration subsoil material and low runoff, infiltration number (I f)1.3 implies higher infiltration capacity, coarse drainage texture (T) 3.40 shows high permeable subsoil, length of overland flow (L g) 0.45 indicates under very less structural disturbances, less runoff conditions, constant of channel maintenance (C) 0.9 indicates higher permeability of subsoil, elongation ratio (R e) 0.58, circularity ratio (R c) 0.75 and form factor (R f) 0.26 signifies sub-circular to more elongated basin with high infiltration with low runoff. It was observed from the hypsometric curves and hypsometric integral values of the watershed along with their sub basins that the drainage system is attaining a mature stage of geomorphic development. Additionally, Hypsometric curve and hypsometric integral value proves that the infiltration capacity is high as well as runoff is low in the watershed. Thus, these mormometric analyses can be used as an estimator of erosion status of watersheds leading to prioritization for taking up soil

  17. BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs

    NASA Astrophysics Data System (ADS)

    Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes

    2017-06-01

    Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.

  18. Local validation of EU-DEM using Least Squares Collocation

    NASA Astrophysics Data System (ADS)

    Ampatzidis, Dimitrios; Mouratidis, Antonios; Gruber, Christian; Kampouris, Vassilios

    2016-04-01

    In the present study we are dealing with the evaluation of the European Digital Elevation Model (EU-DEM) in a limited area, covering few kilometers. We compare EU-DEM derived vertical information against orthometric heights obtained by classical trigonometric leveling for an area located in Northern Greece. We apply several statistical tests and we initially fit a surface model, in order to quantify the existing biases and outliers. Finally, we implement a methodology for orthometric heights prognosis, using the Least Squares Collocation for the remaining residuals of the first step (after the fitted surface application). Our results, taking into account cross validation points, reveal a local consistency between EU-DEM and official heights, which is better than 1.4 meters.

  19. Visualising DEM-related flood-map uncertainties using a disparity-distance equation algorithm

    NASA Astrophysics Data System (ADS)

    Brandt, S. Anders; Lim, Nancy J.

    2016-05-01

    The apparent absoluteness of information presented by crisp-delineated flood boundaries can lead to misconceptions among planners about the inherent uncertainties associated in generated flood maps. Even maps based on hydraulic modelling using the highest-resolution digital elevation models (DEMs), and calibrated with the most optimal Manning's roughness (n) coefficients, are susceptible to errors when compared to actual flood boundaries, specifically in flat areas. Therefore, the inaccuracies in inundation extents, brought about by the characteristics of the slope perpendicular to the flow direction of the river, have to be accounted for. Instead of using the typical Monte Carlo simulation and probabilistic methods for uncertainty quantification, an empirical-based disparity-distance equation that considers the effects of both the DEM resolution and slope was used to create prediction-uncertainty zones around the resulting inundation extents of a one-dimensional (1-D) hydraulic model. The equation was originally derived for the Eskilstuna River where flood maps, based on DEM data of different resolutions, were evaluated for the slope-disparity relationship. To assess whether the equation is applicable to another river with different characteristics, modelled inundation extents from the Testebo River were utilised and tested with the equation. By using the cross-sectional locations, water surface elevations, and DEM, uncertainty zones around the original inundation boundary line can be produced for different confidences. The results show that (1) the proposed method is useful both for estimating and directly visualising model inaccuracies caused by the combined effects of slope and DEM resolution, and (2) the DEM-related uncertainties alone do not account for the total inaccuracy of the derived flood map. Decision-makers can apply it to already existing flood maps, thereby recapitulating and re-analysing the inundation boundaries and the areas that are uncertain

  20. The Importance of Precise Digital Elevation Models (DEM) in Modelling Floods

    NASA Astrophysics Data System (ADS)

    Demir, Gokben; Akyurek, Zuhal

    2016-04-01

    Digital elevation Models (DEM) are important inputs for topography for the accurate modelling of floodplain hydrodynamics. Floodplains have a key role as natural retarding pools which attenuate flood waves and suppress flood peaks. GPS, LIDAR and bathymetric surveys are well known surveying methods to acquire topographic data. It is not only time consuming and expensive to obtain topographic data through surveying but also sometimes impossible for remote areas. In this study it is aimed to present the importance of accurate modelling of topography for flood modelling. The flood modelling for Samsun-Terme in Blacksea region of Turkey is done. One of the DEM is obtained from the point observations retrieved from 1/5000 scaled orthophotos and 1/1000 scaled point elevation data from field surveys at x-sections. The river banks are corrected by using the orthophotos and elevation values. This DEM is named as scaled DEM. The other DEM is obtained from bathymetric surveys. 296 538 number of points and the left/right bank slopes were used to construct the DEM having 1 m spatial resolution and this DEM is named as base DEM. Two DEMs were compared by using 27 x-sections. The maximum difference at thalweg of the river bed is 2m and the minimum difference is 20 cm between two DEMs. The channel conveyance capacity in base DEM is larger than the one in scaled DEM and floodplain is modelled in detail in base DEM. MIKE21 with flexible grid is used in 2- dimensional shallow water flow modelling. The model by using two DEMs were calibrated for a flood event (July 9, 2012). The roughness is considered as the calibration parameter. From comparison of input hydrograph at the upstream of the river and output hydrograph at the downstream of the river, the attenuation is obtained as 91% and 84% for the base DEM and scaled DEM, respectively. The time lag in hydrographs does not show any difference for two DEMs and it is obtained as 3 hours. Maximum flood extents differ for the two DEMs

  1. Generating, Comparing and Exploiting DEMs for Hydrological Applications over the Galapagos Islands

    NASA Astrophysics Data System (ADS)

    D'Ozouville, N.; Benveniste, J.; Deffontaines, B.; Violette, S.; de Marsily, G.; Wegmuller, U.

    Understanding the hydrological cycle of the Galapagos Islands will contribute to more efficient water management in insular basaltic environments with growing anthropogenic pressure and ecosystems to preserve. Lack of essential existing in-situ data such as topography led to retrieving this information from other sources. We present the generation from satellite data of digital elevation model (DEM) and its exploitation for the Santa Cruz island. An interferometric DEM was generated from ASAR (ENVISAT) data with Atlantis EarthView and a radargrammetric DEM using multiple incidence angle capacity of ASAR was generated by Gamma Remote Sensing. SRTM 90 m resolution data (NASA) and a digitalised topographic contour DEM (M. Souris, IRD) were used to aid the phase unwrapping and for comparison and validation. Combining the radargrammetric DEM (overall accurate, few detail) and the interferometric DEM (unresolved in uncoherent areas but high definition in coherent areas), it is hoped to achieve a resolution better than the 90 m SRTM data and which can be compared to the 30 m resolution SRTM data which has been requested from NASA. Drainage networks were extracted and identified on Santa Cruz and zones of interest for the setting up of hydrological instruments are defined. Radargrammetric versus interferometric method of DEM generation in volcanic insular environment is reviewed in this work. Resolution of the DEM will be a limiting factor to the accuracy of transposition from image to fieldwork. Background hydrological information from the DEM can be used in the hydrological modelling.

  2. Wiederbeginn nach dem Zweiten Weltkrieg

    NASA Astrophysics Data System (ADS)

    Strecker, Heinrich; Bassenge-Strecker, Rosemarie

    Dieses Kapitel schildert zunächst die Ausgangslage für die Statistik in Deutschland nach dem Zweiten Weltkrieg: Der statistische Dienst in den Besatzungszonen musste teilweise erst aufgebaut und der statistische Unterricht an den Hochschulen wieder in Gang gebracht werden. In dieser Lage ergriff der Präsident des Bayerischen Statistischen Landesamtes, Karl Wagner, tatkräftig unterstützt von Gerhard Fürst, dem späteren Präsidenten des Statistischen Bundesamtes, die Initiative zur Neugründung der Deutschen Statistischen Gesellschaft (DStatG). Die Gründungsversammlung 1948 im München wurde zu einem Meilenstein in der Geschichte der DStatG. Ziel war es, alle Statistiker zur Zusammenarbeit anzuregen, ihre Qualifikation an das internationale Niveau heranzuführen und die Anwendung neuerer statistischer Methoden in der Praxis zu fördern. Es folgten 24 Jahre fruchtbarer Arbeit unter Karl Wagner (1948-1960) und Gerhard Fürst (1960-1972). Der Beitrag skizziert die Statistischen Wochen, die Tätigkeit der Ausschüsse und die Veröffentlichungen in dieser Zeit.

  3. A photogrammetric DEM of Greenland based on 1978-1987 aerial photos: validation and integration with laser altimetry and satellite-derived DEMs

    NASA Astrophysics Data System (ADS)

    Korsgaard, N. J.; Kjaer, K. H.; Nuth, C.; Khan, S. A.

    2014-12-01

    Here we present a DEM of Greenland covering all ice-free terrain and the margins of the GrIS and local glaciers and ice caps. The DEM is based on the 3534 photos used in the aero-triangulation which were recorded by the Danish Geodata Agency (then the Geodetic Institute) in survey campaigns spanning the period 1978-1987. The GrIS is covered tens of kilometers into the interior due to the large footprints of the photos (30 x 30 km) and control provided by the aero-triangulation. Thus, the data are ideal for providing information for analysis of ice marginal elevation change and also control for satellite-derived DEMs.The results of the validation, error assessments and predicted uncertainties are presented. We test the DEM using Airborne Topographic Mapper (IceBridge ATM) as reference data; evaluate the a posteriori covariance matrix from the aero-triangulation; and co-register DEM blocks of 50 x 50 km to ICESat laser altimetry in order to evaluate the coherency.We complement the aero-photogrammetric DEM with modern laser altimetry and DEMs derived from stereoscopic satellite imagery (AST14DMO) to examine the mass variability of the Northeast Greenland Ice Stream (NEGIS). Our analysis suggests that dynamically-induced mass loss started around 2003 and continued throughout 2014.

  4. TecDEM: A MATLAB based toolbox for tectonic geomorphology, Part 1: Drainage network preprocessing and stream profile analysis

    NASA Astrophysics Data System (ADS)

    Shahzad, Faisal; Gloaguen, Richard

    2011-02-01

    We present TecDEM, a software shell implemented in MATLAB that applies tectonic geomorphologic tasks to digital elevation models (DEMs). The first part of this paper series describes drainage partitioning schemes and stream profile analysis. The graphical user interface of TecDEM provides several options: determining flow directions, stream vectorization, watershed delineation, Strahler order labeling, stream profile generation, knickpoints selection, Concavity, Steepness and Hack indices calculations. The knickpoints along selected streams as well as stream profile analysis, and Hack index per stream profile are computed using a semi-automatic method. TecDEM was used to extract and investigate the stream profiles in the Kaghan Valley (Northern Pakistan). Our interpretations of the TecDEM results correlate well with previous tectonic evolution models for this region. TecDEM is designed to assist geoscientists in applying complex tectonic geomorphology tasks to global DEM data.

  5. Shuttle radar DEM hydrological correction for erosion modelling in small catchments

    NASA Astrophysics Data System (ADS)

    Jarihani, Ben; Sidle, Roy; Bartley, Rebecca

    2016-04-01

    Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modelling of environmental processes. Catchment and hillslope scale runoff and sediment processes (i.e., patterns of overland flow, infiltration, subsurface stormflow and erosion) are all topographically mediated. In remote and data-scarce regions, high resolution DEMs (LiDAR) are often not available, and moderate to course resolution digital elevation models (e.g., SRTM) have difficulty replicating detailed hydrological patterns, especially in relatively flat landscapes. Several surface reconditioning algorithms (e.g., Smoothing) and "Stream burning" techniques (e.g., Agree or ANUDEM), in conjunction with representation of the known stream networks, have been used to improve DEM performance in replicating known hydrology. Detailed stream network data are not available at regional and national scales, but can be derived at local scales from remotely-sensed data. This research explores the implication of high resolution stream network data derived from Google Earth images for DEM hydrological correction, instead of using course resolution stream networks derived from topographic maps. The accuracy of implemented method in producing hydrological-efficient DEMs were assessed by comparing the hydrological parameters derived from modified DEMs and limited high-resolution airborne LiDAR DEMs. The degree of modification is dominated by the method used and availability of the stream network data. Although stream burning techniques improve DEMs hydrologically, these techniques alter DEM characteristics that may affect catchment boundaries, stream position and length, as well as secondary terrain derivatives (e.g., slope, aspect). Modification of a DEM to better reflect known hydrology can be useful, however, knowledge of the magnitude and spatial pattern of the changes are required before using a DEM for subsequent analyses.

  6. Artificial terraced field extraction based on high resolution DEMs

    NASA Astrophysics Data System (ADS)

    Na, Jiaming; Yang, Xin; Xiong, Liyang; Tang, Guoan

    2017-04-01

    With the increase of human activities, artificial landforms become one of the main terrain features with special geographical and hydrological value. Terraced field, as the most important artificial landscapes of the loess plateau, plays an important role in conserving soil and water. With the development of digital terrain analysis (DTA), there is a current and future need in developing a robust, repeatable and cost-effective research methodology for terraced fields. In this paper, a novel method using bidirectional DEM shaded relief is proposed for terraced field identification based on high resolution DEM, taking Zhifanggou watershed, Shannxi province as the study area. Firstly, 1m DEM is obtained by low altitude aerial photogrammetry using Unmanned Aerial Vehicle (UAV), and 0.1m DOM is also obtained as the test data. Then, the positive and negative terrain segmentation is done to acquire the area of terraced field. Finally, a bidirectional DEM shaded relief is simulated to extract the ridges of each terraced field stages. The method in this paper can get not only polygon feature of the terraced field areas but also line feature of terraced field ridges. The accuracy is 89.7% compared with the artificial interpretation result from DOM. And additional experiment shows that this method has a strong robustness as well as high accuracy.

  7. TanDEM-X high resolution DEMs and their applications to flow modeling

    NASA Astrophysics Data System (ADS)

    Wooten, Kelly M.

    Lava flow modeling can be a powerful tool in hazard assessments; however, the ability to produce accurate models is usually limited by a lack of high resolution, up-to-date Digital Elevation Models (DEMs). This is especially obvious in places such as Kilauea Volcano (Hawaii), where active lava flows frequently alter the terrain. In this study, we use a new technique to create high resolution DEMs on Kilauea using synthetic aperture radar (SAR) data from the TanDEM-X (TDX) satellite. We convert raw TDX SAR data into a geocoded DEM using GAMMA software [Werner et al., 2000]. This process can be completed in several hours and permits creation of updated DEMs as soon as new TDX data are available. To test the DEMs, we use the Harris and Rowland [2001] FLOWGO lava flow model combined with the Favalli et al. [2005] DOWNFLOW model to simulate the 3-15 August 2011 eruption on Kilauea's East Rift Zone. Results were compared with simulations using the older, lower resolution 2000 SRTM DEM of Hawaii. Effusion rates used in the model are derived from MODIS thermal infrared satellite imagery. FLOWGO simulations using the TDX DEM produced a single flow line that matched the August 2011 flow almost perfectly, but could not recreate the entire flow field due to the relatively high DEM noise level. The issues with short model flow lengths can be resolved by filtering noise from the DEM. Model simulations using the outdated SRTM DEM produced a flow field that followed a different trajectory to that observed. Numerous lava flows have been emplaced at Kilauea since the creation of the SRTM DEM, leading the model to project flow lines in areas that have since been covered by fresh lava flows. These results show that DEMs can quickly become outdated on active volcanoes, but our new technique offers the potential to produce accurate, updated DEMs for modeling lava flow hazards.

  8. Uncertainty Assessment and Weight Map Generation for Efficient Fusion of Tandem-X and CARTOSAT-1 Dems

    NASA Astrophysics Data System (ADS)

    Bagheri, H.; Schmitt, M.; Zhu, X. X.

    2017-05-01

    Recently, with InSAR data provided by the German TanDEM-X mission, a new global, high-resolution Digital Elevation Model (DEM) has been produced by the German Aerospace Center (DLR) with unprecedented height accuracy. However, due to SAR-inherent sensor specifics, its quality decreases over urban areas, making additional improvement necessary. On the other hand, DEMs derived from optical remote sensing imagery, such as Cartosat-1 data, have an apparently greater resolution in urban areas, making their fusion with TanDEM-X elevation data a promising perspective. The objective of this paper is two-fold: First, the height accuracies of TanDEM-X and Cartosat-1 elevation data over different land types are empirically evaluated in order to analyze the potential of TanDEM-XCartosat- 1 DEM data fusion. After the quality assessment, urban DEM fusion using weighted averaging is investigated. In this experiment, both weight maps derived from the height error maps delivered with the DEM data, as well as more sophisticated weight maps predicted by a procedure based on artificial neural networks (ANNs) are compared. The ANN framework employs several features that can describe the height residual performance to predict the weights used in the subsequent fusion step. The results demonstrate that especially the ANN-based framework is able to improve the quality of the final DEM through data fusion.

  9. Tracking the Effectiveness of Usability Evaluation Methods.

    DTIC Science & Technology

    2007-11-02

    We present a case study that tracks usability problems predicted with six usability evaluation methods (Claims Analysis, Cognitive Walkthrough , GOMS...Heuristic Evaluation , User Action Notation, and simply reading the specification) through a development process. We assess the methods predictive

  10. Pragmatism, Evidence, and Mixed Methods Evaluation

    ERIC Educational Resources Information Center

    Hall, Jori N.

    2013-01-01

    Mixed methods evaluation has a long-standing history of enhancing the credibility of evaluation findings. However, using mixed methods in a utilitarian way implicitly emphasizes convenience over engaging with its philosophical underpinnings (Denscombe, 2008). Because of this, some mixed methods evaluators and social science researchers have been…

  11. Evaluation of DNA and RNA extraction methods.

    PubMed

    Edwin Shiaw, C S; Shiran, M S; Cheah, Y K; Tan, G C; Sabariah, A R

    2010-06-01

    This study was done to evaluate various DNA and RNA extractions from archival FFPE tissues. A total of 30 FFPE blocks from the years of 2004 to 2006 were assessed with each modified and adapted method. Extraction protocols evaluated include the modified enzymatic extraction method (Method A), Chelex-100 extraction method (Method B), heat-induced retrieval in alkaline solution extraction method (Methods C and D) and one commercial FFPE DNA Extraction kit (Qiagen, Crawley, UK). For RNA extraction, 2 extraction protocols were evaluated including the enzymatic extraction method (Method 1), and Chelex-100 RNA extraction method (Method 2). Results show that the modified enzymatic extraction method (Method A) is an efficient DNA extraction protocol, while for RNA extraction, the enzymatic method (Method 1) and the Chelex-100 RNA extraction method (Method 2) are equally efficient RNA extraction protocols.

  12. Icesat Validation of Tandem-X I-Dems Over the UK

    NASA Astrophysics Data System (ADS)

    Feng, L.; Muller, J.-P.

    2016-06-01

    From the latest TanDEM-X mission (bistatic X-Band interferometric SAR), globally consistent Digital Elevation Model (DEM) will be available from 2017, but their accuracy has not yet been fully characterised. This paper presents the methods and implementation of statistical procedures for the validation of the vertical accuracy of TanDEM-X iDEMs at grid-spacing of approximately 12.5 m, 30 m and 90 m based on processed ICESat data over the UK in order to assess their potential extrapolation across the globe. The accuracy of the TanDEM-X iDEM in UK was obtained as follows: against ICESat GLA14 elevation data, TanDEM-X iDEM has -0.028±3.654 m over England and Wales and 0.316 ± 5.286 m over Scotland for 12 m, -0.073 ± 6.575 m for 30 m, and 0.0225 ± 9.251 m at 90 m. Moreover, 90 % of all results at the three resolutions of TanDEM-X iDEM data (with a linear error at 90 % confidence level) are below 16.2 m. These validation results also indicate that derivative topographic parameters (slope, aspect and relief) have a strong effect on the vertical accuracy of the TanDEM-X iDEMs. In high-relief and large slope terrain, large errors and data voids are frequent, and their location is strongly influenced by topography, whilst in the low- to medium-relief and low slope sites, errors are smaller. ICESat derived elevations are heavily influenced by surface slope within the 70 m footprint as well as there being slope dependent errors in the TanDEM-X iDEMs.

  13. Urban DEM generation, analysis and enhancements using TanDEM-X

    NASA Astrophysics Data System (ADS)

    Rossi, Cristian; Gernhardt, Stefan

    2013-11-01

    This paper analyzes the potential of the TanDEM-X mission for the generation of urban Digital Elevation Models (DEMs). The high resolution of the sensors and the absence of temporal decorrelation are exploited. The interferometric chain and the problems encountered for correct mapping of urban areas are analyzed first. The operational Integrated TanDEM-X Processor (ITP) algorithms are taken as reference. The ITP main product is called the raw DEM. Whereas the ITP coregistration stage is demonstrated to be robust enough, large improvements in the raw DEM such as fewer percentages of phase unwrapping errors, can be obtained by using adaptive fringe filters instead of the conventional ones in the interferogram generation stage. The shape of the raw DEM in the layover area is also shown and determined to be regular for buildings with vertical walls. Generally, in the presence of layover, the raw DEM exhibits a height ramp, resulting in a height underestimation for the affected structure. Examples provided confirm the theoretical background. The focus is centered on high resolution DEMs produced using spotlight acquisitions. In particular, a raw DEM over Berlin (Germany) with a 2.5 m raster is generated and validated. For this purpose, ITP is modified in its interferogram generation stage by adopting the Intensity Driven Adaptive Neighbourhood (IDAN) algorithm. The height Root Mean Square Error (RMSE) between the raw DEM and a reference is about 8 m for the two classes defining the urban DEM: structures and non-structures. The result can be further improved for the structure class using a DEM generated with Persistent Scatterer Interferometry. A DEM fusion is thus proposed and a drop of about 20% in the RMSE is reported.

  14. Poisson disk sampling in geodesic metric for DEM simplification

    NASA Astrophysics Data System (ADS)

    Hou, Wenguang; Zhang, Xuming; Li, Xin; Lai, Xudong; Ding, Mingyue

    2013-08-01

    To generate highly compressed digital elevation models (DEMs) with fine details, the method of Poisson disk sampling in geodesic metric is proposed. The main idea is to uniformly pick points from DEM nodes in geodesic metric, resulting in terrain-adaptive samples in Euclidean metric. This method randomly selects point from mesh nodes and then judges whether this point can be accepted in accordance with the related geodesic distances from the sampled points. The whole process is repeated until no more points can be selected. To further adjust the sampling ratios in different areas, weighted geodesic distance, which is in relation to terrain characteristics, are introduced. In addition to adaptability, sample distributions are well visualised. This method is simple and easy to implement. Cases are provided to illustrate the feasibility and superiority of the proposed method.

  15. Incorporating Atomic Data Errors in Stellar DEM Reconstruction

    NASA Astrophysics Data System (ADS)

    Kang, Hosung; van Dyk, David A.; Kashyap, Vinay L.; Connors, Alanna

    2005-06-01

    We develop a powerful new method to reconstruct stellar Differential Emission Measures (DEMs) its Bayesian framework allows us to incorporate atomic and calibration errors as prior information. For instance, known errors in the line locations, as well as lines missing from the atomic data base, can be included directly during fitting. Highly correlated systematic errors in the ion balance may be included as well, as a natural sequence during Monte Carlo sampling. Our method uses the statistical framework of data augmentation, where we treat photon counts in each level of a hierarchical structure as missing data. We demonstrate our method by fitting a selected subset of emission lines and continuum in Chandra and EUVE data of Capella to estimate the DEM that best describes the data, and simultaneously determine the element abundances. The Markov Chain Monte Carlo based method also naturally produces error estimates on the fit parameters.

  16. Mapping debris-flow hazard in Honolulu using a DEM

    USGS Publications Warehouse

    Ellen, Stephen D.; Mark, Robert K.; ,

    1993-01-01

    A method for mapping hazard posed by debris flows has been developed and applied to an area near Honolulu, Hawaii. The method uses studies of past debris flows to characterize sites of initiation, volume at initiation, and volume-change behavior during flow. Digital simulations of debris flows based on these characteristics are then routed through a digital elevation model (DEM) to estimate degree of hazard over the area.

  17. Improved Fluvial Geomorphic Interpretation Derived From DEM Differencing

    NASA Astrophysics Data System (ADS)

    Wheaton, J. M.; Brasington, J.; Brewer, P. A.; Darby, S.; Pasternack, G. B.; Sear, D.; Vericat, D.; Williams, R.

    2007-12-01

    Technological advances over the past two decades in remotely-sensed and ground-based topographic surveying technologies have made the rapid acquisition of topographic data in the fluvial environment possible at spatial resolutions and extents previously unimaginable. Consequently, monitoring geomorphic changes and estimating fluvial sediment budgets through comparing repeat topographic surveys (DEM differencing) has now become a tractable, affordable approach for both research purposes and long-term monitoring associated with river restoration. However, meaningful quantitative geomorphic interpretation of repeat topographic surveys has received little attention from either researchers or practitioners. Previous research has shown that quantitative estimates of erosion and deposition from DEM differencing are highly sensitive to DEM uncertainty, with minimum level of detection techniques typically discarding between 40% and 90% of the predicted changes. A series of new methods for segregating reach-scale sediment budgets into their specific process components, while accounting for the influence of DEM uncertainty, were developed and explored to highlight distinctive geomorphic signatures between different styles of change. To illustrate the interpretive power of the techniques in different settings, results are presented from analyses across a range of gravel-bed river types: a) the braided River Feshie, Scotland, UK; b) the formerly gravel-mined, wandering Sulphur Creek, California, USA; c) a heavily regulated reach of the Mokelumne River, California, USA that has been subjected to over 5 years of spawning habitat rehabilitation; and d) a restored meandering channel and floodplain of the Highland Water, New Forest, UK. Despite fundamentally different process suites between the study sites, the budget segregation technique is in each case able to aid in more reliable and meaningful geomorphic interpretations of DEM differences.

  18. Bathymetric survey of water reservoirs in north-eastern Brazil based on TanDEM-X satellite data.

    PubMed

    Zhang, Shuping; Foerster, Saskia; Medeiros, Pedro; de Araújo, José Carlos; Motagh, Mahdi; Waske, Bjoern

    2016-11-15

    Water scarcity in the dry season is a vital problem in dryland regions such as northeastern Brazil. Water supplies in these areas often come from numerous reservoirs of various sizes. However, inventory data for these reservoirs is often limited due to the expense and time required for their acquisition via field surveys, particularly in remote areas. Remote sensing techniques provide a valuable alternative to conventional reservoir bathymetric surveys for water resource management. In this study single pass TanDEM-X data acquired in bistatic mode were used to generate digital elevation models (DEMs) in the Madalena catchment, northeastern Brazil. Validation with differential global positioning system (DGPS) data from field measurements indicated an absolute elevation accuracy of approximately 1m for the TanDEM-X derived DEMs (TDX DEMs). The DEMs derived from TanDEM-X data acquired at low water levels show significant advantages over bathymetric maps derived from field survey, particularly with regard to coverage, evenly distributed measurements and replication of reservoir shape. Furthermore, by mapping the dry reservoir bottoms with TanDEM-X data, TDX DEMs are free of emergent and submerged macrophytes, independent of water depth (e.g. >10m), water quality and even weather conditions. Thus, the method is superior to other existing bathymetric mapping approaches, particularly for inland water bodies. The proposed approach relies on (nearly) dry reservoir conditions at times of image acquisition and is thus restricted to areas that show considerable water levels variations. However, comparisons between TDX DEM and the bathymetric map derived from field surveys show that the amount of water retained during the dry phase has only marginal impact on the total water volume derivation from TDX DEM. Overall, DEMs generated from bistatic TanDEM-X data acquired in low water periods constitute a useful and efficient data source for deriving reservoir bathymetry and show

  19. Methods of Evaluating School Innovations.

    ERIC Educational Resources Information Center

    Cooley, William W.

    This evaluative research is concerned with specific educational programs which attempt to adapt instruction to individual differences. Attention is limited to the Frick School, a large urban Pittsburgh school in which the Learning Research and Development Center develops its new educational programs, and to the Follow-Through network where these…

  20. Methods of Writing Instruction Evaluation.

    ERIC Educational Resources Information Center

    Lamb, Bill H.

    The Writing Program Director at Johnson County Community College (Kansas) developed quantitative measures for writing instruction evaluation which can support that institution's growing interest in and support for peer collaboration as a means to improving instructional quality. The first process (Interaction Analysis) has an observer measure…

  1. Methods of Rapid Evaluation, Assessment, and Appraisal

    ERIC Educational Resources Information Center

    McNall, Miles; Foster-Fishman, Pennie G.

    2007-01-01

    A central issue in the use of rapid evaluation and assessment methods (REAM) is achieving a balance between speed and trustworthiness. In this article, the authors review the key differences and common features of this family of methods and present a case example that illustrates how evaluators can use rapid evaluation techniques in their own…

  2. An assessment of TanDEM-X GlobalDEM over rural and urban areas

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Huber, Martin; Rudari, Roberto; Eddy, Andrew; Lucas, Richard

    2014-10-01

    Digital Elevation Model (DEM) is a key input for the development of risk management systems. Main limitation of the current available DEM is the low level of resolution. DEMs such as STRM 90m or ASTER are globally available free of charge, but offer limited use, for example, to flood modelers in most geographic areas. TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement), the first bistatic SAR can fulfil this gap. The mission objective is the generation of a consistent global digital elevation model with an unprecedented accuracy according to the HRTI-3 (High Resolution Terrain Information) specifications. The mission opens a new era in risk assessment. In the framework of ALTAMIRA INFORMATION research activities, the DIAPASON (Differential Interferometric Automated Process Applied to Survey Of Nature) processing chain has been successfully adapted to TanDEM-X CoSSC (Coregistered Slant Range Single Look Complex) data processing. In this study the capability of CoSSC data for DEM generation is investigated. Within the on-going FP7 RASOR project (Rapid Analysis and Spatialisation and Of Risk), the generated DEM are compared with Intermediate DEM derived from the TanDEM-X first global coverage. The results are presented and discussed.

  3. Creating High Quality DEMs of Large Scale Fluvial Environments Using Structure-from-Motion

    NASA Astrophysics Data System (ADS)

    Javernick, L. A.; Brasington, J.; Caruso, B. S.; Hicks, M.; Davies, T. R.

    2012-12-01

    During the past decade, advances in survey and sensor technology have generated new opportunities to investigate the structure and dynamics of fluvial systems. Key geomatic technologies include the Global Positioning System (GPS), digital photogrammetry, LiDAR, and terrestrial laser scanning (TLS). The application of such has resulted in a profound increase in the dimensionality of topographic surveys - from cross-sections to distributed 3d point clouds and digital elevation models (DEMs). Each of these technologies have been used successfully to derive high quality DEMs of fluvial environments; however, they often require specialized and expensive equipment, such as a TLS or large format camera, bespoke platforms such as survey aircraft, and consequently make data acquisition prohibitively expensive or highly labour intensive, thus restricting the extent and frequency of surveys. Recently, advances in computer vision and image analysis have led to development of a novel photogrammetric approach that is fully automated and suitable for use with simple compact (non-metric) cameras. In this paper, we evaluate a new photogrammetric method, Structure-from-Motion (SfM), and demonstrate how this can be used to generate DEMs of comparable quality to airborne LiDAR, using consumer grade cameras at low costs. Using the SfM software PhotoScan (version 0.8.5), high quality DEMs were produced for a 1.6 km reach and a 3.3 km reach of the braided Ahuriri River, New Zealand. Photographs used for DEM creation were acquired from a helicopter flying at 600 m and 800 m above ground level using a consumer grade 10.1mega-pixel, non-metric digital camera, resulting in object space resolution imagery of 0.12 m and 0.16 m respectively. Point clouds for the two study reaches were generated using 147 and 224 photographs respectively, and were extracted automatically in an arbitrary coordinate system; RTK-GPS located ground control points (GCPs) were used to define a 3d non

  4. Evaluation Methods of The Text Entities

    ERIC Educational Resources Information Center

    Popa, Marius

    2006-01-01

    The paper highlights some evaluation methods to assess the quality characteristics of the text entities. The main concepts used in building and evaluation processes of the text entities are presented. Also, some aggregated metrics for orthogonality measurements are presented. The evaluation process for automatic evaluation of the text entities is…

  5. Genetics | Selection: Evaluation and Methods

    USDA-ARS?s Scientific Manuscript database

    The procedures used for collecting and analyzing data for genetic improvement of dairy cattle are described. Methods of identification and milk recording are presented. Selection traits include production (milk, fat, and protein yields and component percentages), conformation (final score and linear...

  6. Applying the Artificial Neural Network to Predict the Soil Responses in the DEM Simulation

    NASA Astrophysics Data System (ADS)

    Li, Z.; Chow, J. K.; Wang, Y. H.

    2017-06-01

    This paper aims to bridge the soil properties and the soil response in the discrete element method (DEM) simulation using the artificial neural network (ANN). The network was designed to output the stress-strain-volumetric response from inputting the soil properties. 31 biaxial shearing tests with varying soil parameters were generated using the DEM simulations. Based on these 31 training samples, a three-layer neural network was established. 2 extra samples were generated to examine the validity of the network, and the predicted curves using the ANN were well matched with those from the DEM simulations. Overall, the ANN was found promising in effectively modelling the soil behaviour.

  7. Hydraulic fracturing - an attempt of DEM simulation

    NASA Astrophysics Data System (ADS)

    Kosmala, Alicja; Foltyn, Natalia; Klejment, Piotr; Dębski, Wojciech

    2017-04-01

    Hydraulic fracturing is a technique widely used in oil, gas and unconventional reservoirs exploitation in order to enable the oil/gas to flow more easily and enhance the production. It relays on pumping into a rock a special fluid under a high pressure which creates a set of microcracks which enhance porosity of the reservoir rock. In this research, attempt of simulation of such hydrofracturing process using the Discrete Element Method approach is presented. The basic assumption of this approach is that the rock can be represented as an assembly of discrete particles cemented into a rigid sample (Potyondy 2004). An existence of voids among particles simulates then a pore system which can be filled out by fracturing fluid, numerically represented by much smaller particles. Following this microscopic point of view and its numerical representation by DEM method we present primary results of numerical analysis of hydrofracturing phenomena, using the ESyS-Particle Software. In particular, we consider what is happening in distinct vicinity of the border between rock sample and fracking particles, how cracks are creating and evolving by breaking bonds between particles, how acoustic/seismic energy is releasing and so on. D.O. Potyondy, P.A. Cundall. A bonded-particle model for rock. International Journal of Rock Mechanics and Mining Sciences, 41 (2004), pp. 1329-1364.

  8. The Safeguards Evaluation Method for evaluating vulnerability to insider threats

    SciTech Connect

    Al-Ayat, R.A.; Judd, B.R.; Renis, T.A.

    1986-01-01

    As protection of DOE facilities against outsiders increases to acceptable levels, attention is shifting toward achieving comparable protection against insiders. Since threats and protection measures for insiders are substantially different from those for outsiders, new perspectives and approaches are needed. One such approach is the Safeguards Evaluation Method. This method helps in assessing safeguards vulnerabilities to theft or diversion of special nuclear material (SNM) by insiders. The Safeguards Evaluation Method-Insider Threat is a simple model that can be used by safeguards and security planners to evaluate safeguards and proposed upgrades at their own facilities. A discussion of the Safeguards Evaluation Method is presented in this paper.

  9. A simplified DEM numerical simulation of vibroflotation without backfill

    NASA Astrophysics Data System (ADS)

    Jiang, M. J.; Liu, W. W.; He, J.; Sun, Y.

    2015-09-01

    Vibroflotation is one of the deep vibratory compaction techniques for ground reinforcement. This method densities the soil and improves its mechanical properties, thus helps to protect people's lives and property from geological disasters. The macro reinforcement mechanisms of vibroflotation method have been investigated by numerical simulations, laboratory and in-situ experiments. However, little attention has been paid on its micro - mechanism, which is essential to fully understand the principle of the ground reinforcement. Discrete element method (DEM), based on discrete mechanics, is more powerful to solve large deformation and failure problems. This paper investigated the macro-micro mechanism of vibroflotation without backfill under two conditions, i.e., whether or not the ground water was considered, by incorporating inter-particle rolling resistance model in the DEM simulations. Conclusions obtained are as follows: The DEM simulations incorporating rolling resistance well replicate the mechanical response of the soil assemblages and are in line with practical observations. The void ratio of the granular soil fluctuates up and down in the process of vibroflotation, and finally reduces to a lower value. It is more efficient to densify the ground without water compared to the ground with water.

  10. Evaluation of Rhenium Joining Methods

    NASA Technical Reports Server (NTRS)

    Reed, Brian D.; Morren, Sybil H.

    1995-01-01

    Coupons of rhenium-to-Cl03 flat plate joints, formed by explosive and diffusion bonding, were evaluated in a series of shear tests. Shear testing was conducted on as-received, thermally-cycled (100 cycles, from 21 to 1100 C), and thermally-aged (3 and 6 hrs at 1100 C) joint coupons. Shear tests were also conducted on joint coupons with rhenium and/or Cl03 electron beam welded tabs to simulate the joint's incorporation into a structure. Ultimate shear strength was used as a figure of merit to assess the effects of the thermal treatment and the electron beam welding of tabs on the joint coupons. All of the coupons survived thermal testing intact and without any visible degradation. Two different lots of as-received, explosively-bonded joint coupons had ultimate shear strengths of 281 and 310 MPa and 162 and 223 MPa, respectively. As-received, diffusion-bonded coupons had ultimate shear strengths of 199 and 348 MPa. For the most part, the thermally-treated and rhenium weld tab coupons had shear strengths slightly reduced or within the range of the as-received values. Coupons with Cl03 weld tabs experienced a significant reduction in shear strength. The degradation of strength appeared to be the result of a poor heat sink provided during the electron beam welding. The Cl03 base material could not dissipate heat as effectively as rhenium, leading to the formation of a brittle rhenium-niobium intermetallic.

  11. Improving the TanDEM-X DEM for flood modelling using flood extents from Synthetic Aperture Radar images.

    NASA Astrophysics Data System (ADS)

    Mason, David; Trigg, Mark; Garcia-Pintado, Javier; Cloke, Hannah; Neal, Jeffrey; Bates, Paul

    2015-04-01

    Many floodplains in the developed world have now been imaged with high resolution airborne LiDAR or InSAR, giving accurate DEMs that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X World DEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution SAR images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. The paper discusses an additional use of SAR flood extents to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving the DEM for future flood modelling studies in this area. The method is based on the fact that for larger rivers the water elevation changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as a sample of heights with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate height estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the refined heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must be no lower than the refined heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the

  12. Application of TanDEM-X interferometry in volcano monitoring

    NASA Astrophysics Data System (ADS)

    Kubanek, Julia; Westerhaus, Malte; Heck, Bernhard

    2013-04-01

    crater rim signaled the end of magma ascent in June 2011. The bistatic TanDEM-X data gives important information on this explosion as we can observe material changes in the summit area when comparing datasets taken before and after the explosion. Our results indicate that repeated DEMs with great detail and good accuracy are obtainable, enabling a quantitative estimation of finite volume changes in the summit area of the volcano. Additionally, we highlight the importance of employing remote sensing methods to collect data in volcano research.

  13. Evaluating the utility of two gestural discomfort evaluation methods

    PubMed Central

    Son, Minseok; Jung, Jaemoon; Park, Woojin

    2017-01-01

    Evaluating physical discomfort of designed gestures is important for creating safe and usable gesture-based interaction systems; yet, gestural discomfort evaluation has not been extensively studied in HCI, and few evaluation methods seem currently available whose utility has been experimentally confirmed. To address this, this study empirically demonstrated the utility of the subjective rating method after a small number of gesture repetitions (a maximum of four repetitions) in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. The subjective rating method has been widely used in previous gesture studies but without empirical evidence on its utility. This study also proposed a gesture discomfort evaluation method based on an existing ergonomics posture evaluation tool (Rapid Upper Limb Assessment) and demonstrated its utility in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. Rapid Upper Limb Assessment is an ergonomics postural analysis tool that quantifies the work-related musculoskeletal disorders risks for manual tasks, and has been hypothesized to be capable of correctly determining discomfort resulting from prolonged, repetitive gesture use. The two methods were evaluated through comparisons against a baseline method involving discomfort rating after actual prolonged, repetitive gesture use. Correlation analyses indicated that both methods were in good agreement with the baseline. The methods proposed in this study seem useful for predicting discomfort resulting from prolonged, repetitive gesture use, and are expected to help interaction designers create safe and usable gesture-based interaction systems. PMID:28423016

  14. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  15. Evaluation Measures and Methods: Some Intersections.

    ERIC Educational Resources Information Center

    Elliott, John

    The literature is reviewed for four combinations of evaluation measures and methods: traditional methods with traditional measures (T-Meth/T-Mea), nontraditional methods with traditional measures (N-Meth/T-Mea), traditional measures with nontraditional measures (T-Meth/N-Mea), and nontraditional methods with nontraditional measures (N-Meth/N-Mea).…

  16. Evaluation Measures and Methods: Some Intersections.

    ERIC Educational Resources Information Center

    Elliott, John

    The literature is reviewed for four combinations of evaluation measures and methods: traditional methods with traditional measures (T-Meth/T-Mea), nontraditional methods with traditional measures (N-Meth/T-Mea), traditional measures with nontraditional measures (T-Meth/N-Mea), and nontraditional methods with nontraditional measures (N-Meth/N-Mea).…

  17. Evaluation Methods for Intelligent Tutoring Systems Revisited

    ERIC Educational Resources Information Center

    Greer, Jim; Mark, Mary

    2016-01-01

    The 1993 paper in "IJAIED" on evaluation methods for Intelligent Tutoring Systems (ITS) still holds up well today. Basic evaluation techniques described in that paper remain in use. Approaches such as kappa scores, simulated learners and learning curves are refinements on past evaluation techniques. New approaches have also arisen, in…

  18. Methods of Generating and Evaluating Hypertext.

    ERIC Educational Resources Information Center

    Blustein, James; Staveley, Mark S.

    2001-01-01

    Focuses on methods of generating and evaluating hypertext. Highlights include historical landmarks; nonlinearity; literary hypertext; models of hypertext; manual, automatic, and semi-automatic generation of hypertext; mathematical models for hypertext evaluation, including computing coverage and correlation; human factors in evaluation; and…

  19. Evaluation Methods for Intelligent Tutoring Systems Revisited

    ERIC Educational Resources Information Center

    Greer, Jim; Mark, Mary

    2016-01-01

    The 1993 paper in "IJAIED" on evaluation methods for Intelligent Tutoring Systems (ITS) still holds up well today. Basic evaluation techniques described in that paper remain in use. Approaches such as kappa scores, simulated learners and learning curves are refinements on past evaluation techniques. New approaches have also arisen, in…

  20. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  1. [A method of iris image quality evaluation].

    PubMed

    Murat, Hamit; Mao, Dawei; Tong, Qinye

    2006-04-01

    Iris image quality evaluation plays a very important part in iris computer recognition. An iris image quality evaluation method was introduced into this study to distinguish good image from bad image caused by pupil distortion, blurred boundary, two circles appearing not concentric, and severe occlusion by eyelids and eyelashes. The tests based on this method gave good results.

  2. DEM Simulation of Rotational Disruption of Rubble-Pile Asteroids

    NASA Astrophysics Data System (ADS)

    Sanchez, Paul; Scheeres, D. J.

    2010-10-01

    We report on our study of rotation induced disruption of a self-gravitating granular aggregate by using a Discrete Element Method (DEM) granular dynamics code, a class of simulation commonly used in the granular mechanics community. Specifically, we simulate the behavior of a computer simulated asteroid when subjected to an array of rotation rates that cross its disruption limit. The code used to carry out these studies implements a Soft-sphere DEM method as applied for granular systems. In addition a novel algorithm to calculate self-gravitating forces which makes use of the DEM static grid has been developed and implemented in the code. By using a DEM code, it is possible to model a poly-disperse aggregate with a specified size distribution power law, incorporate contact forces such as dry cohesion and friction, and compute internal stresses within the gravitational aggregate. This approach to the modeling of gravitational aggregates is complementary to and distinctly different than other approaches reported in the literature. The simulations use both 2D and 3D modeling for analysis. One aim of this work is to understand the basic processes and dynamics of aggregates during the disruption process. We have used these simulations to understand how to form a contact binary that mimics observed asteroid shapes, how to accelerate the rotation rate of the aggregate so that it has enough time to reshape and find a stable configuration and how to analyze a system that has an occasionally changing shape. From a more physical point of view, we have focused on the understanding of the dynamics of the reshaping process, the evolution of internal stresses during this reshaping and finding the critical disruption angular velocity. This research was supported by a grant from NASA's PG&G Program: NNX10AJ66G

  3. Glacier Volume Change Estimation Using Time Series of Improved Aster Dems

    NASA Astrophysics Data System (ADS)

    Girod, Luc; Nuth, Christopher; Kääb, Andreas

    2016-06-01

    Volume change data is critical to the understanding of glacier response to climate change. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system embarked on the Terra (EOS AM-1) satellite has been a unique source of systematic stereoscopic images covering the whole globe at 15m resolution and at a consistent quality for over 15 years. While satellite stereo sensors with significantly improved radiometric and spatial resolution are available to date, the potential of ASTER data lies in its long consistent time series that is unrivaled, though not fully exploited for change analysis due to lack of data accuracy and precision. Here, we developed an improved method for ASTER DEM generation and implemented it in the open source photogrammetric library and software suite MicMac. The method relies on the computation of a rational polynomial coefficients (RPC) model and the detection and correction of cross-track sensor jitter in order to compute DEMs. ASTER data are strongly affected by attitude jitter, mainly of approximately 4 km and 30 km wavelength, and improving the generation of ASTER DEMs requires removal of this effect. Our sensor modeling does not require ground control points and allows thus potentially for the automatic processing of large data volumes. As a proof of concept, we chose a set of glaciers with reference DEMs available to assess the quality of our measurements. We use time series of ASTER scenes from which we extracted DEMs with a ground sampling distance of 15m. Our method directly measures and accounts for the cross-track component of jitter so that the resulting DEMs are not contaminated by this process. Since the along-track component of jitter has the same direction as the stereo parallaxes, the two cannot be separated and the elevations extracted are thus contaminated by along-track jitter. Initial tests reveal no clear relation between the cross-track and along-track components so that the latter seems not to be

  4. Influence of the external DEM on PS-InSAR processing and results on Northern Appennine slopes

    NASA Astrophysics Data System (ADS)

    Bayer, B.; Schmidt, D. A.; Simoni, A.

    2014-12-01

    We present an InSAR analysis of slow moving landslide in the Northern Appennines, Italy, and assess the dependencies on the choice of DEM. In recent years, advanced processing techniques for synthetic aperture radar interferometry (InSAR) have been applied to measure slope movements. The persistent scatterers (PS-InSAR) approach is probably the most widely used and some codes are now available in the public domain. The Stanford method of Persistent Scatterers (StamPS) has been successfully used to analyze landslide areas. One problematic step in the processing chain is the choice of an external DEM that is used to model and remove the topographic phase in a series of interferograms in order to obtain the phase contribution caused by surface deformation. The choice is not trivial, because the PS InSAR results differ significantly in terms of PS identification, positioning, and the resulting deformation signal. We use four different DEMs to process a set of 18 ASAR (Envisat) scenes over a mountain area (~350 km2) of the Northern Appennines of Italy, using StamPS. Slow-moving landslides control the evolution of the landscape and cover approximately 30% of the territory. Our focus in this presentation is to evaluate the influence of DEM resolution and accuracy by comparing PS-InSAR results. On an areal basis, we perform a statistical analysis of displacement time-series to make the comparison. We also consider two case studies to illustrate the differences in terms of PS identification, number and estimated displacements. It is clearly shown that DEM accuracy positively influences the number of PS, while line-of-sight rates differ from case to case and can result in deformation signals that are difficult to interpret. We also take advantage of statistical tools to analyze the obtained time-series datasets for the whole study area. Results indicate differences in the style and amount of displacement that can be related to the accuracy of the employed DEM.

  5. Morphological changes at Mt. Etna detected by TanDEM-X

    NASA Astrophysics Data System (ADS)

    Wegmuller, Urs; Bonforte, Alessandro; De Beni, Emanuela; Guglielmino, Francesco; Strozzi, Tazio

    2014-05-01

    the 2012 TanDEM-X model with the 2000 SRTM DEM in order to evaluate the morphological changes occurred on the volcano during the 12 years time lap. The pixel size of SRTM-DEM is about 90 m and we resampled the TanDEM-X model to fit this value. The results show that most of the variations occurred in the Valle del Bove and on the summit crater areas. In order to compare DEMs with the same pixel size, we performed a further comparison with a 5m ground resolution optical DEM, produced in 2004 and covering only the summit area. The variations in topography have been compared with ground mapping surveys, confirming a good correlation with the spatial extension of the lava flows and of the pyroclastic deposits occurred on Mt. Etna in the last seven years. The comparison between the two DEM's (2004-2012) allows calculating the amount of volcanics emitted and to clearly monitoring the growth and development of the New South East Crater (NSEC). TanDEM-X is a useful tools to monitor volcanic area characterized by a quit frequent activity (a paroxysm every 5-10 days), such us Mt. Etna, especially if concentrated in areas not easily accessible.

  6. Safeguards Evaluation Method for evaluating vulnerability to insider threats

    SciTech Connect

    Al-Ayat, R.A.; Judd, B.R.; Renis, T.A.

    1986-01-01

    As protection of DOE facilities against outsiders increases to acceptable levels, attention is shifting toward achieving comparable protection against insiders. Since threats and protection measures for insiders are substantially different from those for outsiders, new perspectives and approaches are needed. One such approach is the Safeguards Evaluation Method. This method helps in assessing safeguards vulnerabilities to theft or diversion of special nuclear meterial (SNM) by insiders. The Safeguards Evaluation Method-Insider Threat is a simple model that can be used by safeguards and security planners to evaluate safeguards and proposed upgrades at their own facilities. The method is used to evaluate the effectiveness of safeguards in both timely detection (in time to prevent theft) and late detection (after-the-fact). The method considers the various types of potential insider adversaries working alone or in collusion with other insiders. The approach can be used for a wide variety of facilities with various quantities and forms of SNM. An Evaluation Workbook provides documentation of the baseline assessment; this simplifies subsequent on-site appraisals. Quantitative evaluation is facilitated by an accompanying computer program. The method significantly increases an evaluation team's on-site analytical capabilities, thereby producing a more thorough and accurate safeguards evaluation.

  7. Accuracy of geolocation and DEM for ASTER

    NASA Astrophysics Data System (ADS)

    Watanabe, Hiroshi

    2005-10-01

    Since the launch in December of 1999, ASTER (Advanced Thermal Emission and Reflection Radiometer) has collected more than 1,000,000 scenes of data and generated more than 10,000 DEM and ortho-rectified images (Level 3A) from them, covering 20% of the whole land. The relative and absolute accuracy of geolocation and DEM will be discussed by comparing GCPs (Ground Control Point), GIS (Geographic Information System) and other existing topographic map. ASTER has shown very high geometric accuracy even if any GCP is not available. Contributing factors to this high accuracy are the stability and knowledge of the space craft orbit and attitude, ASTER sensors geometry, information on the Earth movement, algorithm to calculate the line of site vectors, and so on. Discussion will also cover the applicability of the DEM and ortho-rectified image data, based on the accuracy, and the discussion on further improvement.

  8. Hair Evaluation Methods: Merits and Demerits

    PubMed Central

    Dhurat, Rachita; Saraogi, Punit

    2009-01-01

    Various methods are available for evaluation (for diagnosis and/or quantification) of a patient presenting with hair loss. Hair evaluation methods are grouped into three main categories: Non-invasive methods (e.g., questionnaire, daily hair counts, standardized wash test, 60-s hair count, global photographs, dermoscopy, hair weight, contrasting felt examination, phototrichogram, TrichoScan and polarizing and surface electron microscopy), semi-invasive methods (e.g., trichogram and unit area trichogram) and invasive methods (e.g., scalp biopsy). Any single method is neither 'ideal' nor feasible. However, when interpreted with caution, these are valuable tools for patient diagnosis and monitoring. Daily hair counts, wash test, etc. are good methods for primary evaluation of the patient and to get an approximate assessment of the amount of shedding. Some methods like global photography form an important part of any hair clinic. Analytical methods like phototrichogram are usually possible only in the setting of a clinical trial. Many of these methods (like the scalp biopsy) require expertise for both processing and interpreting. We reviewed the available literature in detail in light of merits and demerits of each method. A plethora of newer methods is being introduced, which are relevant to the cosmetic industry/research. Such methods as well as metabolic/hormonal evaluation are not included in this review. PMID:20927232

  9. Development of a 'bare-earth' SRTM DEM product

    NASA Astrophysics Data System (ADS)

    O'Loughlin, Fiachra; Paiva, Rodrigo; Durand, Michael; Alsdorf, Douglas; Bates, Paul

    2015-04-01

    We present the methodology and results from the development of a near-global 'bare-earth' Digital Elevation Model (DEM) derived from the Shuttle Radar Topography Mission (SRTM) data. Digital Elevation Models are the most important input for hydraulic modelling, as the DEM quality governs the accuracy of the model outputs. While SRTM is currently the best near-globally [60N to 60S] available DEM, it requires adjustments to reduce the vegetation contamination and make it useful for hydrodynamic modelling over heavily vegetated areas (e.g. tropical wetlands). Unlike previous methods of accounting for vegetation contamination, which concentrated on correcting relatively small areas and usually applied a static adjustment, we account for vegetation contamination globally and apply a spatial varying correction, based on information about canopy height and density. In creating the final 'bare-earth' SRTM DEM dataset, we produced three different 'bare-earth' SRTM products. The first applies global parameters, while the second and third products apply parameters that are regionalised based on either climatic zones or vegetation types, respectively. We also tested two different canopy density proxies of different spatial resolution. Using ground elevations obtained from the ICESat GLA14 satellite altimeter, we calculate the residual errors for the raw SRTM and the three 'bare-earth' SRTM products and compare performances. The three 'bare-earth' products all show large improvements over the raw SRTM in vegetated areas with the overall mean bias reduced by between 75 and 92% from 4.94 m to 0.40 m. The overall standard deviation is reduced by between 29 and 33 % from 7.12 m to 4.80 m. As expected, improvements are higher in areas with denser vegetation. The final 'bare-earth' SRTM dataset is available at 3 arc-second with lower vertical height errors and less noise than the original SRTM product.

  10. An automated, open-source pipeline for mass production of digital elevation models (DEMs) from very-high-resolution commercial stereo satellite imagery

    NASA Astrophysics Data System (ADS)

    Shean, David E.; Alexandrov, Oleg; Moratto, Zachary M.; Smith, Benjamin E.; Joughin, Ian R.; Porter, Claire; Morin, Paul

    2016-06-01

    We adapted the automated, open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline a processing workflow for ∼0.5 m ground sample distance (GSD) DigitalGlobe WorldView-1 and WorldView-2 along-track stereo image data, with an overview of ASP capabilities, an evaluation of ASP correlator options, benchmark test results, and two case studies of DEM accuracy. Output DEM products are posted at ∼2 m with direct geolocation accuracy of <5.0 m CE90/LE90. An automated iterative closest-point (ICP) co-registration tool reduces absolute vertical and horizontal error to <0.5 m where appropriate ground-control data are available, with observed standard deviation of ∼0.1-0.5 m for overlapping, co-registered DEMs (n = 14, 17). While ASP can be used to process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We are leveraging these resources to produce dense time series and regional mosaics for the Earth's polar regions.

  11. DEM analysis of FOXSI-2 microflare using AIA observations

    NASA Astrophysics Data System (ADS)

    Athiray Panchapakesan, Subramania; Glesener, Lindsay; Vievering, Juliana; Camilo Buitrago-Casas, Juan; Christe, Steven; Inglis, Andrew; Krucker, Sam; Musset, Sophie

    2017-08-01

    The second flight of Focusing Optics X-ray Solar Imager (FOXSI) sounding rocket experiment was successfully completed on 11 December 2014. FOXSI makes direct imaging and spectral observation of the Sun in hard X-rays using grazing incidence optics modules which focus X-rays onto seven focal plane detectors kept at a 2m distance, in the energy range 4 to 20 keV, to study particle acceleration and coronal heating. Significant HXR emissions were observed by FOXSI during microflare events with A0.5 and A2.5 class, as classified by GOES, that occurred during FOXSI-2 flight.Spectral analysis of FOXSI data for these events indicate presence of plasma at higher temperatures (>10MK). We attempt to study the plasma content in the corona at different temperatures, characterized by the differential emission measure (DEM), over the FOXSI-2 observed flare regions using the Atmospheric Imaging Assembly (SDO/AIA) data. We utilize AIA observations in different EUV filters that are sensitive to ionized iron lines, to determine the DEM by using a regularized inversion method. This poster will show the properties of hot plasma as derived from FOXSI-2 HXR spectra with supporting DEM analysis using AIA observations.

  12. Efficient parallel CFD-DEM simulations using OpenMP

    NASA Astrophysics Data System (ADS)

    Amritkar, Amit; Deb, Surya; Tafti, Danesh

    2014-01-01

    The paper describes parallelization strategies for the Discrete Element Method (DEM) used for simulating dense particulate systems coupled to Computational Fluid Dynamics (CFD). While the field equations of CFD are best parallelized by spatial domain decomposition techniques, the N-body particulate phase is best parallelized over the number of particles. When the two are coupled together, both modes are needed for efficient parallelization. It is shown that under these requirements, OpenMP thread based parallelization has advantages over MPI processes. Two representative examples, fairly typical of dense fluid-particulate systems are investigated, including the validation of the DEM-CFD and thermal-DEM implementation with experiments. Fluidized bed calculations are performed on beds with uniform particle loading, parallelized with MPI and OpenMP. It is shown that as the number of processing cores and the number of particles increase, the communication overhead of building ghost particle lists at processor boundaries dominates time to solution, and OpenMP which does not require this step is about twice as fast as MPI. In rotary kiln heat transfer calculations, which are characterized by spatially non-uniform particle distributions, the low overhead of switching the parallelization mode in OpenMP eliminates the load imbalances, but introduces increased overheads in fetching non-local data. In spite of this, it is shown that OpenMP is between 50-90% faster than MPI.

  13. Research on image scrambling degree evaluation method

    NASA Astrophysics Data System (ADS)

    Bai, Sen; Liao, Xiaofeng; Chen, Jinyu; Liu, Yijun; Wang, Xiao

    2005-12-01

    This paper discussed the evaluation problem of image scrambling degree (ISD). Inspired by the evaluation method of image texture characteristics, three new metrics for assessing objectively the ISD were proposed. The first method utilized the performance of energy concentration of Walsh transformation (WT), which took into account the properties that a good ISD measurement method should be contented. The second method used angular second moment (ASM) of image gray level co-occurrence matrix (GLCM). The third method combined the entropy of GLCM with image texture characteristic. Experimental results show that the proposed metrics are effective to assess the ISD, which correlates well with subjective assessment. Considering the computational complexity, the first evaluation method based on WT is remarkably superior to the method based on ASM and GLCM in terms of the time cost.

  14. The New Global Digital Elevation Model : TanDEM-X DEM and its Final Performance

    NASA Astrophysics Data System (ADS)

    Gonzalez, Carolina; Rizzoli, Paola; Martone, Michele; Wecklich, Christopher; Borla Tridon, Daniela; Bachmann, Markus; Fritz, Thomas; Wessel, Birgit; Krieger, Gerhard; Zink, Manfred

    2017-04-01

    Digital elevation models (DEMs) have become widely used in many scientific and commercial applications and there are several local products have been developed in the last years. They provide a representation of the topographic features of the landscape. The importance of them is known and valued in every geoscience field, but they have also vast use in navigation and in other commercial areas. The main goal of the TanDEM-X (TerraSARX add-on for Digital Elevation Measurements) mission is the generation of a global DEM, homogeneous in quality with unprecedented global accuracy and resolution, which has been completed in mid-2016. For over four years, the almost identical satellites TerraSAR-X and TanDEM-X acquired single-pass interferometric SAR image pairs, from which is it possible to derive the topographic height by unwrapping the interferometric phase, unaffected by temporal decorrelation. Both satellites have been flying in close formation with a flexible geometric configuration. An optimized acquisition strategy aimed at achieving an absolute vertical accuracy much better than 10 meters and a relative vertical accuracy of 2 m and 4 m for flat and steep terrain, respectively, within a horizontal raster of 12 m x 12 m, which slightly varies depending on the geographic latitude. In this paper, we assess the performance of the global Tandem-X DEM, characterized in terms of relative and absolute vertical accuracy. The coverage statistics are also discussed in comparison to the previous almost global but with lower resolution DEM provided by the Shuttle Radar Topography Mission (SRTM). The exceptional quality of the global DEM is confirmed by the obtained results and the global TanDEM-X DEM is now ready to be distributed to the scientific and commercial community.

  15. ArcGeomorphometry: A toolbox for geomorphometric characterisation of DEMs in the ArcGIS environment

    NASA Astrophysics Data System (ADS)

    Rigol-Sanchez, Juan P.; Stuart, Neil; Pulido-Bosch, Antonio

    2015-12-01

    A software tool is described for the extraction of geomorphometric land surface variables and features from Digital Elevation Models (DEMs). The ArcGeomorphometry Toolbox consists of a series of Python/Numpy processing functions, presented through an easy-to-use graphical menu for the widely used ArcGIS package. Although many GIS provide some operations for analysing DEMs, the methods are often only partially implemented and can be difficult to find and used effectively. Since the results of automated characterisation of landscapes from DEMs are influenced by the extent being considered, the resolution of the source DEM and the size of the kernel (analysis window) used for processing, we have developed a tool to allow GIS users to flexibly apply several multi-scale analysis methods to parameterise and classify a DEM into discrete land surface units. Users can control the threshold values for land surface classifications. The size of the processing kernel can be used to identify land surface features across a range of landscape scales. The pattern of land surface units from each attempt at classification is displayed immediately and can then be processed in the GIS alongside additional data that can assist with a visual assessment and comparison of a series of results. The functionality of the ArcGeomorphometry toolbox is described using an example DEM.

  16. Application of improved extension evaluation method to water quality evaluation

    NASA Astrophysics Data System (ADS)

    Wong, Heung; Hu, Bao Qing

    2014-02-01

    The extension evaluation method (EEM) has been developed and applied to evaluate water quality. There are, however, negative values in the correlative degree (water quality grades from EEM) after the calculation. This is not natural as the correlative degree is essentially an index based on grades (rankings) of water quality by different methods, which are positive. To overcome this negativity issue, the interval clustering approach (ICA) was introduced, which is based on the grey clustering approach (GCA) and interval-valued fuzzy sets. However, the computing process and formulas of ICA are rather complex. This paper provides a novel method, i.e., improved extension evaluation method, so as to avoid negative values in the correlative degree. To demonstrate our proposed approach, the improved EEM is applied to evaluate the water quality of three different cross-sections of the Fen River, the second major branch river of the Yellow River in China and the Han Jiang River, one of the major branch rivers of the Yangtse River in China. The results of the improved evaluation method are basically the same as the official water quality. The proposed method possesses also the same merit as the EEM and ICA method, which can be applied to assess water quality when the levels of attributes are defined in terms of intervals in the water quality criteria. Existing methods are mostly applicable to data in the form of single numeric values.

  17. Uncertainty of SWAT model at different DEM resolutions in a large mountainous watershed.

    PubMed

    Zhang, Peipei; Liu, Ruimin; Bao, Yimeng; Wang, Jiawei; Yu, Wenwen; Shen, Zhenyao

    2014-04-15

    The objective of this study was to enhance understanding of the sensitivity of the SWAT model to the resolutions of Digital Elevation Models (DEMs) based on the analysis of multiple evaluation indicators. The Xiangxi River, a large tributary of Three Gorges Reservoir in China, was selected as the study area. A range of 17 DEM spatial resolutions, from 30 to 1000 m, was examined, and the annual and monthly model outputs based on each resolution were compared. The following results were obtained: (i) sediment yield was greatly affected by DEM resolution; (ii) the prediction of dissolved oxygen load was significantly affected by DEM resolutions coarser than 500 m; (iii) Total Nitrogen (TN) load was not greatly affected by the DEM resolution; (iv) Nitrate Nitrogen (NO₃-N) and Total Phosphorus (TP) loads were slightly affected by the DEM resolution; and (v) flow and Ammonia Nitrogen (NH₄-N) load were essentially unaffected by the DEM resolution. The flow and dissolved oxygen load decreased more significantly in the dry season than in the wet and normal seasons. Excluding flow and dissolved oxygen, the uncertainties of the other Hydrology/Non-point Source (H/NPS) pollution indicators were greater in the wet season than in the dry and normal seasons. Considering the temporal distribution uncertainties, the optimal DEM resolutions for flow was 30-200 m, for sediment and TP was 30-100 m, for dissolved oxygen and NO₃-N was 30-300 m, for NH₄-N was 30 to 70 m and for TN was 30-150 m.

  18. Laboratory evaluation of PCBs encapsulation method

    EPA Science Inventory

    Effectiveness and limitations of the encapsulation method for reducing polychlorinated biphenyls (PCBs) concentrations in indoor air and contaminated surface have been evaluated in the laboratory study. Ten coating materials such as epoxy and polyurethane coatings, latex paint, a...

  19. Laboratory evaluation of polychlorinated biphenyls encapsulation methods

    EPA Science Inventory

    Effectiveness and limitations of the encapsulation method for reducing polychlorinated biphenyls (PCBs) concentrations in indoor air and contaminated surface have been evaluated in the laboratory study. Ten coating materials such as epoxy and polyurethane coatings, latex paint, a...

  20. Meteorite. Urmaterie aus dem interplanetaren Raum.

    NASA Astrophysics Data System (ADS)

    Bühler, R. W.

    Contents: 1. Wie Irrlichter aus irdischen Dünsten. 2. Steine und Eisen aus dem Weltraum. 3. Die Meteoritenfunde in der Antarktis. 4. Einschläge auf die Erde. 5. Staubkörner und Riesenbrocken. 6. Systematik, Mineralogie, Petrologie, Zusammensetzung. 7. Ursprungsorte und wissenschaftliche Bedeutung der Meteorite. 8. Meteorite erkennen und konservieren.9. Meteoriten-Sammlungen in Europa.

  1. Modelling of Singapore's topographic transformation based on DEMs

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Belle, Iris; Hassler, Uta

    2015-02-01

    Singapore's topography has been heavily transformed by industrialization and urbanization processes. To investigate topographic changes and evaluate soil mass flows, historical topographic maps of 1924 and 2012 were employed, and basic topographic features were vectorized. Digital elevation models (DEMs) for the two years were reconstructed based on vector features. Corresponding slope maps, a surface difference map and a scatter plot of elevation changes were generated and used to quantify and categorize the nature of the topographic transformation. The surface difference map is aggregated into five main categories of changes: (1) areas without significant height changes, (2) lowered-down areas where hill ranges were cut down, (3) raised-up areas where valleys and swamps were filled in, (4) reclaimed areas from the sea, and (5) new water-covered areas. Considering spatial proximity and configurations of different types of changes, topographic transformation can be differentiated as either creating inland flat areas or reclaiming new land from the sea. Typical topographic changes are discussed in the context of Singapore's urbanization processes. The two slope maps and elevation histograms show that generally, the topographic surface of Singapore has become flatter and lower since 1924. More than 89% of height changes have happened within a range of 20 m and 95% have been below 40 m. Because of differences in land surveying and map drawing methods, uncertainties and inaccuracies inherent in the 1924 topographic maps are discussed in detail. In this work, a modified version of a traditional scatter plot is used to present height transformation patterns intuitively. This method of deriving categorical maps of topographical changes from a surface difference map can be used in similar studies to qualitatively interpret transformation. Slope maps and histograms were also used jointly to reveal additional patterns of topographic change.

  2. Assessment and Evaluation Methods for Access Services

    ERIC Educational Resources Information Center

    Long, Dallas

    2014-01-01

    This article serves as a primer for assessment and evaluation design by describing the range of methods commonly employed in library settings. Quantitative methods, such as counting and benchmarking measures, are useful for investigating the internal operations of an access services department in order to identify workflow inefficiencies or…

  3. Assessment and Evaluation Methods for Access Services

    ERIC Educational Resources Information Center

    Long, Dallas

    2014-01-01

    This article serves as a primer for assessment and evaluation design by describing the range of methods commonly employed in library settings. Quantitative methods, such as counting and benchmarking measures, are useful for investigating the internal operations of an access services department in order to identify workflow inefficiencies or…

  4. Evaluation of Sight, Sound, Symbol Instructional Method.

    ERIC Educational Resources Information Center

    Massarotti, Michael C.; Slaichert, William M.

    Evaluated was the Sight-Sound-Symbol (S-S-S) method of teaching basic reading skills with four groups of 16 trainable mentally retarded children. The method involved use of a musical keyboard to teach children to identify numbers, letters, colors, and shapes. Groups either received individual S-S-S instruction for 10 minutes daily, received S-S-S…

  5. Decision Support Method with AHP Based on Evaluation Grid Method

    NASA Astrophysics Data System (ADS)

    Yumoto, Masaki

    In the Decision Support Method with AHP, there is a tendency for accuracy to fall remarkably when only qualitative criteria estimate alternatives. To solve this problem, it is necessary to define the setting method of criteria clearly. Evaluation Grid Method can construct the recognition structure, which is the element of the target causality model. Through the verification of the hypothesis, the criteria of AHP can be extracted. This paper proposes how to model human's recognition structure with Evaluation Grid Method, and how to support the decision with AHP using the criteria which constructs the model. In practical experiments, the proposal method contributed to creation of objective criteria, and examinees were able to receive the good decision support.

  6. Research on evaluation method of CMOS camera

    NASA Astrophysics Data System (ADS)

    Zhang, Shaoqiang; Han, Weiqiang; Cui, Lanfang

    2014-09-01

    In some professional image application fields, we need to test some key parameters of the CMOS camera and evaluate the performance of the device. Aiming at this requirement, this paper proposes a perfect test method to evaluate the CMOS camera. Considering that the CMOS camera has a big fixed pattern noise, the method proposes the `photon transfer curve method' based on pixels to measure the gain and the read noise of the camera. The advantage of this method is that it can effectively wipe out the error brought by the response nonlinearity. Then the reason of photoelectric response nonlinearity of CMOS camera is theoretically analyzed, and the calculation formula of CMOS camera response nonlinearity is deduced. Finally, we use the proposed test method to test the CMOS camera of 2560*2048 pixels. In addition, we analyze the validity and the feasibility of this method.

  7. Geodetic mass balance of the Patagonian Icefields from STRM and TanDEM-X DEMs

    NASA Astrophysics Data System (ADS)

    Abdel Jaber, W.; Floricioiu, D.; Rott, H.

    2016-12-01

    The Northern and Southern Patagonian Icefields (NPI & SPI), represent the largest mid-latitude ice masses in the Southern Hemisphere. They are mostly drained by outlet glaciers with fronts calving into fresh water lakes or Pacific fjords. Both icefields were affected by significant downwasting in the last decades, as confirmed by published mass change trends obtained by means of gravimetric measurements and geodetic methods. Given their unique characteristics and the significant contribution to sea level rise per unit of area, they represent a fundamental barometer for climate research. The Shuttle Radar Topography Mission (SRTM) of 2000 provided the most complete and accurate Digital Elevation Model (DEM) at the time covering the entire globe from 56°S to 60°N. The present TanDEM-X mission shares the same objective aiming at a global coverage with much higher resolution and accuracy. Their combination leads to a unique multitemporal elevation dataset based solely on SAR single pass bistatic interferometry characterized by 11 to 16 year time span: an ideal setup for monitoring long-term large-scale geophysical phenomena. Using this dataset, detailed and extensive ice elevation change maps were obtained for the 12900 km² SPI for the observation period 2000 - 2011/2012 and for the 3900 km² NPI for the period 2000 - 2014. These maps were used to compute the glacier mass balance of the icefields through the geodetic method. Particular emphasis was set on the estimation of the uncertainty of the geodetic mass balance by quantifying all relevant sources of error. Among these, signal penetration into dry ice and snow can affect considerably radar elevation measurements. For this purpose the backscattering coefficient of the acquisitions along with concurrent meteorological data were analyzed to assess the conditions of the icefield surface. Mass change rates of -3.96±0.14 Gt a-1 and of -13.14±0.42 Gt a-1 (excluding subaqueous loss) were obtained for NPI and SPI

  8. Further evaluation of traditional icing scaling methods

    NASA Technical Reports Server (NTRS)

    Anderson, David N.

    1996-01-01

    This report provides additional evaluations of two methods to scale icing test conditions; it also describes a hybrid technique for use when scaled conditions are outside the operating envelope of the test facility. The first evaluation is of the Olsen method which can be used to scale the liquid-water content in icing tests, and the second is the AEDC (Ruff) method which is used when the test model is less than full size. Equations for both scaling methods are presented in the paper, and the methods were evaluated by performing icing tests in the NASA Lewis Icing Research Tunnel (IRT). The Olsen method was tested using 53 cm diameter NACA 0012 airfoils. Tests covered liquid-water-contents which varied by as much as a factor of 1.8. The Olsen method was generally effective in giving scale ice shapes which matched the reference shapes for these tests. The AEDC method was tested with NACA 0012 airfoils with chords from 18 cm to 53 cm. The 53 cm chord airfoils were used in reference tests, and 1/2 and 1/3 scale tests were made at conditions determined by applying the AEDC scaling method. The scale and reference airspeeds were matched in these tests. The AEDC method was found to provide fairly effective scaling for 1/2 size tests, but for 1/3 size models, scaling was generally less effective. In addition to these two scaling methods, a hybrid approach was also tested in which the Olsen method was used to adjust the LWC after size was scaled using the constant Weber number method. This approach was found to be an effective way to test when scaled conditions would otherwise be outside the capability of the test facility.

  9. Energy efficiency assessment methods and tools evaluation

    SciTech Connect

    McMordie, K.L.; Richman, E.E.; Keller, J.M.; Dixon, D.R.

    1994-08-01

    Many different methods of assessing the energy savings potential at federal installations, and identifying attractive projects for capital investment have been used by the different federal agencies. These methods range from high-level estimating tools to detailed design tools, both manual and software assisted. These methods have different purposes and provide results that are used for different parts of the project identification, and implementation process. Seven different assessment methods are evaluated in this study. These methods were selected by the program managers at the DoD Energy Policy Office, and DOE Federal Energy Management Program (FEMP). Each of the methods was applied to similar buildings at Bolling Air Force Base (AFB), unless it was inappropriate or the method was designed to make an installation-wide analysis, rather than focusing on particular buildings. Staff at Bolling AFB controlled the collection of data.

  10. Graphical methods for evaluating covering arrays

    SciTech Connect

    Kim, Youngil; Jang, Dae -Heung; Anderson-Cook, Christine M.

    2015-08-10

    Covering arrays relax the condition of orthogonal arrays by only requiring that all combination of levels be covered but not requiring that the appearance of all combination of levels be balanced. This allows for a much larger number of factors to be simultaneously considered but at the cost of poorer estimation of the factor effects. To better understand patterns between sets of columns and evaluate the degree of coverage to compare and select between alternative arrays, we suggest several new graphical methods that show some of the patterns of coverage for different designs. As a result, these graphical methods for evaluating covering arrays are illustrated with a few examples.

  11. CFD-DEM simulations of current-induced dune formation and morphological evolution

    NASA Astrophysics Data System (ADS)

    Sun, Rui; Xiao, Heng

    2016-06-01

    Understanding the fundamental mechanisms of sediment transport, particularly those during the formation and evolution of bedforms, is of critical scientific importance and has engineering relevance. Traditional approaches of sediment transport simulations heavily rely on empirical models, which are not able to capture the physics-rich, regime-dependent behaviors of the process. With the increase of available computational resources in the past decade, CFD-DEM (computational fluid dynamics-discrete element method) has emerged as a viable high-fidelity method for the study of sediment transport. However, a comprehensive, quantitative study of the generation and migration of different sediment bed patterns using CFD-DEM is still lacking. In this work, current-induced sediment transport problems in a wide range of regimes are simulated, including 'flat bed in motion', 'small dune', 'vortex dune' and suspended transport. Simulations are performed by using SediFoam, an open-source, massively parallel CFD-DEM solver developed by the authors. This is a general-purpose solver for particle-laden flows tailed for particle transport problems. Validation tests are performed to demonstrate the capability of CFD-DEM in the full range of sediment transport regimes. Comparison of simulation results with experimental and numerical benchmark data demonstrates the merits of CFD-DEM approach. In addition, the improvements of the present simulations over existing studies using CFD-DEM are presented. The present solver gives more accurate prediction of sediment transport rate by properly accounting for the influence of particle volume fraction on the fluid flow. In summary, this work demonstrates that CFD-DEM is a promising particle-resolving approach for probing the physics of current-induced sediment transport.

  12. Auf der Suche nach dem Unendlichen.

    NASA Astrophysics Data System (ADS)

    Fraser, G.; Lillestøl, E.; Sellevåg, I.

    This book is a German translation by C. Ascheron and J. Urbahn, of "The search for infinity: solving the mysteries of the universe", published in 1994. Diese Buch beschreibt anschaulich die Meilensteine, die der Mensch seit der Antike auf der Suche nach dem Unendlichen erreicht und hinter sich gelassen hat. Es enthält Kurzbiographien der wichtigsten Forscher, verständlich geschriebene Texte sowie Erläuterungen der entscheidenen Fachtermini.

  13. Veterinary and human vaccine evaluation methods.

    PubMed

    Knight-Jones, T J D; Edmond, K; Gubbins, S; Paton, D J

    2014-06-07

    Despite the universal importance of vaccines, approaches to human and veterinary vaccine evaluation differ markedly. For human vaccines, vaccine efficacy is the proportion of vaccinated individuals protected by the vaccine against a defined outcome under ideal conditions, whereas for veterinary vaccines the term is used for a range of measures of vaccine protection. The evaluation of vaccine effectiveness, vaccine protection assessed under routine programme conditions, is largely limited to human vaccines. Challenge studies under controlled conditions and sero-conversion studies are widely used when evaluating veterinary vaccines, whereas human vaccines are generally evaluated in terms of protection against natural challenge assessed in trials or post-marketing observational studies. Although challenge studies provide a standardized platform on which to compare different vaccines, they do not capture the variation that occurs under field conditions. Field studies of vaccine effectiveness are needed to assess the performance of a vaccination programme. However, if vaccination is performed without central co-ordination, as is often the case for veterinary vaccines, evaluation will be limited. This paper reviews approaches to veterinary vaccine evaluation in comparison to evaluation methods used for human vaccines. Foot-and-mouth disease has been used to illustrate the veterinary approach. Recommendations are made for standardization of terminology and for rigorous evaluation of veterinary vaccines.

  14. Veterinary and human vaccine evaluation methods

    PubMed Central

    Knight-Jones, T. J. D.; Edmond, K.; Gubbins, S.; Paton, D. J.

    2014-01-01

    Despite the universal importance of vaccines, approaches to human and veterinary vaccine evaluation differ markedly. For human vaccines, vaccine efficacy is the proportion of vaccinated individuals protected by the vaccine against a defined outcome under ideal conditions, whereas for veterinary vaccines the term is used for a range of measures of vaccine protection. The evaluation of vaccine effectiveness, vaccine protection assessed under routine programme conditions, is largely limited to human vaccines. Challenge studies under controlled conditions and sero-conversion studies are widely used when evaluating veterinary vaccines, whereas human vaccines are generally evaluated in terms of protection against natural challenge assessed in trials or post-marketing observational studies. Although challenge studies provide a standardized platform on which to compare different vaccines, they do not capture the variation that occurs under field conditions. Field studies of vaccine effectiveness are needed to assess the performance of a vaccination programme. However, if vaccination is performed without central co-ordination, as is often the case for veterinary vaccines, evaluation will be limited. This paper reviews approaches to veterinary vaccine evaluation in comparison to evaluation methods used for human vaccines. Foot-and-mouth disease has been used to illustrate the veterinary approach. Recommendations are made for standardization of terminology and for rigorous evaluation of veterinary vaccines. PMID:24741009

  15. Modified method of accommodative facility evaluation

    NASA Astrophysics Data System (ADS)

    Kedzia, Boleslaw; Pieczyrak, Danuta; Tondel, Grazyna; Maples, Willis C.

    1998-10-01

    Background: Accommodative facility testing is a common test performed by optometrist to investigate an individuals skill at focusing objects at near and at far. The traditional test however harbors possible confounding variables including individual variance in reaction time, visual acuity, verbal skills and oculomotor function. We have designed a test procedure to control these variables. Methods: Children were evaluated with a traditional accommodative facility test, a test which evaluated reaction time and language skill but without accommodative (plano lenses) and a test which evaluated reaction time, language skill and accommodative facility (+/- 2.00 D lenses). Results: Speed of reaction time was 2.9 sec/cycle for the plano lenses (for dominant eye). Speed of reaction with +/- 2.00 D lenses was 6.6 sec/cycle for dominant eye and the monocular speed of accommodations was calculated to average 3.7 sec/cycle. Normative data reported in the literature was calculated to be 5.5 sec/cycle. Discussion: We found that both our method which controls for confounding variables the traditional method reveal similar findings but that individual subjects would pass one method and fail the other. This is attributed to variation in the reaction time and digit naming skill. Conclusions: Although both methods reap similar results, both methods should be employed to discover, in those who score below the expected finding, to tease out whether or not the fault falls within the reaction time/language area or whether it is a true accommodative facility dysfunction.

  16. Spaceborne radar interferometry for coastal DEM construction

    USGS Publications Warehouse

    Hong, S.-H.; Lee, C.-W.; Won, J.-S.; Kwoun, Oh-Ig; Lu, Zhiming

    2005-01-01

    Topographic features in coastal regions including tidal flats change more significantly than landmass, and are characterized by extremely low slopes. High precision DEMs are required to monitor dynamic changes in coastal topography. It is difficult to obtain coherent interferometric SAR pairs especially over tidal flats mainly because of variation of tidal conditions. Here we focus on i) coherence of multi-pass ERS SAR interferometric pairs and ii) DEM construction from ERS-ENVISAT pairs. Coherences of multi-pass ERS interferograms were good enough to construct DEM under favorable tidal conditions. Coherence in sand dominant area was generally higher than that in muddy surface. The coarse grained coastal areas are favorable for multi-pass interferometry. Utilization of ERS-ENVISAT interferometric pairs is taken a growing interest. We carried out investigation using a cross-interferometric pair with a normal baseline of about 1.3 km, a 30 minutes temporal separation and the height sensitivity of about 6 meters. Preliminary results of ERS-ENVISAT interferometry were not successful due to baseline and unfavorable scattering conditions. ?? 2005 IEEE.

  17. Sentinel-1 SAR DEM Deployment Mechanisms Recovery

    NASA Astrophysics Data System (ADS)

    Rivera, Laura; Compositzo, Carlos; Arregui, Ibon

    2015-09-01

    The Sentinel-1 mission is encompassed in the Copernicus programme and each of the satellites carries a C-band Synthetic Aperture Radar (SAR) to provide an all-weather, day-and-night supply of imagery of Earth’s surface.This paper is prepared to inform of the development of the Deployment Mechanisms (DEM) of the SAR that are launched packed in stacks and have to deploy in orbit.SENER has designed, manufactured, integrated and tested 8 deployment mechanisms (DEM), 4 for Sentinel- 1A, that were successfully deployed some hours after it was launched in April 2014 and another 4 for Sentinel- 1B that is envisaged to be launched next year 2016. Previously, GAIA satellite was launched in December 2013, the Sunshield that was successfully deployed after launch, was equipped, as the DEMs, with two Sener’s Harmonic Drive Rotary Actuators (HDRA’s). Hence, SENER HDRA actuators have now the flight heritage of six units.Each antenna consisted of 5 stacks (named A to E panels) that are stored around the satellite and deployed once in orbit as per Fig.1:Figure 1.SAR in stowed and deployed configuration.

  18. Simplified methods for evaluating road prism stability

    Treesearch

    William J. Elliot; Mark Ballerini; David Hall

    2003-01-01

    Mass failure is one of the most common failures of low-volume roads in mountainous terrain. Current methods for evaluating stability of these roads require a geotechnical specialist. A stability analysis program, XSTABL, was used to estimate the stability of 3,696 combinations of road geometry, soil, and groundwater conditions. A sensitivity analysis was carried out to...

  19. New evaluation method for ARQ schemes

    NASA Astrophysics Data System (ADS)

    Nakamura, M.; Kodama, T.

    1988-10-01

    Automatic-repeat-request (ARQ) schemes which provide high system reliability with simple error control are widely used in data communication systems. A new evaluation method for ARQ schemes is presented which makes it possible to compare performances for various error-control schemes. Numerical results are reported for error correction, pure ARQ, and hybrid ARQ schemes.

  20. Research and Evaluation Methods in Special Education.

    ERIC Educational Resources Information Center

    Mertens, Donna M.; McLaughlin, John A.

    This text is designed to enable educators to design, conduct, and report research and evaluation in a way that transforms special education by addressing the needs of persons with disabilities as heterogeneous, cultural groups. The book explores ways to adapt those research methods to the special education context by providing a framework for…

  1. In need of combined topography and bathymetry DEM

    NASA Astrophysics Data System (ADS)

    Kisimoto, K.; Hilde, T.

    2003-04-01

    In many geoscience applications, digital elevation models (DEMs) are now more commonly used at different scales and greater resolution due to the great advancement in computer technology. Increasing the accuracy/resolution of the model and the coverage of the terrain (global model) has been the goal of users as mapping technology has improved and computers get faster and cheaper. The ETOPO5 (5 arc minutes spatial resolution land and seafloor model), initially developed in 1988 by Margo Edwards, then at Washington University, St. Louis, MO, has been the only global terrain model for a long time, and it is now being replaced by three new topographic and bathymetric DEMs, i.e.; the ETOPO2 (2 arc minutes spatial resolution land and seafloor model), the GTOPO30 land model with a spatial resolution of 30 arc seconds (c.a. 1km at equator) and the 'GEBCO 1-MINUTE GLOBAL BATHYMETRIC GRID' ocean floor model with a spatial resolution of 1 arc minute (c.a. 2 km at equator). These DEMs are products of projects through which compilation and reprocessing of existing and/or new datasets were made to meet user's new requirements. These ongoing efforts are valuable and support should be continued to refine and update these DEMs. On the other hand, a different approach to create a global bathymetric (seafloor) database exists. A method to estimate the seafloor topography from satellite altimetry combined with existing ships' conventional sounding data was devised and a beautiful global seafloor database created and made public by W.H. Smith and D.T. Sandwell in 1997. The big advantage of this database is the uniformity of coverage, i.e. there is no large area where depths are missing. It has a spatial resolution of 2 arc minute. Another important effort is found in making regional, not global, seafloor databases with much finer resolutions in many countries. The Japan Hydrographic Department has compiled and released a 500m-grid topography database around Japan, J-EGG500, in 1999

  2. Evaluating Sleep Disturbance: A Review of Methods

    NASA Technical Reports Server (NTRS)

    Smith, Roy M.; Oyung, R.; Gregory, K.; Miller, D.; Rosekind, M.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    There are three general approaches to evaluating sleep disturbance in regards to noise: subjective, behavioral, and physiological. Subjective methods range from standardized questionnaires and scales to self-report measures designed for specific research questions. There are two behavioral methods that provide useful sleep disturbance data. One behavioral method is actigraphy, a motion detector that provides an empirical estimate of sleep quantity and quality. An actigraph, worn on the non-dominant wrist, provides a 24-hr estimate of the rest/activity cycle. The other method involves a behavioral response, either to a specific probe or stimuli or subject initiated (e.g., indicating wakefulness). The classic, gold standard for evaluating sleep disturbance is continuous physiological monitoring of brain, eye, and muscle activity. This allows detailed distinctions of the states and stages of sleep, awakenings, and sleep continuity. Physiological delta can be obtained in controlled laboratory settings and in natural environments. Current ambulatory physiological recording equipment allows evaluation in home and work settings. These approaches will be described and the relative strengths and limitations of each method will be discussed.

  3. Evaluating Sleep Disturbance: A Review of Methods

    NASA Technical Reports Server (NTRS)

    Smith, Roy M.; Oyung, R.; Gregory, K.; Miller, D.; Rosekind, M.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    There are three general approaches to evaluating sleep disturbance in regards to noise: subjective, behavioral, and physiological. Subjective methods range from standardized questionnaires and scales to self-report measures designed for specific research questions. There are two behavioral methods that provide useful sleep disturbance data. One behavioral method is actigraphy, a motion detector that provides an empirical estimate of sleep quantity and quality. An actigraph, worn on the non-dominant wrist, provides a 24-hr estimate of the rest/activity cycle. The other method involves a behavioral response, either to a specific probe or stimuli or subject initiated (e.g., indicating wakefulness). The classic, gold standard for evaluating sleep disturbance is continuous physiological monitoring of brain, eye, and muscle activity. This allows detailed distinctions of the states and stages of sleep, awakenings, and sleep continuity. Physiological delta can be obtained in controlled laboratory settings and in natural environments. Current ambulatory physiological recording equipment allows evaluation in home and work settings. These approaches will be described and the relative strengths and limitations of each method will be discussed.

  4. Volcanic geomorphology using TanDEM-X

    NASA Astrophysics Data System (ADS)

    Poland, Michael; Kubanek, Julia

    2016-04-01

    Topography is perhaps the most fundamental dataset for any volcano, yet is surprisingly difficult to collect, especially during the course of an eruption. For example, photogrammetry and lidar are time-intensive and often expensive, and they cannot be employed when the surface is obscured by clouds. Ground-based surveys can operate in poor weather but have poor spatial resolution and may expose personnel to hazardous conditions. Repeat passes of synthetic aperture radar (SAR) data provide excellent spatial resolution, but topography in areas of surface change (from vegetation swaying in the wind to physical changes in the landscape) between radar passes cannot be imaged. The German Space Agency's TanDEM-X satellite system, however, solves this issue by simultaneously acquiring SAR data of the surface using a pair of orbiting satellites, thereby removing temporal change as a complicating factor in SAR-based topographic mapping. TanDEM-X measurements have demonstrated exceptional value in mapping the topography of volcanic environments in as-yet limited applications. The data provide excellent resolution (down to ~3-m pixel size) and are useful for updating topographic data at volcanoes where surface change has occurred since the most recent topographic dataset was collected. Such data can be used for applications ranging from correcting radar interferograms for topography, to modeling flow pathways in support of hazards mitigation. The most valuable contributions, however, relate to calculating volume changes related to eruptive activity. For example, limited datasets have provided critical measurements of lava dome growth and collapse at volcanoes including Merapi (Indonesia), Colima (Mexico), and Soufriere Hills (Montserrat), and of basaltic lava flow emplacement at Tolbachik (Kamchatka), Etna (Italy), and Kīlauea (Hawai`i). With topographic data spanning an eruption, it is possible to calculate eruption rates - information that might not otherwise be available

  5. Evaluation of Dynamic Methods for Earthwork Assessment

    NASA Astrophysics Data System (ADS)

    Vlček, Jozef; Ďureková, Dominika; Zgútová, Katarína

    2015-05-01

    Rapid development of road construction imposes requests on fast and quality methods for earthwork quality evaluation. Dynamic methods are now adopted in numerous civil engineering sections. Especially evaluation of the earthwork quality can be sped up using dynamic equipment. This paper presents the results of the parallel measurements of chosen devices for determining the level of compaction of soils. Measurements were used to develop the correlations between values obtained from various apparatuses. Correlations show that examined apparatuses are suitable for examination of compaction level of fine-grained soils with consideration of boundary conditions of used equipment. Presented methods are quick and results can be obtained immediately after measurement, and they are thus suitable in cases when construction works have to be performed in a short period of time.

  6. Optimizing grid computing configuration and scheduling for geospatial analysis: An example with interpolating DEM

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei

    2011-02-01

    Many geographic analyses are very time-consuming and do not scale well when large datasets are involved. For example, the interpolation of DEMs (digital evaluation model) for large geographic areas could become a problem in practical application, especially for web applications such as terrain visualization, where a fast response is required and computational demands exceed the capacity of a traditional single processing unit conducting serial processing. Therefore, high performance and parallel computing approaches, such as grid computing, were investigated to speed up the geographic analysis algorithms, such as DEM interpolation. The key for grid computing is to configure an optimized grid computing platform for the geospatial analysis and optimally schedule the geospatial tasks within a grid platform. However, there is no research focused on this. Using DEM interoperation as an example, we report our systematic research on configuring and scheduling a high performance grid computing platform to improve the performance of geographic analyses through a systematic study on how the number of cores, processors, grid nodes, different network connections and concurrent request impact the speedup of geospatial analyses. Condor, a grid middleware, is used to schedule the DEM interpolation tasks for different grid configurations. A Kansas raster-based DEM is used for a case study and an inverse distance weighting (IDW) algorithm is used in interpolation experiments.

  7. Quality assessment of Digital Elevation Model (DEM) in view of the Altiplano hydrological modeling

    NASA Astrophysics Data System (ADS)

    Satgé, F.; Arsen, A.; Bonnet, M.; Timouk, F.; Calmant, S.; Pilco, R.; Molina, J.; Lavado, W.; Crétaux, J.; HASM

    2013-05-01

    Topography is crucial data input for hydrological modeling but in many regions of the world, the only way to characterize topography is the use of satellite-based Digital Elevation Models (DEM). In some regions, the quality of these DEMs remains poor and induces modeling errors that may or not be compensated by model parameters tuning. In such regions, the evaluation of these data uncertainties is an important step in the modeling procedure. In this study, which focuses on the Altiplano region, we present the evaluation of the two freely available DEM. The shuttle radar topographic mission (SRTM), a product of the National Aeronautics and Space Administration (NASA) and the Advanced Space Born Thermal Emission and Reflection Global Digital Elevation Map (ASTER GDEM), data provided by the Ministry of Economy, Trade and Industry of Japan (MESI) in collaboration with the NASA, are widely used. While the first represents a resolution of 3 arc seconds (90m) the latter is 1 arc second (30m). In order to select the most reliable DEM, we compared the DEM elevation with high qualities control points elevation. Because of its large spatial coverture (track spaced of 30 km with a measure of each 172 m) and its high vertical accuracy which is less than 15 cm in good weather conditions, the Geoscience Laser Altimeter System (GLAS) on board on the Ice, Cloud and Land elevation Satellite of NASA (ICESat) represent the better solution to establish a high quality elevation database. After a quality check, more than 150 000 ICESat/GLAS measurements are suitable in terms of accuracy for the Altiplano watershed. This data base has been used to evaluate the vertical accuracy for each DEM. Regarding to the full spatial coverture; the comparison has been done for both, all kind of land coverture, range altitude and mean slope.

  8. Feminist research: definitions, methodology, methods and evaluation.

    PubMed

    Webb, C

    1993-03-01

    The literature relating to feminist research both within and beyond nursing is reviewed in this paper. Feminist research is located within a post-positivist paradigm, and various definitions are considered. The distinctive methodological approach of feminist research is discussed, and interviewing and ethnography are evaluated as suitable methods for use in feminist research. Oakley's (1981) paper on interviewing women is subjected to criticism. A final section examines attempts by three sets of writers to propose evaluation criteria for feminist research. The review concludes that a number of paradoxes and dilemmas in feminist research have yet to be resolved.

  9. Permutation method for evaluating topographic craniofacial correlations.

    PubMed

    Halazonetis, Demetrios J

    2011-03-01

    Correlations between cephalometric measurements are frequently assumed to represent biologic associations. However, a significant portion of such correlations might arise from purely geometric dependencies, when measurements share common landmarks. Analytic calculation of this topographic component is difficult. The purpose of this study was to propose a permutation method for evaluating the topographic component of cephalometric correlations. The method consisted of creating a virtual sample of cephalometric tracings (landmark configurations) from the original biologic sample under investigation. Each novel landmark configuration was constructed by assigning coordinates to the cephalometric points; the coordinates of each point were taken randomly from the original sample, each from a potentially different subject. Correlation analysis was performed separately on both samples and the results compared. Biologic meaning was ascribed only when there was a significant difference in correlation values between the samples. Confidence intervals for assessing statistical significance were calculated by using a randomization approach. The method was tested on a sample of 170 radiographs to evaluate the correlation between cranial base angle (NSBa) and angles SNA and SNB, as well as between ANB angle and the Wits appraisal. No biologic association was found between ANB and Wits, or between NSBa and SNA. The biologic correlation between NSBa and SNB was statistically significant but low (r(2) = 12%). Topographic associations between cephalometric measurements are ubiquitous and difficult to assess. The proposed method enables evaluation of their relative strength without the need for analytic solutions. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  10. [Tensiomyography as method of evaluating muscles status].

    PubMed

    Markulincić, Branko; Muraja, Sonja

    2007-01-01

    Sports results, as well as results of rehabilitation treatments are closely related to a detailed, strictly individualized programme of sports and rehabilitation training. It is vitally important to monitor and evaluate results constantly. Along with already standardized methods of evaluating neuromuscular system, such as electrodinamometry and isokinetic dinamometry on Cybex; tensiomyography (TMG) as method of assessing muscles status has been introduced. TMG is non-invasive, selective, objective method designed to measure time of activation, delay time as well as contraction time, relaxation time and intesity of muscle contraction in conditions of submaximum electrostimulation. The method is based on measuring the muscle belly enlargements by a superficialy placed precise electromagnetic sensor.TMG enables the examination of some otherwise inaccessible muscles like gluteus maximus muscle and also selective evaluation of single muscle head (for example m. vastus medialis, m. vastus lateralis and m. rectus femoris of m. quadriceps). Estimation of harmonisation between agonistic and antagonistic muscles, synergistic muscles and same muscles on left and right side of the body, is based on muscles biomechanical properties i.e. parameters, calculated from TMG response. Total harmonization (100%) is hardly ever the case, the lowest level sufficient muscle groups functionality is defined by 80% for lateral and 65% for agonistic/synergistic harmonisation. Harmonization below this level either reflects past injures, muscle adaptation or indicates increased exposure to injury.

  11. Integration of SAR and DEM data: Geometrical considerations

    NASA Technical Reports Server (NTRS)

    Kropatsch, Walter G.

    1991-01-01

    General principles for integrating data from different sources are derived from the experience of registration of SAR images with digital elevation models (DEM) data. The integration consists of establishing geometrical relations between the data sets that allow us to accumulate information from both data sets for any given object point (e.g., elevation, slope, backscatter of ground cover, etc.). Since the geometries of the two data are completely different they cannot be compared on a pixel by pixel basis. The presented approach detects instances of higher level features in both data sets independently and performs the matching at the high level. Besides the efficiency of this general strategy it further allows the integration of additional knowledge sources: world knowledge and sensor characteristics are also useful sources of information. The SAR features layover and shadow can be detected easily in SAR images. An analytical method to find such regions also in a DEM needs in addition the parameters of the flight path of the SAR sensor and the range projection model. The generation of the SAR layover and shadow maps is summarized and new extensions to this method are proposed.

  12. a Near-Global Bare-Earth dem from Srtm

    NASA Astrophysics Data System (ADS)

    Gallant, J. C.; Read, A. M.

    2016-06-01

    The near-global elevation product from NASA's Shuttle Radar Topographic Mission (SRTM) has been widely used since its release in 2005 at 3 arcsecond resolution and the release of the 1 arcsecond version in late 2014 means that the full potential of the SRTM DEM can now be realised. However the routine use of SRTM for analytical purposes such as catchment hydrology, flood inundation, habitat mapping and soil mapping is still seriously impeded by the presence of artefacts in the data, primarily the offsets due to tree cover and the random noise. This paper describes the algorithms being developed to remove those offsets, based on the methods developed to produce the Australian national elevation model from SRTM data. The offsets due to trees are estimated using the GlobeLand30 (National Geomatics Center of China) and Global Forest Change (University of Maryland) products derived from Landsat, along with the ALOS PALSAR radar image data (JAXA) and the global forest canopy height map (NASA). The offsets are estimated using several processes and combined to produce a single continuous tree offset layer that is subtracted from the SRTM data. The DEM products will be made freely available on completion of the first draft product, and the assessment of that product is expected to drive further improvements to the methods.

  13. An evaluation of fracture analysis methods

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1985-01-01

    The results of an experimental and predictive round robin on the applications of fracture analysis methods are presented. The objective of the round robin was to verify whether fracture analysis methods currently in use can or cannot predict failure loads on complex structural components containing cracks. Fracture results from tests on a number of compact specimens were used to make the predictions. The accuracy of the prediction methods was evaluated in terms of the variation in the ratio of predicted to experimental failure loads, and the predictions methods are ranked in order of minimum standard error. A range of applicability of the different methods was also considered in assessing their usefulness. For 7075-T651 aluminum alloy, the best methods were: the effective K sub R curve; the critical crack-tip opening displacement (CTOD) criterion using a finite element analysis; and the K sub R curve with the Dugdale model. For the 2024-T351 aluminum alloy, the best methods included: the two-parameter fracture criterion (TPFC); the CTOD parameter using finite element analysis; the K-curve with the Dugdale model; the deformation plasticity failure assessment diagram (DPFAD); and the effective K sub R curve with a limit load condition. For 304 stainless steel, the best methods were the limit load analysis; the CTOD criterion using finite-element analysis TPFC and DPFAD. Some sample experimental results are given in an appendix.

  14. Graphical methods for evaluating covering arrays

    DOE PAGES

    Kim, Youngil; Jang, Dae -Heung; Anderson-Cook, Christine M.

    2015-08-10

    Covering arrays relax the condition of orthogonal arrays by only requiring that all combination of levels be covered but not requiring that the appearance of all combination of levels be balanced. This allows for a much larger number of factors to be simultaneously considered but at the cost of poorer estimation of the factor effects. To better understand patterns between sets of columns and evaluate the degree of coverage to compare and select between alternative arrays, we suggest several new graphical methods that show some of the patterns of coverage for different designs. As a result, these graphical methods formore » evaluating covering arrays are illustrated with a few examples.« less

  15. [Agricultural products handling: methods of feasibility evaluation].

    PubMed

    Scott, G J; Herrera, J E

    1993-06-01

    Post-harvest problems are important constraints to the expansion of production of food in many Latin American countries. Besides problems of bulkiness, perishability and seasonal production patterns, the necessity of reducing transportation costs, increasing rural employment, and finding new markets for processed products, requires the development of processing technologies. Possible processed products include a vast range of alternatives. Given limited time and resources, it is not always feasible to carry out detailed studies. Hence a practical, low-cost methodology is needed to evaluate the available options. This paper presents a series of methods to evaluate different processing possibilities. It describes in detail each method including a rapid initial assessment, market and consumer research, farm-oriented research, costs and returns analysis and finally, some marketing and promotion strategies.

  16. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  17. [Methods for evaluation of penile erection hardness].

    PubMed

    Yuan, Yi-Ming; Zhou, Su; Zhang, Kai

    2010-07-01

    Penile erection hardness is one of the key factors for successful sexual intercourse, as well as an important index in the diagnosis and treatment of erectile dysfunction (ED). This article gives an overview on the component and impact factors of erection hardness, summarizes some commonly used evaluation methods, including those for objective indexes, such as Rigiscan, axial buckling test and color Doppler ultrasonography, and those for subjective indexes of ED patients, such as IIEF, the Erectile Function Domain of IIEF (IIEF-EF), and Erection Hardness Score (EHS), and discusses the characteristics of these methods.

  18. Shape and Albedo from Shading (SAfS) for Pixel-Level dem Generation from Monocular Images Constrained by Low-Resolution dem

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Chung Liu, Wai; Grumpe, Arne; Wöhler, Christian

    2016-06-01

    Lunar topographic information, e.g., lunar DEM (Digital Elevation Model), is very important for lunar exploration missions and scientific research. Lunar DEMs are typically generated from photogrammetric image processing or laser altimetry, of which photogrammetric methods require multiple stereo images of an area. DEMs generated from these methods are usually achieved by various interpolation techniques, leading to interpolation artifacts in the resulting DEM. On the other hand, photometric shape reconstruction, e.g., SfS (Shape from Shading), extensively studied in the field of Computer Vision has been introduced to pixel-level resolution DEM refinement. SfS methods have the ability to reconstruct pixel-wise terrain details that explain a given image of the terrain. If the terrain and its corresponding pixel-wise albedo were to be estimated simultaneously, this is a SAfS (Shape and Albedo from Shading) problem and it will be under-determined without additional information. Previous works show strong statistical regularities in albedo of natural objects, and this is even more logically valid in the case of lunar surface due to its lower surface albedo complexity than the Earth. In this paper we suggest a method that refines a lower-resolution DEM to pixel-level resolution given a monocular image of the coverage with known light source, at the same time we also estimate the corresponding pixel-wise albedo map. We regulate the behaviour of albedo and shape such that the optimized terrain and albedo are the likely solutions that explain the corresponding image. The parameters in the approach are optimized through a kernel-based relaxation framework to gain computational advantages. In this research we experimentally employ the Lunar-Lambertian model for reflectance modelling; the framework of the algorithm is expected to be independent of a specific reflectance model. Experiments are carried out using the monocular images from Lunar Reconnaissance Orbiter (LRO

  19. Electromagnetic Imaging Methods for Nondestructive Evaluation Applications

    PubMed Central

    Deng, Yiming; Liu, Xin

    2011-01-01

    Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693

  20. Electromagnetic imaging methods for nondestructive evaluation applications.

    PubMed

    Deng, Yiming; Liu, Xin

    2011-01-01

    Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions.

  1. Monitoring lava dome changes by means of differential DEMs from TanDEM-X interferometry: Examples from Merapi, Indonesia and Volcán de Colima, Mexico

    NASA Astrophysics Data System (ADS)

    Kubanek, J.; Westerhaus, M.; Heck, B.

    2013-12-01

    derived by TanDEM-X interferometry taken before and after the eruption. Our results reveal that the eruption had led to a topographic change of up to 200 m in the summit area of Merapi. We further show the ability of the TanDEM-X data to observe much smaller topographic changes using Volcán de Colima as second test site. An explosion at the crater rim signaled the end of magma ascent in June 2011. The bistatic TanDEM-X data give important information on this explosion as we can observe topographic changes of up to 20 m and less in the summit area when comparing datasets taken before and after the event. We further analyzed datasets from the beginning of the year 2013 when Colima got active again after a dormant period. Our results indicate that repeated DEMs with great detail and good accuracy are obtainable, enabling a quantitative estimation of volume changes in the summit area of the volcano. As the TanDEM-X mission is an innovative mission, the present study serves as a test to employ data of a new satellite mission in volcano research. An error analysis of the DEMs to evaluate the volume quantifications was therefore also conducted.

  2. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2015-03-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two data sets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM (Rennó, 2009); and (2) a set of 18 hydrological-topographic descriptors based on the corrected SRTM DEM. Deforestation features are related with the opening of an 800 km road in the central part of the interfluve and occupancy of its vicinity. We used topographic profiles from the pristine forest to the deforested feature to evaluate the recovery of the original canopy coverage by minimizing canopy height variation (corrections ranged from 1 to 38 m). The hydrological-topographic description was obtained by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (above sea level) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND data set was done by in situ hydrological description of 110 km of walking trails also available in this data set. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; the data sets of hydrological features based on topographic modelling are undoubtedly appropriate for ecological modelling and an important contribution to environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the

  3. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    NASA Astrophysics Data System (ADS)

    Mao, Y.; Ye, A.; Xu, J.; Ma, F.; Deng, X.; Miao, C.; Gong, W.; Di, Z.

    2014-07-01

    A high-resolution and high-accuracy drainage network map is a prerequisite for simulating the water cycle in land surface hydrological models. The objective of this study was to develop a new automated extraction of drainage network model, which can get high-precision continuous drainage network on high-resolution DEM (Digital Elevation Model). The high-resolution DEM need too much computer resources to extract drainage network. The conventional GIS method often can not complete to calculate on high-resolution DEM of big basins, because the number of grids is too large. In order to decrease the computation time, an advanced distributed automated extraction of drainage network model (Adam) was proposed in the study. The Adam model has two features: (1) searching upward from outlet of basin instead of sink filling, (2) dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM) at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales).

  4. Assessment methods for the evaluation of vitiligo.

    PubMed

    Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K

    2012-12-01

    There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials.

  5. DEM, tide and velocity over sulzberger ice shelf, West Antarctica

    USGS Publications Warehouse

    Baek, S.; Shum, C.K.; Lee, H.; Yi, Y.; Kwoun, Oh-Ig; Lu, Zhiming; Braun, Andreas

    2005-01-01

    Arctic and Antarctic ice sheets preserve more than 77% of the global fresh water and could raise global sea level by several meters if completely melted. Ocean tides near and under ice shelves shifts the grounding line position significantly and are one of current limitations to study glacier dynamics and mass balance. The Sulzberger ice shelf is an area of ice mass flux change in West Antarctica and has not yet been well studied. In this study, we use repeat-pass synthetic aperture radar (SAR) interferometry data from the ERS-1 and ERS-2 tandem missions for generation of a high-resolution (60-m) Digital Elevation Model (DEM) including tidal deformation detection and ice stream velocity of the Sulzberger Ice Shelf. Other satellite data such as laser altimeter measurements with fine foot-prints (70-m) from NASA's ICESat are used for validation and analyses. The resulting DEM has an accuracy of-0.57??5.88 m and is demonstrated to be useful for grounding line detection and ice mass balance studies. The deformation observed by InSAR is found to be primarily due to ocean tides and atmospheric pressure. The 2-D ice stream velocities computed agree qualitatively with previous methods on part of the Ice Shelf from passive microwave remote-sensing data (i.e., LANDSAT). ?? 2005 IEEE.

  6. Simulation of triaxial response of granular materials by modified DEM

    NASA Astrophysics Data System (ADS)

    Wang, XiaoLiang; Li, JiaChun

    2014-12-01

    A modified discrete element method (DEM) with rolling effect taken into consideration is developed to examine macroscopic behavior of granular materials in this study. Dimensional analysis is firstly performed to establish the relationship between macroscopic mechanical behavior, mesoscale contact parameters at particle level and external loading rate. It is found that only four dimensionless parameters may govern the macroscopic mechanical behavior in bulk. The numerical triaxial apparatus was used to study their influence on the mechanical behavior of granular materials. The parametric study indicates that Poisson's ratio only varies with stiffness ratio, while Young's modulus is proportional to contact modulus and grows with stiffness ratio, both of which agree with the micromechanical model. The peak friction angle is dependent on both inter-particle friction angle and rolling resistance. The dilatancy angle relies on inter-particle friction angle if rolling stiffness coefficient is sufficiently large. Finally, we have recommended a calibration procedure for cohesionless soil, which was at once applied to the simulation of Chende sand using a series of triaxial compression tests. The responses of DEM model are shown in quantitative agreement with experiments. In addition, stress-strain response of triaxial extension was also obtained by numerical triaxial extension tests.

  7. DEM modeling of flexible structures against granular material avalanches

    NASA Astrophysics Data System (ADS)

    Lambert, Stéphane; Albaba, Adel; Nicot, François; Chareyre, Bruno

    2016-04-01

    This article presents the numerical modeling of flexible structures intended to contain avalanches of granular and coarse material (e.g. rock slide, a debris slide). The numerical model is based on a discrete element method (YADE-Dem). The DEM modeling of both the flowing granular material and the flexible structure are detailed before presenting some results. The flowing material consists of a dry polydisperse granular material accounting for the non-sphericity of real materials. The flexible structure consists in a metallic net hanged on main cables, connected to the ground via anchors, on both sides of the channel, including dissipators. All these components were modeled as flexible beams or wires, with mechanical parameters defined from literature data. The simulation results are presented with the aim of investigating the variability of the structure response depending on different parameters related to the structure (inclination of the fence, with/without brakes, mesh size opening), but also to the channel (inclination). Results are then compared with existing recommendations in similar fields.

  8. Evaluation of Alternate Surface Passivation Methods (U)

    SciTech Connect

    Clark, E

    2005-05-31

    Stainless steel containers were assembled from parts passivated by four commercial vendors using three passivation methods. The performance of these containers in storing hydrogen isotope mixtures was evaluated by monitoring the composition of initially 50% H{sub 2} 50% D{sub 2} gas with time using mass spectroscopy. Commercial passivation by electropolishing appears to result in surfaces that do not catalyze hydrogen isotope exchange. This method of surface passivation shows promise for tritium service, and should be studied further and considered for use. On the other hand, nitric acid passivation and citric acid passivation may not result in surfaces that do not catalyze the isotope exchange reaction H{sub 2} + D{sub 2} {yields} 2HD. These methods should not be considered to replace the proprietary passivation processes of the two current vendors used at the Savannah River Site Tritium Facility.

  9. Cryptosporidiosis: multiattribute evaluation of six diagnostic methods.

    PubMed Central

    MacPherson, D W; McQueen, R

    1993-01-01

    Six diagnostic methods (Giemsa staining, Ziehl-Neelsen staining, auramine-rhodamine staining, Sheather's sugar flotation, an indirect immunofluorescence procedure, and a modified concentration-sugar flotation method) for the detection of Cryptosporidium spp. in stool specimens were compared on the following attributes: diagnostic yield, cost to perform each test, ease of handling, and ability to process large numbers of specimens for screening purposes by batching. A rank ordering from least desirable to most desirable was then established for each method by using the study attributes. The process of decision analysis with respect to the laboratory diagnosis of cryptosporidiosis is discussed through the application of multiattribute utility theory to the rank ordering of the study criteria. Within a specific health care setting, a diagnostic facility will be able to calculate its own utility scores for our study attributes. Multiattribute evaluation and analysis are potentially powerful tools in the allocation of resources in the laboratory. PMID:8432802

  10. Co-seismic landslide topographic analysis based on multi-temporal DEM-A case study of the Wenchuan earthquake.

    PubMed

    Ren, Zhikun; Zhang, Zhuqi; Dai, Fuchu; Yin, Jinhui; Zhang, Huiping

    2013-01-01

    Hillslope instability has been thought to be one of the most important factors for landslide susceptibility. In this study, we apply geomorphic analysis using multi-temporal DEM data and shake intensity analysis to evaluate the topographic characteristics of the landslide areas. There are many geomorphologic analysis methods such as roughness, slope aspect, which are also as useful as slope analysis. The analyses indicate that most of the co-seismic landslides occurred in regions with roughness, hillslope and slope aspect of >1.2, >30, and between 90 and 270, respectively. However, the intersection regions from the above three methods are more accurate than that derived by applying single topographic analysis method. The ground motion data indicates that the co-seismic landslides mainly occurred on the hanging wall side of Longmen Shan Thrust Belt within the up-down and horizontal peak ground acceleration (PGA) contour of 150 PGA and 200 gal, respectively. The comparisons of pre- and post-earthquake DEM data indicate that the medium roughness and slope increased, the roughest and steepest regions decreased after the Wenchuan earthquake. However, slope aspects did not even change. Our results indicate that co-seismic landslides mainly occurred at specific regions of high roughness, southward and steep sloping areas under strong ground motion. Co-seismic landslides significantly modified the local topography, especially the hillslope and roughness. The roughest relief and steepest slope are significantly smoothed; however, the medium relief and slope become rougher and steeper, respectively.

  11. Blaze-DEMGPU: Modular high performance DEM framework for the GPU architecture

    NASA Astrophysics Data System (ADS)

    Govender, Nicolin; Wilke, Daniel N.; Kok, Schalk

    Blaze-DEMGPU is a modular GPU based discrete element method (DEM) framework that supports polyhedral shaped particles. The high level performance is attributed to the light weight and Single Instruction Multiple Data (SIMD) that the GPU architecture offers. Blaze-DEMGPU offers suitable algorithms to conduct DEM simulations on the GPU and these algorithms can be extended and modified. Since a large number of scientific simulations are particle based, many of the algorithms and strategies for GPU implementation present in Blaze-DEMGPU can be applied to other fields. Blaze-DEMGPU will make it easier for new researchers to use high performance GPU computing as well as stimulate wider GPU research efforts by the DEM community.

  12. GPS-Based Precision Baseline Reconstruction for the TanDEM-X SAR-Formation

    NASA Technical Reports Server (NTRS)

    Montenbruck, O.; vanBarneveld, P. W. L.; Yoon, Y.; Visser, P. N. A. M.

    2007-01-01

    The TanDEM-X formation employs two separate spacecraft to collect interferometric Synthetic Aperture Radar (SAR) measurements over baselines of about 1 km. These will allow the generation ofa global Digital Elevation Model (DEM) with an relative vertical accuracy of 2-4 m and a 10 m ground resolution. As part of the ground processing, the separation of the SAR antennas at the time of each data take must be reconstructed with a 1 mm accuracy using measurements from two geodetic grade GPS receivers. The paper discusses the TanDEM-X mission as well as the methods employed for determining the interferometric baseline with utmost precision. Measurements collected during the close fly-by of the two GRACE satellites serve as a reference case to illustrate the processing concept, expected accuracy and quality control strategies.

  13. Evaluation method of indoor GPS measurement network

    NASA Astrophysics Data System (ADS)

    Li, Yang; Zhou, Zili; Ma, Liqun; Li, Qian

    2013-01-01

    Indoor GPS measurement network is a space coordinate measuring system, which is composed of more than one transmitter. The number and location of the transmitter determine the measurement range and accuracy of the measurement network. Therefore, how to correctly evaluate the measurement network is a key issue. By analyzing the error model of a measuring system, which is composed of two transmitters, we acquired the main cause of the measurement uncertainty. Through MATLAB simulation, we are able to get the effective measurement conditions, in order to meet specific requirement of measurement uncertainty. Meanwhile, total uncertainty of the measurement network, which is composed of measurement uncertainty, location uncertainty, receiver uncertainty and other uncertainties, is analyzed. We proposed the evaluation method based on the reference length, and at the same time, optimized the position of the reference position, posture and length, in order to meet the evaluation requirements of the entire measurement space. Finally, we simulated the measurement network for aircraft assembly in measurement space of 20m×20m×5m, and the measurement network for car assembly in measurement space of 5m×5m×2m. We evaluated the measurement network according to the above principles and estimated the uncertainty of the measurement network trough measurement bias of reference length at different locations.

  14. Evaluation of contemporary female sterilization methods.

    PubMed

    Brenner, W E

    1981-09-01

    Different methods of sterilization were evaluated. Laparoscopic techniques were the most satisfactory because they had lower pelvic and incision infection rates and shorter hospitalization and convalescent times than laparotomy lower pelvic infection rates than culdoscopy and culpotomy. Via laparoscopy, sterilization by coagulation and cutting, spring-loaded clips and bands was an effective, safe method. Mechanical problems with the applicator and optics and decreased visibility resulted in more technical failures and difficulties and more misapplication with the clip applicator. Although the total complication rates were similar with all methods, bleeding from the tubes and wound and pelvic infections were more frequent with silastic-band technique. Long-term complications, such as dysmenorrhea and menometrorrhagia and especially those resulting in hysterectomy after laparoscopy, are infrequent. Pregnancy rates are low after laparoscopic sterilization with coagulation and silastic bands as compared to the clip. Rates of complications with sterilization combined with abortion or delivery are only slightly higher than after abortion without sterilization and much less than the combined complications that would be anticipated from abortion and interval sterilization. To make colpotomy, culdoscopy and minilaparotomy easier and potentially safer, mechanical techniques using the spring-loaded clip and silastic band are being evaluated. Simplified techniques that can be administered via the cervix, such as Quinacrine, may be practical in the future.

  15. Evaluation of Scaling Methods for Rotorcraft Icing

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Kreeger, Richard E.

    2010-01-01

    This paper reports result of an experimental study in the NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the current recommended scaling methods developed for fixed-wing unprotected surface icing applications might apply to representative rotor blades at finite angle of attack. Unlike the fixed-wing case, there is no single scaling method that has been systematically developed and evaluated for rotorcraft icing applications. In the present study, scaling was based on the modified Ruff method with scale velocity determined by maintaining constant Weber number. Models were unswept NACA 0012 wing sections. The reference model had a chord of 91.4 cm and scale model had a chord of 35.6 cm. Reference tests were conducted with velocities of 76 and 100 kt (39 and 52 m/s), droplet MVDs of 150 and 195 fun, and with stagnation-point freezing fractions of 0.3 and 0.5 at angle of attack of 0deg and 5deg. It was shown that good ice shape scaling was achieved for NACA 0012 airfoils with angle of attack lip to 5deg.

  16. Evaluating conflation methods using uncertainty modeling

    NASA Astrophysics Data System (ADS)

    Doucette, Peter; Dolloff, John; Canavosio-Zuzelski, Roberto; Lenihan, Michael; Motsko, Dennis

    2013-05-01

    The classic problem of computer-assisted conflation involves the matching of individual features (e.g., point, polyline, or polygon vectors) as stored in a geographic information system (GIS), between two different sets (layers) of features. The classical goal of conflation is the transfer of feature metadata (attributes) from one layer to another. The age of free public and open source geospatial feature data has significantly increased the opportunity to conflate such data to create enhanced products. There are currently several spatial conflation tools in the marketplace with varying degrees of automation. An ability to evaluate conflation tool performance quantitatively is of operational value, although manual truthing of matched features is laborious and costly. In this paper, we present a novel methodology that uses spatial uncertainty modeling to simulate realistic feature layers to streamline evaluation of feature matching performance for conflation methods. Performance results are compiled for DCGIS street centerline features.

  17. Precise Global DEM Generation by ALOS PRISM

    NASA Astrophysics Data System (ADS)

    Tadono, T.; Ishida, H.; Oda, F.; Naito, S.; Minakawa, K.; Iwamoto, H.

    2014-04-01

    The Japan Aerospace Exploration Agency (JAXA) generated the global digital elevation/surface model (DEM/DSM) and orthorectified image (ORI) using the archived data of the Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM) onboard the Advanced Land Observing Satellite (ALOS, nicknamed "Daichi"), which was operated from 2006 to 2011. PRISM consisted of three panchromatic radiometers that acquired along-track stereo images. It had a spatial resolution of 2.5 m in the nadir-looking radiometer and achieved global coverage, making it a suitable potential candidate for precise global DSM and ORI generation. In the past 10 years or so, JAXA has conducted the calibration of the system corrected standard products of PRISM in order to improve absolute accuracies as well as to validate the high-level products such as DSM and ORI. In this paper, we introduce an overview of the global DEM/DSM dataset generation project, including a summary of ALOS and PRISM, in addition to the global data archive status. It is also necessary to consider data processing strategies, since the processing capabilities of the level 1 standard product and the high-level products must be developed in terms of both hardware and software to achieve the project aims. The automatic DSM/ORI processing software and its test processing results are also described.

  18. A framework for global terrain classification using 250-m DEMs to predict geohazards

    NASA Astrophysics Data System (ADS)

    Iwahashi, J.; Matsuoka, M.; Yong, A.

    2016-12-01

    Geomorphology is key for identifying factors that control geohazards induced by landslides, liquefaction, and ground shaking. To systematically identify landforms that affect these hazards, Iwahashi and Pike (2007; IP07) introduced an automated terrain classification scheme using 1-km-scale Shuttle Radar Topography Mission (SRTM) digital elevation models (DEMs). The IP07 classes describe 16 categories of terrain types and were used as a proxy for predicting ground motion amplification (Yong et al., 2012; Seyhan et al., 2014; Stewart et al., 2014; Yong, 2016). These classes, however, were not sufficiently resolved because coarse-scaled SRTM DEMs were the basis for the categories (Yong, 2016). Thus, we develop a new framework consisting of more detailed polygonal global terrain classes to improve estimations of soil-type and material stiffness. We first prepare high resolution 250-m DEMs derived from the 2010 Global Multi-resolution Terrain Elevation Data (GMTED2010). As in IP07, we calculate three geometric signatures (slope, local convexity and surface texture) from the DEMs. We create additional polygons by using the same signatures and multi-resolution segmentation techniques on the GMTED2010. We consider two types of surface texture thresholds in different window sizes (3x3 and 13x13 pixels), in addition to slope and local convexity, to classify pixels within the DEM. Finally, we apply the k-means clustering and thresholding methods to the 250-m DEM and produce more detailed polygonal terrain classes. We compare the new terrain classification maps of Japan and California with geologic, aerial photography, and landslide distribution maps, and visually find good correspondence of key features. To predict ground motion amplification, we apply the Yong (2016) method for estimating VS30. The systematic classification of geomorphology has the potential to provide a better understanding of the susceptibility to geohazards, which is especially vital in populated areas.

  19. Comparative evaluation of different histoprocessing methods

    PubMed Central

    Singla, Kartesh; Sandhu, Simarpreet Virk; Pal, Rana A. G. K.; Bansal, Himanta; Bhullar, Ramanpreet Kaur; Kaur, Preetinder

    2017-01-01

    Objectives: Tissue processing for years is carried out by the conventional method, which is a time-consuming technique resulting in 1-day delay in diagnosis. However, in this area of modernization and managed care, rapid diagnosis is increasingly desirable to fulfill the needs of clinicians. The objective of the present study was to compare and determine the positive impact on turnaround times of different tissue processing methods by comparing the color intensity, cytoplasmic details, and nuclear details of the tissues processed by three methods. Methods: A total of sixty biopsied tissues were grossed and cut into three equal parts. One part was processed by conventional method, second by rapid manual, and third by microwave-assisted method. The slides obtained after processing were circulated among four observers for evaluation. Sections processed by the three techniques were subjected to statistical analysis by Kruskal–Wallis test. Cronbach’s alpha reliability test was applied to assess the reliability among observers. One-way analysis of variance (ANOVA) was used for comparing mean shrinkage before and after processing. Results: All observers were assumed to be reliable as the Cronbach’s reliability test was statistically significant. The results were statistically non-significant as observed by Kruskal–Wallis test. One-way ANOVA revealed a significant value on comparison of the tissue shrinkage processed by the three techniques. The histological evaluation of the tissues revealed that the nuclear-cytoplasmic contrast was good in tissues processed by microwave, followed by conventional and rapid manual processing techniques. The color intensity of the tissues processed by microwave was crisper, and there was a good contrast between the hematoxylin and eosin-stained areas as compared to manual methods. Conclusion: The overall quality of tissues from all the three methods was similar. It was not feasible to distinguish between the three techniques by

  20. Image Inpainting Methods Evaluation and Improvement

    PubMed Central

    Vreja, Raluca

    2014-01-01

    With the upgrowing of digital processing of images and film archiving, the need for assisted or unsupervised restoration required the development of a series of methods and techniques. Among them, image inpainting is maybe the most impressive and useful. Based on partial derivative equations or texture synthesis, many other hybrid techniques have been proposed recently. The need for an analytical comparison, beside the visual one, urged us to perform the studies shown in the present paper. Starting with an overview of the domain, an evaluation of the five methods was performed using a common benchmark and measuring the PSNR. Conclusions regarding the performance of the investigated algorithms have been presented, categorizing them in function of the restored image structure. Based on these experiments, we have proposed an adaptation of Oliveira's and Hadhoud's algorithms, which are performing well on images with natural defects. PMID:25136700

  1. Method for Preclinical Evaluation of Antiplaque Agents

    PubMed Central

    Tanzer, J. M.; Reid, Y.; Reid, W.

    1972-01-01

    A preclinical method for the evaluation of antibacterial agents for use against dental plaques associated with caries and periodontal disease is proposed. The method is applicable to screening agents and to defining, in vitro, the minimal conditions required for maximal antiplaque effect. As a model of antiplaque agents, chlorhexidine was assessed in vitro against preformed plaques of microorganisms conducive to dental caries and periodontal disease. The agent was bactericidal to plaques of nine strains of Streptococcus mutans and one strain of Actinomyces viscosus when used in a single treatment for 20 min at 2 × 10−1%, in two 2-min treatments on the same day, or in daily 2-min treatments at this same concentration. Using the last of these experimental conditions, we then tested chlorhexidine in vivo by topical application to the maxillary teeth of infected hamsters and found it to be effective in controlling plaques of S. mutans and A. viscosus. Images PMID:4670477

  2. [Cost evaluation of two induced labor methods].

    PubMed

    Torres Magret, E; Sánchez Batista, R; Ramírez Pellicer, A M; Deulofeu Betancourt, A I

    1997-01-01

    Two induced labor methods, the venoclysis with oxitocin and the self-stimulation of the nipples, were comparatively evaluated in 2 groups of pregnant women (80) admitted at the Eastern Gyneco-obstetric and Teaching Hospital in Santiago de Cuba during the first semester of 1993. The following variables were calculated: drugs intake, material expenses, length of stay, and quality. A questionnaire was used to collect them. Percent and chi square were applied to these data, which were represented by tables. The self-stimulation of the nipples proved to be the most economical as regards the saving os spendable material and drugs. Hospital stay and the perinatal results connected with the type of labor and the newborn status were similar with both methods.

  3. Image inpainting methods evaluation and improvement.

    PubMed

    Vreja, Raluca; Brad, Remus

    2014-01-01

    With the upgrowing of digital processing of images and film archiving, the need for assisted or unsupervised restoration required the development of a series of methods and techniques. Among them, image inpainting is maybe the most impressive and useful. Based on partial derivative equations or texture synthesis, many other hybrid techniques have been proposed recently. The need for an analytical comparison, beside the visual one, urged us to perform the studies shown in the present paper. Starting with an overview of the domain, an evaluation of the five methods was performed using a common benchmark and measuring the PSNR. Conclusions regarding the performance of the investigated algorithms have been presented, categorizing them in function of the restored image structure. Based on these experiments, we have proposed an adaptation of Oliveira's and Hadhoud's algorithms, which are performing well on images with natural defects.

  4. Economic methods for multipollutant analysis and evaluation

    SciTech Connect

    Baasel, W.D.

    1985-01-01

    Since 1572, when miners' lung problems were first linked to dust, man's industrial activity has been increasingly accused of causing disease in man and harm to the environment. Since that time each compound or stream thought to be damaging has been looked at independently. If a gas stream caused the problem the bad compound compositions were reduced to an acceptable level and the problem was considered solved. What happened to substances after they were removed usually was not fully considered until the finding of an adverse effect required it. Until 1970, one usual way of getting rid of many toxic wastes was to place the, in landfills and forget about them. The discovery of sickness caused by substances escaping from the Love Canal landfill has caused a total rethinking of that procedure. This and other incidents clearly showed that taking a substance out of one stream which is discharged to the environment and placing it in another may not be an adequate solution. What must be done is to look at all streams leaving an industrial plant and devise a way to reduce the potentially harmful emissions in those streams to an acceptable level, using methods that are inexpensive. To illustrate conceptually how the environmental assessment approach is a vast improvement over the current methods, an example evaluating effluents from a coal-fired 500 MW power plant is presented. Initially only one substance in one stream is evaluated. This is sulfur oxide leaving in the flue gas.

  5. Methods and Metrics for Evaluating Environmental Dredging ...

    EPA Pesticide Factsheets

    This report documents the objectives, approach, methodologies, results, and interpretation of a collaborative research study conducted by the National Risk Management Research Laboratory (NRMRL) and the National Exposure Research laboratory (NERL) of the U.S. Environmental Protection Agency’s (U.S. EPA’s) Office of Research and Development (ORD) and the U.S. EPA’s Great Lakes National Program Office (GLNPO). The objectives of the research study were to: 1) evaluate remedy effectiveness of environmental dredging as applied to contaminated sediments in the Ashtabula River in northeastern Ohio, and 2) monitor the recovery of the surrounding ecosystem. The project was carried out over 6 years from 2006 through 2011 and consisted of the development and evaluation of methods and approaches to assess river and ecosystem conditions prior to dredging (2006), during dredging (2006 and 2007), and following dredging, both short term (2008) and long term (2009-2011). This project report summarizes and interprets the results of this 6-year study to develop and assess methods for monitoring pollutant fate and transport and ecosystem recovery through the use of biological, chemical, and physical lines of evidence (LOEs) such as: 1) comprehensive sampling of and chemical analysis of contaminants in surface, suspended, and historic sediments; 2) extensive grab and multi-level real time water sampling and analysis of contaminants in the water column; 3) sampling, chemi

  6. An Investigation of Transgressive Deposits in Late Pleistocene Lake Bonneville using GPR and UAV-produced DEMs.

    NASA Astrophysics Data System (ADS)

    Schide, K.; Jewell, P. W.; Oviatt, C. G.; Jol, H. M.

    2015-12-01

    Lake Bonneville was the largest of the Pleistocene pluvial lakes that once filled the Great Basin of the interior western United States. Its two most prominent shorelines, Bonneville and Provo, are well documented but many of the lake's intermediate shoreline features have yet to be studied. These transgressive barriers and embankments mark short-term changes in the regional water budget and thus represent a proxy for local climate change. The internal and external structures of these features are analyzed using the following methods: ground penetrating radar, 5 meter auto-correlated DEMs, 1-meter DEMs generated from LiDAR, high-accuracy handheld GPS, and 3D imagery collected with an unmanned aerial vehicle. These methods in mapping, surveying, and imaging provide a quantitative analysis of regional sediment availability, transportation, and deposition as well as changes in wave and wind energy. These controls help define climate thresholds and rates of landscape evolution in the Great Basin during the Pleistocene that are then evaluated in the context of global climate change.

  7. DEM time series of an agricultural watershed

    NASA Astrophysics Data System (ADS)

    Pineux, Nathalie; Lisein, Jonathan; Swerts, Gilles; Degré, Aurore

    2014-05-01

    In agricultural landscape soil surface evolves notably due to erosion and deposition phenomenon. Even if most of the field data come from plot scale studies, the watershed scale seems to be more appropriate to understand them. Currently, small unmanned aircraft systems and images treatments are improving. In this way, 3D models are built from multiple covering shots. When techniques for large areas would be to expensive for a watershed level study or techniques for small areas would be too time consumer, the unmanned aerial system seems to be a promising solution to quantify the erosion and deposition patterns. The increasing technical improvements in this growth field allow us to obtain a really good quality of data and a very high spatial resolution with a high Z accuracy. In the center of Belgium, we equipped an agricultural watershed of 124 ha. For three years (2011-2013), we have been monitoring weather (including rainfall erosivity using a spectropluviograph), discharge at three different locations, sediment in runoff water, and watershed microtopography through unmanned airborne imagery (Gatewing X100). We also collected all available historical data to try to capture the "long-term" changes in watershed morphology during the last decades: old topography maps, soil historical descriptions, etc. An erosion model (LANDSOIL) is also used to assess the evolution of the relief. Short-term evolution of the surface are now observed through flights done at 200m height. The pictures are taken with a side overlap equal to 80%. To precisely georeference the DEM produced, ground control points are placed on the study site and surveyed using a Leica GPS1200 (accuracy of 1cm for x and y coordinates and 1.5cm for the z coordinate). Flights are done each year in December to have an as bare as possible ground surface. Specific treatments are developed to counteract vegetation effect because it is know as key sources of error in the DEM produced by small unmanned aircraft

  8. Discrete element modelling (DEM) input parameters: understanding their impact on model predictions using statistical analysis

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Wilkinson, S. K.; Stitt, E. H.; Marigo, M.

    2015-09-01

    Selection or calibration of particle property input parameters is one of the key problematic aspects for the implementation of the discrete element method (DEM). In the current study, a parametric multi-level sensitivity method is employed to understand the impact of the DEM input particle properties on the bulk responses for a given simple system: discharge of particles from a flat bottom cylindrical container onto a plate. In this case study, particle properties, such as Young's modulus, friction parameters and coefficient of restitution were systematically changed in order to assess their effect on material repose angles and particle flow rate (FR). It was shown that inter-particle static friction plays a primary role in determining both final angle of repose and FR, followed by the role of inter-particle rolling friction coefficient. The particle restitution coefficient and Young's modulus were found to have insignificant impacts and were strongly cross correlated. The proposed approach provides a systematic method that can be used to show the importance of specific DEM input parameters for a given system and then potentially facilitates their selection or calibration. It is concluded that shortening the process for input parameters selection and calibration can help in the implementation of DEM.

  9. Zusatz- und Weiterqualifikation nach dem Studium

    NASA Astrophysics Data System (ADS)

    Domnick, Ivonne

    Ist der Bachelor geschafft, stellt sich die Frage nach einer Weiterqualifizierung. Neben einem Einstieg ins Berufsleben kann auch ein Masterstudium eventuell weitere entscheidende Bonuspunkte für den Lebenslauf bringen. Mit Zusatzqualifikationen aus fachfremden Bereichen wie Betriebswirtschaft oder Marketing ist es für Naturwissenschaftler leichter, den Einstieg ins Berufsleben zu schaffen. Viele Arbeitgeber sehen gerade bei Naturwissenschaftlern eine Promotion gerne. Hier sollte genau abgewogen werden, ob sie innerhalb einer bestimmten Zeitspanne zu schaffen ist. Auch nach einem Einstieg in den Job lässt sich der Doktortitel unter Umständen noch nachholen. Ebenso ist eine Weiterbildung neben dem Beruf in Teilzeit oder in einem Fernkurs möglich. Zusätzlich gibt es viele mehrwöchige oder mehrmonatige Kurse privater Anbieter, in denen man BWL-Grundkenntnisse erwerben kann.

  10. Modified risk evaluation method. Revision 1

    SciTech Connect

    Udell, C.J.; Tilden, J.A.; Toyooka, R.T.

    1993-08-01

    The purpose of this paper is to provide a structured and cost-oriented process to determine risks associated with nuclear material and other security interests. Financial loss is a continuing concern for US Department of Energy contractors. In this paper risk is equated with uncertainty of cost impacts to material assets or human resources. The concept provides a method for assessing the effectiveness of an integrated protection system, which includes operations, safety, emergency preparedness, and safeguards and security. The concept is suitable for application to sabotage evaluations. The protection of assets is based on risk associated with cost impacts to assets and the potential for undesirable events. This will allow managers to establish protection priorities in terms of the cost and the potential for the event, given the current level of protection.

  11. Methods for incomplete Bessel function evaluation

    NASA Astrophysics Data System (ADS)

    Harris, Frank E.; Fripiat, J. G.

    Presented here are detailed methods for evaluating the incomplete Bessel functions arising when Gaussian-type orbitals are used for systems periodic in one spatial dimension. The scheme is designed to yield these incomplete Bessel functions with an absolute accuracy of ±1 × 10-10, for the range of integer orders 0 ≤ n ≤ 12 [a range sufficient for a basis whose members have angular momenta of up to three units (s, p, d, or f atomic functions)]. To reach this accuracy level within acceptable computation times, new rational approximations were developed to compute the special functions involved, namely, the exponential integral E1(x) and the modified Bessel functions K0(x) and K1(x), to absolute accuracy ±1 × 10-15.

  12. Developmental Eye Movement (DEM) Test Norms for Mandarin Chinese-Speaking Chinese Children.

    PubMed

    Xie, Yachun; Shi, Chunmei; Tong, Meiling; Zhang, Min; Li, Tingting; Xu, Yaqin; Guo, Xirong; Hong, Qin; Chi, Xia

    2016-01-01

    The Developmental Eye Movement (DEM) test is commonly used as a clinical visual-verbal ocular motor assessment tool to screen and diagnose reading problems at the onset. No established norm exists for using the DEM test with Mandarin Chinese-speaking Chinese children. This study aims to establish the normative values of the DEM test for the Mandarin Chinese-speaking population in China; it also aims to compare the values with three other published norms for English-, Spanish-, and Cantonese-speaking Chinese children. A random stratified sampling method was used to recruit children from eight kindergartens and eight primary schools in the main urban and suburban areas of Nanjing. A total of 1,425 Mandarin Chinese-speaking children aged 5 to 12 years took the DEM test in Mandarin Chinese. A digital recorder was used to record the process. All of the subjects completed a symptomatology survey, and their DEM scores were determined by a trained tester. The scores were computed using the formula in the DEM manual, except that the "vertical scores" were adjusted by taking the vertical errors into consideration. The results were compared with the three other published norms. In our subjects, a general decrease with age was observed for the four eye movement indexes: vertical score, adjusted horizontal score, ratio, and total error. For both the vertical and adjusted horizontal scores, the Mandarin Chinese-speaking children completed the tests much more quickly than the norms for English- and Spanish-speaking children. However, the same group completed the test slightly more slowly than the norms for Cantonese-speaking children. The differences in the means were significant (P<0.001) in all age groups. For several ages, the scores obtained in this study were significantly different from the reported scores of Cantonese-speaking Chinese children (P<0.005). Compared with English-speaking children, only the vertical score of the 6-year-old group, the vertical-horizontal time

  13. Developmental Eye Movement (DEM) Test Norms for Mandarin Chinese-Speaking Chinese Children

    PubMed Central

    Tong, Meiling; Zhang, Min; Li, Tingting; Xu, Yaqin; Guo, Xirong; Hong, Qin; Chi, Xia

    2016-01-01

    The Developmental Eye Movement (DEM) test is commonly used as a clinical visual-verbal ocular motor assessment tool to screen and diagnose reading problems at the onset. No established norm exists for using the DEM test with Mandarin Chinese-speaking Chinese children. This study aims to establish the normative values of the DEM test for the Mandarin Chinese-speaking population in China; it also aims to compare the values with three other published norms for English-, Spanish-, and Cantonese-speaking Chinese children. A random stratified sampling method was used to recruit children from eight kindergartens and eight primary schools in the main urban and suburban areas of Nanjing. A total of 1,425 Mandarin Chinese-speaking children aged 5 to 12 years took the DEM test in Mandarin Chinese. A digital recorder was used to record the process. All of the subjects completed a symptomatology survey, and their DEM scores were determined by a trained tester. The scores were computed using the formula in the DEM manual, except that the “vertical scores” were adjusted by taking the vertical errors into consideration. The results were compared with the three other published norms. In our subjects, a general decrease with age was observed for the four eye movement indexes: vertical score, adjusted horizontal score, ratio, and total error. For both the vertical and adjusted horizontal scores, the Mandarin Chinese-speaking children completed the tests much more quickly than the norms for English- and Spanish-speaking children. However, the same group completed the test slightly more slowly than the norms for Cantonese-speaking children. The differences in the means were significant (P<0.001) in all age groups. For several ages, the scores obtained in this study were significantly different from the reported scores of Cantonese-speaking Chinese children (P<0.005). Compared with English-speaking children, only the vertical score of the 6-year-old group, the vertical

  14. Defining optimal DEM resolutions and point densities for modelling hydrologically sensitive areas in agricultural catchments dominated by microtopography

    NASA Astrophysics Data System (ADS)

    Thomas, I. A.; Jordan, P.; Shine, O.; Fenton, O.; Mellander, P.-E.; Dunlop, P.; Murphy, P. N. C.

    2017-02-01

    Defining critical source areas (CSAs) of diffuse pollution in agricultural catchments depends upon the accurate delineation of hydrologically sensitive areas (HSAs) at highest risk of generating surface runoff pathways. In topographically complex landscapes, this delineation is constrained by digital elevation model (DEM) resolution and the influence of microtopographic features. To address this, optimal DEM resolutions and point densities for spatially modelling HSAs were investigated, for onward use in delineating CSAs. The surface runoff framework was modelled using the Topographic Wetness Index (TWI) and maps were derived from 0.25 m LiDAR DEMs (40 bare-earth points m-2), resampled 1 m and 2 m LiDAR DEMs, and a radar generated 5 m DEM. Furthermore, the resampled 1 m and 2 m LiDAR DEMs were regenerated with reduced bare-earth point densities (5, 2, 1, 0.5, 0.25 and 0.125 points m-2) to analyse effects on elevation accuracy and important microtopographic features. Results were compared to surface runoff field observations in two 10 km2 agricultural catchments for evaluation. Analysis showed that the accuracy of modelled HSAs using different thresholds (5%, 10% and 15% of the catchment area with the highest TWI values) was much higher using LiDAR data compared to the 5 m DEM (70-100% and 10-84%, respectively). This was attributed to the DEM capturing microtopographic features such as hedgerow banks, roads, tramlines and open agricultural drains, which acted as topographic barriers or channels that diverted runoff away from the hillslope scale flow direction. Furthermore, the identification of 'breakthrough' and 'delivery' points along runoff pathways where runoff and mobilised pollutants could be potentially transported between fields or delivered to the drainage channel network was much higher using LiDAR data compared to the 5 m DEM (75-100% and 0-100%, respectively). Optimal DEM resolutions of 1-2 m were identified for modelling HSAs, which balanced the need

  15. DEM Simulation of Particle Stratification and Segregation in Stockpile Formation

    NASA Astrophysics Data System (ADS)

    Zhang, Dizhe; Zhou, Zongyan; Pinson, David

    2017-06-01

    Granular stockpiles are commonly observed in nature and industry, and their formation has been extensively investigated experimentally and mathematically in the literature. One of the striking features affecting properties of stockpiles are the internal patterns formed by the stratification and segregation processes. In this work, we conduct a numerical study based on DEM (discrete element method) model to study the influencing factors and triggering mechanisms of these two phenomena. With the use of a previously developed mixing index, the effects of parameters including size ratio, injection height and mass ratio are investigated. We found that it is a void-filling mechanism that differentiates the motions of particles with different sizes. This mechanism drives the large particles to flow over the pile surface and segregate at the pile bottom, while it also pushes small particles to fill the voids between large particles, giving rise to separate layers. Consequently, this difference in motion will result in the observed stratification and segregation phenomena.

  16. Indicators and Methods for Evaluating Economic, Ecosystem ...

    EPA Pesticide Factsheets

    The U.S. Human Well-being Index (HWBI) is a composite measure that incorporates economic, environmental, and societal well-being elements through the eight domains of connection to nature, cultural fulfillment, education, health, leisure time, living standards, safety and security, and social cohesion (USEPA 2012a; Smith et al. 2013). Twenty-eight services, represented by a collection of indicators and metrics, have been identified as influencing these domains of human well-being. By taking an inventory of stocks or measuring the results of a service, a relationship function can be derived to understand how changes in the provisioning of that service can influence the HWBI. An extensive review of existing services was performed to identify current services, indicators and metrics in use. This report describes the indicators and methods we have selected to evaluate the provisioning of economic, ecosystem, and social services related to human well-being. Provide metadata and methods for calculating services provisioning scores for HWBI modeling framework

  17. Laboratory evaluation of PCBs encapsulation method ...

    EPA Pesticide Factsheets

    Effectiveness and limitations of the encapsulation method for reducing polychlorinated biphenyls (PCBs) concentrations in indoor air and contaminated surface have been evaluated in the laboratory study. Ten coating materials such as epoxy and polyurethane coatings, latex paint, and petroleum-based paint were tested in small environmental chambers to rank the encapsulants by their resistance to PCB sorption and estimate the key parameters required by a barrier model. Wipe samples were collected from PCB contaminated surface encapsulated with the coating materials to rank the encapsulants by their resistance to PCB migration from the source. A barrier model was used to calculate the PCB concentrations in the sources and the encapsulant layers, and at the exposed surfaces of the encapsulant and in the room air at different times. The performance of the encapsulants was ranked by those concentrations and PCB percent reductions. Overall, the three epoxy coatings performed better than the other coatings. Both the experimental results and the mathematical modeling showed that selecting proper encapsulants can effectively reduce the PCB concentrations at the exposed surfaces. The encapsulation method is most effective for contaminated surfaces that contain low levels of PCBs. This study answers some of these questions by using a combination of laboratory testing and mathematical modeling. The results should be useful to mitigation engineers, building owners and managers

  18. Extraction of Hydrological Proximity Measures from DEMs using Parallel Processing

    SciTech Connect

    Tesfa, Teklu K.; Tarboton, David G.; Watson, Daniel W.; Schreuders, Kimberly A.; Baker, Matthew M.; Wallace, Robert M.

    2011-12-01

    Land surface topography is one of the most important terrain properties which impact hydrological, geomorphological, and ecological processes active on a landscape. In our previous efforts to develop a soil depth model based upon topographic and land cover variables, we extracted a set of hydrological proximity measures (HPMs) from a Digital Elevation Model (DEM) as potential explanatory variables for soil depth. These HPMs may also have other, more general modeling applicability in hydrology, geomorphology and ecology, and so are described here from a general perspective. The HPMs we derived are variations of the distance up to ridge points (cells with no incoming flow) and variations of the distance down to stream points (cells with a contributing area greater than a threshold), following the flow path. These HPMs were computed using the D-infinity flow model that apportions flow between adjacent neighbors based on the direction of steepest downward slope on the eight triangular facets constructed in a 3 x 3 grid cell window using the center cell and each pair of adjacent neighboring grid cells in turn. The D-infinity model typically results in multiple flow paths between 2 points on the topography, with the result that distances may be computed as the minimum, maximum or average of the individual flow paths. In addition, each of the HPMs, are calculated vertically, horizontally, and along the land surface. Previously, these HPMs were calculated using recursive serial algorithms which suffered from stack overflow problems when used to process large datasets, limiting the size of DEMs that could be analyzed using that method to approximately 7000 x 7000 cells. To overcome this limitation, we developed a message passing interface (MPI) parallel approach for calculating these HPMs. The parallel algorithms of the HPMs spatially partition the input grid into stripes which are each assigned to separate processes for computation. Each of those processes then uses a

  19. Evaluation of methods to assess physical activity

    NASA Astrophysics Data System (ADS)

    Leenders, Nicole Y. J. M.

    Epidemiological evidence has accumulated that demonstrates that the amount of physical activity-related energy expenditure during a week reduces the incidence of cardiovascular disease, diabetes, obesity, and all-cause mortality. To further understand the amount of daily physical activity and related energy expenditure that are necessary to maintain or improve the functional health status and quality of life, instruments that estimate total (TDEE) and physical activity-related energy expenditure (PAEE) under free-living conditions should be determined to be valid and reliable. Without evaluation of the various methods that estimate TDEE and PAEE with the doubly labeled water (DLW) method in females there will be eventual significant limitations on assessing the efficacy of physical activity interventions on health status in this population. A triaxial accelerometer (Tritrac-R3D, (TT)), an uniaxial (Computer Science and Applications Inc., (CSA)) activity monitor, a Yamax-Digiwalker-500sp°ler , (YX-stepcounter), by measuring heart rate responses (HR method) and a 7-d Physical Activity Recall questionnaire (7-d PAR) were compared with the "criterion method" of DLW during a 7-d period in female adults. The DLW-TDEE was underestimated on average 9, 11 and 15% using 7-d PAR, HR method and TT. The underestimation of DLW-PAEE by 7-d PAR was 21% compared to 47% and 67% for TT and YX-stepcounter. Approximately 56% of the variance in DLW-PAEE*kgsp{-1} is explained by the registration of body movement with accelerometry. A larger proportion of the variance in DLW-PAEE*kgsp{-1} was explained by jointly incorporating information from the vertical and horizontal movement measured with the CSA and Tritrac-R3D (rsp2 = 0.87). Although only a small amount of variance in DLW-PAEE*kgsp{-1} is explained by the number of steps taken per day, because of its low cost and ease of use, the Yamax-stepcounter is useful in studies promoting daily walking. Thus, studies involving the

  20. Rockslide and Impulse Wave Modelling in the Vajont Reservoir by DEM-CFD Analyses

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Utili, S.; Crosta, G. B.

    2016-06-01

    This paper investigates the generation of hydrodynamic water waves due to rockslides plunging into a water reservoir. Quasi-3D DEM analyses in plane strain by a coupled DEM-CFD code are adopted to simulate the rockslide from its onset to the impact with the still water and the subsequent generation of the wave. The employed numerical tools and upscaling of hydraulic properties allow predicting a physical response in broad agreement with the observations notwithstanding the assumptions and characteristics of the adopted methods. The results obtained by the DEM-CFD coupled approach are compared to those published in the literature and those presented by Crosta et al. (Landslide spreading, impulse waves and modelling of the Vajont rockslide. Rock mechanics, 2014) in a companion paper obtained through an ALE-FEM method. Analyses performed along two cross sections are representative of the limit conditions of the eastern and western slope sectors. The max rockslide average velocity and the water wave velocity reach ca. 22 and 20 m/s, respectively. The maximum computed run up amounts to ca. 120 and 170 m for the eastern and western lobe cross sections, respectively. These values are reasonably similar to those recorded during the event (i.e. ca. 130 and 190 m, respectively). Therefore, the overall study lays out a possible DEM-CFD framework for the modelling of the generation of the hydrodynamic wave due to the impact of a rapid moving rockslide or rock-debris avalanche.

  1. Methods for the comparative evaluation of pharmaceuticals.

    PubMed

    Zentner, Annette; Velasco-Garrido, Marcial; Busse, Reinhard

    2005-11-15

    POLITICAL BACKGROUND: As a German novelty, the Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen; IGWiG) was established in 2004 to, among other tasks, evaluate the benefit of pharmaceuticals. In this context it is of importance that patented pharmaceuticals are only excluded from the reference pricing system if they offer a therapeutic improvement. The institute is commissioned by the Federal Joint Committee (Gemeinsamer Bundesausschuss, G-BA) or by the Ministry of Health and Social Security. The German policy objective expressed by the latest health care reform (Gesetz zur Modernisierung der Gesetzlichen Krankenversicherung, GMG) is to base decisions on a scientific assessment of pharmaceuticals in comparison to already available treatments. However, procedures and methods are still to be established. This health technology assessment (HTA) report was commissioned by the German Agency for HTA at the Institute for Medical Documentation and Information (DAHTA@DIMDI). It analysed criteria, procedures, and methods of comparative drug assessment in other EU-/OECD-countries. The research question was the following: How do national public institutions compare medicines in connection with pharmaceutical regulation, i.e. licensing, reimbursement and pricing of drugs? Institutions as well as documents concerning comparative drug evaluation (e.g. regulations, guidelines) were identified through internet, systematic literature, and hand searches. Publications were selected according to pre-defined inclusion and exclusion criteria. Documents were analysed in a qualitative matter following an analytic framework that had been developed in advance. Results were summarised narratively and presented in evidence tables. Currently licensing agencies do not systematically assess a new drug's added value for patients and society. This is why many countries made post-licensing evaluation of pharmaceuticals a

  2. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...

  3. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...

  4. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...

  5. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Postclosure suitability evaluation method. 963.16 Section... Determination, Methods, and Criteria § 963.16 Postclosure suitability evaluation method. (a) DOE will evaluate postclosure suitability using the total system performance assessment method. DOE will conduct a total system...

  6. International genomic evaluation methods for dairy cattle

    USDA-ARS?s Scientific Manuscript database

    Background Genomic evaluations are rapidly replacing traditional evaluation systems used for dairy cattle selection. Economies of scale in genomics promote cooperation across country borders. Genomic information can be transferred across countries using simple conversion equations, by modifying mult...

  7. Democratizing Evaluation: Meanings and Methods from Practice.

    ERIC Educational Resources Information Center

    Ryan, Katherine E.; Johnson, Trav D.

    2000-01-01

    Uses the results of an instrumental case study to identify issues connected to evaluation participation and its representation and the role of the internal evaluator in democratic, deliberative evaluation. Identified direct participation and participation by representation, sanctioned or unsanctioned representation, and extrinsic and intrinsic…

  8. Democratizing Evaluation: Meanings and Methods from Practice.

    ERIC Educational Resources Information Center

    Ryan, Katherine E.; Johnson, Trav D.

    2000-01-01

    Uses the results of an instrumental case study to identify issues connected to evaluation participation and its representation and the role of the internal evaluator in democratic, deliberative evaluation. Identified direct participation and participation by representation, sanctioned or unsanctioned representation, and extrinsic and intrinsic…

  9. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  10. DEM GPU studies of industrial scale particle simulations for granular flow civil engineering applications

    NASA Astrophysics Data System (ADS)

    Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine

    2017-06-01

    The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.

  11. A simplified DEM-CFD approach for pebble bed reactor simulations

    SciTech Connect

    Li, Y.; Ji, W.

    2012-07-01

    In pebble bed reactors (PBR's), the pebble flow and the coolant flow are coupled with each other through coolant-pebble interactions. Approaches with different fidelities have been proposed to simulate similar phenomena. Coupled Discrete Element Method-Computational Fluid Dynamics (DEM-CFD) approaches are widely studied and applied in these problems due to its good balance between efficiency and accuracy. In this work, based on the symmetry of the PBR geometry, a simplified 3D-DEM/2D-CFD approach is proposed to speed up the DEM-CFD simulation without significant loss of accuracy. Pebble flow is simulated by a full 3-D DEM, while the coolant flow field is calculated with a 2-D CFD simulation by averaging variables along the annular direction in the cylindrical geometry. Results show that this simplification can greatly enhance the efficiency for cylindrical core, which enables further inclusion of other physics such as thermal and neutronic effect in the multi-physics simulations for PBR's. (authors)

  12. Extracting DEM from airborne X-band data based on PolInSAR

    NASA Astrophysics Data System (ADS)

    Hou, X. X.; Huang, G. M.; Zhao, Z.

    2015-06-01

    Polarimetric Interferometric Synthetic Aperture Radar (PolInSAR) is a new trend of SAR remote sensing technology which combined polarized multichannel information and Interferometric information. It is of great significance for extracting DEM in some regions with low precision of DEM such as vegetation coverage area and building concentrated area. In this paper we describe our experiments with high-resolution X-band full Polarimetric SAR data acquired by a dual-baseline interferometric airborne SAR system over an area of Danling in southern China. Pauli algorithm is used to generate the double polarimetric interferometry data, Singular Value Decomposition (SVD), Numerical Radius (NR) and Phase diversity (PD) methods are used to generate the full polarimetric interferometry data. Then we can make use of the polarimetric interferometric information to extract DEM with processing of pre filtering , image registration, image resampling, coherence optimization, multilook processing, flat-earth removal, interferogram filtering, phase unwrapping, parameter calibration, height derivation and geo-coding. The processing system named SARPlore has been exploited based on VC++ led by Chinese Academy of Surveying and Mapping. Finally compared optimization results with the single polarimetric interferometry, it has been observed that optimization ways can reduce the interferometric noise and the phase unwrapping residuals, and improve the precision of DEM. The result of full polarimetric interferometry is better than double polarimetric interferometry. Meanwhile, in different terrain, the result of full polarimetric interferometry will have a different degree of increase.

  13. A quick algorithm of counting flow accumulation matrix for deriving drainage networks from a DEM

    NASA Astrophysics Data System (ADS)

    Wang, Yanping; Liu, Yonghe; Xie, Hongbo; Xiang, ZhongLin

    2011-06-01

    Computerized auto-extraction of drainage networks from Digital Elevation Model (DEM) has been widely used in hydrological modeling and relevant studies. Several essential procedures need to be implemented in eight-directional(D8) watershed delineation method, among which a problem need to be resolved is the lack of a high efficiency algorithm for quick and accurate computation of flow accumulation matrix involved in river network delineations. For the problem of depression filling, the algorithm presented by Oliver Planchon has resolved it. This study was aimed to develop a simple and quick algorithm for flow accumulation matrix computations. For this purpose, a simple and high efficiency algorithm of the time complexity of O(n) compared to the commonly used code of the time complexity of O(n2) orO(nlogn) , has been developed. Performance tests on this newly developed algorithm were conducted for different size of DEMs, and the results suggested that the algorithm has a linear time complexity with increasing sizes of DEM. The computation efficiency of this newly developed algorithm is many times higher than the commonly used code, and for a DEM of size 1000*1000, flow accumulation matrix computation can be completed within only several seconds compared with about few minutes needed by common used algorithms.

  14. Lava emplacements at Shiveluch volcano (Kamchatka) from June 2011 to September 2014 observed by TanDEM-X SAR-Interferometry

    NASA Astrophysics Data System (ADS)

    Heck, Alexandra; Kubanek, Julia; Westerhaus, Malte; Gottschämmer, Ellen; Heck, Bernhard; Wenzel, Friedemann

    2016-04-01

    As part of the Ring of Fire, Shiveluch volcano is one of the largest and most active volcanoes on Kamchatka Peninsula. During the Holocene, only the southern part of the Shiveluch massive was active. Since the last Plinian eruption in 1964, the activity of Shiveluch is characterized by periods of dome growth and explosive eruptions. The recent active phase began in 1999 and continues until today. Due to the special conditions at active volcanoes, such as smoke development, danger of explosions or lava flows, as well as poor weather conditions and inaccessible area, it is difficult to observe the interaction between dome growth, dome destruction, and explosive eruptions in regular intervals. Consequently, a reconstruction of the eruption processes is hardly possible, though important for a better understanding of the eruption mechanism as well as for hazard forecast and risk assessment. A new approach is provided by the bistatic radar data acquired by the TanDEM-X satellite mission. This mission is composed of two nearly identical satellites, TerraSAR-X and TanDEM-X, flying in a close helix formation. On one hand, the radar signals penetrate clouds and partially vegetation and snow considering the average wavelength of about 3.1 cm. On the other hand, in comparison with conventional InSAR methods, the bistatic radar mode has the advantage that there are no difficulties due to temporal decorrelation. By interferometric evaluation of the simultaneously recorded SAR images, it is possible to calculate high-resolution digital elevation models (DEMs) of Shiveluch volcano and its surroundings. Furthermore, the short recurrence interval of 11 days allows to generate time series of DEMs, with which finally volumetric changes of the dome and of lava flows can be determined, as well as lava effusion rates. Here, this method is used at Shiveluch volcano based on data acquired between June 2011 and September 2014. Although Shiveluch has a fissured topography with steep slopes

  15. San Francisco Bay-Delta bathymetric/topographic digital elevation model (DEM)

    USGS Publications Warehouse

    Fregoso, Theresa; Wang, Rueen-Fang; Ateljevich, Eli; Jaffe, Bruce E.

    2017-01-01

    A high-resolution (10-meter per pixel) digital elevation model (DEM) was created for the Sacramento-San Joaquin Delta using both bathymetry and topography data. This DEM is the result of collaborative efforts of the U.S. Geological Survey (USGS) and the California Department of Water Resources (DWR). The base of the DEM is from a 10-m DEM released in 2004 and updated in 2005 (Foxgrover and others, 2005) that used Environmental Systems Research Institute (ESRI), ArcGIS Topo to Raster module to interpolate grids from single beam bathymetric surveys collected by DWR, the Army Corp of Engineers (COE), the National Oceanic and Atmospheric Administration (NOAA), and the USGS, into a continuous surface. The Topo to Raster interpolation method was specifically designed to create hydrologically correct DEMs from point, line, and polygon data (Environmental Systems Research Institute, Inc., 2015). Elevation contour lines were digitized based on the single beam point data for control of channel morphology during the interpolation process. Checks were performed to ensure that the interpolated surfaces honored the source bathymetry, and additional contours and (or) point data were added as needed to help constrain the data. The original data were collected in the tidal datum Mean Lower or Low Water (MLLW) or the National Geodetic Vertical Datum of 1929 (NGVD29). All data were converted to NGVD29.The 2005 USGS DEM was updated by DWR, first by converting the DEM to the current modern datum of North American Vertical Datum of 1988 (NAVD88) and then by following the methodology of the USGS DEM, established for the 2005 DEM (Foxgrover and others, 2005) for adding newly collected single and multibeam bathymetric data. They then included topographic data from lidar surveys, providing the first DEM that included the land/water interface (Wang and Ateljevich, 2012).The USGS further updated and expanded the DWR DEM with the inclusion of USGS interpolated sections of single beam

  16. Method for evaluation of laboratory craters using crater detection algorithm for digital topography data

    NASA Astrophysics Data System (ADS)

    Salamunićcar, Goran; Vinković, Dejan; Lončarić, Sven; Vučina, Damir; Pehnec, Igor; Vojković, Marin; Gomerčić, Mladen; Hercigonja, Tomislav

    In our previous work the following has been done: (1) the crater detection algorithm (CDA) based on digital elevation model (DEM) has been developed and the GT-115225 catalog has been assembled [GRS, 48 (5), in press, doi:10.1109/TGRS.2009.2037750]; and (2) the results of comparison between explosion-induced laboratory craters in stone powder surfaces and GT-115225 have been presented using depth/diameter measurements [41stLPSC, Abstract #1428]. The next step achievable using the available technology is to create 3D scans of such labo-ratory craters, in order to compare different properties with simple Martian craters. In this work, we propose a formal method for evaluation of laboratory craters, in order to provide objective, measurable and reproducible estimation of the level of achieved similarity between these laboratory and real impact craters. In the first step, the section of MOLA data for Mars (or SELENE LALT for Moon) is replaced with one or several 3D-scans of laboratory craters. Once embedment was done, the CDA can be used to find out whether this laboratory crater is similar enough to real craters, as to be recognized as a crater by the CDA. The CDA evaluation using ROC' curve represents how true detection rate (TDR=TP/(TP+FN)=TP/GT) depends on the false detection rate (FDR=FP/(TP+FP)). Using this curve, it is now possible to define the measure of similarity between laboratory and real impact craters, as TDR or FDR value, or as a distance from the bottom-right origin of the ROC' curve. With such an approach, the reproducible (formally described) method for evaluation of laboratory craters is provided.

  17. Stream Morphologic Measurements from Airborne Laser Swath Mapping: Comparisons with Field Surveys, Traditional DEMs, and Aerial Photographs

    NASA Astrophysics Data System (ADS)

    Snyder, N. P.; Schultz, L. L.

    2005-12-01

    Precise measurement of stream morphology over entire watersheds is one of the great research opportunities provided by airborne laser swath mapping (ALSM). ALSM surveys allow for rapid quantification of factors, such as channel width and gradient, that control stream hydraulic and ecologic properties. We compare measurements from digital elevation models (DEMs) derived from ALSM data collected by the National Center for Airborne Laser Mapping (NCALM) to field surveys, traditional DEMs (rasterized from topographic maps), and aerial photographs. The field site is in the northern Black Mountains in arid Death Valley National Park (California). The area is unvegetated, and therefore is excellent for testing DEM analysis methods because the ALSM data required minimal filtering, and the resulting DEM contains relatively few unphysical sinks. Algorithms contained in geographic information systems (GIS) software used to extract stream networks from DEMs yield best results where streams are steep enough for resolvable pixel-to-pixel elevation change, and channel width is on the order of pixel resolution. This presents a new challenge with ALSM-derived DEMs because the pixel size (1 m) is often an order of magnitude or more smaller than channel width. We find the longitudinal profile of Gower Gulch in the northern Black Mountains (~4 km total length) extracted using the ALSM DEM and a flow accumulation algorithm is 14% longer than a traditional 10-m DEM, and 13% longer than a field survey. These differences in length (and therefore gradient) are due to the computed channel path following small-scale topographic variations within the channel bottom that are not relevant during high flows. However, visual analysis of shaded-relief images created from high-resolution ALSM data is an excellent method for digitizing channel banks and thalweg paths. We used these lines to measure distance, elevation, and width. In Gower Gulch, the algorithm-derived profile is 10% longer than that

  18. Validation of DEMs Derived from High Resolution SAR Data: a Case Study on Barcelona

    NASA Astrophysics Data System (ADS)

    Sefercik, U. G.; Schunert, A.; Soergel, U.; Watanabe, K.

    2012-07-01

    In recent years, Synthetic Aperture Radar (SAR) data have been widely used for scientific applications and several SAR missions were realized. The active sensor principle and the signal wavelength in the order of centimeters provide all-day and all-weather capabilities, respectively. The modern German TerraSAR-X (TSX) satellite provides high spatial resolution down to one meter. Based on such data SAR Interferometry may yield high quality digital surface models (DSMs), which includes points located on 3d objects such as vegetation, forest, and elevated man-made structures. By removing these points, digital elevation model (DEM) representing the bare ground of Earth is obtained. The primary objective of this paper is the validation of DEMs obtained from TSX SAR data covering Barcelona area, Spain, in the framework of a scientific project conducted by ISPRS Working Group VII/2 "SAR Interferometry" that aims the evaluation of DEM derived from data of modern SAR satellite sensors. Towards this purpose, a DSM was generated with 10 m grid spacing using TSX StripMap mode SAR data and converted to a DEM by filtering. The accuracy results have been presented referring the comparison with a more accurate (10 cm-1 m) digital terrain model (DTM) derived from large scale photogrammetry. The results showed that the TSX DEM is quite coherent with the topography and the accuracy is in between ±8-10 m. As another application, the persistent scatterer interferometry (PSI) was conducted using TSX data and the outcomes were compared with a 3d city model available in Google Earth, which is known to be very precise because it is based on LIDAR data. The results showed that PSI outcomes are quite coherent with reference data and the RMSZ of differences is around 2.5 m.

  19. A DEM-based partition adjustment for the interpolation of annual cumulative temperature in China

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Li, Fei; Fu, Haiyue; Tian, Ying; Hu, Zizhi

    2007-06-01

    The spatial interpolation of meteorological elements has more important application value. The interpolation methods of air temperature data have been wildly applied in the large scale region. It has been paid more attentions that taking altitude as a variable was introduced into the interpolation models so as to improve the interpolation precision of air temperature data. In a large area, it is difficult to find the relationship between annual cumulative temperature and altitude according to the distribution of meteorological stations. Compared whit it dividing the study area, introducing interpolation models modified by DEM in the smaller region, we can availably improve the spatial interpolation precision of the annual cumulative temperature. The result shows that: Applied in the partition study area, inverse distance squared method modified by DEM can reduce complexity of spatial data analysis in the process of annual cumulative temperature interpolation. Partition interpolation methods take into account some factors that affect the interpolation results, such as the spatial distribution imbalance of the meteorological stations, altitude and region difference. The methods are fit for the interpolation analysis of the large-scale region. Compared with the tradition interpolation methods such as Kriging, Inverse distance interpolation method, etc., inverse distance squared method modified by DEM has higher interpolation precision of annual cumulative temperature in China.

  20. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  1. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  2. TanDEM-X DEMs and feature-tracking of Helheim and Kangerdlugssuaq glaciers in south-east Greenland

    NASA Astrophysics Data System (ADS)

    Bevan, Suzanne; Luckman, Adrian; Murray, Tavi

    2013-04-01

    We use sequences of TanDEM-X acquisitions over 'supersites' Helheim and Kangerdlugssuaq glaciers in south-east Greenland to generate interferometric digital elevation models (DEMs) and to feature-track surface displacement between image acquisitions. The high spatial resolution, day/night, and cloud-penetrating capabilities of the X-band SAR system enabled the production of more than 20 DEMs for each glacier with a spatial resolution of 8 m or better. The DEMs span the period June 2011 to March 2012, at 11-day intervals, with a few breaks. Time-lapse animations of Helheim DEMs reveal the development of troughs in surface elevation close to the front. The troughs propagate down flow and develop into the rifts from which calving takes place. On both glaciers, regions of high variance in elevation can be identified caused by the transit of crevasses. In addition, on Helheim, a 1 km wide band of high variance adjacent to the calving front may be interpreted as the response to tidal forcing of a partially floating tongue. In addition to the DEMs we will also present featured tracked high-quality surface velocity fields at a spatial resolution of 2 m coincident with the DEMs. On Helheim these velocity fields indicate a winter deceleration of less than 10% at a point 4 km behind the calving front.

  3. EarthEnv-DEM90: A nearly-global, void-free, multi-scale smoothed, 90m digital elevation model from fused ASTER and SRTM data

    NASA Astrophysics Data System (ADS)

    Robinson, Natalie; Regetz, James; Guralnick, Robert P.

    2014-01-01

    A variety of DEM products are available to the public at no cost, though all are characterized by trade-offs in spatial coverage, data resolution, and quality. The absence of a high-resolution, high-quality, well-described and vetted, free, global consensus product was the impetus for the creation of a new DEM product described here, 'EarthEnv-DEM90'. This new DEM is a compilation dataset constructed via rigorous techniques by which ASTER GDEM2 and CGIAR-CSI v4.1 products were fused into a quality-enhanced, consistent grid of elevation estimates that spans ∼91% of the globe. EarthEnv-DEM90 was assembled using methods for seamlessly merging input datasets, thoroughly filling voids, and smoothing data irregularities (e.g. those caused by DEM noise) from the approximated surface. The result is a DEM product in which elevational artifacts are strongly mitigated from the input data fusion zone, substantial voids are filled in the northern-most regions of the globe, and the entire DEM exhibits reduced terrain noise. As important as the final product is a well defined methodology, along with new processing techniques and careful attention to final outputs, that extends the value and usability of the work beyond just this single product. Finally, we outline EarthEnv-DEM90 acquisition instructions and metadata availability, so that researchers can obtain this high-resolution, high-quality, nearly-global new DEM product for the study of wide-ranging global phenomena.

  4. Development of DEM formalism to modeling the dynamic response of brittle solids

    NASA Astrophysics Data System (ADS)

    Grigoriev, Aleksandr S.; Shilko, Eugeny V.; Psakhie, Sergey G.

    2016-11-01

    The paper presents a numerical model of the response for brittle materials to dynamic mechanical loading and implementation of the model within the discrete element method (DEM) by the example of the movable cellular automaton method (MCA). Verification of the model was carried out using the numerical modeling of the uniaxial compression tests of concrete and sandstone samples at various strain rates. It is shown that the developed model is correct and adequately describes the behavior of brittle materials under dynamic loading.

  5. Reanalysis of the DEMS Nested Case-Control Study of Lung Cancer and Diesel Exhaust: Suitability for Quantitative Risk Assessment

    PubMed Central

    Crump, Kenny S; Van Landingham, Cynthia; Moolgavkar, Suresh H; McClellan, Roger

    2015-01-01

    The International Agency for Research on Cancer (IARC) in 2012 upgraded its hazard characterization of diesel engine exhaust (DEE) to “carcinogenic to humans.” The Diesel Exhaust in Miners Study (DEMS) cohort and nested case-control studies of lung cancer mortality in eight U.S. nonmetal mines were influential in IARC’s determination. We conducted a reanalysis of the DEMS case-control data to evaluate its suitability for quantitative risk assessment (QRA). Our reanalysis used conditional logistic regression and adjusted for cigarette smoking in a manner similar to the original DEMS analysis. However, we included additional estimates of DEE exposure and adjustment for radon exposure. In addition to applying three DEE exposure estimates developed by DEMS, we applied six alternative estimates. Without adjusting for radon, our results were similar to those in the original DEMS analysis: all but one of the nine DEE exposure estimates showed evidence of an association between DEE exposure and lung cancer mortality, with trend slopes differing only by about a factor of two. When exposure to radon was adjusted, the evidence for a DEE effect was greatly diminished, but was still present in some analyses that utilized the three original DEMS DEE exposure estimates. A DEE effect was not observed when the six alternative DEE exposure estimates were utilized and radon was adjusted. No consistent evidence of a DEE effect was found among miners who worked only underground. This article highlights some issues that should be addressed in any use of the DEMS data in developing a QRA for DEE. PMID:25857246

  6. LiDAR DEM for Slope regulations of land development in Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, J.-K.; Yang, M.-S.; Wu, M.-C.; Hsu, W.-C.

    2012-04-01

    Slope gradient is a major parameter for regulating the development of slope-lands in Taiwan. According to official guidelines, only two methods can be adopted, namely the rectangular parcel method and the parcel contouring method. Both of them are manual methods using conventional analogue maps produced by photogrammetric method. As the trend of technology is in favor of adopting digital elevation models for automated production of slope maps and complete coverage of the territory of Taiwan with DEM in 40m, 5m and 1m grids have been mostly completed, it is needed to assess the difference of DEM approaches in comparison to the official approaches which is recognized as the only legal procedure until now. Thus, a 1/1000 contour map in the sloping land of suburban area of New Taipei City is selected for this study. Manual approaches are carried out using the contour lines with 2m intervals. DEM grids of 1m, 5m, and 10m are generated by LiDAR survey. It is shown that the slope maps generated by Eight Neighbors Unweighted method are comparable or even better than the conventional approaches. As the conventional approach is prone to error propagations and uncertainties, the new digital approach should be implemented and enforced in the due process of law.

  7. A comparison between DEM and MPM for the modeling of unsteady flow

    NASA Astrophysics Data System (ADS)

    Gracia, Fabio; Villard, Pascal; Richefeu, Vincent

    2017-06-01

    In order to provide a comprehensive comparison between two current numerical methods employed in the modeling of rock avalanches, the Discrete Element Method (DEM) [3] and the Material Point Method (MPM) [1] were used to simulate the mass propagation along a 45° plane transitioning to an horizontal plane. When using the DEM, a 3D code using tetrahedral elements was used and the flow was channelized by means of frictionless walls. For the MPM simulations, a 2D code was developed and plane strain simulations were run. Comparisons were made in terms of run-out distance and energy dissipated. Influence of parameters such as initial sample geometry, basal friction coefficient and shape of blocks composing the sample was studied.

  8. [Epidemiological methods for evaluating screening programmes].

    PubMed

    Olsen, Jørn

    2014-06-09

    The effect of screening programmes must be estimated before the programmes are implemented. Usually, the evaluation includes randomized trials if possible but even a large randomized trial will have limitations and need not estimate effects properly under routine conditions.

  9. Spotlight COSMO-SkyMed DEM generation and validation

    NASA Astrophysics Data System (ADS)

    Lombardi, N.; Lorusso, R.; Milillo, G.

    2016-10-01

    This paper focuses on the generation of Digital Elevation Models (DEMs) with COSMO SkyMed Spotlight data in providing DEMs. In particular, the peculiarity of Spotlight data (affected from Doppler centroid drift) is investigated, and the use of the processing chain included in the Delft Object-oriented Radar Interferometric Software (DORIS [1]). The effects of not correctly handled Doppler drift is shown. The standard interferometric processing, without Doppler drift handling, has been applied to Spotlight image pairs, resulting in interferometric coherence loss in interferograms as we move away from scene center. So, the standard processing chain has been modified to take in account the Doppler centroid drift affecting Spotlight data and very high resolution and accuracy DEMs have been obtained. Some Spotlight image pairs have been processed and the obtained DEMs have been shown and analyzed proving the high details and product accuracy.

  10. Aquifer water abundance evaluation using a fuzzy- comprehensive weighting method

    NASA Astrophysics Data System (ADS)

    Wei, Z.

    2016-08-01

    Aquifer water abundance evaluation is a highly relevant issue that has been researched for many years. Despite prior research, problems with the conventional evaluation method remain. This paper establishes an aquifer water abundance evaluation method that combines fuzzy evaluation with a comprehensive weighting method to overcome both the subjectivity and lack of conformity in determining weight by pure data analysis alone. First, this paper introduces the principle of a fuzzy-comprehensive weighting method. Second, the example of well field no. 3 (of a coalfield) is used to illustrate the method's process. The evaluation results show that this method is can more suitably meet the real requirements of aquifer water abundance assessment, leading to more precise and accurate evaluations. Ultimately, this paper provides a new method for aquifer water abundance evaluation.

  11. Development of high-resolution coastal DEMs: Seamlessly integrating bathymetric and topographic data to support coastal inundation modeling

    NASA Astrophysics Data System (ADS)

    Eakins, B. W.; Taylor, L. A.; Warnken, R. R.; Carignan, K. S.; Sharman, G. F.

    2006-12-01

    The National Geophysical Data Center (NGDC), an office of the National Oceanic and Atmospheric Administration (NOAA), is cooperating with the NOAA Pacific Marine Environmental Laboratory (PMEL), Center for Tsunami Research to develop high-resolution digital elevation models (DEMs) of combined bathymetry and topography. The coastal DEMs will be used as input for the Method of Splitting Tsunami (MOST) model developed by PMEL to simulate tsunami generation, propagation and inundation. The DEMs will also be useful in studies of coastal inundation caused by hurricane storm surge and rainfall flooding, resulting in valuable information for local planners involved in disaster preparedness. We present our methodology for creating the high-resolution coastal DEMs, typically at 1/3 arc-second (10 meters) cell size, from diverse digital datasets collected by numerous methods, in different terrestrial environments, and at various scales and resolutions; one important step is establishing the relationships between various tidal and geodetic vertical datums, which may vary over a gridding region. We also discuss problems encountered and lessons learned, using the Myrtle Beach, South Carolina DEM as an example.

  12. Adaptive smoothing of valleys in DEMs using TIN interpolation from ridgeline elevations: An application to morphotectonic aspect analysis

    NASA Astrophysics Data System (ADS)

    Jordan, Gyozo

    2007-05-01

    This paper presents a smoothing method that eliminates valleys of various Strahler-order drainage lines from a digital elevation model (DEM), thus enabling the recovery of local and regional trends in a terrain. A novel method for automated extraction of high-density channel network is developed to identify ridgelines defined as the watershed boundaries of channel segments. A DEM using TIN interpolation is calculated based on elevations of digitally extracted ridgelines. This removes first-order watersheds from the DEM. Higher levels of DEM smoothing can be achieved by the application of the method to ridgelines of higher-order channels. The advantage of the proposed smoothing method over traditional smoothing methods of moving kernel, trend and spectral methods is that it does not require pre-definition of smoothing parameters, such as kernel or trend parameters, and thus it follows topography in an adaptive way. Another advantage is that smoothing is controlled by the physical-hydrological properties of the terrain, as opposed to mathematical filters. Level of smoothing depends on ridgeline geometry and density, and the applied user-defined channel order. The method requires digital extraction of a high-density channel and ridgeline network. The advantage of the smoothing method over traditional methods is demonstrated through a case study of the Kali Basin test site in Hungary. The smoothing method is used in this study for aspect generalisation for morphotectonic investigations in a small watershed.

  13. Training Methods and Materials: An Evaluation.

    ERIC Educational Resources Information Center

    Carter, Alex; Parker, Bill

    1993-01-01

    A survey of business, industry, government, medicine, and higher education investigated and ranked methods and materials for teaching factual and visual information; principles, concepts, and rules; procedures; motor skills; attitudes, opinions, and motivation. Findings show individual exercises to be most effective method and interactive video to…

  14. Simplified method for video performance evaluation

    NASA Astrophysics Data System (ADS)

    Harshbarger, John H.

    1996-04-01

    Meaningful performance evaluation of video equipment can be complex, requiring specialized equipment in which results must be interpreted by technically trained operators. The alternative to this has been to attempt evaluation by visual inspection of patterns such as the SMPTE RP-133 Medical Imaging Standard. However this involves subjective interpretation and does not indicate the point in a system at which degradation has occurred. The video waveform of such a pattern is complex and not suitable for quantitative analysis. The principal factors which influence quality of a video image on a day-to-day basis are resolution, gray scale and color, if employed. If these qualities are transmitted and displayed without degradation beyond acceptable limits, suitable performance is assured. Performance evaluation by inspection of the image produced on a video display monitor is subject to interpretation; this is resolved by inserting, at the display, the original 'perfect' electronically generated waveform to serve as a reference. Thus the viewer has a specific visual comparison as the basis for performance evaluation. Another valuable feature of the test pattern insert is that a test segment can be placed on recorded images. Thus each image recalled by tape playback or from digital storage will carry an integral means for quality assurance.

  15. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  16. A Ranking Method for Evaluating Constructed Responses

    ERIC Educational Resources Information Center

    Attali, Yigal

    2014-01-01

    This article presents a comparative judgment approach for holistically scored constructed response tasks. In this approach, the grader rank orders (rather than rate) the quality of a small set of responses. A prior automated evaluation of responses guides both set formation and scaling of rankings. Sets are formed to have similar prior scores and…

  17. Evaluation of Alternative Methods for Wastewater Disinfection

    DTIC Science & Technology

    1994-09-01

    sodium metabisulfite, and sodium bisulfite are used for dechlorinating chlorinated effluents, but sulfur dioxide is the favored candidate for...metabisulfite and sodium bisulfite are safe substitutes for sulfur dioxide and are used in most small facilities. These solid dechlorination materials are...induced toxicity to aquatic life? (TRC limits. ChlorinatedNO compounds) No Yes Evalue alternate disinfection , technologies: Dechlorination techniques

  18. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  19. A Ranking Method for Evaluating Constructed Responses

    ERIC Educational Resources Information Center

    Attali, Yigal

    2014-01-01

    This article presents a comparative judgment approach for holistically scored constructed response tasks. In this approach, the grader rank orders (rather than rate) the quality of a small set of responses. A prior automated evaluation of responses guides both set formation and scaling of rankings. Sets are formed to have similar prior scores and…

  20. Multi-Method Evaluation of College Teaching

    ERIC Educational Resources Information Center

    Algozzine, Bob; Beattie, John; Bray, Marty; Flowers, Claudia; Gretes, John; Mohanty, Ganesh; Spooner, Fred

    2010-01-01

    Student evaluation of instruction in college and university courses has been a routine and mandatory part of undergraduate and graduate education for some time. A major shortcoming of the process is that it relies exclusively on the opinions or qualitative judgments of students rather than on assessing the learning or transfer of knowledge that…

  1. Advanced reliability methods for structural evaluation

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.; Wu, Y.-T.

    1985-01-01

    Fast probability integration (FPI) methods, which can yield approximate solutions to such general structural reliability problems as the computation of the probabilities of complicated functions of random variables, are known to require one-tenth the computer time of Monte Carlo methods for a probability level of 0.001; lower probabilities yield even more dramatic differences. A strategy is presented in which a computer routine is run k times with selected perturbed values of the variables to obtain k solutions for a response variable Y. An approximating polynomial is fit to the k 'data' sets, and FPI methods are employed for this explicit form.

  2. A hybrid FEM-DEM approach to the simulation of fluid flow laden with many particles

    NASA Astrophysics Data System (ADS)

    Casagrande, Marcus V. S.; Alves, José L. D.; Silva, Carlos E.; Alves, Fábio T.; Elias, Renato N.; Coutinho, Alvaro L. G. A.

    2017-04-01

    In this work we address a contribution to the study of particle laden fluid flows in scales smaller than TFM (two-fluid models). The hybrid model is based on a Lagrangian-Eulerian approach. A Lagrangian description is used for the particle system employing the discrete element method (DEM), while a fixed Eulerian mesh is used for the fluid phase modeled by the finite element method (FEM). The resulting coupled DEM-FEM model is integrated in time with a subcycling scheme. The aforementioned scheme is applied in the simulation of a seabed current to analyze which mechanisms lead to the emergence of bedload transport and sediment suspension, and also quantify the effective viscosity of the seabed in comparison with the ideal no-slip wall condition. A simulation of a salt plume falling in a fluid column is performed, comparing the main characteristics of the system with an experiment.

  3. Coupling Soil Water Movement and Discrete Element Method for Evaluating the Effects of Shrinkage Cracking on Soil Hydraulic Properties

    NASA Astrophysics Data System (ADS)

    Jabakhanji, R.

    2010-12-01

    Due to the heterogeneous nature of the soil medium and the dynamic relationship between structure, function, and water movement, soil-water movement phenomena are complex systems with emergent behavior that varies across spatiotemporal scales of observation. Understanding how information is transferred from one scale to another is essential to produce accurate hydrologic and transport models and for proper scaling and integration of processes, constitutive models, and parameters from various measurement scales. We propose a modeling approach that couple soil water movement with the mechanical deformations it induces. The aim is to capture the formation of shrinkage and/or swelling cracks, and track them, in order to evaluate their effect on the hydraulic properties of the soil observed at the field scale compared to the properties determined at the laboratory scale. This approach is based on the Pedostructure soil-water model proposed by Braudeau et al., and a discrete representation of the soil medium. The latter will be shared by a discrete element method (DEM) mechanical model and a water movement model represented as a network of reservoirs and connecting pipes. Moisture will flow from one reservoir to another depending on the potential difference between the reservoirs and the conductivity of the connecting pipe. Consequently, water potentials and conductivities, as well as the volume of each reservoir, will be updated according to the Pedostructure model. The volumetric strains induced by this water movement will feed into the mechanical DEM model, and the forces between the reservoirs will be calculated. If a contact force reaches failure, it will be severed and moisture exchange will stop through this pipe, in turn altering the path of the subsequent mechanical steps. We will present some preliminary results showing promising agreement between the discrete water movement model and existing experimental data in determining soil moisture profile evolution.

  4. Evaluation of temperament scoring methods for beef cattle

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to evaluate methods of temperament scoring. Crossbred (n=228) calves were evaluated for temperament by an individual evaluator at weaning by two methods of scoring: 1) pen score (1 to 5 scale, with higher scores indicating increasing degree of nervousness, aggressiven...

  5. Land management planning: a method of evaluating alternatives

    Treesearch

    Andres Weintraub; Richard Adams; Linda Yellin

    1982-01-01

    A method is described for developing and evaluating alternatives in land management planning. A structured set of 15 steps provides a framework for such an evaluation. when multiple objectives and uncertainty must be considered in the planning process. The method is consistent with other processes used in organizational evaluation, and allows for the interaction of...

  6. Automatic Delineation of Sea-Cliff Limits Using Lidar-Derived High-Resolution DEMs in Southern California

    NASA Astrophysics Data System (ADS)

    Palaseanu, M.; Danielson, J.; Foxgrover, A. C.; Barnard, P.; Thatcher, C.; Brock, J. C.

    2014-12-01

    Seacliff erosion is a serious hazard with implications for coastal management, and is often estimated using successive hand digitized cliff tops or bases (toe) to assess cliff retreat. Traditionally the recession of the cliff top or cliff base is obtained from aerial photographs, topographic maps, or in situ surveys. Irrespective of how or what is measured to categorize cliff erosion, the position of the cliff top and cliff base is important. Habitually, the cliff top and base are hand digitized even when using high resolution lidar derived DEMs. Even if efforts were made to standardize and eliminate as much as possible any digitizing subjectivity, the delineation of cliffs is time consuming, and depends on the analyst's interpretation. We propose an automatic procedure to delineate the cliff top and base from high resolution bare-earth DEMs. The method is based on bare-earth high-resolution DEMs, generalized coastal shorelines and approximate measurements of distance between the shoreline and the cliff top. The method generates orthogonal transects and profiles with a minimum spacing equal to the DEM resolution and extracts for each profile xyz coordinates for cliff's top and toe, as well as second major positive and negative inflections (second top and toe) along the profile. The difference between the automated and digitized top and toe, respectively, is smaller than the DEM error margin for over 82% of the top points and 86% of the toe points along a stretch of coast in Del Mar, CA. The larger errors were due either to the failure to remove all vegetation from the bare-earth DEM or errors of interpretation during hand digitizing. The automatic method was further applied between Point Conception and Los Angeles Harbor, CA. This automatic method is repeatable, takes advantage of the bare-earth high-resolution, and is more efficient.

  7. Data Collection Methods for Evaluating Museum Programs and Exhibitions

    ERIC Educational Resources Information Center

    Nelson, Amy Crack; Cohn, Sarah

    2015-01-01

    Museums often evaluate various aspects of their audiences' experiences, be it what they learn from a program or how they react to an exhibition. Each museum program or exhibition has its own set of goals, which can drive what an evaluator studies and how an evaluation evolves. When designing an evaluation, data collection methods are purposefully…

  8. Data Collection Methods for Evaluating Museum Programs and Exhibitions

    ERIC Educational Resources Information Center

    Nelson, Amy Crack; Cohn, Sarah

    2015-01-01

    Museums often evaluate various aspects of their audiences' experiences, be it what they learn from a program or how they react to an exhibition. Each museum program or exhibition has its own set of goals, which can drive what an evaluator studies and how an evaluation evolves. When designing an evaluation, data collection methods are purposefully…

  9. Finding the service you need: human centered design of a Digital Interactive Social Chart in DEMentia care (DEM-DISC).

    PubMed

    van der Roest, H G; Meiland, F J M; Haaker, T; Reitsma, E; Wils, H; Jonker, C; Dröes, R M

    2008-01-01

    Community dwelling people with dementia and their informal carers experience a lot of problems. In the course of the disease process people with dementia become more dependent on others and professional help is often necessary. Many informal carers and people with dementia experience unmet needs with regard to information on the disease and on the available care and welfare offer, therefore they tend not to utilize the broad spectrum of available care and welfare services. This can have very negative consequences like unsafe situations, social isolation of the person with dementia and overburden of informal carers with consequent increased risk of illness for them. The development of a DEMentia specific Digital Interactive Social Chart (DEM-DISC) may counteract these problems. DEM-DISC is a demand oriented website for people with dementia and their carers, which is easy, accessible and provides users with customized information on healthcare and welfare services. DEM-DISC is developed according to the human centered design principles, this means that people with dementia, informal carers and healthcare professionals were involved throughout the development process. This paper describes the development of DEM-DISC from four perspectives, a domain specific content perspective, an ICT perspective, a user perspective and an organizational perspective. The aims and most important results from each perspective will be discussed. It is concluded that the human centered design was a valuable method for the development of the DEM-DISC.

  10. Comparison of elevation derived from insar data with dem from topography map in Son Dong, Bac Giang, Viet Nam

    NASA Astrophysics Data System (ADS)

    Nguyen, Duy

    2012-07-01

    Digital Elevation Models (DEMs) are used in many applications in the context of earth sciences such as in topographic mapping, environmental modeling, rainfall-runoff studies, landslide hazard zonation, seismic source modeling, etc. During the last years multitude of scientific applications of Synthetic Aperture Radar Interferometry (InSAR) techniques have evolved. It has been shown that InSAR is an established technique of generating high quality DEMs from space borne and airborne data, and that it has advantages over other methods for the generation of large area DEM. However, the processing of InSAR data is still a challenging task. This paper describes InSAR operational steps and processing chain for DEM generation from Single Look Complex (SLC) SAR data and compare a satellite SAR estimate of surface elevation with a digital elevation model (DEM) from Topography map. The operational steps are performed in three major stages: Data Search, Data Processing, and product Validation. The Data processing stage is further divided into five steps of Data Pre-Processing, Co-registration, Interferogram generation, Phase unwrapping, and Geocoding. The Data processing steps have been tested with ERS 1/2 data using Delft Object-oriented Interferometric (DORIS) InSAR processing software. Results of the outcome of the application of the described processing steps to real data set are presented.

  11. Animal Methods for Evaluating Forage Quality

    USDA-ARS?s Scientific Manuscript database

    Numerous methods are available that employ animals in the assessment of forage quality. Some of these procedures provide information needed to address very specific goals (e.g., monitoring protein adequacy), some serve as useful contributors to the efforts to accurately predict nutritive value, wher...

  12. Empirical methods in the evaluation of estimators

    Treesearch

    Gerald S. Walton; C.J. DeMars; C.J. DeMars

    1973-01-01

    The authors discuss the problem of selecting estimators of density and survival by making use of data on a forest-defoliating larva, the spruce budworm. Varlous estimators are compared. The results show that, among the estimators considered, ratio-type estimators are superior in terms of bias and variance. The methods used in making comparisons, particularly simulation...

  13. Evaluation of Electrochemical Methods for Electrolyte Characterization

    NASA Technical Reports Server (NTRS)

    Heidersbach, Robert H.

    2001-01-01

    This report documents summer research efforts in an attempt to develop an electrochemical method of characterizing electrolytes. The ultimate objective of the characterization would be to determine the composition and corrosivity of Martian soil. Results are presented using potentiodynamic scans, Tafel extrapolations, and resistivity tests in a variety of water-based electrolytes.

  14. Test methods for evaluating reformulated fuels

    SciTech Connect

    Croudace, M.C.

    1994-12-31

    The US Environmental Protection Agency (EPA) introduced regulations in the 1989 Clean Air Act Amendment governing the reformulation of gasoline and diesel fuels to improve air quality. These statutes drove the need for a fast and accurate method for analyzing product composition, especially aromatic and oxygenate content. The current method, gas chromatography, is slow, expensive, non portable, and requires a trained chemist to perform the analysis. The new mid-infrared spectroscopic method uses light to identify and quantify the different components in fuels. Each individual fuel component absorbs a specific wavelength of light depending on the molecule`s unique chemical structure. The quantity of light absorbed is proportional to the concentration of that fuel component in the mixture. The mid-infrared instrument has significant advantages; it is easy to use, rugged, portable, fully automated and cost effective. It can be used to measure multiple oxygenate or aromatic components in unknown fuel mixtures. Regulatory agencies have begun using this method in field compliance testing; petroleum refiners and marketers use it to monitor compliance, product quality and blending accuracy.

  15. Stress analysis during slope failure from DEM simulations

    NASA Astrophysics Data System (ADS)

    Katz, O.; Morgan, J. K.

    2012-04-01

    We used Discrete Element Method (DEM) simulations to study the initiation and evolution of landsliding, with a focus on the development and propagation of the sliding plane, and on the effects of material strength on the behavior of the slope material during landsliding. Our simulated slopes were constructed of homogeneous materials, settled under gravity, bonded, and excavated to produce 70 deg slopes of 1050 m in height. Nine simulations were carried out, each using a different value of cohesions, ranging from 0.7 to 4.2 MPa (quantified through DEM direct shear simulations on representative materials). In each of our simulations, failure initiated at the foot of the slope, accompanied by disintegration of the slope material. Failure then propagated upward to the slope crest with further material disintegration. A discrete detachment surface formed below the disintegrated material. Downslope movement of the failed material (i.e. landsliding) occurred only after the failure plane intersected the upper slope face. By the end of landsliding, the disintegrated slope material formed a talus like deposit at the foot of the slope. The value of initial material cohesion influenced the nature of the landslide deposit and its dimension. Higher material strengths produced smaller landslides, as well as the occurrence of discrete landslide blocks, which originated from the shallow slopes, and became entrained within the finer talus. Stress analysis of the slope failure process clarifies how failure initiates and landsliding evolves, and further constrains the limiting failure criteria that define each simulated material. The local proximity to failure throughout the slope can be tracked during the simulation, revealing that high failure potential (high shear stress relative to mean stress) exists at the toe of the slope immediately following excavation. As material disintegrates near the toe of the slope, high tensile stresses develop in the overlying mass, causing the break

  16. Performance Evaluation Methods for Assistive Robotic Technology

    NASA Astrophysics Data System (ADS)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  17. Development of an unresolved CFD-DEM model for the flow of viscous suspensions and its application to solid-liquid mixing

    NASA Astrophysics Data System (ADS)

    Blais, Bruno; Lassaigne, Manon; Goniva, Christoph; Fradette, Louis; Bertrand, François

    2016-08-01

    Although viscous solid-liquid mixing plays a key role in the industry, the vast majority of the literature on the mixing of suspensions is centered around the turbulent regime of operation. However, the laminar and transitional regimes face considerable challenges. In particular, it is important to know the minimum impeller speed (Njs) that guarantees the suspension of all particles. In addition, local information on the flow patterns is necessary to evaluate the quality of mixing and identify the presence of dead zones. Multiphase computational fluid dynamics (CFD) is a powerful tool that can be used to gain insight into local and macroscopic properties of mixing processes. Among the variety of numerical models available in the literature, which are reviewed in this work, unresolved CFD-DEM, which combines CFD for the fluid phase with the discrete element method (DEM) for the solid particles, is an interesting approach due to its accurate prediction of the granular dynamics and its capability to simulate large amounts of particles. In this work, the unresolved CFD-DEM method is extended to viscous solid-liquid flows. Different solid-liquid momentum coupling strategies, along with their stability criteria, are investigated and their accuracies are compared. Furthermore, it is shown that an additional sub-grid viscosity model is necessary to ensure the correct rheology of the suspensions. The proposed model is used to study solid-liquid mixing in a stirred tank equipped with a pitched blade turbine. It is validated qualitatively by comparing the particle distribution against experimental observations, and quantitatively by compairing the fraction of suspended solids with results obtained via the pressure gauge technique.

  18. The Study on Educational Technology Abilities Evaluation Method

    NASA Astrophysics Data System (ADS)

    Jing, Duan

    The traditional methods used to evaluate the test, the test did not really measure that we want to measuring things. Test results and can not serve as a basis for evaluation, so it was worth the natural result of its evaluation of weighing. This system is full use of technical means of education, based on education, psychological theory, to evaluate the object-based, evaluation tools, evaluation of secondary teachers to primary and secondary school teachers in educational technology as the goal, using a variety of evaluation of side France, from various angles established an informal evaluation system.

  19. Evaluation of toothbrush disinfection via different methods.

    PubMed

    Basman, Adil; Peker, Ilkay; Akca, Gulcin; Alkurt, Meryem Toraman; Sarikir, Cigdem; Celik, Irem

    2016-01-01

    The aim of this study was to compare the efficacy of using a dishwasher or different chemical agents, including 0.12% chlorhexidine gluconate, 2% sodium hypochlorite (NaOCl), a mouthrinse containing essential oils and alcohol, and 50% white vinegar, for toothbrush disinfection. Sixty volunteers were divided into five experimental groups and one control group (n = 10). Participants brushed their teeth using toothbrushes with standard bristles, and they disinfected the toothbrushes according to instructed methods. Bacterial contamination of the toothbrushes was compared between the experimental groups and the control group. Data were analyzed by Kruskal-Wallis and Duncan's multiple range tests, with 95% confidence intervals for multiple comparisons. Bacterial contamination of toothbrushes from individuals in the experimental groups differed from those in the control group (p < 0.05). The most effective method for elimination of all tested bacterial species was 50% white vinegar, followed in order by 2% NaOCl, mouthrinse containing essential oils and alcohol, 0.12% chlorhexidine gluconate, dishwasher use, and tap water (control). The results of this study show that the most effective method for disinfecting toothbrushes was submersion in 50% white vinegar, which is cost-effective, easy to access, and appropriate for household use.

  20. [Evaluation of Wits appraisal with superimposition method].

    PubMed

    Xu, T; Ahn, J; Baumrind, S

    1999-07-01

    To compare the conventional Wits appraisal with superimposed Wits appraisal in evaluation of sagittal jaw relationship change between pre and post orthodontic treatment. The sample consists of 48-case pre and post treatment lateral head films. Computerized digitizing is used to get the cephalometric landmarks and measure conventional Wits value, superimposed Wits value and ANB angle. The correlation analysis among these three measures was done by SAS statistical package. The change of ANB angle has higher correlation with the change of superimposed Wits than that of the conventional Wits. The r-value is as high as 0.849 (P < 0.001). The superimposed Wits appraisal reflects the change of sagittal jaw relationship more objectively than the conventional one.

  1. Experimental dem Extraction from Aster Stereo Pairs and 3d Registration Based on Icesat Laser Altimetry Data in Upstream Area of Lambert Glacier, Antarctica

    NASA Astrophysics Data System (ADS)

    Hai, G.; Xie, H.; Chen, J.; Chen, L.; Li, R.; Tong, X.

    2017-09-01

    DEM Extraction from ASTER stereo pairs and three-dimensional registration by reference to ICESat laser altimetry data are carried out in upstream area of Lambert Glacier, East Antarctica. Since the study area is located in inland of East Antarctica where few textures exist, registration between DEM and ICESat data is performed. Firstly, the ASTER DEM generation is based on rational function model (RFM) and the procedure includes: a) rational polynomial coefficient (RPC) computation from ASTER metadata, b) L1A image product de-noise and destriping, c) local histogram equalization and matching, d) artificial collection of tie points and bundle adjustment, and e) coarse-to-fine hierarchical matching of five levels and grid matching. The matching results are filtered semi-automatically. Hereafter, DEM is interpolated using spline method with ground points converted from matching points. Secondly, the generated ASTER DEM is registered to ICESat data in three-dimensional space after Least-squares rigid transformation using singular value decomposition (SVD). The process is stated as: a) correspondence selection of terrain feature points from ICESat and DEM profiles, b) rigid transformation of generated ASTER DEM using selected feature correspondences based on least squares technique. The registration shows a good result that the elevation difference between DEM and ICESat data is low with a mean value less than 2 meters and the standard deviation around 7 meters. This DEM is generated and specially registered in Antarctic typical region without obvious ground rock control points and serves as true terrain input for further radar altimetry simulation.

  2. A global vegetation corrected SRTM DEM for use in hazard modelling

    NASA Astrophysics Data System (ADS)

    Bates, P. D.; O'Loughlin, F.; Neal, J. C.; Durand, M. T.; Alsdorf, D. E.; Paiva, R. C. D.

    2015-12-01

    We present the methodology and results from the development of a near-global 'bare-earth' Digital Elevation Model (DEM) derived from the Shuttle Radar Topography Mission (SRTM) data. Digital Elevation Models are the most important input for hazard modelling, as the DEM quality governs the accuracy of the model outputs. While SRTM is currently the best near-globally [60N to 60S] available DEM, it requires adjustments to reduce the vegetation contamination and make it useful for hazard modelling over heavily vegetated areas (e.g. tropical wetlands). Unlike previous methods of accounting for vegetation contamination, which concentrated on correcting relatively small areas and usually applied a static adjustment, we account for vegetation contamination globally and apply a spatial varying correction, based on information about canopy height and density. Our new 'Bare-Earth' SRTM DEM combines multiple remote sensing datasets, including ICESat GLA14 ground elevations, the vegetation continuous field dataset as a proxy for penetration depth of SRTM and a global vegetation height map, to remove the vegetation artefacts present in the original SRTM DEM. In creating the final 'bare-earth' SRTM DEM dataset, we produced three different 'bare-earth' SRTM products. The first applies global parameters, while the second and third products apply parameters that are regionalised based on either climatic zones or vegetation types, respectively. We also tested two different canopy density proxies of different spatial resolution. Using ground elevations obtained from the ICESat GLA14 satellite altimeter, we calculate the residual errors for the raw SRTM and the three 'bare-earth' SRTM products and compare performances. The three 'bare-earth' products all show large improvements over the raw SRTM in vegetated areas with the overall mean bias reduced by between 75 and 92% from 4.94 m to 0.40 m. The overall standard deviation is reduced by between 29 and 33 % from 7.12 m to 4.80 m. As

  3. Dem Retrieval And Ground Motion Monitoring In China

    NASA Astrophysics Data System (ADS)

    Gatti, Guido; Perissin, Daniele; Wang, Teng; Rocca, Fabio

    2010-10-01

    This paper considers the topographic measurement and analysis basing on multi-baseline Synthetic Aperture Radar data. In 2009, the ongoing works were focused on taking advantage of Permanent Scatterers (PS) Interferometry to estimate the terrain elevation and ground motion in not urban contexts. An adapted version of the method, namely Quasi-PS (QPS) technique, has been used in order to exploit the distributed target information. One of the analyzed datasets concerns the mountainous area around Zhangbei, Hebei Province, from which a geocoded Digital Elevation Model (DEM) has been retrieved. Regarding ground motion monitoring, our attention was focalized on two different areas. The first is a small area near the Three Gorges Dam, in which ground deformations have been identified and measured. The second area regards the west part of the municipality of Shanghai, centered on a straight railway. The subsidence in that zone has been measured and the interferometric coherence of the railway has been studied, according to the hypothesis of spatial and temporal stability of this kind of target.

  4. DEM Simulation of Particle Clogging in Fiber Filtration

    NASA Astrophysics Data System (ADS)

    Tao, Ran; Yang, Mengmeng; Li, Shuiqing

    2015-11-01

    The formation of porous particle deposits plays a crucial role in determining the efficiency of filtration process. In this work, an adhesive discrete element method (DEM), in combination with CFD, is developed to dynamically describe these porous deposit structures and the changed flow field between two parallel fibers under the periodic boundary conditions. For the first time, it is clarified that the structures of clogged particles are dependent on both the adhesion parameter (defined as the ratio of interparticle adhesion to particle inertia) and the Stokes number (as an index of impaction efficiency). The relationship between the pressure-drop gradient and the coordination number along the filtration time is explored, which can be used to quantitatively classify the different filtration regimes, i.e., clean filter stage, clogging stage and cake filtration stage. Finally, we investigate the influence of the fiber separation distance on the particle clogging behavior, which affects the collecting efficiency of the fibers significantly. The results suggest that changing the arrangement of fibers can improve the filter performance. This work has been funded by the National Key Basic Research and Development Program (2013CB228506).

  5. Micromechanics of non-active clays in saturated state and DEM modelling

    NASA Astrophysics Data System (ADS)

    Pagano, Arianna Gea; Tarantino, Alessandro; Pedrotti, Matteo; Magnanimo, Vanessa; Windows-Yule, Kit; Weinhart, Thomas

    2017-06-01

    The paper presents a conceptual micromechanical model for 1-D compression behaviour of non-active clays in saturated state. An experimental investigation was carried out on kaolin clay samples saturated with fluids of different pH and dielectric permittivity. The effect of pore fluid characteristics on one-dimensional compressibility behaviour of kaolin was investigated. A three dimensional Discrete Element Method (DEM) was implemented in order to simulate the response of saturated kaolin observed during the experiments. A complex contact model was introduced, considering both the mechanical and physico-chemical microscopic interactions between clay particles. A simple analysis with spherical particles only was performed as a preliminary step in the DEM study in the elastic regime.

  6. Coupled DEM-CFD Investigation of Granular Transport in a Fluid Channel

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Dai, F.; Xu, N. W.

    2015-09-01

    This paper presents three dimensional numerical investigations of granular transport in fluids, analysed by the Discrete Element Method (DEM) coupled with Computational Fluid Mechanics (CFD). By employing this model, the relevance of flow velocity and granular depositional morphology has been clarified. The larger the flow velocity is, the further distance the grains can be transported to. In this process, the segregation of solid grains has been clearly identified. This research reveals that coarse grains normally accumulate near the grain source region, while the fine grains can be transported to the flow front. Regardless of the different flow velocities used in these simulations, the intensity of grains segregation remains almost unchanged. The results obtained from the DEM-CFD coupled simulations can reasonably explain the grain transport process occurred in natural environments, such as river scouring, evolution of river/ocean floor, deserts and submarine landslides.

  7. Simulation of Roasting Metallurgical Concentrates in Fluidized Bed Using CFD-DEM

    NASA Astrophysics Data System (ADS)

    Beloglazov, I. I.; Kuskova, Y. V.

    2017-07-01

    In this study, we utilized multiphase computational fluid dynamics (CFD), and discrete element method (DEM). Effect of the kinetic parameters of the roasting process in a fluidized bed was investigated. Our results indicate that it is possible to numerically integrate the coupled CFD-DEM system without significantly increasing computational overhead. It is also clear, however, that reactor operating conditions, reaction kinetics, and multiphase flow dynamics have major impacts on the roasting products exiting the reactor. We find that, with the same pre-exponential factors and mean activation energies, inclusion of distributed activation energies in the kinetics can shift the predicted average value of the exit gas-solidphase and its statistical distribution, compared to single-valued activation-energy kinetics. These findings imply that accurate resolution of the reaction activation energy distributions will be important for optimizing roasting processes.

  8. Obstetric skills drills: evaluation of teaching methods.

    PubMed

    Birch, L; Jones, N; Doyle, P M; Green, P; McLaughlin, A; Champney, C; Williams, D; Gibbon, K; Taylor, K

    2007-11-01

    To determine the most effective method of delivering training to staff on the management of an obstetric emergency. The research was conducted in a District General Hospital in the UK, delivering approximately 3500 women per year. Thirty-six staff, comprising of junior and senior medical and midwifery staff were included as research subjects. Each of the staff members were put into one of six multi-professional teams. Effectively, this gave six teams, each comprising of six members. Three teaching methods were employed. Lecture based teaching (LBT), simulation based teaching (SBT) or a combination of these two (LAS). Each team of staff were randomly allocated to undertake a full day of training in the management of Post Partum Haemorrhage utilising one of these three teaching methods. Team knowledge and performance were assessed pre-training, post training and at three months later. In addition to this assessment of knowledge and performance, qualitative semi-structured interviews were carried out with 50% of the original cohort one year after the training, to explore anxiety, confidence, communication, knowledge retention, enjoyment and transferable skills. All teams improved in their performance and knowledge. The teams taught using simulation only (SBT) were the only group to demonstrate sustained improvement in clinical management of the case, confidence, communication skills and knowledge. However, the study did not have enough power to reach statistical significance. The SBT group reported transferable skills and less anxiety in subsequent emergencies. SBT and LAS reported improved multidisciplinary communication. Although tiring, the SBT was enjoyed the most. Obstetrics is a high risk speciality, in which emergencies are to some extent, inevitable. Training staff to manage these emergencies is a fundamental principal of risk management. Traditional risk management strategies based on incident reporting and event analysis are reactive and not always effective

  9. Organic ion exchange resin separation methods evaluation

    SciTech Connect

    Witwer, K.S.

    1998-05-27

    This document describes testing to find effective methods to separate Organic Ion Exchange Resin (OIER) from a sludge simulant. This task supports a comprehensive strategy for treatment and processing of K-Basin sludge. The simulant to be used resembles sludge that has accumulated in the 105KE and 105KW Basins in the 1OOK area of the Hanford Site. The sludge is an accumulation of fuel element corrosion products, organic and inorganic ion exchange materials, canister gasket materials, iron and aluminum corrosion products, sand, dirt, and other minor amounts of organic matter.

  10. An entropy-based objective evaluation method for image segmentation

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Fritts, Jason E.; Goldman, Sally A.

    2003-12-01

    Accurate image segmentation is important for many image, video and computer vision applications. Over the last few decades, many image segmentation methods have been proposed. However, the results of these segmentation methods are usually evaluated only visually, qualitatively, or indirectly by the effectiveness of the segmentation on the subsequent processing steps. Such methods are either subjective or tied to particular applications. They do not judge the performance of a segmentation method objectively, and cannot be used as a means to compare the performance of different segmentation techniques. A few quantitative evaluation methods have been proposed, but these early methods have been based entirely on empirical analysis and have no theoretical grounding. In this paper, we propose a novel objective segmentation evaluation method based on information theory. The new method uses entropy as the basis for measuring the uniformity of pixel characteristics (luminance is used in this paper) within a segmentation region. The evaluation method provides a relative quality score that can be used to compare different segmentations of the same image. This method can be used to compare both various parameterizations of one particular segmentation method as well as fundamentally different segmentation techniques. The results from this preliminary study indicate that the proposed evaluation method is superior to the prior quantitative segmentation evaluation techniques, and identify areas for future research in objective segmentation evaluation.

  11. The Use of DEM to Capture the Dynamics of the Flow of Solid Pellets in a Single Screw Extruder

    NASA Astrophysics Data System (ADS)

    Hong, He; Covas, J. A.; Gaspar-Cunha, A.

    2007-05-01

    Despite of the numerical developments on the numerical modeling of polymer plasticating single screw extrusion, the initial stages of solids conveying are still treated unsatisfactorily, a simple plug flow condition being assumed. It is well known that this produces poor predictions of relevant process parameters, e.g., output. This work reports on attempt to model the process using the Discrete Element Method (DEM) with the aim of unveiling the dynamics of the process. Using DEM each pellet is taken as a separate unit, thus predictions of flow patterns, velocity fields and degree of filling are possible. We present the algorithm and a few preliminary results.

  12. A new method to evaluate energy technologies

    SciTech Connect

    Shibata, Y.; Clark, D.J.

    1985-04-01

    As the world's oil reserves are rapidly depleted and numerous alternative energy technologies are proposed, a perplexing and urgent question confronts us: how can we assess these technologies in order to choose those which will deliver the most desirable consequences. This task, the evaluation of technology, is a form of policy study where the intent is to examine the broadest societal implications (technological, economic, environmental, legal, social, emotional, etc.) related to the development and deployment of existing or emerging technology. In a strategic planning process, one of the principal tools contributing to effective leadership is a carefully designed framework for guiding the discussion. The five step approach and the discussion support system by computer described in this article are a starting point for such a framework. The leader can advance group thinking by offering interpretive summaries using the computer and lead the group from one step to the next by giving transitional statements based on the five step approach. The constant challenge for him is to maintain the balance between freedom and control which makes for progress and yet does not act to stifle creative thinking.

  13. Evaluation of an extensive speckle measurement method

    NASA Astrophysics Data System (ADS)

    Roelandt, Stijn; Meuret, Youri; Craggs, Gordon; Verschaffelt, Guy; Janssens, Peter; Thienpont, Hugo

    2012-06-01

    The introduction of lasers for projection applications is hampered by the emergence of speckle. In order to evaluate the speckle distorted image quality, it is important to devise an objective way to measure the amount of speckle. Mathematically, speckle can be described by its speckle contrast value C, which is given by the ratio between the standard deviation of the intensity fluctuations and the mean intensity. Because the measured speckle contrast strongly depends on the parameters of the measurement setup, in this paper we propose a standardized procedure to measure the amount of speckle in laser based projection systems. To obtain such a procedure, the influence of relevant measurement set-up parameters is investigated. The resulting measurement procedure consists of a single digital image sensor in combination with a camera lens. The parameters of the camera lens are chosen such that the measured speckle contrast values correspond with the subjective speckle perception of a human observer, independent of the projector's speckle reduction mechanism(s). Finally, the speckle measurement procedure was performed with different cameras and the results were compared.

  14. Household batteries: Evaluation of collection methods

    SciTech Connect

    Seeberger, D.A.

    1992-12-31

    While it is difficult to prove that a specific material is causing contamination in a landfill, tests have been conducted at waste-to-energy facilities that indicate that household batteries contribute significant amounts of heavy metals to both air emissions and ash residue. Hennepin County, MN, used a dual approach for developing and implementing a special household battery collection. Alternative collection methods were examined; test collections were conducted. The second phase examined operating and disposal policy issues. This report describes the results of the grant project, moving from a broad examination of the construction and content of batteries, to a description of the pilot collection programs, and ending with a discussion of variables affecting the cost and operation of a comprehensive battery collection program. Three out-of-state companies (PA, NY) were found that accept spent batteries; difficulties in reclaiming household batteries are discussed.

  15. Household batteries: Evaluation of collection methods

    SciTech Connect

    Seeberger, D.A.

    1992-01-01

    While it is difficult to prove that a specific material is causing contamination in a landfill, tests have been conducted at waste-to-energy facilities that indicate that household batteries contribute significant amounts of heavy metals to both air emissions and ash residue. Hennepin County, MN, used a dual approach for developing and implementing a special household battery collection. Alternative collection methods were examined; test collections were conducted. The second phase examined operating and disposal policy issues. This report describes the results of the grant project, moving from a broad examination of the construction and content of batteries, to a description of the pilot collection programs, and ending with a discussion of variables affecting the cost and operation of a comprehensive battery collection program. Three out-of-state companies (PA, NY) were found that accept spent batteries; difficulties in reclaiming household batteries are discussed.

  16. Explosive materials equivalency, test methods and evaluation

    NASA Technical Reports Server (NTRS)

    Koger, D. M.; Mcintyre, F. L.

    1980-01-01

    Attention is given to concepts of explosive equivalency of energetic materials based on specific airblast parameters. A description is provided of a wide bandwidth high accuracy instrumentation system which has been used extensively in obtaining pressure time profiles of energetic materials. The object of the considered test method is to determine the maximum output from the detonation of explosive materials in terms of airblast overpressure and positive impulse. The measured pressure and impulse values are compared with known characteristics of hemispherical TNT data to determine the equivalency of the test material in relation to TNT. An investigation shows that meaningful comparisons between various explosives and a standard reference material such as TNT should be based upon the same parameters. The tests should be conducted under the same conditions.

  17. A 'Drift' algorithm for integrating vector polyline and DEM based on the spherical DQG

    NASA Astrophysics Data System (ADS)

    Wang, Jiaojiao; Wang, Lei; Cao, Wenmin; Zhao, Xuesheng

    2014-03-01

    The efficient integration method of vector and DEM data on a global scale is one of the important issues in the community of Digital Earth. Among the existing methods, geometry-based approach maintains the characteristics of vector data necessary for inquiry and analysis. However, the complexity of geometry-based approach, which needs lots of interpolation calculation, limits its applications greatly in the multi-source spatial data integration on a global scale. To overcome this serious deficiency, a novel 'drift' algorithm is developed based on the spherical Degenerate Quadtree Grid (DQG) on which the global DEMs data is represented. The main principle of this algorithm is that the vector node in a DQG cell can be moved to the cell corner-point without changing the visualization effects if the cell is smaller or equal to a pixel of screen. A detailed algorithm and the multi-scale operation steps are also presented. By the 'drift' algorithm, the vector polylines and DEM grids are integrated seamlessly, avoiding lots of interpolation calculating. Based on the approach described above, we have developed a computer program in platform OpenGL 3D API with VC++ language. In this experiment, USGS GTOPO30 DEM data and 1:1,000,000 DCW roads data sets in China area are selected. Tests have shown that time consumption of the 'drift' algorithm is only about 25% of that of the traditional ones, moreover, the mean error of drift operation on vector nodes can be controlled within about half a DQG cell. In the end, the conclusions and future works are also given.

  18. Evaluation criteria and test methods for electrochromic windows

    SciTech Connect

    Czanderna, A.W. ); Lampert, C.M. )

    1990-07-01

    Report summarizes the test methods used for evaluating electrochromic (EC) windows, and summarizes what is known about degradation of their performance, and recommends methods and procedures for advancing EC windows for buildings applications. 77 refs., 13 figs., 6 tabs.

  19. FIELD VALIDATION OF SEDIMENT TOXCITY IDENTIFCATION AND EVALUATION METHODS

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both porewaters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question of whethe...

  20. Aster Global dem Version 3, and New Aster Water Body Dataset

    NASA Astrophysics Data System (ADS)

    Abrams, M.

    2016-06-01

    In 2016, the US/Japan ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) project released Version 3 of the Global DEM (GDEM). This 30 m DEM covers the earth's surface from 82N to 82S, and improves on two earlier versions by correcting some artefacts and filling in areas of missing DEMs by the acquisition of additional data. The GDEM was produced by stereocorrelation of 2 million ASTER scenes and operation on a pixel-by-pixel basis: cloud screening; stacking data from overlapping scenes; removing outlier values, and averaging elevation values. As previously, the GDEM is packaged in ~ 23,000 1 x 1 degree tiles. Each tile has a DEM file, and a NUM file reporting the number of scenes used for each pixel, and identifying the source for fill-in data (where persistent clouds prevented computation of an elevation value). An additional data set was concurrently produced and released: the ASTER Water Body Dataset (AWBD). This is a 30 m raster product, which encodes every pixel as either lake, river, or ocean; thus providing a global inland and shore-line water body mask. Water was identified through spectral analysis algorithms and manual editing. This product was evaluated against the Shuttle Water Body Dataset (SWBD), and the Landsat-based Global Inland Water (GIW) product. The SWBD only covers the earth between about 60 degrees north and south, so it is not a global product. The GIW only delineates inland water bodies, and does not deal with ocean coastlines. All products are at 30 m postings.

  1. Systems and methods for circuit lifetime evaluation

    NASA Technical Reports Server (NTRS)

    Heaps, Timothy L. (Inventor); Sheldon, Douglas J. (Inventor); Bowerman, Paul N. (Inventor); Everline, Chester J. (Inventor); Shalom, Eddy (Inventor); Rasmussen, Robert D. (Inventor)

    2013-01-01

    Systems and methods for estimating the lifetime of an electrical system in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes iteratively performing Worst Case Analysis (WCA) on a system design with respect to different system lifetimes using a computer to determine the lifetime at which the worst case performance of the system indicates the system will pass with zero margin or fail within a predetermined margin for error given the environment experienced by the system during its lifetime. In addition, performing WCA on a system with respect to a specific system lifetime includes identifying subcircuits within the system, performing Extreme Value Analysis (EVA) with respect to each subcircuit to determine whether the subcircuit fails EVA for the specific system lifetime, when the subcircuit passes EVA, determining that the subcircuit does not fail WCA for the specified system lifetime, when a subcircuit fails EVA performing at least one additional WCA process that provides a tighter bound on the WCA than EVA to determine whether the subcircuit fails WCA for the specified system lifetime, determining that the system passes WCA with respect to the specific system lifetime when all subcircuits pass WCA, and determining that the system fails WCA when at least one subcircuit fails WCA.

  2. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  3. Iodine absorption cells quality evaluation methods

    NASA Astrophysics Data System (ADS)

    Hrabina, Jan; Zucco, Massimo; Holá, Miroslava; Šarbort, Martin; Acef, Ouali; Du-Burck, Frédéric; Lazar, Josef; Číp, Ondřej

    2016-12-01

    The absorption cells represent an unique tool for the laser frequency stabilization. They serve as irreplaceable optical frequency references in realization of high-stable laser standards and laser sources for different brands of optical measurements, including the most precise frequency and dimensional measurement systems. One of the most often used absorption media covering visible and near IR spectral range is molecular iodine. It offers rich atlas of very strong and narrow spectral transitions which allow realization of laser systems with ultimate frequency stabilities in or below 10-14 order level. One of the most often disccussed disadvantage of the iodine cells is iodine's corrosivity and sensitivity to presence of foreign substances. The impurities react with absorption media and cause spectral shifts of absorption spectra, spectral broadening of the transitions and decrease achievable signal-to-noise ratio of the detected spectra. All of these unwanted effects directly influence frequency stability of the realized laser standard and due to this fact, the quality of iodine cells must be precisely controlled. We present a comparison of traditionally used method of laser induced fluorescence (LIF) with novel technique based on hyperfine transitions linewidths measurement. The results summarize advantages and drawbacks of these techniques and give a recommendation for their practical usage.

  4. Comparing ArcticDEM against LiDAR in Alaska: Tests of uncertainty in elevation and hydrologic delineation

    NASA Astrophysics Data System (ADS)

    Crosby, B. T.

    2016-12-01

    The ArcticDEM, created by the Polar Geospatial Center using stereo, high-resolution satellite imagery, promises to transform our capacity to execute geospatial analyses in northern latitudes. The two meter ArcticDEM replaces the 60 meter posting USGS NED, offering a 30x increase in resolution that enables change detection and quantification of cold region morphometry not possible before. ArcticDEM elevations are constrained by an Arctic-wide ground control dataset but the product has not yet been compared against continuous raster elevation data. In order to provide a raster-based uncertainty analysis that considers offsets in both the vertical and horizontal dimensions, we compare ArcticDEM against two different one meter resolution LiDAR products. These products, created by the State of Alaska for evaluation of roadway and utility corridors span gradients in relief, geomorphic process and climate. Small but systematic biases exist for higher relief topography that may be related to the broad range of oblique views present in a single swath of satellite data. Though ArcticDEM provides a sea change in our ability to resolve arctic landforms, it is unclear whether the product can effectively delineate channel form. Stereo imagery only enables the creation of a digital surface model (DSM) that include the forms of trees and shrubs. These vegetation elements, the presence/absence of ice at the time of image acquisition and other factors complicate the efficacy of flow routing models and our ability to extract channel characteristics from elevation datasets. This presentation articulates the opportunities and limitations presented by the ArcitcDEM product and others like Structure from Motion that rely on imagery for the generation of high resolution topography.

  5. Selection and Evaluation of Alternative Teaching Methods in Higher Education.

    ERIC Educational Resources Information Center

    Osterman, Dean N.

    College teachers are seeking alternatives to the conventional lecture as a means of teaching students. This paper presents five alternative teaching methods and their advantages and disadvantages. It describes a program for instructional method selection design and includes an evaluation matrix for the five methods. The methods examined are the…

  6. The effects of wavelet compression on Digital Elevation Models (DEMs)

    USGS Publications Warehouse

    Oimoen, M.J.

    2004-01-01

    This paper investigates the effects of lossy compression on floating-point digital elevation models using the discrete wavelet transform. The compression of elevation data poses a different set of problems and concerns than does the compression of images. Most notably, the usefulness of DEMs depends largely in the quality of their derivatives, such as slope and aspect. Three areas extracted from the U.S. Geological Survey's National Elevation Dataset were transformed to the wavelet domain using the third order filters of the Daubechies family (DAUB6), and were made sparse by setting 95 percent of the smallest wavelet coefficients to zero. The resulting raster is compressible to a corresponding degree. The effects of the nulled coefficients on the reconstructed DEM are noted as residuals in elevation, derived slope and aspect, and delineation of drainage basins and streamlines. A simple masking technique also is presented, that maintains the integrity and flatness of water bodies in the reconstructed DEM.

  7. Mixed Methods and Credibility of Evidence in Evaluation

    ERIC Educational Resources Information Center

    Mertens, Donna M.; Hesse-Biber, Sharlene

    2013-01-01

    We argue for a view of credible evidence that is multidimensional in philosophical and methodological terms. We advocate for the importance of deepening the meaning of credible evaluation practice and findings by bringing multiple philosophical and theoretical lenses to the evaluation process as a basis for the use of mixed methods in evaluation,…

  8. 2D DEM model of sand transport with wind interaction

    NASA Astrophysics Data System (ADS)

    Oger, L.; Valance, A.

    2013-06-01

    The advance of the dunes in the desert is a threat to the life of the local people. The dunes invade houses, agricultural land and perturb the circulation on the roads. It is therefore very important to understand the mechanism of sand transport in order to fight against desertification. Saltation in which sand grains are propelled by the wind along the surface in short hops, is the primary mode of blown sand movement [1]. The saltating grains are very energetic and when impact a sand surface, they rebound and consequently eject other particles from the sand bed. The ejected grains, called reptating grains, contribute to the augmentation of the sand flux. Some of them can be promoted to the saltation motion. We use a mechanical model based on the Discrete Element Method to study successive collisions of incident energetic beads with granular packing in the context of Aeolian saltation transport. We investigate the collision process for the case where the incident bead and those from the packing have identical mechanical properties. We analyze the features of the consecutive collision processes made by the transport of the saltating disks by a wind in which its profile is obtained from the counter-interaction between air flow and grain flows. We used a molecular dynamics method known as DEM (soft Discrete Element Method) with a initial static packing of 20000 2D particles. The dilation of the upper surface due to the consecutive collisions is responsible for maintaining the flow at a given energy input due to the wind.

  9. Prevalence of Evaluation Method Courses in Education Leader Doctoral Preparation

    ERIC Educational Resources Information Center

    Shepperson, Tara L.

    2013-01-01

    This exploratory study investigated the prevalence of single evaluation methods courses in doctoral education leadership programs. Analysis of websites of 132 leading U.S. university programs found 62 evaluation methods courses in 54 programs. Content analysis of 49 course catalog descriptions resulted in five categories: survey, planning and…

  10. EPA METHODS FOR EVALUATING WETLAND CONDITION, WETLANDS CLASSIFICATION

    EPA Science Inventory

    In 1999, the U.S. Environmental Protection Agency (EPA) began work on this series of reports entitled Methods for Evaluating Wetland Condition. The purpose of these reports is to help States and Tribes develop methods to evaluate 1) the overall ecological condition of wetlands us...

  11. Using Developmental Evaluation Methods with Communities of Practice

    ERIC Educational Resources Information Center

    van Winkelen, Christine

    2016-01-01

    Purpose: This paper aims to explore the use of developmental evaluation methods with community of practice programmes experiencing change or transition to better understand how to target support resources. Design/methodology/approach: The practical use of a number of developmental evaluation methods was explored in three organizations over a…

  12. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  13. DEM Modelling of Granule Rearrangement and Fracture Behaviours During a Closed-Die Compaction.

    PubMed

    Furukawa, Ryoichi; Kadota, Kazunori; Noguchi, Tetsuro; Shimosaka, Atsuko; Shirakawa, Yoshiyuki

    2017-08-01

    The closed-die compaction behaviour of D-mannitol granules has been simulated by the discrete element method (DEM) to investigate the granule rearrangement and fracture behaviour during compaction which affects the compactibility of the tablet. The D-mannitol granules produced in a fluidized bed were modelled as agglomerates of primary particles connected by linear spring bonds. The validity of the model granule used in the DEM simulation was demonstrated by comparing to the experimental results of a uniaxial compression test. During uniaxial compression, the numerical results of the force-displacement curve corresponded reasonably well to the experimental data. The closed-die compaction of the modelled granules was carried out to investigate the rearrangement and fracture behaviours of the granule at different upper platen velocities. The forces during closed-die compaction calculated by DEM fluctuated in the low-pressure region due to the rearrangement of granules. A Heckel analysis showed that the force fluctuation occurred at the initial bending region of the Heckel plot, which represents the granule rearrangement and fracture. Furthermore, the upper platen velocity affected the trend of compaction forces, which can lead to compaction failure due to capping. These results could contribute to designing the appropriate granules during closed-die compaction.

  14. Parametrisation of a DEM model for railway ballast under different load cases.

    PubMed

    Suhr, Bettina; Six, Klaus

    2017-01-01

    The prediction quality of discrete element method (DEM) models for railway ballast can be expected to depend on three points: the geometry representation of the single particles, the used contact models and the parametrisation using principal experiments. This works aims at a balanced approach, where none of the points is addressed with excessive depth. In a first step, a simple geometry representation is chosen and the simplified Hertz-Mindlin contact model is used. When experimental data of cyclic compression tests and monotonic direct shear tests are considered, the model can be parametrised to fit either one of the two tests, but not both with the same set of parameters. Similar problems can be found in literature for monotonic and cyclic triaxial tests of railway ballast. In this work, the comparison between experiment and simulation is conducted using the entire data of the test, e.g. shear force over shear path curve from the direct shear test. In addition to a visual comparison of the results also quantitative errors based on the sum of squares are defined. To improve the fit of the DEM model to both types of experiments, an extension on the Hertz-Mindlin contact law is used, which introduces additional physical effects (e.g. breakage of edges or yielding). This model introduces two extra material parameters and is successfully parametrised. Using only one set of parameters, the results of the DEM simulation are in good accordance with both experimental cyclic compression test and monotonic directs shear test.

  15. Detailed geomorphological mapping from high resolution DEM data (LiDAR, TanDEM-X): two case studies from Germany and SE Tibet

    NASA Astrophysics Data System (ADS)

    Loibl, D.

    2012-04-01

    Two major obstacles are hampering the production of high resolution geomorphological maps: the complexity of the subject that should be depicted and the enormous efforts necessary to obtain data by field work. The first factor prevented the establishment of a generally accepted map legend; the second hampered efforts to collect comprehensive sets of geomorphological data. This left geomorphologists to produce applied maps, focusing on very few layers of information and often not sticking to any of the numerous standards proposed in the second half of the 20th century. Technological progress of the recent years, especially in the fields of digital elevation models, GIS environments, and computational hardware, today offers promising opportunities to overcome the obstacles and to produce detailed geomorphological maps even for remote or inhospitable regions. The feasibility of detailed geomorphological mapping from two new sets of digital elevation data, the 1 m LiDAR DTM provided by Germany's State Surveying Authority and the upcoming TanDEM-X DEM, has been evaluated in two case studies from a low mountain range in Germany and a high mountain range in SE Tibet. The results indicate that most layers of information of classical geomorphological maps (e.g. the German GMK) can be extracted from this data at appropriate scales but that significant differences occur concerning the quality and the grades of certainty of key contents. Generally, an enhancement of the geomorphographical, especially the geomorphometrical, and a weakening of geomorphogenetical contents was observed. From these findings, theoretical, methodological, and cartographical remarks on detailed geomorphological mapping from DEM data in GIS environments were educed. As GIS environments decouple data and design and enable the geomorphologist to choose information layer combinations freely to fit research topics, a general purpose legend becomes obsolete. Yet, a unified data structure is demanded to

  16. Conceptual evaluation of population health surveillance programs: method and example.

    PubMed

    El Allaki, Farouk; Bigras-Poulin, Michel; Ravel, André

    2013-03-01

    Veterinary and public health surveillance programs can be evaluated to assess and improve the planning, implementation and effectiveness of these programs. Guidelines, protocols and methods have been developed for such evaluation. In general, they focus on a limited set of attributes (e.g., sensitivity and simplicity), that are assessed quantitatively whenever possible, otherwise qualitatively. Despite efforts at standardization, replication by different evaluators is difficult, making evaluation outcomes open to interpretation. This ultimately limits the usefulness of surveillance evaluations. At the same time, the growing demand to prove freedom from disease or pathogen, and the Sanitary and Phytosanitary Agreement and the International Health Regulations require stronger surveillance programs. We developed a method for evaluating veterinary and public health surveillance programs that is detailed, structured, transparent and based on surveillance concepts that are part of all types of surveillance programs. The proposed conceptual evaluation method comprises four steps: (1) text analysis, (2) extraction of the surveillance conceptual model, (3) comparison of the extracted surveillance conceptual model to a theoretical standard, and (4) validation interview with a surveillance program designer. This conceptual evaluation method was applied in 2005 to C-EnterNet, a new Canadian zoonotic disease surveillance program that encompasses laboratory based surveillance of enteric diseases in humans and active surveillance of the pathogens in food, water, and livestock. The theoretical standard used for evaluating C-EnterNet was a relevant existing structure called the "Population Health Surveillance Theory". Five out of 152 surveillance concepts were absent in the design of C-EnterNet. However, all of the surveillance concept relationships found in C-EnterNet were valid. The proposed method can be used to improve the design and documentation of surveillance programs. It

  17. CapDEM Exercise Gamma: Results and Discussion

    DTIC Science & Technology

    2011-06-01

    internal team’ using CapDEM towards the reality of external groups using the CapDEM approach to address their own problem by themselves. The results...enable and support different internal and external configurations of the classified CEE requires further study, including both technical and security...qu’épreuve et évaluation tout à fait indépendantes, l’Exercice a moins mis l’accent sur une « équipe interne » utilisant l’approche DIGCap et il a plutôt

  18. Discrete Element Modeling (DEM) of Triboelectrically Charged Particles: Revised Experiments

    NASA Technical Reports Server (NTRS)

    Hogue, Michael D.; Calle, Carlos I.; Curry, D. R.; Weitzman, P. S.

    2008-01-01

    In a previous work, the addition of basic screened Coulombic electrostatic forces to an existing commercial discrete element modeling (DEM) software was reported. Triboelectric experiments were performed to charge glass spheres rolling on inclined planes of various materials. Charge generation constants and the Q/m ratios for the test materials were calculated from the experimental data and compared to the simulation output of the DEM software. In this paper, we will discuss new values of the charge generation constants calculated from improved experimental procedures and data. Also, planned work to include dielectrophoretic, Van der Waals forces, and advanced mechanical forces into the software will be discussed.

  19. A hybrid method for evaluating enterprise architecture implementation.

    PubMed

    Nikpay, Fatemeh; Ahmad, Rodina; Yin Kia, Chiam

    2017-02-01

    Enterprise Architecture (EA) implementation evaluation provides a set of methods and practices for evaluating the EA implementation artefacts within an EA implementation project. There are insufficient practices in existing EA evaluation models in terms of considering all EA functions and processes, using structured methods in developing EA implementation, employing matured practices, and using appropriate metrics to achieve proper evaluation. The aim of this research is to develop a hybrid evaluation method that supports achieving the objectives of EA implementation. To attain this aim, the first step is to identify EA implementation evaluation practices. To this end, a Systematic Literature Review (SLR) was conducted. Second, the proposed hybrid method was developed based on the foundation and information extracted from the SLR, semi-structured interviews with EA practitioners, program theory evaluation and Information Systems (ISs) evaluation. Finally, the proposed method was validated by means of a case study and expert reviews. This research provides a suitable foundation for researchers who wish to extend and continue this research topic with further analysis and exploration, and for practitioners who would like to employ an effective and lightweight evaluation method for EA projects.

  20. How to Reach Evidence-Based Usability Evaluation Methods.

    PubMed

    Marcilly, Romaric; Peute, Linda

    2017-01-01

    This paper discusses how and why to build evidence-based knowledge on usability evaluation methods. At each step of building evidence, requisites and difficulties to achieve it are highlighted. Specifically, the paper presents how usability evaluation studies should be designed to allow capitalizing evidence. Reciprocally, it presents how evidence-based usability knowledge will help improve usability practice. Finally, it underlines that evaluation and evidence participate in a virtuous circle that will help improve scientific knowledge and evaluation practice.

  1. Contemporary ice-elevation changes on central Chilean glaciers using SRTM1 and high-resolution DEMs

    NASA Astrophysics Data System (ADS)

    Vivero, Sebastian; MacDonell, Shelley

    2016-04-01

    Glaciers located in central Chile have undergone significant retreat in recent decades. Whilst studies have evaluated area loss of several glaciers, there are no detailed studies of volume losses. This lack of information restricts not only estimations of current and future contributions to sea level rise, but also has limited the evaluation of freshwater resource availability in the region. Recently, the Chilean Water Directorate has supported the collection of field and remotely sensed data in the region which has enabled glacier changes to be evaluated in greater detail. This study aims to compare high-resolution laser scanning DEMs acquired by the Chilean Water Directorate in April 2015 with the recently released SRTM 1 arc-second DEM (˜30 m) acquired in February 2000 to calculate geodetic mass balance changes for three glaciers in a catchment in central Chile over a 15-year period. Detailed analysis of the SRTM and laser scanning DEMs, together with the glacier outlines enable the quantification of elevation and volume changes. Glacier outlines from February 2000 were obtained using the multispectral analysis of a Landsat TM image, whereas outlines from April 2015 were digitised from high resolution glacier orthophotomosaics. Additionally, we accounted for radar penetration into snow and/or ice by evaluating elevation differences between SRTM C-and X-bands, as well as mis-registration between SRTM DEM and the high-resolution DEMs. Over the period all glaciers show similar ice wastage in the order of 0.03 km3 for the debris-covered and non-covered glaciers. However, whilst on the non-covered glaciers mass loss is largely related to elevation and the addition of surface sediment, on the debris-covered glacier, losses are related to the development of thermokarst features. By analysing the DEM in conjunction with Landsat images, we have detected changes in the sediment cover of the non-covered glaciers, which is likely to change the behaviour of the surface mass

  2. Application of Bistatic TanDEM-X Interferometry to Measure Lava Flow Volume and Lava Extrusion Rates During the 2012-13 Tolbachik, Kamchatka Fissure Eruption

    NASA Astrophysics Data System (ADS)

    Kubanek, J.; Westerhaus, M.; Heck, B.

    2015-12-01

    Aerial imaging methods are a well approved source for mapping lava flows during eruptions and can serve as a base to assess the eruption dynamics and to determine the affected area. However, clouds and smoke often hinder optical systems like the Earth Observation Advanced Land Imager (EO-1-ALI, operated by NASA) to map lava flows properly, which hence affects its reliability. Furthermore, the amount of lava that is extruded during an eruption cannot be determined from optical images - however, it can significantly contribute to assess the accompanying hazard and risk. One way to monitor active lava flows is to quantify the topographic changes over time while using up-to-date high-resolution digital elevation models (DEMs). Whereas photogrammetric methods still fail when clouds and fume obstruct the sight, innovative radar satellite missions have the potential to generate high-resolution DEMs at any time. The innovative bistatic TanDEM-X (TerraSAR-X Add-on for Digital Elevation Measurements) satellite mission enables for the first time generating high-resolution DEMs from synthetic aperture radar satellite data repeatedly with reasonable costs and high resolution. The satellite mission consists of the two nearly identical satellites TerraSAR-X and TanDEM-X that build a large synthetic aperture radar interferometer with adaptable across- and along-track baselines aiming to generate topographic information globally. In the present study, we apply the TanDEM-X data to study the lava flows that were emplaced during the 2012-13 Tolbachik, Kamchatka fissure eruption. The eruption was composed of very fluid lava flows that effused along a northeast-southwest trending fissure. We used about fifteen bistatic data pairs to generate DEMs prior to, during, and after the eruption. The differencing of the DEMs enables mapping the lava flow field at different times. This allows measuring the extruded volume and to derive the changes in lava extrusion over time.

  3. Precise baseline determination for the TanDEM-X mission

    NASA Astrophysics Data System (ADS)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  4. Fusion of photogrammetric and photoclinometric information for high-resolution DEMs from Mars in-orbit imagery

    NASA Astrophysics Data System (ADS)

    Jiang, Cheng; Douté, Sylvain; Luo, Bin; Zhang, Liangpei

    2017-08-01

    High-resolution Digital Elevation Models (DEMs) of the Martian surface are instrumental for studying the red planet. The Mars Orbiter Laser Altimeter (MOLA) instrument onboard the Mars Global Surveyor provided global DEM of high vertical resolution but with a limited spatial resolution that is not enough for characterizing small geological objects, normalizing illumination conditions across in-orbit optical images, modeling local meteorology, and other applications. This article addresses the problem of producing DEMs for regions of interest on Mars using available in-orbit imagery, typically ≈1000 km2 in area, while insuring a ≈10 m vertical accuracy and a spatial accuracy which is comparable to that of the imagery. A method is proposed that combines photogrammetric and photoclinometric approaches in order to retain their mutual advantages. According to experiments using Mars Reconnaissance Orbiter Context Camera (CTX) images, the proposed method is indeed able to produce DEMs satisfying the previous requirements, with less artifacts, better surface continuity, and sharper details than the photogrammetric method when it is used alone.

  5. DEM analysis for AIA/SDO EUV channels using a probabilistic approach to the spectral inverse problem

    NASA Astrophysics Data System (ADS)

    Goryaev, Farid; Parenti, Susanna; Hochedez, Jean-François; Urnov, Alexander

    The Atmospheric Imaging Assembly (AIA) for the Solar Dynamics Observatory (SDO) mis-sion is designed to observe the Sun from the photosphere to the flaring corona. These data have to improve our understanding of processes in the solar atmosphere. The differential emis-sion measure (DEM) analysis is one of the main methods to derive information about coronal optically thin plasma characteristics from EUV and SXR emission. In this work we analyze AIA/SDO EUV channels to estimate their ability to reconstruct DEM(T) distributions. We use an iterative method (called Bayesian iterative method, BIM) within the framework of a probabilistic approach to the spectral inverse problem for determining the thermal structures of the emitting plasma sources (Goryaev et al., submitted to AA). The BIM is an iterative procedure based on Bayes' theorem and used for the reconstruction of DEM profiles. Using the BIM algorithm we performed various numerical tests and model simulations demonstrating abilities of our inversion approach for DEM analysis with AIA/SDO EUV channels.

  6. High-quality seamless DEM generation blending SRTM-1, ASTER GDEM v2 and ICESat/GLAS observations

    NASA Astrophysics Data System (ADS)

    Yue, Linwei; Shen, Huanfeng; Zhang, Liangpei; Zheng, Xianwei; Zhang, Fan; Yuan, Qiangqiang

    2017-01-01

    The absence of a high-quality seamless global digital elevation model (DEM) dataset has been a challenge for the Earth-related research fields. Recently, the 1-arc-second Shuttle Radar Topography Mission (SRTM-1) data have been released globally, covering over 80% of the Earth's land surface (60°N-56°S). However, voids and anomalies still exist in some tiles, which has prevented the SRTM-1 dataset from being directly used without further processing. In this paper, we propose a method to generate a seamless DEM dataset blending SRTM-1, ASTER GDEM v2, and ICESat laser altimetry data. The ASTER GDEM v2 data are used as the elevation source for the SRTM void filling. To get a reliable filling source, ICESat GLAS points are incorporated to enhance the accuracy of the ASTER data within the void regions, using an artificial neural network (ANN) model. After correction, the voids in the SRTM-1 data are filled with the corrected ASTER GDEM values. The triangular irregular network based delta surface fill (DSF) method is then employed to eliminate the vertical bias between them. Finally, an adaptive outlier filter is applied to all the data tiles. The final result is a seamless global DEM dataset. ICESat points collected from 2003 to 2009 were used to validate the effectiveness of the proposed method, and to assess the vertical accuracy of the global DEM products in China. Furthermore, channel networks in the Yangtze River Basin were also extracted for the data assessment.

  7. An interdisciplinary heuristic evaluation method for universal building design.

    PubMed

    Afacan, Yasemin; Erbug, Cigdem

    2009-07-01

    This study highlights how heuristic evaluation as a usability evaluation method can feed into current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three consecutive sessions. First, five evaluators from different professions were interviewed regarding the construction drawings in terms of universal design principles. Then, each evaluator was asked to perform the predefined task scenarios. In subsequent interviews, the evaluators were asked to re-analyze the construction drawings. The results showed that heuristic evaluation could successfully integrate universal usability into current building design practice in two ways: (i) it promoted an iterative evaluation process combined with multi-sessions rather than relying on one evaluator and on one evaluation session to find the maximum number of usability problems, and (ii) it highlighted the necessity of an interdisciplinary ad hoc committee regarding the heuristic abilities of each profession. A multi-session and interdisciplinary heuristic evaluation method can save both the project budget and the required time, while ensuring a reduced error rate for the universal usage of the built environments.

  8. Evaluating Methods for Evaluating Instruction: The Case of Higher Education. NBER Working Paper No. 12844

    ERIC Educational Resources Information Center

    Weinberg, Bruce A.; Fleisher, Belton M.; Hashimoto, Masanori

    2007-01-01

    This paper studies methods for evaluating instruction in higher education. We explore student evaluations of instruction and a variety of alternatives. We develop a simple model to illustrate the biases inherent in student evaluations. Measuring learning using grades in future courses, we show that student evaluations are positively related to…

  9. Coupling photogrammetric data with DFN-DEM model for rock slope hazard assessment

    NASA Astrophysics Data System (ADS)

    Donze, Frederic; Scholtes, Luc; Bonilla-Sierra, Viviana; Elmouttie, Marc

    2013-04-01

    fracture persistency in order to enhance the possible contribution of rock bridges on the failure surface development. It is believed that the proposed methodology can bring valuable complementary information for rock slope stability analysis in presence of complex fractured system for which classical "Factor of Safety" is difficult to express. References • Harthong B., Scholtès L. & F.V. Donzé, Strength characterization of rock masses, using a coupled DEM-DFN model, Geophysical Journal International, doi: 10.1111/j.1365-246X.2012.05642.x, 2012. • Kozicki J & Donzé FV. YADE-OPEN DEM: an open--source software using a discrete element method to simulate granular material, Engineering Computations, 26(7):786-805, 2009 • Kozicki J, Donzé FV. A new open-source software developed for numerical simulations using discrete modeling methods, Comp. Meth. In Appl. Mech. And Eng. 197:4429-4443, 2008. • Poropat, G.V., New methods for mapping the structure of rock masses. In Proceedings, Explo 2001, Hunter Valley, New South Wales, 28-31 October 2001, pp. 253-260, 2001. • Scholtès, L. & Donzé FV. Modelling progressive failure in fractured rock masses using a 3D discrete element method, International Journal of Rock Mechanics and Mining Sciences, 52:18-30, 2012a. • Scholtès, L. & Donzé, F.-V., DEM model for soft and hard rocks: role of grain interlocking on strength, J. Mech. Phys. Solids, doi: 10.1016/j.jmps.2012.10.005, 2012b. • Sirovision, Commonwealth Scientific and Industrial Research Organisation CSIRO, Siro3D Sirovision 3D Imaging Mapping System Manual Version 4.1, 2010

  10. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    ERIC Educational Resources Information Center

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  11. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    ERIC Educational Resources Information Center

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  12. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    SciTech Connect

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; Beerli, Peter; Zeng, Xiankui; Lu, Dan; Tao, Yuezan

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamic integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.

  13. A online credit evaluation method based on AHP and SPA

    NASA Astrophysics Data System (ADS)

    Xu, Yingtao; Zhang, Ying

    2009-07-01

    Online credit evaluation is the foundation for the establishment of trust and for the management of risk between buyers and sellers in e-commerce. In this paper, a new credit evaluation method based on the analytic hierarchy process (AHP) and the set pair analysis (SPA) is presented to determine the credibility of the electronic commerce participants. It solves some of the drawbacks found in classical credit evaluation methods and broadens the scope of current approaches. Both qualitative and quantitative indicators are considered in the proposed method, then a overall credit score is achieved from the optimal perspective. In the end, a case analysis of China Garment Network is provided for illustrative purposes.

  14. The topographic grain concept in DEM-based geomorphometric mapping

    NASA Astrophysics Data System (ADS)

    Józsa, Edina

    2016-04-01

    A common drawback of geomorphological analyses based on digital elevation datasets is the definition of search window size for the derivation of morphometric variables. The fixed-size neighbourhood determines the scale of the analysis and mapping, which can lead to the generalization of smaller surface details or the elimination of larger landform elements. The methods of DEM-based geomorphometric mapping are constantly developing into the direction of multi-scale landform delineation, but the optimal threshold for search window size is still a limiting factor. A possible way to determine the suitable value for the parameter is to consider the topographic grain principle (Wood, W. F. - Snell, J. B. 1960, Pike, R. J. et al. 1989). The calculation is implemented as a bash shell script for GRASS GIS to determine the optimal threshold for the r.geomorphon module. The approach relies on the potential of the topographic grain to detect the characteristic local ridgeline-to-channel spacing. By calculating the relative relief values with nested neighbourhood matrices it is possible to define a break-point where the increase rate of local relief encountered by the sample is significantly reducing. The geomorphons approach (Jasiewicz, J. - Stepinski, T. F. 2013) is a cell-based DEM classification method for the identification of landform elements at a broad range of scales by using line-of-sight technique. The landforms larger than the maximum lookup distance are broken down to smaller elements therefore the threshold needs to be set for a relatively large value. On the contrary, the computational requirements and the size of the study sites determine the upper limit for the value. Therefore the aim was to create a tool that would help to determine the optimal parameter for r.geomorphon tool. As a result it would be possible to produce more objective and consistent maps with achieving the full efficiency of this mapping technique. For the thorough analysis on the

  15. Entwicklungsperspektiven von Social Software und dem Web 2.0

    NASA Astrophysics Data System (ADS)

    Raabe, Alexander

    Der Artikel beschäftigt sich zunächst mit dem derzeitigen und zukünftigen Einsatz von Social Software in Unternehmen. Nach dem großen Erfolg von Social Software im Web beginnen viele Unternehmen eigene Social Software-Initiativen zu entwickeln. Der Artikel zeigt die derzeit wahrgenommenen Einsatzmöglichkeiten von Social Software im Unternehmen auf, erörtert Erfolgsfaktoren für die Einführung und präsentiert mögliche Wege für die Zukunft. Nach der Diskussion des Spezialfalles Social Software in Unternehmen werden anschließend die globalen Trends und Zukunftsperspektiven des Web 2.0 in ihren technischen, wirtschaftlichen und sozialen Dimensionen dargestellt. Wie aus den besprochenen Haupttrends hervorgeht, wird die Masse an digital im Web verfügbaren Informationen stetig weiterwachsen. So stellt sich die Frage, wie es in Zukunft möglich sein wird, die Qualität der Informationssuche und der Wissensgenerierung zu verbessern. Mit dem Einsatz von semantischen Technologien im Web wird hier eine revolutionäre Möglichkeit geboten, Informationen zu filtern und intelligente, gewissermaßen verstehende" Anwendungen zu entwerfen. Auf dem Weg zu einem intelligenten Web werden sich das Semantic Web und Social Software annähern: Anwendungen wie Semantic Wikis, Semantic Weblogs, lightweight Semantic Web-Sprachen wie Microformats oder auch kommerzielle Angebote wie Freebase von Metaweb werden die ersten Vorzeichen einer dritten Generation des Webs sein.

  16. Spatial Characterization of Landscapes through Multifractal Analysis of DEM

    PubMed Central

    Aguado, P. L.; Del Monte, J. P.; Moratiel, R.; Tarquis, A. M.

    2014-01-01

    Landscape evolution is driven by abiotic, biotic, and anthropic factors. The interactions among these factors and their influence at different scales create a complex dynamic. Landscapes have been shown to exhibit numerous scaling laws, from Horton's laws to more sophisticated scaling of heights in topography and river network topology. This scaling and multiscaling analysis has the potential to characterise the landscape in terms of the statistical signature of the measure selected. The study zone is a matrix obtained from a digital elevation model (DEM) (map 10 × 10 m, and height 1 m) that corresponds to homogeneous region with respect to soil characteristics and climatology known as “Monte El Pardo” although the water level of a reservoir and the topography play a main role on its organization and evolution. We have investigated whether the multifractal analysis of a DEM shows common features that can be used to reveal the underlying patterns and information associated with the landscape of the DEM mapping and studied the influence of the water level of the reservoir on the applied analysis. The results show that the use of the multifractal approach with mean absolute gradient data is a useful tool for analysing the topography represented by the DEM. PMID:25177728

  17. Evaluation of two gas-dilution methods for instrument calibration

    NASA Technical Reports Server (NTRS)

    Evans, A., Jr.

    1977-01-01

    Two gas dilution methods were evaluated for use in the calibration of analytical instruments used in air pollution studies. A dual isotope fluorescence carbon monoxide analyzer was used as the transfer standard. The methods are not new but some modifications are described. The rotary injection gas dilution method was found to be more accurate than the closed loop method. Results by the two methods differed by 5 percent. This could not be accounted for by the random errors in the measurements. The methods avoid the problems associated with pressurized cylinders. Both methods have merit and have found a place in instrument calibration work.

  18. Entrepreneur environment management behavior evaluation method derived from environmental economy.

    PubMed

    Zhang, Lili; Hou, Xilin; Xi, Fengru

    2013-12-01

    Evaluation system can encourage and guide entrepreneurs, and impel them to perform well in environment management. An evaluation method based on advantage structure is established. It is used to analyze entrepreneur environment management behavior in China. Entrepreneur environment management behavior evaluation index system is constructed based on empirical research. Evaluation method of entrepreneurs is put forward, from the point of objective programming-theory to alert entrepreneurs concerned to think much of it, which means to take minimized objective function as comprehensive evaluation result and identify disadvantage structure pattern. Application research shows that overall behavior of Chinese entrepreneurs environmental management are good, specially, environment strategic behavior are best, environmental management behavior are second, cultural behavior ranks last. Application results show the efficiency and feasibility of this method. Copyright © 2013 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  19. ASTM test methods for composite characterization and evaluation

    NASA Technical Reports Server (NTRS)

    Masters, John E.

    1994-01-01

    A discussion of the American Society for Testing and Materials is given. Under the topic of composite materials characterization and evaluation, general industry practice and test methods for textile composites are presented.

  20. EVALUATION OF ANALYTICAL METHODS FOR DETERMINING PESTICIDES IN BABY FOOD

    EPA Science Inventory

    Three extraction methods and two detection techniques for determining pesticides in baby food were evaluated. The extraction techniques examined were supercritical fluid extraction (SFE), enhanced solvent extraction (ESE), and solid phase extraction (SPE). The detection techni...

  1. EVALUATION OF ANALYTICAL METHODS FOR DETERMINING PESTICIDES IN BABY FOOD

    EPA Science Inventory

    Three extraction methods and two detection techniques for determining pesticides in baby food were evaluated. The extraction techniques examined were supercritical fluid extraction (SFE), enhanced solvent extraction (ESE), and solid phase extraction (SPE). The detection techni...

  2. [Validation and regulatory acceptance of alternative methods for toxicity evaluation].

    PubMed

    Ohno, Yasuo

    2004-01-01

    For regulatory acceptance of alternative methods (AMs) to animal toxicity tests, their reproducibility and relevance should be determined by intra- and inter-laboratory validation. Appropriate procedures of the validation and regulatory acceptance of AMs were recommended by OECD in 1996. According to those principles, several in vitro methods like skin corrosivity tests and phototoxicity tests were evaluated and accepted by ECVAM (European Center for the Validation of Alternative Methods), ICCVAM (The Interagency Coordinating Committee on the Validation of Alternative Methods), and OECD. Because of the difficulties in conducting inter-laboratory validation and relatively short period remained until EU's ban of animal experiments for safety evaluation of cosmetics, ECVAM and ICCVAM have recently started cooperation in validation and evaluation of AMs. It is also necessary to establish JaCVAM (Japanese Center for the Validation of AM) to contribute the issue and for the evaluation of new toxicity tests originated in Japan.

  3. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Phase 1

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley Multidisciplinary Design Optimization (MDO) method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of reproducible experiments. This report documents all computational experiments conducted in Phase I of the study. This report is a companion to the paper titled Initial Results of an MDO Method Evaluation Study by N. M. Alexandrov and S. Kodiyalam (AIAA-98-4884).

  4. Current methods for technology evaluation: primary data collection and synthetic methods

    NASA Astrophysics Data System (ADS)

    Goodman, Clifford

    1995-10-01

    Increased scrutiny of health care technologies is accompanied by greater attention to the quality of the information upon which technology policy decisions are made. In particular, there is greater understanding of, and demand for, technology evaluation methods that embody greater scientific rigor. Methods for evaluating health care technologies can be divided roughly into two main groups: primary data collection methods and synthetic or integrative methods. Improved understanding by analysts and policy makers of the relative strengths and weaknesses of these methods is improving the design and conduct of technology evaluations, as well as the interpretation of study findings for use in health care decision making.

  5. methods for evaluating the DOE Appropriate-Technology Program: A review and compilation of evaluation methods

    NASA Astrophysics Data System (ADS)

    Lucarelli, B.

    1982-07-01

    Procedures for evaluating the energy impact of renewable energy resource projects funded by the appropriate technology (AT) program are described. A discussion of the various evaluation approaches used by LBL over the past 2 years, definitions of key concepts such as direct and indirect energy impact and cost effectiveness, and recommendations of a simplified evaluation approach for the future are included. Procedures for evaluating the direct energy impact from six renewable energy resource systems are: (1) wind electric; (2) hydroelectric; (3) anaerobic digestion; (4) active solar water and space heating; (5) passive solar; and (6) geothermal space and water heating as well as weatherization. Economic concepts and simplified approach for computing the cost effectiveness of small energy projects are defined.

  6. Comparative study of heuristic evaluation and usability testing methods.

    PubMed

    Thyvalikakath, Thankam Paul; Monaco, Valerie; Thambuganipalle, Himabindu; Schleyer, Titus

    2009-01-01

    Usability methods, such as heuristic evaluation, cognitive walk-throughs and user testing, are increasingly used to evaluate and improve the design of clinical software applications. There is still some uncertainty, however, as to how those methods can be used to support the development process and evaluation in the most meaningful manner. In this study, we compared the results of a heuristic evaluation with those of formal user tests in order to determine which usability problems were detected by both methods. We conducted heuristic evaluation and usability testing on four major commercial dental computer-based patient records (CPRs), which together cover 80% of the market for chairside computer systems among general dentists. Both methods yielded strong evidence that the dental CPRs have significant usability problems. An average of 50% of empirically-determined usability problems were identified by the preceding heuristic evaluation. Some statements of heuristic violations were specific enough to precisely identify the actual usability problem that study participants encountered. Other violations were less specific, but still manifested themselves in usability problems and poor task outcomes. In this study, heuristic evaluation identified a significant portion of problems found during usability testing. While we make no assumptions about the generalizability of the results to other domains and software systems, heuristic evaluation may, under certain circumstances, be a useful tool to determine design problems early in the development cycle.

  7. Evaluation of methods of temperament scoring for beef cattle

    USDA-ARS?s Scientific Manuscript database

    Temperament can negatively affect various production traits, including live weight, ADG, DMI, conception rates and carcass weight. The objective of this research study was to evaluate temperament scoring methods in beef cattle. Crossbred (n = 228) calves were evaluated for temperament at weaning by ...

  8. Evaluating Multiple Prevention Programs: Methods, Results, and Lessons Learned

    ERIC Educational Resources Information Center

    Adler-Baeder, Francesca; Kerpelman, Jennifer; Griffin, Melody M.; Schramm, David G.

    2010-01-01

    Extension faculty and agents/educators are increasingly collaborating with local and state agencies to provide and evaluate multiple, distinct programs, yet there is limited information about measuring outcomes and combining results across similar program types. This article explicates the methods and outcomes of a state-level evaluation of…

  9. Optimization Evaluation of Geometric Error Based on Correctional Simplex Method

    NASA Astrophysics Data System (ADS)

    Zheng, P.; Zhang, L. N.; Zheng, H. D.; Chen, M. Y.

    2006-10-01

    This paper researches the theory of geometric error evaluation and its application. On the basis of the geometric model of error evaluation, the features of the geometric error enclosure evaluation are analyzed, and the paper has founded the linear programming model of minimum zone association, maximum inscribed association and minimum circumscribed association. By taking the minimum conditions criterion and the theory on minimizing the extremal difference function as rules of geometric error evaluation, a correctional simplex method for direct solution of the programming model is proposed, and also the process is given. Furthermore, the method is verified by giving an example of the cylindricity error evaluation and comparing the experiment results with the ones obtained from other common methods. In addition, this designed method is also used to other geometric error evaluation in practice. The theoretical analysis and experimental results indicate that, the proposed correctional simplex method does provide well accuracy on geometric error evaluation. The outstanding advantages conclude not only high efficiency and stability but also good universality and practicality.

  10. A formative multi-method approach to evaluating training.

    PubMed

    Hayes, Holly; Scott, Victoria; Abraczinskas, Michelle; Scaccia, Jonathan; Stout, Soma; Wandersman, Abraham

    2016-10-01

    This article describes how we used a formative multi-method evaluation approach to gather real-time information about the processes of a complex, multi-day training with 24 community coalitions in the United States. The evaluation team used seven distinct, evaluation strategies to obtain evaluation data from the first Community Health Improvement Leadership Academy (CHILA) within a three-prong framework (inquiry, observation, and reflection). These methods included: comprehensive survey, rapid feedback form, learning wall, observational form, team debrief, social network analysis and critical moments reflection. The seven distinct methods allowed for both real time quality improvement during the CHILA and long term planning for the next CHILA. The methods also gave a comprehensive picture of the CHILA, which when synthesized allowed the evaluation team to assess the effectiveness of a training designed to tap into natural community strengths and accelerate health improvement. We hope that these formative evaluation methods can continue to be refined and used by others to evaluate training.

  11. Comparison of induction motor field efficiency evaluation methods

    SciTech Connect

    Hsu, J.S.; Kueck, J.D.; Olszewski, M.; Casada, D.A.; Otaduy, P.J.; Tolbert, L.M.

    1996-10-01

    Unlike testing motor efficiency in a laboratory, certain methods given in the IEEE-Std 112 cannot be used for motor efficiency in the field. For example, it is difficult to load a motor in the field with a dynamometer when the motor is already coupled to driven equipment. The motor efficiency field evaluation faces a different environment from that for which the IEEE-Std 112 is chiefly written. A field evaluation method consists of one or several basic methods according to their physical natures. Their intrusivenesses and accuracies are also discussed. This study is useful for field engineers to select or to establish a proper efficiency evaluation method by understanding the theories and error sources of the methods.

  12. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    PubMed

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  13. High mobility of large mass movements: a study by means of FEM/DEM simulations

    NASA Astrophysics Data System (ADS)

    Manzella, I.; Lisjak, A.; Grasselli, G.

    2013-12-01

    Large mass movements, such as rock avalanches and large volcanic debris avalanches are characterized by extremely long propagation, which cannot be modelled using normal sliding friction law. For this reason several studies and theories derived from field observation, physical theories and laboratory experiments, exist to try to explain their high mobility. In order to investigate more into deep some of the processes recalled by these theories, simulations have been run with a new numerical tool called Y-GUI based on the Finite Element-Discrete Element Method FEM/DEM. The FEM/DEM method is a numerical technique developed by Munjiza et al. (1995) where Discrete Element Method (DEM) algorithms are used to model the interaction between different solids, while Finite Element Method (FEM) principles are used to analyze their deformability being also able to explicitly simulate material sudden loss of cohesion (i.e. brittle failure). In particular numerical tests have been run, inspired by the small-scale experiments done by Manzella and Labiouse (2013). They consist of rectangular blocks released on a slope; each block is a rectangular discrete element made of a mesh of finite elements enabled to fragment. These simulations have highlighted the influence on the propagation of block packing, i.e. whether the elements are piled into geometrical ordinate structure before failure or they are chaotically disposed as a loose material, and of the topography, i.e. whether the slope break is smooth and regular or not. In addition the effect of fracturing, i.e. fragmentation, on the total runout have been studied and highlighted.

  14. Approach to evaluating leak detection methods in underground storage tanks

    NASA Astrophysics Data System (ADS)

    Starr, J.; Broscious, J.; Niaki, S.

    1986-10-01

    The detection and evaluation of leaks in underground storage tanks require a detailed knowledge of conditions both within the tank and in the nearby surroundings. The test apparatus, as constructed, enables data regarding these environmental conditions to be readily obtained and incorporated in a carefully structured test program that minimizes the amount of costly full-scale testing that would otherwise be required to evaluate volumetric leak detection methods for underground storage tanks. In addition, sufficient flexibility has been designed into the apparatus to enable additional evaluations of non-volumetric test methods to be conducted, and different types of tanks and products to be tested in a cost-effective manner.

  15. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    SciTech Connect

    Chady, T.

    2004-02-26

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed.

  16. A Mixed Methods Approach to Understanding School Counseling Program Evaluation: High School Counselors' Methods and Perceptions

    ERIC Educational Resources Information Center

    Aucoin, Jennifer Mangrum

    2013-01-01

    The purpose of this mixed methods concurrent triangulation study was to examine the program evaluation practices of high school counselors. A total of 294 high school counselors in Texas were assessed using a mixed methods concurrent triangulation design. A researcher-developed survey, the School Counseling Program Evaluation Questionnaire…

  17. A Mixed Methods Approach to Understanding School Counseling Program Evaluation: High School Counselors' Methods and Perceptions

    ERIC Educational Resources Information Center

    Aucoin, Jennifer Mangrum

    2013-01-01

    The purpose of this mixed methods concurrent triangulation study was to examine the program evaluation practices of high school counselors. A total of 294 high school counselors in Texas were assessed using a mixed methods concurrent triangulation design. A researcher-developed survey, the School Counseling Program Evaluation Questionnaire…

  18. Coastal Digital Elevation Models (DEMs) for tsunami hazard assessment on the French coasts

    NASA Astrophysics Data System (ADS)

    Maspataud, Aurélie; Biscara, Laurie; Hébert, Hélène; Schmitt, Thierry; Créach, Ronan

    2015-04-01

    bathymetric and topographic data, …) were gathered. Consequently, datasets were first assessed internally for both quality and accuracy and then externally with other to ensure consistency and gradual topographic/bathymetric transitioning along limits of the datasets. The heterogeneous ages of the input data also stress the importance of taking into account the temporal variability of bathymetric features, especially in the active areas (sandbanks, estuaries, channels). Locally, gaps between marine (hydrographic surveys) and terrestrial (topographic LIDAR) data have required the introduction of new methods and tools to solve interpolation. Through these activities the goal is to improve the production line and to enhance tools and procedures used for the improvement of processing, validation and qualification algorithms of bathymetric data, data collection work, automation of processing and integration process for conception of improved both bathymetric and topographic DEMs, merging data collected. This work is supported by a French ANR program in the frame of "Investissements d'Avenir", under the grant ANR-11-RSNR-00023-01.

  19. Ice dynamics of Himalayan glaciers (Himachal Pradesh, India) using TerraSAR-X/TanDEM-X data.

    NASA Astrophysics Data System (ADS)

    Vijay, Saurabh; Braun, Matthias

    2015-04-01

    Mountain glaciers are the natural indicators of climate change. Himalaya is a part of widely spread mountain range consisting of second largest ice mass after polar region. The glaciers in Himalaya are located in Himachal Pradesh and other territories of India. The precipitation in the region is influenced by both Indian summer monsoon and mid-latitude winter westerlies. The glacier discharge influences the river basins and provides fresh water for various infrastructural necessities of urbanization in the state. The study aims to estimate the ice thickness and volume change during the decade (2011-2000) and annually during 2011-2014. For this, TanDEM-X DEMs are subtracted from the SRTM C/X band DEM of 2000. In addition, ice flow dynamics are quantified by the constellation of TerraSAR-X/TanDEM-X data using SAR offset tracking method. The primary investigations reveal that the terminus velocity of Bada Shigri (G077683E32169N), the biggest glacier of the state, Chhota Shigri( G077513E32227N), a bench-mark glacier, and other glacier (G077547E32162N) in 2011 found out to be < 2cm/day. The upper stream velocities of the glaciers are increased linearly and influenced by glacier tributaries.

  20. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  1. Infrared image quality evaluation method without reference image

    NASA Astrophysics Data System (ADS)

    Yue, Song; Ren, Tingting; Wang, Chengsheng; Lei, Bo; Zhang, Zhijie

    2013-09-01

    Since infrared image quality depends on many factors such as optical performance and electrical noise of thermal imager, image quality evaluation becomes an important issue which can conduce to both image processing afterward and capability improving of thermal imager. There are two ways of infrared image quality evaluation, with or without reference image. For real-time thermal image, the method without reference image is preferred because it is difficult to get a standard image. Although there are various kinds of methods for evaluation, there is no general metric for image quality evaluation. This paper introduces a novel method to evaluate infrared image without reference image from five aspects: noise, clarity, information volume and levels, information in frequency domain and the capability of automatic target recognition. Generally, the basic image quality is obtained from the first four aspects, and the quality of target is acquired from the last aspect. The proposed method is tested on several infrared images captured by different thermal imagers. Calculate the indicators and compare with human vision results. The evaluation shows that this method successfully describes the characteristics of infrared image and the result is consistent with human vision system.

  2. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  3. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  4. Methods for the evaluation of alternative disaster warning systems

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.

    1977-01-01

    For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.

  5. Subjective and Objective Methods of Evaluating Social Programs.

    ERIC Educational Resources Information Center

    Alemi, Farrokh

    1987-01-01

    Trade-offs are implicit in choosing a subjective or objective method for evaluating social programs. The differences between Bayesian and traditional statistics, decision and cost-benefit analysis, and anthropological and traditional case systems illustrate trade-offs in choosing methods because of limited resources. (SLD)

  6. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  7. 10 CFR 963.13 - Preclosure suitability evaluation method.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Preclosure suitability evaluation method. 963.13 Section 963.13 Energy DEPARTMENT OF ENERGY YUCCA MOUNTAIN SITE SUITABILITY GUIDELINES Site Suitability... geologic repository at the Yucca Mountain site using the method described in paragraph (b) of this section...

  8. 10 CFR 963.13 - Preclosure suitability evaluation method.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Preclosure suitability evaluation method. 963.13 Section 963.13 Energy DEPARTMENT OF ENERGY YUCCA MOUNTAIN SITE SUITABILITY GUIDELINES Site Suitability... geologic repository at the Yucca Mountain site using the method described in paragraph (b) of this section...

  9. 10 CFR 963.13 - Preclosure suitability evaluation method.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Preclosure suitability evaluation method. 963.13 Section 963.13 Energy DEPARTMENT OF ENERGY YUCCA MOUNTAIN SITE SUITABILITY GUIDELINES Site Suitability... geologic repository at the Yucca Mountain site using the method described in paragraph (b) of this section...

  10. 10 CFR 963.13 - Preclosure suitability evaluation method.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Preclosure suitability evaluation method. 963.13 Section 963.13 Energy DEPARTMENT OF ENERGY YUCCA MOUNTAIN SITE SUITABILITY GUIDELINES Site Suitability... geologic repository at the Yucca Mountain site using the method described in paragraph (b) of this section...

  11. 10 CFR 963.13 - Preclosure suitability evaluation method.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Preclosure suitability evaluation method. 963.13 Section 963.13 Energy DEPARTMENT OF ENERGY YUCCA MOUNTAIN SITE SUITABILITY GUIDELINES Site Suitability... geologic repository at the Yucca Mountain site using the method described in paragraph (b) of this section...

  12. Study on evaluation methods for Rayleigh wave dispersion characteristic

    USGS Publications Warehouse

    Shi, L.; Tao, X.; Kayen, R.; Shi, H.; Yan, S.

    2005-01-01

    The evaluation of Rayleigh wave dispersion characteristic is the key step for detecting S-wave velocity structure. By comparing the dispersion curves directly with the spectra analysis of surface waves (SASW) method, rather than comparing the S-wave velocity structure, the validity and precision of microtremor-array method (MAM) can be evaluated more objectively. The results from the China - US joint surface wave investigation in 26 sites in Tangshan, China, show that the MAM has the same precision with SASW method in 83% of the 26 sites. The MAM is valid for Rayleigh wave dispersion characteristic testing and has great application potentiality for site S-wave velocity structure detection.

  13. A new dataset evaluation method based on category overlap.

    PubMed

    Oh, Sejong

    2011-02-01

    The quality of dataset has a profound effect on classification accuracy, and there is a clear need for some method to evaluate this quality. In this paper, we propose a new dataset evaluation method using the R-value measure. This proposed method is based on the ratio of overlapping areas among categories in a dataset. A high R-value for a dataset indicates that the dataset contains wide overlapping areas among its categories, and classification accuracy on the dataset may become low. We can use the R-value measure to understand the characteristics of a dataset, the feature selection process, and the proper design of new classifiers.

  14. DOE methods for evaluating environmental and waste management samples.

    SciTech Connect

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  15. Simulation of a tablet coating process at different scales using DEM.

    PubMed

    Boehling, P; Toschkoff, G; Just, S; Knop, K; Kleinebudde, P; Funke, A; Rehbaum, H; Rajniak, P; Khinast, J G

    2016-10-10

    Spray coating of tablets is an important unit operation in the pharmaceutical industry and is mainly used for modified release, enteric protection, better appearance and brand recognition. It can also be used to apply an additional active pharmaceutical ingredient to the tablet core. Scale-up of such a process is an important step in commercialization. However, scale-up is not trivial and frequently, at manufacturing scales the required coating quality cannot be reached. Thus, we propose a method where laboratory experiments are carried out, yet scale-up is done via computational methods, i.e., by extrapolating results to larger scales. In the recent years, the Discrete Element Method (DEM) has widely been used to simulate tablet behavior in a laboratory scale drum coater. Due the increasing computational power and more sophisticated DEM algorithms, it has become possible to simulate millions of particles on regular PCs and model industrial scale tablet coating devices. In this work, simulations were performed on the laboratory, pilot and industrial scales and DEM was used to study how different scale-up rules influence the bed behavior on larger scales. The material parameters of the tablets were measured in the laboratory and a glued sphere approach was applied to model the tablet shape. The results include a vast amount of qualitative and quantitative data at the different scales. In conclusion, the evolution of the inter-tablet coating variation for the different scales and process parameters is presented. The results suggest that keeping the Froude number constant during the scale up process leads to faster processes as the cycle time is shorter and the spray residence time is more uniform when compared to keeping the circumferential velocity constant. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A numerical analysis method for evaluating rod lenses using the Monte Carlo method.

    PubMed

    Yoshida, Shuhei; Horiuchi, Shuma; Ushiyama, Zenta; Yamamoto, Manabu

    2010-12-20

    We propose a numerical analysis method for evaluating GRIN lenses using the Monte Carlo method. Actual measurements of the modulation transfer function (MTF) of a GRIN lens using this method closely match those made by conventional methods. Experimentally, the MTF is measured using a square wave chart, and is then calculated based on the distribution of output strength on the chart. In contrast, the general method using computers evaluates the MTF based on a spot diagram made by an incident point light source. However the results differ greatly from those from experiments. We therefore developed an evaluation method similar to the experimental system based on the Monte Carlo method and verified that it more closely matches the experimental results than the conventional method.

  17. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  18. System and method for evaluating a wire conductor

    DOEpatents

    Panozzo, Edward; Parish, Harold

    2013-10-22

    A method of evaluating an electrically conductive wire segment having an insulated intermediate portion and non-insulated ends includes passing the insulated portion of the wire segment through an electrically conductive brush. According to the method, an electrical potential is established on the brush by a power source. The method also includes determining a value of electrical current that is conducted through the wire segment by the brush when the potential is established on the brush. The method additionally includes comparing the value of electrical current conducted through the wire segment with a predetermined current value to thereby evaluate the wire segment. A system for evaluating an electrically conductive wire segment is also disclosed.

  19. Jagd nach dem O'Conell-Effekt

    NASA Astrophysics Data System (ADS)

    Reichmann, Norbert

    2013-03-01

    In the present paper, I focus on the O'Connell effect of the WUMa variable V502 Cyg, with the main aim of showing it in the lightcurve. 166 observations were collected in V and B band (100 and 66 measurements, respectively) from my private observatory in Kästenberg, Austria, Ossiacher Tauern, at an elevation of 890 m. All data were acquired with an Apo 130/1200 and an Apogee Alta U16M CCD camera. Photometric colour band and narrowband data were collected simultaneously and evaluated. The combination of photometric data with data for deep-sky imaging I have termed "pretty-picture-photometry". This combination of photometric measurements with colour and narrowband data is presented here in the case of V502 Cyg in its surrounding deep-sky field. Norbert Reichman is member of the BAV.

  20. The emergence of mixing methods in the field of evaluation.

    PubMed

    Greene, Jennifer C

    2015-06-01

    When and how did the contemporary practice of mixing methods in social inquiry get started? What events transpired to catalyze the explosive conceptual development and practical adoption of mixed methods social inquiry over recent decades? How has this development progressed? What "next steps" would be most constructive? These questions are engaged in this personally narrative account of the beginnings of the contemporary mixed methods phenomenon in the field of evaluation from the perspective of a methodologist who was there.

  1. Investigation of micro-structural phenomena at aggregate level in concretes using DEM

    NASA Astrophysics Data System (ADS)

    Nitka, Michał; Tejchman, Jacek

    2017-06-01

    This paper presents numerical analyses of concrete beams under three-point bending. The discrete element methods (DEM) was used to calculate fracture at the aggregate level. Concrete was described as a four-phase material, which was composed of aggregate, cement matrix, interfacial transitional zones (ITZs) and macro-voids. The beam micro-structure was directly taken from our experiments using x-ray micro-tomography. Simulations were carried out with real aggregate modelled as sphere clusters. Numerical results were compared with laboratory outcomes. The special attention was laid on the fracture propagation and some micro-structural phenomena at the aggregate level.

  2. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    DOE PAGES

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less

  3. New method for evaluation of tongue-coating status.

    PubMed

    Shimizu, T; Ueda, T; Sakurai, K

    2007-06-01

    The purpose of this study was to determine the viability of Tongue Coating Index, which is a new method for evaluating tongue-coating status. To determine the reliability and reproducibility of our new evaluation criteria (Score 0: Tongue coating not visible; Score 1: Tongue coating thin, papillae of tongue visible; Score 2: Tongue coating very thick, papillae of tongue not visible), 10 observers evaluated 20 photographs of tongues. Each tongue surface was divided into nine sections. Observers evaluated each section according to our new criteria and each score for tongue-coating status was recorded in the pertinent section of the Tongue Coating Record form. They repeated the same evaluation 2 weeks after the first evaluation. The relationship between the scores obtained and number of oral microorganisms was investigated in 50 edentulous patients. Tongue coating was collected from the tongue surface after evaluation of tongue-coating status. The total number of anaerobic bacteria and the number of Candida species were counted from the specimens collected. Interobserver agreement and intraobserver agreement were 0.66 and 0.80 by Cohen's kappa, respectively. No significant difference was observed in the number of Candida species among the three scores. The number of total anaerobic bacteria, however, was significantly different among the scores (P < 0.05). Therefore, we conclude that our method for evaluating tongue-coating status offers new criteria that are superior in reliability and reproducibility, and that also reflect the total number of anaerobic bacteria present on the dorsum of the tongue.

  4. The Shuttle Radar Topography Mission: A Global DEM

    NASA Technical Reports Server (NTRS)

    Farr, Tom G.; Kobrick, Mike

    2000-01-01

    Digital topographic data are critical for a variety of civilian, commercial, and military applications. Scientists use Digital Elevation Models (DEM) to map drainage patterns and ecosystems, and to monitor land surface changes over time. The mountain-building effects of tectonics and the climatic effects of erosion can also be modeled with DEW The data's military applications include mission planning and rehearsal, modeling and simulation. Commercial applications include determining locations for cellular phone towers, enhanced ground proximity warning systems for aircraft, and improved maps for backpackers. The Shuttle Radar Topography Mission (SRTM) (Fig. 1), is a cooperative project between NASA and the National Imagery and Mapping Agency (NIMA) of the U.S. Department of Defense. The mission is designed to use a single-pass radar interferometer to produce a digital elevation model of the Earth's land surface between about 60 degrees north and south latitude. The DEM will have 30 m pixel spacing and about 15 m vertical errors.

  5. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2014-07-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two datasets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM; and (2) a set of eighteen hydrological-topographic descriptors based on the corrected SRTM DEM. The hydrological-topographic description was generated by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (a.s.l.) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND dataset was done by in situ hydrological description of 110 km of walking trails also available in this dataset. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; and the datasets of hydrological features based on topographic modelling is undoubtedly appropriate for ecological modelling and an important contribution for environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the polygons selected for deforestation correction are available at http://ppbio.inpa.gov.br/knb/metacat/naman.317.3/ppbio; the set of hydrological-topographic descriptors is available at

  6. [An Introduction to Methods for Evaluating Health Care Technology].

    PubMed

    Lee, Ting-Ting

    2015-06-01

    The rapid and continual advance of healthcare technology makes ensuring that this technology is used effectively to achieve its original goals a critical issue. This paper presents three methods that may be applied by healthcare professionals in the evaluation of healthcare technology. These methods include: the perception/experiences of users, user work-pattern changes, and chart review or data mining. The first method includes two categories: using interviews to explore the user experience and using theory-based questionnaire surveys. The second method applies work sampling to observe the work pattern changes of users. The last method conducts chart reviews or data mining to analyze the designated variables. In conclusion, while evaluative feedback may be used to improve the design and development of healthcare technology applications, the informatics competency and informatics literacy of users may be further explored in future research.

  7. Using analytic network process for evaluating mobile text entry methods.

    PubMed

    Ocampo, Lanndon A; Seva, Rosemary R

    2016-01-01

    This paper highlights a preference evaluation methodology for text entry methods in a touch keyboard smartphone using analytic network process (ANP). Evaluation of text entry methods in literature mainly considers speed and accuracy. This study presents an alternative means for selecting text entry method that considers user preference. A case study was carried out with a group of experts who were asked to develop a selection decision model of five text entry methods. The decision problem is flexible enough to reflect interdependencies of decision elements that are necessary in describing real-life conditions. Results showed that QWERTY method is more preferred than other text entry methods while arrangement of keys is the most preferred criterion in characterizing a sound method. Sensitivity analysis using simulation of normally distributed random numbers under fairly large perturbation reported the foregoing results reliable enough to reflect robust judgment. The main contribution of this paper is the introduction of a multi-criteria decision approach in the preference evaluation of text entry methods.

  8. Getting to the Bottom Line: A Method for Synthesizing Findings within Mixed-Method Program Evaluations.

    ERIC Educational Resources Information Center

    McConney, Andrew; Rudd, Andy; Ayres, Robert

    2002-01-01

    Proposes a method for synthesizing findings within mixed-method program evaluations. The proposed method uses a set of criteria and analytic techniques to assess the worth of each data source or type and to establish what each says about program effect. Once data are on a common scale, simple mathematics allows synthesis across data sources or…

  9. User Experience Evaluation Methods in Product Development (UXEM'09)

    NASA Astrophysics Data System (ADS)

    Roto, Virpi; Väänänen-Vainio-Mattila, Kaisa; Law, Effie; Vermeeren, Arnold

    High quality user experience (UX) has become a central competitive factor of product development in mature consumer markets [1]. Although the term UX originated from industry and is a widely used term also in academia, the tools for managing UX in product development are still inadequate. A prerequisite for designing delightful UX in an industrial setting is to understand both the requirements tied to the pragmatic level of functionality and interaction and the requirements pertaining to the hedonic level of personal human needs, which motivate product use [2]. Understanding these requirements helps managers set UX targets for product development. The next phase in a good user-centered design process is to iteratively design and evaluate prototypes [3]. Evaluation is critical for systematically improving UX. In many approaches to UX, evaluation basically needs to be postponed until the product is fully or at least almost fully functional. However, in an industrial setting, it is very expensive to find the UX failures only at this phase of product development. Thus, product development managers and developers have a strong need to conduct UX evaluation as early as possible, well before all the parts affecting the holistic experience are available. Different types of products require evaluation on different granularity and maturity levels of a prototype. For example, due to its multi-user characteristic, a community service or an enterprise resource planning system requires a broader scope of UX evaluation than a microwave oven or a word processor that is meant for a single user at a time. Before systematic UX evaluation can be taken into practice, practical, lightweight UX evaluation methods suitable for different types of products and different phases of product readiness are needed. A considerable amount of UX research is still about the conceptual frameworks and models for user experience [4]. Besides, applying existing usability evaluation methods (UEMs) without

  10. On urban road traffic state evaluation index system and method

    NASA Astrophysics Data System (ADS)

    Su, Fei; Dong, Honghui; Jia, Limin; Sun, Xuan

    2017-01-01

    Traffic state evaluation is a basic and critical work in the research on road traffic congestion. It can provide basic data support for the improvement measures and information release in traffic management and service. The aim of this research is to obtain a comprehensive value to describe traffic state accurately based on the evaluation index system. In this paper, it is carried out using fuzzy c-means (FCM) algorithm and fuzzy entropy weight method. In the framework, traffic flow was classified into six different states to determine the fuzzy range of indices using the improved FCM clustering analysis. Besides, fuzzy entropy weight method is proposed to compute the evaluation result of traffic state for section, road and road network, respectively. The experiments based on the traffic information in a subset of Beijing’s road network prove that the findings of traffic evaluation are in accordance with the actual situation and people’s sense of traffic state.

  11. BOREAS Regional DEM in Raster Format and AEAC Projection

    NASA Technical Reports Server (NTRS)

    Knapp, David; Verdin, Kristine; Hall, Forrest G. (Editor)

    2000-01-01

    This data set is based on the GTOPO30 Digital Elevation Model (DEM) produced by the United States Geological Survey EROS Data Center (USGS EDC). The BOReal Ecosystem-Atmosphere Study (BOREAS) region (1,000 km x 1000 km) was extracted from the GTOPO30 data and reprojected by BOREAS staff into the Albers Equal-Area Conic (AEAC) projection. The pixel size of these data is 1 km. The data are stored in binary, image format files.

  12. [Evaluation of the 360-degree assessment method in a hospital].

    PubMed

    Møller, Lars Bo Krag; Ejlskov, Morten Wolff

    2008-09-15

    The present study examines the acceptability of the 360-degree assessment method as a means for evaluating the management and leadership competencies of the clinical staff of a university hospital. Twenty-eight consultants and registered nurses underwent evaluation. One group had debriefing with management consultants, the other with the head of the clinical department. Two months later, the applicability of the method was assessed. The strengths and weaknesses of the leaders were exposed, and areas for improvement were made visible, and acceptance of the method was widespread. Anonymity was required. The group coached by management consultants tended to benefit the most from the evaluation. Using a web-based solution to collect the data was unproblematic.

  13. Development of characteristic evaluation method on FR cycle system

    SciTech Connect

    Shinoda, Y.; Shiotani, H.; Hirao, K.

    2002-07-01

    The present report is intended to explain some results of the characteristic evaluation work on various FR cycle system concepts, in the 1. phase of the JNC's 'Feasibility Study on Commercialized Fast Reactor Cycle System' (from 1999 to March 2001). The development of the evaluation method is carried out for six criteria, such as Economics, Effective utilization of uranium resource, Reduction of environmental impact, Safety, Proliferation resistance, and Technological feasibility. (authors)

  14. Fractal Image Informatics: from SEM to DEM

    NASA Astrophysics Data System (ADS)

    Oleschko, K.; Parrot, J.-F.; Korvin, G.; Esteves, M.; Vauclin, M.; Torres-Argüelles, V.; Salado, C. Gaona; Cherkasov, S.

    2008-05-01

    In this paper, we introduce a new branch of Fractal Geometry: Fractal Image Informatics, devoted to the systematic and standardized fractal analysis of images of natural systems. The methods of this discipline are based on the properties of multiscale images of selfaffine fractal surfaces. As proved in the paper, the image inherits the scaling and lacunarity of the surface and of its reflectance distribution [Korvin, 2005]. We claim that the fractal analysis of these images must be done without any smoothing, thresholding or binarization. Two new tools of Fractal Image Informatics, firmagram analysis (FA) and generalized lacunarity (GL), are presented and discussed in details. These techniques are applicable to any kind of image or to any observed positive-valued physical field, and can be used to correlate between images. It will be shown, by a modified Grassberger-Hentschel-Procaccia approach [Phys. Lett. 97A, 227 (1983); Physica 8D, 435 (1983)] that GL obeys the same scaling law as the Allain-Cloitre lacunarity [Phys. Rev. A 44, 3552 (1991)] but is free of the problems associated with gliding boxes. Several applications are shown from Soil Physics, Surface Science, and other fields.

  15. Dem Generation with WORLDVIEW-2 Images

    NASA Astrophysics Data System (ADS)

    Büyüksalih, G.; Baz, I.; Alkan, M.; Jacobsen, K.

    2012-07-01

    For planning purposes 42 km coast line of the Black Sea, starting at the Bosporus going in West direction, with a width of approximately 5 km, was imaged by WorldView-2. Three stereo scenes have been oriented at first by 3D-affine transformation and later by bias corrected RPC solution. The result is nearly the same, but it is limited by identification of the control points in the images. Nevertheless after blunder elimination by data snooping root mean square discrepancies below 1 pixel have been reached. The root mean square discrepancy at control point height reached 0.5 m up to 1.3 m with a base to height relation between 1:1.26 and 1:1.80. Digital Surface models (DSM) with 4 m spacing have been generated by least squares matching with region growing, supported by image pyramids. A higher percentage of the mountainous area is covered by forest, requiring the approximation based on image pyramids. In the forest area the approximation just by region growing leads to larger gaps in the DSM. Caused by the good image quality of WorldView-2 the correlation coefficients reached by least squares matching are high and even in most forest areas a satisfying density of accepted points was reached. Two stereo models have an overlapping area of 1.6 km times 6.7 km allowing an accuracy evaluation. Small, but nevertheless significant differences in scene orientation have been eliminated by least squares shift of both overlapping height models to each other. The root mean square differences of both independent DSM are 1.06m or as a function of terrain inclination 0.74 m + 0.55 m  tangent (slope). The terrain inclination in the average is 7° with 12% exceeding 17°. The frequency distribution of height discrepancies is not far away from normal distribution, but as usual, larger discrepancies are more often available as corresponding to normal distribution. This also can be seen by the normalized medium absolute deviation (NMAS) related to 68% probability level of 0.83m

  16. Study on Turbulent Modeling in Gas Entrainment Evaluation Method

    NASA Astrophysics Data System (ADS)

    Ito, Kei; Ohshima, Hiroyuki; Nakamine, Yoshiaki; Imai, Yasutomo

    Suppression of gas entrainment (GE) phenomena caused by free surface vortices are very important to establish an economically superior design of the sodium-cooled fast reactor in Japan (JSFR). However, due to the non-linearity and/or locality of the GE phenomena, it is not easy to evaluate the occurrences of the GE phenomena accurately. In other words, the onset condition of the GE phenomena in the JSFR is not predicted easily based on scaled-model and/or partial-model experiments. Therefore, the authors are developing a CFD-based evaluation method in which the non-linearity and locality of the GE phenomena can be considered. In the evaluation method, macroscopic vortex parameters, e.g. circulation, are determined by three-dimensional CFD and then, GE-related parameters, e.g. gas core (GC) length, are calculated by using the Burgers vortex model. This procedure is efficient to evaluate the GE phenomena in the JSFR. However, it is well known that the Burgers vortex model tends to overestimate the GC length due to the lack of considerations on some physical mechanisms. Therefore, in this study, the authors develop a turbulent vortex model to evaluate the GE phenomena more accurately. Then, the improved GE evaluation method with the turbulent viscosity model is validated by analyzing the GC lengths observed in a simple experiment. The evaluation results show that the GC lengths analyzed by the improved method are shorter in comparison to the original method, and give better agreement with the experimental data.

  17. Interpolation and elevation errors: the impact of the DEM resolution

    NASA Astrophysics Data System (ADS)

    Achilleos, Georgios A.

    2015-06-01

    Digital Elevation Models (DEMs) are developing and evolving at a fast pace, given the progress of computer science and technology. This development though, is not accompanied by an advancement of knowledge on the quality of the models and their indigenous inaccuracy. The user on most occasions is not aware of this quality thus in not aware of the correlating product uncertainty. Extensive research has been conducted - and still is - towards this direction. In the research presented in this paper there is an analysis of elevation errors behavior which are recorded in a DEM. The behavior of these elevation errors, is caused by altering the DEM resolution upon the application of the algorithm interpolation. Contour lines are used as entry data from a topographical map. Elevation errors are calculated in the positions of the initial entry data and wherever the elevation is known. The elevation errors that are recorded, are analyzed, in order to reach conclusions about their distribution and the way in which they occur.

  18. Evaluation of AMOEBA: a spectral-spatial classification method

    USGS Publications Warehouse

    Jenson, Susan K.; Loveland, Thomas R.; Bryant, J.

    1982-01-01

    Muitispectral remotely sensed images have been treated as arbitrary multivariate spectral data for purposes of clustering and classifying. However, the spatial properties of image data can also be exploited. AMOEBA is a clustering and classification method that is based on a spatially derived model for image data. In an evaluation test, Landsat data were classified with both AMOEBA and a widely used spectral classifier. The test showed that irrigated crop types can be classified as accurately with the AMOEBA method as with the generally used spectral method ISOCLS; the AMOEBA method, however, requires less computer time.

  19. Method of Best Representation for Averages in Data Evaluation

    SciTech Connect

    Birch, M. Singh, B.

    2014-06-15

    A new method for averaging data for which incomplete information is available is presented. For example, this method would be applicable during data evaluation where only the final outcomes of the experiments and the associated uncertainties are known. This method is based on using the measurements to construct a mean probability density for the data set. This “expected value method” (EVM) is designed to treat asymmetric uncertainties and has distinct advantages over other methods of averaging, including giving a more realistic uncertainty, being robust to outliers and consistent under various representations of the same quantity.

  20. Methods for Evaluating Text Extraction Toolkits: An Exploratory Investigation

    DTIC Science & Technology

    2015-01-22

    M T R 1 4 0 4 4 3 R 2 M I T R E T E C H N I C A L R E P O R T Methods for Evaluating Text Extraction Toolkits: An...JAN 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Methods for Evaluating Text Extraction Toolkits: An...DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Text extraction

  1. Performance evaluation of BPM system in SSRF using PCA method

    NASA Astrophysics Data System (ADS)

    Chen, Zhi-Chu; Leng, Yong-Bin; Yan, Ying-Bing; Yuan, Ren-Xian; Lai, Long-Wei

    2014-07-01

    The beam position monitor (BPM) system is of most importance in a light source. The capability of the BPM depends on the resolution of the system. The traditional standard deviation on the raw data method merely gives the upper limit of the resolution. Principal component analysis (PCA) had been introduced in the accelerator physics and it could be used to get rid of the actual signals. Beam related information was extracted before the evaluation of the BPM performance. A series of studies had been made in the Shanghai Synchrotron Radiation Facility (SSRF) and PCA was proved to be an effective and robust method in the performance evaluations of our BPM system.

  2. Evaluation methods for association rules in spatial knowlegde base

    NASA Astrophysics Data System (ADS)

    Niu, X.; Ji, X.

    2014-04-01

    Association rule is an important model in data mining. It describes the relationship between predicates in transactions, makes the expression of knowledge hidden in data more specific and clear. While the developing and applying of remote sensing technology and automatic data collection tools in recent decades, tremendous amounts of spatial and non-spatial data have been collected and stored in large spatial database, so association rules mining from spatial database becomes a significant research area with extensive applications. How to find effective, reliable and interesting association rules from vast information for helping people analyze and make decision has become a significant issue. Evaluation methods measure spatial association rules with evaluation criteria. On the basis of analyzing the existing evaluation criteria, this paper improved the novelty evaluation method, built a spatial knowledge base, and proposed a new evaluation process based on the support-confidence evaluation system. Finally, the feasibility of the new evaluation process was validated by an experiment with real-world geographical spatial data.

  3. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  4. Students' rating as one of the methods for instruction evaluation.

    PubMed

    Mares, J

    1990-01-01

    World-wide knowledge in this field is summarized in this study. The information may be useful for university teachers, academical officials and people dealing with measuring quality of higher education. First part of the study is general. Terms of educational effectiveness, instructional effectiveness and pedagogical evaluation are characterized there. Differences between summative and formative evaluations are discussed. Three commonest approaches to the evaluation of higher education quality are analyzed (investigation of university prestige, study of objective indicators, correlation studies). It is emphasized that instructional quality is a relative term, multifaceted, with subjective elements and complexly conditioned. One of possible models for university instruction evaluation is presented and its eight variables are specified. A general strategy of such evaluation is described and attention is drawn to the fact that every assessment is based on a rough or clear concept of the best instruction under given conditions. Basic rules for instruction evaluation are stipulated as well as important principles which should be respected by anyone who wants to choose the most suitable method of university instruction evaluation. Second part of the study is special. The only method--evaluation using rating scales--is focused on. Students are said to be the most frequent evaluators of university instruction quality. Objections to the rating scales and negative experience with their non-professional application are presented. The construction of rating scales is described, i. e. the planning of the content and purpose of the scale, preparation of a blueprint, its practical verification and statistical interpretation. Practical instructions including an appropriate moment of administration, necessary number of raters, statistical analysis of significance of the results obtained are explained here for those who want to use the scale in their routine practice. Information

  5. DATA SYNTHESIS AND METHOD EVALUATION FOR BRAIN IMAGING GENETICS.

    PubMed

    Sheng, Jinhua; Kim, Sungeun; Yan, Jingwen; Moore, Jason; Saykin, Andrew; Shen, Li

    2014-05-01

    Brain imaging genetics is an emergent research field where the association between genetic variations such as single nucleotide polymorphisms (SNPs) and neuroimaging quantitative traits (QTs) is evaluated. Sparse canonical correlation analysis (SCCA) is a bi-multivariate analysis method that has the potential to reveal complex multi-SNP-multi-QT associations. We present initial efforts on evaluating a few SCCA methods for brain imaging genetics. This includes a data synthesis method to create realistic imaging genetics data with known SNP-QT associations, application of three SCCA algorithms to the synthetic data, and comparative study of their performances. Our empirical results suggest, approximating covariance structure using an identity or diagonal matrix, an approach used in these SCCA algorithms, could limit the SCCA capability in identifying the underlying imaging genetics associations. An interesting future direction is to develop enhanced SCCA methods that effectively take into account the covariance structures in the imaging genetics data.

  6. Evaluation of read count based RNAseq analysis methods.

    PubMed

    Guo, Yan; Li, Chung-I; Ye, Fei; Shyr, Yu

    2013-01-01

    RNAseq technology is replacing microarray technology as the tool of choice for gene expression profiling. While providing much richer data than microarray, analysis of RNAseq data has been much more challenging. To date, there has not been a consensus on the best approach for conducting robust RNAseq analysis. In this study, we designed a thorough experiment to evaluate six read count-based RNAseq analysis methods (DESeq, DEGseq, edgeR, NBPSeq, TSPM and baySeq) using both real and simulated data. We found the six methods produce similar fold changes and reasonable overlapping of differentially expressed genes based on p-values. However, all six methods suffer from over-sensitivity. Based on the evaluation of runtime using real data and area under the receiver operating characteristic curve (AUC-ROC) using simulated data, we found that edgeR achieves a better balance between speed and accuracy than the other methods.

  7. DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  8. Tropical-Forest Biomass Dynamics from X-Band, TanDEM-X DATA

    NASA Astrophysics Data System (ADS)

    Treuhaft, R. N.; Neumann, M.; Keller, M. M.; Goncalves, F. G.; Santos, J. R.

    2015-12-01

    The measurement of the change in above ground biomass (AGB) is key for understanding the carbon sink/source nature of tropical forests. Interferometric X-band radar from the only orbiting interferometer, TanDEM-X, shows sensitivity to standing biomass up to at least 300 Mg/ha. This sensitivity may be due in part to the propagation of the shorter X-band wavelength (0.031 m) through holes in the canopy. This talk focuses on estimating the change in AGB over time. Interferometric baselines from TanDEM-X have been obtained in Tapajós National Forest in the Brazilian Amazon over a 4-year period, from 2011 to 2015. Lidar measurements were also acquired during this period. Field measurements of height, height-to-base-of-crown, species, diameter, and position were acquired in 2010, 2013, and 2015. We show interferometric phase height changes, and suggest how these phase height changes are related to biomass change. First we show height changes between baselines separated by one month, over which we expect no change in AGB, to evaluate precision. We find an RMS of <2 m for ~85 stands in the phase height over one month, corresponding to about a 10% measurement of change, which suggests we can detect about a 17 Mg/ha change in AGB at Tapajos. In contrast, interferometric height changes over the period 2011 to 2014 have larger RMS scatters of > 3 m, due to actual change. Most stands show changes in interferometric phase height consistent with regrowth (~10 Mg/ha/yr), and several stands show abrupt, large changes in phase height (>10 m) due to logging and natural disturbance. At the end of 2015, we will acquire more TanDEM-X data over Tapajos, including an area subjected to selective logging. We are doing "before" (March 2015) and "after" (October 2015) fieldwork to be able to understand the signature of change due to selective logging in TanDEM-X interferometric data.

  9. Temporal monitoring of Bardarbunga volcanic activity with TanDEM-X

    NASA Astrophysics Data System (ADS)

    Rossi, C.; Minet, C.; Fritz, T.; Eineder, M.; Erten, E.

    2015-12-01

    On August 29, 2014, a volcanic activity started in the lava field of Holuhraun, at the north east of the Bardarbunga caldera in Iceland. The activity was declared finished on February 27, 2015, thus lasting for about 6 months. During these months the magma chamber below the caldera slowly emptied, causing the rare event of caldera collapse. In this scenario, TanDEM-X remote sensing data is of particular interest. By producing high-resolution and accurate elevation models of the caldera, it is possible to evaluate volume losses and topographical changes useful to increase the knowledge about the volcanic activity dynamics. 5 TanDEM-X InSAR acquisitions have been commanded between August 01, 2014 and November 08, 2014. 2 acquisitions have been commanded before the eruption and 3 acquisitions afterwards. To fully cover the volcanic activity, also the lava flow area at the north-west of the caldera has been monitored and a couple of acquisitions have been employed to reveal the subglacial graben structure and the lava path. In this context, the expected elevation accuracy is studied on two levels. Absolute height accuracy is analyzed by inspecting the signal propagation at X-band in the imaged medium. Relative height accuracy is analyzed by investigating the InSAR system parameters and the local geomorphology. It is shown how the system is very well accurate with mean height errors below the meter. Moreover, neither InSAR processing issues, e.g. phase unwrapping errors, nor complex DEM calibration aspects are problems to tackle. Caldera is imaged in its entirety and new cauldron formations and, in general, the complete restructuring of the glacial volcanic system is well represented. An impressive caldera volume loss of about 1 billion cubic meters is measured in about two months. The dyke propagation from the Bardarbunga cauldron to the Holuhraun lava field is also revealed and a graben structure with a width of up to 1 km and a sinking of a few meters is derived

  10. Analysis and Validation of Grid dem Generation Based on Gaussian Markov Random Field

    NASA Astrophysics Data System (ADS)

    Aguilar, F. J.; Aguilar, M. A.; Blanco, J. L.; Nemmaoui, A.; García Lorca, A. M.

    2016-06-01

    Digital Elevation Models (DEMs) are considered as one of the most relevant geospatial data to carry out land-cover and land-use classification. This work deals with the application of a mathematical framework based on a Gaussian Markov Random Field (GMRF) to interpolate grid DEMs from scattered elevation data. The performance of the GMRF interpolation model was tested on a set of LiDAR data (0.87 points/m2) provided by the Spanish Government (PNOA Programme) over a complex working area mainly covered by greenhouses in Almería, Spain. The original LiDAR data was decimated by randomly removing different fractions of the original points (from 10% to up to 99% of points removed). In every case, the remaining points (scattered observed points) were used to obtain a 1 m grid spacing GMRF-interpolated Digital Surface Model (DSM) whose accuracy was assessed by means of the set of previously extracted checkpoints. The GMRF accuracy results were compared with those provided by the widely known Triangulation with Linear Interpolation (TLI). Finally, the GMRF method was applied to a real-world case consisting of filling the LiDAR-derived DSM gaps after manually filtering out non-ground points to obtain a Digital Terrain Model (DTM). Regarding accuracy, both GMRF and TLI produced visually pleasing and similar results in terms of vertical accuracy. As an added bonus, the GMRF mathematical framework makes possible to both retrieve the estimated uncertainty for every interpolated elevation point (the DEM uncertainty) and include break lines or terrain discontinuities between adjacent cells to produce higher quality DTMs.

  11. Survey research methods in evaluation and case-control studies.

    PubMed

    Kalton, Graham; Piesse, Andrea

    2007-04-15

    Survey research methods are widely used in two types of analytic studies: evaluation studies that measure the effects of interventions; and population-based case-control studies that investigate the effects of various risk factors on the presence of disease. This paper provides a broad overview of some design and analysis issues related to such studies, illustrated with examples. The lack of random assignment to treatment and control groups in many evaluation studies makes controlling for confounders critically important. Confounder control can be achieved by matching in the design and by various alternative methods in the analysis. One popular analytic method of controlling for confounders is propensity scoring, which bears a close resemblance to survey weighting. The use of population-based controls has become common in case-control studies. For reasons of cost, population-based controls are often identified by telephone surveys using random digit dialling (RDD) sampling methods. However, RDD surveys are now experiencing serious problems with response rates. A recent alternative approach is to select controls from frames such as driver license lists that contain valuable demographic information for use in matching. Methods of analysis developed in the survey sampling literature are applicable, at least to some degree, in the analyses of evaluation and population-based case-control studies. In particular, the effects of complex sample designs can be taken into account using survey sampling variance estimation methods. Several survey analysis software packages are available for carrying out the computations.

  12. Force Evaluation in the Lattice Boltzmann Method Involving Curved Geometry

    NASA Technical Reports Server (NTRS)

    Mei, Renwei; Yu, Dazhi; Shyy, Wei; Luo, Li-Shi; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The present work investigates two approaches for force evaluation in the lattice Boltzmann equation: the momentum- exchange method and the stress-integration method on the surface of a body. The boundary condition for the particle distribution functions on curved geometries is handled with second order accuracy based on our recent works. The stress-integration method is computationally laborious for two-dimensional flows and in general difficult to implement for three-dimensional flows, while the momentum-exchange method is reliable, accurate, and easy to implement for both two-dimensional and three-dimensional flows. Several test cases are selected to evaluate the present methods, including: (i) two-dimensional pressure-driven channel flow; (ii) two-dimensional uniform flow past a column of cylinders; (iii) two-dimensional flow past a cylinder asymmetrically placed in a channel (with vortex shedding); (iv) three-dimensional pressure-driven flow in a circular pipe; and (v) three-dimensional flow past a sphere. The drag evaluated by using the momentum-exchange method agrees well with the exact or other published results.

  13. Visualization of vasculature with convolution surfaces: method, validation and evaluation.

    PubMed

    Oeltze, Steffen; Preim, Bernhard

    2005-04-01

    We present a method for visualizing vasculature based on clinical computed tomography or magnetic resonance data. The vessel skeleton as well as the diameter information per voxel serve as input. Our method adheres to these data, while producing smooth transitions at branchings and closed, rounded ends by means of convolution surfaces. We examine the filter design with respect to irritating bulges, unwanted blending and the correct visualization of the vessel diameter. The method has been applied to a large variety of anatomic trees. We discuss the validation of the method by means of a comparison to other visualization methods. Surface distance measures are carried out to perform a quantitative validation. Furthermore, we present the evaluation of the method which has been accomplished on the basis of a survey by 11 radiologists and surgeons.

  14. [Methods of dosimetry in evaluation of electromagnetic fields' biological action].

    PubMed

    Rubtsova, N B; Perov, S Iu

    2012-01-01

    Theoretical and experimental dosimetry can be used for adequate evaluation of the effects of radiofrequency electromagnetic fields. In view of the tough electromagnetic environment in aircraft, pilots' safety is of particular topicality. The dosimetric evaluation is made from the quantitative characteristics of the EMF interaction with bio-objects depending on EM energy absorption in a unit of tissue volume or mass calculated as a specific absorbed rate (SAR) and measured in W/kg. Theoretical dosimetry employs a number of computational methods to determine EM energy, as well as the augmented method of boundary conditions, iterative augmented method of boundary conditions, moments method, generalized multipolar method, finite-element method, time domain finite-difference method, and hybrid methods combining several decision plans modeling the design philosophy of navigation, radiolocation and human systems. Because of difficulties with the experimental SAR estimate, theoretical dosimetry is regarded as the first step in analysis of the in-aircraft conditions of exposure and possible bio-effects.

  15. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    PubMed Central

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  16. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  17. Administrator Evaluation: Concepts, Methods, Cases in Higher Education.

    ERIC Educational Resources Information Center

    Farmer, Charles H.

    Designed for faculty and administration in higher education, the book describes concepts, methods, and case studies in the field of administrative assessment. The first section explores issues and perspectives in three chapters authored by Charles H. Farmer: "Why Evaluate Administrators?", "How Can Administrators be…

  18. Evaluation of methods for nondestructive testing of brazed joints

    NASA Technical Reports Server (NTRS)

    Kanno, A.

    1968-01-01

    Evaluation of nondestructive methods of testing brazed joints reveals that ultrasonic testing is effective in the detection of nonbonds in diffusion bonded samples. Radiography provides excellent resolutions of void or inclusion defects, and the neutron radiographic technique shows particular advantage for brazing materials containing cadmium.

  19. Endoscopic Evaluation of Adenoids: Reproducibility Analysis of Current Methods

    PubMed Central

    Hermann, Juliana Sato; Sallum, Ana Carolina; Pignatari, Shirley Shizue Nagata

    2013-01-01

    Objectives To investigate intra- and interexaminers' reproducibility of usual adenoid hypertrophy assessment methods, according to nasofiberendoscopic examination. Methods Forty children of both sexes, ages ranging between 4 and 14 years, presenting with nasal obstruction and oral breathing suspected to be caused by adenoid hypertrophy, were enrolled in this study. Patients were evaluated by nasofiberendoscopy, and records were referred to and evaluated by two experienced otolaryngologists. Examiners analysed the records according to different evaluation methods; i.e., estimated, and measured percentage of choanal occlusion; as well as subjective and objective classificatory systems of adenoid hypertrophy. Results Data disclosed excellent intraexaminer reproducibility for both estimated and measured choanal occlusion. analysis revealed lower reproducibility rates of estimated in relation to measured choanal occlusion. Measured choanal occlusion also demonstrated less agreement among evaluations made through the right and left sides of the nasal cavity. Alternatively, intra- and interexaminers reliability analysis revealed higher agreement for subjective than objective classificatory system. Besides, subjective method demonstrated higher agreement than the objective classificatory system, when opposite sides were compared. Conclusion Our results suggest that measured is superior to estimated percentage of choanal occlusion, particularly if employed bilaterally, diminishing the lack of agreement between sides. When adenoid categorization is used instead, the authors recommend subjective rather than objective classificatory system of adenoid hypertrophy. PMID:23526477

  20. METHODS FOR EVALUATING THE SUSTAINABILITY OF GREEN PROCESSES

    EPA Science Inventory

    Methods for Evaluating the Sustainability of Green Processes

    By Raymond L. Smith and Michael A. Gonzalez
    U.S. Environmental Protection Agency
    Office of Research and Development
    26 W. Martin Luther King Dr.
    Cincinnati, OH 45268 USA

    Theme: New Challenges...

  1. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  2. An Evaluation of a New Method of IRT Scaling

    ERIC Educational Resources Information Center

    Ragland, Shelley

    2010-01-01

    In order to be able to fairly compare scores derived from different forms of the same test within the Item Response Theory framework, all individual item parameters must be on the same scale. A new approach, the RPA method, which is based on transformations of predicted score distributions was evaluated here and was shown to produce results…

  3. AN EVALUATION OF THE PHONOVISUAL METHOD, GRADES 1-3.

    ERIC Educational Resources Information Center

    Pasadena City Unified School District, CA.

    THE ACHIEVEMENT TEST PERFORMANCES OF TWO GROUPS OF CHILDREN FOR GRADES 1, 2, AND 3 IN TWO PASADENA, CALIFORNIA, SCHOOLS WERE COMPARED TO EVALUATE THE EFFECTIVENESS OF A 3-YEAR EXPERIMENTAL PROGRAM USING THE PHONOVISUAL METHOD OF READING INSTRUCTION. PUPILS WERE MATCHED ON SEX, IQ, AND CHRONOLOGICAL AGE. DIFFERENCES OBSERVED BETWEEN THE MEAN SCORES…

  4. Evaluation of Alternative Difference-in-Differences Methods

    ERIC Educational Resources Information Center

    Yu, Bing

    2013-01-01

    Difference-in-differences (DID) strategies are particularly useful for evaluating policy effects in natural experiments in which, for example, a policy affects some schools and students but not others. However, the standard DID method may produce biased estimation of the policy effect if the confounding effect of concurrent events varies by…

  5. EVALUATION OF TWO METHODS FOR PREDICTION OF BIOACCUMULATION FACTORS

    EPA Science Inventory

    Two methods for deriving bioaccumulation factors (BAFs) used by the U.S. Environmental Protection Agency (EPA) in development of water quality criteria were evaluated using polychlorinated biphenyls (PCB) data from the Hudson River and Green Bay ecosystems. Greater than 90% of th...

  6. Bayesian Monte Carlo Method for Nuclear Data Evaluation

    SciTech Connect

    Koning, A.J.

    2015-01-15

    A Bayesian Monte Carlo method is outlined which allows a systematic evaluation of nuclear reactions using TALYS. The result will be either an EXFOR-weighted covariance matrix or a collection of random files, each accompanied by an experiment based weight.

  7. Methods of Evaluating Child Welfare in Indian Country: An Illustration

    ERIC Educational Resources Information Center

    Fox, Kathleen; Cross, Terry L.; John, Laura; Carter, Patricia; Pavkov, Thomas; Wang, Ching-Tung; Diaz, Javier

    2011-01-01

    The poor quality and quantity of data collected in tribal communities today reflects a lack of true community participation and commitment. This is especially problematic for evaluation studies, in which the needs and desires of the community should be the central focus. This challenge can be met by emphasizing indigenous methods and voice. The…

  8. 10 CFR 963.16 - Postclosure suitability evaluation method.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 963.16 Energy DEPARTMENT OF ENERGY YUCCA MOUNTAIN SITE SUITABILITY GUIDELINES Site Suitability... assessment to evaluate the ability of the Yucca Mountain disposal system to limit radiological doses and... the performance of the Yucca Mountain disposal system using the method described in paragraph (b) of...

  9. Program Evaluation of the Sustainability of Teaching Methods

    ERIC Educational Resources Information Center

    Bray, Cathy

    2008-01-01

    This paper suggests a particular question that higher education researchers might ask: "Do educational programs use teaching methods that are environmentally, socially and economically sustainable?" It further proposes that program evaluation research (PER) can be used to answer the question. Consideration is given to: a) program…

  10. The Diffusion of Evaluation Methods among Public Relations Practitioners.

    ERIC Educational Resources Information Center

    Dozier, David M.

    A study explored the relationships between public relations practitioners' organizational roles and the type of evaluation methods they used on the job. Based on factor analysis of role data obtained from an earlier study, four organizational roles were defined and ranked: communication manager, media relations specialist, communication liaison,…

  11. METHODS FOR EVALUATING THE SUSTAINABILITY OF GREEN PROCESSES

    EPA Science Inventory

    Methods for Evaluating the Sustainability of Green Processes

    By Raymond L. Smith and Michael A. Gonzalez
    U.S. Environmental Protection Agency
    Office of Research and Development
    26 W. Martin Luther King Dr.
    Cincinnati, OH 45268 USA

    Theme: New Challenges...

  12. A SIMPLE METHOD FOR EVALUATING DATA FROM AN INTERLABORATORY STUDY

    EPA Science Inventory

    Large-scale laboratory-and method-performance studies involving more than about 30 laboratories may be evaluated by calculating the HORRAT ratio for each test sample (HORRAT=[experimentally found among-laboratories relative standard deviation] divided by [relative standard deviat...

  13. Statistical methods for evaluating the attainment of cleanup standards

    SciTech Connect

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  14. The research on user behavior evaluation method for network state

    NASA Astrophysics Data System (ADS)

    Zhang, Chengyuan; Xu, Haishui

    2017-08-01

    Based on the correlation between user behavior and network running state, this paper proposes a method of user behavior evaluation based on network state. Based on the analysis and evaluation methods in other fields of study, we introduce the theory and tools of data mining. Based on the network status information provided by the trusted network view, the user behavior data and the network state data are analysed. Finally, we construct the user behavior evaluation index and weight, and on this basis, we can accurately quantify the influence degree of the specific behavior of different users on the change of network running state, so as to provide the basis for user behavior control decision.

  15. Sensitivity evaluation of dynamic speckle activity measurements using clustering methods.

    PubMed

    Etchepareborda, Pablo; Federico, Alejandro; Kaufmann, Guillermo H

    2010-07-01

    We evaluate and compare the use of competitive neural networks, self-organizing maps, the expectation-maximization algorithm, K-means, and fuzzy C-means techniques as partitional clustering methods, when the sensitivity of the activity measurement of dynamic speckle images needs to be improved. The temporal history of the acquired intensity generated by each pixel is analyzed in a wavelet decomposition framework, and it is shown that the mean energy of its corresponding wavelet coefficients provides a suited feature space for clustering purposes. The sensitivity obtained by using the evaluated clustering techniques is also compared with the well-known methods of Konishi-Fujii, weighted generalized differences, and wavelet entropy. The performance of the partitional clustering approach is evaluated using simulated dynamic speckle patterns and also experimental data.

  16. A Safety Index and Method for Flightdeck Evaluation

    NASA Technical Reports Server (NTRS)

    Latorella, Kara A.

    2000-01-01

    If our goal is to improve safety through machine, interface, and training design, then we must define a metric of flightdeck safety that is usable in the design process. Current measures associated with our notions of "good" pilot performance and ultimate safety of flightdeck performance fail to provide an adequate index of safe flightdeck performance for design evaluation purposes. The goal of this research effort is to devise a safety index and method that allows us to evaluate flightdeck performance holistically and in a naturalistic experiment. This paper uses Reason's model of accident causation (1990) as a basis for measuring safety, and proposes a relational database system and method for 1) defining a safety index of flightdeck performance, and 2) evaluating the "safety" afforded by flightdeck performance for the purpose of design iteration. Methodological considerations, limitations, and benefits are discussed as well as extensions to this work.

  17. Participatory Training Evaluation Method (PATEM) as a Collaborative Evaluation Capacity Building Strategy

    ERIC Educational Resources Information Center

    Kuzmin, Alexey

    2012-01-01

    This article describes Participatory Training Evaluation Method (PATEM) of measuring participants' reaction to the training. PATEM provides rich information; allows to document evaluation findings; becomes organic part of the training that helps participants process their experience individually and as a group; makes sense to participants; is an…

  18. Occupational needs and evaluation methods for cold protective clothing.

    PubMed

    Anttonen, H

    1993-01-01

    The aim of the study was to evaluate the needs for and properties of the occupational cold protective clothing with different methods and the risks related to work in cold conditions from the point of view of occupational hygiene and clothing physiology. The thermal insulation of textile materials and clothing was investigated with the equipment, methods and parameters developed especially in cold and windy conditions in dynamic and steady states. Also the simulation and calculation of results were done and compared to the measurements. The cold exposure from the point of occupational hygiene was evaluated in working life to evaluate the risk of cooling and frostbite and utility ranges of clothing. The function of the sweating hot plate constructed and cylinder in the wind tunnel could be regarded adequate for the evaluation of winter clothing with good precision, stability and repeatability. The measured total thermal resistance was mainly dependent on, and operative thermal resistance independent of, temperature. The operative thermal resistance was also very sensitive to errors in measurement procedures. The heat flow usually evaluated by thermal and water vapour resistance could be substituted for total thermal resistance. Both the measurements and theories showed that, in addition to air permeability, also the ambient temperature, air gaps, contact layers and thickness of clothing were important parameters. Increase of wind (1...8 m/s) decreased the total thermal resistance and mass transfer up to 60% depending on conditions. The comparison of calculation models with material measurements proved the value of the simulation models. The reason for differences between the methods was mainly due to changes in water vapour resistance in the cold. The heat flux method was exact enough in the evaluation of the insulation of clothing in the field but in sweating conditions the condensation and evaporation must be taken into consideration. In the case of heat debt in

  19. Holistic Evaluation of Lightweight Operating Systems using the PERCU Method

    SciTech Connect

    Kramer, William T.C.; He, Yun; Carter, Jonathan; Glenski, Joseph; Rippe, Lynn; Cardo, Nicholas

    2008-05-01

    The scale of Leadership Class Systems presents unique challenges to the features and performance of operating system services. This paper reports results of comprehensive evaluations of two Light Weight Operating Systems (LWOS), Cray's Catamount Virtual Node (CVN) and Linux Environment (CLE) operating systems, on the exact same large-scale hardware. The evaluation was carried out over a 5-month period on NERSC's 19,480 core Cray XT-4, Franklin, using a comprehensive evaluation method that spans Performance, Effectiveness, Reliability, Consistency and Usability criteria for all major subsystems and features. The paper presents the results of the comparison between CVN and CLE, evaluates their relative strengths, and reports observations regarding the world's largest Cray XT-4 as well.

  20. Improvement of dem Generation from Aster Images Using Satellite Jitter Estimation and Open Source Implementation

    NASA Astrophysics Data System (ADS)

    Girod, L.; Nuth, C.; Kääb, A.

    2015-12-01

    The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system embarked on the Terra (EOS AM-1) satellite has been a source of stereoscopic images covering the whole globe at a 15m resolution at a consistent quality for over 15 years. The potential of this data in terms of geomorphological analysis and change detection in three dimensions is unrivaled and needs to be exploited. However, the quality of the DEMs and ortho-images currently delivered by NASA (ASTER DMO products) is often of insufficient quality for a number of applications such as mountain glacier mass balance. For this study, the use of Ground Control Points (GCPs) or of other ground truth was rejected due to the global "big data" type of processing that we hope to perform on the ASTER archive. We have therefore developed a tool to compute Rational Polynomial Coefficient (RPC) models from the ASTER metadata and a method improving the quality of the matching by identifying and correcting jitter induced cross-track parallax errors. Our method outputs more accurate DEMs with less unmatched areas and reduced overall noise. The algorithms were implemented in the open source photogrammetric library and software suite MicMac.

  1. Potentials of TanDEM-X Interferometric Data for Global Forest/Non-Forest Classification

    NASA Astrophysics Data System (ADS)

    Martone, Michele; Rizzoli, Paola; Brautigam, Benjamin; Krieger, Gerhard

    2016-08-01

    This paper presents a method to generate forest/non- forest maps from TanDEM-X interferometric SAR data. Among the several contributions which may affect the quality of interferometric products, the coherence loss caused by volume scattering represents the contribution which is predominantly affected by the presence of vegetation, and is therefore here exploited as main indicator for forest classification. Due to the strong dependency of the considered InSAR quantity on the geometric acquisition configuration, namely the incidence angle and the interferometric baseline, a multi-fuzzy clustering classification approach is used. Some examples are provided which show the potential of the proposed method. Further, additional features such as urban settlements, water, and critical areas affected by geometrical distortions (e.g. shadow and layover) need to be extracted, and possible approaches are presented as well. Very promising results are shown, which demonstrate the potentials of TanDEM-X bistatic data not only for forest identification, but, more in general, for the generation of a global land classification map as a next step.

  2. Evaluation of VOC emission measurement methods for paint spray booths.

    PubMed

    Eklund, B M; Nelson, T P

    1995-03-01

    Interest in regulations to control solvent emissions from automotive painting systems is increasing, especially in ozone nonattainment areas. Therefore, an accurate measurement method for VOC emissions from paint spray booths used in the automotive industry is needed to ascertain the efficiency of the spray booth capture and the total emissions. This paper presents the results of a laboratory study evaluating potential VOC sampling and analytical methods used in estimating paint spray booth emissions, and discusses these results relative to other published data. Eight test methods were selected for evaluation. The accuracy of each sampling and analytical method was determined using test atmospheres of known concentration and composition that closely matched the actual exhaust air from paint spray booths. The solvent mixture to generate the test atmospheres contained a large proportion of polar, oxygenated hydrocarbons such as ketones and alcohols. A series of identical tests was performed for each sampling/analytical method with each test atmosphere to assess the precision of the methods. The study identified significant differences among the test methods in terms of accuracy, precision, cost, and complexity.

  3. Effect of DEM Source and Resolution on Extracting River Network and Watershed within Multi-Lake Area in Tibet

    NASA Astrophysics Data System (ADS)

    Li, Yang; Li, Gang; Lin, Hui

    2014-11-01

    DEM defines drainage structures and basin through conducting overland flow simulation. Two matured DEM Sources are SRTM DEM (Shuttle Radar Topographic Mission) and ASTER GDEM (Advanced Space borne Thermal Emission and Reflection Radiometer Global Digital Elevation Model); The accuracy of hydrological characters that derived from DEM decreased from high resolution to coarse resolutionand appeared to be different in different data source.

  4. Evaluation of Low-Tech Indoor Remediation Methods ...

    EPA Pesticide Factsheets

    Report This study identified, collected, evaluated, and summarized available articles, reports, guidance documents, and other pertinent information related to common housekeeping activities within the United States. This resulted in a summary compendium including relevant information about multiple low-tech cleaning methods from the literature search results. Through discussion and prioritization, an EPA project team, made up of several EPA scientists and emergency responders, focused the information into a list of 14 housekeeping activities for decontamination evaluation testing. These types of activities are collectively referred to as “low-tech” remediation methods because of the comparative simple tools, equipment, and operations involved. Similarly, eight common household surfaces were chosen that were contaminated using three different contamination conditions. Thirty-three combinations of methods and surfaces were chosen for testing under the three contamination conditions for a total of 99 tests.

  5. A Rapid Usability Evaluation (RUE) Method for Health Information Technology.

    PubMed

    Russ, Alissa L; Baker, Darrell A; Fahner, W Jeffrey; Milligan, Bryce S; Cox, Leeann; Hagg, Heather K; Saleem, Jason J

    2010-11-13

    Usability testing can help generate design ideas to enhance the quality and safety of health information technology. Despite these potential benefits, few healthcare organizations conduct systematic usability testing prior to software implementation. We used a Rapid Usability Evaluation (RUE) method to apply usability testing to software development at a major VA Medical Center. We describe the development of the RUE method, provide two examples of how it was successfully applied, and discuss key insights gained from this work. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE methods are relatively simple, do not require advanced training or usability software, and should be easy to adopt. Other healthcare organizations may be able to implement RUE to improve software effectiveness, efficiency, and safety.

  6. Evaluating maximum likelihood estimation methods to determine the hurst coefficients

    NASA Astrophysics Data System (ADS)

    Kendziorski, C. M.; Bassingthwaighte, J. B.; Tonellato, P. J.

    1999-12-01

    A maximum likelihood estimation method implemented in S-PLUS ( S-MLE) to estimate the Hurst coefficient ( H) is evaluated. The Hurst coefficient, with 0.5< H<1, characterizes long memory time series by quantifying the rate of decay of the autocorrelation function. S-MLE was developed to estimate H for fractionally differenced (fd) processes. However, in practice it is difficult to distinguish between fd processes and fractional Gaussian noise (fGn) processes. Thus, the method is evaluated for estimating H for both fd and fGn processes. S-MLE gave biased results of H for fGn processes of any length and for fd processes of lengths less than 2 10. A modified method is proposed to correct for this bias. It gives reliable estimates of H for both fd and fGn processes of length greater than or equal to 2 11.

  7. Accurate evaluation and analysis of functional genomics data and methods

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2016-01-01

    The development of technology capable of inexpensively performing large-scale measurements of biological systems has generated a wealth of data. Integrative analysis of these data holds the promise of uncovering gene function, regulation, and, in the longer run, understanding complex disease. However, their analysis has proved very challenging, as it is difficult to quickly and effectively assess the relevance and accuracy of these data for individual biological questions. Here, we identify biases that present challenges for the assessment of functional genomics data and methods. We then discuss evaluation methods that, taken together, begin to address these issues. We also argue that the funding of systematic data-driven experiments and of high-quality curation efforts will further improve evaluation metrics so that they more-accurately assess functional genomics data and methods. Such metrics will allow researchers in the field of functional genomics to continue to answer important biological questions in a data-driven manner. PMID:22268703

  8. Tectonic development of the Northwest Bonaparte Basin, Australia by using Digital Elevation Model (DEM)

    NASA Astrophysics Data System (ADS)

    Wahid, Ali; Salim, Ahmed Mohamed Ahmed; Ragab Gaafar, Gamal; Yusoff, AP Wan Ismail Wan

    2016-02-01

    The Bonaparte Basin consist of majorly offshore part is situated at Australia's NW continental margin, covers an area of approx. 270,000km2. Bonaparte Basin having a number of sub-basins and platform areas of Paleozoic and Mesozoic is structurally complex. This research established the geologic and geomorphologic studies using Digital Elevation Model (DEM) as a substitute approach in morphostructural analysis to unravel the geological complexities. Although DEMs have been in practice since 1990s, they still have not become common tool for mapping studies. The research work comprised of regional structural analysis with the help of integrated elevation data, satellite imageries, available open topograhic images and internal geological maps with interpreted seismic. The structural maps of the study area have been geo-referenced which further overlaid onto SRTM data and satellite images for combined interpretation which facilitate to attain Digital Elevation Model of the study area. The methodology adopts is to evaluate and redefine development of geodynamic processes involved in formation of Bonaparte Basin. The main objectives is to establish the geological histories by using digital elevation model. The research work will be useful to incorporate different tectonic events occurred at different Geological times in a digital elevation model. The integrated tectonic analysis of different digital data sets benefitted substantially from combining them into a common digital database. Whereas, the visualization software facilitates the overlay and combined interpretation of different data sets which is helpful to reveal hidden information not obvious or accessible otherwise for regional analysis.

  9. A spheropolygonal-based DEM study into breakage under repetitive compression

    NASA Astrophysics Data System (ADS)

    Miao, Guien; Alonso-Marroquin, Fernando; Airey, David

    2017-06-01

    Experimental breakage studies have often focused on comparing grading and particle shape data from the beginning and end of a test, but one major advantage of DEM simulations is that, although the data are still discrete, more information on intermediate stages is available. This paper describes a repetitive compression test using a 2D aggregate-based DEM model comprised of spheropolygonal particles (formed by the Minkowski sum of a circle and a polygon, viz. sweeping a circle around the edges of the polygon) that are connected by beams and compares the behaviour with experimental data on the breakage of Barrys Beach carbonate sand. The one-dimensional repetitive compression test was performed on 20 particles—each consisting of over 100 sub-particles—which were generated from the outlines of particles of Barrys Beach carbonate sand. Particle breakage was described through the breakage of beams (particle bonds), allowing the evaluation of changes in the compressibility and grading. It was noted that the simulation compared well with the experimental behaviour of Barrys Beach carbonate sand.

  10. Glacial Surface Topography and its Changes in the Western Qilian Mountains Derived from TanDEM-X Bi-Static InSAR

    NASA Astrophysics Data System (ADS)

    Sun, Yafei; Jiang, Liming; Liu, Lin; Wang, Hansheng; Hsu, Houtse; Shen, Qiang

    2016-08-01

    The high-resolution and high-precision glacier surface topography is one of the most important fundamental data for the research of glacial dynamic process of mountain glaciers. It is noteworthy that the TanDEM- X mission, launched in 2010 by the German Aerospace Center (DLR), opens a new era in single-pass satellite SAR remote sensing [1]. The TanDEM-X (TDX) mission employs a bi-static interferometric configuration of the two identical satellites TerraSAR-X (TSX) and TDX flying in a closely controlled formation, the primary objective of which is to generate a global, high-accurate, and homogeneous DEM following the high standard accuracy HRTI-3 [1].In this study, we aim to quantitatively evaluate the potential of the TDX bi-static SAR data for measuring glacier surface topography and elevation changes over mountain regions.

  11. Comparative evaluation of patellar height methods in the Brazilian population☆

    PubMed Central

    Behrendt, Christian; Zaluski, Alexandre; e Albuquerque, Rodrigo Pires; de Sousa, Eduardo Branco; Cavanellas, Naasson

    2015-01-01

    Objective The methods most used for patellar height measurement were compared with the plateau–patella angle method. Methods A cross-sectional study was conducted, in which lateral-view radiographs of the knee were evaluated using the three methods already established in the literature: Insall–Salvati (IS), Blackburne–Peel (BP) and Caton–Deschamps (CD). These were compared with the plateau–patella angle method. One hundred and ninety-six randomly selected patients were included in the sample. Results The data were initially evaluated using the chi-square test. This analysis was deemed to be positive with p < 0.0001. We compared the traditional methods with the plateau–patella angle measurement, using Fisher's exact test. In comparing the IS index with the plateau–patella angle, we did not find any statistically significant differences in relation to the proportion of altered cases between the two groups. The traditional methods were compared with the plateau–patella angle with regard to the proportions of cases of high and low patella, by means of Fisher's exact test. This analysis showed that the plateau–patella angle identified fewer cases of high patella than did the IS, BP and CD methods, but more cases of low patella. In comparing pairs, we found that the IS and CD indices were capable of identifying more cases of high patella than was the plateau–patella angle. In relation to the cases of low patella, the plateau–patella angle was capable of identifying more cases than were the other three methods. Conclusions The plateau–patella angle found more patients with low patella than did the classical methods and showed results that diverged from those of the other indices studied. PMID:26962492

  12. An IMU Evaluation Method Using a Signal Grafting Scheme

    PubMed Central

    Niu, Xiaoji; Wang, Qiang; Li, You; Zhang, Quan; Jiang, Peng

    2016-01-01

    As various inertial measurement units (IMUs) from different manufacturers appear every year, it is not affordable to evaluate every IMU through tests. Therefore, this paper presents an IMU evaluation method by grafting data from the tested IMU to the reference data from a higher-grade IMU. The signal grafting (SG) method has several benefits: (a) only one set of field tests with a higher-grade IMU is needed, and can be used to evaluate numerous IMUs. Thus, SG is effective and economic because all data from the tested IMU is collected in the lab; (b) it is a general approach to compare navigation performances of various IMUs by using the same reference data; and, finally, (c) through SG, one can first evaluate an IMU in the lab, and then decide whether to further test it. Moreover, this paper verified the validity of SG to both medium- and low-grade IMUs, and presents and compared two SG strategies, i.e., the basic-error strategy and the full-error strategy. SG provided results similar to field tests, with a difference of under 5% and 19.4%–26.7% for tested tactical-grade and MEMS IMUs. Meanwhile, it was found that dynamic IMU errors were essential to guarantee the effect of the SG method. PMID:27294932

  13. [A Standing Balance Evaluation Method Based on Largest Lyapunov Exponent].

    PubMed

    Liu, Kun; Wang, Hongrui; Xiao, Jinzhuang; Zhao, Qing

    2015-12-01

    In order to evaluate the ability of human standing balance scientifically, we in this study proposed a new evaluation method based on the chaos nonlinear analysis theory. In this method, a sinusoidal acceleration stimulus in forward/backward direction was forced under the subjects' feet, which was supplied by a motion platform. In addition, three acceleration sensors, which were fixed to the shoulder, hip and knee of each subject, were applied to capture the balance adjustment dynamic data. Through reconstructing the system phase space, we calculated the largest Lyapunov exponent (LLE) of the dynamic data of subjects' different segments, then used the sum of the squares of the difference between each LLE (SSDLLE) as the balance capabilities evaluation index. Finally, 20 subjects' indexes were calculated, and compared with evaluation results of existing methods. The results showed that the SSDLLE were more in line with the subjects' performance during the experiment, and it could measure the body's balance ability to some extent. Moreover, the results also illustrated that balance level was determined by the coordinate ability of various joints, and there might be more balance control strategy in the process of maintaining balance.

  14. [Methods of evaluating labor progress in contemporary obstetrics].

    PubMed

    Głuszak, Michał; Fracki, Stanisław; Wielgoś, Mirosław; Wegrzyn, Piotr

    2013-08-01

    Assessment of progress in labor is one of the foremost problems in obstetrics. Obstructed labor increases danger to maternal and fetal life and health, and may be caused by birth canal pathologies, as well as inefficient uterine contractions or failure of cervical dilation. Such obstructions require the use of vacuum extraction, forceps, or a Caesarean section. Operative delivery should be performed only when specifically indicated. Conversely postponing an operative delivery when the procedure is necessary is detrimental to the neonatal outcome. Therefore, it is advisable to make the decision on the basis of objective, measurable parameters. Methods of evaluating the risk of labor disorders have evolved over the years. Currently ultrasonography is used for fetal biometric measurements and weight estimation. It helps to evaluate the risk of labor disorders. This method, however is limited by a relatively large measurement error At present, vaginal examination is still the primary method of evaluating labor progress, although the technique is known to be operator-dependent and poorly reproducible. Recent publications suggest that intrapartum translabial ultrasonography is more accurate and allows for an objective assessment of labor progress. Recent studies have evaluated fetal head engagement based on the following parameters: angle between the pubic symphysis and fetal head, distance between the presenting point and the interspinous line and fetal head direction in the birth canal. Each of the described parameters allowed for an objective assessment of head engagement but no advantage of any particular parameter has been revealed so far.

  15. An IMU Evaluation Method Using a Signal Grafting Scheme.

    PubMed

    Niu, Xiaoji; Wang, Qiang; Li, You; Zhang, Quan; Jiang, Peng

    2016-06-10

    As various inertial measurement units (IMUs) from different manufacturers appear every year, it is not affordable to evaluate every IMU through tests. Therefore, this paper presents an IMU evaluation method by grafting data from the tested IMU to the reference data from a higher-grade IMU. The signal grafting (SG) method has several benefits: (a) only one set of field tests with a higher-grade IMU is needed, and can be used to evaluate numerous IMUs. Thus, SG is effective and economic because all data from the tested IMU is collected in the lab; (b) it is a general approach to compare navigation performances of various IMUs by using the same reference data; and, finally, (c) through SG, one can first evaluate an IMU in the lab, and then decide whether to further test it. Moreover, this paper verified the validity of SG to both medium- and low-grade IMUs, and presents and compared two SG strategies, i.e., the basic-error strategy and the full-error strategy. SG provided results similar to field tests, with a difference of under 5% and 19.4%-26.7% for tested tactical-grade and MEMS IMUs. Meanwhile, it was found that dynamic IMU errors were essential to guarantee the effect of the SG method.

  16. Development of evaluation method for software hazard identification techniques

    SciTech Connect

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-07-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  17. A new method to evaluate human-robot system performance.

    PubMed

    Rodriguez, G; Weisbin, C R

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  18. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  19. Operator performance evaluation using multi criteria decision making methods

    NASA Astrophysics Data System (ADS)

    Rani, Ruzanita Mat; Ismail, Wan Rosmanira; Razali, Siti Fatihah

    2014-06-01

    Operator performance evaluation is a very important operation in labor-intensive manufacturing industry because the company's productivity depends on the performance of its operators. The aims of operator performance evaluation are to give feedback to operators on their performance, to increase company's productivity and to identify strengths and weaknesses of each operator. In this paper, six multi criteria decision making methods; Analytical Hierarchy Process (AHP), fuzzy AHP (FAHP), ELECTRE, PROMETHEE II, Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) are used to evaluate the operators' performance and to rank the operators. The performance evaluation is based on six main criteria; competency, experience and skill, teamwork and time punctuality, personal characteristics, capability and outcome. The study was conducted at one of the SME food manufacturing companies in Selangor. From the study, it is found that AHP and FAHP yielded the "outcome" criteria as the most important criteria. The results of operator performance evaluation showed that the same operator is ranked the first using all six methods.

  20. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.