Sample records for proposed method covers

  1. Examination of the semi-automatic calculation technique of vegetation cover rate by digital camera images.

    NASA Astrophysics Data System (ADS)

    Takemine, S.; Rikimaru, A.; Takahashi, K.

    The rice is one of the staple foods in the world High quality rice production requires periodically collecting rice growth data to control the growth of rice The height of plant the number of stem the color of leaf is well known parameters to indicate rice growth Rice growth diagnosis method based on these parameters is used operationally in Japan although collecting these parameters by field survey needs a lot of labor and time Recently a laborsaving method for rice growth diagnosis is proposed which is based on vegetation cover rate of rice Vegetation cover rate of rice is calculated based on discriminating rice plant areas in a digital camera image which is photographed in nadir direction Discrimination of rice plant areas in the image was done by the automatic binarization processing However in the case of vegetation cover rate calculation method depending on the automatic binarization process there is a possibility to decrease vegetation cover rate against growth of rice In this paper a calculation method of vegetation cover rate was proposed which based on the automatic binarization process and referred to the growth hysteresis information For several images obtained by field survey during rice growing season vegetation cover rate was calculated by the conventional automatic binarization processing and the proposed method respectively And vegetation cover rate of both methods was compared with reference value obtained by visual interpretation As a result of comparison the accuracy of discriminating rice plant areas was increased by the proposed

  2. 77 FR 7080 - Changes To Implement Transitional Program for Covered Business Method Patents

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... 0651-AC73 Changes To Implement Transitional Program for Covered Business Method Patents AGENCY: United... covered business method patents to be conducted before the Patent Trial and Appeal Board (Board). These..., Covered Business Method Patent Review Proposed Rules.'' Comments may also be sent by electronic mail...

  3. 77 FR 7095 - Transitional Program for Covered Business Method Patents-Definition of Technological Invention

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... 0651-AC75 Transitional Program for Covered Business Method Patents-- Definition of Technological... proceeding for covered business method patents. The provision of the Leahy-Smith America Invents Act will... to the attention of ``Lead Judge Michael Tierney, Covered Business Method Patent Review Proposed...

  4. General form of a cooperative gradual maximal covering location problem

    NASA Astrophysics Data System (ADS)

    Bagherinejad, Jafar; Bashiri, Mahdi; Nikzad, Hamideh

    2018-07-01

    Cooperative and gradual covering are two new methods for developing covering location models. In this paper, a cooperative maximal covering location-allocation model is developed (CMCLAP). In addition, both cooperative and gradual covering concepts are applied to the maximal covering location simultaneously (CGMCLP). Then, we develop an integrated form of a cooperative gradual maximal covering location problem, which is called a general CGMCLP. By setting the model parameters, the proposed general model can easily be transformed into other existing models, facilitating general comparisons. The proposed models are developed without allocation for physical signals and with allocation for non-physical signals in discrete location space. Comparison of the previously introduced gradual maximal covering location problem (GMCLP) and cooperative maximal covering location problem (CMCLP) models with our proposed CGMCLP model in similar data sets shows that the proposed model can cover more demands and acts more efficiently. Sensitivity analyses are performed to show the effect of related parameters and the model's validity. Simulated annealing (SA) and a tabu search (TS) are proposed as solution algorithms for the developed models for large-sized instances. The results show that the proposed algorithms are efficient solution approaches, considering solution quality and running time.

  5. Mapping Urban Tree Canopy Cover Using Fused Airborne LIDAR and Satellite Imagery Data

    NASA Astrophysics Data System (ADS)

    Parmehr, Ebadat G.; Amati, Marco; Fraser, Clive S.

    2016-06-01

    Urban green spaces, particularly urban trees, play a key role in enhancing the liveability of cities. The availability of accurate and up-to-date maps of tree canopy cover is important for sustainable development of urban green spaces. LiDAR point clouds are widely used for the mapping of buildings and trees, and several LiDAR point cloud classification techniques have been proposed for automatic mapping. However, the effectiveness of point cloud classification techniques for automated tree extraction from LiDAR data can be impacted to the point of failure by the complexity of tree canopy shapes in urban areas. Multispectral imagery, which provides complementary information to LiDAR data, can improve point cloud classification quality. This paper proposes a reliable method for the extraction of tree canopy cover from fused LiDAR point cloud and multispectral satellite imagery data. The proposed method initially associates each LiDAR point with spectral information from the co-registered satellite imagery data. It calculates the normalised difference vegetation index (NDVI) value for each LiDAR point and corrects tree points which have been misclassified as buildings. Then, region growing of tree points, taking the NDVI value into account, is applied. Finally, the LiDAR points classified as tree points are utilised to generate a canopy cover map. The performance of the proposed tree canopy cover mapping method is experimentally evaluated on a data set of airborne LiDAR and WorldView 2 imagery covering a suburb in Melbourne, Australia.

  6. Optic-null space medium for cover-up cloaking without any negative refraction index materials

    PubMed Central

    Sun, Fei; He, Sailing

    2016-01-01

    With the help of optic-null medium, we propose a new way to achieve invisibility by covering up the scattering without using any negative refraction index materials. Compared with previous methods to achieve invisibility, the function of our cloak is to cover up the scattering of the objects to be concealed by a background object of strong scattering. The concealed object can receive information from the outside world without being detected. Numerical simulations verify the performance of our cloak. The proposed method will be a great addition to existing invisibility technology. PMID:27383833

  7. Optic-null space medium for cover-up cloaking without any negative refraction index materials.

    PubMed

    Sun, Fei; He, Sailing

    2016-07-07

    With the help of optic-null medium, we propose a new way to achieve invisibility by covering up the scattering without using any negative refraction index materials. Compared with previous methods to achieve invisibility, the function of our cloak is to cover up the scattering of the objects to be concealed by a background object of strong scattering. The concealed object can receive information from the outside world without being detected. Numerical simulations verify the performance of our cloak. The proposed method will be a great addition to existing invisibility technology.

  8. Competitive code-based fast palmprint identification using a set of cover trees

    NASA Astrophysics Data System (ADS)

    Yue, Feng; Zuo, Wangmeng; Zhang, David; Wang, Kuanquan

    2009-06-01

    A palmprint identification system recognizes a query palmprint image by searching for its nearest neighbor from among all the templates in a database. When applied on a large-scale identification system, it is often necessary to speed up the nearest-neighbor searching process. We use competitive code, which has very fast feature extraction and matching speed, for palmprint identification. To speed up the identification process, we extend the cover tree method and propose to use a set of cover trees to facilitate the fast and accurate nearest-neighbor searching. We can use the cover tree method because, as we show, the angular distance used in competitive code can be decomposed into a set of metrics. Using the Hong Kong PolyU palmprint database (version 2) and a large-scale palmprint database, our experimental results show that the proposed method searches for nearest neighbors faster than brute force searching.

  9. Reversible integer wavelet transform for blind image hiding method

    PubMed Central

    Bibi, Nargis; Mahmood, Zahid; Akram, Tallha; Naqvi, Syed Rameez

    2017-01-01

    In this article, a blind data hiding reversible methodology to embed the secret data for hiding purpose into cover image is proposed. The key advantage of this research work is to resolve the privacy and secrecy issues raised during the data transmission over the internet. Firstly, data is decomposed into sub-bands using the integer wavelets. For decomposition, the Fresnelet transform is utilized which encrypts the secret data by choosing a unique key parameter to construct a dummy pattern. The dummy pattern is then embedded into an approximated sub-band of the cover image. Our proposed method reveals high-capacity and great imperceptibility of the secret embedded data. With the utilization of family of integer wavelets, the proposed novel approach becomes more efficient for hiding and retrieving process. It retrieved the secret hidden data from the embedded data blindly, without the requirement of original cover image. PMID:28498855

  10. Proposed hybrid-classifier ensemble algorithm to map snow cover area

    NASA Astrophysics Data System (ADS)

    Nijhawan, Rahul; Raman, Balasubramanian; Das, Josodhir

    2018-01-01

    Metaclassification ensemble approach is known to improve the prediction performance of snow-covered area. The methodology adopted in this case is based on neural network along with four state-of-art machine learning algorithms: support vector machine, artificial neural networks, spectral angle mapper, K-mean clustering, and a snow index: normalized difference snow index. An AdaBoost ensemble algorithm related to decision tree for snow-cover mapping is also proposed. According to available literature, these methods have been rarely used for snow-cover mapping. Employing the above techniques, a study was conducted for Raktavarn and Chaturangi Bamak glaciers, Uttarakhand, Himalaya using multispectral Landsat 7 ETM+ (enhanced thematic mapper) image. The study also compares the results with those obtained from statistical combination methods (majority rule and belief functions) and accuracies of individual classifiers. Accuracy assessment is performed by computing the quantity and allocation disagreement, analyzing statistic measures (accuracy, precision, specificity, AUC, and sensitivity) and receiver operating characteristic curves. A total of 225 combinations of parameters for individual classifiers were trained and tested on the dataset and results were compared with the proposed approach. It was observed that the proposed methodology produced the highest classification accuracy (95.21%), close to (94.01%) that was produced by the proposed AdaBoost ensemble algorithm. From the sets of observations, it was concluded that the ensemble of classifiers produced better results compared to individual classifiers.

  11. Aggregation of Sentinel-2 time series classifications as a solution for multitemporal analysis

    NASA Astrophysics Data System (ADS)

    Lewiński, Stanislaw; Nowakowski, Artur; Malinowski, Radek; Rybicki, Marcin; Kukawska, Ewa; Krupiński, Michał

    2017-10-01

    The general aim of this work was to elaborate efficient and reliable aggregation method that could be used for creating a land cover map at a global scale from multitemporal satellite imagery. The study described in this paper presents methods for combining results of land cover/land use classifications performed on single-date Sentinel-2 images acquired at different time periods. For that purpose different aggregation methods were proposed and tested on study sites spread on different continents. The initial classifications were performed with Random Forest classifier on individual Sentinel-2 images from a time series. In the following step the resulting land cover maps were aggregated pixel by pixel using three different combinations of information on the number of occurrences of a certain land cover class within a time series and the posterior probability of particular classes resulting from the Random Forest classification. From the proposed methods two are shown superior and in most cases were able to reach or outperform the accuracy of the best individual classifications of single-date images. Moreover, the aggregations results are very stable when used on data with varying cloudiness. They also enable to reduce considerably the number of cloudy pixels in the resulting land cover map what is significant advantage for mapping areas with frequent cloud coverage.

  12. Snow depth and snow cover retrieval from FengYun3B microwave radiation imagery based on a snow passive microwave unmixing method in Northeast China

    NASA Astrophysics Data System (ADS)

    Gu, Lingjia; Ren, Ruizhi; Zhao, Kai; Li, Xiaofeng

    2014-01-01

    The precision of snow parameter retrieval is unsatisfactory for current practical demands. The primary reason is because of the problem of mixed pixels that are caused by low spatial resolution of satellite passive microwave data. A snow passive microwave unmixing method is proposed in this paper, based on land cover type data and the antenna gain function of passive microwaves. The land cover type of Northeast China is partitioned into grass, farmland, bare soil, forest, and water body types. The component brightness temperatures (CBT), namely unmixed data, with 1 km data resolution are obtained using the proposed unmixing method. The snow depth determined by the CBT and three snow depth retrieval algorithms are validated through field measurements taken in forest and farmland areas of Northeast China in January 2012 and 2013. The results show that the overall of the retrieval precision of the snow depth is improved by 17% in farmland areas and 10% in forest areas when using the CBT in comparison with the mixed pixels. The snow cover results based on the CBT are compared with existing MODIS snow cover products. The results demonstrate that more snow cover information can be obtained with up to 86% accuracy.

  13. A Hierarchical Object-oriented Urban Land Cover Classification Using WorldView-2 Imagery and Airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Wu, M. F.; Sun, Z. C.; Yang, B.; Yu, S. S.

    2016-11-01

    In order to reduce the “salt and pepper” in pixel-based urban land cover classification and expand the application of fusion of multi-source data in the field of urban remote sensing, WorldView-2 imagery and airborne Light Detection and Ranging (LiDAR) data were used to improve the classification of urban land cover. An approach of object- oriented hierarchical classification was proposed in our study. The processing of proposed method consisted of two hierarchies. (1) In the first hierarchy, LiDAR Normalized Digital Surface Model (nDSM) image was segmented to objects. The NDVI, Costal Blue and nDSM thresholds were set for extracting building objects. (2) In the second hierarchy, after removing building objects, WorldView-2 fused imagery was obtained by Haze-ratio-based (HR) fusion, and was segmented. A SVM classifier was applied to generate road/parking lot, vegetation and bare soil objects. (3) Trees and grasslands were split based on an nDSM threshold (2.4 meter). The results showed that compared with pixel-based and non-hierarchical object-oriented approach, proposed method provided a better performance of urban land cover classification, the overall accuracy (OA) and overall kappa (OK) improved up to 92.75% and 0.90. Furthermore, proposed method reduced “salt and pepper” in pixel-based classification, improved the extraction accuracy of buildings based on LiDAR nDSM image segmentation, and reduced the confusion between trees and grasslands through setting nDSM threshold.

  14. Estimation of vegetation cover at subpixel resolution using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Jasinski, Michael F.; Eagleson, Peter S.

    1986-01-01

    The present report summarizes the various approaches relevant to estimating canopy cover at subpixel resolution. The approaches are based on physical models of radiative transfer in non-homogeneous canopies and on empirical methods. The effects of vegetation shadows and topography are examined. Simple versions of the model are tested, using the Taos, New Mexico Study Area database. Emphasis has been placed on using relatively simple models requiring only one or two bands. Although most methods require some degree of ground truth, a two-band method is investigated whereby the percent cover can be estimated without ground truth by examining the limits of the data space. Future work is proposed which will incorporate additional surface parameters into the canopy cover algorithm, such as topography, leaf area, or shadows. The method involves deriving a probability density function for the percent canopy cover based on the joint probability density function of the observed radiances.

  15. A higher order conditional random field model for simultaneous classification of land cover and land use

    NASA Astrophysics Data System (ADS)

    Albert, Lena; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    We propose a new approach for the simultaneous classification of land cover and land use considering spatial as well as semantic context. We apply a Conditional Random Fields (CRF) consisting of a land cover and a land use layer. In the land cover layer of the CRF, the nodes represent super-pixels; in the land use layer, the nodes correspond to objects from a geospatial database. Intra-layer edges of the CRF model spatial dependencies between neighbouring image sites. All spatially overlapping sites in both layers are connected by inter-layer edges, which leads to higher order cliques modelling the semantic relation between all land cover and land use sites in the clique. A generic formulation of the higher order potential is proposed. In order to enable efficient inference in the two-layer higher order CRF, we propose an iterative inference procedure in which the two classification tasks mutually influence each other. We integrate contextual relations between land cover and land use in the classification process by using contextual features describing the complex dependencies of all nodes in a higher order clique. These features are incorporated in a discriminative classifier, which approximates the higher order potentials during the inference procedure. The approach is designed for input data based on aerial images. Experiments are carried out on two test sites to evaluate the performance of the proposed method. The experiments show that the classification results are improved compared to the results of a non-contextual classifier. For land cover classification, the result is much more homogeneous and the delineation of land cover segments is improved. For the land use classification, an improvement is mainly achieved for land use objects showing non-typical characteristics or similarities to other land use classes. Furthermore, we have shown that the size of the super-pixels has an influence on the level of detail of the classification result, but also on the degree of smoothing induced by the segmentation method, which is especially beneficial for land cover classes covering large, homogeneous areas.

  16. Sunglass detection method for automation of video surveillance system

    NASA Astrophysics Data System (ADS)

    Sikandar, Tasriva; Samsudin, Wan Nur Azhani W.; Hawari Ghazali, Kamarul; Mohd, Izzeldin I.; Fazle Rabbi, Mohammad

    2018-04-01

    Wearing sunglass to hide face from surveillance camera is a common activity in criminal incidences. Therefore, sunglass detection from surveillance video has become a demanding issue in automation of security systems. In this paper we propose an image processing method to detect sunglass from surveillance images. Specifically, a unique feature using facial height and width has been employed to identify the covered region of the face. The presence of covered area by sunglass is evaluated using facial height-width ratio. Threshold value of covered area percentage is used to classify the glass wearing face. Two different types of glasses have been considered i.e. eye glass and sunglass. The results of this study demonstrate that the proposed method is able to detect sunglasses in two different illumination conditions such as, room illumination as well as in the presence of sunlight. In addition, due to the multi-level checking in facial region, this method has 100% accuracy of detecting sunglass. However, in an exceptional case where fabric surrounding the face has similar color as skin, the correct detection rate was found 93.33% for eye glass.

  17. Comparison of power curve monitoring methods

    NASA Astrophysics Data System (ADS)

    Cambron, Philippe; Masson, Christian; Tahan, Antoine; Torres, David; Pelletier, Francis

    2017-11-01

    Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM) of wind turbines (WT). In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.

  18. Quality Assurance in Higher Education: Proposals for Consultation.

    ERIC Educational Resources Information Center

    Higher Education Funding Council for England, Bristol.

    This document sets out for consultation proposals for a revised method for quality assurance of teaching and learning in higher education. The proposals cover: (1) the objectives and principles of quality assurance; (2) an approach to quality assurance based on external audit principles; (3) the collection and publication of information; (4)…

  19. Target deception jamming method against spaceborne synthetic aperture radar using electromagnetic scattering

    NASA Astrophysics Data System (ADS)

    Sun, Qingyang; Shu, Ting; Tang, Bin; Yu, Wenxian

    2018-01-01

    A method is proposed to perform target deception jamming against spaceborne synthetic aperture radar. Compared with the traditional jamming methods using deception templates to cover the target or region of interest, the proposed method aims to generate a verisimilar deceptive target in various attitude with high fidelity using the electromagnetic (EM) scattering. Based on the geometrical model for target deception jamming, the EM scattering data from the deceptive target was first simulated by applying an EM calculation software. Then, the proposed jamming frequency response (JFR) is calculated offline by further processing. Finally, the deception jamming is achieved in real time by a multiplication between the proposed JFR and the spectrum of intercepted radar signals. The practical implementation is presented. The simulation results prove the validity of the proposed method.

  20. Cloud Detection of Optical Satellite Images Using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Lee, Kuan-Yi; Lin, Chao-Hung

    2016-06-01

    Cloud covers are generally present in optical remote-sensing images, which limit the usage of acquired images and increase the difficulty of data analysis, such as image compositing, correction of atmosphere effects, calculations of vegetation induces, land cover classification, and land cover change detection. In previous studies, thresholding is a common and useful method in cloud detection. However, a selected threshold is usually suitable for certain cases or local study areas, and it may be failed in other cases. In other words, thresholding-based methods are data-sensitive. Besides, there are many exceptions to control, and the environment is changed dynamically. Using the same threshold value on various data is not effective. In this study, a threshold-free method based on Support Vector Machine (SVM) is proposed, which can avoid the abovementioned problems. A statistical model is adopted to detect clouds instead of a subjective thresholding-based method, which is the main idea of this study. The features used in a classifier is the key to a successful classification. As a result, Automatic Cloud Cover Assessment (ACCA) algorithm, which is based on physical characteristics of clouds, is used to distinguish the clouds and other objects. In the same way, the algorithm called Fmask (Zhu et al., 2012) uses a lot of thresholds and criteria to screen clouds, cloud shadows, and snow. Therefore, the algorithm of feature extraction is based on the ACCA algorithm and Fmask. Spatial and temporal information are also important for satellite images. Consequently, co-occurrence matrix and temporal variance with uniformity of the major principal axis are used in proposed method. We aim to classify images into three groups: cloud, non-cloud and the others. In experiments, images acquired by the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) and images containing the landscapes of agriculture, snow area, and island are tested. Experiment results demonstrate the detection accuracy of the proposed method is better than related methods.

  1. Optimized extreme learning machine for urban land cover classification using hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Su, Hongjun; Tian, Shufang; Cai, Yue; Sheng, Yehua; Chen, Chen; Najafian, Maryam

    2017-12-01

    This work presents a new urban land cover classification framework using the firefly algorithm (FA) optimized extreme learning machine (ELM). FA is adopted to optimize the regularization coefficient C and Gaussian kernel σ for kernel ELM. Additionally, effectiveness of spectral features derived from an FA-based band selection algorithm is studied for the proposed classification task. Three sets of hyperspectral databases were recorded using different sensors, namely HYDICE, HyMap, and AVIRIS. Our study shows that the proposed method outperforms traditional classification algorithms such as SVM and reduces computational cost significantly.

  2. DEVELOPMENT AND APPLICATION OF METHODS TO ASSESS HUMAN EXPOSURE TO PESTICIDES

    EPA Science Inventory

    Note: this task is schedule to end September 2003. Two tasks will take its place: method development for emerging pesticides including chiral chemistry applications, and in-house laboratory operations. Field sampling methods are covered under a new task proposed this year.
    <...

  3. Camouflaging in Digital Image for Secure Communication

    NASA Astrophysics Data System (ADS)

    Jindal, B.; Singh, A. P.

    2013-06-01

    The present paper reports on a new type of camouflaging in digital image for hiding crypto-data using moderate bit alteration in the pixel. In the proposed method, cryptography is combined with steganography to provide a two layer security to the hidden data. The novelty of the algorithm proposed in the present work lies in the fact that the information about hidden bit is reflected by parity condition in one part of the image pixel. The remaining part of the image pixel is used to perform local pixel adjustment to improve the visual perception of the cover image. In order to examine the effectiveness of the proposed method, image quality measuring parameters are computed. In addition to this, security analysis is also carried by comparing the histograms of cover and stego images. This scheme provides a higher security as well as robustness to intentional as well as unintentional attacks.

  4. Social Science: Course Proposal.

    ERIC Educational Resources Information Center

    Cook, Charles Gene

    A proposal is presented for a Community College of Philadelphia course surveying basic social science skills and information, including scientific method, map usage, evolution, native peoples, social groups, and U.S. Government. Following a standard cover form, a statement of purpose for the course indicates that it is designed to provide…

  5. GIS based optimal impervious surface map generation using various spatial data for urban nonpoint source management.

    PubMed

    Lee, Cholyoung; Kim, Kyehyun; Lee, Hyuk

    2018-01-15

    Impervious surfaces are mainly artificial structures such as rooftops, roads, and parking lots that are covered by impenetrable materials. These surfaces are becoming the major causes of nonpoint source (NPS) pollution in urban areas. The rapid progress of urban development is increasing the total amount of impervious surfaces and NPS pollution. Therefore, many cities worldwide have adopted a stormwater utility fee (SUF) that generates funds needed to manage NPS pollution. The amount of SUF is estimated based on the impervious ratio, which is calculated by dividing the total impervious surface area by the net area of an individual land parcel. Hence, in order to identify the exact impervious ratio, large-scale impervious surface maps (ISMs) are necessary. This study proposes and assesses various methods for generating large-scale ISMs for urban areas by using existing GIS data. Bupyeong-gu, a district in the city of Incheon, South Korea, was selected as the study area. Spatial data that were freely offered by national/local governments in S. Korea were collected. First, three types of ISMs were generated by using the land-cover map, digital topographic map, and orthophotographs, to validate three methods that had been proposed conceptually by Korea Environment Corporation. Then, to generate an ISM of higher accuracy, an integration method using all data was proposed. Error matrices were made and Kappa statistics were calculated to evaluate the accuracy. Overlay analyses were performed to examine the distribution of misclassified areas. From the results, the integration method delivered the highest accuracy (Kappa statistic of 0.99) compared to the three methods that use a single type of spatial data. However, a longer production time and higher cost were limiting factors. Among the three methods using a single type of data, the land-cover map showed the highest accuracy with a Kappa statistic of 0.91. Thus, it was judged that the mapping method using the land-cover map is more appropriate than the others. In conclusion, it is desirable to apply the integration method when generating the ISM with the highest accuracy. However, if time and cost are constrained, it would be effective to primarily use the land-cover map. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Testing Multivariate Adaptive Regression Splines (MARS) as a Method of Land Cover Classification of TERRA-ASTER Satellite Images.

    PubMed

    Quirós, Elia; Felicísimo, Angel M; Cuartero, Aurora

    2009-01-01

    This work proposes a new method to classify multi-spectral satellite images based on multivariate adaptive regression splines (MARS) and compares this classification system with the more common parallelepiped and maximum likelihood (ML) methods. We apply the classification methods to the land cover classification of a test zone located in southwestern Spain. The basis of the MARS method and its associated procedures are explained in detail, and the area under the ROC curve (AUC) is compared for the three methods. The results show that the MARS method provides better results than the parallelepiped method in all cases, and it provides better results than the maximum likelihood method in 13 cases out of 17. These results demonstrate that the MARS method can be used in isolation or in combination with other methods to improve the accuracy of soil cover classification. The improvement is statistically significant according to the Wilcoxon signed rank test.

  7. Communicative Competence of the Fourth Year Students: Basis for Proposed English Language Program

    ERIC Educational Resources Information Center

    Tuan, Vu Van

    2017-01-01

    This study on level of communicative competence covering linguistic/grammatical and discourse has aimed at constructing a proposed English language program for 5 key universities in Vietnam. The descriptive method utilized was scientifically employed with comparative techniques and correlational analysis. The researcher treated the surveyed data…

  8. Iliotibial band friction syndrome

    PubMed Central

    2010-01-01

    Published articles on iliotibial band friction syndrome have been reviewed. These articles cover the epidemiology, etiology, anatomy, pathology, prevention, and treatment of the condition. This article describes (1) the various etiological models that have been proposed to explain iliotibial band friction syndrome; (2) some of the imaging methods, research studies, and clinical experiences that support or call into question these various models; (3) commonly proposed treatment methods for iliotibial band friction syndrome; and (4) the rationale behind these methods and the clinical outcome studies that support their efficacy. PMID:21063495

  9. A novel method for multifactorial bio-chemical experiments design based on combinational design theory.

    PubMed

    Wang, Xun; Sun, Beibei; Liu, Boyang; Fu, Yaping; Zheng, Pan

    2017-01-01

    Experimental design focuses on describing or explaining the multifactorial interactions that are hypothesized to reflect the variation. The design introduces conditions that may directly affect the variation, where particular conditions are purposely selected for observation. Combinatorial design theory deals with the existence, construction and properties of systems of finite sets whose arrangements satisfy generalized concepts of balance and/or symmetry. In this work, borrowing the concept of "balance" in combinatorial design theory, a novel method for multifactorial bio-chemical experiments design is proposed, where balanced templates in combinational design are used to select the conditions for observation. Balanced experimental data that covers all the influencing factors of experiments can be obtianed for further processing, such as training set for machine learning models. Finally, a software based on the proposed method is developed for designing experiments with covering influencing factors a certain number of times.

  10. Improved Frame Mode Selection for AMR-WB+ Based on Decision Tree

    NASA Astrophysics Data System (ADS)

    Kim, Jong Kyu; Kim, Nam Soo

    In this letter, we propose a coding mode selection method for the AMR-WB+ audio coder based on a decision tree. In order to reduce computation while maintaining good performance, decision tree classifier is adopted with the closed loop mode selection results as the target classification labels. The size of the decision tree is controlled by pruning, so the proposed method does not increase the memory requirement significantly. Through an evaluation test on a database covering both speech and music materials, the proposed method is found to achieve a much better mode selection accuracy compared with the open loop mode selection module in the AMR-WB+.

  11. Enhancement of spectral quality of archival aerial photographs using satellite imagery for detection of land cover

    NASA Astrophysics Data System (ADS)

    Siok, Katarzyna; Jenerowicz, Agnieszka; Woroszkiewicz, Małgorzata

    2017-07-01

    Archival aerial photographs are often the only reliable source of information about the area. However, these data are single-band data that do not allow unambiguous detection of particular forms of land cover. Thus, the authors of this article seek to develop a method of coloring panchromatic aerial photographs, which enable increasing the spectral information of such images. The study used data integration algorithms based on pansharpening, implemented in commonly used remote sensing programs: ERDAS, ENVI, and PCI. Aerial photos and Landsat multispectral data recorded in 1987 and 2016 were chosen. This study proposes the use of modified intensity-hue-saturation and Brovey methods. The use of these methods enabled the addition of red-green-blue (RGB) components to monochrome images, thus enhancing their interpretability and spectral quality. The limitations of the proposed method relate to the availability of RGB satellite imagery, the accuracy of mutual orientation of the aerial and the satellite data, and the imperfection of archival aerial photographs. Therefore, it should be expected that the results of coloring will not be perfect compared to the results of the fusion of recent data with a similar ground sampling resolution, but still, they will allow a more accurate and efficient classification of land cover registered on archival aerial photographs.

  12. Numerical modelling of methane oxidation efficiency and coupled water-gas-heat reactive transfer in a sloping landfill cover.

    PubMed

    Feng, S; Ng, C W W; Leung, A K; Liu, H W

    2017-10-01

    Microbial aerobic methane oxidation in unsaturated landfill cover involves coupled water, gas and heat reactive transfer. The coupled process is complex and its influence on methane oxidation efficiency is not clear, especially in steep covers where spatial variations of water, gas and heat are significant. In this study, two-dimensional finite element numerical simulations were carried out to evaluate the performance of unsaturated sloping cover. The numerical model was calibrated using a set of flume model test data, and was then subsequently used for parametric study. A new method that considers transient changes of methane concentration during the estimation of the methane oxidation efficiency was proposed and compared against existing methods. It was found that a steeper cover had a lower oxidation efficiency due to enhanced downslope water flow, during which desaturation of soil promoted gas transport and hence landfill gas emission. This effect was magnified as the cover angle and landfill gas generation rate at the bottom of the cover increased. Assuming the steady-state methane concentration in a cover would result in a non-conservative overestimation of oxidation efficiency, especially when a steep cover was subjected to rainfall infiltration. By considering the transient methane concentration, the newly-modified method can give a more accurate oxidation efficiency. Copyright © 2017. Published by Elsevier Ltd.

  13. Best Hiding Capacity Scheme for Variable Length Messages Using Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Bajaj, Ruchika; Bedi, Punam; Pal, S. K.

    Steganography is an art of hiding information in such a way that prevents the detection of hidden messages. Besides security of data, the quantity of data that can be hidden in a single cover medium, is also very important. We present a secure data hiding scheme with high embedding capacity for messages of variable length based on Particle Swarm Optimization. This technique gives the best pixel positions in the cover image, which can be used to hide the secret data. In the proposed scheme, k bits of the secret message are substituted into k least significant bits of the image pixel, where k varies from 1 to 4 depending on the message length. The proposed scheme is tested and results compared with simple LSB substitution, uniform 4-bit LSB hiding (with PSO) for the test images Nature, Baboon, Lena and Kitty. The experimental study confirms that the proposed method achieves high data hiding capacity and maintains imperceptibility and minimizes the distortion between the cover image and the obtained stego image.

  14. SIZE DISTRIBUTION OF SEA-SALT EMISSIONS AS A FUNCTION OF RELATIVE HUMIDITY

    EPA Science Inventory

    This note presents a straightforward method to correct sea-salt-emission particle-size distributions according to local relative humidity. The proposed method covers a wide range of relative humidity (0.45 to 0.99) and its derivation incorporates recent laboratory results on sea-...

  15. A technology mapping based on graph of excitations and outputs for finite state machines

    NASA Astrophysics Data System (ADS)

    Kania, Dariusz; Kulisz, Józef

    2017-11-01

    A new, efficient technology mapping method of FSMs, dedicated for PAL-based PLDs is proposed. The essence of the method consists in searching for the minimal set of PAL-based logic blocks that cover a set of multiple-output implicants describing the transition and output functions of an FSM. The method is based on a new concept of graph: the Graph of Excitations and Outputs. The proposed algorithm was tested using the FSM benchmarks. The obtained results were compared with the classical technology mapping of FSM.

  16. Image steganography based on 2k correction and coherent bit length

    NASA Astrophysics Data System (ADS)

    Sun, Shuliang; Guo, Yongning

    2014-10-01

    In this paper, a novel algorithm is proposed. Firstly, the edge of cover image is detected with Canny operator and secret data is embedded in edge pixels. Sorting method is used to randomize the edge pixels in order to enhance security. Coherent bit length L is determined by relevant edge pixels. Finally, the method of 2k correction is applied to achieve better imperceptibility in stego image. The experiment shows that the proposed method is better than LSB-3 and Jae-Gil Yu's in PSNR and capacity.

  17. Global Characterization and Monitoring of Forest Cover Using Landsat Data: Opportunities and Challenges

    NASA Technical Reports Server (NTRS)

    Townshend, John R.; Masek, Jeffrey G.; Huang, ChengQuan; Vermote, Eric F.; Gao, Feng; Channan, Saurabh; Sexton, Joseph O.; Feng, Min; Narasimhan, Ramghuram; Kim, Dohyung; hide

    2012-01-01

    The compilation of global Landsat data-sets and the ever-lowering costs of computing now make it feasible to monitor the Earth's land cover at Landsat resolutions of 30 m. In this article, we describe the methods to create global products of forest cover and cover change at Landsat resolutions. Nevertheless, there are many challenges in ensuring the creation of high-quality products. And we propose various ways in which the challenges can be overcome. Among the challenges are the need for atmospheric correction, incorrect calibration coefficients in some of the data-sets, the different phenologies between compilations, the need for terrain correction, the lack of consistent reference data for training and accuracy assessment, and the need for highly automated characterization and change detection. We propose and evaluate the creation and use of surface reflectance products, improved selection of scenes to reduce phenological differences, terrain illumination correction, automated training selection, and the use of information extraction procedures robust to errors in training data along with several other issues. At several stages we use Moderate Resolution Spectroradiometer data and products to assist our analysis. A global working prototype product of forest cover and forest cover change is included.

  18. An information dimension of weighted complex networks

    NASA Astrophysics Data System (ADS)

    Wen, Tao; Jiang, Wen

    2018-07-01

    The fractal and self-similarity are important properties in complex networks. Information dimension is a useful dimension for complex networks to reveal these properties. In this paper, an information dimension is proposed for weighted complex networks. Based on the box-covering algorithm for weighted complex networks (BCANw), the proposed method can deal with the weighted complex networks which appear frequently in the real-world, and it can get the influence of the number of nodes in each box on the information dimension. To show the wide scope of information dimension, some applications are illustrated, indicating that the proposed method is effective and feasible.

  19. Geometrical force constraint method for vessel and x-ray angiogram simulation.

    PubMed

    Song, Shuang; Yang, Jian; Fan, Jingfan; Cong, Weijian; Ai, Danni; Zhao, Yitian; Wang, Yongtian

    2016-01-01

    This study proposes a novel geometrical force constraint method for 3-D vasculature modeling and angiographic image simulation. For this method, space filling force, gravitational force, and topological preserving force are proposed and combined for the optimization of the topology of the vascular structure. The surface covering force and surface adhesion force are constructed to drive the growth of the vasculature on any surface. According to the combination effects of the topological and surface adhering forces, a realistic vasculature can be effectively simulated on any surface. The image projection of the generated 3-D vascular structures is simulated according to the perspective projection and energy attenuation principles of X-rays. Finally, the simulated projection vasculature is fused with a predefined angiographic mask image to generate a realistic angiogram. The proposed method is evaluated on a CT image and three generally utilized surfaces. The results fully demonstrate the effectiveness and robustness of the proposed method.

  20. An Iterative Inference Procedure Applying Conditional Random Fields for Simultaneous Classification of Land Cover and Land Use

    NASA Astrophysics Data System (ADS)

    Albert, L.; Rottensteiner, F.; Heipke, C.

    2015-08-01

    Land cover and land use exhibit strong contextual dependencies. We propose a novel approach for the simultaneous classification of land cover and land use, where semantic and spatial context is considered. The image sites for land cover and land use classification form a hierarchy consisting of two layers: a land cover layer and a land use layer. We apply Conditional Random Fields (CRF) at both layers. The layers differ with respect to the image entities corresponding to the nodes, the employed features and the classes to be distinguished. In the land cover layer, the nodes represent super-pixels; in the land use layer, the nodes correspond to objects from a geospatial database. Both CRFs model spatial dependencies between neighbouring image sites. The complex semantic relations between land cover and land use are integrated in the classification process by using contextual features. We propose a new iterative inference procedure for the simultaneous classification of land cover and land use, in which the two classification tasks mutually influence each other. This helps to improve the classification accuracy for certain classes. The main idea of this approach is that semantic context helps to refine the class predictions, which, in turn, leads to more expressive context information. Thus, potentially wrong decisions can be reversed at later stages. The approach is designed for input data based on aerial images. Experiments are carried out on a test site to evaluate the performance of the proposed method. We show the effectiveness of the iterative inference procedure and demonstrate that a smaller size of the super-pixels has a positive influence on the classification result.

  1. A 0.4-2.3 GHz broadband power amplifier extended continuous class-F design technology

    NASA Astrophysics Data System (ADS)

    Chen, Peng; He, Songbai

    2015-08-01

    A 0.4-2.3 GHz broadband power amplifier (PA) extended continuous class-F design technology is proposed in this paper. Traditional continuous class-F PA performs in high-efficiency only in one octave bandwidth. With the increasing development of wireless communication, the PA is in demand to cover the mainstream communication standards' working frequencies from 0.4 GHz to 2.2 GHz. In order to achieve this objective, the bandwidths of class-F and continuous class-F PA are analysed and discussed by Fourier series. Also, two criteria, which could reduce the continuous class-F PA's implementation complexity, are presented and explained to investigate the overlapping area of the transistor's current and voltage waveforms. The proposed PA design technology is based on the continuous class-F design method and divides the bandwidth into two parts: the first part covers the bandwidth from 1.3 GHz to 2.3 GHz, where the impedances are designed by the continuous class-F method; the other part covers the bandwidth from 0.4 GHz to 1.3 GHz, where the impedance to guarantee PA to be in high-efficiency over this bandwidth is selected and controlled. The improved particle swarm optimisation is employed for realising the multi-impedances of output and input network. A PA based on a commercial 10 W GaN high electron mobility transistor is designed and fabricated to verify the proposed design method. The simulation and measurement results show that the proposed PA could deliver 40-76% power added efficiency and more than 11 dB power gain with more than 40 dBm output power over the bandwidth from 0.4-2.3 GHz.

  2. Bank Erosion Vulnerability Zonation (BEVZ) -A Proposed Method of Preparing Bank Erosion Zonation and Its Application on the River Haora, Tripura, India

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Shreya; de, Sunil Kumar

    2014-05-01

    In the present paper an attempt has been made to propose RS-GIS based method for erosion vulnerability zonation for the entire river based on simple techniques that requires very less field investigation. This method consist of 8 parameters, such as, rainfall erosivity, lithological factor, bank slope, meander index, river gradient, soil erosivity, vegetation cover and anthropogenic impact. Meteorological data, GSI maps, LISS III (30m resolution), SRTM DEM (56m resolution) and Google Images have been used to determine rainfall erosivity, lithological factor, bank slope, meander index, river gradient, vegetation cover and anthropogenic impact; Soil map of the NBSSLP, India has been used for assessing Soil Erosivity index. By integrating the individual values of those six parameters (the 1st two parameters are remained constant for this particular study area) a bank erosion vulnerability zonation map of the River Haora, Tripura, India (23°37' - 23°53'N and 91°15'-91°37'E) has been prepared. The values have been compared with the existing BEHI-NBS method of 60 spots and also with field data of 30 cross sections (covering the 60 spots) taken along 51 km stretch of the river in Indian Territory and found that the estimated values are matching with the existing method as well as with field data. The whole stretch has been divided into 5 hazard zones, i.e. Very High, High, Moderate, Low and Very Low Hazard Zones and they are covering 5.66 km, 16.81 km, 40.82km, 29.67 km and 9.04 km respectively. KEY WORDS: Bank erosion, Bank Erosion Hazard Index (BEHI), Near Bank Stress (NBS), Erosivity, Bank Erosion Vulnerability Zonation.

  3. A mutual information-Dempster-Shafer based decision ensemble system for land cover classification of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Pahlavani, Parham; Bigdeli, Behnaz

    2017-12-01

    Hyperspectral images contain extremely rich spectral information that offer great potential to discriminate between various land cover classes. However, these images are usually composed of tens or hundreds of spectrally close bands, which result in high redundancy and great amount of computation time in hyperspectral classification. Furthermore, in the presence of mixed coverage pixels, crisp classifiers produced errors, omission and commission. This paper presents a mutual information-Dempster-Shafer system through an ensemble classification approach for classification of hyperspectral data. First, mutual information is applied to split data into a few independent partitions to overcome high dimensionality. Then, a fuzzy maximum likelihood classifies each band subset. Finally, Dempster-Shafer is applied to fuse the results of the fuzzy classifiers. In order to assess the proposed method, a crisp ensemble system based on a support vector machine as the crisp classifier and weighted majority voting as the crisp fusion method are applied on hyperspectral data. Furthermore, a dimension reduction system is utilized to assess the effectiveness of mutual information band splitting of the proposed method. The proposed methodology provides interesting conclusions on the effectiveness and potentiality of mutual information-Dempster-Shafer based classification of hyperspectral data.

  4. Improved LSB matching steganography with histogram characters reserved

    NASA Astrophysics Data System (ADS)

    Chen, Zhihong; Liu, Wenyao

    2008-03-01

    This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.

  5. Memory-optimized shift operator alternating direction implicit finite difference time domain method for plasma

    NASA Astrophysics Data System (ADS)

    Song, Wanjun; Zhang, Hou

    2017-11-01

    Through introducing the alternating direction implicit (ADI) technique and the memory-optimized algorithm to the shift operator (SO) finite difference time domain (FDTD) method, the memory-optimized SO-ADI FDTD for nonmagnetized collisional plasma is proposed and the corresponding formulae of the proposed method for programming are deduced. In order to further the computational efficiency, the iteration method rather than Gauss elimination method is employed to solve the equation set in the derivation of the formulae. Complicated transformations and convolutions are avoided in the proposed method compared with the Z transforms (ZT) ADI FDTD method and the piecewise linear JE recursive convolution (PLJERC) ADI FDTD method. The numerical dispersion of the SO-ADI FDTD method with different plasma frequencies and electron collision frequencies is analyzed and the appropriate ratio of grid size to the minimum wavelength is given. The accuracy of the proposed method is validated by the reflection coefficient test on a nonmagnetized collisional plasma sheet. The testing results show that the proposed method is advantageous for improving computational efficiency and saving computer memory. The reflection coefficient of a perfect electric conductor (PEC) sheet covered by multilayer plasma and the RCS of the objects coated by plasma are calculated by the proposed method and the simulation results are analyzed.

  6. Design and Installation of a Disposal Cell Cover Field Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, C.H.; Waugh, W.J.; Albright, W.H.

    2011-02-27

    The U.S. Department of Energy’s Office of Legacy Management (LM) initiated a cover assessment project in September 2007 to evaluate an inexpensive approach to enhancing the hydrological performance of final covers for disposal cells. The objective is to accelerate and enhance natural processes that are transforming existing conventional covers, which rely on low-conductivity earthen barriers, into water balance covers, that store water in soil and release it as soil evaporation and plant transpiration. A low conductivity cover could be modified by deliberately blending the upper layers of the cover profile and planting native shrubs. A test facility was constructed atmore » the Grand Junction, Colorado, Disposal Site to evaluate the proposed methodology. The test cover was constructed in two identical sections, each including a large drainage lysimeter. The test cover was constructed with the same design and using the same materials as the existing disposal cell in order to allow for a direct comparison of performance. One test section will be renovated using the proposed method; the other is a control. LM is using the lysimeters to evaluate the effectiveness of the renovation treatment by monitoring hydrologic conditions within the cover profile as well as all water entering and leaving the system. This paper describes the historical experience of final covers employing earthen barrier layers, the design and operation of the lysimeter test facility, testing conducted to characterize the as-built engineering and edaphic properties of the lysimeter soils, the calibration of instruments installed at the test facility, and monitoring data collected since the lysimeters were constructed.« less

  7. Using movies in family medicine teaching: A reference to EURACT Educational Agenda

    PubMed Central

    Švab, Igor

    2017-01-01

    Abstract Introduction Cinemeducation is a teaching method where popular movies or movie clips are used. We aimed to determine whether family physicians’ competencies as listed in the Educational Agenda produced by the European Academy of Teachers in General Practice/Family Medicine (EURACT) can be found in movies, and to propose a template for teaching by these movies. Methods A group of family medicine teachers provided a list of movies that they would use in cinemeducation. The movies were categorised according to the key family medicine competencies, thus creating a framework of competences, covered by different movies. These key competencies are Primary care management, Personcentred care, Specific problem-solving skills, Comprehensive approach, Community orientation, and Holistic approach. Results The list consisted of 17 movies. Nine covered primary care management. Person-centred care was covered in 13 movies. Eight movies covered specific problem-solving skills. Comprehensive approach was covered in five movies. Five movies covered community orientation. Holistic approach was covered in five movies. Conclusions All key family medicine competencies listed in the Educational Agenda can be taught using movies. Our results can serve as a template for teachers on how to use any appropriate movies in family medicine education. PMID:28289469

  8. Robust image watermarking using DWT and SVD for copyright protection

    NASA Astrophysics Data System (ADS)

    Harjito, Bambang; Suryani, Esti

    2017-02-01

    The Objective of this paper is proposed a robust combined Discrete Wavelet Transform (DWT) and Singular Value Decomposition (SVD). The RGB image is called a cover medium, and watermark image is converted into gray scale. Then, they are transformed using DWT so that they can be split into several subbands, namely sub-band LL2, LH2, HL2. The watermark image embeds into the cover medium on sub-band LL2. This scheme aims to obtain the higher robustness level than the previous method which performs of SVD matrix factorization image for copyright protection. The experiment results show that the proposed method has robustness against several image processing attacks such as Gaussian, Poisson and Salt and Pepper Noise. In these attacks, noise has average Normalized Correlation (NC) values of 0.574863 0.889784, 0.889782 respectively. The watermark image can be detected and extracted.

  9. Selection of test paths for solder joint intermittent connection faults under DC stimulus

    NASA Astrophysics Data System (ADS)

    Huakang, Li; Kehong, Lv; Jing, Qiu; Guanjun, Liu; Bailiang, Chen

    2018-06-01

    The test path of solder joint intermittent connection faults under direct-current stimulus is examined in this paper. According to the physical structure of the circuit, a network model is established first. A network node is utilised to represent the test node. The path edge refers to the number of intermittent connection faults in the path. Then, the selection criteria of the test path based on the node degree index are proposed and the solder joint intermittent connection faults are covered using fewer test paths. Finally, three circuits are selected to verify the method. To test if the intermittent fault is covered by the test paths, the intermittent fault is simulated by a switch. The results show that the proposed method can detect the solder joint intermittent connection fault using fewer test paths. Additionally, the number of detection steps is greatly reduced without compromising fault coverage.

  10. Pulse electrodeposition of CoFe thin films covered with layered double hydroxides as a fast route to prepare enhanced catalysts for oxygen evolution reaction

    NASA Astrophysics Data System (ADS)

    Sakita, Alan M. P.; Noce, Rodrigo Della; Vallés, Elisa; Benedetti, Assis V.

    2018-03-01

    A novel, ultra-fast, and one-step method for obtaining an effective catalyst for oxygen evolution reaction is proposed. The procedure consists in direct electrodeposition, in a free-nitrate bath, of CoFe alloy films covered with layered double hydroxides (LDH), by potentiostatic mode, in continuous or pulsed regime. The catalyst is directly formed on glassy carbon substrates. The best-prepared catalyst material reveals a mixed morphology with granular and dendritic CoFe alloy covered with a sponge of CoFe-LDH containing a Cl interlayer. An overpotential of η10 mA = 286 mV, with a Tafel slope of 48 mV dec-1, is obtained for the OER which displays the enhanced properties of the catalyst. These improved results demonstrate the competitiveness and efficacy of our proposal for the production of OER catalysts.

  11. A scale-invariant change detection method for land use/cover change research

    NASA Astrophysics Data System (ADS)

    Xing, Jin; Sieber, Renee; Caelli, Terrence

    2018-07-01

    Land Use/Cover Change (LUCC) detection relies increasingly on comparing remote sensing images with different spatial and spectral scales. Based on scale-invariant image analysis algorithms in computer vision, we propose a scale-invariant LUCC detection method to identify changes from scale heterogeneous images. This method is composed of an entropy-based spatial decomposition, two scale-invariant feature extraction methods, Maximally Stable Extremal Region (MSER) and Scale-Invariant Feature Transformation (SIFT) algorithms, a spatial regression voting method to integrate MSER and SIFT results, a Markov Random Field-based smoothing method, and a support vector machine classification method to assign LUCC labels. We test the scale invariance of our new method with a LUCC case study in Montreal, Canada, 2005-2012. We found that the scale-invariant LUCC detection method provides similar accuracy compared with the resampling-based approach but this method avoids the LUCC distortion incurred by resampling.

  12. Influence of die geometry and material selection on the behavior of protective die covers in closed-die forging

    NASA Astrophysics Data System (ADS)

    Yu, Yingyan; Rosenstock, Dirk; Wolfgarten, Martin; Hirt, Gerhard

    2016-10-01

    Due to the fact that tooling costs make up to 30% of total costs of the final forged part, the tool life is always one main research topic in closed-die forging [1]. To improve the wear resistance of forging dies, many methods like nitriding and deposition of ceramic layers have been used. However, all these methods will lose its effect after a certain time, then tool repair or exchange is needed, which requires additional time and costs. A new method, which applies an inexpensive and changeable sheet metal on the forging die to protect it from abrasive wear, was firstly proposed in [2]. According to the first investigation, the die cover is effective for decreasing thermal and mechanical loads, but there are still several challenges to overcome in this concept, like wrinkling and thinning of the die cover. Therefore, an experimental study using different geometries and die cover materials is presented within this work. The results indicate the existence of feasible application cases of this concept, since conditions are found under which a die cover made of 22MnB5 still keeps its original shape even after 7 forging cycles.

  13. Application of modified Martinez-Silva algorithm in determination of net cover

    NASA Astrophysics Data System (ADS)

    Stefanowicz, Łukasz; Grobelna, Iwona

    2016-12-01

    In the article we present the idea of modifications of Martinez-Silva algorithm, which allows for determination of place invariants (p-invariants) of Petri net. Their generation time is important in the parallel decomposition of discrete systems described by Petri nets. Decomposition process is essential from the point of view of discrete system design, as it allows for separation of smaller sequential parts. The proposed modifications of Martinez-Silva method concern the net cover by p-invariants and are focused on two important issues: cyclic reduction of invariant matrix and cyclic checking of net cover.

  14. A patch-based convolutional neural network for remote sensing image classification.

    PubMed

    Sharma, Atharva; Liu, Xiuwen; Yang, Xiaojun; Shi, Di

    2017-11-01

    Availability of accurate land cover information over large areas is essential to the global environment sustainability; digital classification using medium-resolution remote sensing data would provide an effective method to generate the required land cover information. However, low accuracy of existing per-pixel based classification methods for medium-resolution data is a fundamental limiting factor. While convolutional neural networks (CNNs) with deep layers have achieved unprecedented improvements in object recognition applications that rely on fine image structures, they cannot be applied directly to medium-resolution data due to lack of such fine structures. In this paper, considering the spatial relation of a pixel to its neighborhood, we propose a new deep patch-based CNN system tailored for medium-resolution remote sensing data. The system is designed by incorporating distinctive characteristics of medium-resolution data; in particular, the system computes patch-based samples from multidimensional top of atmosphere reflectance data. With a test site from the Florida Everglades area (with a size of 771 square kilometers), the proposed new system has outperformed pixel-based neural network, pixel-based CNN and patch-based neural network by 24.36%, 24.23% and 11.52%, respectively, in overall classification accuracy. By combining the proposed deep CNN and the huge collection of medium-resolution remote sensing data, we believe that much more accurate land cover datasets can be produced over large areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  16. Unsupervised universal steganalyzer for high-dimensional steganalytic features

    NASA Astrophysics Data System (ADS)

    Hou, Xiaodan; Zhang, Tao

    2016-11-01

    The research in developing steganalytic features has been highly successful. These features are extremely powerful when applied to supervised binary classification problems. However, they are incompatible with unsupervised universal steganalysis because the unsupervised method cannot distinguish embedding distortion from varying levels of noises caused by cover variation. This study attempts to alleviate the problem by introducing similarity retrieval of image statistical properties (SRISP), with the specific aim of mitigating the effect of cover variation on the existing steganalytic features. First, cover images with some statistical properties similar to those of a given test image are searched from a retrieval cover database to establish an aided sample set. Then, unsupervised outlier detection is performed on a test set composed of the given test image and its aided sample set to determine the type (cover or stego) of the given test image. Our proposed framework, called SRISP-aided unsupervised outlier detection, requires no training. Thus, it does not suffer from model mismatch mess. Compared with prior unsupervised outlier detectors that do not consider SRISP, the proposed framework not only retains the universality but also exhibits superior performance when applied to high-dimensional steganalytic features.

  17. Study on Building Extraction from High-Resolution Images Using Mbi

    NASA Astrophysics Data System (ADS)

    Ding, Z.; Wang, X. Q.; Li, Y. L.; Zhang, S. S.

    2018-04-01

    Building extraction from high resolution remote sensing images is a hot research topic in the field of photogrammetry and remote sensing. However, the diversity and complexity of buildings make building extraction methods still face challenges in terms of accuracy, efficiency, and so on. In this study, a new building extraction framework based on MBI and combined with image segmentation techniques, spectral constraint, shadow constraint, and shape constraint is proposed. In order to verify the proposed method, worldview-2, GF-2, GF-1 remote sensing images covered Xiamen Software Park were used for building extraction experiments. Experimental results indicate that the proposed method improve the original MBI significantly, and the correct rate is over 86 %. Furthermore, the proposed framework reduces the false alarms by 42 % on average compared to the performance of the original MBI.

  18. Fixtureless nonrigid part inspection using depth cameras

    NASA Astrophysics Data System (ADS)

    Xiong, Hanwei; Xu, Jun; Xu, Chenxi; Pan, Ming

    2016-10-01

    In automobile industry, flexible thin shell parts are used to cover car body. Such parts could have a different shape in a free state than the design model due to dimensional variation, gravity loads and residual strains. Special inspection fixtures are generally indispensable for geometric inspection. Recently, some researchers have proposed fixtureless nonridged inspect methods using intrinsic geometry or virtual spring-mass system, based on some assumptions about deformation between Free State shape and nominal CAD shape. In this paper, we propose a new fixtureless method to inspect flexible parts with a depth camera, which is efficient and low computational complexity. Unlike traditional method, we gather two point cloud set of the manufactured part in two different states, and make correspondences between them and one of them to the CAD model. The manufacturing defects can be derived from the correspondences. Finite element method (FEM) disappears in our method. Experimental evaluation of the proposed method is presented.

  19. Interior region-of-interest reconstruction using a small, nearly piecewise constant subregion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taguchi, Katsuyuki; Xu Jingyan; Srivastava, Somesh

    2011-03-15

    Purpose: To develop a method to reconstruct an interior region-of-interest (ROI) image with sufficient accuracy that uses differentiated backprojection (DBP) projection onto convex sets (POCS) [H. Kudo et al., ''Tiny a priori knowledge solves the interior problem in computed tomography'', Phys. Med. Biol. 53, 2207-2231 (2008)] and a tiny knowledge that there exists a nearly piecewise constant subregion. Methods: The proposed method first employs filtered backprojection to reconstruct an image on which a tiny region P with a small variation in the pixel values is identified inside the ROI. Total variation minimization [H. Yu and G. Wang, ''Compressed sensing basedmore » interior tomography'', Phys. Med. Biol. 54, 2791-2805 (2009); W. Han et al., ''A general total variation minimization theorem for compressed sensing based interior tomography'', Int. J. Biomed. Imaging 2009, Article 125871 (2009)] is then employed to obtain pixel values in the subregion P, which serve as a priori knowledge in the next step. Finally, DBP-POCS is performed to reconstruct f(x,y) inside the ROI. Clinical data and the reconstructed image obtained by an x-ray computed tomography system (SOMATOM Definition; Siemens Healthcare) were used to validate the proposed method. The detector covers an object with a diameter of {approx}500 mm. The projection data were truncated either moderately to limit the detector coverage to diameter 350 mm of the object or severely to cover diameter 199 mm. Images were reconstructed using the proposed method. Results: The proposed method provided ROI images with correct pixel values in all areas except near the edge of the ROI. The coefficient of variation, i.e., the root mean square error divided by the mean pixel values, was less than 2.0% or 4.5% with the moderate or severe truncation cases, respectively, except near the boundary of the ROI. Conclusions: The proposed method allows for reconstructing interior ROI images with sufficient accuracy with a tiny knowledge that there exists a nearly piecewise constant subregion.« less

  20. A novel weighted-direction color interpolation

    NASA Astrophysics Data System (ADS)

    Tao, Jin-you; Yang, Jianfeng; Xue, Bin; Liang, Xiaofen; Qi, Yong-hong; Wang, Feng

    2013-08-01

    A digital camera capture images by covering the sensor surface with a color filter array (CFA), only get a color sample at pixel location. Demosaicking is a process by estimating the missing color components of each pixel to get a full resolution image. In this paper, a new algorithm based on edge adaptive and different weighting factors is proposed. Our method can effectively suppress undesirable artifacts. Experimental results based on Kodak images show that the proposed algorithm obtain higher quality images compared to other methods in numerical and visual aspects.

  1. 75 FR 46958 - Proposed Fair Market Rents for the Housing Choice Voucher Program and Moderate Rehabilitation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... program staff. Questions on how to conduct FMR surveys or concerning further methodological explanations... insufficient sample sizes. The areas covered by this estimation method had less than the HUD standard of 200...-bedroom FMR for that area's CBSA as calculated using methods employed for past metropolitan area FMR...

  2. Optimized hyperspectral band selection using hybrid genetic algorithm and gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    The serious information redundancy in hyperspectral images (HIs) cannot contribute to the data analysis accuracy, instead it require expensive computational resources. Consequently, to identify the most useful and valuable information from the HIs, thereby improve the accuracy of data analysis, this paper proposed a novel hyperspectral band selection method using the hybrid genetic algorithm and gravitational search algorithm (GA-GSA). In the proposed method, the GA-GSA is mapped to the binary space at first. Then, the accuracy of the support vector machine (SVM) classifier and the number of selected spectral bands are utilized to measure the discriminative capability of the band subset. Finally, the band subset with the smallest number of spectral bands as well as covers the most useful and valuable information is obtained. To verify the effectiveness of the proposed method, studies conducted on an AVIRIS image against two recently proposed state-of-the-art GSA variants are presented. The experimental results revealed the superiority of the proposed method and indicated that the method can indeed considerably reduce data storage costs and efficiently identify the band subset with stable and high classification precision.

  3. Stylistic Reformulation: Theoretical Premises and Practical Applications.

    ERIC Educational Resources Information Center

    Schultz, Jean Marie

    1994-01-01

    Various aspects of writing style are discussed to propose concrete methods for improving students' performance. Topics covered include the relationship between syntactic and cognitive complexity and classroom techniques and the reformulation technique as applied to student writing samples. (Contains 20 references.) (LB)

  4. Monitoring snow cover variability (2000-2014) in the Hengduan Mountains based on cloud-removed MODIS products with an adaptive spatio-temporal weighted method

    NASA Astrophysics Data System (ADS)

    Li, Xinghua; Fu, Wenxuan; Shen, Huanfeng; Huang, Chunlin; Zhang, Liangpei

    2017-08-01

    Monitoring the variability of snow cover is necessary and meaningful because snow cover is closely connected with climate and ecological change. In this work, 500 m resolution MODIS daily snow cover products from 2000 to 2014 were adopted to analyze the status in Hengduan Mountains. In order to solve the spatial discontinuity caused by clouds in the products, we propose an adaptive spatio-temporal weighted method (ASTWM), which is based on the initial result of a Terra and Aqua combination. This novel method simultaneously considers the temporal and spatial correlations of the snow cover. The simulated experiments indicate that ASTWM removes clouds completely, with a robust overall accuracy (OA) of above 93% under different cloud fractions. The spatio-temporal variability of snow cover in the Hengduan Mountains was investigated with two indices: snow cover days (SCD) and snow fraction. The results reveal that the annual SCD gradually increases and the coefficient of variation (CV) decreases with elevation. The pixel-wise trends of SCD first rise and then drop in most areas. Moreover, intense intra-annual variability of the snow fraction occurs from October to March, during which time there is abundant snow cover. The inter-annual variability, which mainly occurs in high elevation areas, shows an increasing trend before 2004/2005 and a decreasing trend after 2004/2005. In addition, the snow fraction responds to the two climate factors of air temperature and precipitation. For the intra-annual variability, when the air temperature and precipitation decrease, the snow cover increases. Besides, precipitation plays a more important role in the inter-annual variability of snow cover than temperature.

  5. [A method of temperature measurement for hot forging with surface oxide based on infrared spectroscopy].

    PubMed

    Zhang, Yu-cun; Qi, Yan-de; Fu, Xian-bin

    2012-05-01

    High temperature large forging is covered with a thick oxide during forging. It leads to a big measurement data error. In this paper, a method of measuring temperature based on infrared spectroscopy is presented. It can effectively eliminate the influence of surface oxide on the measurement of temperature. The method can measure the surface temperature and emissivity of the oxide directly using the infrared spectrum. The infrared spectrum is radiated from surface oxide of forging. Then it can derive the real temperature of hot forging covered with the oxide using the heat exchange equation. In order to greatly restrain interference spectroscopy through included in the received infrared radiation spectrum, three interference filter system was proposed, and a group of optimal gap parameter values using spectral simulation were obtained. The precision of temperature measurement was improved. The experimental results show that the method can accurately measure the surface temperature of high temperature forging covered with oxide. It meets the requirements of measurement accuracy, and the temperature measurement method is feasible according to the experiment result.

  6. Snow Cover Mapping and Ice Avalanche Monitoring from the Satellite Data of the Sentinels

    NASA Astrophysics Data System (ADS)

    Wang, S.; Yang, B.; Zhou, Y.; Wang, F.; Zhang, R.; Zhao, Q.

    2018-04-01

    In order to monitor ice avalanches efficiently under disaster emergency conditions, a snow cover mapping method based on the satellite data of the Sentinels is proposed, in which the coherence and backscattering coefficient image of Synthetic Aperture Radar (SAR) data (Sentinel-1) is combined with the atmospheric correction result of multispectral data (Sentinel-2). The coherence image of the Sentinel-1 data could be segmented by a certain threshold to map snow cover, with the water bodies extracted from the backscattering coefficient image and removed from the coherence segment result. A snow confidence map from Sentinel-2 was used to map the snow cover, in which the confidence values of the snow cover were relatively high. The method can make full use of the acquired SAR image and multispectral image under emergency conditions, and the application potential of Sentinel data in the field of snow cover mapping is exploited. The monitoring frequency can be ensured because the areas obscured by thick clouds are remedied in the monitoring results. The Kappa coefficient of the monitoring results is 0.946, and the data processing time is less than 2 h, which meet the requirements of disaster emergency monitoring.

  7. An information hiding method based on LSB and tent chaotic map

    NASA Astrophysics Data System (ADS)

    Song, Jianhua; Ding, Qun

    2011-06-01

    In order to protect information security more effectively, a novel information hiding method based on LSB and Tent chaotic map was proposed, first the secret message is Tent chaotic encrypted, and then LSB steganography is executed for the encrypted message in the cover-image. Compared to the traditional image information hiding method, the simulation results indicate that the method greatly improved in imperceptibility and security, and acquired good results.

  8. On Max-Plus Algebra and Its Application on Image Steganography

    PubMed Central

    Santoso, Kiswara Agung

    2018-01-01

    We propose a new steganography method to hide an image into another image using matrix multiplication operations on max-plus algebra. This is especially interesting because the matrix used in encoding or information disguises generally has an inverse, whereas matrix multiplication operations in max-plus algebra do not have an inverse. The advantages of this method are the size of the image that can be hidden into the cover image, larger than the previous method. The proposed method has been tested on many secret images, and the results are satisfactory which have a high level of strength and a high level of security and can be used in various operating systems. PMID:29887761

  9. On Max-Plus Algebra and Its Application on Image Steganography.

    PubMed

    Santoso, Kiswara Agung; Fatmawati; Suprajitno, Herry

    2018-01-01

    We propose a new steganography method to hide an image into another image using matrix multiplication operations on max-plus algebra. This is especially interesting because the matrix used in encoding or information disguises generally has an inverse, whereas matrix multiplication operations in max-plus algebra do not have an inverse. The advantages of this method are the size of the image that can be hidden into the cover image, larger than the previous method. The proposed method has been tested on many secret images, and the results are satisfactory which have a high level of strength and a high level of security and can be used in various operating systems.

  10. A Selfish Constraint Satisfaction Genetic Algorithms for Planning a Long-Distance Transportation Network

    NASA Astrophysics Data System (ADS)

    Onoyama, Takashi; Maekawa, Takuya; Kubota, Sen; Tsuruta, Setuso; Komoda, Norihisa

    To build a cooperative logistics network covering multiple enterprises, a planning method that can build a long-distance transportation network is required. Many strict constraints are imposed on this type of problem. To solve these strict-constraint problems, a selfish constraint satisfaction genetic algorithm (GA) is proposed. In this GA, each gene of an individual satisfies only its constraint selfishly, disregarding the constraints of other genes in the same individuals. Moreover, a constraint pre-checking method is also applied to improve the GA convergence speed. The experimental result shows the proposed method can obtain an accurate solution in a practical response time.

  11. Land cover classification of Landsat 8 satellite data based on Fuzzy Logic approach

    NASA Astrophysics Data System (ADS)

    Taufik, Afirah; Sakinah Syed Ahmad, Sharifah

    2016-06-01

    The aim of this paper is to propose a method to classify the land covers of a satellite image based on fuzzy rule-based system approach. The study uses bands in Landsat 8 and other indices, such as Normalized Difference Water Index (NDWI), Normalized difference built-up index (NDBI) and Normalized Difference Vegetation Index (NDVI) as input for the fuzzy inference system. The selected three indices represent our main three classes called water, built- up land, and vegetation. The combination of the original multispectral bands and selected indices provide more information about the image. The parameter selection of fuzzy membership is performed by using a supervised method known as ANFIS (Adaptive neuro fuzzy inference system) training. The fuzzy system is tested for the classification on the land cover image that covers Klang Valley area. The results showed that the fuzzy system approach is effective and can be explored and implemented for other areas of Landsat data.

  12. Limited distortion in LSB steganography

    NASA Astrophysics Data System (ADS)

    Kim, Younhee; Duric, Zoran; Richards, Dana

    2006-02-01

    It is well known that all information hiding methods that modify the least significant bits introduce distortions into the cover objects. Those distortions have been utilized by steganalysis algorithms to detect that the objects had been modified. It has been proposed that only coefficients whose modification does not introduce large distortions should be used for embedding. In this paper we propose an effcient algorithm for information hiding in the LSBs of JPEG coefficients. Our algorithm uses parity coding to choose the coefficients whose modifications introduce minimal additional distortion. We derive the expected value of the additional distortion as a function of the message length and the probability distribution of the JPEG quantization errors of cover images. Our experiments show close agreement between the theoretical prediction and the actual additional distortion.

  13. Flood Extent Delineation by Thresholding Sentinel-1 SAR Imagery Based on Ancillary Land Cover Information

    NASA Astrophysics Data System (ADS)

    Liang, J.; Liu, D.

    2017-12-01

    Emergency responses to floods require timely information on water extents that can be produced by satellite-based remote sensing. As SAR image can be acquired in adverse illumination and weather conditions, it is particularly suitable for delineating water extent during a flood event. Thresholding SAR imagery is one of the most widely used approaches to delineate water extent. However, most studies apply only one threshold to separate water and dry land without considering the complexity and variability of different dry land surface types in an image. This paper proposes a new thresholding method for SAR image to delineate water from other different land cover types. A probability distribution of SAR backscatter intensity is fitted for each land cover type including water before a flood event and the intersection between two distributions is regarded as a threshold to classify the two. To extract water, a set of thresholds are applied to several pairs of land cover types—water and urban or water and forest. The subsets are merged to form the water distribution for the SAR image during or after the flooding. Experiments show that this land cover based thresholding approach outperformed the traditional single thresholding by about 5% to 15%. This method has great application potential with the broadly acceptance of the thresholding based methods and availability of land cover data, especially for heterogeneous regions.

  14. Enhanced Deforestation Mapping in North Korea using Spatial-temporal Image Fusion Method and Phenology-based Index

    NASA Astrophysics Data System (ADS)

    Jin, Y.; Lee, D.

    2017-12-01

    North Korea (the Democratic People's Republic of Korea, DPRK) is known to have some of the most degraded forest in the world. The characteristics of forest landscape in North Korea is complex and heterogeneous, the major vegetation cover types in the forest are hillside farm, unstocked forest, natural forest, and plateau vegetation. Better classification of types in high spatial resolution of deforested areas could provide essential information for decisions about forest management priorities and restoration of deforested areas. For mapping heterogeneous vegetation covers, the phenology-based indices are helpful to overcome the reflectance value confusion that occurs when using one season images. Coarse spatial resolution images may be acquired with a high repetition rate and it is useful for analyzing phenology characteristics, but may not capture the spatial detail of the land cover mosaic of the region of interest. Previous spatial-temporal fusion methods were only capture the temporal change, or focused on both temporal change and spatial change but with low accuracy in heterogeneous landscapes and small patches. In this study, a new concept for spatial-temporal image fusion method focus on heterogeneous landscape was proposed to produce fine resolution images at both fine spatial and temporal resolution. We classified the three types of pixels between the base image and target image, the first type is only reflectance changed caused by phenology, this type of pixels supply the reflectance, shape and texture information; the second type is both reflectance and spectrum changed in some bands caused by phenology like rice paddy or farmland, this type of pixels only supply shape and texture information; the third type is reflectance and spectrum changed caused by land cover type change, this type of pixels don't provide any information because we can't know how land cover changed in target image; and each type of pixels were applied different prediction methods. Results show that both STARFM and FSDAF predicted in low accuracy in second type pixels and small patches. Classification results used spatial-temporal image fusion method proposed in this study showed overall classification accuracy of 89.38%, with corresponding kappa coefficients of 0.87.

  15. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method

    PubMed Central

    Zhang, Tingting; Kou, S. C.

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure. PMID:21258615

  16. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method.

    PubMed

    Zhang, Tingting; Kou, S C

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure.

  17. Maritime Search and Rescue via Multiple Coordinated UAS

    DTIC Science & Technology

    2017-06-12

    performed by a set of UAS. Our investigation covers the detection of multiple mobile objects by a heterogeneous collection of UAS. Three methods (two...account for contingencies such as airspace deconfliction. Results are produced using simulation to verify the capability of the proposed method and to...compare the various par- titioning methods . Results from this simulation show that great gains in search efficiency can be made when the search space is

  18. Estimation of land-surface evaporation at four forest sites across Japan with the new nonlinear complementary method.

    PubMed

    Ai, Zhipin; Wang, Qinxue; Yang, Yonghui; Manevski, Kiril; Zhao, Xin; Eer, Deni

    2017-12-19

    Evaporation from land surfaces is a critical component of the Earth water cycle and of water management strategies. The complementary method originally proposed by Bouchet, which describes a linear relation between actual evaporation (E), potential evaporation (E po ) and apparent potential evaporation (E pa ) based on routinely measured weather data, is one of the various methods for evaporation calculation. This study evaluated the reformulated version of the original method, as proposed by Brutsaert, for forest land cover in Japan. The new complementary method is nonlinear and based on boundary conditions with strictly physical considerations. The only unknown parameter (α e ) was for the first time determined for various forest covers located from north to south across Japan. The values of α e ranged from 0.94 to 1.10, with a mean value of 1.01. Furthermore, the calculated evaporation with the new method showed a good fit with the eddy-covariance measured values, with a determination coefficient of 0.78 and a mean bias of 4%. Evaluation results revealed that the new nonlinear complementary relation performs better than the original linear relation in describing the relationship between E/E pa and E po /E pa , and also in depicting the asymmetry variation between E pa /E po and E/E po .

  19. Variances in solar collector performance predictions due to different methods of evaluating wind heat transfer coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, J.W.; Charmchi, M.

    1980-11-01

    The performance of several solar collector configurations has been predicted using both inappropriate and appropriate relations to evaluate the wind-related heat transfer coefficient. The combinations analyzed are: one or two covers and a selectively absorbing surface coating, and one or two covers and a nonselectively absorbing surface coating all collectors are of the basic liquid heating type. It is shown that the optimum results are obtained by using a global correlation equation proposed by Sparrow et al. (1979).

  20. a Framework of Change Detection Based on Combined Morphologica Features and Multi-Index Classification

    NASA Astrophysics Data System (ADS)

    Li, S.; Zhang, S.; Yang, D.

    2017-09-01

    Remote sensing images are particularly well suited for analysis of land cover change. In this paper, we present a new framework for detection of changing land cover using satellite imagery. Morphological features and a multi-index are used to extract typical objects from the imagery, including vegetation, water, bare land, buildings, and roads. Our method, based on connected domains, is different from traditional methods; it uses image segmentation to extract morphological features, while the enhanced vegetation index (EVI), the differential water index (NDWI) are used to extract vegetation and water, and a fragmentation index is used to the correct extraction results of water. HSV transformation and threshold segmentation extract and remove the effects of shadows on extraction results. Change detection is performed on these results. One of the advantages of the proposed framework is that semantic information is extracted automatically using low-level morphological features and indexes. Another advantage is that the proposed method detects specific types of change without any training samples. A test on ZY-3 images demonstrates that our framework has a promising capability to detect change.

  1. Soft material-based microculture system having air permeable cover sheet for the protoplast culture of Nicotiana tabacum.

    PubMed

    Ju, Jong Il; Ko, Jung-Moon; Kim, So Hyeon; Baek, Ju Yeoul; Cha, Hyeon-Cheol; Lee, Sang Hoon

    2006-08-01

    In plant cell culture, the delivery of nutrition and gas (mainly oxygen) to the cells is the most important factor for viability. In this paper, we propose a polydimethylsiloxane (PDMS)-based microculture system that is designed to have good aeration. PDMS is known to have excellent air permeability, and through the experimental method, we investigated the relation between the degree of air delivery and the thickness of the PDMS sheet covering the culture chamber. We determined the proper thickness of the cover sheet, and cultured protoplasts of Nicotiana tabacum in a culture chamber covered with a PDMS sheet having thickness of 400 microm. The cells were successfully divided, and lived well inside the culture chamber for 10 days. In addition, protoplasts were cultured inside the culture chambers covered with the cover glass and the PDMS sheet, respectively, and the microcolonies were formed well inside the PDMS covered chamber after 10 days.

  2. Comparison of Methods and Interdisciplinary Possibilities. The Case of Literature Reviews in Social Work and in Nursing Sciences

    ERIC Educational Resources Information Center

    Couturier, Yves; Dumas-Laverdiere, Christian

    2006-01-01

    The reflections on interdisciplinarity cover several dimensions. One, among them, concerns the nature of what occurs between two disciplines. Does inter-disciplinarity relate to an intention, to a metatheory, to the object, or to a method? It is this ultimate space that we propose to study, supported by Reswebers (2000) proposition, putting the…

  3. A Novel Algorithm Combining Finite State Method and Genetic Algorithm for Solving Crude Oil Scheduling Problem

    PubMed Central

    Duan, Qian-Qian; Yang, Gen-Ke; Pan, Chang-Chun

    2014-01-01

    A hybrid optimization algorithm combining finite state method (FSM) and genetic algorithm (GA) is proposed to solve the crude oil scheduling problem. The FSM and GA are combined to take the advantage of each method and compensate deficiencies of individual methods. In the proposed algorithm, the finite state method makes up for the weakness of GA which is poor at local searching ability. The heuristic returned by the FSM can guide the GA algorithm towards good solutions. The idea behind this is that we can generate promising substructure or partial solution by using FSM. Furthermore, the FSM can guarantee that the entire solution space is uniformly covered. Therefore, the combination of the two algorithms has better global performance than the existing GA or FSM which is operated individually. Finally, a real-life crude oil scheduling problem from the literature is used for conducting simulation. The experimental results validate that the proposed method outperforms the state-of-art GA method. PMID:24772031

  4. Soil erosion assessment on hillslope of GCE using RUSLE model

    NASA Astrophysics Data System (ADS)

    Islam, Md. Rabiul; Jaafar, Wan Zurina Wan; Hin, Lai Sai; Osman, Normaniza; Din, Moktar Aziz Mohd; Zuki, Fathiah Mohamed; Srivastava, Prashant; Islam, Tanvir; Adham, Md. Ibrahim

    2018-06-01

    A new method for obtaining the C factor (i.e., vegetation cover and management factor) of the RUSLE model is proposed. The method focuses on the derivation of the C factor based on the vegetation density to obtain a more reliable erosion prediction. Soil erosion that occurs on the hillslope along the highway is one of the major problems in Malaysia, which is exposed to a relatively high amount of annual rainfall due to the two different monsoon seasons. As vegetation cover is one of the important factors in the RUSLE model, a new method that accounts for a vegetation density is proposed in this study. A hillslope near the Guthrie Corridor Expressway (GCE), Malaysia, is chosen as an experimental site whereby eight square plots with the size of 8× 8 and 5× 5 m are set up. A vegetation density available on these plots is measured by analyzing the taken image followed by linking the C factor with the measured vegetation density using several established formulas. Finally, erosion prediction is computed based on the RUSLE model in the Geographical Information System (GIS) platform. The C factor obtained by the proposed method is compared with that of the soil erosion guideline Malaysia, thereby predicted erosion is determined by both the C values. Result shows that the C value from the proposed method varies from 0.0162 to 0.125, which is lower compared to the C value from the soil erosion guideline, i.e., 0.8. Meanwhile predicted erosion computed from the proposed C value is between 0.410 and 3.925 t ha^{-1 } yr^{-1} compared to 9.367 to 34.496 t ha^{-1} yr^{-1 } range based on the C value of 0.8. It can be concluded that the proposed method of obtaining a reasonable C value is acceptable as the computed predicted erosion is found to be classified as a very low zone, i.e. less than 10 t ha^{-1 } yr^{-1} whereas the predicted erosion based on the guideline has classified the study area as a low zone of erosion, i.e., between 10 and 50 t ha^{-1 } yr^{-1}.

  5. Auto-calibration of GF-1 WFV images using flat terrain

    NASA Astrophysics Data System (ADS)

    Zhang, Guo; Xu, Kai; Huang, Wenchao

    2017-12-01

    Four wide field view (WFV) cameras with 16-m multispectral medium-resolution and a combined swath of 800 km are onboard the Gaofen-1 (GF-1) satellite, which can increase the revisit frequency to less than 4 days and enable large-scale land monitoring. The detection and elimination of WFV camera distortions is key for subsequent applications. Due to the wide swath of WFV images, geometric calibration using either conventional methods based on the ground control field (GCF) or GCF independent methods is problematic. This is predominantly because current GCFs in China fail to cover the whole WFV image and most GCF independent methods are used for close-range photogrammetry or computer vision fields. This study proposes an auto-calibration method using flat terrain to detect nonlinear distortions of GF-1 WFV images. First, a classic geometric calibration model is built for the GF1 WFV camera, and at least two images with an overlap area that cover flat terrain are collected, then the elevation residuals between the real elevation and that calculated by forward intersection are used to solve nonlinear distortion parameters in WFV images. Experiments demonstrate that the orientation accuracy of the proposed method evaluated by GCF CPs is within 0.6 pixel, and residual errors manifest as random errors. Validation using Google Earth CPs further proves the effectiveness of auto-calibration, and the whole scene is undistorted compared to not using calibration parameters. The orientation accuracy of the proposed method and the GCF method is compared. The maximum difference is approximately 0.3 pixel, and the factors behind this discrepancy are analyzed. Generally, this method can effectively compensate for distortions in the GF-1 WFV camera.

  6. Residential roof condition assessment system using deep learning

    NASA Astrophysics Data System (ADS)

    Wang, Fan; Kerekes, John P.; Xu, Zhuoyi; Wang, Yandong

    2018-01-01

    The emergence of high resolution (HR) and ultra high resolution (UHR) airborne remote sensing imagery is enabling humans to move beyond traditional land cover analysis applications to the detailed characterization of surface objects. A residential roof condition assessment method using techniques from deep learning is presented. The proposed method operates on individual roofs and divides the task into two stages: (1) roof segmentation, followed by (2) condition classification of the segmented roof regions. As the first step in this process, a self-tuning method is proposed to segment the images into small homogeneous areas. The segmentation is initialized with simple linear iterative clustering followed by deep learned feature extraction and region merging, with the optimal result selected by an unsupervised index, Q. After the segmentation, a pretrained residual network is fine-tuned on the augmented roof segments using a proposed k-pixel extension technique for classification. The effectiveness of the proposed algorithm was demonstrated on both HR and UHR imagery collected by EagleView over different study sites. The proposed algorithm has yielded promising results and has outperformed traditional machine learning methods using hand-crafted features.

  7. Feasibility of using LANDSAT images of vegetation cover to estimate effective hydraulic properties of soils

    NASA Technical Reports Server (NTRS)

    Eagleson, P. S.

    1985-01-01

    Research activities conducted from February 1, 1985 to July 31, 1985 and preliminary conclusions regarding research objectives are summarized. The objective is to determine the feasibility of using LANDSAT data to estimate effective hydraulic properties of soils. The general approach is to apply the climatic-climax hypothesis (Ealgeson, 1982) to natural water-limited vegetation systems using canopy cover estimated from LANDSAT data. Natural water-limited systems typically consist of inhomogeneous vegetation canopies interspersed with bare soils. The ground resolution associated with one pixel from LANDSAT MSS (or TM) data is generally greater than the scale of the plant canopy or canopy clusters. Thus a method for resolving percent canopy cover at a subpixel level must be established before the Eagleson hypothesis can be tested. Two formulations are proposed which extend existing methods of analyzing mixed pixels to naturally vegetated landscapes. The first method involves use of the normalized vegetation index. The second approach is a physical model based on radiative transfer principles. Both methods are to be analyzed for their feasibility on selected sites.

  8. Application of Deep Learning in GLOBELAND30-2010 Product Refinement

    NASA Astrophysics Data System (ADS)

    Liu, T.; Chen, X.

    2018-04-01

    GlobeLand30, as one of the best Global Land Cover (GLC) product at 30-m resolution, has been widely used in many research fields. Due to the significant spectral confusion among different land cover types and limited textual information of Landsat data, the overall accuracy of GlobeLand30 is about 80 %. Although such accuracy is much higher than most other global land cover products, it cannot satisfy various applications. There is still a great need of an effective method to improve the quality of GlobeLand30. The explosive high-resolution satellite images and remarkable performance of Deep Learning on image classification provide a new opportunity to refine GlobeLand30. However, the performance of deep leaning depends on quality and quantity of training samples as well as model training strategy. Therefore, this paper 1) proposed an automatic training sample generation method via Google earth to build a large training sample set; and 2) explore the best training strategy for land cover classification using GoogleNet (Inception V3), one of the most widely used deep learning network. The result shows that the fine-tuning from first layer of Inception V3 using rough large sample set is the best strategy. The retrained network was then applied in one selected area from Xi'an city as a case study of GlobeLand30 refinement. The experiment results indicate that the proposed approach with Deep Learning and google earth imagery is a promising solution for further improving accuracy of GlobeLand30.

  9. Remote sensing monitoring of land restoration interventions in semi-arid environments with a before-after control-impact statistical design

    NASA Astrophysics Data System (ADS)

    Meroni, Michele; Schucknecht, Anne; Fasbender, Dominique; Rembold, Felix; Fava, Francesco; Mauclaire, Margaux; Goffner, Deborah; Di Lucchio, Luisa M.; Leonardi, Ugo

    2017-07-01

    Restoration interventions to combat land degradation are carried out in arid and semi-arid areas to improve vegetation cover and land productivity. Evaluating the success of an intervention over time is challenging due to various constraints (e.g. difficult-to-access areas, lack of long-term records) and the lack of standardised and affordable methodologies. We propose a semi-automatic methodology that uses remote sensing data to provide a rapid, standardised and objective assessment of the biophysical impact, in terms of vegetation cover, of restoration interventions. The Normalised Difference Vegetation Index (NDVI) is used as a proxy for vegetation cover. Recognising that changes in vegetation cover are naturally due to environmental factors such as seasonality and inter-annual climate variability, conclusions about the success of the intervention cannot be drawn by focussing on the intervention area only. We therefore use a comparative method that analyses the temporal variations (before and after the intervention) of the NDVI of the intervention area with respect to multiple control sites that are automatically and randomly selected from a set of candidates that are similar to the intervention area. Similarity is defined in terms of class composition as derived from an ISODATA classification of the imagery before the intervention. The method provides an estimate of the magnitude and significance of the difference in greenness change between the intervention area and control areas. As a case study, the methodology is applied to 15 restoration interventions carried out in Senegal. The impact of the interventions is analysed using 250-m MODIS and 30-m Landsat data. Results show that a significant improvement in vegetation cover was detectable only in one third of the analysed interventions, which is consistent with independent qualitative assessments based on field observations and visual analysis of high resolution imagery. Rural development agencies may potentially use the proposed method for a first screening of restoration interventions.

  10. A modified temporal criterion to meta-optimize the extended Kalman filter for land cover classification of remotely sensed time series

    NASA Astrophysics Data System (ADS)

    Salmon, B. P.; Kleynhans, W.; Olivier, J. C.; van den Bergh, F.; Wessels, K. J.

    2018-05-01

    Humans are transforming land cover at an ever-increasing rate. Accurate geographical maps on land cover, especially rural and urban settlements are essential to planning sustainable development. Time series extracted from MODerate resolution Imaging Spectroradiometer (MODIS) land surface reflectance products have been used to differentiate land cover classes by analyzing the seasonal patterns in reflectance values. The proper fitting of a parametric model to these time series usually requires several adjustments to the regression method. To reduce the workload, a global setting of parameters is done to the regression method for a geographical area. In this work we have modified a meta-optimization approach to setting a regression method to extract the parameters on a per time series basis. The standard deviation of the model parameters and magnitude of residuals are used as scoring function. We successfully fitted a triply modulated model to the seasonal patterns of our study area using a non-linear extended Kalman filter (EKF). The approach uses temporal information which significantly reduces the processing time and storage requirements to process each time series. It also derives reliability metrics for each time series individually. The features extracted using the proposed method are classified with a support vector machine and the performance of the method is compared to the original approach on our ground truth data.

  11. High dimensional land cover inference using remotely sensed modis data

    NASA Astrophysics Data System (ADS)

    Glanz, Hunter S.

    Image segmentation persists as a major statistical problem, with the volume and complexity of data expanding alongside new technologies. Land cover classification, one of the most studied problems in Remote Sensing, provides an important example of image segmentation whose needs transcend the choice of a particular classification method. That is, the challenges associated with land cover classification pervade the analysis process from data pre-processing to estimation of a final land cover map. Many of the same challenges also plague the task of land cover change detection. Multispectral, multitemporal data with inherent spatial relationships have hardly received adequate treatment due to the large size of the data and the presence of missing values. In this work we propose a novel, concerted application of methods which provide a unified way to estimate model parameters, impute missing data, reduce dimensionality, classify land cover, and detect land cover changes. This comprehensive analysis adopts a Bayesian approach which incorporates prior knowledge to improve the interpretability, efficiency, and versatility of land cover classification and change detection. We explore a parsimonious, parametric model that allows for a natural application of principal components analysis to isolate important spectral characteristics while preserving temporal information. Moreover, it allows us to impute missing data and estimate parameters via expectation-maximization (EM). A significant byproduct of our framework includes a suite of training data assessment tools. To classify land cover, we employ a spanning tree approximation to a lattice Potts prior to incorporate spatial relationships in a judicious way and more efficiently access the posterior distribution of pixel labels. We then achieve exact inference of the labels via the centroid estimator. To detect land cover changes, we develop a new EM algorithm based on the same parametric model. We perform simulation studies to validate our models and methods, and conduct an extensive continental scale case study using MODIS data. The results show that we successfully classify land cover and recover the spatial patterns present in large scale data. Application of our change point method to an area in the Amazon successfully identifies the progression of deforestation through portions of the region.

  12. K-Band Substrate Integrated Waveguide (SIW) Coupler

    NASA Astrophysics Data System (ADS)

    Khalid, N.; Ibrahim, S. Z.; Hoon, W. F.

    2018-03-01

    This paper presents a designed coupler by using substrate Roger RO4003. The four port network coupler operates at (18-26 GHz) and designed by using substrate integrated waveguide (SIW) method. Substrate Integrated Waveguide (SIW) are high performance broadband interconnects with excellent immunity to electromagnetic interference and suitable in microwave and millimetre-wave electronics applications, as well as wideband systems. The designs of the coupler are investigated using CST Microwave Studio simulation tool. These proposed couplers are capable of covering the frequency range and provide better performance of scattering parameter (S-parameter). This technology is successfully approached for millimetre-wave and microwave applications. Designs and results are presented and discussed in this paper. The overall simulated percentage bandwidth of the proposed coupler is covered from 18 to 26 GHz with percentage bandwidth of 36.36%.

  13. Compulsory Birth Control and Fertility Measures in India.

    ERIC Educational Resources Information Center

    Halli, S. S.

    1983-01-01

    Discussion of possible applications of the microsimulation approach to analysis of population policy proposes compulsory sterilization policy for all of India. Topics covered include India's population problem, methods for generating a distribution of couples to be sterilized, model validation, data utilized, data analysis, program limitations,…

  14. Text Summarization Model based on Facility Location Problem

    NASA Astrophysics Data System (ADS)

    Takamura, Hiroya; Okumura, Manabu

    e propose a novel multi-document generic summarization model based on the budgeted median problem, which is a facility location problem. The summarization method based on our model is an extractive method, which selects sentences from the given document cluster and generates a summary. Each sentence in the document cluster will be assigned to one of the selected sentences, where the former sentece is supposed to be represented by the latter. Our method selects sentences to generate a summary that yields a good sentence assignment and hence covers the whole content of the document cluster. An advantage of this method is that it can incorporate asymmetric relations between sentences such as textual entailment. Through experiments, we showed that the proposed method yields good summaries on the dataset of DUC'04.

  15. A Particle Batch Smoother Approach to Snow Water Equivalent Estimation

    NASA Technical Reports Server (NTRS)

    Margulis, Steven A.; Girotto, Manuela; Cortes, Gonzalo; Durand, Michael

    2015-01-01

    This paper presents a newly proposed data assimilation method for historical snow water equivalent SWE estimation using remotely sensed fractional snow-covered area fSCA. The newly proposed approach consists of a particle batch smoother (PBS), which is compared to a previously applied Kalman-based ensemble batch smoother (EnBS) approach. The methods were applied over the 27-yr Landsat 5 record at snow pillow and snow course in situ verification sites in the American River basin in the Sierra Nevada (United States). This basin is more densely vegetated and thus more challenging for SWE estimation than the previous applications of the EnBS. Both data assimilation methods provided significant improvement over the prior (modeling only) estimates, with both able to significantly reduce prior SWE biases. The prior RMSE values at the snow pillow and snow course sites were reduced by 68%-82% and 60%-68%, respectively, when applying the data assimilation methods. This result is encouraging for a basin like the American where the moderate to high forest cover will necessarily obscure more of the snow-covered ground surface than in previously examined, less-vegetated basins. The PBS generally outperformed the EnBS: for snow pillows the PBSRMSE was approx.54%of that seen in the EnBS, while for snow courses the PBSRMSE was approx.79%of the EnBS. Sensitivity tests show relative insensitivity for both the PBS and EnBS results to ensemble size and fSCA measurement error, but a higher sensitivity for the EnBS to the mean prior precipitation input, especially in the case where significant prior biases exist.

  16. A New Automatic Method of Urban Areas Mapping in East Asia from LANDSAT Data

    NASA Astrophysics Data System (ADS)

    XU, R.; Jia, G.

    2012-12-01

    Cities, as places where human activities are concentrated, account for a small percent of global land cover but are frequently cited as the chief causes of, and solutions to, climate, biogeochemistry, and hydrology processes at local, regional, and global scales. Accompanying with uncontrolled economic growth, urban sprawl has been attributed to the accelerating integration of East Asia into the world economy and involved dramatic changes in its urban form and land use. To understand the impact of urban extent on biogeophysical processes, reliable mapping of built-up areas is particularly essential in eastern cities as a result of their characteristics of smaller patches, more fragile, and a lower fraction of the urban landscape which does not have natural than in the West. Segmentation of urban land from other land-cover types using remote sensing imagery can be done by standard classification processes as well as a logic rule calculation based on spectral indices and their derivations. Efforts to establish such a logic rule with no threshold for automatically mapping are highly worthwhile. Existing automatic methods are reviewed, and then a proposed approach is introduced including the calculation of the new index and the improved logic rule. Following this, existing automatic methods as well as the proposed approach are compared in a common context. Afterwards, the proposed approach is tested separately in cities of large, medium, and small scale in East Asia selected from different LANDSAT images. The results are promising as the approach can efficiently segment urban areas, even in the presence of more complex eastern cities. Key words: Urban extraction; Automatic Method; Logic Rule; LANDSAT images; East AisaThe Proposed Approach of Extraction of Urban Built-up Areas in Guangzhou, China

  17. Land use and land cover classification for rural residential areas in China using soft-probability cascading of multifeatures

    NASA Astrophysics Data System (ADS)

    Zhang, Bin; Liu, Yueyan; Zhang, Zuyu; Shen, Yonglin

    2017-10-01

    A multifeature soft-probability cascading scheme to solve the problem of land use and land cover (LULC) classification using high-spatial-resolution images to map rural residential areas in China is proposed. The proposed method is used to build midlevel LULC features. Local features are frequently considered as low-level feature descriptors in a midlevel feature learning method. However, spectral and textural features, which are very effective low-level features, are neglected. The acquisition of the dictionary of sparse coding is unsupervised, and this phenomenon reduces the discriminative power of the midlevel feature. Thus, we propose to learn supervised features based on sparse coding, a support vector machine (SVM) classifier, and a conditional random field (CRF) model to utilize the different effective low-level features and improve the discriminability of midlevel feature descriptors. First, three kinds of typical low-level features, namely, dense scale-invariant feature transform, gray-level co-occurrence matrix, and spectral features, are extracted separately. Second, combined with sparse coding and the SVM classifier, the probabilities of the different LULC classes are inferred to build supervised feature descriptors. Finally, the CRF model, which consists of two parts: unary potential and pairwise potential, is employed to construct an LULC classification map. Experimental results show that the proposed classification scheme can achieve impressive performance when the total accuracy reached about 87%.

  18. Interior micro-CT with an offset detector

    PubMed Central

    Sharma, Kriti Sen; Gong, Hao; Ghasemalizadeh, Omid; Yu, Hengyong; Wang, Ge; Cao, Guohua

    2014-01-01

    Purpose: The size of field-of-view (FOV) of a microcomputed tomography (CT) system can be increased by offsetting the detector. The increased FOV is beneficial in many applications. All prior investigations, however, have been focused to the case in which the increased FOV after offset-detector acquisition can cover the transaxial extent of an object fully. Here, the authors studied a new problem where the FOV of a micro-CT system, although increased after offset-detector acquisition, still covers an interior region-of-interest (ROI) within the object. Methods: An interior-ROI-oriented micro-CT scan with an offset detector poses a difficult reconstruction problem, which is caused by both detector offset and projection truncation. Using the projection completion techniques, the authors first extended three previous reconstruction methods from offset-detector micro-CT to offset-detector interior micro-CT. The authors then proposed a novel method which combines two of the extended methods using a frequency split technique. The authors tested the four methods with phantom simulations at 9.4%, 18.8%, 28.2%, and 37.6% detector offset. The authors also applied these methods to physical phantom datasets acquired at the same amounts of detector offset from a customized micro-CT system. Results: When the detector offset was small, all reconstruction methods showed good image quality. At large detector offset, the three extended methods gave either visible shading artifacts or high deviation of pixel value, while the authors’ proposed method demonstrated no visible artifacts and minimal deviation of pixel value in both the numerical simulations and physical experiments. Conclusions: For an interior micro-CT with an offset detector, the three extended reconstruction methods can perform well at a small detector offset but show strong artifacts at a large detector offset. When the detector offset is large, the authors’ proposed reconstruction method can outperform the three extended reconstruction methods by suppressing artifacts and maintaining pixel values. PMID:24877826

  19. Doing Developmental Research: A Practical Guide

    ERIC Educational Resources Information Center

    Striano, Tricia

    2016-01-01

    Addressing practical issues rarely covered in methods texts, this user-friendly, jargon-free book helps students and beginning researchers plan infant and child development studies and get them done. The author provides step-by-step guidance for getting involved in a developmental laboratory and crafting effective research questions and proposals.…

  20. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements.

    PubMed

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K; Cai, Chang; Nagarajan, Srikantan S

    2018-06-01

    Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  1. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements

    NASA Astrophysics Data System (ADS)

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K.; Cai, Chang; Nagarajan, Srikantan S.

    2018-06-01

    Objective. Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. Approach. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Main results. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. Significance. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  2. Strategy to increase Barangan Banana production in Kabupaten Deli Serdang

    NASA Astrophysics Data System (ADS)

    Adhany, I.; Chalil, D.; Ginting, R.

    2018-02-01

    This study was conducted to analyze internal and external factors in increasing Barangan Banana production in Kabupaten Deli Serdang. Samples were determined by snowball sampling technique and purposive sampling method. Using SWOT analysis method, this study found that there were 6 internal strategic factors and 9 external strategic factors. Among that strategic factors, support for production facilities appears as the most important internal strategic factor, while the demand for Barangan Banana. as the most important external strategic factor. Based on the importance and existing condition of these strategic factors, using support for production facilities and realization of supporting facilities with farming experience are the strategies covering strength-opportunity (SO), organizing mentoring to meet the demand for Barangan Banana are the strategies covering weakness-opportunity (WO), making use of funding support and subsidies to widen the land, using tissue culture seeds and facilities and infrastructures are the strategies covering strength-threat (ST), increas the funding support to widen the land, the use of tissue culture seeds and facilities and infrastructures are the strategies covering weakness-threat (WT) are discussed and proposed to increase Barangan Banana productivity in Kabupaten Deli Serdang.

  3. Fusion of sensor geometry into additive strain fields measured with sensing skin

    NASA Astrophysics Data System (ADS)

    Downey, Austin; Sadoughi, Mohammadkazem; Laflamme, Simon; Hu, Chao

    2018-07-01

    Recently, numerous studies have been conducted on flexible skin-like membranes for the cost effective monitoring of large-scale structures. The authors have proposed a large-area electronic consisting of a soft elastomeric capacitor (SEC) that transduces a structure’s strain into a measurable change in capacitance. Arranged in a network configuration, SECs deployed onto the surface of a structure could be used to reconstruct strain maps. Several regression methods have been recently developed with the purpose of reconstructing such maps, but all these algorithms assumed that each SEC-measured strain located at its geometric center. This assumption may not be realistic since an SEC measures the average strain value of the whole area covered by the sensor. One solution is to reduce the size of each SEC, but this would also increase the number of required sensors needed to cover the large-scale structure, therefore increasing the need for the power and data acquisition capabilities. Instead, this study proposes an algorithm that accounts for the sensor’s strain averaging feature by adjusting the strain measurements and constructing a full-field strain map using the kriging interpolation method. The proposed algorithm fuses the geometry of an SEC sensor into the strain map reconstruction in order to adaptively adjust the average kriging-estimated strain of the area monitored by the sensor to the signal. Results show that by considering the sensor geometry, in addition to the sensor signal and location, the proposed strain map adjustment algorithm is capable of producing more accurate full-field strain maps than the traditional spatial interpolation method that considered only signal and location.

  4. A secure steganography for privacy protection in healthcare system.

    PubMed

    Liu, Jing; Tang, Guangming; Sun, Yifeng

    2013-04-01

    Private data in healthcare system require confidentiality protection while transmitting. Steganography is the art of concealing data into a cover media for conveying messages confidentially. In this paper, we propose a steganographic method which can provide private data in medical system with very secure protection. In our method, a cover image is first mapped into a 1D pixels sequence by Hilbert filling curve and then divided into non-overlapping embedding units with three consecutive pixels. We use adaptive pixel pair match (APPM) method to embed digits in the pixel value differences (PVD) of the three pixels and the base of embedded digits is dependent on the differences among the three pixels. By solving an optimization problem, minimal distortion of the pixel ternaries caused by data embedding can be obtained. The experimental results show our method is more suitable to privacy protection of healthcare system than prior steganographic works.

  5. A new JPEG-based steganographic algorithm for mobile devices

    NASA Astrophysics Data System (ADS)

    Agaian, Sos S.; Cherukuri, Ravindranath C.; Schneider, Erik C.; White, Gregory B.

    2006-05-01

    Currently, cellular phones constitute a significant portion of the global telecommunications market. Modern cellular phones offer sophisticated features such as Internet access, on-board cameras, and expandable memory which provide these devices with excellent multimedia capabilities. Because of the high volume of cellular traffic, as well as the ability of these devices to transmit nearly all forms of data. The need for an increased level of security in wireless communications is becoming a growing concern. Steganography could provide a solution to this important problem. In this article, we present a new algorithm for JPEG-compressed images which is applicable to mobile platforms. This algorithm embeds sensitive information into quantized discrete cosine transform coefficients obtained from the cover JPEG. These coefficients are rearranged based on certain statistical properties and the inherent processing and memory constraints of mobile devices. Based on the energy variation and block characteristics of the cover image, the sensitive data is hidden by using a switching embedding technique proposed in this article. The proposed system offers high capacity while simultaneously withstanding visual and statistical attacks. Based on simulation results, the proposed method demonstrates an improved retention of first-order statistics when compared to existing JPEG-based steganographic algorithms, while maintaining a capacity which is comparable to F5 for certain cover images.

  6. Crack image segmentation based on improved DBC method

    NASA Astrophysics Data System (ADS)

    Cao, Ting; Yang, Nan; Wang, Fengping; Gao, Ting; Wang, Weixing

    2017-11-01

    With the development of computer vision technology, crack detection based on digital image segmentation method arouses global attentions among researchers and transportation ministries. Since the crack always exhibits the random shape and complex texture, it is still a challenge to accomplish reliable crack detection results. Therefore, a novel crack image segmentation method based on fractal DBC (differential box counting) is introduced in this paper. The proposed method can estimate every pixel fractal feature based on neighborhood information which can consider the contribution from all possible direction in the related block. The block moves just one pixel every time so that it could cover all the pixels in the crack image. Unlike the classic DBC method which only describes fractal feature for the related region, this novel method can effectively achieve crack image segmentation according to the fractal feature of every pixel. The experiment proves the proposed method can achieve satisfactory results in crack detection.

  7. Land cover's refined classification based on multi source of remote sensing information fusion: a case study of national geographic conditions census in China

    NASA Astrophysics Data System (ADS)

    Cheng, Tao; Zhang, Jialong; Zheng, Xinyan; Yuan, Rujin

    2018-03-01

    The project of The First National Geographic Conditions Census developed by Chinese government has designed the data acquisition content and indexes, and has built corresponding classification system mainly based on the natural property of material. However, the unified standard for land cover classification system has not been formed; the production always needs converting to meet the actual needs. Therefore, it proposed a refined classification method based on multi source of remote sensing information fusion. It takes the third-level classes of forest land and grassland for example, and has collected the thematic data of Vegetation Map of China (1:1,000,000), attempts to develop refined classification utilizing raster spatial analysis model. Study area is selected, and refined classification is achieved by using the proposed method. The results show that land cover within study area is divided principally among 20 classes, from subtropical broad-leaved forest (31131) to grass-forb community type of low coverage grassland (41192); what's more, after 30 years in the study area, climatic factors, developmental rhythm characteristics and vegetation ecological geographical characteristics have not changed fundamentally, only part of the original vegetation types have changed in spatial distribution range or land cover types. Research shows that refined classification for the third-level classes of forest land and grassland could make the results take on both the natural attributes of the original and plant community ecology characteristics, which could meet the needs of some industry application, and has certain practical significance for promoting the product of The First National Geographic Conditions Census.

  8. [Application of optical flow dynamic texture in land use/cover change detection].

    PubMed

    Yan, Li; Gong, Yi-Long; Zhang, Yi; Duan, Wei

    2014-11-01

    In the present study, a novel change detection approach for high resolution remote sensing images is proposed based on the optical flow dynamic texture (OFDT), which could achieve the land use & land cover change information automatically with a dynamic description of ground-object changes. This paper describes the ground-object gradual change process from the principle using optical flow theory, which breaks the ground-object sudden change hypothesis in remote sensing change detection methods in the past. As the steps of this method are simple, it could be integrated in the systems and software such as Land Resource Management and Urban Planning software that needs to find ground-object changes. This method takes into account the temporal dimension feature between remote sensing images, which provides a richer set of information for remote sensing change detection, thereby improving the status that most of the change detection methods are mainly dependent on the spatial dimension information. In this article, optical flow dynamic texture is the basic reflection of changes, and it is used in high resolution remote sensing image support vector machine post-classification change detection, combined with spectral information. The texture in the temporal dimension which is considered in this article has a smaller amount of data than most of the textures in the spatial dimensions. The highly automated texture computing has only one parameter to set, which could relax the onerous manual evaluation present status. The effectiveness of the proposed approach is evaluated with the 2011 and 2012 QuickBird datasets covering Duerbert Mongolian Autonomous County of Daqing City, China. Then, the effects of different optical flow smooth coefficient and the impact on the description of the ground-object changes in the method are deeply analyzed: The experiment result is satisfactory, with an 87.29% overall accuracy and an 0.850 7 Kappa index, and the method achieves better performance than the post-classification change detection methods using spectral information only.

  9. Development of a Mobile Robot with Wavy Movement by Rotating Bars

    NASA Astrophysics Data System (ADS)

    Kitagawa, Ato; Zhang, Liang; Eguchi, Takashi; Tsukagoshi, Hideyuki

    A mobile robot with a new type of movement called wavy movement is proposed in this paper. Wavy movement can be readily realized by many bars or crosses which are rotating at equivalent speeds, and the robot with simple structure and easy control method is able to ascend and descend stairs by covering the corners of stairs within separate wave shapes between touching points. The principle of wavy movement, the mechanism, and the experimental result of the proposed robot are discussed.

  10. The mechanics of gyroscope ball bearings

    NASA Astrophysics Data System (ADS)

    Zhuravlev, V. F.; Balmont, V. B.

    Various aspects of the mechanics of gyroscopes are examined with emphasis on the elastic properties of the radial thrust ball bearings of the main axle and of the radial ball bearings of the gimbal suspension, covers, and flanges. Particular attention is given to the statics, kinematics, and dynamics of imperfect bearings. A stiffnes model convenient for engineering calculations is developed. A gyroscope vibration theory is proposed, and methods for reducing and preventing vibration are analyzed. The validity of the models proposed here is supported by experimental data.

  11. Multi-Pixel Simultaneous Classification of PolSAR Image Using Convolutional Neural Networks

    PubMed Central

    Xu, Xin; Gui, Rong; Pu, Fangling

    2018-01-01

    Convolutional neural networks (CNN) have achieved great success in the optical image processing field. Because of the excellent performance of CNN, more and more methods based on CNN are applied to polarimetric synthetic aperture radar (PolSAR) image classification. Most CNN-based PolSAR image classification methods can only classify one pixel each time. Because all the pixels of a PolSAR image are classified independently, the inherent interrelation of different land covers is ignored. We use a fixed-feature-size CNN (FFS-CNN) to classify all pixels in a patch simultaneously. The proposed method has several advantages. First, FFS-CNN can classify all the pixels in a small patch simultaneously. When classifying a whole PolSAR image, it is faster than common CNNs. Second, FFS-CNN is trained to learn the interrelation of different land covers in a patch, so it can use the interrelation of land covers to improve the classification results. The experiments of FFS-CNN are evaluated on a Chinese Gaofen-3 PolSAR image and other two real PolSAR images. Experiment results show that FFS-CNN is comparable with the state-of-the-art PolSAR image classification methods. PMID:29510499

  12. Multi-Pixel Simultaneous Classification of PolSAR Image Using Convolutional Neural Networks.

    PubMed

    Wang, Lei; Xu, Xin; Dong, Hao; Gui, Rong; Pu, Fangling

    2018-03-03

    Convolutional neural networks (CNN) have achieved great success in the optical image processing field. Because of the excellent performance of CNN, more and more methods based on CNN are applied to polarimetric synthetic aperture radar (PolSAR) image classification. Most CNN-based PolSAR image classification methods can only classify one pixel each time. Because all the pixels of a PolSAR image are classified independently, the inherent interrelation of different land covers is ignored. We use a fixed-feature-size CNN (FFS-CNN) to classify all pixels in a patch simultaneously. The proposed method has several advantages. First, FFS-CNN can classify all the pixels in a small patch simultaneously. When classifying a whole PolSAR image, it is faster than common CNNs. Second, FFS-CNN is trained to learn the interrelation of different land covers in a patch, so it can use the interrelation of land covers to improve the classification results. The experiments of FFS-CNN are evaluated on a Chinese Gaofen-3 PolSAR image and other two real PolSAR images. Experiment results show that FFS-CNN is comparable with the state-of-the-art PolSAR image classification methods.

  13. 3D photo mosaicing of Tagiri shallow vent field by an autonomous underwater vehicle (3rd report) - Mosaicing method based on navigation data and visual features -

    NASA Astrophysics Data System (ADS)

    Maki, Toshihiro; Ura, Tamaki; Singh, Hanumant; Sakamaki, Takashi

    Large-area seafloor imaging will bring significant benefits to various fields such as academics, resource survey, marine development, security, and search-and-rescue. The authors have proposed a navigation method of an autonomous underwater vehicle for seafloor imaging, and verified its performance through mapping tubeworm colonies with the area of 3,000 square meters using the AUV Tri-Dog 1 at Tagiri vent field, Kagoshima bay in Japan (Maki et al., 2008, 2009). This paper proposes a post-processing method to build a natural photo mosaic from a number of pictures taken by an underwater platform. The method firstly removes lens distortion, invariances of color and lighting from each image, and then ortho-rectification is performed based on camera pose and seafloor estimated by navigation data. The image alignment is based on both navigation data and visual characteristics, implemented as an expansion of the image based method (Pizarro et al., 2003). Using the two types of information realizes an image alignment that is consistent both globally and locally, as well as making the method applicable to data sets with little visual keys. The method was evaluated using a data set obtained by the AUV Tri-Dog 1 at the vent field in Sep. 2009. A seamless, uniformly illuminated photo mosaic covering the area of around 500 square meters was created from 391 pictures, which covers unique features of the field such as bacteria mats and tubeworm colonies.

  14. Mapping Urban Environmental Noise Using Smartphones.

    PubMed

    Zuo, Jinbo; Xia, Hao; Liu, Shuo; Qiao, Yanyou

    2016-10-13

    Noise mapping is an effective method of visualizing and accessing noise pollution. In this paper, a noise-mapping method based on smartphones to effectively and easily measure environmental noise is proposed. By using this method, a noise map of an entire area can be created using limited measurement data. To achieve the measurement with certain precision, a set of methods was designed to calibrate the smartphones. Measuring noise with mobile phones is different from the traditional static observations. The users may be moving at any time. Therefore, a method of attaching an additional microphone with a windscreen is proposed to reduce the wind effect. However, covering an entire area is impossible. Therefore, an interpolation method is needed to achieve full coverage of the area. To reduce the influence of spatial heterogeneity and improve the precision of noise mapping, a region-based noise-mapping method is proposed in this paper, which is based on the distribution of noise in different region types tagged by volunteers, to interpolate and combine them to create a noise map. To validate the effect of the method, a comparison of the interpolation results was made to analyse our method and the ordinary Kriging method. The result shows that our method is more accurate in reflecting the local distribution of noise and has better interpolation precision. We believe that the proposed noise-mapping method is a feasible and low-cost noise-mapping solution.

  15. Mapping Urban Environmental Noise Using Smartphones

    PubMed Central

    Zuo, Jinbo; Xia, Hao; Liu, Shuo; Qiao, Yanyou

    2016-01-01

    Noise mapping is an effective method of visualizing and accessing noise pollution. In this paper, a noise-mapping method based on smartphones to effectively and easily measure environmental noise is proposed. By using this method, a noise map of an entire area can be created using limited measurement data. To achieve the measurement with certain precision, a set of methods was designed to calibrate the smartphones. Measuring noise with mobile phones is different from the traditional static observations. The users may be moving at any time. Therefore, a method of attaching an additional microphone with a windscreen is proposed to reduce the wind effect. However, covering an entire area is impossible. Therefore, an interpolation method is needed to achieve full coverage of the area. To reduce the influence of spatial heterogeneity and improve the precision of noise mapping, a region-based noise-mapping method is proposed in this paper, which is based on the distribution of noise in different region types tagged by volunteers, to interpolate and combine them to create a noise map. To validate the effect of the method, a comparison of the interpolation results was made to analyse our method and the ordinary Kriging method. The result shows that our method is more accurate in reflecting the local distribution of noise and has better interpolation precision. We believe that the proposed noise-mapping method is a feasible and low-cost noise-mapping solution. PMID:27754359

  16. The High Citadel: The Influence of Harvard Law School.

    ERIC Educational Resources Information Center

    Seligman, Joel

    The history of Harvard Law School, a modern critique, and a proposed new model for American legal education are covered in this book by a Harvard Law graduate. Harvard Law School is called the "high citadel" of American legal education. Its admissions procedures, faculty selection, curriculum, teaching methods, and placement practices…

  17. Pricing the Services of Scientific Cores. Part I: Charging Subsidized and Unsubsidized Users.

    ERIC Educational Resources Information Center

    Fife, Jerry; Forrester, Robert

    2002-01-01

    Explaining that scientific cores at research institutions support shared resources and facilities, discusses devising a method of charging users for core services and controlling and managing the rates. Proposes the concept of program-based management to cover sources of core support that are funding similar work. (EV)

  18. Image Classification Using Biomimetic Pattern Recognition with Convolutional Neural Networks Features

    PubMed Central

    Huo, Guanying

    2017-01-01

    As a typical deep-learning model, Convolutional Neural Networks (CNNs) can be exploited to automatically extract features from images using the hierarchical structure inspired by mammalian visual system. For image classification tasks, traditional CNN models employ the softmax function for classification. However, owing to the limited capacity of the softmax function, there are some shortcomings of traditional CNN models in image classification. To deal with this problem, a new method combining Biomimetic Pattern Recognition (BPR) with CNNs is proposed for image classification. BPR performs class recognition by a union of geometrical cover sets in a high-dimensional feature space and therefore can overcome some disadvantages of traditional pattern recognition. The proposed method is evaluated on three famous image classification benchmarks, that is, MNIST, AR, and CIFAR-10. The classification accuracies of the proposed method for the three datasets are 99.01%, 98.40%, and 87.11%, respectively, which are much higher in comparison with the other four methods in most cases. PMID:28316614

  19. Mapping shorelines to subpixel accuracy using Landsat imagery

    NASA Astrophysics Data System (ADS)

    Abileah, Ron; Vignudelli, Stefano; Scozzari, Andrea

    2013-04-01

    A promising method to accurately map the shoreline of oceans, lakes, reservoirs, and rivers is proposed and verified in this work. The method is applied to multispectral satellite imagery in two stages. The first stage is a classification of each image pixel into land/water categories using the conventional 'dark pixel' method. The approach presented here, makes use of a single shortwave IR image band (SWIR), if available. It is well known that SWIR has the least water leaving radiance and relatively little sensitivity to water pollutants and suspended sediments. It is generally the darkest (over water) and most reliable single band for land-water discrimination. The boundary of the water cover map determined in stage 1 underestimates the water cover and often misses the true shoreline by a quantity up to one pixel. A more accurate shoreline would be obtained by connecting the center point of pixels with exactly 50-50 mix of water and land. Then, stage 2 finds the 50-50 mix points. According to the method proposed, image data is interpolated and up-sampled to ten times the original resolution. The local gradient in radiance is used to find the direction to the shore, thus searching along that path for the interpolated pixel closest to a 50-50 mix. Landsat images with 30m resolution, processed by this method, may thus provide the shoreline accurate to 3m. Compared to similar approaches available in the literature, the method proposed discriminates sub-pixels crossed by the shoreline by using a criteria based on the absolute value of radiance, rather than its gradient. Preliminary experimentation of the algorithm shows that 10m resolution accuracy is easily achieved and in some cases is often better than 5m. The proposed method can be used to study long term shoreline changes by exploiting the 30 years of archived world-wide coverage Landsat imagery. Landsat imagery is free and easily accessible for downloading. Some applications that exploit the Landsat dataset and the new method are discussed in the companion poster: "Case-studies of potential applications for highly resolved shorelines."

  20. Multitemporal Snow Cover Mapping in Mountainous Terrain for Landsat Climate Data Record Development

    NASA Technical Reports Server (NTRS)

    Crawford, Christopher J.; Manson, Steven M.; Bauer, Marvin E.; Hall, Dorothy K.

    2013-01-01

    A multitemporal method to map snow cover in mountainous terrain is proposed to guide Landsat climate data record (CDR) development. The Landsat image archive including MSS, TM, and ETM+ imagery was used to construct a prototype Landsat snow cover CDR for the interior northwestern United States. Landsat snow cover CDRs are designed to capture snow-covered area (SCA) variability at discrete bi-monthly intervals that correspond to ground-based snow telemetry (SNOTEL) snow-water-equivalent (SWE) measurements. The June 1 bi-monthly interval was selected for initial CDR development, and was based on peak snowmelt timing for this mountainous region. Fifty-four Landsat images from 1975 to 2011 were preprocessed that included image registration, top-of-the-atmosphere (TOA) reflectance conversion, cloud and shadow masking, and topographic normalization. Snow covered pixels were retrieved using the normalized difference snow index (NDSI) and unsupervised classification, and pixels having greater (less) than 50% snow cover were classified presence (absence). A normalized SCA equation was derived to independently estimate SCA given missing image coverage and cloud-shadow contamination. Relative frequency maps of missing pixels were assembled to assess whether systematic biases were embedded within this Landsat CDR. Our results suggest that it is possible to confidently estimate historical bi-monthly SCA from partially cloudy Landsat images. This multitemporal method is intended to guide Landsat CDR development for freshwaterscarce regions of the western US to monitor climate-driven changes in mountain snowpack extent.

  1. Analysis of economic values of land use and land cover changes in crisis territories by satellite data: models of socio-economy and population dynamics in war

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Yuschenko, Maxim; Movchan, Dmytro; Kopachevsky, Ivan

    2017-10-01

    Problem of remote sensing data harnessing for decision making in conflict territories is considered. Approach for analysis of socio-economic and demographic parameters with a limited set of data and deep uncertainty is described. Number of interlinked techniques to estimate a population and economy in crisis territories are proposed. Stochastic method to assessment of population dynamics using multi-source data using remote sensing data is proposed. Adaptive Markov's chain based method to study of land-use changes using satellite data is proposed. Proposed approach is applied to analysis of socio-economic situation in Donbas (East Ukraine) territory of conflict in 2014-2015. Land-use and landcover patterns for different periods were analyzed using the Landsat and MODIS data . The land-use classification scheme includes the following categories: (1) urban or built-up land, (2) barren land, (3) cropland, (4) horticulture farms, (5) livestock farms, (6) forest, and (7) water. It was demonstrated, that during the period 2014-2015 was not detected drastic changes in land-use structure of study area. Heterogeneously distributed decreasing of horticulture farms (4-6%), livestock farms (5-6%), croplands (3-4%), and increasing of barren land (6-7%) have been observed. Way to analyze land-cover productivity variations using satellite data is proposed. Algorithm is based on analysis of time-series of NDVI and NDWI distributions. Drastic changes of crop area and its productivity were detected. Set of indirect indicators, such as night light intensity, is also considered. Using the approach proposed, using the data utilized, the local and regional GDP, local population, and its dynamics are estimated.

  2. Modeling a color-rendering operator for high dynamic range images using a cone-response function

    NASA Astrophysics Data System (ADS)

    Choi, Ho-Hyoung; Kim, Gi-Seok; Yun, Byoung-Ju

    2015-09-01

    Tone-mapping operators are the typical algorithms designed to produce visibility and the overall impression of brightness, contrast, and color of high dynamic range (HDR) images on low dynamic range (LDR) display devices. Although several new tone-mapping operators have been proposed in recent years, the results of these operators have not matched those of the psychophysical experiments based on the human visual system. A color-rendering model that is a combination of tone-mapping and cone-response functions using an XYZ tristimulus color space is presented. In the proposed method, the tone-mapping operator produces visibility and the overall impression of brightness, contrast, and color in HDR images when mapped onto relatively LDR devices. The tone-mapping resultant image is obtained using chromatic and achromatic colors to avoid well-known color distortions shown in the conventional methods. The resulting image is then processed with a cone-response function wherein emphasis is placed on human visual perception (HVP). The proposed method covers the mismatch between the actual scene and the rendered image based on HVP. The experimental results show that the proposed method yields an improved color-rendering performance compared to conventional methods.

  3. Using polarimetry to retrieve the cloud coverage of Earth-like exoplanets

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Stam, D. M.

    2017-11-01

    Context. Clouds have already been detected in exoplanetary atmospheres. They play crucial roles in a planet's atmosphere and climate and can also create ambiguities in the determination of atmospheric parameters such as trace gas mixing ratios. Knowledge of cloud properties is required when assessing the habitability of a planet. Aims: We aim to show that various types of cloud cover such as polar cusps, subsolar clouds, and patchy clouds on Earth-like exoplanets can be distinguished from each other using the polarization and flux of light that is reflected by the planet. Methods: We have computed the flux and polarization of reflected starlight for different types of (liquid water) cloud covers on Earth-like model planets using the adding-doubling method, that fully includes multiple scattering and polarization. Variations in cloud-top altitudes and planet-wide cloud cover percentages were taken into account. Results: We find that the different types of cloud cover (polar cusps, subsolar clouds, and patchy clouds) can be distinguished from each other and that the percentage of cloud cover can be estimated within 10%. Conclusions: Using our proposed observational strategy, one should be able to determine basic orbital parameters of a planet such as orbital inclination and estimate cloud coverage with reduced ambiguities from the planet's polarization signals along its orbit.

  4. Estimating snow depth of alpine snowpack via airborne multifrequency passive microwave radiance observations: Colorado, USA

    NASA Astrophysics Data System (ADS)

    Kim, R. S.; Durand, M. T.; Li, D.; Baldo, E.; Margulis, S. A.; Dumont, M.; Morin, S.

    2017-12-01

    This paper presents a newly-proposed snow depth retrieval approach for mountainous deep snow using airborne multifrequency passive microwave (PM) radiance observation. In contrast to previous snow depth estimations using satellite PM radiance assimilation, the newly-proposed method utilized single flight observation and deployed the snow hydrologic models. This method is promising since the satellite-based retrieval methods have difficulties to estimate snow depth due to their coarse resolution and computational effort. Indeed, this approach consists of particle filter using combinations of multiple PM frequencies and multi-layer snow physical model (i.e., Crocus) to resolve melt-refreeze crusts. The method was performed over NASA Cold Land Processes Experiment (CLPX) area in Colorado during 2002 and 2003. Results showed that there was a significant improvement over the prior snow depth estimates and the capability to reduce the prior snow depth biases. When applying our snow depth retrieval algorithm using a combination of four PM frequencies (10.7,18.7, 37.0 and 89.0 GHz), the RMSE values were reduced by 48 % at the snow depth transects sites where forest density was less than 5% despite deep snow conditions. This method displayed a sensitivity to different combinations of frequencies, model stratigraphy (i.e. different number of layering scheme for snow physical model) and estimation methods (particle filter and Kalman filter). The prior RMSE values at the forest-covered areas were reduced by 37 - 42 % even in the presence of forest cover.

  5. Using movies in family medicine teaching: A reference to EURACT Educational Agenda.

    PubMed

    Klemenc Ketiš, Zalika; Švab, Igor

    2017-06-01

    Cinemeducation is a teaching method where popular movies or movie clips are used. We aimed to determine whether family physicians' competencies as listed in the Educational Agenda produced by the European Academy of Teachers in General Practice/Family Medicine (EURACT) can be found in movies, and to propose a template for teaching by these movies. A group of family medicine teachers provided a list of movies that they would use in cinemeducation. The movies were categorised according to the key family medicine competencies, thus creating a framework of competences, covered by different movies. These key competencies are Primary care management, Personcentred care, Specific problem-solving skills, Comprehensive approach, Community orientation, and Holistic approach. The list consisted of 17 movies. Nine covered primary care management. Person-centred care was covered in 13 movies. Eight movies covered specific problem-solving skills. Comprehensive approach was covered in five movies. Five movies covered community orientation. Holistic approach was covered in five movies. All key family medicine competencies listed in the Educational Agenda can be taught using movies. Our results can serve as a template for teachers on how to use any appropriate movies in family medicine education.

  6. SymPS: BRDF Symmetry Guided Photometric Stereo for Shape and Light Source Estimation.

    PubMed

    Lu, Feng; Chen, Xiaowu; Sato, Imari; Sato, Yoichi

    2018-01-01

    We propose uncalibrated photometric stereo methods that address the problem due to unknown isotropic reflectance. At the core of our methods is the notion of "constrained half-vector symmetry" for general isotropic BRDFs. We show that such symmetry can be observed in various real-world materials, and it leads to new techniques for shape and light source estimation. Based on the 1D and 2D representations of the symmetry, we propose two methods for surface normal estimation; one focuses on accurate elevation angle recovery for surface normals when the light sources only cover the visible hemisphere, and the other for comprehensive surface normal optimization in the case that the light sources are also non-uniformly distributed. The proposed robust light source estimation method also plays an essential role to let our methods work in an uncalibrated manner with good accuracy. Quantitative evaluations are conducted with both synthetic and real-world scenes, which produce the state-of-the-art accuracy for all of the non-Lambertian materials in MERL database and the real-world datasets.

  7. Orthogonal Array Testing for Transmit Precoding based Codebooks in Space Shift Keying Systems

    NASA Astrophysics Data System (ADS)

    Al-Ansi, Mohammed; Alwee Aljunid, Syed; Sourour, Essam; Mat Safar, Anuar; Rashidi, C. B. M.

    2018-03-01

    In Space Shift Keying (SSK) systems, transmit precoding based codebook approaches have been proposed to improve the performance in limited feedback channels. The receiver performs an exhaustive search in a predefined Full-Combination (FC) codebook to select the optimal codeword that maximizes the Minimum Euclidean Distance (MED) between the received constellations. This research aims to reduce the codebook size with the purpose of minimizing the selection time and the number of feedback bits. Therefore, we propose to construct the codebooks based on Orthogonal Array Testing (OAT) methods due to their powerful inherent properties. These methods allow to acquire a short codebook where the codewords are sufficient to cover almost all the possible effects included in the FC codebook. Numerical results show the effectiveness of the proposed OAT codebooks in terms of the system performance and complexity.

  8. Sol–gel method as a way of carbonyl iron powder surface modification for interaction improvement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Małecki, P., E-mail: pawel.malecki@pwr.edu.pl; Kolman, K.; Pigłowski, J.

    2015-03-15

    This article presents a method for modification of carbonyl iron particles’ surface (CIP), (d{sub 50}=4–9 µm) by silica coatings obtained using the sol–gel method. Reaction parameters were determined to obtain dry magnetic powder with homogeneous silica coatings without further processing and without any by-product in the solid or liquid phase. This approach is new among the commonly used methods of silica coating of iron particles. No attempt has been made to cover a carbonyl iron surface by silica in a waste-free method, up to date. In the current work two different silica core/shell structures were made by the sol–gel process,more » based on different silica precursors: tetraethoxy-silane (TEOS) and tetramethoxy-silane (TMOS). The dependence between the synthesis procedure and thickness of silica shell covering carbonyl iron particles has been described. Surface morphology of the modified magnetic particles and the coating thickness were characterized with the use of scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Determination of the physicochemical structure of the obtained materials was performed by the energy-dispersive X-ray spectroscope (EDS), and the infrared technique (IR). The surface composition was analyzed using X-ray photoelectron spectroscopy (XPS). Additionally, distribution of particle size was measured using light microscopy. The new, efficient process of covering micro-size CIP with a nanometric silica layer was shown. Results of a performed analysis confirm the effectiveness of the presented method. - Highlights: • Proper covering CIP by sol–gel silica layer avoids agglomeration. • A new solid waste-free method of CIP coating is proposed. • Examination of the properties of modified CIP in depends on washing process. • Coatings on CIP particles doesn’t change the magnetic properties of particles.« less

  9. Quality Evaluation of Land-Cover Classification Using Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Dang, Y.; Zhang, J.; Zhao, Y.; Luo, F.; Ma, W.; Yu, F.

    2018-04-01

    Land-cover classification is one of the most important products of earth observation, which focuses mainly on profiling the physical characters of the land surface with temporal and distribution attributes and contains the information of both natural and man-made coverage elements, such as vegetation, soil, glaciers, rivers, lakes, marsh wetlands and various man-made structures. In recent years, the amount of high-resolution remote sensing data has increased sharply. Accordingly, the volume of land-cover classification products increases, as well as the need to evaluate such frequently updated products that is a big challenge. Conventionally, the automatic quality evaluation of land-cover classification is made through pixel-based classifying algorithms, which lead to a much trickier task and consequently hard to keep peace with the required updating frequency. In this paper, we propose a novel quality evaluation approach for evaluating the land-cover classification by a scene classification method Convolutional Neural Network (CNN) model. By learning from remote sensing data, those randomly generated kernels that serve as filter matrixes evolved to some operators that has similar functions to man-crafted operators, like Sobel operator or Canny operator, and there are other kernels learned by the CNN model that are much more complex and can't be understood as existing filters. The method using CNN approach as the core algorithm serves quality-evaluation tasks well since it calculates a bunch of outputs which directly represent the image's membership grade to certain classes. An automatic quality evaluation approach for the land-cover DLG-DOM coupling data (DLG for Digital Line Graphic, DOM for Digital Orthophoto Map) will be introduced in this paper. The CNN model as an robustness method for image evaluation, then brought out the idea of an automatic quality evaluation approach for land-cover classification. Based on this experiment, new ideas of quality evaluation of DLG-DOM coupling land-cover classification or other kinds of labelled remote sensing data can be further studied.

  10. Gamification for Non-Majors Mathematics: An Innovative Assignment Model

    ERIC Educational Resources Information Center

    Leong, Siow Hoo; Tang, Howe Eng

    2017-01-01

    The most important ingredient of the pedagogy for teaching non-majors is getting their engagement. This paper proposes to use gamification to engage non-majors. An innovative game termed as Cover the Hungarian's Zeros is designed to tackle the common weakness of non-majors mathematics in solving the assignment problem using the Hungarian Method.…

  11. Mapping Nearshore Seagrass and Colonized Hard Bottom Spatial Distribution and Percent Biological Cover in Florida, USA Using Object Based Image Analysis of WorldView-2 Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Baumstark, R. D.; Duffey, R.; Pu, R.

    2016-12-01

    The offshore extent of seagrass habitat along the West Florida (USA) coast represents an important corridor for inshore-offshore migration of economically important fish and shellfish. Surviving at the fringe of light requirements, offshore seagrass beds are sensitive to changes in water clarity. Beyond and intermingled with the offshore seagrass areas are large swaths of colonized hard bottom. These offshore habitats of the West Florida coast have lacked mapping efforts needed for status and trends monitoring. The objective of this study was to propose an object-based classification method for mapping offshore habitats and to compare results to traditional photo-interpreted maps. Benthic maps depicting the spatial distribution and percent biological cover were created from WorldView-2 satellite imagery using Object Based Image Analysis (OBIA) method and a visual photo-interpretation method. A logistic regression analysis identified depth and distance from shore as significant parameters for discriminating spectrally similar seagrass and colonized hard bottom features. Seagrass, colonized hard bottom and unconsolidated sediment (sand) were mapped with 78% overall accuracy using the OBIA method compared to 71% overall accuracy using the photo-interpretation method. This study presents an alternative for mapping deeper, offshore habitats capable of producing higher thematic (percent biological cover) and spatial resolution maps compared to those created with the traditional photo-interpretation method.

  12. Two fast and accurate heuristic RBF learning rules for data classification.

    PubMed

    Rouhani, Modjtaba; Javan, Dawood S

    2016-03-01

    This paper presents new Radial Basis Function (RBF) learning methods for classification problems. The proposed methods use some heuristics to determine the spreads, the centers and the number of hidden neurons of network in such a way that the higher efficiency is achieved by fewer numbers of neurons, while the learning algorithm remains fast and simple. To retain network size limited, neurons are added to network recursively until termination condition is met. Each neuron covers some of train data. The termination condition is to cover all training data or to reach the maximum number of neurons. In each step, the center and spread of the new neuron are selected based on maximization of its coverage. Maximization of coverage of the neurons leads to a network with fewer neurons and indeed lower VC dimension and better generalization property. Using power exponential distribution function as the activation function of hidden neurons, and in the light of new learning approaches, it is proved that all data became linearly separable in the space of hidden layer outputs which implies that there exist linear output layer weights with zero training error. The proposed methods are applied to some well-known datasets and the simulation results, compared with SVM and some other leading RBF learning methods, show their satisfactory and comparable performance. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Glue detection based on teaching points constraint and tracking model of pixel convolution

    NASA Astrophysics Data System (ADS)

    Geng, Lei; Ma, Xiao; Xiao, Zhitao; Wang, Wen

    2018-01-01

    On-line glue detection based on machine version is significant for rust protection and strengthening in car production. Shadow stripes caused by reflect light and unevenness of inside front cover of car reduce the accuracy of glue detection. In this paper, we propose an effective algorithm to distinguish the edges of the glue and shadow stripes. Teaching points are utilized to calculate slope between the two adjacent points. Then a tracking model based on pixel convolution along motion direction is designed to segment several local rectangular regions using distance. The distance is the height of rectangular region. The pixel convolution along the motion direction is proposed to extract edges of gules in local rectangular region. A dataset with different illumination and complexity shape stripes are used to evaluate proposed method, which include 500 thousand images captured from the camera of glue gun machine. Experimental results demonstrate that the proposed method can detect the edges of glue accurately. The shadow stripes are distinguished and removed effectively. Our method achieves the 99.9% accuracies for the image dataset.

  14. A Hybrid Cellular Genetic Algorithm for Multi-objective Crew Scheduling Problem

    NASA Astrophysics Data System (ADS)

    Jolai, Fariborz; Assadipour, Ghazal

    Crew scheduling is one of the important problems of the airline industry. This problem aims to cover a number of flights by crew members, such that all the flights are covered. In a robust scheduling the assignment should be so that the total cost, delays, and unbalanced utilization are minimized. As the problem is NP-hard and the objectives are in conflict with each other, a multi-objective meta-heuristic called CellDE, which is a hybrid cellular genetic algorithm, is implemented as the optimization method. The proposed algorithm provides the decision maker with a set of non-dominated or Pareto-optimal solutions, and enables them to choose the best one according to their preferences. A set of problems of different sizes is generated and solved using the proposed algorithm. Evaluating the performance of the proposed algorithm, three metrics are suggested, and the diversity and the convergence of the achieved Pareto front are appraised. Finally a comparison is made between CellDE and PAES, another meta-heuristic algorithm. The results show the superiority of CellDE.

  15. LSB-Based Steganography Using Reflected Gray Code

    NASA Astrophysics Data System (ADS)

    Chen, Chang-Chu; Chang, Chin-Chen

    Steganography aims to hide secret data into an innocuous cover-medium for transmission and to make the attacker cannot recognize the presence of secret data easily. Even the stego-medium is captured by the eavesdropper, the slight distortion is hard to be detected. The LSB-based data hiding is one of the steganographic methods, used to embed the secret data into the least significant bits of the pixel values in a cover image. In this paper, we propose an LSB-based scheme using reflected-Gray code, which can be applied to determine the embedded bit from secret information. Following the transforming rule, the LSBs of stego-image are not always equal to the secret bits and the experiment shows that the differences are up to almost 50%. According to the mathematical deduction and experimental results, the proposed scheme has the same image quality and payload as the simple LSB substitution scheme. In fact, our proposed data hiding scheme in the case of G1 (one bit Gray code) system is equivalent to the simple LSB substitution scheme.

  16. Snow Water Equivalent estimation based on satellite observation

    NASA Astrophysics Data System (ADS)

    Macchiavello, G.; Pesce, F.; Boni, G.; Gabellani, S.

    2009-09-01

    The availability of remotely sensed images and them analysis is a powerful tool for monitoring the extension and typology of snow cover over territory where the in situ measurements are often difficult. Information on snow are fundamental for monitoring and forecasting the available water above all in regions at mid latitudes as Mediterranean where snowmelt may cause floods. The hydrological model requirements and the daily acquisitions of MODIS (Moderate Resolution Imaging Spectroradiometer), drove, in previous research activities, to the development of a method to automatically map the snow cover from multi-spectral images. But, the major hydrological parameter related to the snow pack is the Snow Water Equivalent (SWE). This represents a direct measure of stored water in the basin. Because of it, the work was focused to the daily estimation of SWE from MODIS images. But, the complexity of this aim, based only on optical data, doesn’t find any information in literature. Since, from the spectral range of MODIS data it is not possible to extract a direct relation between spectral information and the SWE. Then a new method, respectful of the physic of the snow, was defined and developed. Reminding that the snow water equivalent is the product of the three factors as snow density, snow depth and the snow covered areas, the proposed approach works separately on each of these physical behaviors. Referring to the physical characteristic of snow, the snow density is function of the snow age, then it was studied a new method to evaluate this. Where, a module for snow age simulation from albedo information was developed. It activates an age counter updated by new snow information set to estimate snow age from zero accumulation status to the end of melting season. The height of the snow pack, can be retrieved by adopting relation between vegetation and snow depth distributions. This computes snow height distribution by the relation between snow cover fraction and the forest canopy density. Finally, the SWE has to be calculated for the snow covered areas, detected by means of a previously developed decision tree classifier able to classify snow cover by self selecting rules in a statistically optimum way. The advantages introduced from this work are many. Firstly, applying a suitable method with data features, it is possible to automatically obtain snow cover description with high frequency. Moreover, the advantages of the modularity in the proposed approach allows to improve the three factors estimation in an independent way. Limitations lie into clouds problem that affects results by obscuring the observed territory, that is bounded by fusing temporal and spatial information. Then the spatial resolution of data, satisfactory with the scale of hydrological models, mismatch with the available in situ point information, causing difficulties for a method validation or calibration. However this working flow results computationally cost-effectiveness, robust to the radiometric noise of the original data, provides spatially extended and frequent information.

  17. 78 FR 48821 - Energy Conservation Program for Consumer Products and Certain Commercial and Industrial Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ... Commercial and Industrial Equipment: Proposed Determination of Computer Servers as a Covered Consumer Product... comments on the proposed determination that computer servers (servers) qualify as a covered product. DATES: The comment period for the proposed determination relating to servers published on July 12, 2013 (78...

  18. Image steganalysis using Artificial Bee Colony algorithm

    NASA Astrophysics Data System (ADS)

    Sajedi, Hedieh

    2017-09-01

    Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.

  19. Study on Finite Element Model Updating in Highway Bridge Static Loading Test Using Spatially-Distributed Optical Fiber Sensors

    PubMed Central

    Wu, Bitao; Lu, Huaxi; Chen, Bo; Gao, Zhicheng

    2017-01-01

    A finite model updating method that combines dynamic-static long-gauge strain responses is proposed for highway bridge static loading tests. For this method, the objective function consisting of static long-gauge stains and the first order modal macro-strain parameter (frequency) is established, wherein the local bending stiffness, density and boundary conditions of the structures are selected as the design variables. The relationship between the macro-strain and local element stiffness was studied first. It is revealed that the macro-strain is inversely proportional to the local stiffness covered by the long-gauge strain sensor. This corresponding relation is important for the modification of the local stiffness based on the macro-strain. The local and global parameters can be simultaneously updated. Then, a series of numerical simulation and experiments were conducted to verify the effectiveness of the proposed method. The results show that the static deformation, macro-strain and macro-strain modal can be predicted well by using the proposed updating model. PMID:28753912

  20. Study on Finite Element Model Updating in Highway Bridge Static Loading Test Using Spatially-Distributed Optical Fiber Sensors.

    PubMed

    Wu, Bitao; Lu, Huaxi; Chen, Bo; Gao, Zhicheng

    2017-07-19

    A finite model updating method that combines dynamic-static long-gauge strain responses is proposed for highway bridge static loading tests. For this method, the objective function consisting of static long-gauge stains and the first order modal macro-strain parameter (frequency) is established, wherein the local bending stiffness, density and boundary conditions of the structures are selected as the design variables. The relationship between the macro-strain and local element stiffness was studied first. It is revealed that the macro-strain is inversely proportional to the local stiffness covered by the long-gauge strain sensor. This corresponding relation is important for the modification of the local stiffness based on the macro-strain. The local and global parameters can be simultaneously updated. Then, a series of numerical simulation and experiments were conducted to verify the effectiveness of the proposed method. The results show that the static deformation, macro-strain and macro-strain modal can be predicted well by using the proposed updating model.

  1. Space-ecology set covering problem for modeling Daiyun Mountain Reserve, China

    NASA Astrophysics Data System (ADS)

    Lin, Chih-Wei; Liu, Jinfu; Huang, Jiahang; Zhang, Huiguang; Lan, Siren; Hong, Wei; Li, Wenzhou

    2018-02-01

    Site selection is an important issue in designing the nature reserve that has been studied over the years. However, a well-balanced relationship between preservation of biodiversity and site selection is still challenging. Unlike the existing methods, we consider three critical components, the spatial continuity, spatial compactness and ecological information to address the problem of designing the reserve. In this paper, we propose a new mathematical model of set covering problem called Space-ecology Set Covering Problem (SeSCP) for designing a reserve network. First, we generate the ecological information by forest resource investigation. Then, we split the landscape into elementary cells and calculate the ecological score of each cell. Next, we associate the ecological information with the spatial properties to select a set of cells to form a nature reserve for improving the ability of protecting the biodiversity. Two spatial constraints, continuity and compactability, are given in SeSCP. The continuity is to ensure that any selected site has to be connected with adjacent sites and the compactability is to minimize the perimeter of the selected sites. In computational experiments, we take Daiyun Mountain as a study area to demonstrate the feasibility and effectiveness of the proposed model.

  2. Therapy Decision Support Based on Recommender System Methods

    PubMed Central

    Gräßer, Felix; Beckert, Stefanie; Küster, Denise; Schmitt, Jochen; Abraham, Susanne; Malberg, Hagen

    2017-01-01

    We present a system for data-driven therapy decision support based on techniques from the field of recommender systems. Two methods for therapy recommendation, namely, Collaborative Recommender and Demographic-based Recommender, are proposed. Both algorithms aim to predict the individual response to different therapy options using diverse patient data and recommend the therapy which is assumed to provide the best outcome for a specific patient and time, that is, consultation. The proposed methods are evaluated using a clinical database incorporating patients suffering from the autoimmune skin disease psoriasis. The Collaborative Recommender proves to generate both better outcome predictions and recommendation quality. However, due to sparsity in the data, this approach cannot provide recommendations for the entire database. In contrast, the Demographic-based Recommender performs worse on average but covers more consultations. Consequently, both methods profit from a combination into an overall recommender system. PMID:29065657

  3. Automatic Centerline Extraction of Coverd Roads by Surrounding Objects from High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Kamangir, H.; Momeni, M.; Satari, M.

    2017-09-01

    This paper presents an automatic method to extract road centerline networks from high and very high resolution satellite images. The present paper addresses the automated extraction roads covered with multiple natural and artificial objects such as trees, vehicles and either shadows of buildings or trees. In order to have a precise road extraction, this method implements three stages including: classification of images based on maximum likelihood algorithm to categorize images into interested classes, modification process on classified images by connected component and morphological operators to extract pixels of desired objects by removing undesirable pixels of each class, and finally line extraction based on RANSAC algorithm. In order to evaluate performance of the proposed method, the generated results are compared with ground truth road map as a reference. The evaluation performance of the proposed method using representative test images show completeness values ranging between 77% and 93%.

  4. Hiding Techniques for Dynamic Encryption Text based on Corner Point

    NASA Astrophysics Data System (ADS)

    Abdullatif, Firas A.; Abdullatif, Alaa A.; al-Saffar, Amna

    2018-05-01

    Hiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.

  5. Adaptive classifier for steel strip surface defects

    NASA Astrophysics Data System (ADS)

    Jiang, Mingming; Li, Guangyao; Xie, Li; Xiao, Mang; Yi, Li

    2017-01-01

    Surface defects detection system has been receiving increased attention as its precision, speed and less cost. One of the most challenges is reacting to accuracy deterioration with time as aged equipment and changed processes. These variables will make a tiny change to the real world model but a big impact on the classification result. In this paper, we propose a new adaptive classifier with a Bayes kernel (BYEC) which update the model with small sample to it adaptive for accuracy deterioration. Firstly, abundant features were introduced to cover lots of information about the defects. Secondly, we constructed a series of SVMs with the random subspace of the features. Then, a Bayes classifier was trained as an evolutionary kernel to fuse the results from base SVMs. Finally, we proposed the method to update the Bayes evolutionary kernel. The proposed algorithm is experimentally compared with different algorithms, experimental results demonstrate that the proposed method can be updated with small sample and fit the changed model well. Robustness, low requirement for samples and adaptive is presented in the experiment.

  6. A System Approach to Navy Medical Education and Training. Appendix 36. Competency Curriculum for Operating Room Assistant and Operating Room Technician.

    DTIC Science & Technology

    1974-08-31

    These methods and curriculum materials constituted a third (instructional) sub-system. Thus, as originally proposed, a system capability has been...NODAL and its associated indexing techniques, it is possible to assemble modified or completely different inventories than those used in this research...covering all hair as a source of infection Method by which synthetic material causes static electricity Danger of static electricity in O.R. suite I

  7. 76 FR 54734 - Proposed Information Collection; Comment Request; Application for Export Trade; Certificate of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... private suits, by limiting liability in private actions to actual damages when the challenged activities are covered by an Export Trade Certificate of Review. II. Method of Collection The form is sent by request to U.S. firms. III. Data OMB Control Number: 0625-0125. Form Number(s): ITA-4093P. Type of Review...

  8. A method for optical imaging and monitoring of the excretion of fluorescent nanocomposites from the body using artificial neural networks.

    PubMed

    Sarmanova, Olga E; Burikov, Sergey A; Dolenko, Sergey A; Isaev, Igor V; Laptinskiy, Kirill A; Prabhakar, Neeraj; Karaman, Didem Şen; Rosenholm, Jessica M; Shenderova, Olga A; Dolenko, Tatiana A

    2018-04-12

    In this study, a new approach to the implementation of optical imaging of fluorescent nanoparticles in a biological medium using artificial neural networks is proposed. The studies were carried out using new synthesized nanocomposites - nanometer graphene oxides, covered by the poly(ethylene imine)-poly(ethylene glycol) copolymer and by the folic acid. We present an example of a successful solution of the problem of monitoring the removal of nanocomposites based on nGO and their components with urine using fluorescent spectroscopy and artificial neural networks. However, the proposed method is applicable for optical imaging of any fluorescent nanoparticles used as theranostic agents in biological tissue. Copyright © 2018. Published by Elsevier Inc.

  9. Dataset of surface plasmon resonance based on photonic crystal fiber for chemical sensing applications.

    PubMed

    Khalek, Md Abdul; Chakma, Sujan; Paul, Bikash Kumar; Ahmed, Kawsar

    2018-08-01

    In this research work a perfectly circular lattice Photonic Crystal Fiber (PCF) based surface Plasmon resonance (SPR) based sensor has been proposed. The investigation process has been successfully carried out using finite element method (FEM) based commercial available software package COMSOL Multiphysics version 4.2. The whole investigation module covers the wider optical spectrum ranging from 0.48 µm to 1.10 µm. Using the wavelength interrogation method the proposed model exposed maximum sensitivity of 9000 nm/RIU(Refractive Index Unit) and using the amplitude interrogation method it obtained maximum sensitivity of 318 RIU -1 . Moreover the maximum sensor resolution of 1.11×10 -5 in the sensing ranges between 1.34 and 1.37. Based on the suggested sensor model may provide great impact in biological area such as bio-imaging.

  10. Indonesian name matching using machine learning supervised approach

    NASA Astrophysics Data System (ADS)

    Alifikri, Mohamad; Arif Bijaksana, Moch.

    2018-03-01

    Most existing name matching methods are developed for English language and so they cover the characteristics of this language. Up to this moment, there is no specific one has been designed and implemented for Indonesian names. The purpose of this thesis is to develop Indonesian name matching dataset as a contribution to academic research and to propose suitable feature set by utilizing combination of context of name strings and its permute-winkler score. Machine learning classification algorithms is taken as the method for performing name matching. Based on the experiments, by using tuned Random Forest algorithm and proposed features, there is an improvement of matching performance by approximately 1.7% and it is able to reduce until 70% misclassification result of the state of the arts methods. This improving performance makes the matching system more effective and reduces the risk of misclassified matches.

  11. 77 FR 6879 - Rules of Practice for Trials Before the Patent Trial and Appeal Board and Judicial Review of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ...The United States Patent and Trademark Office (Office or USPTO) proposes new rules of practice to implement the provisions of the Leahy-Smith America Invents Act that provide for trials before the Patent Trial and Appeal Board (Board). The proposed rules would provide a consolidated set of rules relating to Board trial practice for inter partes review, post-grant review, the transitional program for covered business method patents, and derivation proceedings. The proposed rules would also provide a consolidated set of rules to implement the provisions of the Leahy-Smith America Invents Act related to seeking judicial review of Board decisions.

  12. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  13. Simultaneous measurement of refractive index and thickness by combining low-coherence interferometry and confocal optics.

    PubMed

    Kim, Seokhan; Na, Jihoon; Kim, Myoung Jin; Lee, Byeong Ha

    2008-04-14

    We propose and demonstrate novel methods that enable simultaneous measurements of the phase index, the group index, and the geometrical thickness of an optically transparent object by combining optical low-coherence interferometer and confocal optics. The low-coherence interferometer gives information relating the group index with the thickness, while the confocal optics allows access to the phase index related with the thickness of the sample. To relate these, two novel methods were devised. In the first method, the dispersion-induced broadening of the low-coherence envelop signal was utilized, and in the second method the frequency derivative of the phase index was directly obtained by taking the confocal measurements at several wavelengths. The measurements were made with eight different samples; B270, CaF2, two of BK7, two of fused silica, cover glass, and cigarette cover film. The average measurement errors of the first and the second methods were 0.123% and 0.061% in the geometrical thickness, 0.133% and 0.066% in the phase index, and 0.106% and 0.057% in the group index, respectively.

  14. Erosion estimation of guide vane end clearance in hydraulic turbines with sediment water flow

    NASA Astrophysics Data System (ADS)

    Han, Wei; Kang, Jingbo; Wang, Jie; Peng, Guoyi; Li, Lianyuan; Su, Min

    2018-04-01

    The end surface of guide vane or head cover is one of the most serious parts of sediment erosion for high-head hydraulic turbines. In order to investigate the relationship between erosion depth of wall surface and the characteristic parameter of erosion, an estimative method including a simplified flow model and a modificatory erosion calculative function is proposed in this paper. The flow between the end surfaces of guide vane and head cover is simplified as a clearance flow around a circular cylinder with a backward facing step. Erosion characteristic parameter of csws3 is calculated with the mixture model for multiphase flow and the renormalization group (RNG) k-𝜀 turbulence model under the actual working conditions, based on which, erosion depths of guide vane and head cover end surfaces are estimated with a modification of erosion coefficient K. The estimation results agree well with the actual situation. It is shown that the estimative method is reasonable for erosion prediction of guide vane and can provide a significant reference to determine the optimal maintenance cycle for hydraulic turbine in the future.

  15. 7 CFR 3402.11 - Proposal cover page.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Proposal cover page. 3402.11 Section 3402.11 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE FOOD AND AGRICULTURAL SCIENCES NATIONAL NEEDS...

  16. Multi-source remotely sensed data fusion for improving land cover classification

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Bo; Xu, Bing

    2017-02-01

    Although many advances have been made in past decades, land cover classification of fine-resolution remotely sensed (RS) data integrating multiple temporal, angular, and spectral features remains limited, and the contribution of different RS features to land cover classification accuracy remains uncertain. We proposed to improve land cover classification accuracy by integrating multi-source RS features through data fusion. We further investigated the effect of different RS features on classification performance. The results of fusing Landsat-8 Operational Land Imager (OLI) data with Moderate Resolution Imaging Spectroradiometer (MODIS), China Environment 1A series (HJ-1A), and Advanced Spaceborne Thermal Emission and Reflection (ASTER) digital elevation model (DEM) data, showed that the fused data integrating temporal, spectral, angular, and topographic features achieved better land cover classification accuracy than the original RS data. Compared with the topographic feature, the temporal and angular features extracted from the fused data played more important roles in classification performance, especially those temporal features containing abundant vegetation growth information, which markedly increased the overall classification accuracy. In addition, the multispectral and hyperspectral fusion successfully discriminated detailed forest types. Our study provides a straightforward strategy for hierarchical land cover classification by making full use of available RS data. All of these methods and findings could be useful for land cover classification at both regional and global scales.

  17. An Augmented Reality Endoscope System for Ureter Position Detection.

    PubMed

    Yu, Feng; Song, Enmin; Liu, Hong; Li, Yunlong; Zhu, Jun; Hung, Chih-Cheng

    2018-06-25

    Iatrogenic injury of ureter in the clinical operation may cause the serious complication and kidney damage. To avoid such a medical accident, it is necessary to provide the ureter position information to the doctor. For the detection of ureter position, an ureter position detection and display system with the augmented ris proposed to detect the ureter that is covered by human tissue. There are two key issues which should be considered in this new system. One is how to detect the covered ureter that cannot be captured by the electronic endoscope and the other is how to display the ureter position that provides stable and high-quality images. Simultaneously, any delayed processing of the system should disturb the surgery. The aided hardware detection method and target detection algorithms are proposed in this system. To mark the ureter position, a surface-lighting plastic optical fiber (POF) with the encoded light-emitting diode (LED) light is used to indicate the ureter position. The monochrome channel filtering algorithm (MCFA) is proposed to locate the ureter region more precisely. The ureter position is extracted using the proposed automatic region growing algorithm (ARGA) that utilizes the statistical information of the monochrome channel for the selection of growing seed point. In addition, according to the pulse signal of encoded light, the recognition of bright and dark frames based on the aided hardware (BDAH) is proposed to expedite the processing speed. Experimental results demonstrate that the proposed endoscope system can identify 92.04% ureter region in average.

  18. Spectral unmixing of urban land cover using a generic library approach

    NASA Astrophysics Data System (ADS)

    Degerickx, Jeroen; Lordache, Marian-Daniel; Okujeni, Akpona; Hermy, Martin; van der Linden, Sebastian; Somers, Ben

    2016-10-01

    Remote sensing based land cover classification in urban areas generally requires the use of subpixel classification algorithms to take into account the high spatial heterogeneity. These spectral unmixing techniques often rely on spectral libraries, i.e. collections of pure material spectra (endmembers, EM), which ideally cover the large EM variability typically present in urban scenes. Despite the advent of several (semi-) automated EM detection algorithms, the collection of such image-specific libraries remains a tedious and time-consuming task. As an alternative, we suggest the use of a generic urban EM library, containing material spectra under varying conditions, acquired from different locations and sensors. This approach requires an efficient EM selection technique, capable of only selecting those spectra relevant for a specific image. In this paper, we evaluate and compare the potential of different existing library pruning algorithms (Iterative Endmember Selection and MUSIC) using simulated hyperspectral (APEX) data of the Brussels metropolitan area. In addition, we develop a new hybrid EM selection method which is shown to be highly efficient in dealing with both imagespecific and generic libraries, subsequently yielding more robust land cover classification results compared to existing methods. Future research will include further optimization of the proposed algorithm and additional tests on both simulated and real hyperspectral data.

  19. A calibration method based on virtual large planar target for cameras with large FOV

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Han, Yangyang; Nie, Hong; Ou, Qiaofeng; Xiong, Bangshu

    2018-02-01

    In order to obtain high precision in camera calibration, a target should be large enough to cover the whole field of view (FOV). For cameras with large FOV, using a small target will seriously reduce the precision of calibration. However, using a large target causes many difficulties in making, carrying and employing the large target. In order to solve this problem, a calibration method based on the virtual large planar target (VLPT), which is virtually constructed with multiple small targets (STs), is proposed for cameras with large FOV. In the VLPT-based calibration method, first, the positions and directions of STs are changed several times to obtain a number of calibration images. Secondly, the VLPT of each calibration image is created by finding the virtual point corresponding to the feature points of the STs. Finally, intrinsic and extrinsic parameters of the camera are calculated by using the VLPTs. Experiment results show that the proposed method can not only achieve the similar calibration precision as those employing a large target, but also have good stability in the whole measurement area. Thus, the difficulties to accurately calibrate cameras with large FOV can be perfectly tackled by the proposed method with good operability.

  20. Two-Wavelength Multi-Gigahertz Frequency Comb-Based Interferometry for Full-Field Profilometry

    NASA Astrophysics Data System (ADS)

    Choi, Samuel; Kashiwagi, Ken; Kojima, Shuto; Kasuya, Yosuke; Kurokawa, Takashi

    2013-10-01

    The multi-gigahertz frequency comb-based interferometer exhibits only the interference amplitude peak without the phase fringes, which can produce a rapid axial scan for full-field profilometry and tomography. Despite huge technical advantages, there remain problems that the interference intensity undulations occurred depending on the interference phase. To avoid such problems, we propose a compensation technique of the interference signals using two frequency combs with slightly varied center wavelengths. The compensated full-field surface profile measurements of cover glass and onion skin were demonstrated experimentally to verify the advantages of the proposed method.

  1. The use of low density high accuracy (LDHA) data for correction of high density low accuracy (HDLA) point cloud

    NASA Astrophysics Data System (ADS)

    Rak, Michal Bartosz; Wozniak, Adam; Mayer, J. R. R.

    2016-06-01

    Coordinate measuring techniques rely on computer processing of coordinate values of points gathered from physical surfaces using contact or non-contact methods. Contact measurements are characterized by low density and high accuracy. On the other hand optical methods gather high density data of the whole object in a short time but with accuracy at least one order of magnitude lower than for contact measurements. Thus the drawback of contact methods is low density of data, while for non-contact methods it is low accuracy. In this paper a method for fusion of data from two measurements of fundamentally different nature: high density low accuracy (HDLA) and low density high accuracy (LDHA) is presented to overcome the limitations of both measuring methods. In the proposed method the concept of virtual markers is used to find a representation of pairs of corresponding characteristic points in both sets of data. In each pair the coordinates of the point from contact measurements is treated as a reference for the corresponding point from non-contact measurement. Transformation enabling displacement of characteristic points from optical measurement to their match from contact measurements is determined and applied to the whole point cloud. The efficiency of the proposed algorithm was evaluated by comparison with data from a coordinate measuring machine (CMM). Three surfaces were used for this evaluation: plane, turbine blade and engine cover. For the planar surface the achieved improvement was of around 200 μm. Similar results were obtained for the turbine blade but for the engine cover the improvement was smaller. For both freeform surfaces the improvement was higher for raw data than for data after creation of mesh of triangles.

  2. Ontology-Based Method for Fault Diagnosis of Loaders.

    PubMed

    Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-02-28

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.

  3. High-efficiency power transfer for silicon-based photonic devices

    NASA Astrophysics Data System (ADS)

    Son, Gyeongho; Yu, Kyoungsik

    2018-02-01

    We demonstrate an efficient coupling of guided light of 1550 nm from a standard single-mode optical fiber to a silicon waveguide using the finite-difference time-domain method and propose a fabrication method of tapered optical fibers for efficient power transfer to silicon-based photonic integrated circuits. Adiabatically-varying fiber core diameters with a small tapering angle can be obtained using the tube etching method with hydrofluoric acid and standard single-mode fibers covered by plastic jackets. The optical power transmission of the fundamental HE11 and TE-like modes between the fiber tapers and the inversely-tapered silicon waveguides was calculated with the finite-difference time-domain method to be more than 99% at a wavelength of 1550 nm. The proposed method for adiabatic fiber tapering can be applied in quantum optics, silicon-based photonic integrated circuits, and nanophotonics. Furthermore, efficient coupling within the telecommunication C-band is a promising approach for quantum networks in the future.

  4. Ontology-Based Method for Fault Diagnosis of Loaders

    PubMed Central

    Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-01-01

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study. PMID:29495646

  5. Simplex-stochastic collocation method with improved scalability

    NASA Astrophysics Data System (ADS)

    Edeling, W. N.; Dwight, R. P.; Cinnella, P.

    2016-04-01

    The Simplex-Stochastic Collocation (SSC) method is a robust tool used to propagate uncertain input distributions through a computer code. However, it becomes prohibitively expensive for problems with dimensions higher than 5. The main purpose of this paper is to identify bottlenecks, and to improve upon this bad scalability. In order to do so, we propose an alternative interpolation stencil technique based upon the Set-Covering problem, and we integrate the SSC method in the High-Dimensional Model-Reduction framework. In addition, we address the issue of ill-conditioned sample matrices, and we present an analytical map to facilitate uniformly-distributed simplex sampling.

  6. A Cross-Lingual Similarity Measure for Detecting Biomedical Term Translations

    PubMed Central

    Bollegala, Danushka; Kontonatsios, Georgios; Ananiadou, Sophia

    2015-01-01

    Bilingual dictionaries for technical terms such as biomedical terms are an important resource for machine translation systems as well as for humans who would like to understand a concept described in a foreign language. Often a biomedical term is first proposed in English and later it is manually translated to other languages. Despite the fact that there are large monolingual lexicons of biomedical terms, only a fraction of those term lexicons are translated to other languages. Manually compiling large-scale bilingual dictionaries for technical domains is a challenging task because it is difficult to find a sufficiently large number of bilingual experts. We propose a cross-lingual similarity measure for detecting most similar translation candidates for a biomedical term specified in one language (source) from another language (target). Specifically, a biomedical term in a language is represented using two types of features: (a) intrinsic features that consist of character n-grams extracted from the term under consideration, and (b) extrinsic features that consist of unigrams and bigrams extracted from the contextual windows surrounding the term under consideration. We propose a cross-lingual similarity measure using each of those feature types. First, to reduce the dimensionality of the feature space in each language, we propose prototype vector projection (PVP)—a non-negative lower-dimensional vector projection method. Second, we propose a method to learn a mapping between the feature spaces in the source and target language using partial least squares regression (PLSR). The proposed method requires only a small number of training instances to learn a cross-lingual similarity measure. The proposed PVP method outperforms popular dimensionality reduction methods such as the singular value decomposition (SVD) and non-negative matrix factorization (NMF) in a nearest neighbor prediction task. Moreover, our experimental results covering several language pairs such as English–French, English–Spanish, English–Greek, and English–Japanese show that the proposed method outperforms several other feature projection methods in biomedical term translation prediction tasks. PMID:26030738

  7. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Ming; Yu, Hengyong, E-mail: hengyong-yu@ieee.org

    2015-10-15

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle tomore » cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.« less

  8. [Land use and land cover charnge (LUCC) and landscape service: Evaluation, mapping and modeling].

    PubMed

    Song, Zhang-jian; Cao, Yu; Tan, Yong-zhong; Chen, Xiao-dong; Chen, Xian-peng

    2015-05-01

    Studies on ecosystem service from landscape scale aspect have received increasing attention from researchers all over the world. Compared with ecosystem scale, it should be more suitable to explore the influence of human activities on land use and land cover change (LUCC), and to interpret the mechanisms and processes of sustainable landscape dynamics on landscape scale. Based on comprehensive and systematic analysis of researches on landscape service, this paper firstly discussed basic concepts and classification of landscape service. Then, methods of evaluation, mapping and modeling of landscape service were analyzed and concluded. Finally, future trends for the research on landscape service were proposed. It was put forward that, exploring further connotation and classification system of landscape service, improving methods and quantitative indicators for evaluation, mapping and modelling of landscape service, carrying out long-term integrated researches on landscape pattern-process-service-scale relationships and enhancing the applications of theories and methods on landscape economics and landscape ecology are very important fields of the research on landscape service in future.

  9. 25 CFR 900.26 - What happens if the Secretary declines a part of a proposal on the ground that the proposal...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., function, service or activity that is beyond the scope of programs covered under section 102(a) of the Act, or proposes a level of funding that is in excess of the applicable level determined under section 106... administer a program, function, service or activity that is beyond the scope of programs covered under...

  10. 25 CFR 900.26 - What happens if the Secretary declines a part of a proposal on the ground that the proposal...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., function, service or activity that is beyond the scope of programs covered under section 102(a) of the Act, or proposes a level of funding that is in excess of the applicable level determined under section 106... administer a program, function, service or activity that is beyond the scope of programs covered under...

  11. Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient.

    PubMed

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-28

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAI(D)), which was compared with LAI estimated by the proposed digital photography method (LAI(M)). Results showed that the LAI(M) was able to estimate LAI(D) with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (f(f)) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  12. Digital Cover Photography for Estimating Leaf Area Index (LAI) in Apple Trees Using a Variable Light Extinction Coefficient

    PubMed Central

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-01

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID), which was compared with LAI estimated by the proposed digital photography method (LAIM). Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (ff) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions. PMID:25635411

  13. Accelerated lattice Boltzmann model for colloidal suspensions rheology and interface morphology

    NASA Astrophysics Data System (ADS)

    Farhat, Hassan

    Colloids are ubiquitous in the food, medical, cosmetic, polymer, water purification and pharmaceutical industries. Colloids thermal, mechanical and storage properties are highly dependent on their interface morphology and their rheological behavior. Numerical methods provide a cheap and reliable virtual laboratory for the study of colloids. However efficiency is a major concern to address when using numerical methods for practical applications. This work introduces the main building-blocks for an improved lattice Boltzmann-based numerical tool designed for the study of colloidal rheology and interface morphology. The efficiency of the proposed model is enhanced by using the recently developed and validated migrating multi-block algorithms for the lattice Boltzmann method (LBM). The migrating multi-block was used to simulate single component, multi-component, multiphase and single component multiphase flows. Results were validated by experimental, numerical and analytical solutions. The contamination of the fluid-fluid interface influences the colloids morphology. This issue was addressed by the introduction of the hybrid LBM for surfactant-covered droplets. The module was used for the simulation of surfactant-covered droplet deformation under shear and uniaxial extensional flows respectively and under buoyancy. Validation with experimental and theoretical results was provided. Colloids are non-Newtonian fluids which exhibit rich rheological behavior. The suppression of coalescence module is the part of the proposed model which facilitates the study of colloids rheology. The model results for the relative viscosity were in agreement with some theoretical results. Biological suspensions such as blood are macro-colloids by nature. The study of the blood flow in the microvasculature was heuristically approached by assuming the red blood cells as surfactant covered droplets. The effects of interfacial tension on the flow velocity and the droplet exclusion from the walls in parabolic flows were in qualitative agreement with some experimental and numerical results. The Fahraeus and the Fahraeus-Lindqvist effects were reproduced. The proposed LBM model provides a flexible numerical platform consisting of various modules which could be used separately or in combination for the study of a variety of colloids and biological suspensions flow deformation problems.

  14. Improving LUC estimation accuracy with multiple classification system for studying impact of urbanization on watershed flood

    NASA Astrophysics Data System (ADS)

    Dou, P.

    2017-12-01

    Guangzhou has experienced a rapid urbanization period called "small change in three years and big change in five years" since the reform of China, resulting in significant land use/cover changes(LUC). To overcome the disadvantages of single classifier for remote sensing image classification accuracy, a multiple classifier system (MCS) is proposed to improve the quality of remote sensing image classification. The new method combines advantages of different learning algorithms, and achieves higher accuracy (88.12%) than any single classifier did. With the proposed MCS, land use/cover (LUC) on Landsat images from 1987 to 2015 was obtained, and the LUCs were used on three watersheds (Shijing river, Chebei stream, and Shahe stream) to estimate the impact of urbanization on water flood. The results show that with the high accuracy LUC, the uncertainty in flood simulations are reduced effectively (for Shijing river, Chebei stream, and Shahe stream, the uncertainty reduced 15.5%, 17.3% and 19.8% respectively).

  15. Corrosion Resistance of a Cast-Iron Material Coated With a Ceramic Layer Using Thermal Spray Method

    NASA Astrophysics Data System (ADS)

    Florea, C. D.; Bejinariu, C.; Munteanu, C.; Istrate, B.; Toma, S. L.; Alexandru, A.; Cimpoesu, R.

    2018-06-01

    Cast-iron 250 used for breake systems present many corrosion signs after a mean usage time based on the environment conditions they work. In order to improve them corrosion resistance we propose to cover the active part of the material using a ceramic material. The deposition process is an industrial deposition system based on thermal spraying that can cover high surfaces in low time. In this articol we analyze the influence of a ceramic layer (40-50 µm) on the corrosion resistance of FC250 cast iron. The results were analyzed using scanning electron microscopy (SEM), X-ray energy dispersive (EDS) and linear and cyclic potentiometry.

  16. Ex-ante assessment of the safety effects of intelligent transport systems.

    PubMed

    Kulmala, Risto

    2010-07-01

    There is a need to develop a comprehensive framework for the safety assessment of Intelligent Transport Systems (ITS). This framework should: (1) cover all three dimensions of road safety-exposure, crash risk and consequence, (2) cover, in addition to the engineering effect, also the effects due to behavioural adaptation and (3) be compatible with the other aspects of state of the art road safety theories. A framework based on nine ITS safety mechanisms is proposed and discussed with regard to the requirements set to the framework. In order to illustrate the application of the framework in practice, the paper presents a method based on the framework and the results from applying that method for twelve intelligent vehicle systems in Europe. The framework is also compared to two recent frameworks applied in the safety assessment of intelligent vehicle safety systems. Copyright 2010 Elsevier Ltd. All rights reserved.

  17. Steganography in arrhythmic electrocardiogram signal.

    PubMed

    Edward Jero, S; Ramu, Palaniappan; Ramakrishnan, S

    2015-08-01

    Security and privacy of patient data is a vital requirement during exchange/storage of medical information over communication network. Steganography method hides patient data into a cover signal to prevent unauthenticated accesses during data transfer. This study evaluates the performance of ECG steganography to ensure secured transmission of patient data where an abnormal ECG signal is used as cover signal. The novelty of this work is to hide patient data into two dimensional matrix of an abnormal ECG signal using Discrete Wavelet Transform and Singular Value Decomposition based steganography method. A 2D ECG is constructed according to Tompkins QRS detection algorithm. The missed R peaks are computed using RR interval during 2D conversion. The abnormal ECG signals are obtained from the MIT-BIH arrhythmia database. Metrics such as Peak Signal to Noise Ratio, Percentage Residual Difference, Kullback-Leibler distance and Bit Error Rate are used to evaluate the performance of the proposed approach.

  18. Analysis of resonance response performance of C-band antenna using parasitic element.

    PubMed

    Zaman, M R; Islam, M T; Misran, N; Mandeep, J S

    2014-01-01

    Analysis of the resonance response improvement of a planar C-band (4-8 GHz) antenna is proposed using parasitic element method. This parasitic element based method is validated for change in the active and parasitic antenna elements. A novel dual-band antenna for C-band application covering 5.7 GHz and 7.6 GHz is designed and fabricated. The antenna is composed of circular parasitic element with unequal microstrip lines at both sides and a rectangular partial ground plane. A fractional bandwidth of 13.5% has been achieved from 5.5 GHz to 6.3 GHz (WLAN band) for the lower band. The upper band covers from 7.1 GHz to 8 GHz with a fractional bandwidth of 12%. A gain of 6.4 dBi is achieved at the lower frequency and 4 dBi is achieved at the upper frequency. The VSWR of the antenna is less than 2 at the resonance frequency.

  19. A Prescription for Drug Formulary Evaluation: An Application of Price Indexes

    PubMed Central

    Glazer, Jacob; Huskamp, Haiden A.; McGuire, Thomas G.

    2012-01-01

    Existing economic approaches to the design and evaluation of health insurance do not readily apply to coverage decisions in the multi-tiered drug formularies characterizing drug coverage in private health insurance and Medicare. This paper proposes a method for evaluating a change in the value of a formulary to covered members based on the economic theory of price indexes. A formulary is cast as a set of demand-side prices, and our measure approximates the compensation (positive or negative) that would need to be paid to consumers to accept the new set of prices. The measure also incorporates any effect of the formulary change on plan drug acquisition costs and “offset effects” on non-drug services covered by the plan. Data needed to calculate formulary value are known or can be forecast by a health plan. We illustrate the method with data from a move from a two- to a three-tier formulary. PMID:23372543

  20. Least square regularized regression in sum space.

    PubMed

    Xu, Yong-Li; Chen, Di-Rong; Li, Han-Xiong; Liu, Lu

    2013-04-01

    This paper proposes a least square regularized regression algorithm in sum space of reproducing kernel Hilbert spaces (RKHSs) for nonflat function approximation, and obtains the solution of the algorithm by solving a system of linear equations. This algorithm can approximate the low- and high-frequency component of the target function with large and small scale kernels, respectively. The convergence and learning rate are analyzed. We measure the complexity of the sum space by its covering number and demonstrate that the covering number can be bounded by the product of the covering numbers of basic RKHSs. For sum space of RKHSs with Gaussian kernels, by choosing appropriate parameters, we tradeoff the sample error and regularization error, and obtain a polynomial learning rate, which is better than that in any single RKHS. The utility of this method is illustrated with two simulated data sets and five real-life databases.

  1. Preheating Water In The Covers Of Solar Water Heaters

    NASA Technical Reports Server (NTRS)

    Bhandari, Pradeep

    1995-01-01

    Solar water heaters that include glass covers over absorber plates redesigned to increase efficiencies according to proposal. Redesign includes modification of single-layer glass cover into double-layer glass cover and addition of plumbing so cool water to be heated made to flow between layers of cover before entering absorber plate.

  2. Three-dimensional illusion thermal device for location camouflage.

    PubMed

    Wang, Jing; Bi, Yanqiang; Hou, Quanwen

    2017-08-08

    Thermal metamaterials, proposed in recent years, provide a new method to manipulate the energy flux in heat transfer, and result in many novel thermal devices. In this paper, an illusion thermal device for location camouflage in 3-dimensional heat conduction regime is proposed based on the transformation thermodynamics. The heat source covered by the device produces a fake signal outside the device, which makes the source look like appearing at another position away from its real position. The parameters required by the device are deduced and the method is validated by simulations. The possible scheme to obtain the thermal conductivities required in the device by composing natural materials is supplied, and the influence of some problems in practical fabrication process of the device on the effect of the camouflage is also discussed.

  3. Retrieval method of aerosol extinction coefficient profile by an integral lidar system and case study

    NASA Astrophysics Data System (ADS)

    Shan, Huihui; Zhang, Hui; Liu, Junjian; Wang, Shenhao; Ma, Xiaomin; Zhang, Lianqing; Liu, Dong; Xie, Chenbo; Tao, Zongming

    2018-02-01

    Aerosol extinction coefficient profile is an essential parameter for atmospheric radiation model. But it is difficult to get the full aerosol extinction profile from the ground to the tropopause especially in near ground precisely using backscattering lidar. A combined measurement of side-scattering, backscattering and Raman-scattering lidar is proposed to retrieve the aerosol extinction coefficient profile from the surface to the tropopause which covered a dynamic range of 5 orders. The side-scattering technique solves the dead zone and the overlap problem caused by the traditional lidar in the near range. Using the Raman-scattering the aerosol lidar ratio (extinction to backscatter ratio) can be obtained. The cases studies in this paper show the proposed method is reasonable and feasible.

  4. A Data Stream Model For Runoff Simulation In A Changing Environment

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Shao, J.; Zhang, H.; Wang, G.

    2017-12-01

    Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.

  5. Estimating time-varying drug adherence using electronic records: extending the proportion of days covered (PDC) method.

    PubMed

    Bijlsma, Maarten J; Janssen, Fanny; Hak, Eelko

    2016-03-01

    Accurate measurement of drug adherence is essential for valid risk-benefit assessments of pharmacologic interventions. To date, measures of drug adherence have almost exclusively been applied for a fixed-time interval and without considering changes over time. However, patients with irregular dosing behaviour commonly have a different prognosis than patients with stable dosing behaviour. We propose a method, based on the proportion of days covered (PDC) method, to measure time-varying drug adherence and drug dosage using electronic records. We compare a time-fixed PDC method with the time-varying PDC method through detailed examples and through summary statistics of 100 randomly selected patients on statin therapy. We demonstrate that time-varying PDC method better distinguishes an irregularly dosing patient from a stably dosing patient and demonstrate how the time-fixed method can result in a biassed estimate of drug adherence. Furthermore, the time-varying PDC method may be better used to reduce certain types of confounding and misclassification of exposure. The time-varying PDC method may improve longitudinal and time-to-event studies that associate adherence with a clinical outcome or (intervention) studies that seek to describe changes in adherence over time. Copyright © 2015 John Wiley & Sons, Ltd.

  6. [A proposal for a new definition of excess mortality associated with influenza-epidemics and its estimation].

    PubMed

    Takahashi, M; Tango, T

    2001-05-01

    As methods for estimating excess mortality associated with influenza-epidemic, the Serfling's cyclical regression model and the Kawai and Fukutomi model with seasonal indices have been proposed. Excess mortality under the old definition (i.e., the number of deaths actually recorded in excess of the number expected on the basis of past seasonal experience) covers the random error for that portion of variation regarded as due to chance. In addition, it disregards the range of random variation of mortality with the season. In this paper, we propose a new definition of excess mortality associated with influenza-epidemics and a new estimation method, considering these questions with the Kawai and Fukutomi method. The new definition of excess mortality and a novel method for its estimation were generated as follows. Factors bringing about variation in mortality in months with influenza-epidemics may be divided into two groups: 1. Influenza itself, 2. others (practically random variation). The range of variation of mortality due to the latter (normal range) can be estimated from the range for months in the absence of influenza-epidemics. Excess mortality is defined as death over the normal range. A new definition of excess mortality associated with influenza-epidemics and an estimation method are proposed. The new method considers variation in mortality in months in the absence of influenza-epidemics. Consequently, it provides reasonable estimates of excess mortality by separating the portion of random variation. Further, it is a characteristic that the proposed estimate can be used as a criterion of statistical significance test.

  7. 22 CFR 513.220 - Continuation of covered transactions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and participants shall not renew or extend covered transactions (other than no-cost time extensions... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Continuation of covered transactions. 513.220... Continuation of covered transactions. (a) Notwithstanding the debarment, suspension, proposed debarment under...

  8. Problems and criteria of quality improvement in end face mechanical seal rings through technological methods

    NASA Astrophysics Data System (ADS)

    Tarelnik, V.; Belous, A.; Antoszewski, B.; Zukov, A.

    2017-08-01

    In this paper are presented the recommendations for material’s selections of the mechanical seals rings and basic productive and operating requirements. The system of a directional selection of technology that ensures the required quality of working surfaces of the mechanical seals rings covers their entire life cycle. The mathematical frictional model is proposed as an instrument for calculating a linear and weighing abrasion of the mechanical seals rings and helps to improve selection’s criteria and the most rational method of strengthening.

  9. Artificial testing targets with controllable blur for adaptive optics microscopes

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Tamada, Yosuke; Murata, Takashi; Oya, Shin; Hasebe, Mitsuyasu; Hayano, Yutaka; Kamei, Yasuhiro

    2017-08-01

    This letter proposes a method of configuring a testing target to evaluate the performance of adaptive optics microscopes. In this method, a testing slide with fluorescent beads is used to simultaneously determine the point spread function and the field of view. The point spread function is reproduced to simulate actual biological samples by etching a microstructure on the cover glass. The fabrication process is simplified to facilitate an onsite preparation. The artificial tissue consists of solid materials and silicone oil and is stable for use in repetitive experiments.

  10. A new collage steganographic algorithm using cartoon design

    NASA Astrophysics Data System (ADS)

    Yi, Shuang; Zhou, Yicong; Pun, Chi-Man; Chen, C. L. Philip

    2014-02-01

    Existing collage steganographic methods suffer from low payload of embedding messages. To improve the payload while providing a high level of security protection to messages, this paper introduces a new collage steganographic algorithm using cartoon design. It embeds messages into the least significant bits (LSBs) of color cartoon objects, applies different permutations to each object, and adds objects to a cartoon cover image to obtain the stego image. Computer simulations and comparisons demonstrate that the proposed algorithm shows significantly higher capacity of embedding messages compared with existing collage steganographic methods.

  11. The Approximability of Partial Vertex Covers in Trees.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mkrtchyan, Vahan; Parekh, Ojas D.; Segev, Danny

    Motivated by applications in risk management of computational systems, we focus our attention on a special case of the partial vertex cover problem, where the underlying graph is assumed to be a tree. Here, we consider four possible versions of this setting, depending on whether vertices and edges are weighted or not. Two of these versions, where edges are assumed to be unweighted, are known to be polynomial-time solvable (Gandhi, Khuller, and Srinivasan, 2004). However, the computational complexity of this problem with weighted edges, and possibly with weighted vertices, has not been determined yet. The main contribution of this papermore » is to resolve these questions, by fully characterizing which variants of partial vertex cover remain intractable in trees, and which can be efficiently solved. In particular, we propose a pseudo-polynomial DP-based algorithm for the most general case of having weights on both edges and vertices, which is proven to be NPhard. This algorithm provides a polynomial-time solution method when weights are limited to edges, and combined with additional scaling ideas, leads to an FPTAS for the general case. A secondary contribution of this work is to propose a novel way of using centroid decompositions in trees, which could be useful in other settings as well.« less

  12. Generating local scale land use/cover change scenarios: case studies of high-risk mountain areas

    NASA Astrophysics Data System (ADS)

    Malek, Žiga; Glade, Thomas; Boerboom, Luc

    2014-05-01

    The relationship between land use/cover changes and consequences to human well-being is well acknowledged and has led to higher interest of both researchers and decision makers in driving forces and consequences of such changes. For example, removal of natural vegetation cover or urban expansion resulting in new elements at risk can increase hydro-meteorological risk. This is why it is necessary to study how the land use/cover could evolve in the future. Emphasis should especially be given to areas experiencing, or expecting, high rates of socio-economic change. A suitable approach to address these changes is scenario development; it offers exploring possible futures and the corresponding environmental consequences, and aids decision-making, as it enables to analyse possible options. Scenarios provide a creative methodology to depict possible futures, resulting from existing decisions, based on different assumptions of future socio-economic development. They have been used in various disciplines and on various scales, such as flood risk and soil erosion. Several studies have simulated future scenarios of land use/cover changes at a very high success rate, however usually these approaches are tailor made for specific case study areas and fit to available data. This study presents a multi-step scenario generation framework, which can be transferable to other local scale case study areas, taking into account the case study specific consequences of land use/cover changes. Through the use of experts' and decision-makers' knowledge, we aimed to develop a framework with the following characteristics: (1) it enables development of scenarios that are plausible, (2) it can overcome data inaccessibility, (3) it can address intangible and external driving forces of land use/cover change, and (4) it ensures transferability to other local scale case study areas with different land use/cover change processes and consequences. To achieve this, a set of different methods is applied including: qualitative methods such as interviews, group discussions and fuzzy cognitive mapping to identify land use/cover change processes, their driving forces and possible consequences, and final scenario generation; and geospatial methods such as GIS, geostatistics and environmental modeling in an environment for geoprocessing objects (Dinamica EGO) for spatial allocation of these scenarios. The methods were applied in the Italian Alps and the Romanian Carpathians. Both are mountainous areas, however they differ in terms of past and most likely future socio-economic development, and therefore consequent land use/cover changes. Whereas we focused on urban expansion due to tourism development in the Alps, we focused on possible deforestation trajectories in the Carpathians. In both areas, the recognized most significant driving forces were either not covered by accessible data, or were characterized as intangible. With the proposed framework we were able to generate futures scenarios despite these shortcomings, and enabling the transferability of the method.

  13. Remote sensing, hydrological modeling and in situ observations in snow cover research: A review

    NASA Astrophysics Data System (ADS)

    Dong, Chunyu

    2018-06-01

    Snow is an important component of the hydrological cycle. As a major part of the cryosphere, snow cover also represents a valuable terrestrial water resource. In the context of climate change, the dynamics of snow cover play a crucial role in rebalancing the global energy and water budgets. Remote sensing, hydrological modeling and in situ observations are three techniques frequently utilized for snow cover investigations. However, the uncertainties caused by systematic errors, scale gaps, and complicated snow physics, among other factors, limit the usability of these three approaches in snow studies. In this paper, an overview of the advantages, limitations and recent progress of the three methods is presented, and more effective ways to estimate snow cover properties are evaluated. The possibility of improving remotely sensed snow information using ground-based observations is discussed. As a rapidly growing source of volunteered geographic information (VGI), web-based geotagged photos have great potential to provide ground truth data for remotely sensed products and hydrological models and thus contribute to procedures for cloud removal, correction, validation, forcing and assimilation. Finally, this review proposes a synergistic framework for the future of snow cover research. This framework highlights the cross-scale integration of in situ and remotely sensed snow measurements and the assimilation of improved remote sensing data into hydrological models.

  14. Object-based land-cover classification for metropolitan Phoenix, Arizona, using aerial photography

    NASA Astrophysics Data System (ADS)

    Li, Xiaoxiao; Myint, Soe W.; Zhang, Yujia; Galletti, Chritopher; Zhang, Xiaoxiang; Turner, Billie L.

    2014-12-01

    Detailed land-cover mapping is essential for a range of research issues addressed by the sustainability and land system sciences and planning. This study uses an object-based approach to create a 1 m land-cover classification map of the expansive Phoenix metropolitan area through the use of high spatial resolution aerial photography from National Agricultural Imagery Program. It employs an expert knowledge decision rule set and incorporates the cadastral GIS vector layer as auxiliary data. The classification rule was established on a hierarchical image object network, and the properties of parcels in the vector layer were used to establish land cover types. Image segmentations were initially utilized to separate the aerial photos into parcel sized objects, and were further used for detailed land type identification within the parcels. Characteristics of image objects from contextual and geometrical aspects were used in the decision rule set to reduce the spectral limitation of the four-band aerial photography. Classification results include 12 land-cover classes and subclasses that may be assessed from the sub-parcel to the landscape scales, facilitating examination of scale dynamics. The proposed object-based classification method provides robust results, uses minimal and readily available ancillary data, and reduces computational time.

  15. 77 FR 48733 - Transitional Program for Covered Business Method Patents-Definitions of Covered Business Method...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Office 37 CFR Part 42 Transitional Program for Covered Business Method Patents--Definitions of Covered Business Method Patent and Technological Invention; Final Rule #0;#0;Federal Register / Vol. 77 , No. 157... Business Method Patents-- Definitions of Covered Business Method Patent and Technological Invention AGENCY...

  16. 75 FR 54232 - Proposed Collection; Comment Request for Report of Covered Pharmaceutical Manufacturers and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Report of Covered Pharmaceutical Manufacturers and Importers (Form-8947) AGENCY: Internal Revenue Service...)). Currently, the IRS is soliciting comments concerning Form-8947, Report of Covered Pharmaceutical...: Title: Report of Covered Pharmaceutical Manufacturers and Importers. OMB Number: 1545-XXXX. Form Number...

  17. Impacts of heterogeneous organic matter on phenanthrene sorption--Different soil and sediment samples

    USGS Publications Warehouse

    Karapanagioti, Hrissi K.; Childs, Jeffrey; Sabatini, David A.

    2001-01-01

    Organic petrography has been proposed as a tool for characterizing the heterogeneous organic matter present in soil and sediment samples. A new simplified method is proposed as a quantitative means of interpreting observed sorption behavior for phenanthrene and different soils and sediments based on their organic petrographical characterization. This method is tested under singe solute conditions and at phenanthrene concentration of 1 μg/L. Since the opaque organic matter fraction dominates the sorption process, we propose that by quantifying this fraction one can interpret organic content normalized sorption distribution coefficient (Koc) values for a sample. While this method was developed and tested for various samples within the same aquifer, in the current study the method is validated for soil and sediment samples from different sites that cover a wide range of organic matter origin, age, and organic content. All 10 soil and sediment samples studied had log Koc values for the opaque particles between 5.6 and 6.8. This range of Koc values illustrates the heterogeneity of opaque particles between sites and geological formations and thus the need to characterize the opaque fraction of materials on a site-by-site basis.

  18. On Quantile Regression in Reproducing Kernel Hilbert Spaces with Data Sparsity Constraint

    PubMed Central

    Zhang, Chong; Liu, Yufeng; Wu, Yichao

    2015-01-01

    For spline regressions, it is well known that the choice of knots is crucial for the performance of the estimator. As a general learning framework covering the smoothing splines, learning in a Reproducing Kernel Hilbert Space (RKHS) has a similar issue. However, the selection of training data points for kernel functions in the RKHS representation has not been carefully studied in the literature. In this paper we study quantile regression as an example of learning in a RKHS. In this case, the regular squared norm penalty does not perform training data selection. We propose a data sparsity constraint that imposes thresholding on the kernel function coefficients to achieve a sparse kernel function representation. We demonstrate that the proposed data sparsity method can have competitive prediction performance for certain situations, and have comparable performance in other cases compared to that of the traditional squared norm penalty. Therefore, the data sparsity method can serve as a competitive alternative to the squared norm penalty method. Some theoretical properties of our proposed method using the data sparsity constraint are obtained. Both simulated and real data sets are used to demonstrate the usefulness of our data sparsity constraint. PMID:27134575

  19. Helical cone beam CT with an asymmetrical detector.

    PubMed

    Zamyatin, Alexander A; Taguchi, Katsuyuki; Silver, Michael D

    2005-10-01

    If a multislice or other area detector is shifted to one side to cover a larger field of view, then the data are truncated on one side. We propose a method to restore the missing data in helical cone-beam acquisitions that uses measured data on the longer side of the asymmetric detector array. The method is based on the idea of complementary rays, which is well known in fan beam geometry; in this paper we extend this concept to the cone-beam case. Different cases of complementary data coverage and dependence on the helical pitch are considered. The proposed method is used in our prototype 16-row CT scanner with an asymmetric detector and a 700 mm field of view. For evaluation we used scanned body phantom data and computer-simulated data. To simulate asymmetric truncation, the full, symmetric datasets were truncated by dropping either 22.5% or 45% from one side of the detector. Reconstructed images from the prototype scanner with the asymmetrical detector show excellent image quality in the extended field of view. The proposed method allows flexible helical pitch selection and can be used with overscan, short-scan, and super-short-scan reconstructions.

  20. Tropical land use land cover mapping in Pará (Brazil) using discriminative Markov random fields and multi-temporal TerraSAR-X data

    NASA Astrophysics Data System (ADS)

    Hagensieker, Ron; Roscher, Ribana; Rosentreter, Johannes; Jakimow, Benjamin; Waske, Björn

    2017-12-01

    Remote sensing satellite data offer the unique possibility to map land use land cover transformations by providing spatially explicit information. However, detection of short-term processes and land use patterns of high spatial-temporal variability is a challenging task. We present a novel framework using multi-temporal TerraSAR-X data and machine learning techniques, namely discriminative Markov random fields with spatio-temporal priors, and import vector machines, in order to advance the mapping of land cover characterized by short-term changes. Our study region covers a current deforestation frontier in the Brazilian state Pará with land cover dominated by primary forests, different types of pasture land and secondary vegetation, and land use dominated by short-term processes such as slash-and-burn activities. The data set comprises multi-temporal TerraSAR-X imagery acquired over the course of the 2014 dry season, as well as optical data (RapidEye, Landsat) for reference. Results show that land use land cover is reliably mapped, resulting in spatially adjusted overall accuracies of up to 79% in a five class setting, yet limitations for the differentiation of different pasture types remain. The proposed method is applicable on multi-temporal data sets, and constitutes a feasible approach to map land use land cover in regions that are affected by high-frequent temporal changes.

  1. A Prognostic Methodology for Precipitation Phase Detection using GPM Microwave Observations —With Focus on Snow Cover

    NASA Astrophysics Data System (ADS)

    Takbiri, Z.; Ebtehaj, A.; Foufoula-Georgiou, E.; Kirstetter, P.

    2017-12-01

    Improving satellite retrieval of precipitation requires increased understanding of its passive microwave signature over different land surfaces. Passive microwave signals over snow-covered surfaces are notoriously difficult to interpret because they record both emission from the land below and absorption/scattering from the liquid/ice crystals. Using data from the Global Precipitation Measurement (GPM) core satellite, we demonstrate that the microwave brightness temperatures of rain and snowfall shifts from a scattering to an emission regime from summer to winter, due to expansion of the less emissive snow cover underneath. We present evidence that the combination of low- (10-19 GHz) and high-frequency (89-166 GHz) channels provides the maximum amount of information for snowfall detection. The study also examines a prognostic nearest neighbor matching method for the detection of precipitation and its phase from passive microwave observations using GPM data. The nearest neighbor uses the weighted Euclidean distance metric to search through an a priori database that is populated with coincident GPM radiometer and radar data as well as ancillary snow cover fraction. The results demonstrate prognostic capabilities of the proposed method in detection of terrestrial snowfall. At the global scale, the average probability of hit and false alarm reaches to 0.80 and remains below 0.10, respectively. Surprisingly, the results show that the snow cover may help to better detect precipitation as the detection rate of terrestrial precipitation is increased from 0.75 (no snow cover) to 0.84 (snow-covered surfaces). For solid precipitation, this increased rate of detection is larger than its liquid counterpart by almost 8%. The main reasons are found to be related to the multi-frequency capabilities of the nearest neighbor matching that can properly isolate the atmospheric signal from the background emission and the fact that the precipitation can exhibit an emission-like (warmer than surface) signature over fresh snow cover.

  2. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  3. Benthic Habitat Mapping by Combining Lyzenga’s Optical Model and Relative Water Depth Model in Lintea Island, Southeast Sulawesi

    NASA Astrophysics Data System (ADS)

    Hafizt, M.; Manessa, M. D. M.; Adi, N. S.; Prayudha, B.

    2017-12-01

    Benthic habitat mapping using satellite data is one challenging task for practitioners and academician as benthic objects are covered by light-attenuating water column obscuring object discrimination. One common method to reduce this water-column effect is by using depth-invariant index (DII) image. However, the application of the correction in shallow coastal areas is challenging as a dark object such as seagrass could have a very low pixel value, preventing its reliable identification and classification. This limitation can be solved by specifically applying a classification process to areas with different water depth levels. The water depth level can be extracted from satellite imagery using Relative Water Depth Index (RWDI). This study proposed a new approach to improve the mapping accuracy, particularly for benthic dark objects by combining the DII of Lyzenga’s water column correction method and the RWDI of Stumpt’s method. This research was conducted in Lintea Island which has a high variation of benthic cover using Sentinel-2A imagery. To assess the effectiveness of the proposed new approach for benthic habitat mapping two different classification procedures are implemented. The first procedure is the commonly applied method in benthic habitat mapping where DII image is used as input data to all coastal area for image classification process regardless of depth variation. The second procedure is the proposed new approach where its initial step begins with the separation of the study area into shallow and deep waters using the RWDI image. Shallow area was then classified using the sunglint-corrected image as input data and the deep area was classified using DII image as input data. The final classification maps of those two areas were merged as a single benthic habitat map. A confusion matrix was then applied to evaluate the mapping accuracy of the final map. The result shows that the new proposed mapping approach can be used to map all benthic objects in all depth ranges and shows a better accuracy compared to that of classification map produced using only with DII.

  4. Improving snow fraction spatio-temporal continuity using a combination of MODIS and Fengyun-2 satellites over China

    NASA Astrophysics Data System (ADS)

    Jiang, L.; Wang, G.

    2017-12-01

    Snow cover is one of key elements in the investigations of weather, climatic change, water resource, and snow hazard. Satellites observations from on-board optical sensors provides the ability to snow cover mapping through the discrimination of snow from other surface features and cloud. MODIS provides maximum of snow cover data using 8-day composition data in order to reduce the cloud obscuration impacts. However, snow cover mapping is often required to obtain at the temporal scale of less than one day, especially in the case of disasters. Geostationary satellites provide much higher temporal resolution measurements (typically at 15 min or half or one hour), which has a great potential to reduce cloud cover problem and observe ground surface for identifying snow. The proposed method in this work is that how to take the advantages of polar-orbiting and geostationary optical sensors to accurately map snow cover without data gaps due to cloud. FY-2 geostationary satellites have high temporal resolution observations, however, they are lacking enough spectral bands essential for snow cover monitoring, such as the 1.6 μm band. Based on our recent work (Wang et al., 2017), we improved FY-2/VISSR fractional snow cover estimation with a linear spectral unmixing analysis method. The linear approach is applied then using the reflectance observed at the certain hourly image of FY-2 to calculate pixel-wise snow cover fraction. The composition of daily factional snow cover employs the sun zenith angle, where the snow fraction under lowest sun zenith angle is considered as the most confident result. FY-2/VISSR fractional snow cover map has less cloud due to the composition of multi-temporal snow maps in a single day. In order to get an accurate and cloud-reduced fractional snow cover map, both of MODIS and FY-2/VISSR daily snow fraction maps are blended together. With the combination of FY-2E/VISSR and MODIS, there are still some cloud existing in the daily snow fraction map. Then the combination snow fraction map is temporally reconstructed using MATLAB Piecewise Cubic Hermite Interpolating Polynomial (PCHIP) function to derive a completely daily cloud-free snow cover map under all the sky conditions.

  5. An in-mold packaging process for plastic fluidic devices.

    PubMed

    Yoo, Y E; Lee, K H; Je, T J; Choi, D S; Kim, S K

    2011-01-01

    Micro or nanofluidic devices have many channel shapes to deliver chemical solutions, body fluids or any fluids. The channels in these devices should be covered to prevent the fluids from overflowing or leaking. A typical method to fabricate an enclosed channel is to bond or weld a cover plate to a channel plate. This solid-to-solid bonding process, however, takes a considerable amount of time for mass production. In this study, a new process for molding a cover layer that can enclose open micro or nanochannels without solid-to-solid bonding is proposed and its feasibility is estimated. First, based on the design of a model microchannel, a brass microchannel master core was machined and a plastic microchannel platform was injection-molded. Using this molded platform, a series of experiments was performed for four process or mold design parameters. Some feasible conditions were successfully found to enclosed channels without filling the microchannels for the injection molding of a cover layer over the plastic microchannel platform. In addition, the bond strength and seal performance were estimated in a comparison with those done by conventional bonding or welding processes.

  6. Linguistic steganography on Twitter: hierarchical language modeling with manual interaction

    NASA Astrophysics Data System (ADS)

    Wilson, Alex; Blunsom, Phil; Ker, Andrew D.

    2014-02-01

    This work proposes a natural language stegosystem for Twitter, modifying tweets as they are written to hide 4 bits of payload per tweet, which is a greater payload than previous systems have achieved. The system, CoverTweet, includes novel components, as well as some already developed in the literature. We believe that the task of transforming covers during embedding is equivalent to unilingual machine translation (paraphrasing), and we use this equivalence to de ne a distortion measure based on statistical machine translation methods. The system incorporates this measure of distortion to rank possible tweet paraphrases, using a hierarchical language model; we use human interaction as a second distortion measure to pick the best. The hierarchical language model is designed to model the speci c language of the covers, which in this setting is the language of the Twitter user who is embedding. This is a change from previous work, where general-purpose language models have been used. We evaluate our system by testing the output against human judges, and show that humans are unable to distinguish stego tweets from cover tweets any better than random guessing.

  7. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  8. Data-Driven Engineering of Social Dynamics: Pattern Matching and Profit Maximization

    PubMed Central

    Peng, Huan-Kai; Lee, Hao-Chih; Pan, Jia-Yu; Marculescu, Radu

    2016-01-01

    In this paper, we define a new problem related to social media, namely, the data-driven engineering of social dynamics. More precisely, given a set of observations from the past, we aim at finding the best short-term intervention that can lead to predefined long-term outcomes. Toward this end, we propose a general formulation that covers two useful engineering tasks as special cases, namely, pattern matching and profit maximization. By incorporating a deep learning model, we derive a solution using convex relaxation and quadratic-programming transformation. Moreover, we propose a data-driven evaluation method in place of the expensive field experiments. Using a Twitter dataset, we demonstrate the effectiveness of our dynamics engineering approach for both pattern matching and profit maximization, and study the multifaceted interplay among several important factors of dynamics engineering, such as solution validity, pattern-matching accuracy, and intervention cost. Finally, the method we propose is general enough to work with multi-dimensional time series, so it can potentially be used in many other applications. PMID:26771830

  9. Data-Driven Engineering of Social Dynamics: Pattern Matching and Profit Maximization.

    PubMed

    Peng, Huan-Kai; Lee, Hao-Chih; Pan, Jia-Yu; Marculescu, Radu

    2016-01-01

    In this paper, we define a new problem related to social media, namely, the data-driven engineering of social dynamics. More precisely, given a set of observations from the past, we aim at finding the best short-term intervention that can lead to predefined long-term outcomes. Toward this end, we propose a general formulation that covers two useful engineering tasks as special cases, namely, pattern matching and profit maximization. By incorporating a deep learning model, we derive a solution using convex relaxation and quadratic-programming transformation. Moreover, we propose a data-driven evaluation method in place of the expensive field experiments. Using a Twitter dataset, we demonstrate the effectiveness of our dynamics engineering approach for both pattern matching and profit maximization, and study the multifaceted interplay among several important factors of dynamics engineering, such as solution validity, pattern-matching accuracy, and intervention cost. Finally, the method we propose is general enough to work with multi-dimensional time series, so it can potentially be used in many other applications.

  10. Raster Vs. Point Cloud LiDAR Data Classification

    NASA Astrophysics Data System (ADS)

    El-Ashmawy, N.; Shaker, A.

    2014-09-01

    Airborne Laser Scanning systems with light detection and ranging (LiDAR) technology is one of the fast and accurate 3D point data acquisition techniques. Generating accurate digital terrain and/or surface models (DTM/DSM) is the main application of collecting LiDAR range data. Recently, LiDAR range and intensity data have been used for land cover classification applications. Data range and Intensity, (strength of the backscattered signals measured by the LiDAR systems), are affected by the flying height, the ground elevation, scanning angle and the physical characteristics of the objects surface. These effects may lead to uneven distribution of point cloud or some gaps that may affect the classification process. Researchers have investigated the conversion of LiDAR range point data to raster image for terrain modelling. Interpolation techniques have been used to achieve the best representation of surfaces, and to fill the gaps between the LiDAR footprints. Interpolation methods are also investigated to generate LiDAR range and intensity image data for land cover classification applications. In this paper, different approach has been followed to classifying the LiDAR data (range and intensity) for land cover mapping. The methodology relies on the classification of the point cloud data based on their range and intensity and then converted the classified points into raster image. The gaps in the data are filled based on the classes of the nearest neighbour. Land cover maps are produced using two approaches using: (a) the conventional raster image data based on point interpolation; and (b) the proposed point data classification. A study area covering an urban district in Burnaby, British Colombia, Canada, is selected to compare the results of the two approaches. Five different land cover classes can be distinguished in that area: buildings, roads and parking areas, trees, low vegetation (grass), and bare soil. The results show that an improvement of around 10 % in the classification results can be achieved by using the proposed approach.

  11. A study of high density bit transition requirements versus the effects on BCH error correcting coding

    NASA Technical Reports Server (NTRS)

    Ingels, F.; Schoggen, W. O.

    1981-01-01

    Several methods for increasing bit transition densities in a data stream are summarized, discussed in detail, and compared against constraints imposed by the 2 MHz data link of the space shuttle high rate multiplexer unit. These methods include use of alternate pulse code modulation waveforms, data stream modification by insertion, alternate bit inversion, differential encoding, error encoding, and use of bit scramblers. The psuedo-random cover sequence generator was chosen for application to the 2 MHz data link of the space shuttle high rate multiplexer unit. This method is fully analyzed and a design implementation proposed.

  12. Improving spectral resolution in spatial encoding dimension of single-scan nuclear magnetic resonance 2D spin echo correlated spectroscopy

    NASA Astrophysics Data System (ADS)

    Lin, Liangjie; Wei, Zhiliang; Yang, Jian; Lin, Yanqin; Chen, Zhong

    2014-11-01

    The spatial encoding technique can be used to accelerate the acquisition of multi-dimensional nuclear magnetic resonance spectra. However, with this technique, we have to make trade-offs between the spectral width and the resolution in the spatial encoding dimension (F1 dimension), resulting in the difficulty of covering large spectral widths while preserving acceptable resolutions for spatial encoding spectra. In this study, a selective shifting method is proposed to overcome the aforementioned drawback. This method is capable of narrowing spectral widths and improving spectral resolutions in spatial encoding dimensions by selectively shifting certain peaks in spectra of the ultrafast version of spin echo correlated spectroscopy (UFSECSY). This method can also serve as a powerful tool to obtain high-resolution correlated spectra in inhomogeneous magnetic fields for its resistance to any inhomogeneity in the F1 dimension inherited from UFSECSY. Theoretical derivations and experiments have been carried out to demonstrate performances of the proposed method. Results show that the spectral width in spatial encoding dimension can be reduced by shortening distances between cross peaks and axial peaks with the proposed method and the expected resolution improvement can be achieved. Finally, the shifting-absent spectrum can be recovered readily by post-processing.

  13. Estimating fractional vegetation cover and the vegetation index of bare soil and highly dense vegetation with a physically based method

    NASA Astrophysics Data System (ADS)

    Song, Wanjuan; Mu, Xihan; Ruan, Gaiyan; Gao, Zhan; Li, Linyuan; Yan, Guangjian

    2017-06-01

    Normalized difference vegetation index (NDVI) of highly dense vegetation (NDVIv) and bare soil (NDVIs), identified as the key parameters for Fractional Vegetation Cover (FVC) estimation, are usually obtained with empirical statistical methods However, it is often difficult to obtain reasonable values of NDVIv and NDVIs at a coarse resolution (e.g., 1 km), or in arid, semiarid, and evergreen areas. The uncertainty of estimated NDVIs and NDVIv can cause substantial errors in FVC estimations when a simple linear mixture model is used. To address this problem, this paper proposes a physically based method. The leaf area index (LAI) and directional NDVI are introduced in a gap fraction model and a linear mixture model for FVC estimation to calculate NDVIv and NDVIs. The model incorporates the Moderate Resolution Imaging Spectroradiometer (MODIS) Bidirectional Reflectance Distribution Function (BRDF) model parameters product (MCD43B1) and LAI product, which are convenient to acquire. Two types of evaluation experiments are designed 1) with data simulated by a canopy radiative transfer model and 2) with satellite observations. The root-mean-square deviation (RMSD) for simulated data is less than 0.117, depending on the type of noise added on the data. In the real data experiment, the RMSD for cropland is 0.127, for grassland is 0.075, and for forest is 0.107. The experimental areas respectively lack fully vegetated and non-vegetated pixels at 1 km resolution. Consequently, a relatively large uncertainty is found while using the statistical methods and the RMSD ranges from 0.110 to 0.363 based on the real data. The proposed method is convenient to produce NDVIv and NDVIs maps for FVC estimation on regional and global scales.

  14. Using Tensor Completion Method to Achieving Better Coverage of Traffic State Estimation from Sparse Floating Car Data

    PubMed Central

    Ran, Bin; Song, Li; Cheng, Yang; Tan, Huachun

    2016-01-01

    Traffic state estimation from the floating car system is a challenging problem. The low penetration rate and random distribution make available floating car samples usually cover part space and time points of the road networks. To obtain a wide range of traffic state from the floating car system, many methods have been proposed to estimate the traffic state for the uncovered links. However, these methods cannot provide traffic state of the entire road networks. In this paper, the traffic state estimation is transformed to solve a missing data imputation problem, and the tensor completion framework is proposed to estimate missing traffic state. A tensor is constructed to model traffic state in which observed entries are directly derived from floating car system and unobserved traffic states are modeled as missing entries of constructed tensor. The constructed traffic state tensor can represent spatial and temporal correlations of traffic data and encode the multi-way properties of traffic state. The advantage of the proposed approach is that it can fully mine and utilize the multi-dimensional inherent correlations of traffic state. We tested the proposed approach on a well calibrated simulation network. Experimental results demonstrated that the proposed approach yield reliable traffic state estimation from very sparse floating car data, particularly when dealing with the floating car penetration rate is below 1%. PMID:27448326

  15. Using Tensor Completion Method to Achieving Better Coverage of Traffic State Estimation from Sparse Floating Car Data.

    PubMed

    Ran, Bin; Song, Li; Zhang, Jian; Cheng, Yang; Tan, Huachun

    2016-01-01

    Traffic state estimation from the floating car system is a challenging problem. The low penetration rate and random distribution make available floating car samples usually cover part space and time points of the road networks. To obtain a wide range of traffic state from the floating car system, many methods have been proposed to estimate the traffic state for the uncovered links. However, these methods cannot provide traffic state of the entire road networks. In this paper, the traffic state estimation is transformed to solve a missing data imputation problem, and the tensor completion framework is proposed to estimate missing traffic state. A tensor is constructed to model traffic state in which observed entries are directly derived from floating car system and unobserved traffic states are modeled as missing entries of constructed tensor. The constructed traffic state tensor can represent spatial and temporal correlations of traffic data and encode the multi-way properties of traffic state. The advantage of the proposed approach is that it can fully mine and utilize the multi-dimensional inherent correlations of traffic state. We tested the proposed approach on a well calibrated simulation network. Experimental results demonstrated that the proposed approach yield reliable traffic state estimation from very sparse floating car data, particularly when dealing with the floating car penetration rate is below 1%.

  16. Hybrid modeling method for a DEP based particle manipulation.

    PubMed

    Miled, Mohamed Amine; Gagne, Antoine; Sawan, Mohamad

    2013-01-30

    In this paper, a new modeling approach for Dielectrophoresis (DEP) based particle manipulation is presented. The proposed method fulfills missing links in finite element modeling between the multiphysic simulation and the biological behavior. This technique is amongst the first steps to develop a more complex platform covering several types of manipulations such as magnetophoresis and optics. The modeling approach is based on a hybrid interface using both ANSYS and MATLAB to link the propagation of the electrical field in the micro-channel to the particle motion. ANSYS is used to simulate the electrical propagation while MATLAB interprets the results to calculate cell displacement and send the new information to ANSYS for another turn. The beta version of the proposed technique takes into account particle shape, weight and its electrical properties. First obtained results are coherent with experimental results.

  17. Hybrid Modeling Method for a DEP Based Particle Manipulation

    PubMed Central

    Miled, Mohamed Amine; Gagne, Antoine; Sawan, Mohamad

    2013-01-01

    In this paper, a new modeling approach for Dielectrophoresis (DEP) based particle manipulation is presented. The proposed method fulfills missing links in finite element modeling between the multiphysic simulation and the biological behavior. This technique is amongst the first steps to develop a more complex platform covering several types of manipulations such as magnetophoresis and optics. The modeling approach is based on a hybrid interface using both ANSYS and MATLAB to link the propagation of the electrical field in the micro-channel to the particle motion. ANSYS is used to simulate the electrical propagation while MATLAB interprets the results to calculate cell displacement and send the new information to ANSYS for another turn. The beta version of the proposed technique takes into account particle shape, weight and its electrical properties. First obtained results are coherent with experimental results. PMID:23364197

  18. Land cover mapping at sub-pixel scales

    NASA Astrophysics Data System (ADS)

    Makido, Yasuyo Kato

    One of the biggest drawbacks of land cover mapping from remotely sensed images relates to spatial resolution, which determines the level of spatial details depicted in an image. Fine spatial resolution images from satellite sensors such as IKONOS and QuickBird are now available. However, these images are not suitable for large-area studies, since a single image is very small and therefore it is costly for large area studies. Much research has focused on attempting to extract land cover types at sub-pixel scale, and little research has been conducted concerning the spatial allocation of land cover types within a pixel. This study is devoted to the development of new algorithms for predicting land cover distribution using remote sensory imagery at sub-pixel level. The "pixel-swapping" optimization algorithm, which was proposed by Atkinson for predicting sub-pixel land cover distribution, is investigated in this study. Two limitations of this method, the arbitrary spatial range value and the arbitrary exponential model of spatial autocorrelation, are assessed. Various weighting functions, as alternatives to the exponential model, are evaluated in order to derive the optimum weighting function. Two different simulation models were employed to develop spatially autocorrelated binary class maps. In all tested models, Gaussian, Exponential, and IDW, the pixel swapping method improved classification accuracy compared with the initial random allocation of sub-pixels. However the results suggested that equal weight could be used to increase accuracy and sub-pixel spatial autocorrelation instead of using these more complex models of spatial structure. New algorithms for modeling the spatial distribution of multiple land cover classes at sub-pixel scales are developed and evaluated. Three methods are examined: sequential categorical swapping, simultaneous categorical swapping, and simulated annealing. These three methods are applied to classified Landsat ETM+ data that has been resampled to 210 meters. The result suggested that the simultaneous method can be considered as the optimum method in terms of accuracy performance and computation time. The case study employs remote sensing imagery at the following sites: tropical forests in Brazil and temperate multiple land mosaic in East China. Sub-areas for both sites are used to examine how the characteristics of the landscape affect the ability of the optimum technique. Three types of measurement: Moran's I, mean patch size (MPS), and patch size standard deviation (STDEV), are used to characterize the landscape. All results suggested that this technique could increase the classification accuracy more than traditional hard classification. The methods developed in this study can benefit researchers who employ coarse remote sensing imagery but are interested in detailed landscape information. In many cases, the satellite sensor that provides large spatial coverage has insufficient spatial detail to identify landscape patterns. Application of the super-resolution technique described in this dissertation could potentially solve this problem by providing detailed land cover predictions from the coarse resolution satellite sensor imagery.

  19. 78 FR 79649 - Energy Conservation Program: Proposed Determination of Set-Top Boxes and Network Equipment as a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-31

    ... Conservation Program: Proposed Determination of Set-Top Boxes and Network Equipment as a Covered Consumer... published June 15, 2011 that set-top boxes (STBs) and network equipment qualify as a covered product under... action in light of a consensus agreement entered by a broadly representative group that DOE believes has...

  20. Ten Years of War: A Characterization of Craniomaxillofacial Injuries Incurred During Operations Enduring Freedom and Iraqi Freedom

    DTIC Science & Technology

    2012-01-01

    been proposed. A comprehensive characterization of the injury pattern sustained during this 10-year period to the craniomaxillofacial region is needed...development of protective equipment in the future. METHODS: The Joint Theater Trauma Registry was queried from October 19, 2001, to March 27, 2011, covering...injury involved explosive devices, followed by ballistic trauma . CONCLUSION: Modern combat, characterized by blast injuries, results in higher than

  1. [A research in speech endpoint detection based on boxes-coupling generalization dimension].

    PubMed

    Wang, Zimei; Yang, Cuirong; Wu, Wei; Fan, Yingle

    2008-06-01

    In this paper, a new calculating method of generalized dimension, based on boxes-coupling principle, is proposed to overcome the edge effects and to improve the capability of the speech endpoint detection which is based on the original calculating method of generalized dimension. This new method has been applied to speech endpoint detection. Firstly, the length of overlapping border was determined, and through calculating the generalized dimension by covering the speech signal with overlapped boxes, three-dimension feature vectors including the box dimension, the information dimension and the correlation dimension were obtained. Secondly, in the light of the relation between feature distance and similarity degree, feature extraction was conducted by use of common distance. Lastly, bi-threshold method was used to classify the speech signals. The results of experiment indicated that, by comparison with the original generalized dimension (OGD) and the spectral entropy (SE) algorithm, the proposed method is more robust and effective for detecting the speech signals which contain different kinds of noise in different signal noise ratio (SNR), especially in low SNR.

  2. Spreading to localized targets in complex networks

    NASA Astrophysics Data System (ADS)

    Sun, Ye; Ma, Long; Zeng, An; Wang, Wen-Xu

    2016-12-01

    As an important type of dynamics on complex networks, spreading is widely used to model many real processes such as the epidemic contagion and information propagation. One of the most significant research questions in spreading is to rank the spreading ability of nodes in the network. To this end, substantial effort has been made and a variety of effective methods have been proposed. These methods usually define the spreading ability of a node as the number of finally infected nodes given that the spreading is initialized from the node. However, in many real cases such as advertising and news propagation, the spreading only aims to cover a specific group of nodes. Therefore, it is necessary to study the spreading ability of nodes towards localized targets in complex networks. In this paper, we propose a reversed local path algorithm for this problem. Simulation results show that our method outperforms the existing methods in identifying the influential nodes with respect to these localized targets. Moreover, the influential spreaders identified by our method can effectively avoid infecting the non-target nodes in the spreading process.

  3. Prediction of protein-protein interactions from amino acid sequences with ensemble extreme learning machines and principal component analysis.

    PubMed

    You, Zhu-Hong; Lei, Ying-Ke; Zhu, Lin; Xia, Junfeng; Wang, Bing

    2013-01-01

    Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time.

  4. A new method to obtain ground control points based on SRTM data

    NASA Astrophysics Data System (ADS)

    Wang, Pu; An, Wei; Deng, Xin-pu; Zhang, Xi

    2013-09-01

    The GCPs are widely used in remote sense image registration and geometric correction. Normally, the DRG and DOM are the major data source from which GCPs are extracted. But the high accuracy products of DRG and DOM are usually costly to obtain. Some of the production are free, yet without any guarantee. In order to balance the cost and the accuracy, the paper proposes a method of extracting the GCPs from SRTM data. The method consist of artificial assistance, binarization, data resample and reshape. With artificial assistance to find out which part of SRTM data could be used as GCPs, such as the islands or sharp coast line. By utilizing binarization algorithm , the shape information of the region is obtained while other information is excluded. Then the binary data is resampled to a suitable resolution required by specific application. At last, the data would be reshaped according to satellite imaging type to obtain the GCPs which could be used. There are three advantages of the method proposed in the paper. Firstly, the method is easy for implementation. Unlike the DRG data or DOM data that charges a lot, the SRTM data is totally free to access without any constricts. Secondly, the SRTM has a high accuracy about 90m that is promised by its producer, so the GCPs got from it can also obtain a high quality. Finally, given the SRTM data covers nearly all the land surface of earth between latitude -60° and latitude +60°, the GCPs which are produced by the method can cover most important regions of the world. The method which obtain GCPs from SRTM data can be used in meteorological satellite image or some situation alike, which have a relative low requirement about the accuracy. Through plenty of simulation test, the method is proved convenient and effective.

  5. Throughput Measurement of a Dual-Band MIMO Rectangular Dielectric Resonator Antenna for LTE Applications

    PubMed Central

    Nasir, Jamal; Jamaluddin, Mohd. Haizal; Ahmad Khan, Aftab; Kamarudin, Muhammad Ramlee; Leow, Chee Yen; Owais, Owais

    2017-01-01

    An L-shaped dual-band multiple-input multiple-output (MIMO) rectangular dielectric resonator antenna (RDRA) for long term evolution (LTE) applications is proposed. The presented antenna can transmit and receive information independently using fundamental TE111 and higher order TE121 modes of the DRA. TE111 degenerate mode covers LTE band 2 (1.85–1.99 GHz), 3 (1.71–1.88 GHz), and 9 (1.7499–1.7849 GHz) at fr = 1.8 GHz whereas TE121 covers LTE band 7 (2.5–2.69 GHz) at fr = 2.6 GHz, respectively. An efficient design method has been used to reduce mutual coupling between ports by changing the effective permittivity values of DRA by introducing a cylindrical air-gap at an optimal position in the dielectric resonator. This air-gap along with matching strips at the corners of the dielectric resonator keeps the isolation at a value more than 17 dB at both the bands. The diversity performance has also been evaluated by calculating the envelope correlation coefficient, diversity gain, and mean effective gain of the proposed design. MIMO performance has been evaluated by measuring the throughput of the proposed MIMO antenna. Experimental results successfully validate the presented design methodology in this work. PMID:28098807

  6. Throughput Measurement of a Dual-Band MIMO Rectangular Dielectric Resonator Antenna for LTE Applications.

    PubMed

    Nasir, Jamal; Jamaluddin, Mohd Haizal; Ahmad Khan, Aftab; Kamarudin, Muhammad Ramlee; Yen, Bruce Leow Chee; Owais, Owais

    2017-01-13

    An L-shaped dual-band multiple-input multiple-output (MIMO) rectangular dielectric resonator antenna (RDRA) for long term evolution (LTE) applications is proposed. The presented antenna can transmit and receive information independently using fundamental TE 111 and higher order TE 121 modes of the DRA. TE 111 degenerate mode covers LTE band 2 (1.85-1.99 GHz), 3 (1.71-1.88 GHz), and 9 (1.7499-1.7849 GHz) at f r = 1.8 GHz whereas TE 121 covers LTE band 7 (2.5-2.69 GHz) at f r = 2.6 GHz, respectively. An efficient design method has been used to reduce mutual coupling between ports by changing the effective permittivity values of DRA by introducing a cylindrical air-gap at an optimal position in the dielectric resonator. This air-gap along with matching strips at the corners of the dielectric resonator keeps the isolation at a value more than 17 dB at both the bands. The diversity performance has also been evaluated by calculating the envelope correlation coefficient, diversity gain, and mean effective gain of the proposed design. MIMO performance has been evaluated by measuring the throughput of the proposed MIMO antenna. Experimental results successfully validate the presented design methodology in this work.

  7. Internet addiction assessment tools: dimensional structure and methodological status.

    PubMed

    Lortie, Catherine L; Guitton, Matthieu J

    2013-07-01

    Excessive internet use is becoming a concern, and some have proposed that it may involve addiction. We evaluated the dimensions assessed by, and psychometric properties of, a range of questionnaires purporting to assess internet addiction. Fourteen questionnaires were identified purporting to assess internet addiction among adolescents and adults published between January 1993 and October 2011. Their reported dimensional structure, construct, discriminant and convergent validity and reliability were assessed, as well as the methods used to derive these. Methods used to evaluate internet addiction questionnaires varied considerably. Three dimensions of addiction predominated: compulsive use (79%), negative outcomes (86%) and salience (71%). Less common were escapism (21%), withdrawal symptoms (36%) and other dimensions. Measures of validity and reliability were found to be within normally acceptable limits. There is a broad convergence of questionnaires purporting to assess internet addiction suggesting that compulsive use, negative outcome and salience should be covered and the questionnaires show adequate psychometric properties. However, the methods used to evaluate the questionnaires vary widely and possible factors contributing to excessive use such as social motivation do not appear to be covered. © 2013 Society for the Study of Addiction.

  8. Sensing Urban Land-Use Patterns by Integrating Google Tensorflow and Scene-Classification Models

    NASA Astrophysics Data System (ADS)

    Yao, Y.; Liang, H.; Li, X.; Zhang, J.; He, J.

    2017-09-01

    With the rapid progress of China's urbanization, research on the automatic detection of land-use patterns in Chinese cities is of substantial importance. Deep learning is an effective method to extract image features. To take advantage of the deep-learning method in detecting urban land-use patterns, we applied a transfer-learning-based remote-sensing image approach to extract and classify features. Using the Google Tensorflow framework, a powerful convolution neural network (CNN) library was created. First, the transferred model was previously trained on ImageNet, one of the largest object-image data sets, to fully develop the model's ability to generate feature vectors of standard remote-sensing land-cover data sets (UC Merced and WHU-SIRI). Then, a random-forest-based classifier was constructed and trained on these generated vectors to classify the actual urban land-use pattern on the scale of traffic analysis zones (TAZs). To avoid the multi-scale effect of remote-sensing imagery, a large random patch (LRP) method was used. The proposed method could efficiently obtain acceptable accuracy (OA = 0.794, Kappa = 0.737) for the study area. In addition, the results show that the proposed method can effectively overcome the multi-scale effect that occurs in urban land-use classification at the irregular land-parcel level. The proposed method can help planners monitor dynamic urban land use and evaluate the impact of urban-planning schemes.

  9. Noise source separation of diesel engine by combining binaural sound localization method and blind source separation method

    NASA Astrophysics Data System (ADS)

    Yao, Jiachi; Xiang, Yang; Qian, Sichong; Li, Shengyang; Wu, Shaowei

    2017-11-01

    In order to separate and identify the combustion noise and the piston slap noise of a diesel engine, a noise source separation and identification method that combines a binaural sound localization method and blind source separation method is proposed. During a diesel engine noise and vibration test, because a diesel engine has many complex noise sources, a lead covering method was carried out on a diesel engine to isolate other interference noise from the No. 1-5 cylinders. Only the No. 6 cylinder parts were left bare. Two microphones that simulated the human ears were utilized to measure the radiated noise signals 1 m away from the diesel engine. First, a binaural sound localization method was adopted to separate the noise sources that are in different places. Then, for noise sources that are in the same place, a blind source separation method is utilized to further separate and identify the noise sources. Finally, a coherence function method, continuous wavelet time-frequency analysis method, and prior knowledge of the diesel engine are combined to further identify the separation results. The results show that the proposed method can effectively separate and identify the combustion noise and the piston slap noise of a diesel engine. The frequency of the combustion noise and the piston slap noise are respectively concentrated at 4350 Hz and 1988 Hz. Compared with the blind source separation method, the proposed method has superior separation and identification effects, and the separation results have fewer interference components from other noise.

  10. A method for classification of multisource data using interval-valued probabilities and its application to HIRIS data

    NASA Technical Reports Server (NTRS)

    Kim, H.; Swain, P. H.

    1991-01-01

    A method of classifying multisource data in remote sensing is presented. The proposed method considers each data source as an information source providing a body of evidence, represents statistical evidence by interval-valued probabilities, and uses Dempster's rule to integrate information based on multiple data source. The method is applied to the problems of ground-cover classification of multispectral data combined with digital terrain data such as elevation, slope, and aspect. Then this method is applied to simulated 201-band High Resolution Imaging Spectrometer (HIRIS) data by dividing the dimensionally huge data source into smaller and more manageable pieces based on the global statistical correlation information. It produces higher classification accuracy than the Maximum Likelihood (ML) classification method when the Hughes phenomenon is apparent.

  11. EVALUATING ECOREGIONS FOR SAMPLING AND MAPPING LAND-COVER PATTERNS

    EPA Science Inventory

    Ecoregional stratification has been proposed for sampling and mapping land- cover composition and pattern over time. Using a wall-to-wall land-cover map of the United States, we evaluated geographic scales of variance for 17 landscape pattern indices, and compared stratification ...

  12. Using remote sensing data to predict road fill areas and areas affected by fill erosion with planned forest road construction: a case study in Kastamonu Regional Forest Directorate (Turkey).

    PubMed

    Aricak, Burak

    2015-07-01

    Forest roads are essential for transport in managed forests, yet road construction causes environmental disturbance, both in the surface area the road covers and in erosion and downslope deposition of road fill material. The factors affecting the deposition distance of eroded road fill are the slope gradient and the density of plant cover. Thus, it is important to take these factors into consideration during road planning to minimize their disturbance. The aim of this study was to use remote sensing and field surveying to predict the locations that would be affected by downslope deposition of eroding road fill and to compile the data into a geographic information system (GIS) database. The construction of 99,500 m of forest roads is proposed for the Kastamonu Regional Forest Directorate in Turkey. Using GeoEye satellite images and a digital elevation model (DEM) for the region, the location and extent of downslope deposition of road fill were determined for the roads as planned. It was found that if the proposed roads were constructed by excavators, the fill material would cover 910,621 m(2) and the affected surface area would be 1,302,740 m(2). Application of the method used here can minimize the adverse effects of forest roads.

  13. A proposed periodic national inventory of land use land cover change

    Treesearch

    Hans T. Schreuder; Paul W. Snook; Raymond L. Czaplewski; Glenn P. Catts

    1986-01-01

    Three alternatives using digital thematic mapper (TM), analog TM, and a combination of either digital or analog TM data with low altitude photography are discussed for level I and level II land use/land cover classes for a proposed national inventory. Digital TM data should prove satisfactory for estimating acreage in level I classes, although estimates of precision...

  14. Evaluation of forest cover estimates for Haiti using supervised classification of Landsat data

    NASA Astrophysics Data System (ADS)

    Churches, Christopher E.; Wampler, Peter J.; Sun, Wanxiao; Smith, Andrew J.

    2014-08-01

    This study uses 2010-2011 Landsat Thematic Mapper (TM) imagery to estimate total forested area in Haiti. The thematic map was generated using radiometric normalization of digital numbers by a modified normalization method utilizing pseudo-invariant polygons (PIPs), followed by supervised classification of the mosaicked image using the Food and Agriculture Organization (FAO) of the United Nations Land Cover Classification System. Classification results were compared to other sources of land-cover data produced for similar years, with an emphasis on the statistics presented by the FAO. Three global land cover datasets (GLC2000, Globcover, 2009, and MODIS MCD12Q1), and a national-scale dataset (a land cover analysis by Haitian National Centre for Geospatial Information (CNIGS)) were reclassified and compared. According to our classification, approximately 32.3% of Haiti's total land area was tree covered in 2010-2011. This result was confirmed using an error-adjusted area estimator, which predicted a tree covered area of 32.4%. Standardization to the FAO's forest cover class definition reduces the amount of tree cover of our supervised classification to 29.4%. This result was greater than the reported FAO value of 4% and the value for the recoded GLC2000 dataset of 7.0%, but is comparable to values for three other recoded datasets: MCD12Q1 (21.1%), Globcover (2009) (26.9%), and CNIGS (19.5%). We propose that at coarse resolutions, the segmented and patchy nature of Haiti's forests resulted in a systematic underestimation of the extent of forest cover. It appears the best explanation for the significant difference between our results, FAO statistics, and compared datasets is the accuracy of the data sources and the resolution of the imagery used for land cover analyses. Analysis of recoded global datasets and results from this study suggest a strong linear relationship (R2 = 0.996 for tree cover) between spatial resolution and land cover estimates.

  15. Unsupervised change detection in a particular vegetation land cover type using spectral angle mapper

    NASA Astrophysics Data System (ADS)

    Renza, Diego; Martinez, Estibaliz; Molina, Iñigo; Ballesteros L., Dora M.

    2017-04-01

    This paper presents a new unsupervised change detection methodology for multispectral images applied to specific land covers. The proposed method involves comparing each image against a reference spectrum, where the reference spectrum is obtained from the spectral signature of the type of coverage you want to detect. In this case the method has been tested using multispectral images (SPOT5) of the community of Madrid (Spain), and multispectral images (Quickbird) of an area over Indonesia that was impacted by the December 26, 2004 tsunami; here, the tests have focused on the detection of changes in vegetation. The image comparison is obtained by applying Spectral Angle Mapper between the reference spectrum and each multitemporal image. Then, a threshold to produce a single image of change is applied, which corresponds to the vegetation zones. The results for each multitemporal image are combined through an exclusive or (XOR) operation that selects vegetation zones that have changed over time. Finally, the derived results were compared against a supervised method based on classification with the Support Vector Machine. Furthermore, the NDVI-differencing and the Spectral Angle Mapper techniques were selected as unsupervised methods for comparison purposes. The main novelty of the method consists in the detection of changes in a specific land cover type (vegetation), therefore, for comparison purposes, the best scenario is to compare it with methods that aim to detect changes in a specific land cover type (vegetation). This is the main reason to select NDVI-based method and the post-classification method (SVM implemented in a standard software tool). To evaluate the improvements using a reference spectrum vector, the results are compared with the basic-SAM method. In SPOT5 image, the overall accuracy was 99.36% and the κ index was 90.11%; in Quickbird image, the overall accuracy was 97.5% and the κ index was 82.16%. Finally, the precision results of the method are comparable to those of a supervised method, supported by low detection of false positives and false negatives, along with a high overall accuracy and a high kappa index. On the other hand, the execution times were comparable to those of unsupervised methods of low computational load.

  16. Optimum design of a novel pounding tuned mass damper under harmonic excitation

    NASA Astrophysics Data System (ADS)

    Wang, Wenxi; Hua, Xugang; Wang, Xiuyong; Chen, Zhengqing; Song, Gangbing

    2017-05-01

    In this paper, a novel pounding tuned mass damper (PTMD) utilizing pounding damping is proposed to reduce structural vibration by increasing the damping ratio of a lightly damped structure. The pounding boundary covered by viscoelastic material is fixed right next to the tuned mass when the spring-mass system is in the equilibrium position. The dynamic properties of the proposed PTMD, including the natural frequency and the equivalent damping ratio, are derived theoretically. Moreover, the numerical simulation method by using an impact force model to study the PTMD is proposed and validated by pounding experiments. To minimize the maximum dynamic magnification factor under harmonic excitations, an optimum design of the PTMD is developed. Finally, the optimal PTMD is implemented to control a lightly damped frame structure. A comparison of experimental and simulated results reveals that the proposed impact force model can accurately model the pounding force. Furthermore, the proposed PTMD is effective to control the vibration in a wide frequency range, as demonstrated experimentally.

  17. Use of the 37-38 GHz and 40-40.5 GHz Ka-bands for Deep Space Communications

    NASA Technical Reports Server (NTRS)

    Morabito, David; Hastrup, Rolf

    2004-01-01

    This paper covers a wide variety of issues associated with the implementation and use of these frequency bands for deep space communications. Performance issues, such as ground station pointing stability, ground antenna gain, antenna pattern, and propagation effects such as due to atmospheric, charged-particle and space loss at 37 GHz, will be addressed in comparison to the 32 GHz Ka-band deep space allocation. Issues with the use of and competition for this spectrum also will be covered. The state of the hardware developed (or proposed) for operating in this frequency band will be covered from the standpoint of the prospects for achieving higher data rates that could be accommodated in the available bandwidth. Hardware areas to be explored include modulators, digital-to-analog converters, filters, power amplifiers, receivers, and antennas. The potential users of the frequency band will be explored as well as their anticipated methods to achieve the potential high data rates and the implications of the competition for bandwidth.

  18. Attosecond time-resolved streaked photoemission from Mg-covered W(110) surfaces

    NASA Astrophysics Data System (ADS)

    Liao, Qing; Thumm, Uwe

    2015-05-01

    We formulate a quantum-mechanical model for infrared-streaked photoelectron emission by an ultrashort extreme ultraviolet pulse from adsorbate-covered metal surfaces. Applying this numerical model to ultrathin Mg adsorbates on W(110) substrates, we analyze streaked photoelectron spectra and attosecond streaking time delays for photoemission from the Mg/W(110) conduction band and Mg(2p) and W(4f) core levels. Based on this analysis, we propose the use of attosecond streaking spectroscopy on adsorbate-covered surfaces with variable adsorbate thickness as a method for investigating (a) electron transport in condensed-matter systems and (b) metal-adsorbate-interface properties at subatomic length and time scales. Our calculated streaked photoemission spectra and time delays agree with recently obtained experimental data. Supported by the Chemical Sciences, Geosciences, and Biosciences Division, Office of Basic Energy Sciences, Office of Science, U.S. Department of Energy under Grant No. DE-FG02-86ER13491 and NSF Grant PHY-1068752.

  19. Incidence of human papillomavirus contamination of transvaginal probes in Japan and possible contamination prevention strategy.

    PubMed

    Kuwata, Tomoyuki; Takahashi, Hironori; Koibuchi, Harumi; Ichizuka, Kiyotake; Natori, Michiya; Matsubara, Shigeki

    2016-10-01

    To clarify the present status of human papillomavirus (HPV) contamination of transvaginal probes in Japan and propose a preventive method. This study was performed at three institutes: a tertiary center, secondary hospital, and primary facility. To identify contamination rates, probes were disinfected and covered with probe covers and condoms; the cover was changed for each patient. The probes were tested for HPV, and those with HPV detected were analyzed to identify the type of HPV. Next, nurses put on new gloves before covering the probe for each patient, and the probes were similarly tested for HPV. A total of 120 probes were tested, and HPV was detected from a total of five probes, a contamination rate of 4.2 % (5/120). HPV was detected in all three institutes. Importantly, high-risk HPV, i.e., HPV-52, 56, and 59, was detected. After the "glove change strategy" was implemented, HPV was not detected on any of 150 probes tested at any of the three institutions. In Japan, the HPV contamination rate of vaginal probes in routine practice was 4.2 %. There was no HPV contamination of probes after changing the gloves for cover exchange for each patient. This strategy may prevent HPV probe contamination.

  20. Flight State Identification of a Self-Sensing Wing via an Improved Feature Selection Method and Machine Learning Approaches.

    PubMed

    Chen, Xi; Kopsaftopoulos, Fotis; Wu, Qi; Ren, He; Chang, Fu-Kuo

    2018-04-29

    In this work, a data-driven approach for identifying the flight state of a self-sensing wing structure with an embedded multi-functional sensing network is proposed. The flight state is characterized by the structural vibration signals recorded from a series of wind tunnel experiments under varying angles of attack and airspeeds. A large feature pool is created by extracting potential features from the signals covering the time domain, the frequency domain as well as the information domain. Special emphasis is given to feature selection in which a novel filter method is developed based on the combination of a modified distance evaluation algorithm and a variance inflation factor. Machine learning algorithms are then employed to establish the mapping relationship from the feature space to the practical state space. Results from two case studies demonstrate the high identification accuracy and the effectiveness of the model complexity reduction via the proposed method, thus providing new perspectives of self-awareness towards the next generation of intelligent air vehicles.

  1. Beam hardening correction for interior tomography based on exponential formed model and radon inversion transform

    NASA Astrophysics Data System (ADS)

    Chen, Siyu; Zhang, Hanming; Li, Lei; Xi, Xiaoqi; Han, Yu; Yan, Bin

    2016-10-01

    X-ray computed tomography (CT) has been extensively applied in industrial non-destructive testing (NDT). However, in practical applications, the X-ray beam polychromaticity often results in beam hardening problems for image reconstruction. The beam hardening artifacts, which manifested as cupping, streaks and flares, not only debase the image quality, but also disturb the subsequent analyses. Unfortunately, conventional CT scanning requires that the scanned object is completely covered by the field of view (FOV), the state-of-art beam hardening correction methods only consider the ideal scanning configuration, and often suffer problems for interior tomography due to the projection truncation. Aiming at this problem, this paper proposed a beam hardening correction method based on radon inversion transform for interior tomography. Experimental results show that, compared to the conventional correction algorithms, the proposed approach has achieved excellent performance in both beam hardening artifacts reduction and truncation artifacts suppression. Therefore, the presented method has vitally theoretic and practicable meaning in artifacts correction of industrial CT.

  2. Measuring land-use and land-cover change using the U.S. department of agriculture's cropland data layer: Cautions and recommendations

    NASA Astrophysics Data System (ADS)

    Lark, Tyler J.; Mueller, Richard M.; Johnson, David M.; Gibbs, Holly K.

    2017-10-01

    Monitoring agricultural land is important for understanding and managing food production, environmental conservation efforts, and climate change. The United States Department of Agriculture's Cropland Data Layer (CDL), an annual satellite imagery-derived land cover map, has been increasingly used for this application since complete coverage of the conterminous United States became available in 2008. However, the CDL is designed and produced with the intent of mapping annual land cover rather than tracking changes over time, and as a result certain precautions are needed in multi-year change analyses to minimize error and misapplication. We highlight scenarios that require special considerations, suggest solutions to key challenges, and propose a set of recommended good practices and general guidelines for CDL-based land change estimation. We also characterize a problematic issue of crop area underestimation bias within the CDL that needs to be accounted for and corrected when calculating changes to crop and cropland areas. When used appropriately and in conjunction with related information, the CDL is a valuable and effective tool for detecting diverse trends in agriculture. By explicitly discussing the methods and techniques for post-classification measurement of land-cover and land-use change using the CDL, we aim to further stimulate the discourse and continued development of suitable methodologies. Recommendations generated here are intended specifically for the CDL but may be broadly applicable to additional remotely-sensed land cover datasets including the National Land Cover Database (NLCD), Moderate Resolution Imaging Spectroradiometer (MODIS)-based land cover products, and other regional, national, and global land cover classification maps.

  3. Mapping Secondary Forest Succession on Abandoned Agricultural Land in the Polish Carpathians

    NASA Astrophysics Data System (ADS)

    Kolecka, N.; Kozak, J.; Kaim, D.; Dobosz, M.; Ginzler, Ch.; Psomas, A.

    2016-06-01

    Land abandonment and secondary forest succession have played a significant role in land cover changes and forest cover increase in mountain areas in Europe over the past several decades. Land abandonment can be easily observed in the field over small areas, but it is difficult to map over the large areas, e.g., with remote sensing, due to its subtle and spatially dispersed character. Our previous paper presented how the LiDAR (Light Detection and Ranging) and topographic data were used to detect secondary forest succession on abandoned land in one commune located in the Polish Carpathians by means of object-based image analysis (OBIA) and GIS (Kolecka et al., 2015). This paper proposes how the method can be applied to efficiently map secondary forest succession over the entire Polish Carpathians, incorporating spatial sampling strategy supported by various ancillary data. Here we discuss the methods of spatial sampling, its limitations and results in the context of future secondary forest succession modelling.

  4. Cloud Detection by Fusing Multi-Scale Convolutional Features

    NASA Astrophysics Data System (ADS)

    Li, Zhiwei; Shen, Huanfeng; Wei, Yancong; Cheng, Qing; Yuan, Qiangqiang

    2018-04-01

    Clouds detection is an important pre-processing step for accurate application of optical satellite imagery. Recent studies indicate that deep learning achieves best performance in image segmentation tasks. Aiming at boosting the accuracy of cloud detection for multispectral imagery, especially for those that contain only visible and near infrared bands, in this paper, we proposed a deep learning based cloud detection method termed MSCN (multi-scale cloud net), which segments cloud by fusing multi-scale convolutional features. MSCN was trained on a global cloud cover validation collection, and was tested in more than ten types of optical images with different resolution. Experiment results show that MSCN has obvious advantages over the traditional multi-feature combined cloud detection method in accuracy, especially when in snow and other areas covered by bright non-cloud objects. Besides, MSCN produced more detailed cloud masks than the compared deep cloud detection convolution network. The effectiveness of MSCN make it promising for practical application in multiple kinds of optical imagery.

  5. Analysis of Resonance Response Performance of C-Band Antenna Using Parasitic Element

    PubMed Central

    Islam, M. T.; Misran, N.; Mandeep, J. S.

    2014-01-01

    Analysis of the resonance response improvement of a planar C-band (4–8 GHz) antenna is proposed using parasitic element method. This parasitic element based method is validated for change in the active and parasitic antenna elements. A novel dual-band antenna for C-band application covering 5.7 GHz and 7.6 GHz is designed and fabricated. The antenna is composed of circular parasitic element with unequal microstrip lines at both sides and a rectangular partial ground plane. A fractional bandwidth of 13.5% has been achieved from 5.5 GHz to 6.3 GHz (WLAN band) for the lower band. The upper band covers from 7.1 GHz to 8 GHz with a fractional bandwidth of 12%. A gain of 6.4 dBi is achieved at the lower frequency and 4 dBi is achieved at the upper frequency. The VSWR of the antenna is less than 2 at the resonance frequency. PMID:24895643

  6. Simultaneous liquid chromatography/mass spectrometry determination of both polar and "multiresidue" pesticides in food using parallel hydrophilic interaction/reversed-phase liquid chromatography and a hybrid sample preparation approach.

    PubMed

    Robles-Molina, José; Gilbert-López, Bienvenida; García-Reyes, Juan F; Molina-Díaz, Antonio

    2017-09-29

    Pesticide testing of foodstuffs is usually accomplished with generic wide-scope multi-residue methods based on liquid chromatography tandem mass spectrometry (LC-MS/MS). However, this approach does not cover some special pesticides, the so called "single-residue method" compounds, that are hardly compatible with standard reversed-phase (RP) separations due to their specific properties. In this article, we propose a comprehensive strategy for the integration of single residue method compounds and standard multiresidue pesticides within a single run. It is based on the use of a parallel LC column assembly with two different LC gradients performing orthogonal hydrophilic interaction chromatography (HILIC) and reversed-phase (RPLC) chromatography within one analytical run. Two sample aliquots were simultaneously injected on each column, using different gradients, being the eluents merged post-column prior to mass spectrometry detection. The approach was tested with 41 multiclass pesticides covering a wide range of physicochemical properties across several orders of log K ow (from -4 to +5.5). With this assembly, distinct separation from the void was attained for all the pesticides studied, keeping similar performance in terms of sensitivity, peak area reproducibility (<6 RSD% in most cases) and retention time stability of standard single column approaches (better than±0.1min). The application of the proposed approach using parallel HILIC/RPLC and RPLC/aqueous normal phase (Obelisc) were assessed in leek using LC-MS/MS. For this purpose, a hybrid QuEChERS (Quick, easy, cheap, effective, rugged and safe)/QuPPe (quick method for polar pesticides) method was evaluated based on solvent extraction with MeOH and acetonitrile followed by dispersive solid-phase extraction, delivering appropriate recoveries for most of the pesticides included in the study within the log K ow in the range from -4 to +5.5. The proposed strategy may be extended to other fields such as sport drug testing or environmental analysis, where the same type of variety of analytes featuring poor retention within a single chromatographic separation occurs. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Peripleural lung disease detection based on multi-slice CT images

    NASA Astrophysics Data System (ADS)

    Matsuhiro, M.; Suzuki, H.; Kawata, Y.; Niki, N.; Nakano, Y.; Ohmatsu, H.; Kusumoto, M.; Tsuchida, T.; Eguchi, K.; Kaneko, M.

    2015-03-01

    With the development of multi-slice CT technology, obtaining accurate 3D images of lung field in a short time become possible. To support that, a lot of image processing methods need to be developed. Detection peripleural lung disease is difficult due to its existence out of lung region, because lung extraction is often performed based on threshold processing. The proposed method uses thoracic inner region extracted by inner cavity of bone as well as air region, covers peripleural lung diseased cases such as lung nodule, calcification, pleural effusion and pleural plaque. We applied this method to 50 cases including 39 peripleural lung diseased cases. This method was able to detect 39 peripleural lung disease with 2.9 false positive per case.

  8. A Bayesian Scoring Technique for Mining Predictive and Non-Spurious Rules

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Hauskrecht, Milos

    2015-01-01

    Rule mining is an important class of data mining methods for discovering interesting patterns in data. The success of a rule mining method heavily depends on the evaluation function that is used to assess the quality of the rules. In this work, we propose a new rule evaluation score - the Predictive and Non-Spurious Rules (PNSR) score. This score relies on Bayesian inference to evaluate the quality of the rules and considers the structure of the rules to filter out spurious rules. We present an efficient algorithm for finding rules with high PNSR scores. The experiments demonstrate that our method is able to cover and explain the data with a much smaller rule set than existing methods. PMID:25938136

  9. A Bayesian Scoring Technique for Mining Predictive and Non-Spurious Rules.

    PubMed

    Batal, Iyad; Cooper, Gregory; Hauskrecht, Milos

    Rule mining is an important class of data mining methods for discovering interesting patterns in data. The success of a rule mining method heavily depends on the evaluation function that is used to assess the quality of the rules. In this work, we propose a new rule evaluation score - the Predictive and Non-Spurious Rules (PNSR) score. This score relies on Bayesian inference to evaluate the quality of the rules and considers the structure of the rules to filter out spurious rules. We present an efficient algorithm for finding rules with high PNSR scores. The experiments demonstrate that our method is able to cover and explain the data with a much smaller rule set than existing methods.

  10. A fast recognition method of warhead target in boost phase using kinematic features

    NASA Astrophysics Data System (ADS)

    Chen, Jian; Xu, Shiyou; Tian, Biao; Wu, Jianhua; Chen, Zengping

    2015-12-01

    The radar targets number increases from one to more when the ballistic missile is in the process of separating the lower stage rocket or casting covers or other components. It is vital to identify the warhead target quickly among these multiple targets for radar tracking. A fast recognition method of the warhead target is proposed to solve this problem by using kinematic features, utilizing fuzzy comprehensive method and information fusion method. In order to weaken the influence of radar measurement noise, an extended Kalman filter with constant jerk model (CJEKF) is applied to obtain more accurate target's motion information. The simulation shows the validity of the algorithm and the effects of the radar measurement precision upon the algorithm's performance.

  11. Pixel-Wise-Inter/Intra-Channel Color and Luminance Uniformity Corrections for Multi-Channel Projection Displays

    DTIC Science & Technology

    2016-08-11

    Journal Article 3. DATES COVERED (From – To) Jan 2015 – Dec 2015 4. TITLE AND SUBTITLE PIXEL-WISE INTER/INTRA-CHANNEL COLOR & LUMINANCE UNIFORMITY...Conference Dayton, Ohio – 28-29 June 2016 14. ABSTRACT Inter- and intra-channel color and luminance are generally non-uniform in multi-channel...projection display systems. Several methods have been proposed to correct for both inter- and intra-channel color and luminance variation in multi-channel

  12. Merge Fuzzy Visual Servoing and GPS-Based Planning to Obtain a Proper Navigation Behavior for a Small Crop-Inspection Robot.

    PubMed

    Bengochea-Guevara, José M; Conesa-Muñoz, Jesus; Andújar, Dionisio; Ribeiro, Angela

    2016-02-24

    The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them.

  13. Merge Fuzzy Visual Servoing and GPS-Based Planning to Obtain a Proper Navigation Behavior for a Small Crop-Inspection Robot

    PubMed Central

    Bengochea-Guevara, José M.; Conesa-Muñoz, Jesus; Andújar, Dionisio; Ribeiro, Angela

    2016-01-01

    The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them. PMID:26927102

  14. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation.

    PubMed

    Chen, Ming; Yu, Hengyong

    2015-10-01

    This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and matlab. While the basic platform is constructed in matlab, the computationally intensive segments are coded in c + +, which are linked via a mex interface. A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.

  15. Capacitors Would Help Protect Against Hypervelocity Impacts

    NASA Technical Reports Server (NTRS)

    Edwards, David; Hubbs, Whitney; Hovater, Mary

    2007-01-01

    A proposal investigates alternatives to the present bumper method of protecting spacecraft against impacts of meteoroids and orbital debris. The proposed method is based on a British high-voltage-capacitance technique for protecting armored vehicles against shaped-charge warheads. A shield, according to the proposal, would include a bare metal outer layer separated by a gap from an inner metal layer covered with an electrically insulating material. The metal layers would constitute electrodes of a capacitor. A bias potential would be applied between the metal layers. A particle impinging at hypervelocity on the outer metal layer would break apart into a debris cloud that would penetrate the electrical insulation on the inner metal layer. The cloud would form a path along which electric current could flow between the metal layers, thereby causing the capacitor to discharge. With proper design, the discharge current would be large enough to vaporize the particles in the debris cloud to prevent penetration of the spacecraft. The shield design can be mass optimized to be competitive with existing bumper designs. Parametric studies were proposed to determine optimum correction between bias voltage, impacting particle velocity, gap space, and insulating material required to prevent spacecraft penetration.

  16. An efficient intensity-based ready-to-use X-ray image stitcher.

    PubMed

    Wang, Junchen; Zhang, Xiaohui; Sun, Zhen; Yuan, Fuzhen

    2018-06-14

    The limited field of view of the X-ray image intensifier makes it difficult to cover a large target area with a single X-ray image. X-ray image stitching techniques have been proposed to produce a panoramic X-ray image. This paper presents an efficient intensity-based X-ray image stitcher, which does not rely on accurate C-arm motion control or auxiliary devices and hence is ready to use in clinic. The stitcher consumes sequentially captured X-ray images with overlap areas and automatically produces a panoramic image. The gradient information for optimization of image alignment is obtained using a back-propagation scheme so that it is convenient to adopt various image warping models. The proposed stitcher has the following advantages over existing methods: (1) no additional hardware modification or auxiliary markers are needed; (2) it is robust against feature-based approaches; (3) arbitrary warping models and shapes of the region of interest are supported; (4) seamless stitching is achieved using multi-band blending. Experiments have been performed to confirm the effectiveness of the proposed method. The proposed X-ray image stitcher is efficient, accurate and ready to use in clinic. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Simulation and experimental results of optical and thermal modeling of gold nanoshells.

    PubMed

    Ghazanfari, Lida; Khosroshahi, Mohammad E

    2014-09-01

    This paper proposes a generalized method for optical and thermal modeling of synthesized magneto-optical nanoshells (MNSs) for biomedical applications. Superparamagnetic magnetite nanoparticles with diameter of 9.5 ± 1.4 nm are fabricated using co-precipitation method and subsequently covered by a thin layer of gold to obtain 15.8 ± 3.5 nm MNSs. In this paper, simulations and detailed analysis are carried out for different nanoshell geometry to achieve a maximum heat power. Structural, magnetic and optical properties of MNSs are assessed using vibrating sample magnetometer (VSM), X-ray diffraction (XRD), UV-VIS spectrophotometer, dynamic light scattering (DLS), and transmission electron microscope (TEM). Magnetic saturation of synthesized magnetite nanoparticles are reduced from 46.94 to 11.98 emu/g after coating with gold. The performance of the proposed optical-thermal modeling technique is verified by simulation and experimental results. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Observer synthesis for a class of Takagi-Sugeno descriptor system with unmeasurable premise variable. Application to fault diagnosis

    NASA Astrophysics Data System (ADS)

    López-Estrada, F. R.; Astorga-Zaragoza, C. M.; Theilliol, D.; Ponsart, J. C.; Valencia-Palomo, G.; Torres, L.

    2017-12-01

    This paper proposes a methodology to design a Takagi-Sugeno (TS) descriptor observer for a class of TS descriptor systems. Unlike the popular approach that considers measurable premise variables, this paper considers the premise variables depending on unmeasurable vectors, e.g. the system states. This consideration covers a large class of nonlinear systems and represents a real challenge for the observer synthesis. Sufficient conditions to guarantee robustness against the unmeasurable premise variables and asymptotic convergence of the TS descriptor observer are obtained based on the H∞ approach together with the Lyapunov method. As a result, the designing conditions are given in terms of linear matrix inequalities (LMIs). In addition, sensor fault detection and isolation are performed by means of a generalised observer bank. Two numerical experiments, an electrical circuit and a rolling disc system, are presented in order to illustrate the effectiveness of the proposed method.

  19. Optical hiding with visual cryptography

    NASA Astrophysics Data System (ADS)

    Shi, Yishi; Yang, Xiubo

    2017-11-01

    We propose an optical hiding method based on visual cryptography. In the hiding process, we convert the secret information into a set of fabricated phase-keys, which are completely independent of each other, intensity-detected-proof and image-covered, leading to the high security. During the extraction process, the covered phase-keys are illuminated with laser beams and then incoherently superimposed to extract the hidden information directly by human vision, without complicated optical implementations and any additional computation, resulting in the convenience of extraction. Also, the phase-keys are manufactured as the diffractive optical elements that are robust to the attacks, such as the blocking and the phase-noise. Optical experiments verify that the high security, the easy extraction and the strong robustness are all obtainable in the visual-cryptography-based optical hiding.

  20. Evaluation of SLAR and thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques. [south carolina

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1979-01-01

    A literature review on radar and spectral band information was conducted and a NC-130 mission was flown carrying the NS001 scanner system which basically corresponds to the channel configuration of the proposed thematic mapper. Aerial photography and other reference data were obtained for the study site, an area approximately 290 sq miles in north central South Carolina. A cover type map was prepared and methods were devised for reformatting and geometrically correcting MSS CRT data. Arrangements were made to obtain LANDSAT data for dates approximating the NC-130 mission. Because of the waveband employed to obtain SEASAT radar data, it was decided to determine if X-band (2.40 cm to 3.75 cm wavelength) imagery is available.

  1. Monitoring forest dynamics with multi-scale and time series imagery.

    PubMed

    Huang, Chunbo; Zhou, Zhixiang; Wang, Di; Dian, Yuanyong

    2016-05-01

    To learn the forest dynamics and evaluate the ecosystem services of forest effectively, a timely acquisition of spatial and quantitative information of forestland is very necessary. Here, a new method was proposed for mapping forest cover changes by combining multi-scale satellite remote-sensing imagery with time series data. Using time series Normalized Difference Vegetation Index products derived from the Moderate Resolution Imaging Spectroradiometer images (MODIS-NDVI) and Landsat Thematic Mapper/Enhanced Thematic Mapper Plus (TM/ETM+) images as data source, a hierarchy stepwise analysis from coarse scale to fine scale was developed for detecting the forest change area. At the coarse scale, MODIS-NDVI data with 1-km resolution were used to detect the changes in land cover types and a land cover change map was constructed using NDVI values at vegetation growing seasons. At the fine scale, based on the results at the coarse scale, Landsat TM/ETM+ data with 30-m resolution were used to precisely detect the forest change location and forest change trend by analyzing time series forest vegetation indices (IFZ). The method was tested using the data for Hubei Province, China. The MODIS-NDVI data from 2001 to 2012 were used to detect the land cover changes, and the overall accuracy was 94.02 % at the coarse scale. At the fine scale, the available TM/ETM+ images at vegetation growing seasons between 2001 and 2012 were used to locate and verify forest changes in the Three Gorges Reservoir Area, and the overall accuracy was 94.53 %. The accuracy of the two layer hierarchical monitoring results indicated that the multi-scale monitoring method is feasible and reliable.

  2. Estimation de l'equivalent en eau de la neige en milieu subarctique du Quebec par teledetection micro-ondes passives

    NASA Astrophysics Data System (ADS)

    Vachon, Francois

    The snow cover (extent, depth and water equivalent) is an important factor in assessing the water balance of a territory. In a context of deregulation of electricity, better knowledge of the quantity of water resulting from snowmelt that will be available for hydroelectric power generation has become a major challenge for the managers of Hydro-Quebec's generating plant. In fact, the snow on the ground represents nearly one third of Hydro-Quebec's annual energy reserve and the proportion is even higher for northern watersheds. Snowcover knowledge would therefore help optimize the management of energy stocks. The issue is especially important when one considers that better management of water resources can lead to substantial economic benefits. The Research Institute of Hydro-Quebec (IREQ), our research partner, is currently attempting to optimize the streamfiow forecasts made by its hydrological models by improving the quality of the inputs. These include a parameter known as the snow water equivalent (SWE) which characterizes the properties of the snow cover. At the present time, SWE data is obtained from in situ measurements, which are both sporadic and scattered, and does not allow the temporal and spatial variability of SWE to be characterized adequately for the needs of hydrological models. This research project proposes to provide the Quebec utility's hydrological models with distributed SWE information about its northern watersheds. The targeted accuracy is 15% for the proposed period of analysis covering the winter months of January, February and March of 2001 to 2006. The methodology is based on the HUT snow emission model and uses the passive microwave remote sensing data acquired by the SSM/I sensor. Monitoring of the temporal and spatial variations in SWE is done by inversion of the model and benefits from the assimilation of in situ data to characterize the state of snow cover during the season. Experimental results show that the assimilation technique of in situ data (density and depth) can reproduce the temporal variations in SWE with a RMSE error of 15.9% (R2=0.76). The analysis of land cover within the SSMI pixels can reduce this error to 14.6% ( R2=0.66) for SWE values below 300 mm. Moreover, the results show that the fluctuations of SWE values are driven by changes in snow depths. Indeed, the use of a constant value for the density of snow is feasible and makes it possible to get as good if not better results. These results will allow IREQ to assess the suitability of using snow cover information provided by the remote sensing data in its forecasting models. This improvement in SWE characterization will meet the needs of IREQ for its work on optimization of the quality of hydrological simulations. The originality and relevance of this work are based primarily on the type of method used to quantify SWE and the site where it is applied. The proposed method focuses on the inversion of the HUT model from passive remote sensing data and assimilates in situ data. Moreover, this approach allows high SWE values (> 300 mm) to be quantified, which was impossible with previous methods. These high SWE values are encountered in areas with large amounts of snow such as northern Quebec. Keywords. remote sensing, microwave, snow water equivalent (SWE), model, retrieval, data assimilation, SWE monitoring, spatialization Complete reference. Vachon, F. (2009) Snow water equivalent retrieval in a subartic environment of Quebec using passive microwave remote sensing. Ph.D. Thesis, Sherbrooke University, Sherbrooke, 211 p.

  3. 78 FR 16263 - Agency Information Collection Activities; Proposed Information Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-14

    ... collection titled, ``Annual Stress Test Reporting Template and Documentation for Covered Banks with [email protected] . Include ``Annual Stress Test Reporting Template and Documentation for Covered Banks with...: Annual Stress Test Reporting Template and Documentation for Covered Banks With Total Consolidated Assets...

  4. Wide coverage biomedical event extraction using multiple partially overlapping corpora

    PubMed Central

    2013-01-01

    Background Biomedical events are key to understanding physiological processes and disease, and wide coverage extraction is required for comprehensive automatic analysis of statements describing biomedical systems in the literature. In turn, the training and evaluation of extraction methods requires manually annotated corpora. However, as manual annotation is time-consuming and expensive, any single event-annotated corpus can only cover a limited number of semantic types. Although combined use of several such corpora could potentially allow an extraction system to achieve broad semantic coverage, there has been little research into learning from multiple corpora with partially overlapping semantic annotation scopes. Results We propose a method for learning from multiple corpora with partial semantic annotation overlap, and implement this method to improve our existing event extraction system, EventMine. An evaluation using seven event annotated corpora, including 65 event types in total, shows that learning from overlapping corpora can produce a single, corpus-independent, wide coverage extraction system that outperforms systems trained on single corpora and exceeds previously reported results on two established event extraction tasks from the BioNLP Shared Task 2011. Conclusions The proposed method allows the training of a wide-coverage, state-of-the-art event extraction system from multiple corpora with partial semantic annotation overlap. The resulting single model makes broad-coverage extraction straightforward in practice by removing the need to either select a subset of compatible corpora or semantic types, or to merge results from several models trained on different individual corpora. Multi-corpus learning also allows annotation efforts to focus on covering additional semantic types, rather than aiming for exhaustive coverage in any single annotation effort, or extending the coverage of semantic types annotated in existing corpora. PMID:23731785

  5. Unsupervised Wishart Classfication of Wetlands in Newfoundland, Canada Using Polsar Data Based on Fisher Linear Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Mohammadimanesh, F.; Salehi, B.; Mahdianpari, M.; Homayouni, S.

    2016-06-01

    Polarimetric Synthetic Aperture Radar (PolSAR) imagery is a complex multi-dimensional dataset, which is an important source of information for various natural resources and environmental classification and monitoring applications. PolSAR imagery produces valuable information by observing scattering mechanisms from different natural and man-made objects. Land cover mapping using PolSAR data classification is one of the most important applications of SAR remote sensing earth observations, which have gained increasing attention in the recent years. However, one of the most challenging aspects of classification is selecting features with maximum discrimination capability. To address this challenge, a statistical approach based on the Fisher Linear Discriminant Analysis (FLDA) and the incorporation of physical interpretation of PolSAR data into classification is proposed in this paper. After pre-processing of PolSAR data, including the speckle reduction, the H/α classification is used in order to classify the basic scattering mechanisms. Then, a new method for feature weighting, based on the fusion of FLDA and physical interpretation, is implemented. This method proves to increase the classification accuracy as well as increasing between-class discrimination in the final Wishart classification. The proposed method was applied to a full polarimetric C-band RADARSAT-2 data set from Avalon area, Newfoundland and Labrador, Canada. This imagery has been acquired in June 2015, and covers various types of wetlands including bogs, fens, marshes and shallow water. The results were compared with the standard Wishart classification, and an improvement of about 20% was achieved in the overall accuracy. This method provides an opportunity for operational wetland classification in northern latitude with high accuracy using only SAR polarimetric data.

  6. Probabalistic Risk Assessment of a Turbine Disk

    NASA Astrophysics Data System (ADS)

    Carter, Jace A.; Thomas, Michael; Goswami, Tarun; Fecke, Ted

    Current Federal Aviation Administration (FAA) rotor design certification practices risk assessment using a probabilistic framework focused on only the life-limiting defect location of a component. This method generates conservative approximations of the operational risk. The first section of this article covers a discretization method, which allows for a transition from this relative risk to an absolute risk where the component is discretized into regions called zones. General guidelines were established for the zone-refinement process based on the stress gradient topology in order to reach risk convergence. The second section covers a risk assessment method for predicting the total fatigue life due to fatigue induced damage. The total fatigue life incorporates a dual mechanism approach including the crack initiation life and propagation life while simultaneously determining the associated initial flaw sizes. A microstructure-based model was employed to address uncertainties in material response and relate crack initiation life with crack size, while propagation life was characterized large crack growth laws. The two proposed methods were applied to a representative Inconel 718 turbine disk. The zone-based method reduces the conservative approaches, while showing effects of feature-based inspection on the risk assessment. In the fatigue damage assessment, the predicted initial crack distribution was found to be the most sensitive probabilistic parameter and can be used to establish an enhanced inspection planning.

  7. A New Quantum Watermarking Based on Quantum Wavelet Transforms

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Naseri, Mosayeb; Gheibi, Reza; Baghfalaki, Masoud; Rasoul Pourarian, Mohammad; Farouk, Ahmed

    2017-06-01

    Quantum watermarking is a technique to embed specific information, usually the owner’s identification, into quantum cover data such for copyright protection purposes. In this paper, a new scheme for quantum watermarking based on quantum wavelet transforms is proposed which includes scrambling, embedding and extracting procedures. The invisibility and robustness performances of the proposed watermarking method is confirmed by simulation technique. The invisibility of the scheme is examined by the peak-signal-to-noise ratio (PSNR) and the histogram calculation. Furthermore the robustness of the scheme is analyzed by the Bit Error Rate (BER) and the Correlation Two-Dimensional (Corr 2-D) calculation. The simulation results indicate that the proposed watermarking scheme indicate not only acceptable visual quality but also a good resistance against different types of attack. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, Iran

  8. Arctic lead detection using a waveform mixture algorithm from CryoSat-2 data

    NASA Astrophysics Data System (ADS)

    Lee, Sanggyun; Kim, Hyun-cheol; Im, Jungho

    2018-05-01

    We propose a waveform mixture algorithm to detect leads from CryoSat-2 data, which is novel and different from the existing threshold-based lead detection methods. The waveform mixture algorithm adopts the concept of spectral mixture analysis, which is widely used in the field of hyperspectral image analysis. This lead detection method was evaluated with high-resolution (250 m) MODIS images and showed comparable and promising performance in detecting leads when compared to the previous methods. The robustness of the proposed approach also lies in the fact that it does not require the rescaling of parameters (i.e., stack standard deviation, stack skewness, stack kurtosis, pulse peakiness, and backscatter σ0), as it directly uses L1B waveform data, unlike the existing threshold-based methods. Monthly lead fraction maps were produced by the waveform mixture algorithm, which shows interannual variability of recent sea ice cover during 2011-2016, excluding the summer season (i.e., June to September). We also compared the lead fraction maps to other lead fraction maps generated from previously published data sets, resulting in similar spatiotemporal patterns.

  9. Object Manifold Alignment for Multi-Temporal High Resolution Remote Sensing Images Classification

    NASA Astrophysics Data System (ADS)

    Gao, G.; Zhang, M.; Gu, Y.

    2017-05-01

    Multi-temporal remote sensing images classification is very useful for monitoring the land cover changes. Traditional approaches in this field mainly face to limited labelled samples and spectral drift of image information. With spatial resolution improvement, "pepper and salt" appears and classification results will be effected when the pixelwise classification algorithms are applied to high-resolution satellite images, in which the spatial relationship among the pixels is ignored. For classifying the multi-temporal high resolution images with limited labelled samples, spectral drift and "pepper and salt" problem, an object-based manifold alignment method is proposed. Firstly, multi-temporal multispectral images are cut to superpixels by simple linear iterative clustering (SLIC) respectively. Secondly, some features obtained from superpixels are formed as vector. Thirdly, a majority voting manifold alignment method aiming at solving high resolution problem is proposed and mapping the vector data to alignment space. At last, all the data in the alignment space are classified by using KNN method. Multi-temporal images from different areas or the same area are both considered in this paper. In the experiments, 2 groups of multi-temporal HR images collected by China GF1 and GF2 satellites are used for performance evaluation. Experimental results indicate that the proposed method not only has significantly outperforms than traditional domain adaptation methods in classification accuracy, but also effectively overcome the problem of "pepper and salt".

  10. An improved discriminative filter bank selection approach for motor imagery EEG signal classification using mutual information.

    PubMed

    Kumar, Shiu; Sharma, Alok; Tsunoda, Tatsuhiko

    2017-12-28

    Common spatial pattern (CSP) has been an effective technique for feature extraction in electroencephalography (EEG) based brain computer interfaces (BCIs). However, motor imagery EEG signal feature extraction using CSP generally depends on the selection of the frequency bands to a great extent. In this study, we propose a mutual information based frequency band selection approach. The idea of the proposed method is to utilize the information from all the available channels for effectively selecting the most discriminative filter banks. CSP features are extracted from multiple overlapping sub-bands. An additional sub-band has been introduced that cover the wide frequency band (7-30 Hz) and two different types of features are extracted using CSP and common spatio-spectral pattern techniques, respectively. Mutual information is then computed from the extracted features of each of these bands and the top filter banks are selected for further processing. Linear discriminant analysis is applied to the features extracted from each of the filter banks. The scores are fused together, and classification is done using support vector machine. The proposed method is evaluated using BCI Competition III dataset IVa, BCI Competition IV dataset I and BCI Competition IV dataset IIb, and it outperformed all other competing methods achieving the lowest misclassification rate and the highest kappa coefficient on all three datasets. Introducing a wide sub-band and using mutual information for selecting the most discriminative sub-bands, the proposed method shows improvement in motor imagery EEG signal classification.

  11. Wetland fire scar monitoring and analysis using archival Landsat data for the Everglades

    USGS Publications Warehouse

    Jones, John W.; Hall, Annette E.; Foster, Ann M.; Smith, Thomas J.

    2013-01-01

    The ability to document the frequency, extent, and severity of fires in wetlands, as well as the dynamics of post-fire wetland land cover, informs fire and wetland science, resource management, and ecosystem protection. Available information on Everglades burn history has been based on field data collection methods that evolved through time and differ by land management unit. Our objectives were to (1) design and test broadly applicable and repeatable metrics of not only fire scar delineation but also post-fire land cover dynamics through exhaustive use of the Landsat satellite data archives, and then (2) explore how those metrics relate to various hydrologic and anthropogenic factors that may influence post-fire land cover dynamics. Visual interpretation of every Landsat scene collected over the study region during the study time frame produced a new, detailed database of burn scars greater than 1.6 ha in size in the Water Conservation Areas and post-fire land cover dynamics for Everglades National Park fires greater than 1.6 ha in area. Median burn areas were compared across several landscape units of the Greater Everglades and found to differ as a function of administrative unit and fire history. Some burned areas transitioned to open water, exhibiting water depths and dynamics that support transition mechanisms proposed in the literature. Classification tree techniques showed that time to green-up and return to pre-burn character were largely explained by fire management practices and hydrology. Broadly applicable as they use data from the global, nearly 30-year-old Landsat archive, these methods for documenting wetland burn extent and post-fire land cover change enable cost-effective collection of new data on wetland fire ecology and independent assessment of fire management practice effectiveness.

  12. An automatic approach for rice mapping in temperate region using time series of MODIS imagery: first results for Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Boschetti, M.; Nelson, A.; Manfrom, G.; Brivio, P. A.

    2012-04-01

    Timely and accurate information on crop typology and status are required to support suitable action to better manage agriculture production and reduce food insecurity. More specifically, regional crop masking and phenological information are important inputs for spatialized crop growth models for yield forecasting systems. Digital cartographic data available at global/regional scale, such as GLC2000, GLOBCOVER or MODIS land cover products (MOD12), are often not adequate for this crop modeling application. For this reason, there is a need to develop and test methods that can provide such information for specific cropsusing automated classification techniques.. In this framework we focused our analysis on the rice cultivation area detection due to the importance of this crop. Rice is a staple food for half of the world's population (FAO 2004). Over 90% of the world's rice is produced and consumed in Asia and the region is home to 70% of the world's poor, most of whom depend on rice for their livelihoods andor food security. Several initiatives are being promoted at the international level to provide maps of rice cultivated areas in South and South East Asia using different approaches available in literature for rice mapping in tropical regions. We contribute to these efforts by proposing an automatic method to detect rice cultivated areas in temperate regions exploiting MODIS 8-Day composite of Surface Reflectance at 500m spatial resolution (MOD09A1product). Temperate rice is cultivated worldwide in more than 20 countries covering around 16M ha for a total production of about 65M tons of paddy per year. The proposed method is based on a common approach available in literature that first identifies flood condition that can be related to rice agronomic practice and then checks for vegetation growth. The method presents innovative aspects related both to the flood detection, exploiting Short Wave Infrared spectral information, and to the crop grow monitoring analyzing vegetation index seasonal trend. Tests conducted in European Mediterranean environment demonstrated that our approach is able to provide accurate rice map (User Accuracy > 80%) when compared to available Corine Land Cover land use map (1:100.000 scale, MMU 25 ha). Map accuracy in term of omission and commission error has been analyzed in north of Italy where about 60 % of total European riceis produced. For this study area thematic cartography at 1:10.000scale allowed to analyze the type of commission errors and evaluate the entity of omission errors in relation to low resolution bias and/or algorithm performance. Pareto boundary method has been used to assess the level of accuracy of the method respect a maximum achievable accuracy with medium resolution MODIS data. Results demonstrate that the proposed approach outperform the method developed for tropical and sub-tropical environment.

  13. Cellulosic Biofuel Production with Winter Cover Crops: Yield and Nitrogen Implications

    USDA-ARS?s Scientific Manuscript database

    Interest in renewable energy sources derived from plant biomass is increasing. Growing cover crops after harvest of the primary crop has been proposed as a solution to producing cellulosic biomass on existing crop-producing land without reducing food-harvest potential. Growing cover crops is a recom...

  14. Mining method selection by integrated AHP and PROMETHEE method.

    PubMed

    Bogdanovic, Dejan; Nikolic, Djordje; Ilic, Ivana

    2012-03-01

    Selecting the best mining method among many alternatives is a multicriteria decision making problem. The aim of this paper is to demonstrate the implementation of an integrated approach that employs AHP and PROMETHEE together for selecting the most suitable mining method for the "Coka Marin" underground mine in Serbia. The related problem includes five possible mining methods and eleven criteria to evaluate them. Criteria are accurately chosen in order to cover the most important parameters that impact on the mining method selection, such as geological and geotechnical properties, economic parameters and geographical factors. The AHP is used to analyze the structure of the mining method selection problem and to determine weights of the criteria, and PROMETHEE method is used to obtain the final ranking and to make a sensitivity analysis by changing the weights. The results have shown that the proposed integrated method can be successfully used in solving mining engineering problems.

  15. Dating human skeletal remains using 90Sr and 210Pb: case studies.

    PubMed

    Schrag, Bettina; Uldin, Tanya; Mangin, Patrice; Bochud, François; Froidevaux, Pascal

    2014-01-01

    In legal medicine, the post mortem interval (PMI) of interest covers the last 50 years. When only human skeletal remains are found, determining the PMI currently relies mostly on the experience of the forensic anthropologist, with few techniques available to help. Recently, several radiometric methods have been proposed to reveal PMI. For instance, (14)C and (90)Sr bomb pulse dating covers the last 60 years and give reliable PMI when teeth or bones are available. (232)Th series dating has also been proposed but requires a large amount of bones. In addition, (210)Pb dating is promising but is submitted to diagenesis and individual habits like smoking that must be handled carefully. Here we determine PMI on 29 cases of forensic interest using (90)Sr bomb pulse. In 12 cases, (210)Pb dating was added to narrow the PMI interval. In addition, anthropological investigations were carried out on 15 cases to confront anthropological expertise to the radiometric method. Results show that 10 of the 29 cases can be discarded as having no forensic interest (PMI>50 years) based only on the (90)Sr bomb pulse dating. For 10 other cases, the additional (210)Pb dating restricts the PMI uncertainty to a few years. In 15 cases, anthropological investigations corroborate the radiometric PMI. This study also shows that diagenesis and inter-individual difference in radionuclide uptake represent the main sources of uncertainty in the PMI determination using radiometric methods. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. Forest Stand Segmentation Using Airborne LIDAR Data and Very High Resolution Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet, Valérie; Hervieu, Alexandre

    2016-06-01

    Forest stands are the basic units for forest inventory and mapping. Stands are large forested areas (e.g., ≥ 2 ha) of homogeneous tree species composition. The accurate delineation of forest stands is usually performed by visual analysis of human operators on very high resolution (VHR) optical images. This work is highly time consuming and should be automated for scalability purposes. In this paper, a method based on the fusion of airborne laser scanning data (or lidar) and very high resolution multispectral imagery for automatic forest stand delineation and forest land-cover database update is proposed. The multispectral images give access to the tree species whereas 3D lidar point clouds provide geometric information on the trees. Therefore, multi-modal features are computed, both at pixel and object levels. The objects are individual trees extracted from lidar data. A supervised classification is performed at the object level on the computed features in order to coarsely discriminate the existing tree species in the area of interest. The analysis at tree level is particularly relevant since it significantly improves the tree species classification. A probability map is generated through the tree species classification and inserted with the pixel-based features map in an energetical framework. The proposed energy is then minimized using a standard graph-cut method (namely QPBO with α-expansion) in order to produce a segmentation map with a controlled level of details. Comparison with an existing forest land cover database shows that our method provides satisfactory results both in terms of stand labelling and delineation (matching ranges between 94% and 99%).

  17. Triple ionization chamber method for clinical dose monitoring with a Be-covered Li BNCT field.

    PubMed

    Nguyen, Thanh Tat; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Nguyen, Chien Cong; Endo, Satoru

    2016-11-01

    Fast neutron, gamma-ray, and boron doses have different relative biological effectiveness (RBE). In boron neutron capture therapy (BNCT), the clinical dose is the total of these dose components multiplied by their RBE. Clinical dose monitoring is necessary for quality assurance of the irradiation profile; therefore, the fast neutron, gamma-ray, and boron doses should be separately monitored. To estimate these doses separately, and to monitor the boron dose without monitoring the thermal neutron fluence, the authors propose a triple ionization chamber method using graphite-walled carbon dioxide gas (C-CO 2 ), tissue-equivalent plastic-walled tissue-equivalent gas (TE-TE), and boron-loaded tissue-equivalent plastic-walled tissue-equivalent gas [TE(B)-TE] chambers. To use this method for dose monitoring for a neutron and gamma-ray field moderated by D 2 O from a Be-covered Li target (Be-covered Li BNCT field), the relative sensitivities of these ionization chambers are required. The relative sensitivities of the TE-TE, C-CO 2 , and TE(B)-TE chambers to fast neutron, gamma-ray, and boron doses are calculated with the particle and heavy-ion transport code system (PHITS). The relative sensitivity of the TE(B)-TE chamber is calculated with the same method as for the TE-TE and C-CO 2 chambers in the paired chamber method. In the Be-covered Li BNCT field, the relative sensitivities of the ionization chambers to fast neutron, gamma-ray, and boron doses are calculated from the kerma ratios, mass attenuation coefficient tissue-to-wall ratios, and W-values. The Be-covered Li BNCT field consists of neutrons and gamma-rays which are emitted from a Be-covered Li target, and this resultant field is simulated by using PHITS with the cross section library of ENDF-VII. The kerma ratios and mass attenuation coefficient tissue-to-wall ratios are determined from the energy spectra of neutrons and gamma-rays in the Be-covered Li BNCT field. The W-value is calculated from recoil charged particle spectra by the collision of neutrons and gamma-rays with the wall and gas materials of the ionization chambers in the gas cavities of TE-TE, C-CO 2 , and TE(B)-TE chambers ( 10 B concentrations of 10, 50, and 100 ppm in the TE-wall). The calculated relative sensitivity of the C-CO 2 chamber to the fast neutron dose in the Be-covered Li BNCT field is 0.029, and those of the TE-TE and TE(B)-TE chambers are both equal to 0.965. The relative sensitivities of the C-CO 2 , TE-TE, and TE(B)-TE chambers to the gamma-ray dose in the Be-covered Li BNCT field are all 1 within the 1% calculation uncertainty. The relative sensitivities of TE(B)-TE to boron dose with concentrations of 10, 50, and 100 ppm 10 B are calculated to be 0.865 times the ratio of the in-tumor to in-chamber wall boron concentration. The fast neutron, gamma-ray, and boron doses of a tumor in-air can be separately monitored by the triple ionization chamber method in the Be-covered Li BNCT field. The results show that these doses can be easily converted to the clinical dose with the depth correction factor in the body and the RBE.

  18. 77 FR 73500 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change Relating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-10

    ... Trading of Shares of the Horizons S&P 500 Covered Call ETF, Horizons S&P Financial Select Sector Covered Call ETF, and Horizons S&P Energy Select Sector Covered Call ETF Under NYSE Arca Equities Rule 5.2(j)(3... and trade shares (``Shares'') of the Horizons S&P 500 Covered Call ETF, Horizons S&P Financial Select...

  19. Cover/Frequency (CF)

    Treesearch

    John F. Caratti

    2006-01-01

    The FIREMON Cover/Frequency (CF) method is used to assess changes in plant species cover and frequency for a macroplot. This method uses multiple quadrats to sample within-plot variation and quantify statistically valid changes in plant species cover, height, and frequency over time. Because it is difficult to estimate cover in quadrats for larger plants, this method...

  20. Sighting frequency and relative abundance of bottlenose dolphins (Tursiops truncatus) along the northeast coast of Margarita Island and Los Frailes Archipelago, Venezuela.

    PubMed

    Oviedo, Lenin; Silva, Noemi

    2005-01-01

    The study of local cetaceans in Venezuela has a very recent history, and few efforts have been made in the assessment of coastal populations based on field research. The occurrence of whales and dolphins along the northeast coast of Venezuela has been documented through sightings and stranding records. Given the underwater topographical features and the influence of upwelling processes, this area is considered a very productive coastal ecosystem. Our objective was to establish the sighting frequency and relative abundance of bottlenose dolphins in the area. Sighting records were gathered on bottlenose dolphins and other cetacean species occurring along the northeast coast of Margarita Island and Los Frailes Archipelago through direct observation during land-based (6 surveys, 48 hours of observation) and boat-based surveys (24 surveys, 121 hours of observation, 1295 km covered). A sighting frequency was calculated using two methodologies and then compared, considering: 1) a mean effective observation time (4.27 hours), and 2) distance covered with cetacean sightings (1108 kin). A third method is proposed relating a mean effective distance covered with cetacean sightings and expressed as a percentage. The abundance index was calculated using the mean effective observation time. The sighting frequency of Tursiops truncattus in the study area was 3 - 4 sightings per day of 4.27 observation hours, or by 185 kilometers covered. The relative abundance was calculated as 35 dolphins in the study area, so a total population of less than 60 dolphins could inhabit the proposed range. Tursiops truncatus is the dominant species in the northeast coast of Margarita Island and Los Frailes Archipelago with 70% of all the sightings, so this locality could be termed as the distribution range of a possible local population of bottlenose dolphins.

  1. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  2. A unified in vitro evaluation for apatite-forming ability of bioactive glasses and their variants.

    PubMed

    Maçon, Anthony L B; Kim, Taek B; Valliant, Esther M; Goetschius, Kathryn; Brow, Richard K; Day, Delbert E; Hoppe, Alexander; Boccaccini, Aldo R; Kim, Ill Yong; Ohtsuki, Chikara; Kokubo, Tadashi; Osaka, Akiyoshi; Vallet-Regí, Maria; Arcos, Daniel; Fraile, Leandro; Salinas, Antonio J; Teixeira, Alexandra V; Vueva, Yuliya; Almeida, Rui M; Miola, Marta; Vitale-Brovarone, Chiara; Verné, Enrica; Höland, Wolfram; Jones, Julian R

    2015-02-01

    The aim of this study was to propose and validate a new unified method for testing dissolution rates of bioactive glasses and their variants, and the formation of calcium phosphate layer formation on their surface, which is an indicator of bioactivity. At present, comparison in the literature is difficult as many groups use different testing protocols. An ISO standard covers the use of simulated body fluid on standard shape materials but it does not take into account that bioactive glasses can have very different specific surface areas, as for glass powders. Validation of the proposed modified test was through round robin testing and comparison to the ISO standard where appropriate. The proposed test uses fixed mass per solution volume ratio and agitated solution. The round robin study showed differences in hydroxyapatite nucleation on glasses of different composition and between glasses of the same composition but different particle size. The results were reproducible between research facilities. Researchers should use this method when testing new glasses, or their variants, to enable comparison between the literature in the future.

  3. Engraving Print Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoelck, Daniel; Barbe, Joaquim

    2008-04-15

    A print is a mark, or drawing, made in or upon a plate, stone, woodblock or other material which is cover with ink and then is press usually into a paper reproducing the image on the paper. Engraving prints usually are image composed of a group of binary lines, specially those are made with relief and intaglio techniques. Varying the number and the orientation of lines, the drawing of the engraving print is conformed. For this reason we propose an application based on image processing methods to classify engraving prints.

  4. Entropy Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity

    DTIC Science & Technology

    2015-10-23

    AFRL-AFOSR-VA-TR-2015-0337 Entropy Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity Jean-Luc Guermond TEXAS A & M UNIVERSITY 750...REPORT DATE (DD-MM-YYYY) 09-05-2015 2. REPORT TYPE Final report 3. DATES COVERED (From - To) 01-07-2012 - 30-06-2015 4. TITLE AND SUBTITLE Entropy ...conservation equations can be stabilized by using the so-called entropy viscosity method and we proposed to to investigate this new technique. We

  5. Modelling of capillary-driven flow for closed paper-based microfluidic channels

    NASA Astrophysics Data System (ADS)

    Songok, Joel; Toivakka, Martti

    2017-06-01

    Paper-based microfluidics is an emerging field focused on creating inexpensive devices, with simple fabrication methods for applications in various fields including healthcare, environmental monitoring and veterinary medicine. Understanding the flow of liquid is important in achieving consistent operation of the devices. This paper proposes capillary models to predict flow in paper-based microfluidic channels, which include a flow accelerating hydrophobic top cover. The models, which consider both non-absorbing and absorbing substrates, are in good agreement with the experimental results.

  6. Acquisition of Expert/Non-Expert Vocabulary from Reformulations.

    PubMed

    Antoine, Edwige; Grabar, Natalia

    2017-01-01

    Technical medical terms are complicated to be correctly understood by non-experts. Vocabulary, associating technical terms with layman expressions, can help in increasing the readability of technical texts and their understanding. The purpose of our work is to build this kind of vocabulary. We propose to exploit the notion of reformulation following two methods: extraction of abbreviations and of reformulations with specific markers. The segments associated thanks to these methods are aligned with medical terminologies. Our results allow to cover over 9,000 medical terms and show precision of extractions between 0.24 and 0.98. The results and analyzed and compared with the existing work.

  7. All-dielectric three-dimensional broadband Eaton lens with large refractive index range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yin, Ming; Yong Tian, Xiao, E-mail: leoxyt@mail.xjtu.edu.cn; Ling Wu, Ling

    2014-03-03

    We proposed a method to realize three-dimensional (3D) gradient index (GRIN) devices requiring large refractive index (RI) range with broadband performance. By combining non-resonant GRIN woodpile photonic crystals structure in the metamaterial regime with a compound liquid medium, a wide RI range (1–6.32) was fulfilled flexibly. As a proof-of-principle for the low-loss and non-dispersive method, a 3D Eaton lens was designed and fabricated based on 3D printing process. Full-wave simulation and experiment validated its omnidirectional wave bending effects in a broad bandwidth covering Ku band (12 GHz–18 GHz)

  8. Generalizing the flash technique in the front-face configuration to measure the thermal diffusivity of semitransparent solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pech-May, Nelson Wilbur; Department of Applied Physics, CINVESTAV Unidad Mérida, carretera Antigua a Progreso km6, A.P. 73 Cordemex, Mérida Yucatán 97310, México; Mendioroz, Arantza

    2014-10-15

    In this work, we have extended the front-face flash method to retrieve simultaneously the thermal diffusivity and the optical absorption coefficient of semitransparent plates. A complete theoretical model that allows calculating the front surface temperature rise of the sample has been developed. It takes into consideration additional effects, such as multiple reflections of the heating light beam inside the sample, heat losses by convection and radiation, transparency of the sample to infrared wavelengths, and heating pulse duration. Measurements performed on calibrated solids, covering a wide range of absorption coefficients (from transparent to opaque) and thermal diffusivities, validate the proposed method.

  9. 78 FR 43887 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... manufacturers and applicable group purchasing organizations (GPOs) of covered drugs, devices, biologicals, or..., applicable manufacturers of covered drugs, devices, biologicals, and medical supplies are required to submit...

  10. a Spiral-Based Downscaling Method for Generating 30 M Time Series Image Data

    NASA Astrophysics Data System (ADS)

    Liu, B.; Chen, J.; Xing, H.; Wu, H.; Zhang, J.

    2017-09-01

    The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland) make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these high spatial resolution images image by image. Simulated experiment and remote sensing image downscaling experiment were conducted. In simulated experiment, the 30 meters class map dataset Globeland30 was adopted to investigate the effect on avoid the underdetermined problem in downscaling procedure and a comparison between spiral and window was conducted. Further, the MODIS NDVI and Landsat image data was adopted to generate the 30m time series NDVI in remote sensing image downscaling experiment. Simulated experiment results showed that the proposed method had a robust performance in downscaling pixel in heterogeneous region and indicated that it was superior to the traditional window-based methods. The high resolution time series generated may be a benefit to the mapping and updating of land cover data.

  11. Floor Sensing System Using Laser Reflectivity for Localizing Everyday Objects and Robot

    PubMed Central

    Pyo, Yoonseok; Hasegawa, Tsutomu; Tsuji, Tokuo; Kurazume, Ryo; Morooka, Ken'ichi

    2014-01-01

    This paper describes a new method of measuring the position of everyday objects and a robot on the floor using distance and reflectance acquired by laser range finder (LRF). The information obtained by this method is important for a service robot working in a human daily life environment. Our method uses only one LRF together with a mirror installed on the wall. Moreover, since the area of sensing is limited to a LRF scanning plane parallel to the floor and just a few centimeters above the floor, the scanning covers the whole room with minimal invasion of privacy of a resident, and occlusion problem is mitigated by using mirror. We use the reflection intensity and position information obtained from the target surface. Although it is not possible to identify all objects by additionally using reflection values, it would be easier to identify unknown objects if we can eliminate easily identifiable objects by reflectance. In addition, we propose a method for measuring the robot's pose using the tag which has the encoded reflection pattern optically identified by the LRF. Our experimental results validate the effectiveness of the proposed method. PMID:24763253

  12. 76 FR 13981 - Proposed Information Collection; Comment Request; 2012 Economic Census Covering the Construction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... essential information for government, business and the general public. The 2012 Economic Census covering the... Economic Census Covering the Construction Sector AGENCY: U.S. Census Bureau. ACTION: Notice. SUMMARY: The... provider of timely, relevant and quality data about the people and economy of the United States. Economic...

  13. 76 FR 13978 - Proposed Information Collection; Comment Request; 2012 Economic Census Covering the Manufacturing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... essential information for government, business and the general public. The 2012 Economic Census covering the... Economic Census Covering the Manufacturing Sector AGENCY: U.S. Census Bureau. ACTION: Notice. SUMMARY: The... provider of timely, relevant and quality data about the people and economy of the United States. Economic...

  14. Evaluating ecoregions for sampling and mapping land-cover patterns

    Treesearch

    Kurt H. Riitters; James D. Wickham; Timothy G. Wade

    2006-01-01

    Ecoregional stratification has been proposed for sampling and mapping land-cover composition and pattern over time. Using a wall-to-wall land-cover map of the United States, we evaluated geographic scales of variance for nine landscapelevel and eight forest pattern indices, and compared stratification by ecoregions, administrative units, and watersheds. Ecoregions...

  15. A practical approach for linearity assessment of calibration curves under the International Union of Pure and Applied Chemistry (IUPAC) guidelines for an in-house validation of method of analysis.

    PubMed

    Sanagi, M Marsin; Nasir, Zalilah; Ling, Susie Lu; Hermawan, Dadan; Ibrahim, Wan Aini Wan; Naim, Ahmedy Abu

    2010-01-01

    Linearity assessment as required in method validation has always been subject to different interpretations and definitions by various guidelines and protocols. However, there are very limited applicable implementation procedures that can be followed by a laboratory chemist in assessing linearity. Thus, this work proposes a simple method for linearity assessment in method validation by a regression analysis that covers experimental design, estimation of the parameters, outlier treatment, and evaluation of the assumptions according to the International Union of Pure and Applied Chemistry guidelines. The suitability of this procedure was demonstrated by its application to an in-house validation for the determination of plasticizers in plastic food packaging by GC.

  16. Efficient method of image edge detection based on FSVM

    NASA Astrophysics Data System (ADS)

    Cai, Aiping; Xiong, Xiaomei

    2013-07-01

    For efficient object cover edge detection in digital images, this paper studied traditional methods and algorithm based on SVM. It analyzed Canny edge detection algorithm existed some pseudo-edge and poor anti-noise capability. In order to provide a reliable edge extraction method, propose a new detection algorithm based on FSVM. Which contains several steps: first, trains classify sample and gives the different membership function to different samples. Then, a new training sample is formed by increase the punishment some wrong sub-sample, and use the new FSVM classification model for train and test them. Finally the edges are extracted of the object image by using the model. Experimental result shows that good edge detection image will be obtained and adding noise experiments results show that this method has good anti-noise.

  17. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery.

    PubMed

    Huang, Huasheng; Deng, Jizhong; Lan, Yubin; Yang, Aqing; Deng, Xiaoling; Zhang, Lei

    2018-01-01

    Appropriate Site Specific Weed Management (SSWM) is crucial to ensure the crop yields. Within SSWM of large-scale area, remote sensing is a key technology to provide accurate weed distribution information. Compared with satellite and piloted aircraft remote sensing, unmanned aerial vehicle (UAV) is capable of capturing high spatial resolution imagery, which will provide more detailed information for weed mapping. The objective of this paper is to generate an accurate weed cover map based on UAV imagery. The UAV RGB imagery was collected in 2017 October over the rice field located in South China. The Fully Convolutional Network (FCN) method was proposed for weed mapping of the collected imagery. Transfer learning was used to improve generalization capability, and skip architecture was applied to increase the prediction accuracy. After that, the performance of FCN architecture was compared with Patch_based CNN algorithm and Pixel_based CNN method. Experimental results showed that our FCN method outperformed others, both in terms of accuracy and efficiency. The overall accuracy of the FCN approach was up to 0.935 and the accuracy for weed recognition was 0.883, which means that this algorithm is capable of generating accurate weed cover maps for the evaluated UAV imagery.

  18. Surface-induced brightness temperature variations and their effects on detecting thin cirrus clouds using IR emission channels in the 8-12 micrometer region

    NASA Technical Reports Server (NTRS)

    Gao, Bo-Cai; Wiscombe, W. J.

    1993-01-01

    A method for detecting cirrus clouds in terms of brightness temperature differences between narrow bands at 8, 11, and 12 mu m has been proposed by Ackerman et al. (1990). In this method, the variation of emissivity with wavelength for different surface targets was not taken into consideration. Based on state-of-the-art laboratory measurements of reflectance spectra of terrestrial materials by Salisbury and D'Aria (1992), we have found that the brightness temperature differences between the 8 and 11 mu m bands for soils, rocks and minerals, and dry vegetation can vary between approximately -8 K and +8 K due solely to surface emissivity variations. We conclude that although the method of Ackerman et al. is useful for detecting cirrus clouds over areas covered by green vegetation, water, and ice, it is less effective for detecting cirrus clouds over areas covered by bare soils, rocks and minerals, and dry vegetation. In addition, we recommend that in future the variation of surface emissivity with wavelength should be taken into account in algorithms for retrieving surface temperatures and low-level atmospheric temperature and water vapor profiles.

  19. First tier modeling of consumer dermal exposure to substances in consumer articles under REACH: a quantitative evaluation of the ECETOC TRA for consumers tool.

    PubMed

    Delmaar, J E; Bokkers, B G H; ter Burg, W; van Engelen, J G M

    2013-02-01

    The demonstration of safe use of chemicals in consumer products, as required under REACH, is proposed to follow a tiered process. In the first tier, simple conservative methods and assumptions should be made to quickly verify whether risks for a particular use are expected. The ECETOC TRA Consumer Exposure Tool was developed to assist in first tier risk assessments for substances in consumer products. The ECETOC TRA is not a prioritization tool, but is meant as a first screening. Therefore, the exposure assessment needs to cover all products/articles in a specific category. For the assessment of the dermal exposure for substances in articles, ECETOC TRA uses the concept of a 'contact layer', a hypothetical layer that limits the exposure to a substance contained in the product. For each product/article category, ECETOC TRA proposes default values for the thickness of this contact layer. As relevant experimental exposure data is currently lacking, default values are based on expert judgment alone. In this paper it is verified whether this concept meets the requirement of being a conservative exposure evaluation method. This is done by confronting the ECETOC TRA expert judgment based predictions with a mechanistic emission model, based on the well established theory of diffusion of substances in materials. Diffusion models have been applied and tested in many applications of emission modeling. Experimentally determined input data for a number of material and substance combinations are available. The estimated emissions provide information on the range of emissions that could occur in reality. First tier tools such as ECETOC TRA tool are required to cover all products/articles in a category and to provide estimates that are at least as high as is expected on the basis of current scientific knowledge. Since this was not the case, it is concluded that the ECETOC TRA does not provide a proper conservative estimation method for the dermal exposure to articles. An alternative method was proposed. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. A method of initial orbit determination from three or more observations on a short arc. (Russian Title: Метод определения первоначальной орбиты по трем и более наблюдениям на короткой дуге)

    NASA Astrophysics Data System (ADS)

    Shefer, V. A.

    2010-12-01

    A new method is suggested for computing the initial orbit of a small celestial body from its three or more pairs of angular measurements at three times. The method is based on using the approach that we previously developed for constructing the intermediate orbit from minimal number of observations. This intermediate orbit allows for most of the perturbations in the motion of the body under study. The method proposed uses the Herget's algorithmic scheme that makes it possible to involve additional observations as well. The methodical error of orbit computation by the proposed method is two orders smaller than the corresponding error of the Herget's approach based on the construction of the unperturbed Keplerian orbit. The new method is especially efficient if applied to high-accuracy observational data covering short orbital arcs.

  1. A new method of preliminary orbit determination from three or more observations on a short arc. (Russian Title: Новый метод определения предварительной орбиты по трем и более наблюдениям на короткой дуге)

    NASA Astrophysics Data System (ADS)

    Shefer, V. A.

    2011-07-01

    A new method is suggested for finding the preliminary orbit of a small celestial body from its three or more pairs of angular measurements at three times. The method is based on using the approach that we previously developed for constructing the intermediate orbit from minimal number of observations. This intermediate orbit allows for most of the perturbations in the motion of the body under study. The method proposed uses the Herget's algorithmic scheme that makes it possible to involve additional observations as well. The methodical error of orbit computation by the proposed method is two orders smaller than the corresponding error of the commonly used approach based on the construction of the unperturbed Keplerian orbit. The new method is especially efficient if applied to high-accuracy observational data covering short orbital arcs.

  2. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping.

    PubMed

    Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J

    2017-01-01

    Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  3. Asymmetric color image encryption based on singular value decomposition

    NASA Astrophysics Data System (ADS)

    Yao, Lili; Yuan, Caojin; Qiang, Junjie; Feng, Shaotong; Nie, Shouping

    2017-02-01

    A novel asymmetric color image encryption approach by using singular value decomposition (SVD) is proposed. The original color image is encrypted into a ciphertext shown as an indexed image by using the proposed method. The red, green and blue components of the color image are subsequently encoded into a complex function which is then separated into U, S and V parts by SVD. The data matrix of the ciphertext is obtained by multiplying orthogonal matrices U and V while implementing phase-truncation. Diagonal entries of the three diagonal matrices of the SVD results are abstracted and scrambling combined to construct the colormap of the ciphertext. Thus, the encrypted indexed image covers less space than the original image. For decryption, the original color image cannot be recovered without private keys which are obtained from phase-truncation and the orthogonality of V. Computer simulations are presented to evaluate the performance of the proposed algorithm. We also analyze the security of the proposed system.

  4. Wideband Motion Control by Position and Acceleration Input Based Disturbance Observer

    NASA Astrophysics Data System (ADS)

    Irie, Kouhei; Katsura, Seiichiro; Ohishi, Kiyoshi

    The disturbance observer can observe and suppress the disturbance torque within its bandwidth. Recent motion systems begin to spread in the society and they are required to have ability to contact with unknown environment. Such a haptic motion requires much wider bandwidth. However, since the conventional disturbance observer attains the acceleration response by the second order derivative of position response, the bandwidth is limited due to the derivative noise. This paper proposes a novel structure of a disturbance observer. The proposed disturbance observer uses an acceleration sensor for enlargement of bandwidth. Generally, the bandwidth of an acceleration sensor is from 1Hz to more than 1kHz. To cover DC range, the conventional position sensor based disturbance observer is integrated. Thus, the performance of the proposed Position and Acceleration input based disturbance observer (PADO) is superior to the conventional one. The PADO is applied to position control (infinity stiffness) and force control (zero stiffness). The numerical and experimental results show viability of the proposed method.

  5. On the use of the energy probability distribution zeros in the study of phase transitions

    NASA Astrophysics Data System (ADS)

    Mól, L. A. S.; Rodrigues, R. G. M.; Stancioli, R. A.; Rocha, J. C. S.; Costa, B. V.

    2018-04-01

    This contribution is devoted to cover some technical aspects related to the use of the recently proposed energy probability distribution zeros in the study of phase transitions. This method is based on the partial knowledge of the partition function zeros and has been shown to be extremely efficient to precisely locate phase transition temperatures. It is based on an iterative method in such a way that the transition temperature can be approached at will. The iterative method will be detailed and some convergence issues that has been observed in its application to the 2D Ising model and to an artificial spin ice model will be shown, together with ways to circumvent them.

  6. Innovative research methods for studying treatments for rare diseases: methodological review.

    PubMed

    Gagne, Joshua J; Thompson, Lauren; O'Keefe, Kelly; Kesselheim, Aaron S

    2014-11-24

    To examine methods for generating evidence on health outcomes in patients with rare diseases. Methodological review of existing literature. PubMed, Embase, and Academic Search Premier searched for articles describing innovative approaches to randomized trial design and analysis methods and methods for conducting observational research in patients with rare diseases. We assessed information related to the proposed methods, the specific rare disease being studied, and outcomes from the application of the methods. We summarize methods with respect to their advantages in studying health outcomes in rare diseases and provide examples of their application. We identified 46 articles that proposed or described methods for studying patient health outcomes in rare diseases. Articles covered a wide range of rare diseases and most (72%) were published in 2008 or later. We identified 16 research strategies for studying rare disease. Innovative clinical trial methods minimize sample size requirements (n=4) and maximize the proportion of patients who receive active treatment (n=2), strategies crucial to studying small populations of patients with limited treatment choices. No studies describing unique methods for conducting observational studies in patients with rare diseases were identified. Though numerous studies apply unique clinical trial designs and considerations to assess patient health outcomes in rare diseases, less attention has been paid to innovative methods for studying rare diseases using observational data. © Gagne et al 2014.

  7. Performance analysis of mineral mapping method to delineate mineralization zones under tropical region

    NASA Astrophysics Data System (ADS)

    Wakila, M. H.; Saepuloh, A.; Heriawan, M. N.; Susanto, A.

    2016-09-01

    Geothermal explorations and productions are currently being intensively conducted at certain areas in Indonesia such as Wayang Windu Geothermal Field (WWGF) in West Java, Indonesia. The WWGF is located at wide area covering about 40 km2. An accurate method to map the distribution of heterogeneity minerals is necessary for wide areas such as WWGF. Mineral mapping is an important method in geothermal explorations to determine the distribution of minerals which indicate the surface manifestations of geothermal system. This study is aimed to determine the most precise and accurate methods for minerals mapping at geothermal field. Field measurements were performed to assess the accuracy of three proposed methods: 1) Minimum Noise Fraction (MNF), utilizing the linear transformation method to eliminate the correlation among the spectra bands and to reduce the noise in the data, 2) Pixel Purity Index (PPI), a designed method to find the most extreme spectrum pixels and their characteristics due to end-members mixing, 3) Spectral Angle Mapper (SAM), an image classification technique by measuring the spectral similarity between an unknown object with spectral reference in n- dimension. The output of those methods were mineral distribution occurrence. The performance of each mapping method was analyzed based on the ground truth data. Among the three proposed method, the SAM classification method is the most appropriate and accurate for mineral mapping related to spatial distribution of alteration minerals.

  8. Towards Online Multiresolution Community Detection in Large-Scale Networks

    PubMed Central

    Huang, Jianbin; Sun, Heli; Liu, Yaguang; Song, Qinbao; Weninger, Tim

    2011-01-01

    The investigation of community structure in networks has aroused great interest in multiple disciplines. One of the challenges is to find local communities from a starting vertex in a network without global information about the entire network. Many existing methods tend to be accurate depending on a priori assumptions of network properties and predefined parameters. In this paper, we introduce a new quality function of local community and present a fast local expansion algorithm for uncovering communities in large-scale networks. The proposed algorithm can detect multiresolution community from a source vertex or communities covering the whole network. Experimental results show that the proposed algorithm is efficient and well-behaved in both real-world and synthetic networks. PMID:21887325

  9. A new real-time method for investigation of affinity properties and binding kinetics of magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Orlov, Alexey V.; Nikitin, Maxim P.; Bragina, Vera A.; Znoyko, Sergey L.; Zaikina, Marina N.; Ksenevich, Tatiana I.; Gorshkov, Boris G.; Nikitin, Petr I.

    2015-04-01

    A method for quantitative investigation of affinity constants of receptors immobilized on magnetic nanoparticles (MP) is developed based on spectral correlation interferometry (SCI). The SCI records with a picometer resolution the thickness changes of a layer of molecules or nanoparticles due to a biochemical reaction on a cover slip, averaged over the sensing area. The method is compatible with other types of sensing surfaces employed in biosensing. The measured values of kinetic association constants of magnetic nanoparticles are 4 orders of magnitude higher than those of molecular antibody association with antigen. The developed method also suggests highly sensitive detection of antigens in a wide dynamic range. The limit of detection of 92 pg/ml has been demonstrated for prostate-specific antigen (PSA) with 50-nm MP employed as labels, which produce 3-order amplification of the SCI signals. The calibration curve features high sensitivity (slope) of 3-fold signal raise per 10-fold increase of PSA concentration within 4-order dynamic range, which is an attractive compromise for precise quantitative and highly sensitive immunoassay. The proposed biosensing technique offers inexpensive disposable sensor chips of cover slips and represents an economically sound alternative to traditional immunoassays for disease diagnostics, detection of pathogens in food and environmental monitoring.

  10. A novel edge based embedding in medical images based on unique key generated using sudoku puzzle design.

    PubMed

    Santhi, B; Dheeptha, B

    2016-01-01

    The field of telemedicine has gained immense momentum, owing to the need for transmitting patients' information securely. This paper puts forth a unique method for embedding data in medical images. It is based on edge based embedding and XOR coding. The algorithm proposes a novel key generation technique by utilizing the design of a sudoku puzzle to enhance the security of the transmitted message. The edge blocks of the cover image alone, are utilized to embed the payloads. The least significant bit of the pixel values are changed by XOR coding depending on the data to be embedded and the key generated. Hence the distortion in the stego image is minimized and the information is retrieved accurately. Data is embedded in the RGB planes of the cover image, thus increasing its embedding capacity. Several measures including peak signal noise ratio (PSNR), mean square error (MSE), universal image quality index (UIQI) and correlation coefficient (R) are the image quality measures that have been used to analyze the quality of the stego image. It is evident from the results that the proposed technique outperforms the former methodologies.

  11. A Prototype of Reflection Pulse Oximeter Designed for Mobile Healthcare.

    PubMed

    Lu, Zhiyuan; Chen, Xiang; Dong, Zhongfei; Zhao, Zhangyan; Zhang, Xu

    2016-09-01

    This paper introduces a pulse oximeter prototype designed for mobile healthcare. In this prototype, a reflection pulse oximeter is embedded into the back cover of a smart handheld device to offer the convenient measurement of both heart rate (HR) and SpO2 (estimation of arterial oxygen saturation) for home or mobile applications. Novel and miniaturized circuit modules including a chopper network and a filtering amplifier were designed to overcome the influence of ambient light and interferences that are caused by embedding the sensor into a flat cover. A method based on adaptive trough detection for improved HR and SpO2 estimation is proposed with appropriate simplification for its implementation on mobile devices. A fast and effective photoplethysmogram validation scheme is also proposed. Clinical experiments have been carried out to calibrate and test our oximeter. Our prototype oximeter can achieve comparable performance to a clinical oximeter with no significant difference revealed by paired t -tests ( p = 0.182 for SpO2 measurement and p = 0.496 for HR measurement). The design of this pulse oximeter will facilitate fast and convenient measurement of SpO2 for mobile healthcare.

  12. A cloud cover model based on satellite data

    NASA Technical Reports Server (NTRS)

    Somerville, P. N.; Bean, S. J.

    1980-01-01

    A model for worldwide cloud cover using a satellite data set containing infrared radiation measurements is proposed. The satellite data set containing day IR, night IR and incoming and absorbed solar radiation measurements on a 2.5 degree latitude-longitude grid covering a 45 month period was converted to estimates of cloud cover. The global area was then classified into homogeneous cloud cover regions for each of the four seasons. It is noted that the developed maps can be of use to the practicing climatologist who can obtain a considerable amount of cloud cover information without recourse to large volumes of data.

  13. Multi-Scale Compositionality: Identifying the Compositional Structures of Social Dynamics Using Deep Learning

    PubMed Central

    Peng, Huan-Kai; Marculescu, Radu

    2015-01-01

    Objective Social media exhibit rich yet distinct temporal dynamics which cover a wide range of different scales. In order to study this complex dynamics, two fundamental questions revolve around (1) the signatures of social dynamics at different time scales, and (2) the way in which these signatures interact and form higher-level meanings. Method In this paper, we propose the Recursive Convolutional Bayesian Model (RCBM) to address both of these fundamental questions. The key idea behind our approach consists of constructing a deep-learning framework using specialized convolution operators that are designed to exploit the inherent heterogeneity of social dynamics. RCBM’s runtime and convergence properties are guaranteed by formal analyses. Results Experimental results show that the proposed method outperforms the state-of-the-art approaches both in terms of solution quality and computational efficiency. Indeed, by applying the proposed method on two social network datasets, Twitter and Yelp, we are able to identify the compositional structures that can accurately characterize the complex social dynamics from these two social media. We further show that identifying these patterns can enable new applications such as anomaly detection and improved social dynamics forecasting. Finally, our analysis offers new insights on understanding and engineering social media dynamics, with direct applications to opinion spreading and online content promotion. PMID:25830775

  14. Segmentation of Polarimetric SAR Images Usig Wavelet Transformation and Texture Features

    NASA Astrophysics Data System (ADS)

    Rezaeian, A.; Homayouni, S.; Safari, A.

    2015-12-01

    Polarimetric Synthetic Aperture Radar (PolSAR) sensors can collect useful observations from earth's surfaces and phenomena for various remote sensing applications, such as land cover mapping, change and target detection. These data can be acquired without the limitations of weather conditions, sun illumination and dust particles. As result, SAR images, and in particular Polarimetric SAR (PolSAR) are powerful tools for various environmental applications. Unlike the optical images, SAR images suffer from the unavoidable speckle, which causes the segmentation of this data difficult. In this paper, we use the wavelet transformation for segmentation of PolSAR images. Our proposed method is based on the multi-resolution analysis of texture features is based on wavelet transformation. Here, we use the information of gray level value and the information of texture. First, we produce coherency or covariance matrices and then generate span image from them. In the next step of proposed method is texture feature extraction from sub-bands is generated from discrete wavelet transform (DWT). Finally, PolSAR image are segmented using clustering methods as fuzzy c-means (FCM) and k-means clustering. We have applied the proposed methodology to full polarimetric SAR images acquired by the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) L-band system, during July, in 2012 over an agricultural area in Winnipeg, Canada.

  15. Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain-computer interface

    NASA Astrophysics Data System (ADS)

    Chen, Xiaogang; Wang, Yijun; Gao, Shangkai; Jung, Tzyy-Ping; Gao, Xiaorong

    2015-08-01

    Objective. Recently, canonical correlation analysis (CCA) has been widely used in steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) due to its high efficiency, robustness, and simple implementation. However, a method with which to make use of harmonic SSVEP components to enhance the CCA-based frequency detection has not been well established. Approach. This study proposed a filter bank canonical correlation analysis (FBCCA) method to incorporate fundamental and harmonic frequency components to improve the detection of SSVEPs. A 40-target BCI speller based on frequency coding (frequency range: 8-15.8 Hz, frequency interval: 0.2 Hz) was used for performance evaluation. To optimize the filter bank design, three methods (M1: sub-bands with equally spaced bandwidths; M2: sub-bands corresponding to individual harmonic frequency bands; M3: sub-bands covering multiple harmonic frequency bands) were proposed for comparison. Classification accuracy and information transfer rate (ITR) of the three FBCCA methods and the standard CCA method were estimated using an offline dataset from 12 subjects. Furthermore, an online BCI speller adopting the optimal FBCCA method was tested with a group of 10 subjects. Main results. The FBCCA methods significantly outperformed the standard CCA method. The method M3 achieved the highest classification performance. At a spelling rate of ˜33.3 characters/min, the online BCI speller obtained an average ITR of 151.18 ± 20.34 bits min-1. Significance. By incorporating the fundamental and harmonic SSVEP components in target identification, the proposed FBCCA method significantly improves the performance of the SSVEP-based BCI, and thereby facilitates its practical applications such as high-speed spelling.

  16. 7 CFR 3406.18 - Content of a research proposal.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 15 2011-01-01 2011-01-01 false Content of a research proposal. 3406.18 Section 3406... AND AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Preparation of a Research Proposal § 3406.18 Content of a research proposal. (a) Proposal cover page. (1) Form CSREES-712, “Higher Education...

  17. 7 CFR 3406.18 - Content of a research proposal.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 15 2012-01-01 2012-01-01 false Content of a research proposal. 3406.18 Section 3406... AND AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Preparation of a Research Proposal § 3406.18 Content of a research proposal. (a) Proposal cover page. (1) Form NIFA-712, “Higher Education...

  18. 7 CFR 3406.18 - Content of a research proposal.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 15 2014-01-01 2014-01-01 false Content of a research proposal. 3406.18 Section 3406... AND AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Preparation of a Research Proposal § 3406.18 Content of a research proposal. (a) Proposal cover page. (1) Form NIFA-712, “Higher Education...

  19. 7 CFR 3406.18 - Content of a research proposal.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 15 2013-01-01 2013-01-01 false Content of a research proposal. 3406.18 Section 3406... AND AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Preparation of a Research Proposal § 3406.18 Content of a research proposal. (a) Proposal cover page. (1) Form NIFA-712, “Higher Education...

  20. Using synthetic polymers to reduce soil erosion after forest fires in Mediterranean soils

    NASA Astrophysics Data System (ADS)

    Lado, Marcos; Ben-Hur, Meni; Inbar, Assaf

    2010-05-01

    Forest fires are a major environmental problem in the Mediterranean region because they result in a loss of vegetation cover, changes in biodiversity, increases in greenhouse gasses emission and a potential increase of runoff and soil erosion. The large increases in runoff and sediment yields after high severity fires have been attributed to several factors, among them: increase in soil water repellency; soil sealing by detached particles and by ash particles, and the loss of a surface cover. The presence of a surface cover increases infiltration, and decreases runoff and erosion by several mechanisms which include: rainfall interception, plant evapotranspiration, preservation of soil structure by increasing soil organic matter, and increasing surface roughness. The loss of vegetation cover as a result of fire leaves the surface of the soil exposed to the direct impact of the raindrops, and therefore the sensitivity of the soil to runoff generation and soil loss increases. In this work, we propose a new method to protect soils against post-fire erosion based on the application of synthetic polymers to the soil. Laboratory rainfall simulations and field runoff plots were used to analyze the suitability of the application of synthetic polymers to reduce soil erosion and stabilize soil structure in Mediterranean soils. The combination of these two processes will potentially favor a faster recovery of the vegetation structure. This method has been successfully applied in arable land, however it has not been tested in burnt forests. The outcome of this study may provide important managerial tools for forest management following fires.

  1. Prediction of protein-protein interactions from amino acid sequences with ensemble extreme learning machines and principal component analysis

    PubMed Central

    2013-01-01

    Background Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. Results We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. Conclusions When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time. PMID:23815620

  2. A matched filter approach for blind joint detection of galaxy clusters in X-ray and SZ surveys

    NASA Astrophysics Data System (ADS)

    Tarrío, P.; Melin, J.-B.; Arnaud, M.

    2018-06-01

    The combination of X-ray and Sunyaev-Zeldovich (SZ) observations can potentially improve the cluster detection efficiency, when compared to using only one of these probes, since both probe the same medium, the hot ionized gas of the intra-cluster medium. We present a method based on matched multifrequency filters (MMF) for detecting galaxy clusters from SZ and X-ray surveys. This method builds on a previously proposed joint X-ray-SZ extraction method and allows the blind detection of clusters, that is finding new clusters without knowing their position, size, or redshift, by searching on SZ and X-ray maps simultaneously. The proposed method is tested using data from the ROSAT all-sky survey and from the Planck survey. The evaluation is done by comparison with existing cluster catalogues in the area of the sky covered by the deep SPT survey. Thanks to the addition of the X-ray information, the joint detection method is able to achieve simultaneously better purity, better detection efficiency, and better position accuracy than its predecessor Planck MMF, which is based on SZ maps alone. For a purity of 85%, the X-ray-SZ method detects 141 confirmed clusters in the SPT region; to detect the same number of confirmed clusters with Planck MMF, we would need to decrease its purity to 70%. We provide a catalogue of 225 sources selected by the proposed method in the SPT footprint, with masses ranging between 0.7 and 14.5 ×1014 M⊙ and redshifts between 0.01 and 1.2.

  3. A cluster merging method for time series microarray with production values.

    PubMed

    Chira, Camelia; Sedano, Javier; Camara, Monica; Prieto, Carlos; Villar, Jose R; Corchado, Emilio

    2014-09-01

    A challenging task in time-course microarray data analysis is to cluster genes meaningfully combining the information provided by multiple replicates covering the same key time points. This paper proposes a novel cluster merging method to accomplish this goal obtaining groups with highly correlated genes. The main idea behind the proposed method is to generate a clustering starting from groups created based on individual temporal series (representing different biological replicates measured in the same time points) and merging them by taking into account the frequency by which two genes are assembled together in each clustering. The gene groups at the level of individual time series are generated using several shape-based clustering methods. This study is focused on a real-world time series microarray task with the aim to find co-expressed genes related to the production and growth of a certain bacteria. The shape-based clustering methods used at the level of individual time series rely on identifying similar gene expression patterns over time which, in some models, are further matched to the pattern of production/growth. The proposed cluster merging method is able to produce meaningful gene groups which can be naturally ranked by the level of agreement on the clustering among individual time series. The list of clusters and genes is further sorted based on the information correlation coefficient and new problem-specific relevant measures. Computational experiments and results of the cluster merging method are analyzed from a biological perspective and further compared with the clustering generated based on the mean value of time series and the same shape-based algorithm.

  4. Optimization of Self-Directed Target Coverage in Wireless Multimedia Sensor Network

    PubMed Central

    Yang, Yang; Wang, Yufei; Pi, Dechang; Wang, Ruchuan

    2014-01-01

    Video and image sensors in wireless multimedia sensor networks (WMSNs) have directed view and limited sensing angle. So the methods to solve target coverage problem for traditional sensor networks, which use circle sensing model, are not suitable for WMSNs. Based on the FoV (field of view) sensing model and FoV disk model proposed, how expected multimedia sensor covers the target is defined by the deflection angle between target and the sensor's current orientation and the distance between target and the sensor. Then target coverage optimization algorithms based on expected coverage value are presented for single-sensor single-target, multisensor single-target, and single-sensor multitargets problems distinguishingly. Selecting the orientation that sensor rotated to cover every target falling in the FoV disk of that sensor for candidate orientations and using genetic algorithm to multisensor multitargets problem, which has NP-complete complexity, then result in the approximated minimum subset of sensors which covers all the targets in networks. Simulation results show the algorithm's performance and the effect of number of targets on the resulting subset. PMID:25136667

  5. Methane fluxes during the cold season: distribution and mass transfer in the snow cover of bogs

    NASA Astrophysics Data System (ADS)

    Smagin, A. V.; Shnyrev, N. A.

    2015-08-01

    Fluxes and profile distribution of methane in the snow cover and different landscape elements of an oligotrophic West-Siberian bog (Mukhrino Research Station, Khanty-Mansiisk autonomous district) have been studied during a cold season. Simple models have been proposed for the description of methane distribution in the inert snow layer, which combine the transport of the gas and a source of constant intensity on the soil surface. The formation rates of stationary methane profiles in the snow cover have been estimated (characteristic time of 24 h). Theoretical equations have been derived for the calculation of small emission fluxes from bogs to the atmosphere on the basis of the stationary profile distribution parameters, the snow porosity, and the effective methane diffusion coefficient in the snow layer. The calculated values of methane emission significantly (by 2-3 to several tens of times) have exceeded the values measured under field conditions by the closed chamber method (0.008-0.25 mg C/(m2 h)), which indicates the possibility of underestimating the contribution of the cold period to the annual emission cycle of bog methane.

  6. Near-field radiative heat transfer between graphene-covered hyperbolic metamaterials

    NASA Astrophysics Data System (ADS)

    Hong, Xiao-Juan; Li, Jian-Wen; Wang, Tong-Biao; Zhang, De-Jian; Liu, Wen-Xing; Liao, Qing-Hua; Yu, Tian-Bao; Liu, Nian-Hua

    2018-04-01

    We propose the use of graphene-covered silicon carbide (SiC) nanowire arrays (NWAs) for theoretical studies of near-field radiative heat transfer. The SiC NWAs exhibit a hyperbolic characteristic at an appropriately selected filling-volume fraction. The surface plasmon supported by graphene and the hyperbolic modes supported by SiC NWAs significantly affect radiative heat transfer. The heat-transfer coefficient (HTC) between the proposed structures is larger than that between SiC NWAs. We also find that the chemical potential of graphene plays an important role in modulating the HTC. The tunability of chemical potential through gate voltage enables flexible control of heat transfer using the graphene-covered SiC NWAs.

  7. Effective scheme of photolysis of GFP in live cell as revealed with confocal fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Glazachev, Yu I.; Orlova, D. Y.; Řezníčková, P.; Bártová, E.

    2018-05-01

    We proposed an effective kinetics scheme of photolysis of green fluorescent protein (GFP) observed in live cells with a commercial confocal fluorescence microscope. We investigated the photolysis of GFP-tagged heterochromatin protein, HP1β-GFP, in live nucleus with the pulse position modulation approach, which has several advantages over the classical pump-and-probe method. At the basis of the proposed scheme lies a process of photoswitching from the native fluorescence state to the intermediate fluorescence state, which has a lower fluorescence yield and recovers back to native state in the dark. This kinetics scheme includes four effective parameters (photoswitching, reverse switching, photodegradation rate constants, and relative brightness of the intermediate state) and covers the time scale from dozens of milliseconds to minutes of the experimental fluorescence kinetics. Additionally, the applicability of the scheme was demonstrated in the cases of continuous irradiation and the classical pump-and-probe approach using numerical calculations and analytical solutions. An interesting finding of experimental data analysis was that the overall photodegradation of GFP proceeds dominantly from the intermediate state, and demonstrated approximately the second-order reaction versus irradiation power. As a practical example, the proposed scheme elucidates the artifacts of fluorescence recovery after the photobleaching method, and allows us to propose some suggestions on how to diminish them.

  8. Effective scheme of photolysis of GFP in live cell as revealed with confocal fluorescence microscopy.

    PubMed

    Glazachev, Yu I; Orlova, D Y; Řezníčková, P; Bártová, E

    2018-03-23

    We proposed an effective kinetics scheme of photolysis of green fluorescent protein (GFP) observed in live cells with a commercial confocal fluorescence microscope. We investigated the photolysis of GFP-tagged heterochromatin protein, HP1β-GFP, in live nucleus with the pulse position modulation approach, which has several advantages over the classical pump-and-probe method. At the basis of the proposed scheme lies a process of photoswitching from the native fluorescence state to the intermediate fluorescence state, which has a lower fluorescence yield and recovers back to native state in the dark. This kinetics scheme includes four effective parameters (photoswitching, reverse switching, photodegradation rate constants, and relative brightness of the intermediate state) and covers the time scale from dozens of milliseconds to minutes of the experimental fluorescence kinetics. Additionally, the applicability of the scheme was demonstrated in the cases of continuous irradiation and the classical pump-and-probe approach using numerical calculations and analytical solutions. An interesting finding of experimental data analysis was that the overall photodegradation of GFP proceeds dominantly from the intermediate state, and demonstrated approximately the second-order reaction versus irradiation power. As a practical example, the proposed scheme elucidates the artifacts of fluorescence recovery after the photobleaching method, and allows us to propose some suggestions on how to diminish them.

  9. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  10. Fault classification method for the driving safety of electrified vehicles

    NASA Astrophysics Data System (ADS)

    Wanner, Daniel; Drugge, Lars; Stensson Trigell, Annika

    2014-05-01

    A fault classification method is proposed which has been applied to an electric vehicle. Potential faults in the different subsystems that can affect the vehicle directional stability were collected in a failure mode and effect analysis. Similar driveline faults were grouped together if they resembled each other with respect to their influence on the vehicle dynamic behaviour. The faults were physically modelled in a simulation environment before they were induced in a detailed vehicle model under normal driving conditions. A special focus was placed on faults in the driveline of electric vehicles employing in-wheel motors of the permanent magnet type. Several failures caused by mechanical and other faults were analysed as well. The fault classification method consists of a controllability ranking developed according to the functional safety standard ISO 26262. The controllability of a fault was determined with three parameters covering the influence of the longitudinal, lateral and yaw motion of the vehicle. The simulation results were analysed and the faults were classified according to their controllability using the proposed method. It was shown that the controllability decreased specifically with increasing lateral acceleration and increasing speed. The results for the electric driveline faults show that this trend cannot be generalised for all the faults, as the controllability deteriorated for some faults during manoeuvres with low lateral acceleration and low speed. The proposed method is generic and can be applied to various other types of road vehicles and faults.

  11. Robust distortion correction of endoscope

    NASA Astrophysics Data System (ADS)

    Li, Wenjing; Nie, Sixiang; Soto-Thompson, Marcelo; Chen, Chao-I.; A-Rahim, Yousif I.

    2008-03-01

    Endoscopic images suffer from a fundamental spatial distortion due to the wide angle design of the endoscope lens. This barrel-type distortion is an obstacle for subsequent Computer Aided Diagnosis (CAD) algorithms and should be corrected. Various methods and research models for the barrel-type distortion correction have been proposed and studied. For industrial applications, a stable, robust method with high accuracy is required to calibrate the different types of endoscopes in an easy of use way. The correction area shall be large enough to cover all the regions that the physicians need to see. In this paper, we present our endoscope distortion correction procedure which includes data acquisition, distortion center estimation, distortion coefficients calculation, and look-up table (LUT) generation. We investigate different polynomial models used for modeling the distortion and propose a new one which provides correction results with better visual quality. The method has been verified with four types of colonoscopes. The correction procedure is currently being applied on human subject data and the coefficients are being utilized in a subsequent 3D reconstruction project of colon.

  12. Flight State Identification of a Self-Sensing Wing via an Improved Feature Selection Method and Machine Learning Approaches

    PubMed Central

    Chen, Xi; Wu, Qi; Ren, He; Chang, Fu-Kuo

    2018-01-01

    In this work, a data-driven approach for identifying the flight state of a self-sensing wing structure with an embedded multi-functional sensing network is proposed. The flight state is characterized by the structural vibration signals recorded from a series of wind tunnel experiments under varying angles of attack and airspeeds. A large feature pool is created by extracting potential features from the signals covering the time domain, the frequency domain as well as the information domain. Special emphasis is given to feature selection in which a novel filter method is developed based on the combination of a modified distance evaluation algorithm and a variance inflation factor. Machine learning algorithms are then employed to establish the mapping relationship from the feature space to the practical state space. Results from two case studies demonstrate the high identification accuracy and the effectiveness of the model complexity reduction via the proposed method, thus providing new perspectives of self-awareness towards the next generation of intelligent air vehicles. PMID:29710832

  13. Zepto-molar electrochemical detection of Brucella genome based on gold nanoribbons covered by gold nanoblooms

    NASA Astrophysics Data System (ADS)

    Rahi, Amid; Sattarahmady, Naghmeh; Heli, Hossein

    2015-12-01

    Gold nanoribbons covered by gold nanoblooms were sonoelectrodeposited on a polycrystalline gold surface at -1800 mV (vs. AgCl) with the assistance of ultrasound and co-occurrence of the hydrogen evolution reaction. The nanostructure, as a transducer, was utilized to immobilize a Brucella-specific probe and fabrication of a genosensor, and the process of immobilization and hybridization was detected by electrochemical methods, using methylene blue as a redox marker. The proposed method for detection of the complementary sequence, sequences with base-mismatched (one-, two- and three-base mismatches), and the sequence of non-complementary sequence was assayed. The fabricated genosensor was evaluated for the assay of the bacteria in the cultured and human samples without polymerase chain reactions (PCR). The genosensor could detect the complementary sequence with a calibration sensitivity of 0.40 μA dm3 mol-1, a linear concentration range of 10 zmol dm-3 to 10 pmol dm-3, and a detection limit of 1.71 zmol dm-3.

  14. Potential value of systematic reviews of qualitative evidence in informing user-centered health and social care: findings from a descriptive overview.

    PubMed

    Dalton, Jane; Booth, Andrew; Noyes, Jane; Sowden, Amanda J

    2017-08-01

    Systematic reviews of quantitative evidence are well established in health and social care. Systematic reviews of qualitative evidence are increasingly available, but volume, topics covered, methods used, and reporting quality are largely unknown. We provide a descriptive overview of systematic reviews of qualitative evidence assessing health and social care interventions included on the Database of Abstracts of Reviews of Effects (DARE). We searched DARE for reviews published between January 1, 2009, and December 31, 2014. We extracted data on review content and methods, summarized narratively, and explored patterns over time. We identified 145 systematic reviews conducted worldwide (64 in the UK). Interventions varied but largely covered treatment or service delivery in community and hospital settings. There were no discernible patterns over time. Critical appraisal of primary studies was conducted routinely. Most reviews were poorly reported. Potential exists to use systematic reviews of qualitative evidence when driving forward user-centered health and social care. We identify where more research is needed and propose ways to improve review methodology and reporting. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. A Graph Theory Practice on Transformed Image: A Random Image Steganography

    PubMed Central

    Thanikaiselvan, V.; Arulmozhivarman, P.; Subashanthini, S.; Amirtharajan, Rengarajan

    2013-01-01

    Modern day information age is enriched with the advanced network communication expertise but unfortunately at the same time encounters infinite security issues when dealing with secret and/or private information. The storage and transmission of the secret information become highly essential and have led to a deluge of research in this field. In this paper, an optimistic effort has been taken to combine graceful graph along with integer wavelet transform (IWT) to implement random image steganography for secure communication. The implementation part begins with the conversion of cover image into wavelet coefficients through IWT and is followed by embedding secret image in the randomly selected coefficients through graph theory. Finally stegoimage is obtained by applying inverse IWT. This method provides a maximum of 44 dB peak signal to noise ratio (PSNR) for 266646 bits. Thus, the proposed method gives high imperceptibility through high PSNR value and high embedding capacity in the cover image due to adaptive embedding scheme and high robustness against blind attack through graph theoretic random selection of coefficients. PMID:24453857

  16. Wheelchair pushrim kinetics measurement: A method to cancel inaccuracies due to pushrim weight and wheel camber.

    PubMed

    Chénier, Félix; Aissaoui, Rachid; Gauthier, Cindy; Gagnon, Dany H

    2017-02-01

    The commercially available SmartWheel TM is largely used in research and increasingly used in clinical practice to measure the forces and moments applied on the wheelchair pushrims by the user. However, in some situations (i.e. cambered wheels or increased pushrim weight), the recorded kinetics may include dynamic offsets that affect the accuracy of the measurements. In this work, an automatic method to identify and cancel these offsets is proposed and tested. First, the method was tested on an experimental bench with different cambers and pushrim weights. Then, the method was generalized to wheelchair propulsion. Nine experienced wheelchair users propelled their own wheelchairs instrumented with two SmartWheels with anti-slip pushrim covers. The dynamic offsets were correctly identified using the propulsion acquisition, without needing a separate baseline acquisition. A kinetic analysis was performed with and without dynamic offset cancellation using the proposed method. The most altered kinetic variables during propulsion were the vertical and total forces, with errors of up to 9N (p<0.001, large effect size of 5). This method is simple to implement, fully automatic and requires no further acquisitions. Therefore, we advise to use it systematically to enhance the accuracy of existing and future kinetic measurements. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Comparison of rangeland vegetation sampling techniques in the Central Grasslands

    USGS Publications Warehouse

    Stohlgren, T.J.; Bull, K.A.; Otsuki, Yuka

    1998-01-01

    Maintaining native plant diversity, detecting exotic species, and monitoring rare species are becoming important objectives in rangeland conservation. Four rangeland vegetation sampling techniques were compared to see how well they captured local pant diversity. The methods tested included the commonly used Parker transects, Daubenmire transects as modified by the USDA Forest Service, a new transect and 'large quadrat' design proposed by the USDA Agricultural Research Service, and the Modified-Whittaker multi-scale vegetation plot. The 4 methods were superimposed in shortgrass steppe, mixed grass prairie, northern mixed prairie, and tallgrass prairie in the Central Grasslands of the United States with 4 replicates in each prairie type. Analysis of variance tests showed significant method effects and prairie type effects, but no significant method X type interactions for total species richness, the number of native species, the number of species with less than 1 % cover, and the time required for sampling. The methods behaved similarly in each prairie type under a wide variety of grazing regimens. The Parker, large quadrat, and Daubenmire transects significantly underestimated the total species richness and the number of native species in each prairie type, and the number of species with less than 1 % cover in all but the tallgrass prairie type. The transect techniques also consistently missed half the exotic species, including noxious weeds, in each prairie type. The Modified-Whittaker method, which included an exhaustive search for plant species in a 20 x 50 m plot, served as the baseline for species richeness comparisons. For all prairie types, the Modified-Whittaker plot captured an average of 42. (?? 2.4; 1 S.E.) plant species per site compared to 15.9 (?? 1.3), 18.9 (?? 1.2), and 22.8 (?? 1.6) plant species per site using the Parker, large quadrat, and Daubenmire transect methods, respectively. The 4 methods captured most of the dominant species at each site and thus produced similar results for total foliar cover and soil cover. The detection and measurement of exotic plant species were greatly enhanced by using ten 1 m2 subplots in a multi-scale sampling design and searching a larger area (1,000 m2) at each site. Even with 4 replicate sites, the transect methods usually captured, and thus would monitor, 36 to 66 % of the plant species at each site. To evaluate the status and trends of common, rare, and exotic plant species at local, regional, and national scales, innovative, multi-scale methods must replace the commonly used transect methods to the past.

  18. MODIS Snow Cover Recovery Using Variational Interpolation

    NASA Astrophysics Data System (ADS)

    Tran, H.; Nguyen, P.; Hsu, K. L.; Sorooshian, S.

    2017-12-01

    Cloud obscuration is one of the major problems that limit the usages of satellite images in general and in NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) global Snow-Covered Area (SCA) products in particular. Among the approaches to resolve the problem, the Variational Interpolation (VI) algorithm method, proposed by Xia et al., 2012, obtains cloud-free dynamic SCA images from MODIS. This method is automatic and robust. However, computational deficiency is a main drawback that degrades applying the method for larger scales (i.e., spatial and temporal scales). To overcome this difficulty, this study introduces an improved version of the original VI. The modified VI algorithm integrates the MINimum RESidual (MINRES) iteration (Paige and Saunders., 1975) to prevent the system from breaking up when applied to much broader scales. An experiment was done to demonstrate the crash-proof ability of the new algorithm in comparison with the original VI method, an ability that is obtained when maintaining the distribution of the weights set after solving the linear system. After that, the new VI algorithm was applied to the whole Contiguous United States (CONUS) over four winter months of 2016 and 2017, and validated using the snow station network (SNOTEL). The resulting cloud free images have high accuracy in capturing the dynamical changes of snow in contrast with the MODIS snow cover maps. Lastly, the algorithm was applied to create a Cloud free images dataset from March 10, 2000 to February 28, 2017, which is able to provide an overview of snow trends over CONUS for nearly two decades. ACKNOWLEDGMENTSWe would like to acknowledge NASA, NOAA Office of Hydrologic Development (OHD) National Weather Service (NWS), Cooperative Institute for Climate and Satellites (CICS), Army Research Office (ARO), ICIWaRM, and UNESCO for supporting this research.

  19. Model-based registration for assessment of spinal deformities in idiopathic scoliosis

    NASA Astrophysics Data System (ADS)

    Forsberg, Daniel; Lundström, Claes; Andersson, Mats; Knutsson, Hans

    2014-01-01

    Detailed analysis of spinal deformity is important within orthopaedic healthcare, in particular for assessment of idiopathic scoliosis. This paper addresses this challenge by proposing an image analysis method, capable of providing a full three-dimensional spine characterization. The proposed method is based on the registration of a highly detailed spine model to image data from computed tomography. The registration process provides an accurate segmentation of each individual vertebra and the ability to derive various measures describing the spinal deformity. The derived measures are estimated from landmarks attached to the spine model and transferred to the patient data according to the registration result. Evaluation of the method provides an average point-to-surface error of 0.9 mm ± 0.9 (comparing segmentations), and an average target registration error of 2.3 mm ± 1.7 (comparing landmarks). Comparing automatic and manual measurements of axial vertebral rotation provides a mean absolute difference of 2.5° ± 1.8, which is on a par with other computerized methods for assessing axial vertebral rotation. A significant advantage of our method, compared to other computerized methods for rotational measurements, is that it does not rely on vertebral symmetry for computing the rotational measures. The proposed method is fully automatic and computationally efficient, only requiring three to four minutes to process an entire image volume covering vertebrae L5 to T1. Given the use of landmarks, the method can be readily adapted to estimate other measures describing a spinal deformity by changing the set of employed landmarks. In addition, the method has the potential to be utilized for accurate segmentations of the vertebrae in routine computed tomography examinations, given the relatively low point-to-surface error.

  20. Evaluation of the robustness of the preprocessing technique improving reversible compressibility of CT images: Tested on various CT examinations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Chang Ho; Kim, Bohyoung; Gu, Bon Seung

    2013-10-15

    Purpose: To modify the preprocessing technique, which was previously proposed, improving compressibility of computed tomography (CT) images to cover the diversity of three dimensional configurations of different body parts and to evaluate the robustness of the technique in terms of segmentation correctness and increase in reversible compression ratio (CR) for various CT examinations.Methods: This study had institutional review board approval with waiver of informed patient consent. A preprocessing technique was previously proposed to improve the compressibility of CT images by replacing pixel values outside the body region with a constant value resulting in maximizing data redundancy. Since the technique wasmore » developed aiming at only chest CT images, the authors modified the segmentation method to cover the diversity of three dimensional configurations of different body parts. The modified version was evaluated as follows. In randomly selected 368 CT examinations (352 787 images), each image was preprocessed by using the modified preprocessing technique. Radiologists visually confirmed whether the segmented region covers the body region or not. The images with and without the preprocessing were reversibly compressed using Joint Photographic Experts Group (JPEG), JPEG2000 two-dimensional (2D), and JPEG2000 three-dimensional (3D) compressions. The percentage increase in CR per examination (CR{sub I}) was measured.Results: The rate of correct segmentation was 100.0% (95% CI: 99.9%, 100.0%) for all the examinations. The median of CR{sub I} were 26.1% (95% CI: 24.9%, 27.1%), 40.2% (38.5%, 41.1%), and 34.5% (32.7%, 36.2%) in JPEG, JPEG2000 2D, and JPEG2000 3D, respectively.Conclusions: In various CT examinations, the modified preprocessing technique can increase in the CR by 25% or more without concerning about degradation of diagnostic information.« less

  1. Classification of Urban Feature from Unmanned Aerial Vehicle Images Using Gasvm Integration and Multi-Scale Segmentation

    NASA Astrophysics Data System (ADS)

    Modiri, M.; Salehabadi, A.; Mohebbi, M.; Hashemi, A. M.; Masumi, M.

    2015-12-01

    The use of UAV in the application of photogrammetry to obtain cover images and achieve the main objectives of the photogrammetric mapping has been a boom in the region. The images taken from REGGIOLO region in the province of, Italy Reggio -Emilia by UAV with non-metric camera Canon Ixus and with an average height of 139.42 meters were used to classify urban feature. Using the software provided SURE and cover images of the study area, to produce dense point cloud, DSM and Artvqvtv spatial resolution of 10 cm was prepared. DTM area using Adaptive TIN filtering algorithm was developed. NDSM area was prepared with using the difference between DSM and DTM and a separate features in the image stack. In order to extract features, using simultaneous occurrence matrix features mean, variance, homogeneity, contrast, dissimilarity, entropy, second moment, and correlation for each of the RGB band image was used Orthophoto area. Classes used to classify urban problems, including buildings, trees and tall vegetation, grass and vegetation short, paved road and is impervious surfaces. Class consists of impervious surfaces such as pavement conditions, the cement, the car, the roof is stored. In order to pixel-based classification and selection of optimal features of classification was GASVM pixel basis. In order to achieve the classification results with higher accuracy and spectral composition informations, texture, and shape conceptual image featureOrthophoto area was fencing. The segmentation of multi-scale segmentation method was used.it belonged class. Search results using the proposed classification of urban feature, suggests the suitability of this method of classification complications UAV is a city using images. The overall accuracy and kappa coefficient method proposed in this study, respectively, 47/93% and 84/91% was.

  2. Estimating Global Impervious Surface based on Social-economic Data and Satellite Observations

    NASA Astrophysics Data System (ADS)

    Zeng, Z.; Zhang, K.; Xue, X.; Hong, Y.

    2016-12-01

    Impervious surface areas around the globe are expanding and significantly altering the surface energy balance, hydrology cycle and ecosystem services. Many studies have underlined the importance of impervious surface, r from hydrological modeling to contaminant transport monitoring and urban development estimation. Therefore accurate estimation of the global impervious surface is important for both physical and social sciences. Given the limited coverage of high spatial resolution imagery and ground survey, using satellite remote sensing and geospatial data to estimate global impervious areas is a practical approach. Based on the previous work of area-weighted imperviousness for north branch of the Chicago River provided by HDR, this study developed a method to determine the percentage of impervious surface using latest global land cover categories from multi-source satellite observations, population density and gross domestic product (GDP) data. Percent impervious surface at 30-meter resolution were mapped. We found that 1.33% of the CONUS (105,814 km2) and 0.475% of the land surface (640,370km2) are impervious surfaces. To test the utility and practicality of the proposed method, National Land Cover Database (NLCD) 2011 percent developed imperviousness for the conterminous United States was used to evaluate our results. The average difference between the derived imperviousness from our method and the NLCD data across CONUS is 1.14%, while difference between our results and the NLCD data are within ±1% over 81.63% of the CONUS. The distribution of global impervious surface map indicates that impervious surfaces are primarily concentrated in China, India, Japan, USA and Europe where are highly populated and/or developed. This study proposes a straightforward way of mapping global imperviousness, which can provide useful information for hydrologic modeling and other applications.

  3. Assimilating MODIS-based albedo and snow cover fraction into the Common Land Model to improve snow depth simulation with direct insertion and deterministic ensemble Kalman filter methods

    NASA Astrophysics Data System (ADS)

    Xu, Jianhui; Shu, Hong

    2014-09-01

    This study assesses the analysis performance of assimilating the Moderate Resolution Imaging Spectroradiometer (MODIS)-based albedo and snow cover fraction (SCF) separately or jointly into the physically based Common Land Model (CoLM). A direct insertion method (DI) is proposed to assimilate the black and white-sky albedos into the CoLM. The MODIS-based albedo is calculated with the MODIS bidirectional reflectance distribution function (BRDF) model parameters product (MCD43B1) and the solar zenith angle as estimated in the CoLM for each time step. Meanwhile, the MODIS SCF (MOD10A1) is assimilated into the CoLM using the deterministic ensemble Kalman filter (DEnKF) method. A new DEnKF-albedo assimilation scheme for integrating the DI and DEnKF assimilation schemes is proposed. Our assimilation results are validated against in situ snow depth observations from November 2008 to March 2009 at five sites in the Altay region of China. The experimental results show that all three data assimilation schemes can improve snow depth simulations. But overall, the DEnKF-albedo assimilation shows the best analysis performance as it significantly reduces the bias and root-mean-square error (RMSE) during the snow accumulation and ablation periods at all sites except for the Fuyun site. The SCF assimilation via DEnKF produces better results than the albedo assimilation via DI, implying that the albedo assimilation that indirectly updates the snow depth state variable is less efficient than the direct SCF assimilation. For the Fuyun site, the DEnKF-albedo scheme tends to overestimate the snow depth accumulation with the maximum bias and RMSE values because of the large positive innovation (observation minus forecast).

  4. Voltage stability analysis in the new deregulated environment

    NASA Astrophysics Data System (ADS)

    Zhu, Tong

    Nowadays, a significant portion of the power industry is under deregulation. Under this new circumstance, network security analysis is more critical and more difficult. One of the most important issues in network security analysis is voltage stability analysis. Due to the expected higher utilization of equipment induced by competition in a power market that covers bigger power systems, this issue is increasingly acute after deregulation. In this dissertation, some selected topics of voltage stability analysis are covered. In the first part, after a brief review of general concepts of continuation power flow (CPF), investigations on various matrix analysis techniques to improve the speed of CPF calculation for large systems are reported. Based on these improvements, a new CPF algorithm is proposed. This new method is then tested by an inter-area transaction in a large inter-connected power system. In the second part, the Arnoldi algorithm, the best method to find a few minimum singular values for a large sparse matrix, is introduced into the modal analysis for the first time. This new modal analysis is applied to the estimation of the point of voltage collapse and contingency evaluation in voltage security assessment. Simulations show that the new method is very efficient. In the third part, after transient voltage stability component models are investigated systematically, a novel system model for transient voltage stability analysis, which is a logical-algebraic-differential-difference equation (LADDE), is offered. As an example, TCSC (Thyristor controlled series capacitors) is addressed as a transient voltage stabilizing controller. After a TCSC transient voltage stability model is outlined, a new TCSC controller is proposed to enhance both fault related and load increasing related transient voltage stability. Its ability is proven by the simulation.

  5. [Contrast-enhanced ultrasound for the characterization of incidental liver lesions - an economical evaluation in comparison with multi-phase computed tomography].

    PubMed

    Giesel, F L; Delorme, S; Sibbel, R; Kauczor, H-U; Krix, M

    2009-06-01

    The aim of the study was to conduct a cost-minimization analysis of contrast-enhanced ultrasound (CEUS) compared to multi-phase computed tomography (M-CT) as the diagnostic standard for diagnosing incidental liver lesions. Different scenarios of a cost-covering realization of CEUS in the ambulant sector in the general health insurance system of Germany were compared to the current cost situation. The absolute savings potential was estimated using different approaches for the calculation of the incidence of liver lesions which require further characterization. CEUS was the more cost-effective method in all scenarios in which CEUS examinations where performed at specialized centers (122.18-186.53 euro) compared to M-CT (223.19 euro). With about 40 000 relevant liver lesions per year, systematic implementation of CEUS would result in a cost savings of 4 m euro per year. However, the scenario of a cost-covering CEUS examination for all physicians who perform liver ultrasound would be the most cost-intensive approach (e. g., 407.87 euro at an average utilization of the ultrasound machine of 25 %, and a CEUS ratio of 5 %). A cost-covering realization of the CEUS method can result in cost savings in the German healthcare system. A centralized approach as proposed by the DEGUM should be targeted.

  6. EsPRit: ethics committee proposals for Long Term Medical Data Registries in rapidly evolving research fields - a future-proof best practice approach.

    PubMed

    Oberbichler, S; Hackl, W O; Hörbst, A

    2017-10-18

    Long-term data collection is a challenging task in the domain of medical research. Many effects in medicine require long periods of time to become traceable e.g. the development of secondary malignancies based on a given radiotherapeutic treatment of the primary disease. Nevertheless, long-term studies often suffer from an initial lack of available information, thus disallowing a standardized approach for their approval by the ethics committee. This is due to several factors, such as the lack of existing case report forms or an explorative research approach in which data elements may change over time. In connection with current medical research and the ongoing digitalization in medicine, Long Term Medical Data Registries (MDR-LT) have become an important means of collecting and analyzing study data. As with any clinical study, ethical aspects must be taken into account when setting up such registries. This work addresses the problem of creating a valid, high-quality ethics committee proposal for medical registries by suggesting groups of tasks (building blocks), information sources and appropriate methods for collecting and analyzing the information, as well as a process model to compile an ethics committee proposal (EsPRit). To derive the building blocks and associated methods software and requirements engineering approaches were utilized. Furthermore, a process-oriented approach was chosen, as information required in the creating process of ethics committee proposals remain unknown in the beginning of planning an MDR-LT. Here, we derived the needed steps from medical product certification. This was done as the medical product certification itself also communicates a process-oriented approach rather than merely focusing on content. A proposal was created for validation and inspection of applicability by using the proposed building blocks. The proposed best practice was tested and refined within SEMPER (Secondary Malignoma - Prospective Evaluation of the Radiotherapeutics dose distribution as the cause for induction) as a case study. The proposed building blocks cover the topics of "Context Analysis", "Requirements Analysis", "Requirements Validation", "Electronic Case Report (eCRF) Design" and "Overall Concept Creation". Additional methods are attached with regards to each topic. The goals of each block can be met by applying those methods. The proposed methods are proven methods as applied in e.g. existing Medical Data Registry projects, as well as in software or requirements engineering. Several building blocks and attached methods could be identified in the creation of a generic ethics committee proposal. Hence, an Ethics Committee can make informed decisions on the suggested study via said blocks, using the suggested methods such as "Defining Clinical Questions" within the Context Analysis. The study creators have to confirm that they adhere to the proposed procedure within the ethic proposal statement. Additional existing Medical Data Registry projects can be compared to EsPRit for conformity to the proposed procedure. This allows for the identification of gaps, which can lead to amendments requested by the ethics committee.

  7. Yellow River Icicle Hazard Dynamic Monitoring Using UAV Aerial Remote Sensing Technology

    NASA Astrophysics Data System (ADS)

    Wang, H. B.; Wang, G. H.; Tang, X. M.; Li, C. H.

    2014-02-01

    Monitoring the response of Yellow River icicle hazard change requires accurate and repeatable topographic surveys. A new method based on unmanned aerial vehicle (UAV) aerial remote sensing technology is proposed for real-time data processing in Yellow River icicle hazard dynamic monitoring. The monitoring area is located in the Yellow River ice intensive care area in southern BaoTou of Inner Mongolia autonomous region. Monitoring time is from the 20th February to 30th March in 2013. Using the proposed video data processing method, automatic extraction covering area of 7.8 km2 of video key frame image 1832 frames took 34.786 seconds. The stitching and correcting time was 122.34 seconds and the accuracy was better than 0.5 m. Through the comparison of precise processing of sequence video stitching image, the method determines the change of the Yellow River ice and locates accurate positioning of ice bar, improving the traditional visual method by more than 100 times. The results provide accurate aid decision information for the Yellow River ice prevention headquarters. Finally, the effect of dam break is repeatedly monitored and ice break five meter accuracy is calculated through accurate monitoring and evaluation analysis.

  8. A novel fuzzy logic-based image steganography method to ensure medical data security.

    PubMed

    Karakış, R; Güler, I; Çapraz, I; Bilir, E

    2015-12-01

    This study aims to secure medical data by combining them into one file format using steganographic methods. The electroencephalogram (EEG) is selected as hidden data, and magnetic resonance (MR) images are also used as the cover image. In addition to the EEG, the message is composed of the doctor׳s comments and patient information in the file header of images. Two new image steganography methods that are based on fuzzy-logic and similarity are proposed to select the non-sequential least significant bits (LSB) of image pixels. The similarity values of the gray levels in the pixels are used to hide the message. The message is secured to prevent attacks by using lossless compression and symmetric encryption algorithms. The performance of stego image quality is measured by mean square of error (MSE), peak signal-to-noise ratio (PSNR), structural similarity measure (SSIM), universal quality index (UQI), and correlation coefficient (R). According to the obtained result, the proposed method ensures the confidentiality of the patient information, and increases data repository and transmission capacity of both MR images and EEG signals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Economic tour package model using heuristic

    NASA Astrophysics Data System (ADS)

    Rahman, Syariza Abdul; Benjamin, Aida Mauziah; Bakar, Engku Muhammad Nazri Engku Abu

    2014-07-01

    A tour-package is a prearranged tour that includes products and services such as food, activities, accommodation, and transportation, which are sold at a single price. Since the competitiveness within tourism industry is very high, many of the tour agents try to provide attractive tour-packages in order to meet tourist satisfaction as much as possible. Some of the criteria that are considered by the tourist are the number of places to be visited and the cost of the tour-packages. Previous studies indicate that tourists tend to choose economical tour-packages and aiming to visit as many places as they can cover. Thus, this study proposed tour-package model using heuristic approach. The aim is to find economical tour-packages and at the same time to propose as many places as possible to be visited by tourist in a given geographical area particularly in Langkawi Island. The proposed model considers only one starting point where the tour starts and ends at an identified hotel. This study covers 31 most attractive places in Langkawi Island from various categories of tourist attractions. Besides, the allocation of period for lunch and dinner are included in the proposed itineraries where it covers 11 popular restaurants around Langkawi Island. In developing the itinerary, the proposed heuristic approach considers time window for each site (hotel/restaurant/place) so that it represents real world implementation. We present three itineraries with different time constraints (1-day, 2-day and 3-day tour-package). The aim of economic model is to minimize the tour-package cost as much as possible by considering entrance fee of each visited place. We compare the proposed model with our uneconomic model from our previous study. The uneconomic model has no limitation to the cost with the aim to maximize the number of places to be visited. Comparison between the uneconomic and economic itinerary has shown that the proposed model have successfully achieved the objective that minimize the tour cost and cover maximum number of places to be visited.

  10. Classification of Large-Scale Remote Sensing Images for Automatic Identification of Health Hazards: Smoke Detection Using an Autologistic Regression Classifier.

    PubMed

    Wolters, Mark A; Dean, C B

    2017-01-01

    Remote sensing images from Earth-orbiting satellites are a potentially rich data source for monitoring and cataloguing atmospheric health hazards that cover large geographic regions. A method is proposed for classifying such images into hazard and nonhazard regions using the autologistic regression model, which may be viewed as a spatial extension of logistic regression. The method includes a novel and simple approach to parameter estimation that makes it well suited to handling the large and high-dimensional datasets arising from satellite-borne instruments. The methodology is demonstrated on both simulated images and a real application to the identification of forest fire smoke.

  11. MRAC Control with Prior Model Knowledge for Asymmetric Damaged Aircraft

    PubMed Central

    Zhang, Jing

    2015-01-01

    This paper develops a novel state-tracking multivariable model reference adaptive control (MRAC) technique utilizing prior knowledge of plant models to recover control performance of an asymmetric structural damaged aircraft. A modification of linear model representation is given. With prior knowledge on structural damage, a polytope linear parameter varying (LPV) model is derived to cover all concerned damage conditions. An MRAC method is developed for the polytope model, of which the stability and asymptotic error convergence are theoretically proved. The proposed technique reduces the number of parameters to be adapted and thus decreases computational cost and requires less input information. The method is validated by simulations on NASA generic transport model (GTM) with damage. PMID:26180839

  12. Dynamic Fuzzy Model Development for a Drum-type Boiler-turbine Plant Through GK Clustering

    NASA Astrophysics Data System (ADS)

    Habbi, Ahcène; Zelmat, Mimoun

    2008-10-01

    This paper discusses a TS fuzzy model identification method for an industrial drum-type boiler plant using the GK fuzzy clustering approach. The fuzzy model is constructed from a set of input-output data that covers a wide operating range of the physical plant. The reference data is generated using a complex first-principle-based mathematical model that describes the key dynamical properties of the boiler-turbine dynamics. The proposed fuzzy model is derived by means of fuzzy clustering method with particular attention on structure flexibility and model interpretability issues. This may provide a basement of a new way to design model based control and diagnosis mechanisms for the complex nonlinear plant.

  13. Minimum depth of soil cover above long-span soil-steel railway bridges

    NASA Astrophysics Data System (ADS)

    Esmaeili, Morteza; Zakeri, Jabbar Ali; Abdulrazagh, Parisa Haji

    2013-12-01

    Recently, soil-steel bridges have become more commonly used as railway-highway crossings because of their economical advantages and short construction period compared with traditional bridges. The currently developed formula for determining the minimum depth of covers by existing codes is typically based on vehicle loads and non-stiffened panels and takes into consideration the geometrical shape of the metal structure to avoid the failure of soil cover above a soil-steel bridge. The effects of spans larger than 8 m or more stiffened panels due to railway loads that maintain a safe railway track have not been accounted for in the minimum cover formulas and are the subject of this paper. For this study, two-dimensional finite element (FE) analyses of four low-profile arches and four box culverts with spans larger than 8 m were performed to develop new patterns for the minimum depth of soil cover by considering the serviceability criterion of the railway track. Using the least-squares method, new formulas were then developed for low-profile arches and box culverts and were compared with Canadian Highway Bridge Design Code formulas. Finally, a series of three-dimensional (3D) finite element FE analyses were carried out to control the out-of-plane buckling in the steel plates due to the 3D pattern of train loads. The results show that the out-of-plane bending does not control the buckling behavior of the steel plates, so the proposed equations for minimum depth of cover can be appropriately used for practical purposes.

  14. Effect of using different cover image quality to obtain robust selective embedding in steganography

    NASA Astrophysics Data System (ADS)

    Abdullah, Karwan Asaad; Al-Jawad, Naseer; Abdulla, Alan Anwer

    2014-05-01

    One of the common types of steganography is to conceal an image as a secret message in another image which normally called a cover image; the resulting image is called a stego image. The aim of this paper is to investigate the effect of using different cover image quality, and also analyse the use of different bit-plane in term of robustness against well-known active attacks such as gamma, statistical filters, and linear spatial filters. The secret messages are embedded in higher bit-plane, i.e. in other than Least Significant Bit (LSB), in order to resist active attacks. The embedding process is performed in three major steps: First, the embedding algorithm is selectively identifying useful areas (blocks) for embedding based on its lighting condition. Second, is to nominate the most useful blocks for embedding based on their entropy and average. Third, is to select the right bit-plane for embedding. This kind of block selection made the embedding process scatters the secret message(s) randomly around the cover image. Different tests have been performed for selecting a proper block size and this is related to the nature of the used cover image. Our proposed method suggests a suitable embedding bit-plane as well as the right blocks for the embedding. Experimental results demonstrate that different image quality used for the cover images will have an effect when the stego image is attacked by different active attacks. Although the secret messages are embedded in higher bit-plane, but they cannot be recognised visually within the stegos image.

  15. Current situation on regulations for mycotoxins. Overview of tolerances and status of standard methods of sampling and analysis.

    PubMed

    Van Egmond, H P

    1989-01-01

    A worldwide enquiry was undertaken in 1986-1987 to obtain up-to-date information about mycotoxin legislation in as many countries of the world as possible. Together with some additional data collected in 1981, information is now available about planned, proposed, existing or absence of legislation in 66 countries. Details about tolerances, legal bases, responsible authorities, prescribed methods of sampling and analysis and disposition of commodities containing inadmissible amounts of mycotoxins, are given. The information concerns aflatoxins in foodstuffs, aflatoxin M1 in dairy products, aflatoxins in animal feedstuffs, and other mycotoxins in food- and feedstuffs. In comparison with the situation in 1981, limits and regulations for mycotoxins have been expanded in 1987 with more countries having legislation (proposed or passed) on the subject, more products, and more mycotoxins covered by this legislation. The differences between tolerances in the various countries are sometimes quite large, which makes harmonization of mycotoxin regulations highly desirable.

  16. Extended wave-packet model to calculate energy-loss moments of protons in matter

    NASA Astrophysics Data System (ADS)

    Archubi, C. D.; Arista, N. R.

    2017-12-01

    In this work we introduce modifications to the wave-packet method proposed by Kaneko to calculate the energy-loss moments of a projectile traversing a target which is represented in terms of Gaussian functions for the momentum distributions of electrons in the atomic shells. These modifications are introduced using the Levine and Louie technique to take into account the energy gaps corresponding to the different atomic levels of the target. We use the extended wave-packet model to evaluate the stopping power, the energy straggling, the inverse mean free path, and the ionization cross sections for protons in several targets, obtaining good agreements for all these quantities on an extensive energy range that covers low-, intermediate-, and high-energy regions. The extended wave-packet model proposed here provides a method to calculate in a very straightforward way all the significant terms of the inelastic interaction of light ions with any element of the periodic table.

  17. Two-stage Keypoint Detection Scheme for Region Duplication Forgery Detection in Digital Images.

    PubMed

    Emam, Mahmoud; Han, Qi; Zhang, Hongli

    2018-01-01

    In digital image forensics, copy-move or region duplication forgery detection became a vital research topic recently. Most of the existing keypoint-based forgery detection methods fail to detect the forgery in the smooth regions, rather than its sensitivity to geometric changes. To solve these problems and detect points which cover all the regions, we proposed two steps for keypoint detection. First, we employed the scale-invariant feature operator to detect the spatially distributed keypoints from the textured regions. Second, the keypoints from the missing regions are detected using Harris corner detector with nonmaximal suppression to evenly distribute the detected keypoints. To improve the matching performance, local feature points are described using Multi-support Region Order-based Gradient Histogram descriptor. Based on precision-recall rates and commonly tested dataset, comprehensive performance evaluation is performed. The results demonstrated that the proposed scheme has better detection and robustness against some geometric transformation attacks compared with state-of-the-art methods. © 2017 American Academy of Forensic Sciences.

  18. Universal sensor based on the spectroscopy of glow discharge for the detection of traces of atoms or molecules in air

    NASA Astrophysics Data System (ADS)

    Atutov, S. N.; Galeyev, A. E.; Plekhanov, A. I.; Yakovlev, A. V.

    2018-03-01

    A sensitive and versatile sensor for the detection of traces of atoms or molecules in air based on the emission spectroscopy of glow discharge in air has been developed and studied. The advantages of this sensor compared to other well-known methods are that it renders the use of ultrahigh vacuum or cryogenic temperatures superfluous. The sensor is insensitive to the presence of water vapor (for example, in exhaled air) because of the absence of strong water lines in the visible spectral range. It has a high spectral selectivity limited only by Doppler broadening of the emission lines. The high selectivity of the sensor combined with a wide spectral range allows the detection of many toxic impurities, which can be present in air. Moreover, the spectral range used covers almost all biomarkers in exhaled air, making the proposed sensor extremely interesting for medical applications. To our knowledge, the proposed method is the first based on a glow discharge in air.

  19. Identification of active sources inside cavities using the equivalent source method-based free-field recovery technique

    NASA Astrophysics Data System (ADS)

    Bi, Chuan-Xing; Hu, Ding-Yu; Zhang, Yong-Bin; Jing, Wen-Qian

    2015-06-01

    In previous studies, an equivalent source method (ESM)-based technique for recovering the free sound field in a noisy environment has been successfully applied to exterior problems. In order to evaluate its performance when applied to a more general noisy environment, that technique is used to identify active sources inside cavities where the sound field is composed of the field radiated by active sources and that reflected by walls. A patch approach with two semi-closed surfaces covering the target active sources is presented to perform the measurements, and the field that would be radiated by these target active sources into free space is extracted from the mixed field by using the proposed technique, which will be further used as the input of nearfield acoustic holography for source identification. Simulation and experimental results validate the effectiveness of the proposed technique for source identification in cavities, and show the feasibility of performing the measurements with a double layer planar array.

  20. Prediction of residual shear strength of corroded reinforced concrete beams

    NASA Astrophysics Data System (ADS)

    Imam, Ashhad; Azad, Abul Kalam

    2016-09-01

    With the aim of providing experimental data on the shear capacity and behavior of corroded reinforced concrete beams that may help in the development of strength prediction models, the test results of 13 corroded and four un-corroded beams are presented. Corrosion damage was induced by accelerated corrosion induction through impressed current. Test results show that loss of shear strength of beams is mostly attributable to two important damage factors namely, the reduction in stirrups area due to corrosion and the corrosion-induced cracking of concrete cover to stirrups. Based on the test data, a method is proposed to predict the residual shear strength of corroded reinforced concrete beams in which residual shear strength is calculated first by using corrosion-reduced steel area alone, and then it is reduced by a proposed reduction factor, which collectively represents all other applicable corrosion damage factors. The method seems to yield results that are in reasonable agreement with the available test data.

  1. Estimation of Regional-Scale Actual Evapotranspiration in Okayama prefecture in Japan using Complementary Relationship

    NASA Astrophysics Data System (ADS)

    Moroizumi, T.; Yamamoto, M.; Miura, T.

    2008-12-01

    It is important to estimate accurately a water balance in watershed for proposing a reuse of water resources and a proper settlement of water utilization. Evapotranspiration (ET) is an important factor of water balance. Therefore, it is needed to estimate accurately the actual ET. The objective of this study is to estimate accurately monthly actual ET in Yoshii, Asahi, and Takahashi River watersheds in Okayama prefecture from 1999 to 2000. The monthly actual ET was calculated by a Morton and a modified Brutsaert and Stricker (B&S) method, using Automated Meteorological Data Acquisition Systems (AMeDAS) in the basin. The actual ET was estimated using land covers which were classified in 11 categories. The land covers includes the effects of albedo. The actual ET was related to the elevation at each AMeDAS station. Using this relationship, the actual ET at the 1 or 5 km grid-interval mesh in the basin was calculated, and finally, the distribution of actual ET was mapped. The monthly ET estimated by the modified B&S method were smaller than that by Morton method which showed a same tendency as the Penman potential ET (PET). The annual values of Morton"fs ET, modified B&S"fs ET, and PET were estimated as 796, 645, and 800 mm, respectively. The ET by the modified B&S was larger in hilly and mountainous areas than in settlement or city. In general, it was a reasonable result because city or settlement areas were covered with concrete and asphalt and the ET was controlled.

  2. A novel method to detect shadows on multispectral images

    NASA Astrophysics Data System (ADS)

    Daǧlayan Sevim, Hazan; Yardımcı ćetin, Yasemin; Özışık Başkurt, Didem

    2016-10-01

    Shadowing occurs when the direct light coming from a light source is obstructed by high human made structures, mountains or clouds. Since shadow regions are illuminated only by scattered light, true spectral properties of the objects are not observed in such regions. Therefore, many object classification and change detection problems utilize shadow detection as a preprocessing step. Besides, shadows are useful for obtaining 3D information of the objects such as estimating the height of buildings. With pervasiveness of remote sensing images, shadow detection is ever more important. This study aims to develop a shadow detection method on multispectral images based on the transformation of C1C2C3 space and contribution of NIR bands. The proposed method is tested on Worldview-2 images covering Ankara, Turkey at different times. The new index is used on these 8-band multispectral images with two NIR bands. The method is compared with methods in the literature.

  3. Monitoring Farmland Loss Caused by Urbanization in Beijing from Modis Time Series Using Hierarchical Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.

    2018-04-01

    In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.

  4. Evaluation of the deflected mode of the monolithic span pieces and preassembled slabs combined action

    NASA Astrophysics Data System (ADS)

    Roshchina, Svetlana; Ezzi, Hisham; Shishov, Ivan; Lukin, Mikhail; Sergeev, Michael

    2017-10-01

    In single-story industrial buildings, the cost of roof covering comprises 40-55% of the total cost of the buildings. Therefore, research, development and application of new structural forms of reinforced concrete rafter structures, that allow to reduce material consumption and reduce the sub-assembly weight of structures, are the main tasks in the field of improving the existing generic solutions. The article suggests a method for estimating the relieving effect in the rafter structure as the result of combined deformation of the roof slabs with the end arrises. Calculated and experimental method for determining the stress and strain state of the rafter structure upper belt and the roof slabs with regard to their rigid connection has been proposed. A model of a highly effective roof structure providing a significant reduction in the construction height of the roofing and the cubic content of the building at the same time allowing to include the end arrises and a part of the slabs shelves with the help of the monolithic concrete has been proposed. The proposed prefabricated monolithic concrete rafter structure and its rigid connection with ribbed slabs allows to reduce the consumption of the prestressed slabs reinforcement by 50%.

  5. An algorithm for encryption of secret images into meaningful images

    NASA Astrophysics Data System (ADS)

    Kanso, A.; Ghebleh, M.

    2017-03-01

    Image encryption algorithms typically transform a plain image into a noise-like cipher image, whose appearance is an indication of encrypted content. Bao and Zhou [Image encryption: Generating visually meaningful encrypted images, Information Sciences 324, 2015] propose encrypting the plain image into a visually meaningful cover image. This improves security by masking existence of encrypted content. Following their approach, we propose a lossless visually meaningful image encryption scheme which improves Bao and Zhou's algorithm by making the encrypted content, i.e. distortions to the cover image, more difficult to detect. Empirical results are presented to show high quality of the resulting images and high security of the proposed algorithm. Competence of the proposed scheme is further demonstrated by means of comparison with Bao and Zhou's scheme.

  6. Directional Sensitivity in Light-Mass Dark Matter Searches with Single-Electron-Resolution Ionization Detectors

    NASA Astrophysics Data System (ADS)

    Kadribasic, Fedja; Mirabolfathi, Nader; Nordlund, Kai; Sand, Andrea E.; Holmström, Eero; Djurabekova, Flyura

    2018-03-01

    We propose a method using solid state detectors with directional sensitivity to dark matter interactions to detect low-mass weakly interacting massive particles (WIMPs) originating from galactic sources. In spite of a large body of literature for high-mass WIMP detectors with directional sensitivity, no available technique exists to cover WIMPs in the mass range <1 GeV /c2 . We argue that single-electron-resolution semiconductor detectors allow for directional sensitivity once properly calibrated. We examine the commonly used semiconductor material response to these low-mass WIMP interactions.

  7. Reporting inquiry in simulation.

    PubMed

    Kardong-Edgren, Suzie; Gaba, David; Dieckmann, Peter; Cook, David A

    2011-08-01

    The term "inquiry" covers the large spectrum of what people are currently doing in the nascent field of simulation. This monograph proposes appropriate means of dissemination for the many different levels of inquiry that may arise from the Summit or other sources of inspiration. We discuss various methods of inquiry and where they might fit in the hierarchy of reporting and dissemination. We provide guidance for deciding whether an inquiry has reached the level of development required for publication in a peer-reviewed journal and conclude with a discussion of what most journals view as inquiry acceptable for publication.

  8. Optimizing the discovery organization for innovation.

    PubMed

    Sams-Dodd, Frank

    2005-08-01

    Strategic management is the process of adapting organizational structure and management principles to fit the strategic goal of the business unit. The pharmaceutical industry has generally been expert at optimizing its organizations for drug development, but has rarely implemented different structures for the early discovery process, where the objective is innovation and the transformation of innovation into drug projects. Here, a set of strategic management methods is proposed, covering team composition, organizational structure, management principles and portfolio management, which are designed to increase the level of innovation in the early drug discovery process.

  9. Numerical analysis of the shifting slabs applied in a wireless power transfer system to enhance magnetic coupling

    NASA Astrophysics Data System (ADS)

    Dong, Yayun; Yang, Xijun; Jin, Nan; Li, Wenwen; Yao, Chen; Tang, Houjun

    2017-05-01

    Shifting medium is a kind of metamaterial, which can optically shift a space or an object a certain distance away from its original position. Based on the shifting medium, we propose a concise pair of shifting slabs covering the transmitting or receiving coil in a two-coil wireless power transfer system to decrease the equivalent distance between the coils. The electromagnetic parameters of the shifting slabs are calculated by transformation optics. Numerical simulations validate that the shifting slabs can approximately shift the electromagnetic fields generated by the covered coil; thus, the magnetic coupling and the efficiency of the system are enhanced while remaining the physical transmission distance unchanged. We also verify the advantages of the shifting slabs over the magnetic superlens. Finally, we provide two methods to fabricate shifting slabs based on split-ring resonators.

  10. A Novel Quantum Image Steganography Scheme Based on LSB

    NASA Astrophysics Data System (ADS)

    Zhou, Ri-Gui; Luo, Jia; Liu, XingAo; Zhu, Changming; Wei, Lai; Zhang, Xiafen

    2018-06-01

    Based on the NEQR representation of quantum images and least significant bit (LSB) scheme, a novel quantum image steganography scheme is proposed. The sizes of the cover image and the original information image are assumed to be 4 n × 4 n and n × n, respectively. Firstly, the bit-plane scrambling method is used to scramble the original information image. Then the scrambled information image is expanded to the same size of the cover image by using the key only known to the operator. The expanded image is scrambled to be a meaningless image with the Arnold scrambling. The embedding procedure and extracting procedure are carried out by K 1 and K 2 which are under control of the operator. For validation of the presented scheme, the peak-signal-to-noise ratio (PSNR), the capacity, the security of the images and the circuit complexity are analyzed.

  11. Using Space Lidar Observations to Decompose Longwave Cloud Radiative Effect Variations Over the Last Decade

    NASA Astrophysics Data System (ADS)

    Vaillant de Guélis, Thibault; Chepfer, Hélène; Noel, Vincent; Guzman, Rodrigo; Winker, David M.; Plougonven, Riwal

    2017-12-01

    Measurements of the longwave cloud radiative effect (LWCRE) at the top of the atmosphere assess the contribution of clouds to the Earth warming but do not quantify the cloud property variations that are responsible for the LWCRE variations. The CALIPSO space lidar observes directly the detailed profile of cloud, cloud opacity, and cloud cover. Here we use these observations to quantify the influence of cloud properties on the variations of the LWCRE observed between 2008 and 2015 in the tropics and at global scale. At global scale, the method proposed here gives good results except over the Southern Ocean. We find that the global LWCRE variations observed over ocean are mostly due to variations in the opaque cloud properties (82%); transparent cloud columns contributed 18%. Variation of opaque cloud cover is the first contributor to the LWCRE evolution (58%); opaque cloud temperature is the second contributor (28%).

  12. Novel Remanufacturing Process of Recycled Polytetrafluoroethylene(PTFE)/GF Laminate

    NASA Astrophysics Data System (ADS)

    Xi, Z.; Ghita, O. R.; Johnston, P.; Evans, K. E.

    2011-01-01

    Currently, the PTFE/GF laminate and PTFE PCB manufacturers are under considerable pressure to address the recycling issues due to Waste Electrical and Electronic Equipment (WEEE) Directive, shortage of landfill capacity and cost of disposal. This study is proposing a novel manufacture method for reuse of the mechanical ground PTFE/Glass fibre (GF) laminate and production of the first reconstitute PTFE/GF laminate. The reconstitute PTFE/GF laminate proposed here consists of a layer of recycled sub-sheet, additional layers of PTFE and PTFE coated glass cloth, also covered by copper foils. The reconstitute PTFE/GF laminate showed good dielectric properties. Therefore, there is potential to use the mechanical ground PTFE/GF laminate powder to produce reconstitute PTFE/GF laminate, for use in high frequencies PCB applications.

  13. A prospective health impact assessment of the international astronomy and space exploration centre.

    PubMed

    Winters, L Y

    2001-06-01

    Assess the potential health impacts of the proposed International Astronomy and Space Exploration Centre on the population of New Wallasey. Contribute to the piloting of health impact assessment methods. Prospective health impact assessment involving brainstorming sessions and individual interviews with key informants and a literature review. New Wallasey Single Regeneration Budget 4 area. Key stakeholders including local residents' groups selected through purposeful snowball sampling. Recommendations are made that cover issues around: transport and traffic; civic design; security; public safety, employment and training. Health impact assessment is a useful pragmatic tool for facilitating wide consultation. In particular engaging the local population in the early planning stages of a proposed development, and assisting in highlighting changes to maximise the positive health influences on affected communities.

  14. An impatient evolutionary algorithm with probabilistic tabu search for unified solution of some NP-hard problems in graph and set theory via clique finding.

    PubMed

    Guturu, Parthasarathy; Dantu, Ram

    2008-06-01

    Many graph- and set-theoretic problems, because of their tremendous application potential and theoretical appeal, have been well investigated by the researchers in complexity theory and were found to be NP-hard. Since the combinatorial complexity of these problems does not permit exhaustive searches for optimal solutions, only near-optimal solutions can be explored using either various problem-specific heuristic strategies or metaheuristic global-optimization methods, such as simulated annealing, genetic algorithms, etc. In this paper, we propose a unified evolutionary algorithm (EA) to the problems of maximum clique finding, maximum independent set, minimum vertex cover, subgraph and double subgraph isomorphism, set packing, set partitioning, and set cover. In the proposed approach, we first map these problems onto the maximum clique-finding problem (MCP), which is later solved using an evolutionary strategy. The proposed impatient EA with probabilistic tabu search (IEA-PTS) for the MCP integrates the best features of earlier successful approaches with a number of new heuristics that we developed to yield a performance that advances the state of the art in EAs for the exploration of the maximum cliques in a graph. Results of experimentation with the 37 DIMACS benchmark graphs and comparative analyses with six state-of-the-art algorithms, including two from the smaller EA community and four from the larger metaheuristics community, indicate that the IEA-PTS outperforms the EAs with respect to a Pareto-lexicographic ranking criterion and offers competitive performance on some graph instances when individually compared to the other heuristic algorithms. It has also successfully set a new benchmark on one graph instance. On another benchmark suite called Benchmarks with Hidden Optimal Solutions, IEA-PTS ranks second, after a very recent algorithm called COVER, among its peers that have experimented with this suite.

  15. 46 CFR 164.006-5 - Procedure for approval.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the deck covering. (2) The range of thicknesses in which it is proposed to lay the deck covering... (c). (2) Sufficient bulk material (unmixed) to lay a sample one inch thick on an area of 12″×27″. If...

  16. Research on the shortwave infrared hyperspectral imaging technology based on Integrated Stepwise filter

    NASA Astrophysics Data System (ADS)

    Wei, Liqing; Xiao, Xizhong; Wang, Yueming; Zhuang, Xiaoqiong; Wang, Jianyu

    2017-11-01

    Space-borne hyperspectral imagery is an important tool for earth sciences and industrial applications. Higher spatial and spectral resolutions have been sought persistently, although this results in more power, larger volume and weight during a space-borne spectral imager design. For miniaturization of hyperspectral imager and optimization of spectral splitting methods, several methods are compared in this paper. Spectral time delay integration (TDI) method with high transmittance Integrated Stepwise Filter (ISF) is proposed.With the method, an ISF imaging spectrometer with TDI could achieve higher system sensitivity than the traditional prism/grating imaging spectrometer. In addition, the ISF imaging spectrometer performs well in suppressing infrared background radiation produced by instrument. A compact shortwave infrared (SWIR) hyperspectral imager prototype based on HgCdTe covering the spectral range of 2.0-2.5 μm with 6 TDI stages was designed and integrated. To investigate the performance of ISF spectrometer, a method to derive the optimal blocking band curve of the ISF is introduced, along with known error characteristics. To assess spectral performance of the ISF system, a new spectral calibration based on blackbody radiation with temperature scanning is proposed. The results of the imaging experiment showed the merits of ISF. ISF has great application prospects in the field of high sensitivity and high resolution space-borne hyperspectral imagery.

  17. Segmentation schema for enhancing land cover identification: A case study using Sentinel 2 data

    NASA Astrophysics Data System (ADS)

    Mongus, Domen; Žalik, Borut

    2018-04-01

    Land monitoring is performed increasingly using high and medium resolution optical satellites, such as the Sentinel-2. However, optical data is inevitably subjected to the variable operational conditions under which it was acquired. Overlapping of features caused by shadows, soft transitions between shadowed and non-shadowed regions, and temporal variability of the observed land-cover types require radiometric corrections. This study examines a new approach to enhancing the accuracy of land cover identification that resolves this problem. The proposed method constructs an ensemble-type classification model with weak classifiers tuned to the particular operational conditions under which the data was acquired. Iterative segmentation over the learning set is applied for this purpose, where feature space is partitioned according to the likelihood of misclassifications introduced by the classification model. As these are a consequence of overlapping features, such partitioning avoids the need for radiometric corrections of the data, and divides land cover types implicitly into subclasses. As a result, improved performance of all tested classification approaches were measured during the validation that was conducted on Sentinel-2 data. The highest accuracies in terms of F1-scores were achieved using the Naive Bayes Classifier as the weak classifier, while supplementing original spectral signatures with normalised difference vegetation index and texture analysis features, namely, average intensity, contrast, homogeneity, and dissimilarity. In total, an F1-score of nearly 95% was achieved in this way, with F1-scores of each particular land cover type reaching above 90%.

  18. 46 CFR 148.21 - Necessary information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... material not covered by paragraph (b) of this section, including— (1) Self-heating; (2) Depletion of oxygen... proposed transportation operation, including— (1) The type of vessel proposed for water movements; (2) The...

  19. Updating the 2001 National Land Cover Database land cover classification to 2006 by using Landsat imagery change detection methods

    USGS Publications Warehouse

    Xian, George; Homer, Collin G.; Fry, Joyce

    2009-01-01

    The recent release of the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001, which represents the nation's land cover status based on a nominal date of 2001, is widely used as a baseline for national land cover conditions. To enable the updating of this land cover information in a consistent and continuous manner, a prototype method was developed to update land cover by an individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season in 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, land cover classifications at the full NLCD resolution for 2006 areas of change were completed by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain several metropolitan areas including Seattle, Washington; San Diego, California; Sioux Falls, South Dakota; Jackson, Mississippi; and Manchester, New Hampshire. Results from the five study areas show that the vast majority of land cover change was captured and updated with overall land cover classification accuracies of 78.32%, 87.5%, 88.57%, 78.36%, and 83.33% for these areas. The method optimizes mapping efficiency and has the potential to provide users a flexible method to generate updated land cover at national and regional scales by using NLCD 2001 as the baseline.

  20. The ECHI project: health indicators for the European Community.

    PubMed

    Kramers, Pieter G N

    2003-09-01

    Within the EU Health Monitoring Programme (HMP), the ECHI project has proposed a comprehensive list of 'European Community Health Indicators'. In the design of the indicator set, a set of explicit criteria was applied. These included: i) be comprehensive and coherent, i.e. cover all domains of the public health field; ii) take account of earlier work, especially that by WHO-Europe, OECD and Eurostat; and iii) cover the priority areas that Member States and Community health policies currently pursue. Flexibility is an important characteristic of the present proposal. In ECHI, this has been emphasized by the definition of 'user-windows'. These are subsets from the overall indicator list, each of which should reflect a specific user's requirement or interest. The proposed indicators are, in most cases, defined as generic indicators, i.e. their actual operational definitions have not yet been attempted. This work has been, and is being carried out to a large part by other projects financed under the HMP, which cover specific areas of public health or areas of data collection. Apart from indicators covered by regularly available data, indicators (or issues) have been proposed for which data are currently difficult to collect but which from a policy point of view would be needed. All this points to the fact that establishing an indicator list which is actually used by Member States is a continuously developing process. This process is now continued by the first strand of the new EU Public Health Action Programme.

  1. Ecological security pattern construction based on ecological protection redlines in China

    NASA Astrophysics Data System (ADS)

    Zou, Changxin

    2017-04-01

    China is facing huge environmental problems with its current rapid rate of urbanization and industrialization, thus causing biodiversity loss, ecosystem service degradation on a major scale. Against this background, three previous examples (the nature reserve policy, the afforestation policy, and the zoning policy) are implemented in China. These all play important roles in protecting natural ecosystems, although they can sometimes cause new problems and lack rigorous targets for environmental outcomes. To overcome current management conflicts, China has proposed a new "ecological protection redlines" policy (EPR). EPR can be defined as the ecological baseline area needed to provide ecosystem services to guarantee and maintain ecological safety. This study analyzed the scope, objectives and technical methods of delineating EPR in China, and put forward the proposed scheme for the ecological security pattern based on EPR. We constructed three kinds of redlines in China, including key ecological function area redlines, ecological sensitive or fragile areas redlines, and forbidden development areas redlines. For the key ecological function area redlines, a total of 38 water conservation functional zones have been designated, covering a total area of 3.23 million km2; 14 soil conservation zones have been designated, covering a total area of 881700 km2; wind-prevention and sand-fixation zones across the country cover a total area of about 1.73 million km2, accounting for 57.13% of the total land area of the whole country. With respect to the ecologically vulnerable redlines, 18 ecologically vulnerable zones has been designated across the country, covering 2.19 million km2, accounting for 22.86% of the total land area of the whole country. Forbidden development areas redlines covered a total area of 3.29 million km2, accounting for 34.3% of the total land area of the whole country. We also suggest to form a complete ecological security pattern including patterns of protecting ecological function, residential environment safety, and biodiversity maintenance. Further emphasis should be put in supporting management and control measures in order to promote ecological protection in China.

  2. Application of high-order numerical schemes and Newton-Krylov method to two-phase drift-flux model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    This study concerns the application and solver robustness of the Newton-Krylov method in solving two-phase flow drift-flux model problems using high-order numerical schemes. In our previous studies, the Newton-Krylov method has been proven as a promising solver for two-phase flow drift-flux model problems. However, these studies were limited to use first-order numerical schemes only. Moreover, the previous approach to treating the drift-flux closure correlations was later revealed to cause deteriorated solver convergence performance, when the mesh was highly refined, and also when higher-order numerical schemes were employed. In this study, a second-order spatial discretization scheme that has been tested withmore » two-fluid two-phase flow model was extended to solve drift-flux model problems. In order to improve solver robustness, and therefore efficiency, a new approach was proposed to treating the mean drift velocity of the gas phase as a primary nonlinear variable to the equation system. With this new approach, significant improvement in solver robustness was achieved. With highly refined mesh, the proposed treatment along with the Newton-Krylov solver were extensively tested with two-phase flow problems that cover a wide range of thermal-hydraulics conditions. Satisfactory convergence performances were observed for all test cases. Numerical verification was then performed in the form of mesh convergence studies, from which expected orders of accuracy were obtained for both the first-order and the second-order spatial discretization schemes. Finally, the drift-flux model, along with numerical methods presented, were validated with three sets of flow boiling experiments that cover different flow channel geometries (round tube, rectangular tube, and rod bundle), and a wide range of test conditions (pressure, mass flux, wall heat flux, inlet subcooling and outlet void fraction).« less

  3. Automatic Road Gap Detection Using Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.

    2011-09-01

    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  4. Application of high-order numerical schemes and Newton-Krylov method to two-phase drift-flux model

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2017-08-07

    This study concerns the application and solver robustness of the Newton-Krylov method in solving two-phase flow drift-flux model problems using high-order numerical schemes. In our previous studies, the Newton-Krylov method has been proven as a promising solver for two-phase flow drift-flux model problems. However, these studies were limited to use first-order numerical schemes only. Moreover, the previous approach to treating the drift-flux closure correlations was later revealed to cause deteriorated solver convergence performance, when the mesh was highly refined, and also when higher-order numerical schemes were employed. In this study, a second-order spatial discretization scheme that has been tested withmore » two-fluid two-phase flow model was extended to solve drift-flux model problems. In order to improve solver robustness, and therefore efficiency, a new approach was proposed to treating the mean drift velocity of the gas phase as a primary nonlinear variable to the equation system. With this new approach, significant improvement in solver robustness was achieved. With highly refined mesh, the proposed treatment along with the Newton-Krylov solver were extensively tested with two-phase flow problems that cover a wide range of thermal-hydraulics conditions. Satisfactory convergence performances were observed for all test cases. Numerical verification was then performed in the form of mesh convergence studies, from which expected orders of accuracy were obtained for both the first-order and the second-order spatial discretization schemes. Finally, the drift-flux model, along with numerical methods presented, were validated with three sets of flow boiling experiments that cover different flow channel geometries (round tube, rectangular tube, and rod bundle), and a wide range of test conditions (pressure, mass flux, wall heat flux, inlet subcooling and outlet void fraction).« less

  5. Enhancing the performance of regional land cover mapping

    NASA Astrophysics Data System (ADS)

    Wu, Weicheng; Zucca, Claudio; Karam, Fadi; Liu, Guangping

    2016-10-01

    Different pixel-based, object-based and subpixel-based methods such as time-series analysis, decision-tree, and different supervised approaches have been proposed to conduct land use/cover classification. However, despite their proven advantages in small dataset tests, their performance is variable and less satisfactory while dealing with large datasets, particularly, for regional-scale mapping with high resolution data due to the complexity and diversity in landscapes and land cover patterns, and the unacceptably long processing time. The objective of this paper is to demonstrate the comparatively highest performance of an operational approach based on integration of multisource information ensuring high mapping accuracy in large areas with acceptable processing time. The information used includes phenologically contrasted multiseasonal and multispectral bands, vegetation index, land surface temperature, and topographic features. The performance of different conventional and machine learning classifiers namely Malahanobis Distance (MD), Maximum Likelihood (ML), Artificial Neural Networks (ANNs), Support Vector Machines (SVMs) and Random Forests (RFs) was compared using the same datasets in the same IDL (Interactive Data Language) environment. An Eastern Mediterranean area with complex landscape and steep climate gradients was selected to test and develop the operational approach. The results showed that SVMs and RFs classifiers produced most accurate mapping at local-scale (up to 96.85% in Overall Accuracy), but were very time-consuming in whole-scene classification (more than five days per scene) whereas ML fulfilled the task rapidly (about 10 min per scene) with satisfying accuracy (94.2-96.4%). Thus, the approach composed of integration of seasonally contrasted multisource data and sampling at subclass level followed by a ML classification is a suitable candidate to become an operational and effective regional land cover mapping method.

  6. Exome sequencing of a multigenerational human pedigree.

    PubMed

    Hedges, Dale J; Hedges, Dale; Burges, Dan; Powell, Eric; Almonte, Cherylyn; Huang, Jia; Young, Stuart; Boese, Benjamin; Schmidt, Mike; Pericak-Vance, Margaret A; Martin, Eden; Zhang, Xinmin; Harkins, Timothy T; Züchner, Stephan

    2009-12-14

    Over the next few years, the efficient use of next-generation sequencing (NGS) in human genetics research will depend heavily upon the effective mechanisms for the selective enrichment of genomic regions of interest. Recently, comprehensive exome capture arrays have become available for targeting approximately 33 Mb or approximately 180,000 coding exons across the human genome. Selective genomic enrichment of the human exome offers an attractive option for new experimental designs aiming to quickly identify potential disease-associated genetic variants, especially in family-based studies. We have evaluated a 2.1 M feature human exome capture array on eight individuals from a three-generation family pedigree. We were able to cover up to 98% of the targeted bases at a long-read sequence read depth of > or = 3, 86% at a read depth of > or = 10, and over 50% of all targets were covered with > or = 20 reads. We identified up to 14,284 SNPs and small indels per individual exome, with up to 1,679 of these representing putative novel polymorphisms. Applying the conservative genotype calling approach HCDiff, the average rate of detection of a variant allele based on Illumina 1 M BeadChips genotypes was 95.2% at > or = 10x sequence. Further, we propose an advantageous genotype calling strategy for low covered targets that empirically determines cut-off thresholds at a given coverage depth based on existing genotype data. Application of this method was able to detect >99% of SNPs covered > or = 8x. Our results offer guidance for "real-world" applications in human genetics and provide further evidence that microarray-based exome capture is an efficient and reliable method to enrich for chromosomal regions of interest in next-generation sequencing experiments.

  7. Multiple grid arrangement improves ligand docking with unknown binding sites: Application to the inverse docking problem.

    PubMed

    Ban, Tomohiro; Ohue, Masahito; Akiyama, Yutaka

    2018-04-01

    The identification of comprehensive drug-target interactions is important in drug discovery. Although numerous computational methods have been developed over the years, a gold standard technique has not been established. Computational ligand docking and structure-based drug design allow researchers to predict the binding affinity between a compound and a target protein, and thus, they are often used to virtually screen compound libraries. In addition, docking techniques have also been applied to the virtual screening of target proteins (inverse docking) to predict target proteins of a drug candidate. Nevertheless, a more accurate docking method is currently required. In this study, we proposed a method in which a predicted ligand-binding site is covered by multiple grids, termed multiple grid arrangement. Notably, multiple grid arrangement facilitates the conformational search for a grid-based ligand docking software and can be applied to the state-of-the-art commercial docking software Glide (Schrödinger, LLC). We validated the proposed method by re-docking with the Astex diverse benchmark dataset and blind binding site situations, which improved the correct prediction rate of the top scoring docking pose from 27.1% to 34.1%; however, only a slight improvement in target prediction accuracy was observed with inverse docking scenarios. These findings highlight the limitations and challenges of current scoring functions and the need for more accurate docking methods. The proposed multiple grid arrangement method was implemented in Glide by modifying a cross-docking script for Glide, xglide.py. The script of our method is freely available online at http://www.bi.cs.titech.ac.jp/mga_glide/. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Improved Simplified Methods for Effective Seismic Analysis and Design of Isolated and Damped Bridges in Western and Eastern North America

    NASA Astrophysics Data System (ADS)

    Koval, Viacheslav

    The seismic design provisions of the CSA-S6 Canadian Highway Bridge Design Code and the AASHTO LRFD Seismic Bridge Design Specifications have been developed primarily based on historical earthquake events that have occurred along the west coast of North America. For the design of seismic isolation systems, these codes include simplified analysis and design methods. The appropriateness and range of application of these methods are investigated through extensive parametric nonlinear time history analyses in this thesis. It was found that there is a need to adjust existing design guidelines to better capture the expected nonlinear response of isolated bridges. For isolated bridges located in eastern North America, new damping coefficients are proposed. The applicability limits of the code-based simplified methods have been redefined to ensure that the modified method will lead to conservative results and that a wider range of seismically isolated bridges can be covered by this method. The possibility of further improving current simplified code methods was also examined. By transforming the quantity of allocated energy into a displacement contribution, an idealized analytical solution is proposed as a new simplified design method. This method realistically reflects the effects of ground-motion and system design parameters, including the effects of a drifted oscillation center. The proposed method is therefore more appropriate than current existing simplified methods and can be applicable to isolation systems exhibiting a wider range of properties. A multi-level-hazard performance matrix has been adopted by different seismic provisions worldwide and will be incorporated into the new edition of the Canadian CSA-S6-14 Bridge Design code. However, the combined effect and optimal use of isolation and supplemental damping devices in bridges have not been fully exploited yet to achieve enhanced performance under different levels of seismic hazard. A novel Dual-Level Seismic Protection (DLSP) concept is proposed and developed in this thesis which permits to achieve optimum seismic performance with combined isolation and supplemental damping devices in bridges. This concept is shown to represent an attractive design approach for both the upgrade of existing seismically deficient bridges and the design of new isolated bridges.

  9. An approach to compute the C factor for universal soil loss equation using EOS-MODIS vegetation index (VI)

    NASA Astrophysics Data System (ADS)

    Li, Hui; He, Huizhong; Chen, Xiaoling; Zhang, Lihua

    2008-12-01

    C factor, known as cover and management factor in USLE, is one of the most important factors since it represents the combined effects of plant, soil cover and management on erosion, whereas it also most easily changed variables by men for it itself is time-variant and the uncertainty nature. So it's vital to compute C factor properly in order to model erosion effectively. In this paper we attempt to present a new method for calculating C value using Vegetation Index (VI) derived from multi-temporal MODIS imagery, which can estimate C factor in a more scientific way. Based on the theory that C factor is strongly correlated with VI, the average annual C value is estimated by adding the VI value of three growth phases within a year with different weights. Modified Fournier Index (MFI) is employed to determine the weight of each growth phase for the vegetation growth and agricultural activities are significantly influenced by precipitation. The C values generated by the proposed method were compared with that of other method, and the results showed that the results of our method is highly correlated with the others. This study is helpful to extract C value from satellite data in a scientific and efficient way, which in turn could be used to facilitate the prediction of erosion.

  10. Dual-resolution image reconstruction for region-of-interest CT scan

    NASA Astrophysics Data System (ADS)

    Jin, S. O.; Shin, K. Y.; Yoo, S. K.; Kim, J. G.; Kim, K. H.; Huh, Y.; Lee, S. Y.; Kwon, O.-K.

    2014-07-01

    In ordinary CT scan, so called full field-of-view (FFOV) scan, in which the x-ray beam span covers the whole section of the body, a large number of projections are necessary to reconstruct high resolution images. However, excessive x-ray dose is a great concern in FFOV scan. Region-of-interest (ROI) scan is a method to visualize the ROI in high resolution while reducing the x-ray dose. But, ROI scan suffers from bright-band artifacts which may hamper CT-number accuracy. In this study, we propose an image reconstruction method to eliminate the band artifacts in the ROI scan. In addition to the ROI scan with high sampling rate in the view direction, we get FFOV projection data with much lower sampling rate. Then, we reconstruct images in the compressed sensing (CS) framework with dual resolutions, that is, high resolution in the ROI and low resolution outside the ROI. For the dual-resolution image reconstruction, we implemented the dual-CS reconstruction algorithm in which data fidelity and total variation (TV) terms were enforced twice in the framework of adaptive steepest descent projection onto convex sets (ASD-POCS). The proposed method has remarkably reduced the bright-band artifacts at around the ROI boundary, and it has also effectively suppressed the streak artifacts over the entire image. We expect the proposed method can be greatly used for dual-resolution imaging with reducing the radiation dose, artifacts and scan time.

  11. Multi-ray medical ultrasound simulation without explicit speckle modelling.

    PubMed

    Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak

    2018-05-04

    To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.

  12. Improving the iterative Linear Interaction Energy approach using automated recognition of configurational transitions.

    PubMed

    Vosmeer, C Ruben; Kooi, Derk P; Capoferri, Luigi; Terpstra, Margreet M; Vermeulen, Nico P E; Geerke, Daan P

    2016-01-01

    Recently an iterative method was proposed to enhance the accuracy and efficiency of ligand-protein binding affinity prediction through linear interaction energy (LIE) theory. For ligand binding to flexible Cytochrome P450s (CYPs), this method was shown to decrease the root-mean-square error and standard deviation of error prediction by combining interaction energies of simulations starting from different conformations. Thereby, different parts of protein-ligand conformational space are sampled in parallel simulations. The iterative LIE framework relies on the assumption that separate simulations explore different local parts of phase space, and do not show transitions to other parts of configurational space that are already covered in parallel simulations. In this work, a method is proposed to (automatically) detect such transitions during the simulations that are performed to construct LIE models and to predict binding affinities. Using noise-canceling techniques and splines to fit time series of the raw data for the interaction energies, transitions during simulation between different parts of phase space are identified. Boolean selection criteria are then applied to determine which parts of the interaction energy trajectories are to be used as input for the LIE calculations. Here we show that this filtering approach benefits the predictive quality of our previous CYP 2D6-aryloxypropanolamine LIE model. In addition, an analysis is performed of the gain in computational efficiency that can be obtained from monitoring simulations using the proposed filtering method and by prematurely terminating simulations accordingly.

  13. 75 FR 18776 - Regulated Navigation Area; Galveston Channel, TX

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... that the proposed regulated navigation area covers a small area and vessels are allowed to travel...-AA11 Regulated Navigation Area; Galveston Channel, TX AGENCY: Coast Guard, DHS. ACTION: Notice of proposed rulemaking. SUMMARY: The Coast Guard proposes to establish a regulated navigation area across the...

  14. Analysis of spatial distribution of land cover maps accuracy

    NASA Astrophysics Data System (ADS)

    Khatami, R.; Mountrakis, G.; Stehman, S. V.

    2017-12-01

    Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain yielded similar AUC; iv) for the larger sample size (i.e., very dense spatial sample) and per-class predictions, the spatial domain yielded larger AUC; v) increasing the sample size improved accuracy predictions with a greater benefit accruing to the spatial domain; and vi) the function used for interpolation had the smallest effect on AUC.

  15. Interior tomography in microscopic CT with image reconstruction constrained by full field of view scan at low spatial resolution

    NASA Astrophysics Data System (ADS)

    Luo, Shouhua; Shen, Tao; Sun, Yi; Li, Jing; Li, Guang; Tang, Xiangyang

    2018-04-01

    In high resolution (microscopic) CT applications, the scan field of view should cover the entire specimen or sample to allow complete data acquisition and image reconstruction. However, truncation may occur in projection data and results in artifacts in reconstructed images. In this study, we propose a low resolution image constrained reconstruction algorithm (LRICR) for interior tomography in microscopic CT at high resolution. In general, the multi-resolution acquisition based methods can be employed to solve the data truncation problem if the project data acquired at low resolution are utilized to fill up the truncated projection data acquired at high resolution. However, most existing methods place quite strict restrictions on the data acquisition geometry, which greatly limits their utility in practice. In the proposed LRICR algorithm, full and partial data acquisition (scan) at low and high resolutions, respectively, are carried out. Using the image reconstructed from sparse projection data acquired at low resolution as the prior, a microscopic image at high resolution is reconstructed from the truncated projection data acquired at high resolution. Two synthesized digital phantoms, a raw bamboo culm and a specimen of mouse femur, were utilized to evaluate and verify performance of the proposed LRICR algorithm. Compared with the conventional TV minimization based algorithm and the multi-resolution scout-reconstruction algorithm, the proposed LRICR algorithm shows significant improvement in reduction of the artifacts caused by data truncation, providing a practical solution for high quality and reliable interior tomography in microscopic CT applications. The proposed LRICR algorithm outperforms the multi-resolution scout-reconstruction method and the TV minimization based reconstruction for interior tomography in microscopic CT.

  16. Development and validation of an integrated DNA walking strategy to detect GMO expressing cry genes.

    PubMed

    Fraiture, Marie-Alice; Vandamme, Julie; Herman, Philippe; Roosens, Nancy H C

    2018-06-27

    Recently, an integrated DNA walking strategy has been proposed to prove the presence of GMO via the characterisation of sequences of interest, including their transgene flanking regions and the unnatural associations of elements in their transgenic cassettes. To this end, the p35S, tNOS and t35S pCAMBIA elements have been selected as key targets, allowing the coverage of most of GMO, EU authorized or not. In the present study, a bidirectional DNA walking method anchored on the CryAb/c genes is proposed with the aim to cover additional GMO and additional sequences of interest. The performance of the proposed bidirectional DNA walking method anchored on the CryAb/c genes has been evaluated in a first time for its feasibility using several GM events possessing these CryAb/c genes. Afterwards, its sensitivity has been investigated through low concentrations of targets (as low as 20 HGE). In addition, to illustrate its applicability, the entire workflow has been tested on a sample mimicking food/feed matrices analysed in GMO routine analysis. Given the successful assessment of its performance, the present bidirectional DNA walking method anchored on the CryAb/c genes can easily be implemented in GMO routine analysis by the enforcement laboratories and allows completing the entire DNA walking strategy in targeting an additional transgenic element frequently found in GMO.

  17. Quantitative determination of salbutamol sulfate impurities using achiral supercritical fluid chromatography.

    PubMed

    Dispas, Amandine; Desfontaine, Vincent; Andri, Bertyl; Lebrun, Pierre; Kotoni, Dorina; Clarke, Adrian; Guillarme, Davy; Hubert, Philippe

    2017-02-05

    In the last years, supercritical fluid chromatography has largely been acknowledged as a singular and performing technique in the field of separation sciences. Recent studies highlighted the interest of SFC for the quality control of pharmaceuticals, especially in the case of the determination of the active pharmaceutical ingredient (API). Nevertheless, quality control requires also the determination of impurities. The objectives of the present work were to (i) demonstrate the interest of SFC as a reference technique for the determination of impurities in salbutamol sulfate API and (ii) to propose an alternative to a reference HPLC method from the European Pharmacopeia (EP) involving ion-pairing reagent. Firstly, a screening was carried out to select the most adequate and selective stationary phase. Secondly, in the context of robust optimization strategy, the method was developed using design space methodology. The separation of salbutamol sulfate and related impurities was achieved in 7min, which is seven times faster than the LC-UV method proposed by European Pharmacopeia (total run time of 50min). Finally, full validation using accuracy profile approach was successfully achieved for the determination of impurities B, D, F and G in salbutamol sulfate raw material. The validated dosing range covered 50 to 150% of the targeted concentration (corresponding to 0.3% concentration level), LODs close to 0.5μg/mL were estimated. The SFC method proposed in this study could be presented as a suitable fast alternative to EP LC method for the quantitative determination of salbutamol impurities. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Phytoremediation: Risk or benefit?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beath, J.M.; Allen, B.J.

    1999-07-01

    The proposed use of phytoremediation at an increasing variety of contaminated sites has resulted in concerns by regulating agencies that a successful removal of constituents from contaminated sludge or soil by plants might result in the unwanted transfer of these constituents to the food chain. As part of the basis for a corrective measures study for a potential remedy, this pathway may need to be evaluated. Different constituents of concern result in different transport issues. For volatile compounds, the evolution of gases from plants as part of evapotranspiration may be an issue. This paper discusses the risks associated with polycyclicmore » aromatic hydrocarbons (PAHs) that are frequently present at hazardous waste surface impoundments for which phytoremediation may have attractive cost advantages over conventional closure methods. Central to an analysis of potential uptake effects is an evaluation of constituent transport, exposure pathway and toxicity. Methods by which each of these can be estimated are presented. Regulatory frameworks under which these evaluations may be performed at the state level are still evolving, in fact Texas issued new proposed regulatory language pertaining to ecological risk as this paper was going to print. The attractiveness of phytoremediation in a RCRA setting is greater if a phytoremediation-based cover can be substituted for a traditional RCRA landfill cap. At the federal level some flexibility has now been provided, but it must be adopted by RCRA- delegated states to be useful. Alternatively, a demonstration that the phytoremediation-based cover somehow meets the RCRA closure design criteria for caps must be made. Work to make this kind of demonstration compelling is underway under the oversight of EPA.« less

  19. Enhanced encrypted reversible data hiding algorithm with minimum distortion through homomorphic encryption

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Rupali

    2018-03-01

    Reversible data hiding means embedding a secret message in a cover image in such a manner, to the point that in the midst of extraction of the secret message, the cover image and, furthermore, the secret message are recovered with no error. The goal of by far most of the reversible data hiding algorithms is to have improved the embedding rate and enhanced visual quality of stego image. An improved encrypted-domain-based reversible data hiding algorithm to embed two binary bits in each gray pixel of original cover image with minimum distortion of stego-pixels is employed in this paper. Highlights of the proposed algorithm are minimum distortion of pixel's value, elimination of underflow and overflow problem, and equivalence of stego image and cover image with a PSNR of ∞ (for Lena, Goldhill, and Barbara image). The experimental outcomes reveal that in terms of average PSNR and embedding rate, for natural images, the proposed algorithm performed better than other conventional ones.

  20. Revisit submergence of ice blocks in front of ice cover—an experimental study

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wu, Yi-fan; Sui, Jueyi

    2018-04-01

    The present paper studies the stabilities of ice blocks in front of an ice cover based on experiments carried out in laboratory by using four types of ice blocks with different dimensions. The forces acting on the ice blocks in front of the ice cover are analyzed. The critical criteria for the entrainment of ice blocks in front of the ice cover are established by considering the drag force caused by the flowing water, the collision force, and the hydraulic pressure force. Formula for determining whether or not an ice block will be entrained under the ice cover is derived. All three dimensions of the ice block are considered in the proposed formula. The velocities calculated by using the developed formula are compared with those of calculated by other formulas proposed by other researchers, as well as the measured flow velocities for the entrainment of ice blocks in laboratory. The fitting values obtained by using the derived formula agree well with the experimental results.

  1. [Comparison of molluscicidal effects of two snail control methods with plastic film covering in hilly regions].

    PubMed

    Zhou, Yun; Zhang, Biao; Wang, Zhi-Mei; Zhao, Jia-Huei; Mao, Shu; Xie, De-Bing; Mei, Zhi-Zhong; Zhang, Jun; Hong, Qing-Biao; Wang, Wei; Sun, Le-Ping

    2013-12-01

    To evaluate and compare the molluscicidal effects of colorless and black plastic film covering methods against Oncomelania hupensis snails in hilly regions. A hilly setting with high snail density was selected as the study area, and three groups including the colorless plastic film covering method, black plastic film covering method and control were designed. The snail surveys were conducted 1, 3, 7, 15 days and 30 days in each group following plastic film covering, and the mortality of snails and reduction of snail density were investigated. The air temperature, soil surface temperature in the control group, as well as the soil surface temperature and the temperatures 5 cm and 15 cm under the soil within the film were recorded. The mortality rates of snails were 36.84%, 78.94%, 95.92%, 100.00% and 99.45% 1, 3, 7, 15 days and 30 days following colorless plastic film covering, respectively, and the snail density after 30 days of covering reduced by 99.36% as compared to that before covering, while the mortality rates of snails were 10.08%, 8.94%, 6.11%, 26.15% and 49.32% 1, 3, 7, 15 days and 30 days following black plastic film covering, respectively, and the snail density after 30 days of covering reduced by 58.10% as compared to that before covering. There were significant differences in the 1-, 3-, 7-, 15-day and 30-day snail mortality rates between the colorless and black film covering groups (all P values <0.01), and a significant difference was detected in the snail density between the two groups 30 days after the film covering (P < 0.001). In addition, the speed, amplitude and duration of the rise in the soil surface temperature within the colorless film were all greater than those within the black film. The short-term molluscicidal effect of the colorless plastic film covering method is significantly superior to that of the black plastic film covering method in summer in hilly regions.

  2. Extracting Information about the Rotator Cuff from Magnetic Resonance Images Using Deterministic and Random Techniques

    PubMed Central

    De Los Ríos, F. A.; Paluszny, M.

    2015-01-01

    We consider some methods to extract information about the rotator cuff based on magnetic resonance images; the study aims to define an alternative method of display that might facilitate the detection of partial tears in the supraspinatus tendon. Specifically, we are going to use families of ellipsoidal triangular patches to cover the humerus head near the affected area. These patches are going to be textured and displayed with the information of the magnetic resonance images using the trilinear interpolation technique. For the generation of points to texture each patch, we propose a new method that guarantees the uniform distribution of its points using a random statistical method. Its computational cost, defined as the average computing time to generate a fixed number of points, is significantly lower as compared with deterministic and other standard statistical techniques. PMID:25650281

  3. A Meta-heuristic Approach for Variants of VRP in Terms of Generalized Saving Method

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki

    Global logistic design is becoming a keen interest to provide an essential infrastructure associated with modern societal provision. For examples, we can designate green and/or robust logistics in transportation systems, smart grids in electricity utilization systems, and qualified service in delivery systems, and so on. As a key technology for such deployments, we engaged in practical vehicle routing problem on a basis of the conventional saving method. This paper extends such idea and gives a general framework available for various real-world applications. It can cover not only delivery problems but also two kind of pick-up problems, i.e., straight and drop-by routings. Moreover, multi-depot problem is considered by a hybrid approach with graph algorithm and its solution method is realized in a hierarchical manner. Numerical experiments have been taken place to validate effectiveness of the proposed method.

  4. Estimating the signal-to-noise ratio of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Curran, Paul J.; Dungan, Jennifer L.

    1988-01-01

    To make the best use of narrowband airborne visible/infrared imaging spectrometer (AVIRIS) data, an investigator needs to know the ratio of signal to random variability or noise (signal-to-noise ratio or SNR). The signal is land cover dependent and varies with both wavelength and atmospheric absorption; random noise comprises sensor noise and intrapixel variability (i.e., variability within a pixel). The three existing methods for estimating the SNR are inadequate, since typical laboratory methods inflate while dark current and image methods deflate the SNR. A new procedure is proposed called the geostatistical method. It is based on the removal of periodic noise by notch filtering in the frequency domain and the isolation of sensor noise and intrapixel variability using the semi-variogram. This procedure was applied easily and successfully to five sets of AVIRIS data from the 1987 flying season and could be applied to remotely sensed data from broadband sensors.

  5. Regularization of the double period method for experimental data processing

    NASA Astrophysics Data System (ADS)

    Belov, A. A.; Kalitkin, N. N.

    2017-11-01

    In physical and technical applications, an important task is to process experimental curves measured with large errors. Such problems are solved by applying regularization methods, in which success depends on the mathematician's intuition. We propose an approximation based on the double period method developed for smooth nonperiodic functions. Tikhonov's stabilizer with a squared second derivative is used for regularization. As a result, the spurious oscillations are suppressed and the shape of an experimental curve is accurately represented. This approach offers a universal strategy for solving a broad class of problems. The method is illustrated by approximating cross sections of nuclear reactions important for controlled thermonuclear fusion. Tables recommended as reference data are obtained. These results are used to calculate the reaction rates, which are approximated in a way convenient for gasdynamic codes. These approximations are superior to previously known formulas in the covered temperature range and accuracy.

  6. I-SonReb: an improved NDT method to evaluate the in situ strength of carbonated concrete

    NASA Astrophysics Data System (ADS)

    Breccolotti, Marco; Bonfigli, Massimo F.

    2015-10-01

    Concrete strength evaluated in situ by means of the conventional SonReb method can be highly overestimated in presence of carbonation. This latter, in fact, is responsible for the physical and chemical alteration of the outer layer of concrete. As most of the existing concrete structures are subjected to carbonation, it is of high importance to overcome this problem. In this paper, an Improved SonReb method (I-SonReb) for carbonated concretes is proposed. It relies on the definition of a correction coefficient of the measured Rebound index as a function of the carbonated concrete cover thickness, an additional parameter to be measured during in situ testing campaigns. The usefulness of the method has been validated showing the improvement in the accuracy of concrete strength estimation from two sets of NDT experimental data collected from investigations on real structures.

  7. Proposed Unit Level Ozone Season NOx Allowance Allocations to Existing Units in Six States: Supplemental Proposed Rule TSD

    EPA Pesticide Factsheets

    This Technical Support Document (TSD) presents the proposed unit-level allocations based on the existing-unit portion of each state’s ozone season NOx emission budget to covered existing units in Iowa, Kansas, Michigan, Missouri, Oklahoma, and Wisconsin.

  8. The Effect of the Paideia Proposal on Dance.

    ERIC Educational Resources Information Center

    McCutcheon, Gene

    1984-01-01

    Notes recent growth of dance in America and lack of dance education in public schools. Comments on effects Paideia Proposal adoption would have on dance in schools. Discusses proposal effects on teacher education, particularly concerning dance. Covers citizen interest in dance and the future of the dance profession. (MH)

  9. 7 CFR 3405.11 - Content of a proposal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION...) The title of the project shown on the “Higher Education Proposal Cover Page” must be brief (80... institution, the targeted need area(s), and the title of the proposal must be identified exactly as shown on...

  10. Optimal subinterval selection approach for power system transient stability simulation

    DOE PAGES

    Kim, Soobae; Overbye, Thomas J.

    2015-10-21

    Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modalmore » analysis using a single machine infinite bus (SMIB) system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. As a result, the performance of the proposed method is demonstrated with the GSO 37-bus system.« less

  11. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    NASA Astrophysics Data System (ADS)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  12. How Spatial Heterogeneity of Cover Affects Patterns of Shrub Encroachment into Mesic Grasslands

    PubMed Central

    Montané, Francesc; Casals, Pere; Dale, Mark R. T.

    2011-01-01

    We used a multi-method approach to analyze the spatial patterns of shrubs and cover types (plant species, litter or bare soil) in grassland-shrubland ecotones. This approach allows us to assess how fine-scale spatial heterogeneity of cover types affects the patterns of Cytisus balansae shrub encroachment into mesic mountain grasslands (Catalan Pyrenees, Spain). Spatial patterns and the spatial associations between juvenile shrubs and different cover types were assessed in mesic grasslands dominated by species with different palatabilities (palatable grass Festuca nigrescens and unpalatable grass Festuca eskia). A new index, called RISES (“Relative Index of Shrub Encroachment Susceptibility”), was proposed to calculate the chances of shrub encroachment into a given grassland, combining the magnitude of the spatial associations and the surface area for each cover type. Overall, juveniles showed positive associations with palatable F. nigrescens and negative associations with unpalatable F. eskia, although these associations shifted with shrub development stage. In F. eskia grasslands, bare soil showed a low scale of pattern and positive associations with juveniles. Although the highest RISES values were found in F. nigrescens plots, the number of juvenile Cytisus was similar in both types of grasslands. However, F. nigrescens grasslands showed the greatest number of juveniles in early development stage (i.e. height<10 cm) whereas F. eskia grasslands showed the greatest number of juveniles in late development stages (i.e. height>30 cm). We concluded that in F. eskia grasslands, where establishment may be constrained by the dominant cover type, the low scale of pattern on bare soil may result in higher chances of shrub establishment and survival. In contrast, although grasslands dominated by the palatable F. nigrescens may be more susceptible to shrub establishment; current grazing rates may reduce juvenile survival. PMID:22174858

  13. A decadal observation of vegetation dynamics using multi-resolution satellite images

    NASA Astrophysics Data System (ADS)

    Chiang, Yang-Sheng; Chen, Kun-Shan; Chu, Chang-Jen

    2012-10-01

    Vegetation cover not just affects the habitability of the earth, but also provides potential terrestrial mechanism for mitigation of greenhouse gases. This study aims at quantifying such green resources by incorporating multi-resolution satellite images from different platforms, including Formosat-2(RSI), SPOT(HRV/HRG), and Terra(MODIS), to investigate vegetation fractional cover (VFC) and its inter-/intra-annual variation in Taiwan. Given different sensor capabilities in terms of their spatial coverage and resolution, infusion of NDVIs at different scales was used to determine fraction of vegetation cover based on NDVI. Field campaign has been constantly conducted on a monthly basis for 6 years to calibrate the critical NDVI threshold for the presence of vegetation cover, with test sites covering IPCC-defined land cover types of Taiwan. Based on the proposed method, we analyzed spatio- temporal changes of VFC for the entire Taiwan Island. A bimodal sequence of VFC was observed for intra-annual variation based on MODIS data, with level around 5% and two peaks in spring and autumn marking the principal dual-cropping agriculture pattern in southwestern Taiwan. Compared to anthropogenic-prone variation, the inter-annual VFC (Aug.-Oct.) derived from HRV/HRG/RSI reveals that the moderate variations (3%) and the oscillations were strongly linked with regional climate pattern and major disturbances resulting from extreme weather events. Two distinct cycles (2002-2005 and 2005-2009) were identified in the decadal observations, with VFC peaks at 87.60% and 88.12% in 2003 and 2006, respectively. This time-series mapping of VFC can be used to examine vegetation dynamics and its response associated with short-term and long-term anthropogenic/natural events.

  14. A Modified Relative Spectral Mixture Analysis to Extract the Fractions of Major Land Cover Components

    NASA Astrophysics Data System (ADS)

    Jia, S.

    2015-12-01

    As an effective method of extracting land cover fractions based on spectral endmembers, spectral mixture analysis (SMA) has been applied using remotely sensed imagery in different spatial, temporal, and spectral resolutions. A number of studies focused on arid/semiarid ecosystem have used SMA to obtain the land cover fractions of GV, NPV/litter, and bare soil (BS) using MODIS reflectance products to understand ecosystem phenology, track vegetation dynamics, and evaluate the impact of major disturbances. However, several challenges remain in the application of SMA in studying ecosystem phenology, including obtaining high quality endmembers and increasing computational efficiency when considering to long time series that cover a broad spatial extent. Okin (2007) proposes a variation of SMA, named as relative spectra mixture analysis (RSMA) to address the latter challenge by calculating the relative change of fraction of GV, NPV/litter, and BS compared with a baseline date. This approach assumes that the baseline image contains the spectral information of the bare soil that can be used as an endmember for spectral mixture analysis though it is mixed with the spectral reflectance of other non-soil land cover types. Using the baseline image, one can obtain the change of fractions of GV, NPV/litter, BS, and snow compared with the baseline image. However, RSMA results depend on the selection of baseline date and the fractional components during this date. In this study, we modified the strategy of implementing RSMA by introducing a step of obtaining a soil map as the baseline image using multiple-endmember SMA (MESMA) before applying RSMA. The fractions of land cover components from this modified RSMA are also validated using the field observations from two study area in semiarid savanna and grassland of Queensland, Australia.

  15. Strong contributions of local background climate to the cooling effect of urban green vegetation.

    PubMed

    Yu, Zhaowu; Xu, Shaobin; Zhang, Yuhan; Jørgensen, Gertrud; Vejre, Henrik

    2018-05-01

    Utilization of urban green vegetation (UGV) has been recognized as a promising option to mitigate urban heat island (UHI) effect. While we still lack understanding of the contributions of local background climate to the cooling effect of UGV. Here we proposed and employed a cooling effect framework and selected eight typical cities located in Temperate Monsoon Climate (TMC) and Mediterranean Climate (MC) demonstrate that local climate condition largely affects the cooling effect of UGV. Specifically, we found increasing (artificial) rainfall and irrigation contribute to improving the cooling intensity of grassland in both climates, particularly in the hot-dry environment. The cities with high relative humidity would restrict the cooling effect of UGV. Increasing wind speed would significantly enhance the tree-covered while weakening the grass-covered UGVs' cooling effect in MC cities. We also identified that, in order to achieve the most effective cooling with the smallest sized tree-covered UGV, the area of trees in both climate zones' cities should generally be planned around 0.5 ha. The method and results enhance understanding of the cooling effect of UGVs on larger (climate) scales and provide important insights for UGV planning and management.

  16. Incomplete Sparse Approximate Inverses for Parallel Preconditioning

    DOE PAGES

    Anzt, Hartwig; Huckle, Thomas K.; Bräckle, Jürgen; ...

    2017-10-28

    In this study, we propose a new preconditioning method that can be seen as a generalization of block-Jacobi methods, or as a simplification of the sparse approximate inverse (SAI) preconditioners. The “Incomplete Sparse Approximate Inverses” (ISAI) is in particular efficient in the solution of sparse triangular linear systems of equations. Those arise, for example, in the context of incomplete factorization preconditioning. ISAI preconditioners can be generated via an algorithm providing fine-grained parallelism, which makes them attractive for hardware with a high concurrency level. Finally, in a study covering a large number of matrices, we identify the ISAI preconditioner as anmore » attractive alternative to exact triangular solves in the context of incomplete factorization preconditioning.« less

  17. Aeromagnetic Survey in Afghanistan: A Website for Distribution of Data

    USGS Publications Warehouse

    Abraham, Jared D.; Anderson, Eric D.; Drenth, Benjamin J.; Finn, Carol A.; Kucks, Robert P.; Lindsay, Charles R.; Phillips, Jeffrey D.; Sweeney, Ronald E.

    2007-01-01

    Afghanistan's geologic setting indicates significant natural resource potential While important mineral deposits and petroleum resources have been identified, much of the country's potential remains unknown. Airborne geophysical surveys are a well accepted and cost effective method for obtaining information of the geological setting of an area without the need to be physically located on the ground. Due to the security situation and the large areas of the country of Afghanistan that has not been covered with geophysical exploration methods a regional airborne geophysical survey was proposed. Acting upon the request of the Islamic Republic of Afghanistan Ministry of Mines, the U.S. Geological Survey contracted with the Naval Research Laboratory to jointly conduct an airborne geophysical and remote sensing survey of Afghanistan.

  18. Built-up index methods and their applications for urban extraction from Sentinel 2A satellite data: discussion.

    PubMed

    Valdiviezo-N, Juan C; Téllez-Quiñones, Alejandro; Salazar-Garibay, Adan; López-Caloca, Alejandra A

    2018-01-01

    Several built-up indices have been proposed in the literature in order to extract the urban sprawl from satellite data. Given their relative simplicity and easy implementation, such methods have been widely adopted for urban growth monitoring. Previous research has shown that built-up indices are sensitive to different factors related to image resolution, seasonality, and study area location. Also, most of them confuse urban surfaces with bare soil and barren land covers. By gathering the existing built-up indices, the aim of this paper is to discuss some of their advantages, difficulties, and limitations. In order to illustrate our study, we provide some application examples using Sentinel 2A data.

  19. Evaluating digital libraries in the health sector. Part 2: measuring impacts and outcomes.

    PubMed

    Cullen, Rowena

    2004-03-01

    This is the second part of a two-part paper which explores methods that can be used to evaluate digital libraries in the health sector. Part 1 focuses on approaches to evaluation that have been proposed for mainstream digital information services. This paper investigates evaluative models developed for some innovative digital library projects, and some major national and international electronic health information projects. The value of ethnographic methods to provide qualitative data to explore outcomes, adding to quantitative approaches based on inputs and outputs is discussed. The paper concludes that new 'post-positivist' models of evaluation are needed to cover all the dimensions of the digital library in the health sector, and some ways of doing this are outlined.

  20. Using HST: From proposal to science

    NASA Technical Reports Server (NTRS)

    Shames, P.

    1991-01-01

    The following subject areas are covered: a short history; uses of network STSCII (general communication, science collaboration, functional activities, internal data management, and external data access); proposal/observation handling; DMF access; and future uses and requirements.

  1. NASA/ARC proposed training in intelligent control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1990-01-01

    Viewgraphs on NASA Ames Research Center proposed training in intelligent control was presented. Topics covered include: fuzzy logic control; neural networks in control; artificial intelligence in control; hybrid approaches; hands on experience; and fuzzy controllers.

  2. Text String Detection from Natural Scenes by Structure-based Partition and Grouping

    PubMed Central

    Yi, Chucai; Tian, YingLi

    2012-01-01

    Text information in natural scene images serves as important clues for many image-based applications such as scene understanding, content-based image retrieval, assistive navigation, and automatic geocoding. However, locating text from complex background with multiple colors is a challenging task. In this paper, we explore a new framework to detect text strings with arbitrary orientations in complex natural scene images. Our proposed framework of text string detection consists of two steps: 1) Image partition to find text character candidates based on local gradient features and color uniformity of character components. 2) Character candidate grouping to detect text strings based on joint structural features of text characters in each text string such as character size differences, distances between neighboring characters, and character alignment. By assuming that a text string has at least three characters, we propose two algorithms of text string detection: 1) adjacent character grouping method, and 2) text line grouping method. The adjacent character grouping method calculates the sibling groups of each character candidate as string segments and then merges the intersecting sibling groups into text string. The text line grouping method performs Hough transform to fit text line among the centroids of text candidates. Each fitted text line describes the orientation of a potential text string. The detected text string is presented by a rectangle region covering all characters whose centroids are cascaded in its text line. To improve efficiency and accuracy, our algorithms are carried out in multi-scales. The proposed methods outperform the state-of-the-art results on the public Robust Reading Dataset which contains text only in horizontal orientation. Furthermore, the effectiveness of our methods to detect text strings with arbitrary orientations is evaluated on the Oriented Scene Text Dataset collected by ourselves containing text strings in non-horizontal orientations. PMID:21411405

  3. Text string detection from natural scenes by structure-based partition and grouping.

    PubMed

    Yi, Chucai; Tian, YingLi

    2011-09-01

    Text information in natural scene images serves as important clues for many image-based applications such as scene understanding, content-based image retrieval, assistive navigation, and automatic geocoding. However, locating text from a complex background with multiple colors is a challenging task. In this paper, we explore a new framework to detect text strings with arbitrary orientations in complex natural scene images. Our proposed framework of text string detection consists of two steps: 1) image partition to find text character candidates based on local gradient features and color uniformity of character components and 2) character candidate grouping to detect text strings based on joint structural features of text characters in each text string such as character size differences, distances between neighboring characters, and character alignment. By assuming that a text string has at least three characters, we propose two algorithms of text string detection: 1) adjacent character grouping method and 2) text line grouping method. The adjacent character grouping method calculates the sibling groups of each character candidate as string segments and then merges the intersecting sibling groups into text string. The text line grouping method performs Hough transform to fit text line among the centroids of text candidates. Each fitted text line describes the orientation of a potential text string. The detected text string is presented by a rectangle region covering all characters whose centroids are cascaded in its text line. To improve efficiency and accuracy, our algorithms are carried out in multi-scales. The proposed methods outperform the state-of-the-art results on the public Robust Reading Dataset, which contains text only in horizontal orientation. Furthermore, the effectiveness of our methods to detect text strings with arbitrary orientations is evaluated on the Oriented Scene Text Dataset collected by ourselves containing text strings in nonhorizontal orientations.

  4. 78 FR 63470 - Agency Information Collection Activities: Proposed Information Collection; Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-24

    ... information collection titled, ``Annual Stress Test Reporting Template and Documentation for Covered Banks.... Email: [email protected] . Include ``Annual Stress Test Reporting Template and Documentation'' on the... information collection: Annual Stress Test Reporting Template and Documentation for Covered Banks With Total...

  5. 77 FR 66663 - Agency Information Collection Activities: Proposed Information Collection; Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... titled, ``Company-Run Annual Stress Test Reporting Template and Documentation for Covered Institutions... review and clearance. Company-Run Annual Stress Test Reporting Template and Documentation for Covered... savings associations, to conduct annual stress tests \\2\\ and requires the primary financial regulatory...

  6. Easy and Fast Reconstruction of a 3D Avatar with an RGB-D Sensor.

    PubMed

    Mao, Aihua; Zhang, Hong; Liu, Yuxin; Zheng, Yinglong; Li, Guiqing; Han, Guoqiang

    2017-05-12

    This paper proposes a new easy and fast 3D avatar reconstruction method using an RGB-D sensor. Users can easily implement human body scanning and modeling just with a personal computer and a single RGB-D sensor such as a Microsoft Kinect within a small workspace in their home or office. To make the reconstruction of 3D avatars easy and fast, a new data capture strategy is proposed for efficient human body scanning, which captures only 18 frames from six views with a close scanning distance to fully cover the body; meanwhile, efficient alignment algorithms are presented to locally align the data frames in the single view and then globally align them in multi-views based on pairwise correspondence. In this method, we do not adopt shape priors or subdivision tools to synthesize the model, which helps to reduce modeling complexity. Experimental results indicate that this method can obtain accurate reconstructed 3D avatar models, and the running performance is faster than that of similar work. This research offers a useful tool for the manufacturers to quickly and economically create 3D avatars for products design, entertainment and online shopping.

  7. Inference from Samples of DNA Sequences Using a Two-Locus Model

    PubMed Central

    Griffiths, Robert C.

    2011-01-01

    Abstract Performing inference on contemporary samples of DNA sequence data is an important and challenging task. Computationally intensive methods such as importance sampling (IS) are attractive because they make full use of the available data, but in the presence of recombination the large state space of genealogies can be prohibitive. In this article, we make progress by developing an efficient IS proposal distribution for a two-locus model of sequence data. We show that the proposal developed here leads to much greater efficiency, outperforming existing IS methods that could be adapted to this model. Among several possible applications, the algorithm can be used to find maximum likelihood estimates for mutation and crossover rates, and to perform ancestral inference. We illustrate the method on previously reported sequence data covering two loci either side of the well-studied TAP2 recombination hotspot. The two loci are themselves largely non-recombining, so we obtain a gene tree at each locus and are able to infer in detail the effect of the hotspot on their joint ancestry. We summarize this joint ancestry by introducing the gene graph, a summary of the well-known ancestral recombination graph. PMID:21210733

  8. Study of the motion artefacts of skin-mounted inertial sensors under different attachment conditions.

    PubMed

    Forner-Cordero, A; Mateu-Arce, M; Forner-Cordero, I; Alcántara, E; Moreno, J C; Pons, J L

    2008-04-01

    A common problem shared by accelerometers, inertial sensors and any motion measurement method based on skin-mounted sensors is the movement of the soft tissues covering the bones. The aim of this work is to propose a method for the validation of the attachment of skin-mounted sensors. A second-order (mass-spring-damper) model was proposed to characterize the behaviour of the soft tissue between the bone and the sensor. Three sets of experiments were performed. In the first one, different procedures to excite the system were evaluated to select an adequate excitation stimulus. In the second one, the selected stimulus was applied under varying attachment conditions while the third experiment was used to test the model. The heel drop was chosen as the excitation method because it showed lower variability and could discriminate between different attachment conditions. There was, in agreement with the model, a trend to increase the natural frequency of the system with decreasing accelerometer mass. An important result is the development of a standard procedure to test the bandwidth of skin-mounted inertial sensors, such as accelerometers mounted on the skin or markers heavier than a few grams.

  9. Salient object detection based on discriminative boundary and multiple cues integration

    NASA Astrophysics Data System (ADS)

    Jiang, Qingzhu; Wu, Zemin; Tian, Chang; Liu, Tao; Zeng, Mingyong; Hu, Lei

    2016-01-01

    In recent years, many saliency models have achieved good performance by taking the image boundary as the background prior. However, if all boundaries of an image are equally and artificially selected as background, misjudgment may happen when the object touches the boundary. We propose an algorithm called weighted contrast optimization based on discriminative boundary (wCODB). First, a background estimation model is reliably constructed through discriminating each boundary via Hausdorff distance. Second, the background-only weighted contrast is improved by fore-background weighted contrast, which is optimized through weight-adjustable optimization framework. Then to objectively estimate the quality of a saliency map, a simple but effective metric called spatial distribution of saliency map and mean saliency in covered window ratio (MSR) is designed. Finally, in order to further promote the detection result using MSR as the weight, we propose a saliency fusion framework to integrate three other cues-uniqueness, distribution, and coherence from three representative methods into our wCODB model. Extensive experiments on six public datasets demonstrate that our wCODB performs favorably against most of the methods based on boundary, and the integrated result outperforms all state-of-the-art methods.

  10. Multifractal analysis of mobile social networks

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Zhang, Zifeng; Deng, Yufan

    2017-09-01

    As Wireless Fidelity (Wi-Fi)-enabled handheld devices have been widely used, the mobile social networks (MSNs) has been attracting extensive attention. Fractal approaches have also been widely applied to characterierize natural networks as useful tools to depict their spatial distribution and scaling properties. Moreover, when the complexity of the spatial distribution of MSNs cannot be properly charaterized by single fractal dimension, multifractal analysis is required. For further research, we introduced a multifractal analysis method based on box-covering algorithm to describe the structure of MSNs. Using this method, we find that the networks are multifractal at different time interval. The simulation results demonstrate that the proposed method is efficient for analyzing the multifractal characteristic of MSNs, which provides a distribution of singularities adequately describing both the heterogeneity of fractal patterns and the statistics of measurements across spatial scales in MSNs.

  11. National-scale cropland mapping based on spectral-temporal features and outdated land cover information.

    PubMed

    Waldner, François; Hansen, Matthew C; Potapov, Peter V; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre

    2017-01-01

    The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring.

  12. Generalized interpretation scheme for arbitrary HR InSAR image pairs

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten

    2013-10-01

    Land cover classification of remote sensing imagery is an important topic of research. For example, different applications require precise and fast information about the land cover of the imaged scenery (e.g., disaster management and change detection). Focusing on high resolution (HR) spaceborne remote sensing imagery, the user has the choice between passive and active sensor systems. Passive systems, such as multispectral sensors, have the disadvantage of being dependent from weather influences (fog, dust, clouds, etc.) and time of day, since they work in the visible part of the electromagnetic spectrum. Here, active systems like Synthetic Aperture Radar (SAR) provide improved capabilities. As an interactive method analyzing HR InSAR image pairs, the CovAmCohTM method was introduced in former studies. CovAmCoh represents the joint analysis of locality (coefficient of variation - Cov), backscatter (amplitude - Am) and temporal stability (coherence - Coh). It delivers information on physical backscatter characteristics of imaged scene objects or structures and provides the opportunity to detect different classes of land cover (e.g., urban, rural, infrastructure and activity areas). As example, railway tracks are easily distinguishable from other infrastructure due to their characteristic bluish coloring caused by the gravel between the sleepers. In consequence, imaged objects or structures have a characteristic appearance in CovAmCoh images which allows the development of classification rules. In this paper, a generalized interpretation scheme for arbitrary InSAR image pairs using the CovAmCoh method is proposed. This scheme bases on analyzing the information content of typical CovAmCoh imagery using the semisupervised k-means clustering. It is shown that eight classes model the main local information content of CovAmCoh images sufficiently and can be used as basis for a classification scheme.

  13. National-scale cropland mapping based on spectral-temporal features and outdated land cover information

    PubMed Central

    Hansen, Matthew C.; Potapov, Peter V.; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre

    2017-01-01

    The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring. PMID:28817618

  14. Proposal for Testing and Validation of Vacuum Ultra-Violet Atomic Laser-Induced Fluorescence as a Method to Analyze Carbon Grid Erosion in Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Stevens, Richard

    2003-01-01

    Previous investigation under award NAG3-25 10 sought to determine the best method of LIF to determine the carbon density in a thruster plume. Initial reports from other groups were ambiguous as to the number of carbon clusters that might be present in the plume of a thruster. Carbon clusters would certainly affect the ability to LIF; if they were the dominant species, then perhaps the LIF method should target clusters. The results of quadrupole mass spectroscopy on sputtered carbon determined that minimal numbers of clusters were sputtered from graphite under impact from keV Krypton. There were some investigations in the keV range by other groups that hinted at clusters, but at the time the proposal was presented to NASA, there was no data from low-energy sputtering available. Thus, the proposal sought to develop a method to characterize the population only of atoms sputtered from a graphite target in a test cell. Most of the ground work had been established by the previous two years of investigation. The proposal covering 2003 sought to develop an anti-Stokes Raman shifting cell to generate VUW light and test this cell on two different laser systems, ArF and YAG- pumped dye. The second goal was to measure the lowest detectable amounts of carbon atoms by 156.1 nm and 165.7 nm LIF. If equipment was functioning properly, it was expected that these goals would be met easily during the timeframe of the proposal, and that is the reason only modest funding was requested. The PI was only funded at half- time by Glenn during the summer months. All other work time was paid for by Whitworth College. The college also funded a student, Charles Shawley, who worked on the project during the spring.

  15. Estimates of the effective compressive strength

    NASA Astrophysics Data System (ADS)

    Goldstein, R. V.; Osipenko, N. M.

    2017-07-01

    One problem encountered when determining the effective mechanical properties of large-scale objects, which requires calculating their strength in processes of mechanical interaction with other objects, is related to the possible variability in their local properties including those due to the action of external physical factors. Such problems comprise the determination of the effective strength of bodies one of whose dimensions (thickness) is significantly less than the others and whose properties and/or composition can vary with the thickness. A method for estimating the effective strength of such bodies is proposed and illustrated with example of ice cover strength under longitudinal compression with regard to a partial loss of the ice bearing capacity in deformation. The role of failure localization processes is shown. It is demonstrated that the proposed approach can be used in other problems of fracture mechanics.

  16. Optimized, Fast-Throughput UHPLC-DAD Based Method for Carotenoid Quantification in Spinach, Serum, Chylomicrons, and Feces.

    PubMed

    Eriksen, Jane N; Madsen, Pia L; Dragsted, Lars O; Arrigoni, Eva

    2017-02-01

    An improved UHPLC-DAD-based method was developed and validated for quantification of major carotenoids present in spinach, serum, chylomicrons, and feces. Separation was achieved with gradient elution within 12.5 min for six dietary carotenoids and the internal standard, echinenone. The proposed method provides, for all standard components, resolution > 1.1, linearity covering the target range (R > 0.99), LOQ < 0.035 mg/L, and intraday and interday RSDs < 2 and 10%, respectively. Suitability of the method was tested on biological matrices. Method precision (RSD%) for carotenoid quantification in serum, chylomicrons, and feces was below 10% for intra- and interday analysis, except for lycopene. Method accuracy was consistent with mean recoveries ranging from 78.8 to 96.9% and from 57.2 to 96.9% for all carotenoids, except for lycopene, in serum and feces, respectively. Additionally, an interlaboratory validation study on spinach at two institutions showed no significant differences in lutein or β-carotene content, when evaluated on four occasions.

  17. A finite element method with overlapping meshes for free-boundary axisymmetric plasma equilibria in realistic geometries

    NASA Astrophysics Data System (ADS)

    Heumann, Holger; Rapetti, Francesca

    2017-04-01

    Existing finite element implementations for the computation of free-boundary axisymmetric plasma equilibria approximate the unknown poloidal flux function by standard lowest order continuous finite elements with discontinuous gradients. As a consequence, the location of critical points of the poloidal flux, that are of paramount importance in tokamak engineering, is constrained to nodes of the mesh leading to undesired jumps in transient problems. Moreover, recent numerical results for the self-consistent coupling of equilibrium with resistive diffusion and transport suggest the necessity of higher regularity when approximating the flux map. In this work we propose a mortar element method that employs two overlapping meshes. One mesh with Cartesian quadrilaterals covers the vacuum chamber domain accessible by the plasma and one mesh with triangles discretizes the region outside. The two meshes overlap in a narrow region. This approach gives the flexibility to achieve easily and at low cost higher order regularity for the approximation of the flux function in the domain covered by the plasma, while preserving accurate meshing of the geometric details outside this region. The continuity of the numerical solution in the region of overlap is weakly enforced by a mortar-like mapping.

  18. Dispersion of single-wall carbon nanotubes with supramolecular Congo red - properties of the complexes and mechanism of the interaction.

    PubMed

    Jagusiak, Anna; Piekarska, Barbara; Pańczyk, Tomasz; Jemioła-Rzemińska, Małgorzata; Bielańska, Elżbieta; Stopa, Barbara; Zemanek, Grzegorz; Rybarska, Janina; Roterman, Irena; Konieczny, Leszek

    2017-01-01

    A method of dispersion of single-wall carbon nanotubes (SWNTs) in aqueous media using Congo red (CR) is proposed. Nanotubes covered with CR constitute the high capacity system that provides the possibility of binding and targeted delivery of different drugs, which can intercalate into the supramolecular, ribbon-like CR structure. The study revealed the presence of strong interactions between CR and the surface of SWNTs. The aim of the study was to explain the mechanism of this interaction. The interaction of CR and carbon nanotubes was studied using spectral analysis of the SWNT-CR complex, dynamic light scattering (DLS), differential scanning calorimetry (DSC) and microscopic methods: atomic force microscopy (AFM), transmission (TEM), scanning (SEM) and optical microscopy. The results indicate that the binding of supramolecular CR structures to the surface of the nanotubes is based on the "face to face stacking". CR molecules attached directly to the surface of the nanotubes can bind further, parallel-oriented molecules and form supramolecular and protruding structures. This explains the high CR binding capacity of carbon nanotubes. The presented system - containing SWNTs covered with CR - offers a wide range of biomedical applications.

  19. Efficient Wideband Numerical Simulations for Nanostructures Employing a Drude-Critical Points (DCP) Dispersive Model.

    PubMed

    Ren, Qiang; Nagar, Jogender; Kang, Lei; Bian, Yusheng; Werner, Ping; Werner, Douglas H

    2017-05-18

    A highly efficient numerical approach for simulating the wideband optical response of nano-architectures comprised of Drude-Critical Points (DCP) media (e.g., gold and silver) is proposed and validated through comparing with commercial computational software. The kernel of this algorithm is the subdomain level discontinuous Galerkin time domain (DGTD) method, which can be viewed as a hybrid of the spectral-element time-domain method (SETD) and the finite-element time-domain (FETD) method. An hp-refinement technique is applied to decrease the Degrees-of-Freedom (DoFs) and computational requirements. The collocated E-J scheme facilitates solving the auxiliary equations by converting the inversions of matrices to simpler vector manipulations. A new hybrid time stepping approach, which couples the Runge-Kutta and Newmark methods, is proposed to solve the temporal auxiliary differential equations (ADEs) with a high degree of efficiency. The advantages of this new approach, in terms of computational resource overhead and accuracy, are validated through comparison with well-known commercial software for three diverse cases, which cover both near-field and far-field properties with plane wave and lumped port sources. The presented work provides the missing link between DCP dispersive models and FETD and/or SETD based algorithms. It is a competitive candidate for numerically studying the wideband plasmonic properties of DCP media.

  20. Automated choroidal segmentation method in human eye with 1050nm optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, Cindy; Wang, Ruikang K.

    2014-02-01

    Choroidal thickness (ChT), defined as the distance between the retinal pigment epithelium (RPE) and the choroid-sclera interface (CSI), is highly correlated with various ocular disorders like high myopia, diabetic retinopathy, and central serous chorioretinopathy. Long wavelength Optical Coherence Tomography (OCT) has the ability to penetrate deep to the CSI, making the measurement of the ChT possible. The ability to accurately segment the CSI and RPE is important in extracting clinical information. However, automated CSI segmentation is challenging due to the weak boundary in the lower choroid and inconsistent texture with varied blood vessels. We propose a K-means clustering based automated algorithm, which is effective in segmenting the CSI and RPE. The performance of the method was evaluated using 531 frames from 4 normal subjects. The RPE and CSI segmentation time was about 0.3 seconds per frame, and the average time was around 0.5 seconds per frame with correction among frames, which is faster than reported algorithms. The results from the proposed method are consistent with the manual segmentation results. Further investigation includes the optimization of the algorithm to cover more OCT images captured from patients and the increase of the processing speed and robustness of the segmentation method.

  1. Exploring geo-tagged photos for land cover validation with deep learning

    NASA Astrophysics Data System (ADS)

    Xing, Hanfa; Meng, Yuan; Wang, Zixuan; Fan, Kaixuan; Hou, Dongyang

    2018-07-01

    Land cover validation plays an important role in the process of generating and distributing land cover thematic maps, which is usually implemented by high cost of sample interpretation with remotely sensed images or field survey. With an increasing availability of geo-tagged landscape photos, the automatic photo recognition methodologies, e.g., deep learning, can be effectively utilised for land cover applications. However, they have hardly been utilised in validation processes, as challenges remain in sample selection and classification for highly heterogeneous photos. This study proposed an approach to employ geo-tagged photos for land cover validation by using the deep learning technology. The approach first identified photos automatically based on the VGG-16 network. Then, samples for validation were selected and further classified by considering photos distribution and classification probabilities. The implementations were conducted for the validation of the GlobeLand30 land cover product in a heterogeneous area, western California. Experimental results represented promises in land cover validation, given that GlobeLand30 showed an overall accuracy of 83.80% with classified samples, which was close to the validation result of 80.45% based on visual interpretation. Additionally, the performances of deep learning based on ResNet-50 and AlexNet were also quantified, revealing no substantial differences in final validation results. The proposed approach ensures geo-tagged photo quality, and supports the sample classification strategy by considering photo distribution, with accuracy improvement from 72.07% to 79.33% compared with solely considering the single nearest photo. Consequently, the presented approach proves the feasibility of deep learning technology on land cover information identification of geo-tagged photos, and has a great potential to support and improve the efficiency of land cover validation.

  2. A Framework for Land Cover Classification Using Discrete Return LiDAR Data: Adopting Pseudo-Waveform and Hierarchical Segmentation

    NASA Technical Reports Server (NTRS)

    Jung, Jinha; Pasolli, Edoardo; Prasad, Saurabh; Tilton, James C.; Crawford, Melba M.

    2014-01-01

    Acquiring current, accurate land-use information is critical for monitoring and understanding the impact of anthropogenic activities on natural environments.Remote sensing technologies are of increasing importance because of their capability to acquire information for large areas in a timely manner, enabling decision makers to be more effective in complex environments. Although optical imagery has demonstrated to be successful for land cover classification, active sensors, such as light detection and ranging (LiDAR), have distinct capabilities that can be exploited to improve classification results. However, utilization of LiDAR data for land cover classification has not been fully exploited. Moreover, spatial-spectral classification has recently gained significant attention since classification accuracy can be improved by extracting additional information from the neighboring pixels. Although spatial information has been widely used for spectral data, less attention has been given to LiDARdata. In this work, a new framework for land cover classification using discrete return LiDAR data is proposed. Pseudo-waveforms are generated from the LiDAR data and processed by hierarchical segmentation. Spatial featuresare extracted in a region-based way using a new unsupervised strategy for multiple pruning of the segmentation hierarchy. The proposed framework is validated experimentally on a real dataset acquired in an urban area. Better classification results are exhibited by the proposed framework compared to the cases in which basic LiDAR products such as digital surface model and intensity image are used. Moreover, the proposed region-based feature extraction strategy results in improved classification accuracies in comparison with a more traditional window-based approach.

  3. 75 FR 14582 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-26

    ... comprehensive literacy plan. This request includes information collection activities covered under the Paperwork... DEPARTMENT OF EDUCATION Notice of Proposed Information Collection Requests AGENCY: Department of Education. SUMMARY: The Acting Director, Information Collection Clearance Division, Regulatory Information...

  4. [Patient-Proposed Health Services].

    PubMed

    Fujiwara, Yasuhiro

    2016-06-01

    The Patient-Proposed Health Services(PPHS)was launched in April 2016. PPHS was proposed by the Council for Regulatory Reform, which was established in January 2013 under the Second Abe Administration. After discussion within the council, PPHS was published in the Japan Revitalization Strategy(2014 revised edition), which was endorsed by the Cabinet on June 24, 2014. PPHS was proposed therein as a new mechanism within the mixed billing system to apply for a combination of treatment not covered by the public health insurance with treatment covered by the insurance. Subsequently, PPHS was submitted for diet deliberations in April and May 2015 and inserted into article 63 of the health insurance act in accordance with "a law for making partial amendments to the National Health Insurance Act, etc., in order to create a sustainable medical insurance system", which was promulgated on May 29, 2015. In this paper I will review the background of the birth of PPHS and discuss its overview.

  5. The qualitative research proposal.

    PubMed

    Klopper, H

    2008-12-01

    Qualitative research in the health sciences has had to overcome many prejudices and a number of misunderstandings, but today qualitative research is as acceptable as quantitative research designs and is widely funded and published. Writing the proposal of a qualitative study, however, can be a challenging feat, due to the emergent nature of the qualitative research design and the description of the methodology as a process. Even today, many sub-standard proposals at post-graduate evaluation committees and application proposals to be considered for funding are still seen. This problem has led the researcher to develop a framework to guide the qualitative researcher in writing the proposal of a qualitative study based on the following research questions: (i) What is the process of writing a qualitative research proposal? and (ii) What does the structure and layout of a qualitative proposal look like? The purpose of this article is to discuss the process of writing the qualitative research proposal, as well as describe the structure and layout of a qualitative research proposal. The process of writing a qualitative research proposal is discussed with regards to the most important questions that need to be answered in your research proposal with consideration of the guidelines of being practical, being persuasive, making broader links, aiming for crystal clarity and planning before you write. While the structure of the qualitative research proposal is discussed with regards to the key sections of the proposal, namely the cover page, abstract, introduction, review of the literature, research problem and research questions, research purpose and objectives, research paradigm, research design, research method, ethical considerations, dissemination plan, budget and appendices.

  6. A modified NSGA-II solution for a new multi-objective hub maximal covering problem under uncertain shipments

    NASA Astrophysics Data System (ADS)

    Ebrahimi Zade, Amir; Sadegheih, Ahmad; Lotfi, Mohammad Mehdi

    2014-07-01

    Hubs are centers for collection, rearrangement, and redistribution of commodities in transportation networks. In this paper, non-linear multi-objective formulations for single and multiple allocation hub maximal covering problems as well as the linearized versions are proposed. The formulations substantially mitigate complexity of the existing models due to the fewer number of constraints and variables. Also, uncertain shipments are studied in the context of hub maximal covering problems. In many real-world applications, any link on the path from origin to destination may fail to work due to disruption. Therefore, in the proposed bi-objective model, maximizing safety of the weakest path in the network is considered as the second objective together with the traditional maximum coverage goal. Furthermore, to solve the bi-objective model, a modified version of NSGA-II with a new dynamic immigration operator is developed in which the accurate number of immigrants depends on the results of the other two common NSGA-II operators, i.e. mutation and crossover. Besides validating proposed models, computational results confirm a better performance of modified NSGA-II versus traditional one.

  7. 76 FR 13980 - Proposed Information Collection; Comment Request; 2012 Economic Census Covering the Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... essential information for government, business, and the general public. Economic data are the Census Bureau... Economic Census Covering the Information, etc. AGENCY: U.S. Census Bureau, Commerce. ACTION: Notice... ). SUPPLEMENTARY INFORMATION: I. Abstract The economic census, conducted under authority of Title 13, United States...

  8. 75 FR 65392 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-22

    ... consumer financial information from a Covered Person, the Covered Person must provide a notice to each... require only approximately 2,150 to provide consumers with notice and an opt-out opportunity. The... opportunities to consumers, and would incur an average first-year burden of 18 hours in doing so, for a total...

  9. 33 CFR 239.4 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... response to alternative plans. Thus, covered flood control channels may be proposed if they are desired by... RESOURCES POLICIES AND AUTHORITIES: FEDERAL PARTICIPATION IN COVERED FLOOD CONTROL CHANNELS § 239.4 Policy.... Selection of the plan which best serves the public interest is based upon the ability of the plan to meet...

  10. 77 FR 12754 - Contractor Legal Management Requirements; Acquisition Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... DEPARTMENT OF ENERGY 10 CFR Part 719 48 Parts 931, 952 and 970 RIN 1990-AA37 Contractor Legal... rulemaking (NOPR) to revise existing regulations covering contractor legal management requirements and make... relating to the DOE notice of proposed rulemaking to revise existing regulations covering contractor legal...

  11. 76 FR 31582 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-01

    ... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection....S. Census Bureau. Title: 2012 Economic Census Covering the Retail Trade and Accommodation and Food.... Needs and Uses: The 2012 Economic Census Covering the Retail Trade and Accommodation and Food Services...

  12. Fiber waveguide sensors for intelligent materials

    NASA Technical Reports Server (NTRS)

    Flax, A. R.; Claus, R. O.

    1988-01-01

    This report, an addendum to the six month report submitted to NASA Langley Research Center in December 1987, covers research performed by the Fiber and Electro-Optics Research Center (FEORC) at Virginia Tech for the NASA Langley Research Center, Grant NAG1-780, for the period from December 1987 to June 1988. This final report discusses the research performed in the following four areas as described in the proposal: Fabrication of Sensor Fibers Optimized for Embedding in Advanced Composites; Fabrication of Sensor Fiber with In-Line Splices and Evaluation via OTR methods; Modal Domain Optical Fiber Sensor Analysis; and Acoustic Fiber Waveguide Implementation.

  13. Extreme learning machine for ranking: generalization analysis and applications.

    PubMed

    Chen, Hong; Peng, Jiangtao; Zhou, Yicong; Li, Luoqing; Pan, Zhibin

    2014-05-01

    The extreme learning machine (ELM) has attracted increasing attention recently with its successful applications in classification and regression. In this paper, we investigate the generalization performance of ELM-based ranking. A new regularized ranking algorithm is proposed based on the combinations of activation functions in ELM. The generalization analysis is established for the ELM-based ranking (ELMRank) in terms of the covering numbers of hypothesis space. Empirical results on the benchmark datasets show the competitive performance of the ELMRank over the state-of-the-art ranking methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Twofold orthogonal weavings on cuboids

    PubMed Central

    Kovács, Flórián

    2016-01-01

    Some closed polyhedral surfaces can be completely covered by two-way, twofold (rectangular) weaving of strands of constant width. In this paper, a construction for producing all possible geometries for such weavable cuboids is proposed: a theorem on spherical octahedra is proven first that all further theory is based on. The construction method of weavable cuboids itself relies on successive truncations of an initial tetrahedron and is also extended for cases of degenerate (unbounded) polyhedra. Arguments are mainly based on the plane geometry of the development of the respective polyhedra, in connection with some of three-dimensional projective properties of the same. PMID:27118910

  15. Optimization of a dual mode Rowland mount spectrometer used in the 120-950 nm wavelength range

    NASA Astrophysics Data System (ADS)

    McDowell, M. W.; Bouwer, H. K.

    In a recent article, several configurations were described whereby a Rowland mount spectrometer could be modified to cover a wavelength range of 120-950 nm. In one of these configurations, large additional image aberration is introduced which severely limits the spectral resolving power. In the present article, the theoretical imaging properties of this configuration are considered and a simple method is proposed to reduce this aberration. The optimized system possesses an image quality similar to the conventional Rowland mount with the image surface slightly displaced from the Rowland circle but concentric to it.

  16. Composite structural materials

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1979-01-01

    A multifaceted program is described in which aeronautical, mechanical, and materials engineers interact to develop composite aircraft structures. Topics covered include: (1) the design of an advanced composite elevator and a proposed spar and rib assembly; (2) optimizing fiber orientation in the vicinity of heavily loaded joints; (3) failure mechanisms and delamination; (4) the construction of an ultralight sailplane; (5) computer-aided design; finite element analysis programs, preprocessor development, and array preprocessor for SPAR; (6) advanced analysis methods for composite structures; (7) ultrasonic nondestructive testing; (8) physical properties of epoxy resins and composites; (9) fatigue in composite materials, and (10) transverse thermal expansion of carbon/epoxy composites.

  17. Extending the depth of field with chromatic aberration for dual-wavelength iris imaging.

    PubMed

    Fitzgerald, Niamh M; Dainty, Christopher; Goncharov, Alexander V

    2017-12-11

    We propose a method of extending the depth of field to twice that achievable by conventional lenses for the purpose of a low cost iris recognition front-facing camera in mobile phones. By introducing intrinsic primary chromatic aberration in the lens, the depth of field is doubled by means of dual wavelength illumination. The lens parameters (radius of curvature, optical power) can be found analytically by using paraxial raytracing. The effective range of distances covered increases with dispersion of the glass chosen and with larger distance for the near object point.

  18. 78 FR 27184 - Notice of Reopening of Public Comment Period-Proposed Directives for Forest Service Land...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-09

    ... Period--Proposed Directives for Forest Service Land Management Planning AGENCY: Forest Service, USDA... comment period for the proposed directive regarding land management planning for an additional 15 days... identify your comments by including ``RIN 0596-AD06'' or ``planning directives'' on the cover sheet or the...

  19. 42 CFR 52h.9 - What matters must be reviewed for unsolicited contract proposals?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES GRANTS SCIENTIFIC PEER REVIEW OF RESEARCH GRANT APPLICATIONS AND RESEARCH AND DEVELOPMENT CONTRACT... covered by this part unless the proposal has been reviewed by a peer review group in accordance with the... proposal. (b) Except to the extent otherwise provided by law, peer review group recommendations are...

  20. Periodontics II: Course Proposal.

    ERIC Educational Resources Information Center

    Dordick, Bruce

    A proposal is presented for Periodontics II, a course offered at the Community College of Philadelphia to give the dental hygiene/assisting student an understanding of the disease states of the periodontium and their treatment. A standardized course proposal cover form is given, followed by a statement of purpose for the course, a list of major…

  1. Sensitivity of the snowmelt runoff model to underestimates of remotely sensed snow covered area

    USDA-ARS?s Scientific Manuscript database

    Three methods for estimating snow covered area (SCA) from Terra MODIS data were used to derive conventional depletion curves for input to the Snowmelt Runoff Model (SRM). We compared the MOD10 binary and fractional snow cover products and a method for estimating sub-pixel snow cover using spectral m...

  2. New method to assess the water vapour permeance of wound coverings.

    PubMed

    Jonkman, M F; Molenaar, I; Nieuwenhuis, P; Bruin, P; Pennings, A J

    1988-05-01

    A new method for assessing the permeability to water vapour of wound coverings is presented, using the evaporimeter developed by Nilsson. This new method combines the water vapour transmission rate (WVTR) and the vapour pressure difference across a wound covering in one absolute measure: the water vapour permeance (WVP). The WVP of a wound covering is the steady flow (g) of water vapour per unit (m2) area of surface in unit (h) time induced by unit (kPa) vapour pressure difference, g.m-2.h-1.kPa-1. Since the WVP of a wound covering is a more accurate measure for the permeability than the WVTR is, it facilitates the prediction of the water exchange of a wound covering in clinical situations.

  3. A scale self-adapting segmentation approach and knowledge transfer for automatically updating land use/cover change databases using high spatial resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Yang, Xiaomei; Lu, Chen; Yang, Fengshuo

    2018-07-01

    Automatic updating of land use/cover change (LUCC) databases using high spatial resolution images (HSRI) is important for environmental monitoring and policy making, especially for coastal areas that connect the land and coast and that tend to change frequently. Many object-based change detection methods are proposed, especially those combining historical LUCC with HSRI. However, the scale parameter(s) segmenting the serial temporal images, which directly determines the average object size, is hard to choose without experts' intervention. And the samples transferred from historical LUCC also need experts' intervention to avoid insufficient or wrong samples. With respect to the scale parameter(s) choosing, a Scale Self-Adapting Segmentation (SSAS) approach based on the exponential sampling of a scale parameter and location of the local maximum of a weighted local variance was proposed to determine the scale selection problem when segmenting images constrained by LUCC for detecting changes. With respect to the samples transferring, Knowledge Transfer (KT), a classifier trained on historical images with LUCC and applied in the classification of updated images, was also proposed. Comparison experiments were conducted in a coastal area of Zhujiang, China, using SPOT 5 images acquired in 2005 and 2010. The results reveal that (1) SSAS can segment images more effectively without intervention of experts. (2) KT can also reach the maximum accuracy of samples transfer without experts' intervention. Strategy SSAS + KT would be a good choice if the temporal historical image and LUCC match, and the historical image and updated image are obtained from the same resource.

  4. Multi-period project portfolio selection under risk considerations and stochastic income

    NASA Astrophysics Data System (ADS)

    Tofighian, Ali Asghar; Moezzi, Hamid; Khakzar Barfuei, Morteza; Shafiee, Mahmood

    2018-02-01

    This paper deals with multi-period project portfolio selection problem. In this problem, the available budget is invested on the best portfolio of projects in each period such that the net profit is maximized. We also consider more realistic assumptions to cover wider range of applications than those reported in previous studies. A novel mathematical model is presented to solve the problem, considering risks, stochastic incomes, and possibility of investing extra budget in each time period. Due to the complexity of the problem, an effective meta-heuristic method hybridized with a local search procedure is presented to solve the problem. The algorithm is based on genetic algorithm (GA), which is a prominent method to solve this type of problems. The GA is enhanced by a new solution representation and well selected operators. It also is hybridized with a local search mechanism to gain better solution in shorter time. The performance of the proposed algorithm is then compared with well-known algorithms, like basic genetic algorithm (GA), particle swarm optimization (PSO), and electromagnetism-like algorithm (EM-like) by means of some prominent indicators. The computation results show the superiority of the proposed algorithm in terms of accuracy, robustness and computation time. At last, the proposed algorithm is wisely combined with PSO to improve the computing time considerably.

  5. Multi-scale compositionality: identifying the compositional structures of social dynamics using deep learning.

    PubMed

    Peng, Huan-Kai; Marculescu, Radu

    2015-01-01

    Social media exhibit rich yet distinct temporal dynamics which cover a wide range of different scales. In order to study this complex dynamics, two fundamental questions revolve around (1) the signatures of social dynamics at different time scales, and (2) the way in which these signatures interact and form higher-level meanings. In this paper, we propose the Recursive Convolutional Bayesian Model (RCBM) to address both of these fundamental questions. The key idea behind our approach consists of constructing a deep-learning framework using specialized convolution operators that are designed to exploit the inherent heterogeneity of social dynamics. RCBM's runtime and convergence properties are guaranteed by formal analyses. Experimental results show that the proposed method outperforms the state-of-the-art approaches both in terms of solution quality and computational efficiency. Indeed, by applying the proposed method on two social network datasets, Twitter and Yelp, we are able to identify the compositional structures that can accurately characterize the complex social dynamics from these two social media. We further show that identifying these patterns can enable new applications such as anomaly detection and improved social dynamics forecasting. Finally, our analysis offers new insights on understanding and engineering social media dynamics, with direct applications to opinion spreading and online content promotion.

  6. Critical object recognition in millimeter-wave images with robustness to rotation and scale.

    PubMed

    Mohammadzade, Hoda; Ghojogh, Benyamin; Faezi, Sina; Shabany, Mahdi

    2017-06-01

    Locating critical objects is crucial in various security applications and industries. For example, in security applications, such as in airports, these objects might be hidden or covered under shields or secret sheaths. Millimeter-wave images can be utilized to discover and recognize the critical objects out of the hidden cases without any health risk due to their non-ionizing features. However, millimeter-wave images usually have waves in and around the detected objects, making object recognition difficult. Thus, regular image processing and classification methods cannot be used for these images and additional pre-processings and classification methods should be introduced. This paper proposes a novel pre-processing method for canceling rotation and scale using principal component analysis. In addition, a two-layer classification method is introduced and utilized for recognition. Moreover, a large dataset of millimeter-wave images is collected and created for experiments. Experimental results show that a typical classification method such as support vector machines can recognize 45.5% of a type of critical objects at 34.2% false alarm rate (FAR), which is a drastically poor recognition. The same method within the proposed recognition framework achieves 92.9% recognition rate at 0.43% FAR, which indicates a highly significant improvement. The significant contribution of this work is to introduce a new method for analyzing millimeter-wave images based on machine vision and learning approaches, which is not yet widely noted in the field of millimeter-wave image analysis.

  7. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  8. Models based on ultraviolet spectroscopy, polyphenols, oligosaccharides and polysaccharides for prediction of wine astringency.

    PubMed

    Boulet, Jean-Claude; Trarieux, Corinne; Souquet, Jean-Marc; Ducasse, Maris-Agnés; Caillé, Soline; Samson, Alain; Williams, Pascale; Doco, Thierry; Cheynier, Véronique

    2016-01-01

    Astringency elicited by tannins is usually assessed by tasting. Alternative methods involving tannin precipitation have been proposed, but they remain time-consuming. Our goal was to propose a faster method and investigate the links between wine composition and astringency. Red wines covering a wide range of astringency intensities, assessed by sensory analysis, were selected. Prediction models based on multiple linear regression (MLR) were built using UV spectrophotometry (190-400 nm) and chemical analysis (enological analysis, polyphenols, oligosaccharides and polysaccharides). Astringency intensity was strongly correlated (R(2) = 0.825) with tannin precipitation by bovine serum albumin (BSA). Wine absorbances at 230 nm (A230) proved more suitable for astringency prediction (R(2) = 0.705) than A280 (R(2) = 0.56) or tannin concentration estimated by phloroglucinolysis (R(2) = 0.59). Three variable models built with A230, oligosaccharides and polysaccharides presented high R(2) and low errors of cross-validation. These models confirmed that polysaccharides decrease astringency perception and indicated a positive relationship between oligosaccharides and astringency. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations

    PubMed Central

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J. Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types. PMID:27472383

  10. Applications of flight control system methods to an advanced combat rotorcraft

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.

    1989-01-01

    Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.

  11. Convolutional neural networks based on augmented training samples for synthetic aperture radar target recognition

    NASA Astrophysics Data System (ADS)

    Yan, Yue

    2018-03-01

    A synthetic aperture radar (SAR) automatic target recognition (ATR) method based on the convolutional neural networks (CNN) trained by augmented training samples is proposed. To enhance the robustness of CNN to various extended operating conditions (EOCs), the original training images are used to generate the noisy samples at different signal-to-noise ratios (SNRs), multiresolution representations, and partially occluded images. Then, the generated images together with the original ones are used to train a designed CNN for target recognition. The augmented training samples can contrapuntally improve the robustness of the trained CNN to the covered EOCs, i.e., the noise corruption, resolution variance, and partial occlusion. Moreover, the significantly larger training set effectively enhances the representation capability for other conditions, e.g., the standard operating condition (SOC), as well as the stability of the network. Therefore, better performance can be achieved by the proposed method for SAR ATR. For experimental evaluation, extensive experiments are conducted on the Moving and Stationary Target Acquisition and Recognition dataset under SOC and several typical EOCs.

  12. Detection of Abnormal Operation Noise Using CHLAC of Sound Spectrograph and Continuous DP Matching

    NASA Astrophysics Data System (ADS)

    Hattori, Koosuke; Ohmi, Taishi; Taguchi, Ryo; Umezaki, Taizo

    It is a general way that the industrial product is tested by individual inspector. If the product involves sound factors, each inspector will evaluate the test product to sort out a strange engine noise from the natural sound. However, it is hard to cover the consistency in evaluation criteria due to the personal equation referred to the idea that every individual had an inherent bias, plus a physical and mental conditions can be a negative effect on his/her evaluation criteria. It would be ideal if the criteria would not be affected by anyone, anywhere, circumstances; accordingly the quality of products must be equated. In this paper, we propose a noise detection method based on Cubic Higher-order Local Auto-Correlation (CHLAC) scheme and DP Matching provided by Cepstrum Analysis to extract the correct solution. This technique is practically used for detecting any human abnormal movements out of a monitored video clip and identifying individual persons by voice. The study results are shown to be highly effective in our proposed method.

  13. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    PubMed

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types.

  14. Automatic Mosaicking of Satellite Imagery Considering the Clouds

    NASA Astrophysics Data System (ADS)

    Kang, Yifei; Pan, Li; Chen, Qi; Zhang, Tong; Zhang, Shasha; Liu, Zhang

    2016-06-01

    With the rapid development of high resolution remote sensing for earth observation technology, satellite imagery is widely used in the fields of resource investigation, environment protection, and agricultural research. Image mosaicking is an important part of satellite imagery production. However, the existence of clouds leads to lots of disadvantages for automatic image mosaicking, mainly in two aspects: 1) Image blurring may be caused during the process of image dodging, 2) Cloudy areas may be passed through by automatically generated seamlines. To address these problems, an automatic mosaicking method is proposed for cloudy satellite imagery in this paper. Firstly, modified Otsu thresholding and morphological processing are employed to extract cloudy areas and obtain the percentage of cloud cover. Then, cloud detection results are used to optimize the process of dodging and mosaicking. Thus, the mosaic image can be combined with more clear-sky areas instead of cloudy areas. Besides, clear-sky areas will be clear and distortionless. The Chinese GF-1 wide-field-of-view orthoimages are employed as experimental data. The performance of the proposed approach is evaluated in four aspects: the effect of cloud detection, the sharpness of clear-sky areas, the rationality of seamlines and efficiency. The evaluation results demonstrated that the mosaic image obtained by our method has fewer clouds, better internal color consistency and better visual clarity compared with that obtained by traditional method. The time consumed by the proposed method for 17 scenes of GF-1 orthoimages is within 4 hours on a desktop computer. The efficiency can meet the general production requirements for massive satellite imagery.

  15. Discovering significant evolution patterns from satellite image time series.

    PubMed

    Petitjean, François; Masseglia, Florent; Gançarski, Pierre; Forestier, Germain

    2011-12-01

    Satellite Image Time Series (SITS) provide us with precious information on land cover evolution. By studying these series of images we can both understand the changes of specific areas and discover global phenomena that spread over larger areas. Changes that can occur throughout the sensing time can spread over very long periods and may have different start time and end time depending on the location, which complicates the mining and the analysis of series of images. This work focuses on frequent sequential pattern mining (FSPM) methods, since this family of methods fits the above-mentioned issues. This family of methods consists of finding the most frequent evolution behaviors, and is actually able to extract long-term changes as well as short term ones, whenever the change may start and end. However, applying FSPM methods to SITS implies confronting two main challenges, related to the characteristics of SITS and the domain's constraints. First, satellite images associate multiple measures with a single pixel (the radiometric levels of different wavelengths corresponding to infra-red, red, etc.), which makes the search space multi-dimensional and thus requires specific mining algorithms. Furthermore, the non evolving regions, which are the vast majority and overwhelm the evolving ones, challenge the discovery of these patterns. We propose a SITS mining framework that enables discovery of these patterns despite these constraints and characteristics. Our proposal is inspired from FSPM and provides a relevant visualization principle. Experiments carried out on 35 images sensed over 20 years show the proposed approach makes it possible to extract relevant evolution behaviors.

  16. Comparison of two Classification methods (MLC and SVM) to extract land use and land cover in Johor Malaysia

    NASA Astrophysics Data System (ADS)

    Rokni Deilmai, B.; Ahmad, B. Bin; Zabihi, H.

    2014-06-01

    Mapping is essential for the analysis of the land use and land cover, which influence many environmental processes and properties. For the purpose of the creation of land cover maps, it is important to minimize error. These errors will propagate into later analyses based on these land cover maps. The reliability of land cover maps derived from remotely sensed data depends on an accurate classification. In this study, we have analyzed multispectral data using two different classifiers including Maximum Likelihood Classifier (MLC) and Support Vector Machine (SVM). To pursue this aim, Landsat Thematic Mapper data and identical field-based training sample datasets in Johor Malaysia used for each classification method, which results indicate in five land cover classes forest, oil palm, urban area, water, rubber. Classification results indicate that SVM was more accurate than MLC. With demonstrated capability to produce reliable cover results, the SVM methods should be especially useful for land cover classification.

  17. Automatic design of magazine covers

    NASA Astrophysics Data System (ADS)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  18. The Model Experiments and Finite Element Analysis on Deformation and Failure by Excavation of Grounds in Foregoing-roof Method

    NASA Astrophysics Data System (ADS)

    Sotokoba, Yasumasa; Okajima, Kenji; Iida, Toshiaki; Tanaka, Tadatsugu

    We propose the trenchless box culvert construction method to construct box culverts in small covering soil layers while keeping roads or tracks open. When we use this construction method, it is necessary to clarify deformation and shear failure by excavation of grounds. In order to investigate the soil behavior, model experiments and elasto-plactic finite element analysis were performed. In the model experiments, it was shown that the shear failure was developed from the end of the roof to the toe of the boundary surface. In the finite element analysis, a shear band effect was introduced. Comparing the observed shear bands in model experiments with computed maximum shear strain contours, it was found that the observed direction of the shear band could be simulated reasonably by the finite element analysis. We may say that the finite element method used in this study is useful tool for this construction method.

  19. Development, validation and determination of multiclass pesticide residues in cocoa beans using gas chromatography and liquid chromatography tandem mass spectrometry.

    PubMed

    Zainudin, Badrul Hisyam; Salleh, Salsazali; Mohamed, Rahmat; Yap, Ken Choy; Muhamad, Halimah

    2015-04-01

    An efficient and rapid method for the analysis of pesticide residues in cocoa beans using gas and liquid chromatography-tandem mass spectrometry was developed, validated and applied to imported and domestic cocoa beans samples collected over 2 years from smallholders and Malaysian ports. The method was based on solvent extraction method and covers 26 pesticides (insecticides, fungicides, and herbicides) of different chemical classes. The recoveries for all pesticides at 10 and 50 μg/kg were in the range of 70-120% with relative standard deviations of less than 20%. Good selectivity and sensitivity were obtained with method limit of quantification of 10 μg/kg. The expanded uncertainty measurements were in the range of 4-25%. Finally, the proposed method was successfully applied for the routine analysis of pesticide residues in cocoa beans via a monitoring study where 10% of them was found positive for chlorpyrifos, ametryn and metalaxyl. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Improvement of a method for positioning of pithead by considering motion of the surface water

    NASA Astrophysics Data System (ADS)

    Yi, H.; Lee, D. K.

    2016-12-01

    Underground mining has weakness compared with open pit mining in aspects of efficiency, economy and working environment. However, the method has applied for the development of a deep orebody. Development plan is established when the economic valuation and technical analysis of the deposits is completed through exploration of mineral resources. Development is a process to open a passage from the ground surface to the orebody as one of the steps of mining process. In the planning, there are details such as pithead positioning, mining method selection, and shaft design, etc. Among these, pithead positioning is implemented by considering infrastructures, watershed, geology, and economy. In this study, we propose a method to consider the motion of the surface waters in order to improve the existing pithead positioning techniques. The method contemplates the terrain around the mine and makes the surface water flow information. Then, the drainage treatment cost for each candidate location of pithead is suggested. This study covers the concept and design of the scheme.

  1. Simulation of two-phase flow in horizontal fracture networks with numerical manifold method

    NASA Astrophysics Data System (ADS)

    Ma, G. W.; Wang, H. D.; Fan, L. F.; Wang, B.

    2017-10-01

    The paper presents simulation of two-phase flow in discrete fracture networks with numerical manifold method (NMM). Each phase of fluids is considered to be confined within the assumed discrete interfaces in the present method. The homogeneous model is modified to approach the mixed fluids. A new mathematical cover formation for fracture intersection is proposed to satisfy the mass conservation. NMM simulations of two-phase flow in a single fracture, intersection, and fracture network are illustrated graphically and validated by the analytical method or the finite element method. Results show that the motion status of discrete interface significantly depends on the ratio of mobility of two fluids rather than the value of the mobility. The variation of fluid velocity in each fracture segment and the driven fluid content are also influenced by the ratio of mobility. The advantages of NMM in the simulation of two-phase flow in a fracture network are demonstrated in the present study, which can be further developed for practical engineering applications.

  2. Virtual substrate method for nanomaterials characterization

    PubMed Central

    Da, Bo; Liu, Jiangwei; Yamamoto, Mahito; Ueda, Yoshihiro; Watanabe, Kazuyuki; Cuong, Nguyen Thanh; Li, Songlin; Tsukagoshi, Kazuhito; Yoshikawa, Hideki; Iwai, Hideo; Tanuma, Shigeo; Guo, Hongxuan; Gao, Zhaoshun; Sun, Xia; Ding, Zejun

    2017-01-01

    Characterization techniques available for bulk or thin-film solid-state materials have been extended to substrate-supported nanomaterials, but generally non-quantitatively. This is because the nanomaterial signals are inevitably buried in the signals from the underlying substrate in common reflection-configuration techniques. Here, we propose a virtual substrate method, inspired by the four-point probe technique for resistance measurement as well as the chop-nod method in infrared astronomy, to characterize nanomaterials without the influence of underlying substrate signals from four interrelated measurements. By implementing this method in secondary electron (SE) microscopy, a SE spectrum (white electrons) associated with the reflectivity difference between two different substrates can be tracked and controlled. The SE spectrum is used to quantitatively investigate the covering nanomaterial based on subtle changes in the transmission of the nanomaterial with high efficiency rivalling that of conventional core-level electrons. The virtual substrate method represents a benchmark for surface analysis to provide ‘free-standing' information about supported nanomaterials. PMID:28548114

  3. Quality optimized medical image information hiding algorithm that employs edge detection and data coding.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed

    2016-04-01

    The present work has the goal of developing a secure medical imaging information system based on a combined steganography and cryptography technique. It attempts to securely embed patient's confidential information into his/her medical images. The proposed information security scheme conceals coded Electronic Patient Records (EPRs) into medical images in order to protect the EPRs' confidentiality without affecting the image quality and particularly the Region of Interest (ROI), which is essential for diagnosis. The secret EPR data is converted into ciphertext using private symmetric encryption method. Since the Human Visual System (HVS) is less sensitive to alterations in sharp regions compared to uniform regions, a simple edge detection method has been introduced to identify and embed in edge pixels, which will lead to an improved stego image quality. In order to increase the embedding capacity, the algorithm embeds variable number of bits (up to 3) in edge pixels based on the strength of edges. Moreover, to increase the efficiency, two message coding mechanisms have been utilized to enhance the ±1 steganography. The first one, which is based on Hamming code, is simple and fast, while the other which is known as the Syndrome Trellis Code (STC), is more sophisticated as it attempts to find a stego image that is close to the cover image through minimizing the embedding impact. The proposed steganography algorithm embeds the secret data bits into the Region of Non Interest (RONI), where due to its importance; the ROI is preserved from modifications. The experimental results demonstrate that the proposed method can embed large amount of secret data without leaving a noticeable distortion in the output image. The effectiveness of the proposed algorithm is also proven using one of the efficient steganalysis techniques. The proposed medical imaging information system proved to be capable of concealing EPR data and producing imperceptible stego images with minimal embedding distortions compared to other existing methods. In order to refrain from introducing any modifications to the ROI, the proposed system only utilizes the Region of Non Interest (RONI) in embedding the EPR data. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. a Landsat Time-Series Stacks Model for Detection of Cropland Change

    NASA Astrophysics Data System (ADS)

    Chen, J.; Chen, J.; Zhang, J.

    2017-09-01

    Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.

  5. Hybrid Geometric Calibration Method for Multi-Platform Spaceborne SAR Image with Sparse Gcps

    NASA Astrophysics Data System (ADS)

    Lv, G.; Tang, X.; Ai, B.; Li, T.; Chen, Q.

    2018-04-01

    Geometric calibration is able to provide high-accuracy geometric coordinates of spaceborne SAR image through accurate geometric parameters in the Range-Doppler model by ground control points (GCPs). However, it is very difficult to obtain GCPs that covering large-scale areas, especially in the mountainous regions. In addition, the traditional calibration method is only used for single platform SAR images and can't support the hybrid geometric calibration for multi-platform images. To solve the above problems, a hybrid geometric calibration method for multi-platform spaceborne SAR images with sparse GCPs is proposed in this paper. First, we calibrate the master image that contains GCPs. Secondly, the point tracking algorithm is used to obtain the tie points (TPs) between the master and slave images. Finally, we calibrate the slave images using TPs as the GCPs. We take the Beijing-Tianjin- Hebei region as an example to study SAR image hybrid geometric calibration method using 3 TerraSAR-X images, 3 TanDEM-X images and 5 GF-3 images covering more than 235 kilometers in the north-south direction. Geometric calibration of all images is completed using only 5 GCPs. The GPS data extracted from GNSS receiver are used to assess the plane accuracy after calibration. The results after geometric calibration with sparse GCPs show that the geometric positioning accuracy is 3 m for TSX/TDX images and 7.5 m for GF-3 images.

  6. Effect of an Adapted "Cover Write" Method to Word-Naming and Spelling to Students with Developmental Disabilities in Turkey

    ERIC Educational Resources Information Center

    Erbas, Dilek; Turan, Yasemin; Ozen, Arzu; Halle, James W.

    2006-01-01

    The purpose of the present study was to assess the effectiveness of the "cover write" method of teaching word-naming and spelling to two Turkish students with developmental disabilities. A multiple-probe design across three, 5-word sets was employed to assess the effectiveness of the intervention. The "cover write" method was…

  7. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrator on a case-by-case basis. (c) Electric program. Applications for financial assistance for certain... items covered by § 1794.22(a)(8). All new associated facilities and related electric power lines shall... covered by § 1794.22(a)(8). All new associated facilities and related electric power lines shall be...

  8. 75 FR 31481 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... consumer financial information from a Covered Person, the Covered Person must provide a notice to each... require only approximately 2,147 of them to provide consumers with notice and an opt-out opportunity. The... opportunities to consumers, and would incur an average first-year burden of 18 hours in doing so, for a total...

  9. 78 FR 62506 - TRICARE; Coverage of Care Related to Non-Covered Initial Surgery or Treatment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ... Supplemental Health Care Program waiver. This proposed rule is necessary to protect TRICARE beneficiaries from...-covered surgery or treatment was necessary to assure adequate availability of health care to the Active... Regulatory Actions Under the TRICARE private sector health care program, certain conditions and treatments...

  10. Improving urban land use and land cover classification from high-spatial-resolution hyperspectral imagery using contextual information

    USDA-ARS?s Scientific Manuscript database

    In this paper, we propose approaches to improve the pixel-based support vector machine (SVM) classification for urban land use and land cover (LULC) mapping from airborne hyperspectral imagery with high spatial resolution. Class spatial neighborhood relationship is used to correct the misclassified ...

  11. 63 FR 47026 - Proposed Vaccine Information Materials for Hepatitis B, Haemophilus influenzae type b (Hib...

    Federal Register 2010, 2011, 2012, 2013, 2014

    1998-09-03

    ... United States. Meningitis is an infection of the brain and spinal cord coverings which can lead to..., meningitis (infection of the brain and spinal cord covering), painful swelling of the testicles, and, rarely... Vaccine Information Materials for Hepatitis B, Haemophilus influenzae type b (Hib), Varicella (Chickenpox...

  12. 75 FR 48712 - Proposed Vaccine Information Materials for Influenza Vaccine

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-11

    ... the parent or legal representative in the case of a child) receiving vaccines covered under the... who intends to administer one of these covered vaccines is required to provide copies of the relevant.... In such cases, the only revision to the influenza VIS is the notation of the flu season for which the...

  13. 76 FR 14647 - Proposed Information Collection; Comment Request; 2012 Economic Census Covering the Mining Sector

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-17

    ... Economic Census Covering the Mining Sector AGENCY: U.S. Census Bureau. ACTION: Notice. SUMMARY: The... provider of timely, relevant and quality data about the people and economy of the United States. Economic data are the Census Bureau's primary program commitment during non-decennial census years. The economic...

  14. 75 FR 48706 - Proposed Vaccine Information Materials for Rotavirus Vaccine

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-11

    ... the parent or legal representative in the case of a child) receiving vaccines covered under the... who intends to administer one of these covered vaccines is required to provide copies of the relevant... accompanied by vomiting and fever. Rotavirus is not the only cause of severe diarrhea, but it is one of the...

  15. 78 FR 48821 - Energy Conservation Program for Consumer Products and Certain Commercial and Industrial Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ... Determination of Computers as a Covered Consumer Product AGENCY: Office of Energy Efficiency and Renewable... computers qualify as a covered product. DATES: The comment period for the proposed determination relating to computers, published on July 12, 2013 (78 FR 41873), is extended. Comments are due September 12, 2013...

  16. Regulatory assessment with regulatory flexibility analysis : draft regulatory evaluation - Notice of Proposed Rulemaking -- Pipeline Safety : safety standards for increasing the maximum allowable operating pressure for natural gas transmission pipelines.

    DOT National Transportation Integrated Search

    2008-02-01

    The Pipeline and Hazardous Materials Safety Administration (PHMSA) is proposing changes to the Federal pipeline safety regulations in 49 CFR Part 192, which cover the transportation of natural gas by pipeline. Specifically, PHMSA proposes allowing na...

  17. Regulatory assessment with regulatory flexibility analysis and paperwork reduction act analysis : draft regulatory evaluation : Notice of Proposed Rulemaking -- Pipeline Safety : Polyamide-11 (PA-11) plastic pipe design pressures

    DOT National Transportation Integrated Search

    2007-06-01

    The Pipeline and Hazardous Materials Safety Administration (PHMSA) is proposing changes to the Federal pipeline safety regulations in 49 CFR Part 192, which cover the transportation of natural gas by pipeline. Specifically, PHMSA is proposing to chan...

  18. Evolution of Martian polar landscapes - Interplay of long-term variations in perennial ice cover and dust storm intensity

    NASA Technical Reports Server (NTRS)

    Cutts, J. A.; Blasius, K. R.; Roberts, W. J.

    1979-01-01

    The discovery of a new type of Martian polar terrain, called undulating plain, is reported and the evolution of the plains and other areas of the Martian polar region is discussed in terms of the trapping of dust by the perennial ice cover. High-resolution Viking Orbiter 2 observations of the north polar terrain reveal perennially ice-covered surfaces with low relief, wavelike, regularly spaced, parallel ridges and troughs (undulating plains) occupying areas of the polar terrain previously thought to be flat, and associated with troughs of considerable local relief which exhibit at least partial annual melting. It is proposed that the wavelike topography of the undulating plains originates from long-term periodic variations in cyclical dust precipitation at the margin of a growing or receding perennial polar cap in response to changes in insolation. The troughs are proposed to originate from areas of steep slope in the undulating terrain which have lost their perennial ice cover and have become incapable of trapping dust. The polar landscape thus appears to record the migrations, expansions and contractions of the Martian polar cap.

  19. Solutions of interval type-2 fuzzy polynomials using a new ranking method

    NASA Astrophysics Data System (ADS)

    Rahman, Nurhakimah Ab.; Abdullah, Lazim; Ghani, Ahmad Termimi Ab.; Ahmad, Noor'Ani

    2015-10-01

    A few years ago, a ranking method have been introduced in the fuzzy polynomial equations. Concept of the ranking method is proposed to find actual roots of fuzzy polynomials (if exists). Fuzzy polynomials are transformed to system of crisp polynomials, performed by using ranking method based on three parameters namely, Value, Ambiguity and Fuzziness. However, it was found that solutions based on these three parameters are quite inefficient to produce answers. Therefore in this study a new ranking method have been developed with the aim to overcome the inherent weakness. The new ranking method which have four parameters are then applied in the interval type-2 fuzzy polynomials, covering the interval type-2 of fuzzy polynomial equation, dual fuzzy polynomial equations and system of fuzzy polynomials. The efficiency of the new ranking method then numerically considered in the triangular fuzzy numbers and the trapezoidal fuzzy numbers. Finally, the approximate solutions produced from the numerical examples indicate that the new ranking method successfully produced actual roots for the interval type-2 fuzzy polynomials.

  20. Indexing NASA programs for technology transfer methods development and feasibility

    NASA Technical Reports Server (NTRS)

    Clingman, W. H.

    1972-01-01

    This project was undertaken to evaluate the application of a previously developed indexing methodology to ongoing NASA programs. These programs are comprehended by the NASA Program Approval Documents (PADS). Each PAD contains a technical plan for the area it covers. It was proposed that these could be used to generate an index to the complete NASA program. To test this hypothesis two PADS were selected by the NASA Technology Utilization Office for trial indexing. Twenty-five individuals indexed the two PADS using NASA Thesaurus terms. The results demonstrated the feasibility of indexing ongoing NASA programs using PADS as the source of information. The same indexing methodology could be applied to other documents containing a brief description of the technical plan. Results of this project showed that over 85% of the concepts in the technology should be covered by the indexing. Also over 85% of the descriptors chosen would be accurate. This completeness and accuracy for the indexing is considered satisfactory for application in technology transfer.

Top