Sample records for sampling based map

  1. Mapping of bird distributions from point count surveys

    USGS Publications Warehouse

    Sauer, J.R.; Pendleton, G.W.; Orsillo, Sandra; Ralph, C.J.; Sauer, J.R.; Droege, S.

    1995-01-01

    Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes in proportion counted as a function of observer or habitat differences. Large-scale surveys also generally suffer from regional and temporal variation in sampling intensity. A simulated surface is used to demonstrate sampling principles for maps.

  2. Herd-level prevalence of Map infection in dairy herds of southern Chile determined by culture of environmental fecal samples and bulk-tank milk qPCR.

    PubMed

    Kruze, J; Monti, G; Schulze, F; Mella, A; Leiva, S

    2013-09-01

    Paratuberculosis, an infectious disease of domestic and wild ruminants caused by Mycobacterium avium subsp. paratuberculosis (Map), is an economically important disease in dairy herds worldwide. In Chile the disease has been reported in domestic and wildlife animals. However, accurate and updated estimations of the herd-prevalence in cattle at national or regional level are not available. The objectives of this study were to determine the herd-level prevalence of dairy herds with Map infected animals of Southern Chile, based on two diagnostic tests: culture of environmental fecal samples and bulk-tank milk qPCR. Two composite environmental fecal samples and one bulk-tank milk sample were collected during September 2010 and September 2011 from 150 dairy farms in Southern Chile. Isolation of Map from environmental fecal samples was done by culture of decontaminated samples on a commercial Herrold's Egg Yolk Medium (HEYM) with and without mycobactin J. Suspicious colonies were confirmed to be Map by conventional IS900 PCR. Map detection in bulk-tank milk samples was done by real time IS900 PCR assay. PCR-confirmed Map was isolated from 58 (19.3%) of 300 environmental fecal samples. Holding pens and manure storage lagoons were the two more frequent sites found positive for Map, representing 35% and 33% of total positive samples, respectively. However, parlor exits and cow alleyways were the two sites with the highest proportion of positive samples (40% and 32%, respectively). Herd prevalence based on environmental fecal culture was 27% (true prevalence 44%) compared to 49% (true prevalence 87%) based on bulk-tank milk real time IS900 PC. In both cases herd prevalence was higher in large herds (>200 cows). These results confirm that Map infection is wide spread in dairy herds in Southern Chile with a rough herd-level prevalence of 28-100% depending on the herd size, and that IS900 PCR on bulk-tank milk samples is more sensitive than environmental fecal culture to detect Map-infected dairy herds. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to eliminate the surface measurement noise or measurement errors can also suffer from aliasing effects. During re-sampling of a surface map, this software preserves the low spatial-frequency characteristic of a given surface map through the use of Zernike-polynomial fit coefficients, and maintains mid- and high-spatial-frequency characteristics of the given surface map by the use of a PSD model derived from the two-dimensional PSD data of the mid- and high-spatial-frequency components of the original surface map. Because this new method creates the new surface map in the desired sampling format from analytical expressions only, it does not encounter any aliasing effects and does not cause any discontinuity in the resultant surface map.

  4. TumorMap: Exploring the Molecular Similarities of Cancer Samples in an Interactive Portal.

    PubMed

    Newton, Yulia; Novak, Adam M; Swatloski, Teresa; McColl, Duncan C; Chopra, Sahil; Graim, Kiley; Weinstein, Alana S; Baertsch, Robert; Salama, Sofie R; Ellrott, Kyle; Chopra, Manu; Goldstein, Theodore C; Haussler, David; Morozova, Olena; Stuart, Joshua M

    2017-11-01

    Vast amounts of molecular data are being collected on tumor samples, which provide unique opportunities for discovering trends within and between cancer subtypes. Such cross-cancer analyses require computational methods that enable intuitive and interactive browsing of thousands of samples based on their molecular similarity. We created a portal called TumorMap to assist in exploration and statistical interrogation of high-dimensional complex "omics" data in an interactive and easily interpretable way. In the TumorMap, samples are arranged on a hexagonal grid based on their similarity to one another in the original genomic space and are rendered with Google's Map technology. While the important feature of this public portal is the ability for the users to build maps from their own data, we pre-built genomic maps from several previously published projects. We demonstrate the utility of this portal by presenting results obtained from The Cancer Genome Atlas project data. Cancer Res; 77(21); e111-4. ©2017 AACR . ©2017 American Association for Cancer Research.

  5. TumorMap: Exploring the Molecular Similarities of Cancer Samples in an Interactive Portal

    PubMed Central

    Newton, Yulia; Novak, Adam M.; Swatloski, Teresa; McColl, Duncan C.; Chopra, Sahil; Graim, Kiley; Weinstein, Alana S.; Baertsch, Robert; Salama, Sofie R.; Ellrott, Kyle; Chopra, Manu; Goldstein, Theodore C.; Haussler, David; Morozova, Olena; Stuart, Joshua M.

    2017-01-01

    Vast amounts of molecular data are being collected on tumor samples, which provide unique opportunities for discovering trends within and between cancer subtypes. Such cross-cancer analyses require computational methods that enable intuitive and interactive browsing of thousands of samples based on their molecular similarity. We created a portal called TumorMap to assist in exploration and statistical interrogation of high-dimensional complex “omics” data in an interactive and easily interpretable way. In the TumorMap, samples are arranged on a hexagonal grid based on their similarity to one another in the original genomic space and are rendered with Google’s Map technology. While the important feature of this public portal is the ability for the users to build maps from their own data, we pre-built genomic maps from several previously published projects. We demonstrate the utility of this portal by presenting results obtained from The Cancer Genome Atlas project data. PMID:29092953

  6. Viable Mycobacterium avium ssp. paratuberculosis isolated from calf milk replacer.

    PubMed

    Grant, Irene R; Foddai, Antonio C G; Tarrant, James C; Kunkel, Brenna; Hartmann, Faye A; McGuirk, Sheila; Hansen, Chungyi; Talaat, Adel M; Collins, Michael T

    2017-12-01

    When advising farmers on how to control Johne's disease in an infected herd, one of the main recommendations is to avoid feeding waste milk to calves and instead feed calf milk replacer (CMR). This advice is based on the assumption that CMR is free of viable Mycobacterium avium ssp. paratuberculosis (MAP) cells, an assumption that has not previously been challenged. We tested commercial CMR products (n = 83) obtained from dairy farms around the United States by the peptide-mediated magnetic separation (PMS)-phage assay, PMS followed by liquid culture (PMS-culture), and direct IS900 quantitative PCR (qPCR). Conventional microbiological analyses for total mesophilic bacterial counts, coliforms, Salmonella, coagulase-negative staphylococci, streptococci, nonhemolytic Corynebacterium spp., and Bacillus spp. were also performed to assess the overall microbiological quality of the CMR. Twenty-six (31.3%) of the 83 CMR samples showed evidence of the presence of MAP. Seventeen (20.5%) tested positive for viable MAP by the PMS-phage assay, with plaque counts ranging from 6 to 1,212 pfu/50 mL of reconstituted CMR (average 248.5 pfu/50 mL). Twelve (14.5%) CMR samples tested positive for viable MAP by PMS-culture; isolates from all 12 of these samples were subsequently confirmed by whole-genome sequencing to be different cattle strains of MAP. Seven (8.4%) CMR samples tested positive for MAP DNA by IS900 qPCR. Four CMR samples tested positive by both PMS-based tests and 5 CMR samples tested positive by IS900 qPCR plus one or other of the PMS-based tests, but only one CMR sample tested positive by all 3 MAP detection tests applied. All conventional microbiology results were within current standards for whole milk powders. A significant association existed between higher total bacterial counts and presence of viable MAP indicated by either of the PMS-based assays. This represents the first published report of the isolation of viable MAP from CMR. Our findings raise concerns about the potential ability of MAP to survive manufacture of dried milk-based products. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).

  7. Regional geochemical maps of the Tonopah 1 degree by 2 degrees Quadrangle, Nevada, based on samples of stream sediment and nonmagnetic heavy-mineral concentrate

    USGS Publications Warehouse

    Nash, J.T.; Siems, D.F.

    1988-01-01

    The geochemical maps in this report are based on analytical results reported by Fairfield and others (1985), Hill and others (1986), and Siems and others (1986). These reports also describe the sample preparation and analytical methods and provide information on the location of the sample sites.

  8. Using ancestry matching to combine family-based and unrelated samples for genome-wide association studies‡

    PubMed Central

    Crossett, Andrew; Kent, Brian P.; Klei, Lambertus; Ringquist, Steven; Trucco, Massimo; Roeder, Kathryn; Devlin, Bernie

    2015-01-01

    We propose a method to analyze family-based samples together with unrelated cases and controls. The method builds on the idea of matched case–control analysis using conditional logistic regression (CLR). For each trio within the family, a case (the proband) and matched pseudo-controls are constructed, based upon the transmitted and untransmitted alleles. Unrelated controls, matched by genetic ancestry, supplement the sample of pseudo-controls; likewise unrelated cases are also paired with genetically matched controls. Within each matched stratum, the case genotype is contrasted with control pseudo-control genotypes via CLR, using a method we call matched-CLR (mCLR). Eigenanalysis of numerous SNP genotypes provides a tool for mapping genetic ancestry. The result of such an analysis can be thought of as a multidimensional map, or eigenmap, in which the relative genetic similarities and differences amongst individuals is encoded in the map. Once constructed, new individuals can be projected onto the ancestry map based on their genotypes. Successful differentiation of individuals of distinct ancestry depends on having a diverse, yet representative sample from which to construct the ancestry map. Once samples are well-matched, mCLR yields comparable power to competing methods while ensuring excellent control over Type I error. PMID:20862653

  9. Uncertainty in the profitability of fertilizer management based on various sampling designs.

    NASA Astrophysics Data System (ADS)

    Muhammed, Shibu; Ben, Marchant; Webster, Richard; Milne, Alice; Dailey, Gordon; Whitmore, Andrew

    2016-04-01

    Many farmers sample their soil to measure the concentrations of plant nutrients, including phosphorus (P), so as to decide how much fertilizer to apply. Now that fertilizer can be applied at variable rates, farmers want to know whether maps of nutrient concentration made from grid samples or from field subdivisions (zones within their fields) are merited: do such maps lead to greater profit than would a single measurement on a bulked sample for each field when all costs are taken into account? We have examined the merits of grid-based and zone-based sampling strategies over single field-based averages using continuous spatial data on wheat yields at harvest in six fields in southern England and simulated concentrations of P in the soil. Features of the spatial variation in the yields provide predictions about which sampling scheme is likely to be most cost effective, but there is uncertainty associated with these predictions that must be communicated to farmers. Where variograms of the yield have large variances and long effective ranges, grid-sampling and mapping nutrients are likely to be cost-effective. Where effective ranges are short, sampling must be dense to reveal the spatial variation and may be expensive. In these circumstances variable-rate application of fertilizer is likely to be impracticable and almost certainly not cost-effective. We have explored several methods for communicating these results and found that the most effective method was using probability maps that show the likelihood of grid-based and zone-based sampling being more profitable that a field-based estimate.

  10. Mapping cell populations in flow cytometry data for cross-sample comparison using the Friedman-Rafsky test statistic as a distance measure.

    PubMed

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu; Scheuermann, Richard H

    2016-01-01

    Flow cytometry (FCM) is a fluorescence-based single-cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap-FR, a novel method for cell population mapping across FCM samples. FlowMap-FR is based on the Friedman-Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap-FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap-FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap-FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap-FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap-FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback-Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL-distance in distinguishing equivalent from nonequivalent cell populations. FlowMap-FR was also employed as a distance metric to match cell populations delineated by manual gating across 30 FCM samples from a benchmark FlowCAP data set. An F-measure of 0.88 was obtained, indicating high precision and recall of the FR-based population matching results. FlowMap-FR has been implemented as a standalone R/Bioconductor package so that it can be easily incorporated into current FCM data analytical workflows. © The Authors. Published by Wiley Periodicals, Inc. on behalf of ISAC.

  11. Mapping of Bird Distributions from Point Count Surveys

    Treesearch

    John R. Sauer; Grey W. Pendleton; Sandra Orsillo

    1995-01-01

    Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes...

  12. Reliability of confidence intervals calculated by bootstrap and classical methods using the FIA 1-ha plot design

    Treesearch

    H. T. Schreuder; M. S. Williams

    2000-01-01

    In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...

  13. Application of IS1311 locus 2 PCR-REA assay for the specific detection of 'Bison type' Mycobacterium avium subspecies paratuberculosis isolates of Indian origin.

    PubMed

    Singh, Ajay Vir; Chauhan, Devendra Singh; Singh, Abhinendra; Singh, Pravin Kumar; Sohal, Jagdip Singh; Singh, Shoor Vir

    2015-01-01

    Of the three major genotypes of Mycobacterium avium subspecies paratuberculosis (MAP), 'Bison type' is most prevalent genotype in the domestic livestock species of the country, and has also been recovered from patients suffering from Crohn's disease. Recently, a new assay based on IS1311 locus 2 PCR- restriction endonuclease analysis (REA) was designed to distinguish between 'Indian Bison type' and non-Indian genotypes. The present study investigated discriminatory potential of this new assay while screening of a panel of MAP isolates of diverse genotypes and from different geographical regions. A total of 53 mycobacterial isolates (41 MAP and 12 mycobacterium other than MAP), three MAP genomic DNA and 36 MAP positive faecal DNA samples from different livestock species (cattle, buffaloes, goat, sheep and bison) and geographical regions (India, Canada, USA, Spain and Portugal) were included in the study. The extracted DNA samples (n=92) were analyzed for the presence of MAP specific sequences (IS900, ISMav 2 and HspX) using PCR. DNA samples were further subjected to genotype differentiation using IS1311 PCR-REA and IS1311 L2 PCR-REA methods. All the DNA samples (except DNA from non-MAP mycobacterial isolates) were positive for all the three MAP specific sequences based PCRs. IS1311 PCR-REA showed that MAP DNA samples of Indian origin belonged to 'Bison type'. Whereas, of the total 19 non-Indian MAP DNA samples, 2, 15 and 2 were genotyped as 'Bison type', 'Cattle type' and 'Sheep type', respectively. IS1311 L2 PCR-REA method showed different restriction profiles of 'Bison type' genotype as compared to non-Indian DNA samples. IS1311 L2 PCR-REA method successfully discriminated 'Indian Bison type' from other non-Indian genotypes and showed potential to be future epidemiological tool and for genotyping of MAP isolates.

  14. A Radio-Map Automatic Construction Algorithm Based on Crowdsourcing

    PubMed Central

    Yu, Ning; Xiao, Chenxian; Wu, Yinfeng; Feng, Renjian

    2016-01-01

    Traditional radio-map-based localization methods need to sample a large number of location fingerprints offline, which requires huge amount of human and material resources. To solve the high sampling cost problem, an automatic radio-map construction algorithm based on crowdsourcing is proposed. The algorithm employs the crowd-sourced information provided by a large number of users when they are walking in the buildings as the source of location fingerprint data. Through the variation characteristics of users’ smartphone sensors, the indoor anchors (doors) are identified and their locations are regarded as reference positions of the whole radio-map. The AP-Cluster method is used to cluster the crowdsourced fingerprints to acquire the representative fingerprints. According to the reference positions and the similarity between fingerprints, the representative fingerprints are linked to their corresponding physical locations and the radio-map is generated. Experimental results demonstrate that the proposed algorithm reduces the cost of fingerprint sampling and radio-map construction and guarantees the localization accuracy. The proposed method does not require users’ explicit participation, which effectively solves the resource-consumption problem when a location fingerprint database is established. PMID:27070623

  15. Three-dimensional dominant frequency mapping using autoregressive spectral analysis of atrial electrograms of patients in persistent atrial fibrillation.

    PubMed

    Salinet, João L; Masca, Nicholas; Stafford, Peter J; Ng, G André; Schlindwein, Fernando S

    2016-03-08

    Areas with high frequency activity within the atrium are thought to be 'drivers' of the rhythm in patients with atrial fibrillation (AF) and ablation of these areas seems to be an effective therapy in eliminating DF gradient and restoring sinus rhythm. Clinical groups have applied the traditional FFT-based approach to generate the three-dimensional dominant frequency (3D DF) maps during electrophysiology (EP) procedures but literature is restricted on using alternative spectral estimation techniques that can have a better frequency resolution that FFT-based spectral estimation. Autoregressive (AR) model-based spectral estimation techniques, with emphasis on selection of appropriate sampling rate and AR model order, were implemented to generate high-density 3D DF maps of atrial electrograms (AEGs) in persistent atrial fibrillation (persAF). For each patient, 2048 simultaneous AEGs were recorded for 20.478 s-long segments in the left atrium (LA) and exported for analysis, together with their anatomical locations. After the DFs were identified using AR-based spectral estimation, they were colour coded to produce sequential 3D DF maps. These maps were systematically compared with maps found using the Fourier-based approach. 3D DF maps can be obtained using AR-based spectral estimation after AEGs downsampling (DS) and the resulting maps are very similar to those obtained using FFT-based spectral estimation (mean 90.23 %). There were no significant differences between AR techniques (p = 0.62). The processing time for AR-based approach was considerably shorter (from 5.44 to 5.05 s) when lower sampling frequencies and model order values were used. Higher levels of DS presented higher rates of DF agreement (sampling frequency of 37.5 Hz). We have demonstrated the feasibility of using AR spectral estimation methods for producing 3D DF maps and characterised their differences to the maps produced using the FFT technique, offering an alternative approach for 3D DF computation in human persAF studies.

  16. National-scale crop type mapping and area estimation using multi-resolution remote sensing and field survey

    NASA Astrophysics Data System (ADS)

    Song, X. P.; Potapov, P.; Adusei, B.; King, L.; Khan, A.; Krylov, A.; Di Bella, C. M.; Pickens, A. H.; Stehman, S. V.; Hansen, M.

    2016-12-01

    Reliable and timely information on agricultural production is essential for ensuring world food security. Freely available medium-resolution satellite data (e.g. Landsat, Sentinel) offer the possibility of improved global agriculture monitoring. Here we develop and test a method for estimating in-season crop acreage using a probability sample of field visits and producing wall-to-wall crop type maps at national scales. The method is first illustrated for soybean cultivated area in the US for 2015. A stratified, two-stage cluster sampling design was used to collect field data to estimate national soybean area. The field-based estimate employed historical soybean extent maps from the U.S. Department of Agriculture (USDA) Cropland Data Layer to delineate and stratify U.S. soybean growing regions. The estimated 2015 U.S. soybean cultivated area based on the field sample was 341,000 km2 with a standard error of 23,000 km2. This result is 1.0% lower than USDA's 2015 June survey estimate and 1.9% higher than USDA's 2016 January estimate. Our area estimate was derived in early September, about 2 months ahead of harvest. To map soybean cover, the Landsat image archive for the year 2015 growing season was processed using an active learning approach. Overall accuracy of the soybean map was 84%. The field-based sample estimated area was then used to calibrate the map such that the soybean acreage of the map derived through pixel counting matched the sample-based area estimate. The strength of the sample-based area estimation lies in the stratified design that takes advantage of the spatially explicit cropland layers to construct the strata. The success of the mapping was built upon an automated system which transforms Landsat images into standardized time-series metrics. The developed method produces reliable and timely information on soybean area in a cost-effective way and could be implemented in an operational mode. The approach has also been applied for other crops in other regions, such as winter wheat in Pakistan, soybean in Argentina and soybean in the entire South America. Similar levels of accuracy and timeliness were achieved as in the US.

  17. Analysis of spatial distribution of land cover maps accuracy

    NASA Astrophysics Data System (ADS)

    Khatami, R.; Mountrakis, G.; Stehman, S. V.

    2017-12-01

    Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain yielded similar AUC; iv) for the larger sample size (i.e., very dense spatial sample) and per-class predictions, the spatial domain yielded larger AUC; v) increasing the sample size improved accuracy predictions with a greater benefit accruing to the spatial domain; and vi) the function used for interpolation had the smallest effect on AUC.

  18. A review of accuracy assessment for object-based image analysis: From per-pixel to per-polygon approaches

    NASA Astrophysics Data System (ADS)

    Ye, Su; Pontius, Robert Gilmore; Rakshit, Rahul

    2018-07-01

    Object-based image analysis (OBIA) has gained widespread popularity for creating maps from remotely sensed data. Researchers routinely claim that OBIA procedures outperform pixel-based procedures; however, it is not immediately obvious how to evaluate the degree to which an OBIA map compares to reference information in a manner that accounts for the fact that the OBIA map consists of objects that vary in size and shape. Our study reviews 209 journal articles concerning OBIA published between 2003 and 2017. We focus on the three stages of accuracy assessment: (1) sampling design, (2) response design and (3) accuracy analysis. First, we report the literature's overall characteristics concerning OBIA accuracy assessment. Simple random sampling was the most used method among probability sampling strategies, slightly more than stratified sampling. Office interpreted remotely sensed data was the dominant reference source. The literature reported accuracies ranging from 42% to 96%, with an average of 85%. A third of the articles failed to give sufficient information concerning accuracy methodology such as sampling scheme and sample size. We found few studies that focused specifically on the accuracy of the segmentation. Second, we identify a recent increase of OBIA articles in using per-polygon approaches compared to per-pixel approaches for accuracy assessment. We clarify the impacts of the per-pixel versus the per-polygon approaches respectively on sampling, response design and accuracy analysis. Our review defines the technical and methodological needs in the current per-polygon approaches, such as polygon-based sampling, analysis of mixed polygons, matching of mapped with reference polygons and assessment of segmentation accuracy. Our review summarizes and discusses the current issues in object-based accuracy assessment to provide guidance for improved accuracy assessments for OBIA.

  19. Synchrotron-based FTIR microspectroscopy for the mapping of photo-oxidation and additives in acrylonitrile-butadiene-styrene model samples and historical objects.

    PubMed

    Saviello, Daniela; Pouyet, Emeline; Toniolo, Lucia; Cotte, Marine; Nevin, Austin

    2014-09-16

    Synchrotron-based Fourier transform infrared micro-spectroscopy (SR-μFTIR) was used to map photo-oxidative degradation of acrylonitrile-butadiene-styrene (ABS) and to investigate the presence and the migration of additives in historical samples from important Italian design objects. High resolution (3×3 μm(2)) molecular maps were obtained by FTIR microspectroscopy in transmission mode, using a new method for the preparation of polymer thin sections. The depth of photo-oxidation in samples was evaluated and accompanied by the formation of ketones, aldehydes, esters, and unsaturated carbonyl compounds. This study demonstrates selective surface oxidation and a probable passivation of material against further degradation. In polymer fragments from design objects made of ABS from the 1960s, UV-stabilizers were detected and mapped, and microscopic inclusions of proteinaceous material were identified and mapped for the first time. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Summary geochemical maps for samples of rock, stream sediment, and nonmagnetic heavy-mineral concentrate, Sweetwater Roadless Area, Mono County, California and Lyon and Douglas Counties, Nevada

    USGS Publications Warehouse

    Chaffee, Maurice A.

    1986-01-01

    Map A shows the locations of all sites where rock samples were collected for this report and the distributions of anomalous concentrations for 12 elements in the 127 rock samples collected. In a similar manner, map B shows the collection sites for 59 samples of minus-60-mesh stream sediment, and 59 samples of nonmagnetic heavy-mineral concentrate derived from stream sediment and also shows the distributions of anomalous concentrations for 13 elements in the stream-sediment samples and 17 elements in the concentrate samples. Map C shows outlines of those drainage basins containing samples of stream sediment and concentrate with anomalous element concentrations and also shows weighted values for each outlined basin based on the number of elements with anomalous concentrations in each stream-sediment and concentrate sample and on the degree to which these concentrations are anomalous in each sample.

  1. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  2. Application of laboratory reflectance spectroscopy to target and map expansive soils: example of the western Loiret, France

    NASA Astrophysics Data System (ADS)

    Hohmann, Audrey; Dufréchou, Grégory; Grandjean, Gilles; Bourguignon, Anne

    2014-05-01

    Swelling soils contain clay minerals that change volume with water content and cause extensive and expensive damage on infrastructures. Based on spatial distribution of infrastructure damages and existing geological maps, the Bureau de Recherches Géologiques et Minières (BRGM, i.e. the French Geological Survey) published in 2010 a 1:50 000 swelling hazard map of France, indexing the territory to low, moderate, or high swelling risk. This study aims to use SWIR (1100-2500 nm) reflectance spectra of soils acquired under laboratory controlled conditions to estimate the swelling potential of soils and improve the swelling risk map of France. 332 samples were collected at the W of Orléans (France) in various geological formations and swelling risk areas. Comparisons of swelling potential of soil samples and swelling risk areas of the map show several inconsistent associations that confirm the necessity to redraw the actual swelling risk map of France. New swelling risk maps of the sampling area were produce from soil samples using three interpolation methods. Maps produce using kriging and Natural neighbour interpolation methods did not permit to show discrete lithological units, introduced unsupported swelling risk zones, and did not appear useful to refine swelling risk map of France. Voronoi polygon was also used to produce map where swelling potential estimated from each samples were extrapolated to a polygon and all polygons were thus supported by field information. From methods tested here, Voronoi polygon appears thus the most adapted method to produce expansive soils maps. However, size of polygon is highly dependent of the samples spacing and samples may not be representative of the entire polygon. More samples are thus needed to provide reliable map at the scale of the sampling area. Soils were also sampled along two sections with a sampling interval of ca. 260 m and ca. 50 m. Sample interval of 50 m appears more adapted for mapping of smallest lithological units. The presence of several samples close to themselves indicating the same swelling potential is a good indication of the presence of a zone with constant swelling potential. Combination of Voronoi method and sampling interval of ca. 50 m appear adapted to produce local swelling potential maps in areas where doubt remain or where infrastructure damages attributed to expansive soils are knew.

  3. Calculation of upper confidence bounds on not-sampled vegetation types using a systematic grid sample: An application to map unit definition for existing vegetation maps

    Treesearch

    Paul L. Patterson; Mark Finco

    2009-01-01

    This paper explores the information FIA data can produce regarding forest types that were not sampled and develops the equations necessary to define the upper confidence bounds on not-sampled forest types. The problem is reduced to a Bernoulli variable. This simplification allows the upper confidence bounds to be calculated based on Cochran (1977). Examples are...

  4. Comparing the efficiency of digital and conventional soil mapping to predict soil types in a semi-arid region in Iran

    NASA Astrophysics Data System (ADS)

    Zeraatpisheh, Mojtaba; Ayoubi, Shamsollah; Jafari, Azam; Finke, Peter

    2017-05-01

    The efficiency of different digital and conventional soil mapping approaches to produce categorical maps of soil types is determined by cost, sample size, accuracy and the selected taxonomic level. The efficiency of digital and conventional soil mapping approaches was examined in the semi-arid region of Borujen, central Iran. This research aimed to (i) compare two digital soil mapping approaches including Multinomial logistic regression and random forest, with the conventional soil mapping approach at four soil taxonomic levels (order, suborder, great group and subgroup levels), (ii) validate the predicted soil maps by the same validation data set to determine the best method for producing the soil maps, and (iii) select the best soil taxonomic level by different approaches at three sample sizes (100, 80, and 60 point observations), in two scenarios with and without a geomorphology map as a spatial covariate. In most predicted maps, using both digital soil mapping approaches, the best results were obtained using the combination of terrain attributes and the geomorphology map, although differences between the scenarios with and without the geomorphology map were not significant. Employing the geomorphology map increased map purity and the Kappa index, and led to a decrease in the 'noisiness' of soil maps. Multinomial logistic regression had better performance at higher taxonomic levels (order and suborder levels); however, random forest showed better performance at lower taxonomic levels (great group and subgroup levels). Multinomial logistic regression was less sensitive than random forest to a decrease in the number of training observations. The conventional soil mapping method produced a map with larger minimum polygon size because of traditional cartographic criteria used to make the geological map 1:100,000 (on which the conventional soil mapping map was largely based). Likewise, conventional soil mapping map had also a larger average polygon size that resulted in a lower level of detail. Multinomial logistic regression at the order level (map purity of 0.80), random forest at the suborder (map purity of 0.72) and great group level (map purity of 0.60), and conventional soil mapping at the subgroup level (map purity of 0.48) produced the most accurate maps in the study area. The multinomial logistic regression method was identified as the most effective approach based on a combined index of map purity, map information content, and map production cost. The combined index also showed that smaller sample size led to a preference for the order level, while a larger sample size led to a preference for the great group level.

  5. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  6. Mapping permafrost in the boreal forest with Thematic Mapper satellite data

    NASA Technical Reports Server (NTRS)

    Morrissey, L. A.; Strong, L. L.; Card, D. H.

    1986-01-01

    A geographic data base incorporating Landsat TM data was used to develop and evaluate logistic discriminant functions for predicting the distribution of permafrost in a boreal forest watershed. The data base included both satellite-derived information and ancillary map data. Five permafrost classifications were developed from a stratified random sample of the data base and evaluated by comparison with a photo-interpreted permafrost map using contingency table analysis and soil temperatures recorded at sites within the watershed. A classification using a TM thermal band and a TM-derived vegetation map as independent variables yielded the highest mapping accuracy for all permafrost categories.

  7. High-resolution Mapping of Forest Carbon Stocks in the Colombian Amazon

    NASA Astrophysics Data System (ADS)

    Asner, G. P.; Clark, J. K.; Mascaro, J.; Galindo García, G. A.; Chadwick, K. D.; Navarrete Encinales, D. A.; Paez-Acosta, G.; Cabrera Montenegro, E.; Kennedy-Bowdoin, T.; Duque, Á.; Balaji, A.; von Hildebrand, P.; Maatoug, L.; Bernal, J. F. Phillips; Knapp, D. E.; García Dávila, M. C.; Jacobson, J.; Ordóñez, M. F.

    2012-03-01

    High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or Light Detection and Ranging (LiDAR) samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (>40 %) of the Colombian Amazon - a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i) employing a universal approach to airborne LiDAR-calibration with limited field data; (ii) quantifying environmental controls over carbon densities; and (iii) developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon mapping samples had 14.6 % uncertainty at 1 ha resolution, and regional maps based on stratification and regression approaches had 25.6 % and 29.6 % uncertainty, respectively, in any given hectare. High-resolution approaches with reported local-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision-makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.

  8. Calculation of upper confidence bounds on proportion of area containing not-sampled vegetation types: An application to map unit definition for existing vegetation maps

    Treesearch

    Paul L. Patterson; Mark Finco

    2011-01-01

    This paper explores the information forest inventory data can produce regarding forest types that were not sampled and develops the equations necessary to define the upper confidence bounds on not-sampled forest types. The problem is reduced to a Bernoulli variable. This simplification allows the upper confidence bounds to be calculated based on Cochran (1977)....

  9. Rapid mapping of schistosomiasis and other neglected tropical diseases in the context of integrated control programmes in Africa

    PubMed Central

    BROOKER, S.; KABATEREINE, N. B.; GYAPONG, J. O.; STOTHARD, J. R.; UTZINGER, J.

    2009-01-01

    SUMMARY There is growing interest and commitment to the control of schistosomiasis and other so-called neglected tropical diseases (NTDs). Resources for control are inevitably limited, necessitating assessment methods that can rapidly and accurately identify and map high-risk communities so that interventions can be targeted in a spatially-explicit and cost-effective manner. Here, we review progress made with (i) mapping schistosomiasis across Africa using available epidemiological data and more recently, climate-based risk prediction; (ii) the development and use of morbidity questionnaires for rapid identification of high-risk communities of urinary schistosomiasis; and (iii) innovative sampling-based approaches for intestinal schistosomiasis, using the lot quality assurance sampling technique. Experiences are also presented for the rapid mapping of other NTDs, including onchocerciasis, loiasis and lymphatic filariasis. Future directions for an integrated rapid mapping approach targeting multiple NTDs simultaneously are outlined, including potential challenges in developing an integrated survey tool. The lessons from the mapping of human helminth infections may also be relevant for the rapid mapping of malaria as its control efforts are intensified. PMID:19450373

  10. Case-based fracture image retrieval.

    PubMed

    Zhou, Xin; Stern, Richard; Müller, Henning

    2012-05-01

    Case-based fracture image retrieval can assist surgeons in decisions regarding new cases by supplying visually similar past cases. This tool may guide fracture fixation and management through comparison of long-term outcomes in similar cases. A fracture image database collected over 10 years at the orthopedic service of the University Hospitals of Geneva was used. This database contains 2,690 fracture cases associated with 43 classes (based on the AO/OTA classification). A case-based retrieval engine was developed and evaluated using retrieval precision as a performance metric. Only cases in the same class as the query case are considered as relevant. The scale-invariant feature transform (SIFT) is used for image analysis. Performance evaluation was computed in terms of mean average precision (MAP) and early precision (P10, P30). Retrieval results produced with the GNU image finding tool (GIFT) were used as a baseline. Two sampling strategies were evaluated. One used a dense 40 × 40 pixel grid sampling, and the second one used the standard SIFT features. Based on dense pixel grid sampling, three unsupervised feature selection strategies were introduced to further improve retrieval performance. With dense pixel grid sampling, the image is divided into 1,600 (40 × 40) square blocks. The goal is to emphasize the salient regions (blocks) and ignore irrelevant regions. Regions are considered as important when a high variance of the visual features is found. The first strategy is to calculate the variance of all descriptors on the global database. The second strategy is to calculate the variance of all descriptors for each case. A third strategy is to perform a thumbnail image clustering in a first step and then to calculate the variance for each cluster. Finally, a fusion between a SIFT-based system and GIFT is performed. A first comparison on the selection of sampling strategies using SIFT features shows that dense sampling using a pixel grid (MAP = 0.18) outperformed the SIFT detector-based sampling approach (MAP = 0.10). In a second step, three unsupervised feature selection strategies were evaluated. A grid parameter search is applied to optimize parameters for feature selection and clustering. Results show that using half of the regions (700 or 800) obtains the best performance for all three strategies. Increasing the number of clusters in clustering can also improve the retrieval performance. The SIFT descriptor variance in each case gave the best indication of saliency for the regions (MAP = 0.23), better than the other two strategies (MAP = 0.20 and 0.21). Combining GIFT (MAP = 0.23) and the best SIFT strategy (MAP = 0.23) produced significantly better results (MAP = 0.27) than each system alone. A case-based fracture retrieval engine was developed and is available for online demonstration. SIFT is used to extract local features, and three feature selection strategies were introduced and evaluated. A baseline using the GIFT system was used to evaluate the salient point-based approaches. Without supervised learning, SIFT-based systems with optimized parameters slightly outperformed the GIFT system. A fusion of the two approaches shows that the information contained in the two approaches is complementary. Supervised learning on the feature space is foreseen as the next step of this study.

  11. Mapping seabed sediments: Comparison of manual, geostatistical, object-based image analysis and machine learning approaches

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Green, Sophie L.; Stephens, David; Lark, R. Murray; Stewart, Heather A.; Dove, Dayton

    2014-08-01

    Marine spatial planning and conservation need underpinning with sufficiently detailed and accurate seabed substrate and habitat maps. Although multibeam echosounders enable us to map the seabed with high resolution and spatial accuracy, there is still a lack of fit-for-purpose seabed maps. This is due to the high costs involved in carrying out systematic seabed mapping programmes and the fact that the development of validated, repeatable, quantitative and objective methods of swath acoustic data interpretation is still in its infancy. We compared a wide spectrum of approaches including manual interpretation, geostatistics, object-based image analysis and machine-learning to gain further insights into the accuracy and comparability of acoustic data interpretation approaches based on multibeam echosounder data (bathymetry, backscatter and derivatives) and seabed samples with the aim to derive seabed substrate maps. Sample data were split into a training and validation data set to allow us to carry out an accuracy assessment. Overall thematic classification accuracy ranged from 67% to 76% and Cohen's kappa varied between 0.34 and 0.52. However, these differences were not statistically significant at the 5% level. Misclassifications were mainly associated with uncommon classes, which were rarely sampled. Map outputs were between 68% and 87% identical. To improve classification accuracy in seabed mapping, we suggest that more studies on the effects of factors affecting the classification performance as well as comparative studies testing the performance of different approaches need to be carried out with a view to developing guidelines for selecting an appropriate method for a given dataset. In the meantime, classification accuracy might be improved by combining different techniques to hybrid approaches and multi-method ensembles.

  12. Mapping stand-age distribution of Russian forests from satellite data

    NASA Astrophysics Data System (ADS)

    Chen, D.; Loboda, T. V.; Hall, A.; Channan, S.; Weber, C. Y.

    2013-12-01

    Russian boreal forest is a critical component of the global boreal biome as approximately two thirds of the boreal forest is located in Russia. Numerous studies have shown that wildfire and logging have led to extensive modifications of forest cover in the region since 2000. Forest disturbance and subsequent regrowth influences carbon and energy budgets and, in turn, affect climate. Several global and regional satellite-based data products have been developed from coarse (>100m) and moderate (10-100m) resolution imagery to monitor forest cover change over the past decade, record of forest cover change pre-dating year 2000 is very fragmented. Although by using stacks of Landsat images, some information regarding the past disturbances can be obtained, the quantity and locations of such stacks with sufficient number of images are extremely limited, especially in Eastern Siberia. This paper describes a modified method which is built upon previous work to hindcast the disturbance history and map stand-age distribution in the Russian boreal forest. Utilizing data from both Landsat and the Moderate Resolution Imaging Spectroradiometer (MODIS), a wall-to-wall map indicating the estimated age of forest in the Russian boreal forest is created. Our previous work has shown that disturbances can be mapped successfully up to 30 years in the past as the spectral signature of regrowing forests is statistically significantly different from that of mature forests. The presented algorithm ingests 55 multi-temporal stacks of Landsat imagery available over Russian forest before 2001 and processes through a standardized and semi-automated approach to extract training and validation data samples. Landsat data, dating back to 1984, are used to generate maps of forest disturbance using temporal shifts in Disturbance Index through the multi-temporal stack of imagery in selected locations. These maps are then used as reference data to train a decision tree classifier on 50 MODIS-based indices. The resultant map provides an estimate of forest age based on the regrowth curves observed from Landsat imagery. The accuracy of the resultant map is assessed against three datasets: 1) subset of the disturbance maps developed within the algorithm, 2) independent disturbance maps created by the Northern Eurasia Land Dynamics Analysis (NELDA) project, and 3) field-based stand-age distribution from forestry inventory units. The current version of the product presents a considerable improvement on the previous version which used Landsat data samples at a set of randomly selected locations, resulting a strong bias of the training samples towards the Landsat-rich regions (e.g. European Russia) whereas regions such as Siberia were under-sampled. Aiming at improving accuracy, the current method significantly increases the number of training Landsat samples compared to the previous work. Aside from the previously used data, the current method uses all available Landsat data for the under-sampled regions in order to increase the representativeness of the total samples. The finial accuracy assessment is still ongoing, however, the initial results suggested an overall accuracy expressed in Kappa > 0.8. We plan to release both the training data and the final disturbance map of the Russian boreal forest to the public after the validation is completed.

  13. Large-scale mapping and predictive modeling of submerged aquatic vegetation in a shallow eutrophic lake.

    PubMed

    Havens, Karl E; Harwell, Matthew C; Brady, Mark A; Sharfstein, Bruce; East, Therese L; Rodusky, Andrew J; Anson, Daniel; Maki, Ryan P

    2002-04-09

    A spatially intensive sampling program was developed for mapping the submerged aquatic vegetation (SAV) over an area of approximately 20,000 ha in a large, shallow lake in Florida, U.S. The sampling program integrates Geographic Information System (GIS) technology with traditional field sampling of SAV and has the capability of producing robust vegetation maps under a wide range of conditions, including high turbidity, variable depth (0 to 2 m), and variable sediment types. Based on sampling carried out in August-September 2000, we measured 1,050 to 4,300 ha of vascular SAV species and approximately 14,000 ha of the macroalga Chara spp. The results were similar to those reported in the early 1990s, when the last large-scale SAV sampling occurred. Occurrence of Chara was strongly associated with peat sediments, and maximal depths of occurrence varied between sediment types (mud, sand, rock, and peat). A simple model of Chara occurrence, based only on water depth, had an accuracy of 55%. It predicted occurrence of Chara over large areas where the plant actually was not found. A model based on sediment type and depth had an accuracy of 75% and produced a spatial map very similar to that based on observations. While this approach needs to be validated with independent data in order to test its general utility, we believe it may have application elsewhere. The simple modeling approach could serve as a coarse-scale tool for evaluating effects of water level management on Chara populations.

  14. 'Nano-immuno test' for the detection of live Mycobacterium avium subspecies paratuberculosis bacilli in the milk samples using magnetic nano-particles and chromogen.

    PubMed

    Singh, Manju; Singh, Shoor Vir; Gupta, Saurabh; Chaubey, Kundan Kumar; Stephan, Bjorn John; Sohal, Jagdip Singh; Dutta, Manali

    2018-04-26

    Early rapid detection of Mycobacterium avium subspecies paratuberculosis (MAP) bacilli in milk samples is the major challenge since traditional culture method is time consuming and laboratory dependent. We report a simple, sensitive and specific nano-technology based 'Nano-immuno test' capable of detecting viable MAP bacilli in the milk samples within 10 h. Viable MAP bacilli were captured by MAP specific antibody-conjugated magnetic nano-particles using resazurin dye as chromogen. Test was optimized using true culture positive (10-bovine and 12-goats) and true culture negative (16-bovine and 25-goats) raw milk samples. Domestic livestock species in India are endemically infected with MAP. After successful optimization, sensitivity and specificity of the 'nano-immuno test' in goats with respect to milk culture was 91.7% and 96.0%, respectively. Whereas, it was 90.0% (sensitivity) and 92.6% (specificity) with respect to IS900 PCR. In bovine milk samples, sensitivity and specificity of 'nano-immuno test' with respect to milk culture was 90.0% and 93.7%, respectively. However, with respect to IS900 PCR, the sensitivity and specificity was 88.9% and 94.1%, respectively. Test was validated with field raw milk samples (goats-258 and bovine-138) collected from domestic livestock species to detect live/viable MAP bacilli. Of 138 bovine raw milk samples screened by six diagnostic tests, 81 (58.7%) milk samples were positive for MAP infection in one or more than one diagnostic tests. Of 81 (58.7%) positive bovine raw milk samples, only 24 (17.4%) samples were detected positive for the presence of viable MAP bacilli. Of 258 goats raw milk samples screened by six diagnostic tests, 141 (54.6%) were positive for MAP infection in one or more than one test. Of 141 (54.6%) positive raw milk samples from goats, only 48 (34.0%) were detected positive for live MAP bacilli. Simplicity and efficiency of this novel 'nano-immuno test' makes it suitable for wide-scale screening of milk samples in the field. Standardization, validation and re-usability of functionalized nano-particles and the test was successfully achieved in field samples. Test was highly specific, simple to perform and easy to read by naked eyes and does not require laboratory support in the performance of test. Test has potential to be used as screening test to estimate bio-load of MAP in milk samples at National level.

  15. Mobile robot motion estimation using Hough transform

    NASA Astrophysics Data System (ADS)

    Aldoshkin, D. N.; Yamskikh, T. N.; Tsarev, R. Yu

    2018-05-01

    This paper proposes an algorithm for estimation of mobile robot motion. The geometry of surrounding space is described with range scans (samples of distance measurements) taken by the mobile robot’s range sensors. A similar sample of space geometry in any arbitrary preceding moment of time or the environment map can be used as a reference. The suggested algorithm is invariant to isotropic scaling of samples or map that allows using samples measured in different units and maps made at different scales. The algorithm is based on Hough transform: it maps from measurement space to a straight-line parameters space. In the straight-line parameters, space the problems of estimating rotation, scaling and translation are solved separately breaking down a problem of estimating mobile robot localization into three smaller independent problems. The specific feature of the algorithm presented is its robustness to noise and outliers inherited from Hough transform. The prototype of the system of mobile robot orientation is described.

  16. Landslide Inventory Mapping from Bitemporal 10 m SENTINEL-2 Images Using Change Detection Based Markov Random Field

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Lu, P.; Li, Z.

    2018-04-01

    Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF) method for landslide inventory mapping. The proposed method mainly includes two steps: 1) change detection-based multi-threshold for training samples generation and 2) MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1) it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2) it takes the spectral characteristics of landslides into account; and 3) it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2) images in China.

  17. The impact of sample size on the reproducibility of voxel-based lesion-deficit mappings.

    PubMed

    Lorca-Puls, Diego L; Gajardo-Vidal, Andrea; White, Jitrachote; Seghier, Mohamed L; Leff, Alexander P; Green, David W; Crinion, Jenny T; Ludersdorfer, Philipp; Hope, Thomas M H; Bowman, Howard; Price, Cathy J

    2018-07-01

    This study investigated how sample size affects the reproducibility of findings from univariate voxel-based lesion-deficit analyses (e.g., voxel-based lesion-symptom mapping and voxel-based morphometry). Our effect of interest was the strength of the mapping between brain damage and speech articulation difficulties, as measured in terms of the proportion of variance explained. First, we identified a region of interest by searching on a voxel-by-voxel basis for brain areas where greater lesion load was associated with poorer speech articulation using a large sample of 360 right-handed English-speaking stroke survivors. We then randomly drew thousands of bootstrap samples from this data set that included either 30, 60, 90, 120, 180, or 360 patients. For each resample, we recorded effect size estimates and p values after conducting exactly the same lesion-deficit analysis within the previously identified region of interest and holding all procedures constant. The results show (1) how often small effect sizes in a heterogeneous population fail to be detected; (2) how effect size and its statistical significance varies with sample size; (3) how low-powered studies (due to small sample sizes) can greatly over-estimate as well as under-estimate effect sizes; and (4) how large sample sizes (N ≥ 90) can yield highly significant p values even when effect sizes are so small that they become trivial in practical terms. The implications of these findings for interpreting the results from univariate voxel-based lesion-deficit analyses are discussed. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  18. Elemental mapping of large samples by external ion beam analysis with sub-millimeter resolution and its applications

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Added, N.; Rizzutto, M. A.; Tabacniks, M. H.; Mangiarotti, A.; Curado, J. F.; Aguirre, F. R.; Aguero, N. F.; Allegro, P. R. P.; Campos, P. H. O. V.; Restrepo, J. M.; Trindade, G. F.; Antonio, M. R.; Assis, R. F.; Leite, A. R.

    2018-05-01

    The elemental mapping of large areas using ion beam techniques is a desired capability for several scientific communities, involved on topics ranging from geoscience to cultural heritage. Usually, the constraints for large-area mapping are not met in setups employing micro- and nano-probes implemented all over the world. A novel setup for mapping large sized samples in an external beam was recently built at the University of São Paulo employing a broad MeV-proton probe with sub-millimeter dimension, coupled to a high-precision large range XYZ robotic stage (60 cm range in all axis and precision of 5 μ m ensured by optical sensors). An important issue on large area mapping is how to deal with the irregularities of the sample's surface, that may introduce artifacts in the images due to the variation of the measuring conditions. In our setup, we implemented an automatic system based on machine vision to correct the position of the sample to compensate for its surface irregularities. As an additional benefit, a 3D digital reconstruction of the scanned surface can also be obtained. Using this new and unique setup, we have produced large-area elemental maps of ceramics, stones, fossils, and other sort of samples.

  19. A two-dimensional matrix image based feature extraction method for classification of sEMG: A comparative analysis based on SVM, KNN and RBF-NN.

    PubMed

    Wen, Tingxi; Zhang, Zhongnan; Qiu, Ming; Zeng, Ming; Luo, Weizhen

    2017-01-01

    The computer mouse is an important human-computer interaction device. But patients with physical finger disability are unable to operate this device. Surface EMG (sEMG) can be monitored by electrodes on the skin surface and is a reflection of the neuromuscular activities. Therefore, we can control limbs auxiliary equipment by utilizing sEMG classification in order to help the physically disabled patients to operate the mouse. To develop a new a method to extract sEMG generated by finger motion and apply novel features to classify sEMG. A window-based data acquisition method was presented to extract signal samples from sEMG electordes. Afterwards, a two-dimensional matrix image based feature extraction method, which differs from the classical methods based on time domain or frequency domain, was employed to transform signal samples to feature maps used for classification. In the experiments, sEMG data samples produced by the index and middle fingers at the click of a mouse button were separately acquired. Then, characteristics of the samples were analyzed to generate a feature map for each sample. Finally, the machine learning classification algorithms (SVM, KNN, RBF-NN) were employed to classify these feature maps on a GPU. The study demonstrated that all classifiers can identify and classify sEMG samples effectively. In particular, the accuracy of the SVM classifier reached up to 100%. The signal separation method is a convenient, efficient and quick method, which can effectively extract the sEMG samples produced by fingers. In addition, unlike the classical methods, the new method enables to extract features by enlarging sample signals' energy appropriately. The classical machine learning classifiers all performed well by using these features.

  20. Geological sampling data and benthic biota classification: Buzzards Bay and Vineyard Sound, Massachusetts

    USGS Publications Warehouse

    Ackerman, Seth D.; Pappal, Adrienne L.; Huntley, Emily C.; Blackwood, Dann S.; Schwab, William C.

    2015-01-01

    Sea-floor sample collection is an important component of a statewide cooperative mapping effort between the U.S. Geological Survey (USGS) and the Massachusetts Office of Coastal Zone Management (CZM). Sediment grab samples, bottom photographs, and video transects were collected within Vineyard Sound and Buzzards Bay in 2010 aboard the research vesselConnecticut. This report contains sample data and related information, including analyses of surficial-sediment grab samples, locations and images of sea-floor photography, survey lines along which sea-floor video was collected, and a classification of benthic biota observed in sea-floor photographs and based on the Coastal and Marine Ecological Classification Standard (CMECS). These sample data and analyses information are used to verify interpretations of geophysical data and are an essential part of geologic maps of the sea floor. These data also provide a valuable inventory of benthic habitat and resources. Geographic information system (GIS) data, maps, and interpretations, produced through the USGS and CZM mapping cooperative, are intended to aid efforts to manage coastal and marine resources and to provide baseline information for research focused on coastal evolution and environmental change.

  1. Quantile rank maps: a new tool for understanding individual brain development.

    PubMed

    Chen, Huaihou; Kelly, Clare; Castellanos, F Xavier; He, Ye; Zuo, Xi-Nian; Reiss, Philip T

    2015-05-01

    We propose a novel method for neurodevelopmental brain mapping that displays how an individual's values for a quantity of interest compare with age-specific norms. By estimating smoothly age-varying distributions at a set of brain regions of interest, we derive age-dependent region-wise quantile ranks for a given individual, which can be presented in the form of a brain map. Such quantile rank maps could potentially be used for clinical screening. Bootstrap-based confidence intervals are proposed for the quantile rank estimates. We also propose a recalibrated Kolmogorov-Smirnov test for detecting group differences in the age-varying distribution. This test is shown to be more robust to model misspecification than a linear regression-based test. The proposed methods are applied to brain imaging data from the Nathan Kline Institute Rockland Sample and from the Autism Brain Imaging Data Exchange (ABIDE) sample. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Comparison of non-landslide sampling strategies to counteract inventory-based biases within national-scale statistical landslide susceptibility models

    NASA Astrophysics Data System (ADS)

    Lima, Pedro; Steger, Stefan; Glade, Thomas

    2017-04-01

    Landslides can represent a significant threat for people and infrastructure in hilly and mountainous landscapes worldwide. The understanding and prediction of those geomorphic processes is crucial to avoid economic loses or even casualties to people and their properties. Statistical based landslide susceptibility models are well known for being highly reliant on the quality, representativeness and availability of input data. In this context, several studies indicate that the landslide inventory represents the most important input data. However each landslide mapping technique or data collection has its drawbacks. Consequently, biased landslide inventories may be commonly introduced into statistical models, especially at regional or even national scale. It remains to the researcher to be aware of potential limitations and design strategies to avoid or reduce the potential propagation of input data errors and biases influences on the modelling outcomes. Previous studies have proven that such erroneous landslide inventories may lead to unrealistic landslide susceptibility maps. We assume that one possibility to tackle systematic landslide inventory-based biases might be a concentration on sampling strategies that focus on the distribution of non-landslide locations. For this purpose, we test an approach for the Austrian territory that concentrates on a modified non-landslide sampling strategy, instead the traditional applied random sampling. It is expected that the way non-landslide locations are represented (e.g. equally over the area or within those areas where mapping campaigns have been conducted) is important to reduce a potential over- or underestimation of landslide susceptibility within specific areas caused by bias. As presumably each landslide inventory is known to be systematically incomplete, especially in those areas where no mapping campaign was previously conducted. This is also applicable to the one currently available for the Austrian territory, composed by 14,519 shallow landslides. Within this study, we introduce the following explanatory variables to test the effect of different non-landslide strategies: Lithological units, grouped by their geotechnical properties and topographic parameters such as aspect, elevation, slope gradient and the topographic position. Landslide susceptibility maps will be derived by applying logistic regression, while systematic comparisons will be carried out based on models created by different non-landslide sampling strategies. Models generated by the conventional random sampling are presented against models based on stratified and clustered sampling strategies. The modelling results will be compared in terms of their prediction performance measured by the AUROC (Area Under the Receiver Operating Characteristic Curve) obtained by means of a k-fold cross-validation and also by the spatial pattern of the maps. The outcomes of this study are intended to contribute to the understanding on how landslide-inventory based biases may be counteracted.

  3. Kriging - a challenge in geochemical mapping

    NASA Astrophysics Data System (ADS)

    Stojdl, Jiri; Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan; Vachova, Tatina; Hosek, Michal

    2017-04-01

    Geochemists can easily provide datasets for contamination mapping thanks to recent advances in geographical information systems (GIS) and portable chemical-analytical instrumentation. Kriging is commonly used to visualise the results of such mapping. It is understandable, as kriging is a well-established method of spatial interpolation. It was created in 1950's for geochemical data processing to estimate the most likely distribution of gold based on samples from a few boreholes. However, kriging is based on the assumption of continuous spatial distribution of numeric data that is not realistic in environmental geochemistry. The use of kriging is correct when the data density is sufficient with respect to heterogeneity of the spatial distribution of the geochemical parameters. However, if anomalous geochemical values are focused in hotspots of which boundaries are insufficiently densely sampled, kriging could provide misleading maps with the real contours of hotspots blurred by data smoothing and levelling out individual (isolated) but relevant anomalous values. The data smoothing can thus it results in underestimation of geochemical extremes, which may in fact be of the greatest importance in mapping projects. In our study we characterised hotspots of contamination by uranium and zinc in the floodplain of the Ploučnice River. The first objective of our study was to compare three methods of sampling: random (based on stochastic generation of sampling points), systematic (square grid) and judgemental sampling (based on judgement stemming from principles of fluvial deposition) as the basis for pollution maps. The first detected problem in production of the maps was the reduction of the smoothing effect of kriging using appropriate function of empirical semivariogram and setting the variation of at microscales smaller than the sampling distances to minimum (the "nugget" parameter of semivariogram). Exact interpolators such as Inverse Distance Weighting (IDW) or Radial Basis Functions (RBF) provides better solutions in this respect. The second detected problem was heterogeneous structure of the floodplain: it consists of distinct sedimentary bodies (e.g., natural levees, meander scars, point bars), which have been formed by different process (erosion or deposition on flooding, channel shifts by meandering, channel abandonment). Interpolation through these sedimentary bodies has thus not much sense. Solution is to identify boundaries between sedimentary bodies and interpolation of data with this additional information using exact interpolators with barriers (IDW, RBF or stratified kriging) or regression kriging. Those boundaries can be identified using, e.g., digital elevation model (DEM), dipole electromagnetic profiling (DEMP), gamma spectrometry, or an expertise by a geomorphologist.

  4. The Use of Handheld X-Ray Fluorescence (XRF) Technology in Unraveling the Eruptive History of the San Francisco Volcanic Field, Arizona

    NASA Technical Reports Server (NTRS)

    Young, Kelsey E.; Evans, C. A.; Hodges, K. V.

    2012-01-01

    While traditional geologic mapping includes the examination of structural relationships between rock units in the field, more advanced technology now enables us to simultaneously collect and combine analytical datasets with field observations. Information about tectonomagmatic processes can be gleaned from these combined data products. Historically, construction of multi-layered field maps that include sample data has been accomplished serially (first map and collect samples, analyze samples, combine data, and finally, readjust maps and conclusions about geologic history based on combined data sets). New instruments that can be used in the field, such as a handheld xray fluorescence (XRF) unit, are now available. Targeted use of such instruments enables geologists to collect preliminary geochemical data while in the field so that they can optimize scientific data return from each field traverse. Our study tests the application of this technology and projects the benefits gained by real-time geochemical data in the field. The integrated data set produces a richer geologic map and facilitates a stronger contextual picture for field geologists when collecting field observations and samples for future laboratory work. Real-time geochemical data on samples also provide valuable insight regarding sampling decisions by the field geologist

  5. High-resolution mapping of forest carbon stocks in the Colombian Amazon

    NASA Astrophysics Data System (ADS)

    Asner, G. P.; Clark, J. K.; Mascaro, J.; Galindo García, G. A.; Chadwick, K. D.; Navarrete Encinales, D. A.; Paez-Acosta, G.; Cabrera Montenegro, E.; Kennedy-Bowdoin, T.; Duque, Á.; Balaji, A.; von Hildebrand, P.; Maatoug, L.; Bernal, J. F. Phillips; Yepes Quintero, A. P.; Knapp, D. E.; García Dávila, M. C.; Jacobson, J.; Ordóñez, M. F.

    2012-07-01

    High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or light detection and ranging (LiDAR) samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high-resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (> 40%) of the Colombian Amazon - a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i) employing a universal approach to airborne LiDAR-calibration with limited field data; (ii) quantifying environmental controls over carbon densities; and (iii) developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon maps have 14% uncertainty at 1 ha resolution, and the regional map based on stratification has 28% uncertainty in any given hectare. High-resolution approaches with quantifiable pixel-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.

  6. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  7. Multilayered nonuniform sampling for three-dimensional scene representation

    NASA Astrophysics Data System (ADS)

    Lin, Huei-Yung; Xiao, Yu-Hua; Chen, Bo-Ren

    2015-09-01

    The representation of a three-dimensional (3-D) scene is essential in multiview imaging technologies. We present a unified geometry and texture representation based on global resampling of the scene. A layered data map representation with a distance-dependent nonuniform sampling strategy is proposed. It is capable of increasing the details of the 3-D structure locally and is compact in size. The 3-D point cloud obtained from the multilayered data map is used for view rendering. For any given viewpoint, image synthesis with different levels of detail is carried out using the quadtree-based nonuniformly sampled 3-D data points. Experimental results are presented using the 3-D models of reconstructed real objects.

  8. Mapping South San Francisco Bay's seabed diversity for use in wetland restoration planning

    USGS Publications Warehouse

    Fregoso, Theresa A.; Jaffe, B.; Rathwell, G.; Collins, W.; Rhynas, K.; Tomlin, V.; Sullivan, S.

    2006-01-01

    Data for an acoustic seabed classification were collected as a part of a California Coastal Conservancy funded bathymetric survey of South Bay in early 2005.  A QTC VIEW seabed classification system recorded echoes from a sungle bean 50 kHz echosounder.  Approximately 450,000 seabed classification records were generated from an are of of about 30 sq. miles.  Ten district acoustic classes were identified through an unsupervised classification system using principle component and cluster analyses.  One hundred and sixty-one grab samples and forty-five benthic community composition data samples collected in the study area shortly before and after the seabed classification survey, further refined the ten classes into groups based on grain size.  A preliminary map of surficial grain size of South Bay was developed from the combination of the seabed classification and the grab and benthic samples.  The initial seabed classification map, the grain size map, and locations of sediment samples will be displayed along with the methods of acousitc seabed classification.

  9. Impact of population structure, effective bottleneck time, and allele frequency on linkage disequilibrium maps

    PubMed Central

    Zhang, Weihua; Collins, Andrew; Gibson, Jane; Tapper, William J.; Hunt, Sarah; Deloukas, Panos; Bentley, David R.; Morton, Newton E.

    2004-01-01

    Genetic maps in linkage disequilibrium (LD) units play the same role for association mapping as maps in centimorgans provide at much lower resolution for linkage mapping. Association mapping of genes determining disease susceptibility and other phenotypes is based on the theory of LD, here applied to relations with three phenomena. To test the theory, markers at high density along a 10-Mb continuous segment of chromosome 20q were studied in African-American, Asian, and Caucasian samples. Population structure, whether created by pooling samples from divergent populations or by the mating pattern in a mixed population, is accurately bioassayed from genotype frequencies. The effective bottleneck time for Eurasians is substantially less than for migration out of Africa, reflecting later bottlenecks. The classical dependence of allele frequency on mutation age does not hold for the generally shorter time span of inbreeding and LD. Limitation of the classical theory to mutation age justifies the assumption of constant time in a LD map, except for alleles that were rare at the effective bottleneck time or have arisen since. This assumption is derived from the Malecot model and verified in all samples. Tested measures of relative efficiency, support intervals, and localization error determine the operating characteristics of LD maps that are applicable to every sexually reproducing species, with implications for association mapping, high-resolution linkage maps, evolutionary inference, and identification of recombinogenic sequences. PMID:15604137

  10. Impact of population structure, effective bottleneck time, and allele frequency on linkage disequilibrium maps.

    PubMed

    Zhang, Weihua; Collins, Andrew; Gibson, Jane; Tapper, William J; Hunt, Sarah; Deloukas, Panos; Bentley, David R; Morton, Newton E

    2004-12-28

    Genetic maps in linkage disequilibrium (LD) units play the same role for association mapping as maps in centimorgans provide at much lower resolution for linkage mapping. Association mapping of genes determining disease susceptibility and other phenotypes is based on the theory of LD, here applied to relations with three phenomena. To test the theory, markers at high density along a 10-Mb continuous segment of chromosome 20q were studied in African-American, Asian, and Caucasian samples. Population structure, whether created by pooling samples from divergent populations or by the mating pattern in a mixed population, is accurately bioassayed from genotype frequencies. The effective bottleneck time for Eurasians is substantially less than for migration out of Africa, reflecting later bottlenecks. The classical dependence of allele frequency on mutation age does not hold for the generally shorter time span of inbreeding and LD. Limitation of the classical theory to mutation age justifies the assumption of constant time in a LD map, except for alleles that were rare at the effective bottleneck time or have arisen since. This assumption is derived from the Malecot model and verified in all samples. Tested measures of relative efficiency, support intervals, and localization error determine the operating characteristics of LD maps that are applicable to every sexually reproducing species, with implications for association mapping, high-resolution linkage maps, evolutionary inference, and identification of recombinogenic sequences.

  11. Electrostatic frequency maps for amide-I mode of β-peptide: Comparison of molecular mechanics force field and DFT calculations

    NASA Astrophysics Data System (ADS)

    Cai, Kaicong; Zheng, Xuan; Du, Fenfen

    2017-08-01

    The spectroscopy of amide-I vibrations has been widely utilized for the understanding of dynamical structure of polypeptides. For the modeling of amide-I spectra, two frequency maps were built for β-peptide analogue (N-ethylpropionamide, NEPA) in a number of solvents within different schemes (molecular mechanics force field based, GM map; DFT calculation based, GD map), respectively. The electrostatic potentials on the amide unit that originated from solvents and peptide backbone were correlated to the amide-I frequency shift from gas phase to solution phase during map parameterization. GM map is easier to construct with negligible computational cost since the frequency calculations for the samples are purely based on force field, while GD map utilizes sophisticated DFT calculations on the representative solute-solvent clusters and brings insight into the electronic structures of solvated NEPA and its chemical environments. The results show that the maps' predicted amide-I frequencies present solvation environmental sensitivities and exhibit their specific characters with respect to the map protocols, and the obtained vibrational parameters are in satisfactory agreement with experimental amide-I spectra of NEPA in solution phase. Although different theoretical schemes based maps have their advantages and disadvantages, the present maps show their potentials in interpreting the amide-I spectra for β-peptides, respectively.

  12. Reduced electron exposure for energy-dispersive spectroscopy using dynamic sampling

    DOE PAGES

    Zhang, Yan; Godaliyadda, G. M. Dilshan; Ferrier, Nicola; ...

    2017-10-23

    Analytical electron microscopy and spectroscopy of biological specimens, polymers, and other beam sensitive materials has been a challenging area due to irradiation damage. There is a pressing need to develop novel imaging and spectroscopic imaging methods that will minimize such sample damage as well as reduce the data acquisition time. The latter is useful for high-throughput analysis of materials structure and chemistry. Here, in this work, we present a novel machine learning based method for dynamic sparse sampling of EDS data using a scanning electron microscope. Our method, based on the supervised learning approach for dynamic sampling algorithm and neuralmore » networks based classification of EDS data, allows a dramatic reduction in the total sampling of up to 90%, while maintaining the fidelity of the reconstructed elemental maps and spectroscopic data. In conclusion, we believe this approach will enable imaging and elemental mapping of materials that would otherwise be inaccessible to these analysis techniques.« less

  13. Kalman/Map filtering-aided fast normalized cross correlation-based Wi-Fi fingerprinting location sensing.

    PubMed

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-11-13

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results.

  14. Kalman/Map Filtering-Aided Fast Normalized Cross Correlation-Based Wi-Fi Fingerprinting Location Sensing

    PubMed Central

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-01-01

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results. PMID:24233027

  15. Lunar Silicon Abundance determined by Kaguya Gamma-ray Spectrometer and Chandrayaan-1 Moon Mineralogy Mapper

    NASA Astrophysics Data System (ADS)

    Kim, Kyeong; Berezhnoy, Alexey; Wöhler, Christian; Grumpe, Arne; Rodriguez, Alexis; Hasebe, Nobuyuki; Van Gasselt, Stephan

    2016-07-01

    Using Kaguya GRS data, we investigated Si distribution on the Moon, based on study of the 4934 keV Si gamma ray peak caused by interaction between thermal neutrons and lunar Si-28 atoms. A Si peak analysis for a grid of 10 degrees in longitude and latitude was accomplished by the IRAP Aquarius program followed by a correction for altitude and thermal neutron density. A spectral parameter based regression model of the Si distribution was built for latitudes between 60°S and 60°N based on the continuum slopes, band depths, widths and minimum wavelengths of the absorption bands near 1 μμm and 2 μμm. Based on these regression models a nearly global cpm (counts per minute) map of Si with a resolution of 20 pixels per degree was constructed. The construction of a nearly global map of lunar Si abundances has been achieved by a combination of regression-based analysis of KGRS cpm data and M ^{3} spectral reflectance data, it has been calibrated with respect to returned sample-based wt% values. The Si abundances estimated with our method systematically exceed those of the LP GRS Si data set but are consistent with typical Si abundances of lunar basalt samples (in the maria) and feldspathic mineral samples (in the highlands). Our Si map shows that the Si abundance values on the Moon are typically between 17 and 28 wt%. The obtained Si map will provide an important aspect in both understanding the distribution of minerals and the evolution of the lunar surface since its formation.

  16. Uncertainty estimation for map-based analyses

    Treesearch

    Ronald E. McRoberts; Mark A. Hatfield; Susan J. Crocker

    2010-01-01

    Traditionally, natural resource managers have asked the question, “How much?” and have received sample-based estimates of resource totals or means. Increasingly, however, the same managers are now asking the additional question, “Where?” and are expecting spatially explicit answers in the form of maps. Recent development of natural resource databases, access to...

  17. Korean coastal water depth/sediment and land cover mapping (1:25,000) by computer analysis of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Park, K. Y.; Miller, L. D.

    1978-01-01

    Computer analysis was applied to single date LANDSAT MSS imagery of a sample coastal area near Seoul, Korea equivalent to a 1:50,000 topographic map. Supervised image processing yielded a test classification map from this sample image containing 12 classes: 5 water depth/sediment classes, 2 shoreline/tidal classes, and 5 coastal land cover classes at a scale of 1:25,000 and with a training set accuracy of 76%. Unsupervised image classification was applied to a subportion of the site analyzed and produced classification maps comparable in results in a spatial sense. The results of this test indicated that it is feasible to produce such quantitative maps for detailed study of dynamic coastal processes given a LANDSAT image data base at sufficiently frequent time intervals.

  18. Cluster categorization of urban roads to optimize their noise monitoring.

    PubMed

    Zambon, G; Benocci, R; Brambilla, G

    2016-01-01

    Road traffic in urban areas is recognized to be associated with urban mobility and public health, and it is often the main source of noise pollution. Lately, noise maps have been considered a powerful tool to estimate the population exposure to environmental noise, but they need to be validated by measured noise data. The project Dynamic Acoustic Mapping (DYNAMAP), co-funded in the framework of the LIFE 2013 program, is aimed to develop a statistically based method to optimize the choice and the number of monitoring sites and to automate the noise mapping update using the data retrieved from a low-cost monitoring network. Indeed, the first objective should improve the spatial sampling based on the legislative road classification, as this classification is mainly based on the geometrical characteristics of the road, rather than its noise emission. The present paper describes the statistical approach of the methodology under development and the results of its preliminary application to a limited sample of roads in the city of Milan. The resulting categorization of roads, based on clustering the 24-h hourly L Aeqh, looks promising to optimize the spatial sampling of noise monitoring toward a description of the noise pollution due to complex urban road networks more efficient than that based on the legislative road classification.

  19. A note on the efficiencies of sampling strategies in two-stage Bayesian regional fine mapping of a quantitative trait.

    PubMed

    Chen, Zhijian; Craiu, Radu V; Bull, Shelley B

    2014-11-01

    In focused studies designed to follow up associations detected in a genome-wide association study (GWAS), investigators can proceed to fine-map a genomic region by targeted sequencing or dense genotyping of all variants in the region, aiming to identify a functional sequence variant. For the analysis of a quantitative trait, we consider a Bayesian approach to fine-mapping study design that incorporates stratification according to a promising GWAS tag SNP in the same region. Improved cost-efficiency can be achieved when the fine-mapping phase incorporates a two-stage design, with identification of a smaller set of more promising variants in a subsample taken in stage 1, followed by their evaluation in an independent stage 2 subsample. To avoid the potential negative impact of genetic model misspecification on inference we incorporate genetic model selection based on posterior probabilities for each competing model. Our simulation study shows that, compared to simple random sampling that ignores genetic information from GWAS, tag-SNP-based stratified sample allocation methods reduce the number of variants continuing to stage 2 and are more likely to promote the functional sequence variant into confirmation studies. © 2014 WILEY PERIODICALS, INC.

  20. Decoding 2D-PAGE complex maps: relevance to proteomics.

    PubMed

    Pietrogrande, Maria Chiara; Marchetti, Nicola; Dondi, Francesco; Righetti, Pier Giorgio

    2006-03-20

    This review describes two mathematical approaches useful for decoding the complex signal of 2D-PAGE maps of protein mixtures. These methods are helpful for interpreting the large amount of data of each 2D-PAGE map by extracting all the analytical information hidden therein by spot overlapping. Here the basic theory and application to 2D-PAGE maps are reviewed: the means for extracting information from the experimental data and their relevance to proteomics are discussed. One method is based on the quantitative theory of statistical model of peak overlapping (SMO) using the spot experimental data (intensity and spatial coordinates). The second method is based on the study of the 2D-autocovariance function (2D-ACVF) computed on the experimental digitised map. They are two independent methods that are able to extract equal and complementary information from the 2D-PAGE map. Both methods permit to obtain fundamental information on the sample complexity and the separation performance and to single out ordered patterns present in spot positions: the availability of two independent procedures to compute the same separation parameters is a powerful tool to estimate the reliability of the obtained results. The SMO procedure is an unique tool to quantitatively estimate the degree of spot overlapping present in the map, while the 2D-ACVF method is particularly powerful in simply singling out the presence of order in the spot position from the complexity of the whole 2D map, i.e., spot trains. The procedures were validated by extensive numerical computation on computer-generated maps describing experimental 2D-PAGE gels of protein mixtures. Their applicability to real samples was tested on reference maps obtained from literature sources. The review describes the most relevant information for proteomics: sample complexity, separation performance, overlapping extent, identification of spot trains related to post-translational modifications (PTMs).

  1. EnGeoMAP - geological applications within the EnMAP hyperspectral satellite science program

    NASA Astrophysics Data System (ADS)

    Boesche, N. K.; Mielke, C.; Rogass, C.; Guanter, L.

    2016-12-01

    Hyperspectral investigations from near field to space substantially contribute to geological exploration and mining monitoring of raw material and mineral deposits. Due to their spectral characteristics, large mineral occurrences and minefields can be identified from space and the spatial distribution of distinct proxy minerals be mapped. In the frame of the EnMAP hyperspectral satellite science program a mineral and elemental mapping tool was developed - the EnGeoMAP. It contains a basic mineral mapping and a rare earth element mapping approach. This study shows the performance of EnGeoMAP based on simulated EnMAP data of the rare earth element bearing Mountain Pass Carbonatite Complex, USA, and the Rodalquilar and Lomilla Calderas, Spain, which host the economically relevant gold-silver, lead-zinc-silver-gold and alunite deposits. The mountain pass image data was simulated on the basis of AVIRIS Next Generation images, while the Rodalquilar data is based on HyMap images. The EnGeoMAP - Base approach was applied to both images, while the mountain pass image data were additionally analysed using the EnGeoMAP - REE software tool. The results are mineral and elemental maps that serve as proxies for the regional lithology and deposit types. The validation of the maps is based on chemical analyses of field samples. Current airborne sensors meet the spatial and spectral requirements for detailed mineral mapping and future hyperspectral space borne missions will additionally provide a large coverage. For those hyperspectral missions, EnGeoMAP is a rapid data analysis tool that is provided to spectral geologists working in mineral exploration.

  2. Is a Picture Worth a Thousand Words? Using Mind Maps to Facilitate Participant Recall in Qualitative Research

    ERIC Educational Resources Information Center

    Wheeldon, Johannes

    2011-01-01

    Mind maps may provide a new means to gather unsolicited data through qualitative research designs. In this paper, I explore the utility of mind maps through a project designed to uncover the experiences of Latvians involved in a legal technical assistance project. Based on a sample of 19 respondents, the depth and detail of the responses between…

  3. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  4. Target-specific digital soil mapping supporting terroir mapping in Tokaj Wine Region, Hungary

    NASA Astrophysics Data System (ADS)

    Takács, Katalin; Szabó, József; Laborczi, Annamária; Szatmári, Gábor; László, Péter; Koós, Sándor; Bakacsi, Zsófia; Pásztor, László

    2016-04-01

    Tokaj Wine Region - located in Northeast-Hungary, at Hegyalja, in Tokaj Mountains - is a historical region for botrityzed dessert wine making. Very recently the sustainable quality wine production in the region was targeted, which requires detailed and "terroir-based approach" characterization of viticultural land and the survey of the state of vineyards. Terroir is a homogeneous area that relates to both environmental and cultural factors, that influence the grape and wine quality. Soil plays dominant role determining the viticultural potential and terroir delineation. According to viticultural experts the most relevant soil properties are drainage, water holding capacity, soil depth and pH. Not all of these soil characteristics can be directly measured, therefore the synthesis of observed soil properties is needed to satisfy the requirements of terroir mapping. The sampling strategy was designed to be representative to the combinations of basic environmental parameters (slope, aspect and geology) which determine the main soil properties of the vineyards. Field survey was carried out in two steps. At first soil samples were collected from 200 sites to obtain a general view about the pedology of the area. In the second stage further 650 samples were collected and the sampling strategy was designed based on spatial annealing technique taking into consideration the results of the preliminary survey and the local characteristics of vineyards. The data collection regarded soil type, soil depth, parent material, rate of erosion, organic matter content and further physical and chemical soil properties which support the inference of the proper soil parameters. In the framework of the recent project 33 primary and secondary soil property, soil class and soil function maps were compiled. A set of the resulting maps supports to meet the demands of the Hungarian standard viticultural potential assessment, while the majority of the maps is intended to be applied for terroir delineation. The spatial extension was performed by two, different methods which are widely applied in digital soil mapping. Regression kriging was used for creating continuous soil property maps, category type soil maps were compiled by classification trees method. Accuracy assessment was also provided for all of the soil map products. Our poster will present the summary of the project workflow - the design of sampling strategy, field survey, digital soil mapping process - and some examples of the resulting soil property maps indicating their applicability in terroir delineation. Acknowledgement: The authors are grateful to the Tokaj Kereskedöház Ltd. which has been supporting the project for the survey of the state of vineyards. Digital soil mapping was partly supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  5. An evaluation of sampling and full enumeration strategies for Fisher Jenks classification in big data settings

    USGS Publications Warehouse

    Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.

    2017-01-01

    Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.

  6. Electrostatic frequency maps for amide-I mode of β-peptide: Comparison of molecular mechanics force field and DFT calculations.

    PubMed

    Cai, Kaicong; Zheng, Xuan; Du, Fenfen

    2017-08-05

    The spectroscopy of amide-I vibrations has been widely utilized for the understanding of dynamical structure of polypeptides. For the modeling of amide-I spectra, two frequency maps were built for β-peptide analogue (N-ethylpropionamide, NEPA) in a number of solvents within different schemes (molecular mechanics force field based, GM map; DFT calculation based, GD map), respectively. The electrostatic potentials on the amide unit that originated from solvents and peptide backbone were correlated to the amide-I frequency shift from gas phase to solution phase during map parameterization. GM map is easier to construct with negligible computational cost since the frequency calculations for the samples are purely based on force field, while GD map utilizes sophisticated DFT calculations on the representative solute-solvent clusters and brings insight into the electronic structures of solvated NEPA and its chemical environments. The results show that the maps' predicted amide-I frequencies present solvation environmental sensitivities and exhibit their specific characters with respect to the map protocols, and the obtained vibrational parameters are in satisfactory agreement with experimental amide-I spectra of NEPA in solution phase. Although different theoretical schemes based maps have their advantages and disadvantages, the present maps show their potentials in interpreting the amide-I spectra for β-peptides, respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Estimating uncertainty in map intersections

    Treesearch

    Ronald E. McRoberts; Mark A. Hatfield; Susan J. Crocker

    2009-01-01

    Traditionally, natural resource managers have asked the question "How much?" and have received sample-based estimates of resource totals or means. Increasingly, however, the same managers are now asking the additional question "Where?" and are expecting spatially explicit answers in the form of maps. Recent development of natural resource databases...

  8. Average variograms to guide soil sampling

    NASA Astrophysics Data System (ADS)

    Kerry, R.; Oliver, M. A.

    2004-10-01

    To manage land in a site-specific way for agriculture requires detailed maps of the variation in the soil properties of interest. To predict accurately for mapping, the interval at which the soil is sampled should relate to the scale of spatial variation. A variogram can be used to guide sampling in two ways. A sampling interval of less than half the range of spatial dependence can be used, or the variogram can be used with the kriging equations to determine an optimal sampling interval to achieve a given tolerable error. A variogram might not be available for the site, but if the variograms of several soil properties were available on a similar parent material and or particular topographic positions an average variogram could be calculated from these. Averages of the variogram ranges and standardized average variograms from four different parent materials in southern England were used to suggest suitable sampling intervals for future surveys in similar pedological settings based on half the variogram range. The standardized average variograms were also used to determine optimal sampling intervals using the kriging equations. Similar sampling intervals were suggested by each method and the maps of predictions based on data at different grid spacings were evaluated for the different parent materials. Variograms of loss on ignition (LOI) taken from the literature for other sites in southern England with similar parent materials had ranges close to the average for a given parent material showing the possible wider application of such averages to guide sampling.

  9. EMMMA: A web-based system for environmental mercury mapping, modeling, and analysis

    USGS Publications Warehouse

    Hearn,, Paul P.; Wente, Stephen P.; Donato, David I.; Aguinaldo, John J.

    2006-01-01

    tissue, atmospheric emissions and deposition, stream sediments, soils, and coal) and mercuryrelated data (mine locations); 2) Interactively view and access predictions of the National Descriptive Model of Mercury in Fish (NDMMF) at 4,976 sites and 6,829 sampling events (events are unique combinations of site and sampling date) across the United States; and 3) Use interactive mapping and graphing capabilities to visualize spatial and temporal trends and study relationships between mercury and other variables.

  10. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  11. High-resolution Antibody Array Analysis of Childhood Acute Leukemia Cells*

    PubMed Central

    Kanderova, Veronika; Kuzilkova, Daniela; Stuchly, Jan; Vaskova, Martina; Brdicka, Tomas; Fiser, Karel; Hrusak, Ondrej; Lund-Johansen, Fridtjof

    2016-01-01

    Acute leukemia is a disease pathologically manifested at both genomic and proteomic levels. Molecular genetic technologies are currently widely used in clinical research. In contrast, sensitive and high-throughput proteomic techniques for performing protein analyses in patient samples are still lacking. Here, we used a technology based on size exclusion chromatography followed by immunoprecipitation of target proteins with an antibody bead array (Size Exclusion Chromatography-Microsphere-based Affinity Proteomics, SEC-MAP) to detect hundreds of proteins from a single sample. In addition, we developed semi-automatic bioinformatics tools to adapt this technology for high-content proteomic screening of pediatric acute leukemia patients. To confirm the utility of SEC-MAP in leukemia immunophenotyping, we tested 31 leukemia diagnostic markers in parallel by SEC-MAP and flow cytometry. We identified 28 antibodies suitable for both techniques. Eighteen of them provided excellent quantitative correlation between SEC-MAP and flow cytometry (p < 0.05). Next, SEC-MAP was applied to examine 57 diagnostic samples from patients with acute leukemia. In this assay, we used 632 different antibodies and detected 501 targets. Of those, 47 targets were differentially expressed between at least two of the three acute leukemia subgroups. The CD markers correlated with immunophenotypic categories as expected. From non-CD markers, we found DBN1, PAX5, or PTK2 overexpressed in B-cell precursor acute lymphoblastic leukemias, LAT, SH2D1A, or STAT5A overexpressed in T-cell acute lymphoblastic leukemias, and HCK, GLUD1, or SYK overexpressed in acute myeloid leukemias. In addition, OPAL1 overexpression corresponded to ETV6-RUNX1 chromosomal translocation. In summary, we demonstrated that SEC-MAP technology is a powerful tool for detecting hundreds of proteins in clinical samples obtained from pediatric acute leukemia patients. It provides information about protein size and reveals differences in protein expression between particular leukemia subgroups. Forty-seven of SEC-MAP identified targets were validated by other conventional method in this study. PMID:26785729

  12. Plant-based plume-scale mapping of tritium contamination in desert soils

    USGS Publications Warehouse

    Andraski, Brian J.; Stonestrom, David A.; Michel, R.L.; Halford, K.J.; Radyk, J.C.

    2005-01-01

    Plant-based techniques were tested for field-scale evaluation of tritium contamination adjacent to a low-level radioactive waste (LLRW) facility in the Amargosa Desert, Nevada. Objectives were to (i) characterize and map the spatial variability of tritium in plant water, (ii) develop empirical relations to predict and map subsurface contamination from plant-water concentrations, and (iii) gain insight into tritium migration pathways and processes. Plant sampling [creosote bush, Larrea tridentata (Sessé & Moc. ex DC.) Coville] required one-fifth the time of soil water vapor sampling. Plant concentrations were spatially correlated to a separation distance of 380 m; measurement uncertainty accounted for <0.1% of the total variability in the data. Regression equations based on plant tritium explained 96 and 90% of the variation in root-zone and sub-root-zone soil water vapor concentrations, respectively. The equations were combined with kriged plant-water concentrations to map subsurface contamination. Mapping showed preferential lateral movement of tritium through a dry, coarse-textured layer beneath the root zone, with concurrent upward movement through the root zone. Analysis of subsurface fluxes along a transect perpendicular to the LLRW facility showed that upward diffusive-vapor transport dominates other transport modes beneath native vegetation. Downward advective-liquid transport dominates at one endpoint of the transect, beneath a devegetated road immediately adjacent to the facility. To our knowledge, this study is the first to document large-scale subsurface vapor-phase tritium migration from a LLRW facility. Plant-based methods provide a noninvasive, cost-effective approach to mapping subsurface tritium migration in desert areas.

  13. Preliminary investigation of submerged aquatic vegetation mapping using hyperspectral remote sensing.

    PubMed

    William, David J; Rybicki, Nancy B; Lombana, Alfonso V; O'Brien, Tim M; Gomez, Richard B

    2003-01-01

    The use of airborne hyperspectral remote sensing imagery for automated mapping of submerged aquatic vegetation (SAV) in the tidal Potomac River was investigated for near to real-time resource assessment and monitoring. Airborne hyperspectral imagery and field spectrometer measurements were obtained in October of 2000. A spectral library database containing selected ground-based and airborne sensor spectra was developed for use in image processing. The spectral library is used to automate the processing of hyperspectral imagery for potential real-time material identification and mapping. Field based spectra were compared to the airborne imagery using the database to identify and map two species of SAV (Myriophyllum spicatum and Vallisneria americana). Overall accuracy of the vegetation maps derived from hyperspectral imagery was determined by comparison to a product that combined aerial photography and field based sampling at the end of the SAV growing season. The algorithms and databases developed in this study will be useful with the current and forthcoming space-based hyperspectral remote sensing systems.

  14. Estimating Accuracy of Land-Cover Composition From Two-Stage Clustering Sampling

    EPA Science Inventory

    Land-cover maps are often used to compute land-cover composition (i.e., the proportion or percent of area covered by each class), for each unit in a spatial partition of the region mapped. We derive design-based estimators of mean deviation (MD), mean absolute deviation (MAD), ...

  15. Interpretation of fingerprint image quality features extracted by self-organizing maps

    NASA Astrophysics Data System (ADS)

    Danov, Ivan; Olsen, Martin A.; Busch, Christoph

    2014-05-01

    Accurate prediction of fingerprint quality is of significant importance to any fingerprint-based biometric system. Ensuring high quality samples for both probe and reference can substantially improve the system's performance by lowering false non-matches, thus allowing finer adjustment of the decision threshold of the biometric system. Furthermore, the increasing usage of biometrics in mobile contexts demands development of lightweight methods for operational environment. A novel two-tier computationally efficient approach was recently proposed based on modelling block-wise fingerprint image data using Self-Organizing Map (SOM) to extract specific ridge pattern features, which are then used as an input to a Random Forests (RF) classifier trained to predict the quality score of a propagated sample. This paper conducts an investigative comparative analysis on a publicly available dataset for the improvement of the two-tier approach by proposing additionally three feature interpretation methods, based respectively on SOM, Generative Topographic Mapping and RF. The analysis shows that two of the proposed methods produce promising results on the given dataset.

  16. A tutorial in displaying mass spectrometry-based proteomic data using heat maps.

    PubMed

    Key, Melissa

    2012-01-01

    Data visualization plays a critical role in interpreting experimental results of proteomic experiments. Heat maps are particularly useful for this task, as they allow us to find quantitative patterns across proteins and biological samples simultaneously. The quality of a heat map can be vastly improved by understanding the options available to display and organize the data in the heat map. This tutorial illustrates how to optimize heat maps for proteomics data by incorporating known characteristics of the data into the image. First, the concepts used to guide the creating of heat maps are demonstrated. Then, these concepts are applied to two types of analysis: visualizing spectral features across biological samples, and presenting the results of tests of statistical significance. For all examples we provide details of computer code in the open-source statistical programming language R, which can be used for biologists and clinicians with little statistical background. Heat maps are a useful tool for presenting quantitative proteomic data organized in a matrix format. Understanding and optimizing the parameters used to create the heat map can vastly improve both the appearance and the interoperation of heat map data.

  17. Automated semantic indexing of figure captions to improve radiology image retrieval.

    PubMed

    Kahn, Charles E; Rubin, Daniel L

    2009-01-01

    We explored automated concept-based indexing of unstructured figure captions to improve retrieval of images from radiology journals. The MetaMap Transfer program (MMTx) was used to map the text of 84,846 figure captions from 9,004 peer-reviewed, English-language articles to concepts in three controlled vocabularies from the UMLS Metathesaurus, version 2006AA. Sampling procedures were used to estimate the standard information-retrieval metrics of precision and recall, and to evaluate the degree to which concept-based retrieval improved image retrieval. Precision was estimated based on a sample of 250 concepts. Recall was estimated based on a sample of 40 concepts. The authors measured the impact of concept-based retrieval to improve upon keyword-based retrieval in a random sample of 10,000 search queries issued by users of a radiology image search engine. Estimated precision was 0.897 (95% confidence interval, 0.857-0.937). Estimated recall was 0.930 (95% confidence interval, 0.838-1.000). In 5,535 of 10,000 search queries (55%), concept-based retrieval found results not identified by simple keyword matching; in 2,086 searches (21%), more than 75% of the results were found by concept-based search alone. Concept-based indexing of radiology journal figure captions achieved very high precision and recall, and significantly improved image retrieval.

  18. Heterogeneity of shale documented by micro-FTIR and image analysis.

    PubMed

    Chen, Yanyan; Mastalerz, Maria; Schimmelmann, Arndt

    2014-12-01

    In this study, four New Albany Shale Devonian and Mississippian samples, with vitrinite reflectance [Ro ] values ranging from 0.55% to 1.41%, were analyzed by micro-FTIR mapping of chemical and mineralogical properties. One additional postmature shale sample from the Haynesville Shale (Kimmeridgian, Ro = 3.0%) was included to test the limitation of the method for more mature substrates. Relative abundances of organic matter and mineral groups (carbonates, quartz and clays) were mapped across selected microscale regions based on characteristic infrared peaks and demonstrated to be consistent with corresponding bulk compositional percentages. Mapped distributions of organic matter provide information on the organic matter abundance and the connectivity of organic matter within the overall shale matrix. The pervasive distribution of organic matter mapped in the New Albany Shale sample MM4 is in agreement with this shale's high total organic carbon abundance relative to other samples. Mapped interconnectivity of organic matter domains in New Albany Shale samples is excellent in two early mature shale samples having Ro values from 0.55% to 0.65%, then dramatically decreases in a late mature sample having an intermediate Ro of 1.15% and finally increases again in the postmature sample, which has a Ro of 1.41%. Swanson permeabilities, derived from independent mercury intrusion capillary pressure porosimetry measurements, follow the same trend among the four New Albany Shale samples, suggesting that micro-FTIR, in combination with complementary porosimetric techniques, strengthens our understanding of porosity networks. In addition, image processing and analysis software (e.g. ImageJ) have the capability to quantify organic matter and total organic carbon - valuable parameters for highly mature rocks, because they cannot be analyzed by micro-FTIR owing to the weakness of the aliphatic carbon-hydrogen signal. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  19. Incorporating Aptamers in the Multiple Analyte Profiling Assays (xMAP): Detection of C-Reactive Protein.

    PubMed

    Bernard, Elyse D; Nguyen, Kathy C; DeRosa, Maria C; Tayabali, Azam F; Aranda-Rodriguez, Rocio

    2017-01-01

    Aptamers are short oligonucleotide sequences used in detection systems because of their high affinity binding to a variety of macromolecules. With the introduction of aptamers over 25 years ago came the exploration of their use in many different applications as a substitute for antibodies. Aptamers have several advantages; they are easy to synthesize, can bind to analytes for which it is difficult to obtain antibodies, and in some cases bind better than antibodies. As such, aptamer applications have significantly expanded as an adjunct to a variety of different immunoassay designs. The Multiple-Analyte Profiling (xMAP) technology developed by Luminex Corporation commonly uses antibodies for the detection of analytes in small sample volumes through the use of fluorescently coded microbeads. This technology permits the simultaneous detection of multiple analytes in each sample tested and hence could be applied in many research fields. Although little work has been performed adapting this technology for use with apatmers, optimizing aptamer-based xMAP assays would dramatically increase the versatility of analyte detection. We report herein on the development of an xMAP bead-based aptamer/antibody sandwich assay for a biomarker of inflammation (C-reactive protein or CRP). Protocols for the coupling of aptamers to xMAP beads, validation of coupling, and for an aptamer/antibody sandwich-type assay for CRP are detailed. The optimized conditions, protocols and findings described in this research could serve as a starting point for the development of new aptamer-based xMAP assays.

  20. Evaluation of Techniques Used to Estimate Cortical Feature Maps

    PubMed Central

    Katta, Nalin; Chen, Thomas L.; Watkins, Paul V.; Barbour, Dennis L.

    2011-01-01

    Functional properties of neurons are often distributed nonrandomly within a cortical area and form topographic maps that reveal insights into neuronal organization and interconnection. Some functional maps, such as in visual cortex, are fairly straightforward to discern with a variety of techniques, while other maps, such as in auditory cortex, have resisted easy characterization. In order to determine appropriate protocols for establishing accurate functional maps in auditory cortex, artificial topographic maps were probed under various conditions, and the accuracy of estimates formed from the actual maps was quantified. Under these conditions, low-complexity maps such as sound frequency can be estimated accurately with as few as 25 total samples (e.g., electrode penetrations or imaging pixels) if neural responses are averaged together. More samples are required to achieve the highest estimation accuracy for higher complexity maps, and averaging improves map estimate accuracy even more than increasing sampling density. Undersampling without averaging can result in misleading map estimates, while undersampling with averaging can lead to the false conclusion of no map when one actually exists. Uniform sample spacing only slightly improves map estimation over nonuniform sample spacing typical of serial electrode penetrations. Tessellation plots commonly used to visualize maps estimated using nonuniform sampling are always inferior to linearly interpolated estimates, although differences are slight at higher sampling densities. Within primary auditory cortex, then, multiunit sampling with at least 100 samples would likely result in reasonable feature map estimates for all but the highest complexity maps and the highest variability that might be expected. PMID:21889537

  1. Assessment and mapping of water pollution indices in zone-III of municipal corporation of hyderabad using remote sensing and geographic information system.

    PubMed

    Asadi, S S; Vuppala, Padmaja; Reddy, M Anji

    2005-01-01

    A preliminary survey of area under Zone-III of MCH was undertaken to assess the ground water quality, demonstrate its spatial distribution and correlate with the land use patterns using advance techniques of remote sensing and geographical information system (GIS). Twenty-seven ground water samples were collected and their chemical analysis was done to form the attribute database. Water quality index was calculated from the measured parameters, based on which the study area was classified into five groups with respect to suitability of water for drinking purpose. Thematic maps viz., base map, road network, drainage and land use/land cover were prepared from IRS ID PAN + LISS III merged satellite imagery forming the spatial database. Attribute database was integrated with spatial sampling locations map in Arc/Info and maps showing spatial distribution of water quality parameters were prepared in Arc View. Results indicated that high concentrations of total dissolved solids (TDS), nitrates, fluorides and total hardness were observed in few industrial and densely populated areas indicating deteriorated water quality while the other areas exhibited moderate to good water quality.

  2. Evaluation of PMS-PCR technology for detection of Mycobacterium avium subsp. paratuberculosis directly from bovine fecal specimens.

    PubMed

    Salgado, M; Steuer, P; Troncoso, E; Collins, M T

    2013-12-27

    Mycobacterium avium subsp. paratuberculosis (MAP) causes paratuberculosis, or Johne's disease, in animals. Diagnosis of MAP infection is challenging because of the pathogen's fastidious in vitro growth requirements and low-level intermittent shedding in feces during the preclinical phase of the infection. Detection of these "low-shedders" is important for effective control of paratuberculosis as these animals serve as sources of infection for susceptible calves. Magnetic separation technology, used in combination with culture or molecular methods for the isolation and detection of pathogenic bacteria, enhances the analytical sensitivity and specificity of detection methods. The aim of the present study was to evaluate peptide-mediated magnetic separation (PMS) capture technology coupled with IS900 PCR using the Roche real-time PCR system (PMS-PCR), in comparison with fecal culture using BACTEC-MGIT 960 system, for detection of MAP in bovine fecal samples. Among the 351 fecal samples 74.9% (263/351) were PMS-PCR positive while only 12.3% (43/351) were MGIT culture-positive (p=0.0001). All 43 MGIT culture-positive samples were also positive by PMS-PCR. Mean PMS-PCR crossing-point (Cp) values for the 13 fecal samples with the highest number of MAP, based on time to detection, (26.3) were significantly lower than for the 17 fecal samples with <100 MAP per 2g feces (30.06) (p<0.05). PMS-PCR technology provided results in a shorter time and yielded a higher number of positive results than MGIT culture. Earlier and faster detection of animals shedding MAP by PMS-PCR should significantly strengthen control efforts for MAP-infected cattle herds by helping to limit infection transmission at earlier stages of the infection. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Object-Based Retro-Classification Of A Agricultural Land Use: A Case Study Of Irrigated Croplands

    NASA Astrophysics Data System (ADS)

    Dubovyk, Olena; Conrad, Christopher; Khamzina, Asia; Menz, Gunter

    2013-12-01

    Availability of the historical crop maps is necessary for the assessment of land management practices and their effectiveness, as well as monitoring of environmental impacts of land uses. Lack of accurate current and past land-use information forestalls assessment of the occurred changes and their consequences and, thus, complicates knowledge-driven agrarian policy development. At the same time, lack of the sampling dataset for the past years often restrict mapping of historical land use. We proposed a methodology for a retro-assessment of several crops, based on multitemporal Landsat 5 TM imagery and a limited sampling dataset. The overall accuracy of the retro-map was 81% while accuracies for specific crop classes varied from 60% to 93%. If further elaborated, the developed method could be a useful tool for the generation of historical data on agricultural land use.

  4. Statistical strategy for inventorying and monitoring the ecosystem resources of the Mexican States of Jalisco and Colima at multiple scales and resolution levels

    Treesearch

    H. T. Schreuder; M. S. Williams; C. Aguirre-Bravo; P. L. Patterson

    2003-01-01

    The sampling strategy is presented for the initial phase of the natural resources pilot project in the Mexican States of Jalisco and Colima. The sampling design used is ground-based cluster sampling with poststratification based on Landsat Thematic Mapper imagery. The data collected will serve as a basis for additional data collection, mapping, and spatial modeling...

  5. X-ray fluorescence microscopy artefacts in elemental maps of topologically complex samples: Analytical observations, simulation and a map correction method

    NASA Astrophysics Data System (ADS)

    Billè, Fulvio; Kourousias, George; Luchinat, Enrico; Kiskinova, Maya; Gianoncelli, Alessandra

    2016-08-01

    XRF spectroscopy is among the most widely used non-destructive techniques for elemental analysis. Despite the known angular dependence of X-ray fluorescence (XRF), topological artefacts remain an unresolved issue when using X-ray micro- or nano-probes. In this work we investigate the origin of the artefacts in XRF imaging of topologically complex samples, which are unresolved problems in studies of organic matter due to the limited travel distances of low energy XRF emission from the light elements. In particular we mapped Human Embryonic Kidney (HEK293T) cells. The exemplary results with biological samples, obtained with a soft X-ray scanning microscope installed at a synchrotron facility were used for testing a mathematical model based on detector response simulations, and for proposing an artefact correction method based on directional derivatives. Despite the peculiar and specific application, the methodology can be easily extended to hard X-rays and to set-ups with multi-array detector systems when the dimensions of surface reliefs are in the order of the probing beam size.

  6. A new method for mapping multidimensional data to lower dimensions

    NASA Technical Reports Server (NTRS)

    Gowda, K. C.

    1983-01-01

    A multispectral mapping method is proposed which is based on the new concept of BEND (Bidimensional Effective Normalised Difference). The method, which involves taking one sample point at a time and finding the interrelationships between its features, is found very economical from the point of view of storage and processing time. It has good dimensionality reduction and clustering properties, and is highly suitable for computer analysis of large amounts of data. The transformed values obtained by this procedure are suitable for either a planar 2-space mapping of geological sample points or for making grayscale and color images of geo-terrains. A few examples are given to justify the efficacy of the proposed procedure.

  7. Tomographic active optical trapping of arbitrarily shaped objects by exploiting 3D refractive index maps

    NASA Astrophysics Data System (ADS)

    Kim, Kyoohyun; Park, Yongkeun

    2017-05-01

    Optical trapping can manipulate the three-dimensional (3D) motion of spherical particles based on the simple prediction of optical forces and the responding motion of samples. However, controlling the 3D behaviour of non-spherical particles with arbitrary orientations is extremely challenging, due to experimental difficulties and extensive computations. Here, we achieve the real-time optical control of arbitrarily shaped particles by combining the wavefront shaping of a trapping beam and measurements of the 3D refractive index distribution of samples. Engineering the 3D light field distribution of a trapping beam based on the measured 3D refractive index map of samples generates a light mould, which can manipulate colloidal and biological samples with arbitrary orientations and/or shapes. The present method provides stable control of the orientation and assembly of arbitrarily shaped particles without knowing a priori information about the sample geometry. The proposed method can be directly applied in biophotonics and soft matter physics.

  8. Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach

    NASA Astrophysics Data System (ADS)

    Xiao, T.

    2012-12-01

    One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.

  9. Sample sizes and model comparison metrics for species distribution models

    Treesearch

    B.B. Hanberry; H.S. He; D.C. Dey

    2012-01-01

    Species distribution models use small samples to produce continuous distribution maps. The question of how small a sample can be to produce an accurate model generally has been answered based on comparisons to maximum sample sizes of 200 observations or fewer. In addition, model comparisons often are made with the kappa statistic, which has become controversial....

  10. Spatiotemporal Built-up Land Density Mapping Using Various Spectral Indices in Landsat-7 ETM+ and Landsat-8 OLI/TIRS (Case Study: Surakarta City)

    NASA Astrophysics Data System (ADS)

    Risky, Yanuar S.; Aulia, Yogi H.; Widayani, Prima

    2017-12-01

    Spectral indices variations support for rapid and accurate extracting information such as built-up density. However, the exact determination of spectral waves for built-up density extraction is lacking. This study explains and compares the capabilities of 5 variations of spectral indices in spatiotemporal built-up density mapping using Landsat-7 ETM+ and Landsat-8 OLI/TIRS in Surakarta City on 2002 and 2015. The spectral indices variations used are 3 mid-infrared (MIR) based indices such as the Normalized Difference Built-up Index (NDBI), Urban Index (UI) and Built-up and 2 visible based indices such as VrNIR-BI (visible red) and VgNIR-BI (visible green). Linear regression statistics between ground value samples from Google Earth image in 2002 and 2015 and spectral indices for determining built-up land density. Ground value used amounted to 27 samples for model and 7 samples for accuracy test. The classification of built-up density mapping is divided into 9 classes: unclassified, 0-12.5%, 12.5-25%, 25-37.5%, 37.5-50%, 50-62.5%, 62.5-75%, 75-87.5% and 87.5-100 %. Accuracy of built-up land density mapping in 2002 and 2015 using VrNIR-BI (81,823% and 73.235%), VgNIR-BI (78.934% and 69.028%), NDBI (34.870% and 74.365%), UI (43.273% and 64.398%) and Built-up (59.755% and 72.664%). Based all spectral indices, Surakarta City on 2000-2015 has increased of built-up land density. VgNIR-BI has better capabilities for built-up land density mapping on Landsat-7 ETM + and Landsat-8 OLI/TIRS.

  11. A biosensor assay for the detection of Mycobacterium avium subsp. paratuberculosis in fecal samples

    PubMed Central

    Kumanan, Vijayarani; Nugen, Sam R.; Baeumner, Antje J.

    2009-01-01

    A simple, membrane-strip-based lateral-flow (LF) biosensor assay and a high-throughput microtiter plate assay have been combined with a reverse transcriptase polymerase chain reaction (RT-PCR) for the detection of a small number (ten) of viable Mycobacterium (M.) avium subsp. paratuberculosis (MAP) cells in fecal samples. The assays are based on the identification of the RNA of the IS900 element of MAP. For the assay, RNA was extracted from fecal samples spiked with a known quantity of (101 to 106) MAP cells and amplified using RT-PCR and identified by the LF biosensor and the microtiter plate assay. While the LF biosensor assay requires only 30 min of assay time, the overall process took 10 h for the detection of 10 viable cells. The assays are based on an oligonucleotide sandwich hybridization assay format and use either a membrane flow through system with an immobilized DNA probe that hybridizes with the target sequence or a microtiter plate well. Signal amplification is provided when the target sequence hybridizes to a second DNA probe that has been coupled to liposomes encapsulating the dye, sulforhodamine B. The dye in the liposomes provides a signal that can be read visually, quantified with a hand-held reflectometer, or with a fluorescence reader. Specificity analysis of the assays revealed no cross reactivity with other mycobacteria, such as M. avium complex, M. ulcerans, M. marium, M. kansasii, M. abscessus, M. asiaticum, M. phlei, M. fortuitum, M. scrofulaceum, M. intracellulare, M. smegmatis, and M. bovis. The overall assay for the detection of live MAP organisms is comparatively less expensive and quick, especially in comparison to standard MAP detection using a culture method requiring 6-8 weeks of incubation time, and is significantly less expensive than real-time PCR. PMID:19255522

  12. Occurrence and quality of ground water in southwestern King County, Washington

    USGS Publications Warehouse

    Woodward, D.G.; Packard, F.A.; Dion, N.P.; Sumioka, S.S.

    1995-01-01

    The 250-square mile study area in southwestern King County, Washington is underlain by sediments as much as 2,200 feet thick, deposited during at least four continental glacial/interglacial periods. Published surficial geologic maps and drillers' lithologic logs from about 700 field-located wells were used to prepare 28 geologic sections; these sections were used to delineate 9 hydrogeologic units--5 aquifers, 3 confining beds, and a basal, undifferentiated unit. Two aquifers in these sediments occur at the land surface. Maps depicting the configuration of the tops of three buried aquifers show the extent and the geometry of those aquifers. Maps showing the thickness of two of the three buried aquifers also were prepared. Potentiometric-surface maps for the major aquifers are based on water levels measured in about 400 wells during April 1987. Hydraulic characteristics of the major aquifers are mapped using more than 1,100 specific-capacity calculations and about 240 hydraulic-conductivity determinations from selected wells. Estimates of the average annual recharge to the ground-water system from precipitation for the entire study area were based on relations determined from modeling selected basins. Discharges from the ground-water system were based on estimates of springflow and diffuse seepage from the bluffs surrounding the uplands, and on the quantity of water withdrawn from high-capacity wells. A total of 242 water samples was collected from 217 wells during two mass samplings and analyzed for the presence of common constituents. Samples also were collected and analyzed for heavy metals, boron, detergents, and volatile organic compounds. These analyses indicated there was no widespread degradation of ground-water quality in southwestern King County.

  13. Surface characterization based on optical phase shifting interferometry

    DOEpatents

    Mello, Michael , Rosakis; Ares, J [Altadena, CA

    2011-08-02

    Apparatus, techniques and systems for implementing an optical interferometer to measure surfaces, including mapping of instantaneous curvature or in-plane and out-of-plane displacement field gradients of a sample surface based on obtaining and processing four optical interferograms from a common optical reflected beam from the sample surface that are relatively separated in phase by .pi./2.

  14. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  15. TRAM (Transcriptome Mapper): database-driven creation and analysis of transcriptome maps from multiple sources

    PubMed Central

    2011-01-01

    Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format) and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper) is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays), implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile), useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples) and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene clusters with differential expression during the differentiation toward megakaryocyte were identified. Conclusions TRAM is designed to create, and statistically analyze, quantitative transcriptome maps, based on gene expression data from multiple sources. The release includes FileMaker Pro database management runtime application and it is freely available at http://apollo11.isto.unibo.it/software/, along with preconfigured implementations for mapping of human, mouse and zebrafish transcriptomes. PMID:21333005

  16. Computerized mappings of the cerebral cortex: a multiresolution flattening method and a surface-based coordinate system

    NASA Technical Reports Server (NTRS)

    Drury, H. A.; Van Essen, D. C.; Anderson, C. H.; Lee, C. W.; Coogan, T. A.; Lewis, J. W.

    1996-01-01

    We present a new method for generating two-dimensional maps of the cerebral cortex. Our computerized, two-stage flattening method takes as its input any well-defined representation of a surface within the three-dimensional cortex. The first stage rapidly converts this surface to a topologically correct two-dimensional map, without regard for the amount of distortion introduced. The second stage reduces distortions using a multiresolution strategy that makes gross shape changes on a coarsely sampled map and further shape refinements on progressively finer resolution maps. We demonstrate the utility of this approach by creating flat maps of the entire cerebral cortex in the macaque monkey and by displaying various types of experimental data on such maps. We also introduce a surface-based coordinate system that has advantages over conventional stereotaxic coordinates and is relevant to studies of cortical organization in humans as well as non-human primates. Together, these methods provide an improved basis for quantitative studies of individual variability in cortical organization.

  17. A new high resolution permafrost map of Iceland from Earth Observation data

    NASA Astrophysics Data System (ADS)

    Barnie, Talfan; Conway, Susan; Balme, Matt; Graham, Alastair

    2017-04-01

    High resolution maps of permafrost are required for ongoing monitoring of environmental change and the resulting hazards to ecosystems, people and infrastructure. However, permafrost maps are difficult to construct - direct observations require maintaining networks of sensors and boreholes in harsh environments and are thus limited in extent in space and time, and indirect observations require models or assumptions relating the measurements (e.g. weather station air temperature, basal snow temperature) to ground temperature. Operationally produced Land Surface Temperature maps from Earth Observation data can be used to make spatially contiguous estimates of mean annual skin temperature, which has been used a proxy for the presence of permafrost. However these maps are subject to biases due to (i) selective sampling during the day due to limited satellite overpass times, (ii) selective sampling over the year due to seasonally varying cloud cover, (iii) selective sampling of LST only during clearsky conditions, (iv) errors in cloud masking (v) errors in temperature emissivity separation (vi) smoothing over spatial variability. In this study we attempt to compensate for some of these problems using a bayesian modelling approach and high resolution topography-based downscaling.

  18. Synchrotron X-ray fluorescence spectroscopy of salts in natural sea ice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obbard, Rachel W.; Lieb-Lappen, Ross M.; Nordick, Katherine V.

    We describe the use of synchrotron-based X-ray fluorescence spectroscopy to examine the microstructural location of specific elements, primarily salts, in sea ice. This work was part of an investigation of the location of bromine in the sea ice-snowpack-blowing snow system, where it plays a part in the heterogeneous chemistry that contributes to tropospheric ozone depletion episodes. We analyzed samples at beamline 13-ID-E of the Advanced Photon Source at Argonne National Laboratory. Using an 18 keV incident energy beam, we produced elemental maps of salts for sea ice samples from the Ross Sea, Antarctica. The distribution of salts in sea icemore » depends on ice type. In our columnar ice samples, Br was located in parallel lines spaced roughly 0.5 mm apart, corresponding to the spacing of lamellae in the skeletal region during initial ice growth. The maps revealed concentrations of Br in linear features in samples from all but the topmost and bottommost depths. For those samples, the maps revealed rounded features. Calibration of the Br elemental maps showed bulk concentrations to be 5–10 g/m 3, with concentrations ten times larger in the linear features. Through comparison with horizontal thin sections, we could verify that these linear features were brine sheets or layers.« less

  19. Synchrotron X-ray fluorescence spectroscopy of salts in natural sea ice

    DOE PAGES

    Obbard, Rachel W.; Lieb-Lappen, Ross M.; Nordick, Katherine V.; ...

    2016-10-23

    We describe the use of synchrotron-based X-ray fluorescence spectroscopy to examine the microstructural location of specific elements, primarily salts, in sea ice. This work was part of an investigation of the location of bromine in the sea ice-snowpack-blowing snow system, where it plays a part in the heterogeneous chemistry that contributes to tropospheric ozone depletion episodes. We analyzed samples at beamline 13-ID-E of the Advanced Photon Source at Argonne National Laboratory. Using an 18 keV incident energy beam, we produced elemental maps of salts for sea ice samples from the Ross Sea, Antarctica. The distribution of salts in sea icemore » depends on ice type. In our columnar ice samples, Br was located in parallel lines spaced roughly 0.5 mm apart, corresponding to the spacing of lamellae in the skeletal region during initial ice growth. The maps revealed concentrations of Br in linear features in samples from all but the topmost and bottommost depths. For those samples, the maps revealed rounded features. Calibration of the Br elemental maps showed bulk concentrations to be 5–10 g/m 3, with concentrations ten times larger in the linear features. Through comparison with horizontal thin sections, we could verify that these linear features were brine sheets or layers.« less

  20. Identification of sero-reactive antigens for the early diagnosis of Johne’s disease in cattle

    PubMed Central

    Randall, Arlo; Grohn, Yrjo T.; Katani, Robab; Schilling, Megan; Radzio-Basu, Jessica

    2017-01-01

    Mycobacterium avium subsp. paratuberculosis (MAP) is the causative agent of Johne’s disease (JD), a chronic intestinal inflammatory disease of cattle and other ruminants. JD has a high herd prevalence and causes serious animal health problems and significant economic loss in domesticated ruminants throughout the world. Since serological detection of MAP infected animals during the early stages of infection remains challenging due to the low sensitivity of extant assays, we screened 180 well-characterized serum samples using a whole proteome microarray from Mycobacterium tuberculosis (MTB), a close relative of MAP. Based on extensive testing of serum and milk samples, fecal culture and qPCR for direct detection of MAP, the samples were previously assigned to one of 4 groups: negative low exposure (n = 30, NL); negative high exposure (n = 30, NH); fecal positive, ELISA negative (n = 60, F+E-); and fecal positive, ELISA positive (n = 60, F+E+). Of the 740 reactive proteins, several antigens were serologically recognized early but not late in infection, suggesting a complex and dynamic evolution of the MAP humoral immune response during disease progression. Ordinal logistic regression models identified a subset of 47 candidate proteins with significantly different normalized intensity values (p<0.05), including 12 in the NH and 23 in F+E- groups, suggesting potential utility for the early detection of MAP infected animals. Next, the diagnostic utility of four MAP orthologs (MAP1569, MAP2942c, MAP2609, and MAP1272c) was assessed and reveal moderate to high diagnostic sensitivities (range 48.3% to 76.7%) and specificity (range 96.7% to 100%), with a combined 88.3% sensitivity and 96.7% specificity. Taken together, the results of our analyses have identified several candidate MAP proteins of potential utility for the early detection of MAP infection, as well individual MAP proteins that may serve as the foundation for the next generation of well-defined serological diagnosis of JD in cattle. PMID:28863177

  1. Inventory and mapping of flood inundation using interactive digital image analysis techniques

    USGS Publications Warehouse

    Rohde, Wayne G.; Nelson, Charles A.; Taranik, J.V.

    1979-01-01

    LANDSAT digital data and color infra-red photographs were used in a multiphase sampling scheme to estimate the area of agricultural land affected by a flood. The LANDSAT data were classified with a maximum likelihood algorithm. Stratification of the LANDSAT data, prior to classification, greatly reduced misclassification errors. The classification results were used to prepare a map overlay showing the areal extent of flooding. These data also provided statistics required to estimate sample size in a two phase sampling scheme, and provided quick, accurate estimates of areas flooded for the first phase. The measurements made in the second phase, based on ground data and photo-interpretation, were used with two phase sampling statistics to estimate the area of agricultural land affected by flooding These results show that LANDSAT digital data can be used to prepare map overlays showing the extent of flooding on agricultural land and, with two phase sampling procedures, can provide acreage estimates with sampling errors of about 5 percent. This procedure provides a technique for rapidly assessing the areal extent of flood conditions on agricultural land and would provide a basis for designing a sampling framework to estimate the impact of flooding on crop production.

  2. Low Altitude AVIRIS Data for Mapping Landform Types on West Ship Island, Mississippi

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Otvos, Ervin; Giardino, Marco

    2002-01-01

    A chain of barrier islands provides protection against hurricanes and severe storms along the south and southeastern shores of the United States. Barrier island landform types can be spectrally similar and as small as a few meters across, making highly detailed maps difficult to produce. To determine whether high-resolution airborne hyperspectral imagery could provide detailed maps of barrier island landform types, we used low-altitude hyperspectral and multispectral imagery to map surface environments of West Ship Island, Mississippi. We employed 3.4-meter AVIRIS hyperspectral imagery acquired in July 1999 and 0.5-meter ADAR multispectral data acquired in November 1997. The data were co-registered to digital ortho aerial imagery, and the AVIRIS data was scaled to ground reflectance using ATREM software. Unsupervised classification of AVIRIS and ADAR data proceeded using ISODATA clustering techniques. The resulting landform maps were field-checked and compared to aerial photography and digital elevation maps. Preliminary analyses indicated that the AVIRIS classification mapped more landform types, while the ADAR-based map enabled smaller patches to be identified. Used together, these maps provided a means to assess landform distributions of West Ship Island before and after Hurricane Gorges. Classification accuracy is being addressed through photo-interpretation and field surveys of sample areas selected with stratified random sampling.

  3. Low Altitude AVIRIS Data for Mapping Landform Types on West Ship Island, Mississippi

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Otvos, Ervin; Giardino, Marco

    2003-01-01

    A chain of barrier islands provides protection against hurricanes and severe storms along the southern and southeastern shores of the Unites States. Barrier island landform types can be spectrally similar and as small as a few meters across, making highly detailed maps difficult to produce. To determine whether high-resolution airborne hyperspectral imagery could provide detailed maps of barrier island landform types, we used low-altitude hyperspectral and multispectral imagery to map surface environments of West Ship Island, Mississippi. We employed 3.4 meter AVIRIS hyperspectral imagery acquired in July 1999 and 0.5 meter ADAR multispectral data acquired in November 1997. The data were co-registered to digital ortho aerial imagery, and the AVIRIS data was scaled to ground reflectance using ATREM software. Unsupervised classification of AVIRIS and ADAR data proceeded using ISODATA clustering techniques. The resulting landform maps were field-checked and compared to aerial photography and digital elevation maps. Preliminary analyses indicated that the AVIRIS classification mapped more landform types, while the ADAR-based map enabled smaller patches to be identified. Used together, these maps provided a means to assess landform distributions of West Ship Island before and after Hurricane Georges. Classification accuracy is being assessed through photo-interpretation and field surveys of sample areas selected with stratified random sampling.

  4. The surface water register: an empirically improved sample frame for monitoring the rivers and streams of Kansas

    EPA Science Inventory

    State-wide monitoring based on probability survey designs requires a spatially explicit representation of all streams and rivers of interest within a state, i.e., a sample frame. The sample frame should be the best available map representation of the resource. Many stream progr...

  5. Combined chitosan-thyme treatments with modified atmosphere packaging on a ready-to-cook poultry product.

    PubMed

    Giatrakou, V; Ntzimani, A; Savvaidis, I N

    2010-04-01

    In the present study, natural antimicrobials chitosan and thyme, and their combination, were evaluated for their effect on the shelf life of a ready-to-cook (RTC) chicken-pepper kebab (skewer) stored under modified atmosphere packaging (MAP) conditions at 4 +/- 0.5 degrees C for 14 days. The following treatments were examined: control samples stored under aerobic packaging (A), samples stored under MAP (M), samples treated with 1.5% chitosan (vol/wt) and stored under MAP (M-CH), samples treated with 0.2% thyme essential oil (vol/wt) (M-T), and samples treated with 1.5% chitosan (vol/wt) and 0.2% thyme essential oil (vol/wt) and stored under MAP (M-CH-T). Treatment M-CH-T significantly affected aerobic plate counts and counts of lactic acid bacteria, Pseudomonas spp., Brochothrix thermosphacta, Enterobacteriaceae, and yeasts and molds during the entire storage period. Similarly, lipid oxidation of the RTC product was retarded (M-CH-T treatment) during storage, whereas redness was maintained in M-T, M-CH, and M-CH-T samples. Based primarily on sensory data (taste attribute), M-CH and M-T treatments extended RTC product shelf life by 6 days, whereas M-CH-T treatment resulted in a product with a shelf life of 14 days that maintained acceptable sensory characteristics (shelf life of the control was 6 days).

  6. Automated Semantic Indexing of Figure Captions to Improve Radiology Image Retrieval

    PubMed Central

    Kahn, Charles E.; Rubin, Daniel L.

    2009-01-01

    Objective We explored automated concept-based indexing of unstructured figure captions to improve retrieval of images from radiology journals. Design The MetaMap Transfer program (MMTx) was used to map the text of 84,846 figure captions from 9,004 peer-reviewed, English-language articles to concepts in three controlled vocabularies from the UMLS Metathesaurus, version 2006AA. Sampling procedures were used to estimate the standard information-retrieval metrics of precision and recall, and to evaluate the degree to which concept-based retrieval improved image retrieval. Measurements Precision was estimated based on a sample of 250 concepts. Recall was estimated based on a sample of 40 concepts. The authors measured the impact of concept-based retrieval to improve upon keyword-based retrieval in a random sample of 10,000 search queries issued by users of a radiology image search engine. Results Estimated precision was 0.897 (95% confidence interval, 0.857–0.937). Estimated recall was 0.930 (95% confidence interval, 0.838–1.000). In 5,535 of 10,000 search queries (55%), concept-based retrieval found results not identified by simple keyword matching; in 2,086 searches (21%), more than 75% of the results were found by concept-based search alone. Conclusion Concept-based indexing of radiology journal figure captions achieved very high precision and recall, and significantly improved image retrieval. PMID:19261938

  7. Gene expression pattern recognition algorithm inferences to classify samples exposed to chemical agents

    NASA Astrophysics Data System (ADS)

    Bushel, Pierre R.; Bennett, Lee; Hamadeh, Hisham; Green, James; Ableson, Alan; Misener, Steve; Paules, Richard; Afshari, Cynthia

    2002-06-01

    We present an analysis of pattern recognition procedures used to predict the classes of samples exposed to pharmacologic agents by comparing gene expression patterns from samples treated with two classes of compounds. Rat liver mRNA samples following exposure for 24 hours with phenobarbital or peroxisome proliferators were analyzed using a 1700 rat cDNA microarray platform. Sets of genes that were consistently differentially expressed in the rat liver samples following treatment were stored in the MicroArray Project System (MAPS) database. MAPS identified 238 genes in common that possessed a low probability (P < 0.01) of being randomly detected as differentially expressed at the 95% confidence level. Hierarchical cluster analysis on the 238 genes clustered specific gene expression profiles that separated samples based on exposure to a particular class of compound.

  8. Assessing map accuracy in a remotely sensed, ecoregion-scale cover map

    USGS Publications Warehouse

    Edwards, T.C.; Moisen, Gretchen G.; Cutler, D.R.

    1998-01-01

    Landscape- and ecoregion-based conservation efforts increasingly use a spatial component to organize data for analysis and interpretation. A challenge particular to remotely sensed cover maps generated from these efforts is how best to assess the accuracy of the cover maps, especially when they can exceed 1000 s/km2 in size. Here we develop and describe a methodological approach for assessing the accuracy of large-area cover maps, using as a test case the 21.9 million ha cover map developed for Utah Gap Analysis. As part of our design process, we first reviewed the effect of intracluster correlation and a simple cost function on the relative efficiency of cluster sample designs to simple random designs. Our design ultimately combined clustered and subsampled field data stratified by ecological modeling unit and accessibility (hereafter a mixed design). We next outline estimation formulas for simple map accuracy measures under our mixed design and report results for eight major cover types and the three ecoregions mapped as part of the Utah Gap Analysis. Overall accuracy of the map was 83.2% (SE=1.4). Within ecoregions, accuracy ranged from 78.9% to 85.0%. Accuracy by cover type varied, ranging from a low of 50.4% for barren to a high of 90.6% for man modified. In addition, we examined gains in efficiency of our mixed design compared with a simple random sample approach. In regard to precision, our mixed design was more precise than a simple random design, given fixed sample costs. We close with a discussion of the logistical constraints facing attempts to assess the accuracy of large-area, remotely sensed cover maps.

  9. Easy and accurate reconstruction of whole HIV genomes from short-read sequence data with shiver.

    PubMed

    Wymant, Chris; Blanquart, François; Golubchik, Tanya; Gall, Astrid; Bakker, Margreet; Bezemer, Daniela; Croucher, Nicholas J; Hall, Matthew; Hillebregt, Mariska; Ong, Swee Hoe; Ratmann, Oliver; Albert, Jan; Bannert, Norbert; Fellay, Jacques; Fransen, Katrien; Gourlay, Annabelle; Grabowski, M Kate; Gunsenheimer-Bartmeyer, Barbara; Günthard, Huldrych F; Kivelä, Pia; Kouyos, Roger; Laeyendecker, Oliver; Liitsola, Kirsi; Meyer, Laurence; Porter, Kholoud; Ristola, Matti; van Sighem, Ard; Berkhout, Ben; Cornelissen, Marion; Kellam, Paul; Reiss, Peter; Fraser, Christophe

    2018-01-01

    Studying the evolution of viruses and their molecular epidemiology relies on accurate viral sequence data, so that small differences between similar viruses can be meaningfully interpreted. Despite its higher throughput and more detailed minority variant data, next-generation sequencing has yet to be widely adopted for HIV. The difficulty of accurately reconstructing the consensus sequence of a quasispecies from reads (short fragments of DNA) in the presence of large between- and within-host diversity, including frequent indels, may have presented a barrier. In particular, mapping (aligning) reads to a reference sequence leads to biased loss of information; this bias can distort epidemiological and evolutionary conclusions. De novo assembly avoids this bias by aligning the reads to themselves, producing a set of sequences called contigs. However contigs provide only a partial summary of the reads, misassembly may result in their having an incorrect structure, and no information is available at parts of the genome where contigs could not be assembled. To address these problems we developed the tool shiver to pre-process reads for quality and contamination, then map them to a reference tailored to the sample using corrected contigs supplemented with the user's choice of existing reference sequences. Run with two commands per sample, it can easily be used for large heterogeneous data sets. We used shiver to reconstruct the consensus sequence and minority variant information from paired-end short-read whole-genome data produced with the Illumina platform, for sixty-five existing publicly available samples and fifty new samples. We show the systematic superiority of mapping to shiver's constructed reference compared with mapping the same reads to the closest of 3,249 real references: median values of 13 bases called differently and more accurately, 0 bases called differently and less accurately, and 205 bases of missing sequence recovered. We also successfully applied shiver to whole-genome samples of Hepatitis C Virus and Respiratory Syncytial Virus. shiver is publicly available from https://github.com/ChrisHIV/shiver.

  10. Elemental atmospheric pollution assessment via moss-based measurements in Portland, Oregon

    Treesearch

    Demetrios Gatziolis; Sarah Jovan; Geoffrey Donovan; Michael Amacher; Vicente Monleon

    2016-01-01

    Mosses accumulate pollutants from the atmosphere and can serve as an inexpensive screening tool for mapping air quality and guiding the placement of monitoring instruments. We measured 22 elements using 346 moss samples collected across Portland, Oregon, in December 2013. Our objectives were to develop citywide maps showing concentrations of each element in moss and...

  11. Traumatizing Aspects of Providing Counselling in Community Agencies to Survivors of Sexual Violence: A Concept Map

    ERIC Educational Resources Information Center

    Kadambi, Michaela A.; Truscott, Derek

    2008-01-01

    Concept mapping (a combined qualitative/quantitative approach) was used to clarify and understand 72 Canadian professionals' experience of what they found to be traumatizing about their work with sexual violence survivors in community settings. A sample of 30 professionals providing community-based treatment to survivors of sexual violence sorted…

  12. AWPA biodeterioration hazard map revisited

    Treesearch

    Grant T. Kirker; Amy B. Bishell; William J. Hickey

    2017-01-01

    The fungal decay hazard map used by the American Wood Protection Association (AWPA) currently describes regional decay hazards in ground contact for North America and is based on condition assessments of utility poles from the 1970’s. Current work at the USDA Forest Service, Forest Products Laboratory is underway to analyze soil and wood samples from several National...

  13. A Critical Mapping of Practice-Based Research as Evidenced by Swedish Architectural Theses

    ERIC Educational Resources Information Center

    Buchler, Daniela; Biggs, Michael A. R.; Stahl, Lars-Henrik

    2011-01-01

    This article presents an investigation that was funded by the Swedish Institute into the role of creative practice in architectural research as evidenced in Swedish doctoral theses. The sample was mapped and analysed in terms of clusters of interest, approaches, cultures of knowledge and uses of creative practice. This allowed the identification…

  14. Crisp clustering of airborne geophysical data from the Alto Ligonha pegmatite field, northeastern Mozambique, to predict zones of increased rare earth element potential

    NASA Astrophysics Data System (ADS)

    Eberle, Detlef G.; Daudi, Elias X. F.; Muiuane, Elônio A.; Nyabeze, Peter; Pontavida, Alfredo M.

    2012-01-01

    The National Geology Directorate of Mozambique (DNG) and Maputo-based Eduardo-Mondlane University (UEM) entered a joint venture with the South African Council for Geoscience (CGS) to conduct a case study over the meso-Proterozoic Alto Ligonha pegmatite field in the Zambézia Province of northeastern Mozambique to support the local exploration and mining sectors. Rare-metal minerals, i.e. tantalum and niobium, as well as rare-earth minerals have been mined in the Alto Ligonha pegmatite field since decades, but due to the civil war (1977-1992) production nearly ceased. The Government now strives to promote mining in the region as contribution to poverty alleviation. This study was undertaken to facilitate the extraction of geological information from the high resolution airborne magnetic and radiometric data sets recently acquired through a World Bank funded survey and mapping project. The aim was to generate a value-added map from the airborne geophysical data that is easier to read and use by the exploration and mining industries than mere airborne geophysical grid data or maps. As a first step towards clustering, thorium (Th) and potassium (K) concentrations were determined from the airborne geophysical data as well as apparent magnetic susceptibility and first vertical magnetic gradient data. These four datasets were projected onto a 100 m spaced regular grid to assemble 850,000 four-element (multivariate) sample vectors over the study area. Classification of the sample vectors using crisp clustering based upon the Euclidian distance between sample and class centre provided a (pseudo-) geology map or value-added map, respectively, displaying the spatial distribution of six different classes in the study area. To learn the quality of sample allocation, the degree of membership of each sample vector was determined using a-posterior discriminant analysis. Geophysical ground truth control was essential to allocate geology/geophysical attributes to the six classes. The highest probability to meet pegmatite bodies is in close vicinity to (magnetic) amphibole schist occurring in areas where depletion of potassium as indication of metasomatic processes is evident from the airborne radiometric data. Clustering has proven to be a fast and effective method to compile value-added maps from multivariate geophysical datasets. Experience made in the Alto Ligonha pegmatite field encourages adopting this new methodology for mapping other parts of the Mozambique Fold Belt.

  15. Comparison of milk culture, direct and nested polymerase chain reaction (PCR) with fecal culture based on samples from dairy herds infected with Mycobacterium avium subsp. paratuberculosis

    PubMed Central

    Gao, Anli; Odumeru, Joseph; Raymond, Melinda; Hendrick, Steven; Duffield, Todd; Mutharia, Lucy

    2009-01-01

    Mycobacterium avium subsp. paratuberculosis (MAP) is the etiologic agent of Johne’s disease in cattle and other farm ruminants, and is also a suspected pathogen of Crohn’s disease in humans. Development of diagnostic methods for MAP infection has been a challenge over the last few decades. The objective of this study was to investigate the relationship between different methods for detection of MAP in milk and fecal samples. A total of 134 milk samples and 110 feces samples were collected from 146 individual cows in 14 MAP-infected herds in southwestern Ontario. Culture, IS900 polymerase chain reaction (PCR) and nested PCR methods were used for detecting MAP in milk; results were compared with those of fecal culture. A significant relationship was found between milk culture, direct PCR, and nested PCR (P < 0.05). The fecal culture results were not related to any of the 3 assay methods used for the milk samples (P > 0.10). Although fecal culture showed a higher sensitivity than the milk culture method, the difference was not significant (P = 0.2473). The number of MAP colony-forming units (CFU) isolated by culture from fecal samples was, on average, higher than that isolated from milk samples (P = 0.0083). There was no significant correlation between the number of CFU cultured from milk and from feces (Pearson correlation coefficient = 0.1957, N = 63, P = 0.1243). The animals with high numbers of CFU in milk culture may not be detected by fecal culture at all, and vise versa. A significant proportion (29% to 41%) of the positive animals would be missed if only 1 culture method, instead of both milk and feces, were to be used for diagnosis. This suggests that the shedding of MAP in feces and milk is not synchronized. Most of the infected cows were low-level shedders. The proportion of low-level shedders may even be underestimated because MAP is killed during decontamination, thus reducing the chance of detection. Therefore, to identify suspected Johne’s-infected animals using the tests in this study, both milk and feces samples should be collected in duplicate to enhance the diagnostic rate. The high MAP kill rate identified in the culture methods during decontamination may be compensated for by using the nested PCR method, which had a higher sensitivity than the IS900 PCR method used. PMID:19337397

  16. Quantifying Mesoscale Neuroanatomy Using X-Ray Microtomography

    PubMed Central

    Gray Roncal, William; Prasad, Judy A.; Fernandes, Hugo L.; Gürsoy, Doga; De Andrade, Vincent; Fezzaa, Kamel; Xiao, Xianghui; Vogelstein, Joshua T.; Jacobsen, Chris; Körding, Konrad P.

    2017-01-01

    Methods for resolving the three-dimensional (3D) microstructure of the brain typically start by thinly slicing and staining the brain, followed by imaging numerous individual sections with visible light photons or electrons. In contrast, X-rays can be used to image thick samples, providing a rapid approach for producing large 3D brain maps without sectioning. Here we demonstrate the use of synchrotron X-ray microtomography (µCT) for producing mesoscale (∼1 µm 3 resolution) brain maps from millimeter-scale volumes of mouse brain. We introduce a pipeline for µCT-based brain mapping that develops and integrates methods for sample preparation, imaging, and automated segmentation of cells, blood vessels, and myelinated axons, in addition to statistical analyses of these brain structures. Our results demonstrate that X-ray tomography achieves rapid quantification of large brain volumes, complementing other brain mapping and connectomics efforts. PMID:29085899

  17. View generation for 3D-TV using image reconstruction from irregularly spaced samples

    NASA Astrophysics Data System (ADS)

    Vázquez, Carlos

    2007-02-01

    Three-dimensional television (3D-TV) will become the next big step in the development of advanced TV systems. One of the major challenges for the deployment of 3D-TV systems is the diversity of display technologies and the high cost of capturing multi-view content. Depth image-based rendering (DIBR) has been identified as a key technology for the generation of new views for stereoscopic and multi-view displays from a small number of views captured and transmitted. We propose a disparity compensation method for DIBR that does not require spatial interpolation of the disparity map. We use a forward-mapping disparity compensation with real precision. The proposed method deals with the irregularly sampled image resulting from this disparity compensation process by applying a re-sampling algorithm based on a bi-cubic spline function space that produces smooth images. The fact that no approximation is made on the position of the samples implies that geometrical distortions in the final images due to approximations in sample positions are minimized. We also paid attention to the occlusion problem. Our algorithm detects the occluded regions in the newly generated images and uses simple depth-aware inpainting techniques to fill the gaps created by newly exposed areas. We tested the proposed method in the context of generation of views needed for viewing on SynthaGram TM auto-stereoscopic displays. We used as input either a 2D image plus a depth map or a stereoscopic pair with the associated disparity map. Our results show that this technique provides high quality images to be viewed on different display technologies such as stereoscopic viewing with shutter glasses (two views) and lenticular auto-stereoscopic displays (nine views).

  18. [Characteristic study on village landscape patterns in Sichuan Basin hilly region based on high resolution IKONOS remote sensing].

    PubMed

    Li, Shoucheng; Liu, Wenquan; Cheng, Xu; Ellis, Erle C

    2005-10-01

    To realize the landscape programming of agro-ecosystem management, landscape-stratification can provide us the best understanding of landscape ecosystem at very detailed scales. For this purpose, the village landscapes in densely populated Jintang and Jianyang Counties of Sichuan Basin hilly region were mapped from high resolution (1 m) IKONOS satellite imagery by using a standardized 4 level ecological landscape classification and mapping system in a regionally-representative sample of five 500 x 500 m2 landscape quadrats (sample plots). Based on these maps, the spatial patterns were analyzed by landscape indicators, which demonstrated a large variety of landscape types or ecotopes across the village landscape of this region, with diversity indexes ranging from 1.08 to 2.26 at different levels of the landscape classification system. The richness indices ranged from 42.2% to 58.6 %, except that for the landcover at 85 %. About 12.5 % of the ecotopes were distributed in the same way in each landscape sample, and the remaining 87.5% were distributed differently. The landscape fragmentation indices varied from 2.93 to 4.27 across sample plots, and from 2.86 to 5.63 across classification levels. The population density and the road and hamlet areas had strong linear correlations with some landscape indicators, and especially, the correlation coefficients of hamlet areas with fractal indexes and fragmental dimensions were 0.957* and 0.991**, respectively. The differences in most landscape pattern indices across sample plots and landscape classes were statistically significant, indicating that cross-scale mapping and classification of village landscapes could provide more detailed information on landscape patterns than those from a single level of classification.

  19. Glow discharge sources for atomic and molecular analyses

    NASA Astrophysics Data System (ADS)

    Storey, Andrew Patrick

    Two types of glow discharges were used and characterized for chemical analysis. The flowing atmospheric pressure afterglow (FAPA) source, based on a helium glow discharge (GD), was utilized to analyze samples with molecular mass spectrometry. A second GD, operated at reduced pressure in argon, was employed to map the elemental composition of a solid surface with novel optical detection systems, enabling new applications and perspectives for GD emission spectrometry. Like many plasma-based ambient desorption-ionization sources being used around the world, the FAPA requires a supply of helium to operate effectively. With increased pressures on global helium supply and pricing, the use of an interrupted stream of helium for analysis was explored for vapor and solid samples. In addition to the mass spectra generated by the FAPA source, schlieren imaging and infrared thermography were employed to map the behavior of the source and its surroundings under the altered conditions. Additionally, a new annular microplasma variation of the FAPA source was developed and characterized. A spectroscopic imaging system that utilized an adjustable-tilt interference filter was used to map the elemental composition of a sample surface by glow discharge emission spectroscopy. This apparatus was compared to other GD imaging techniques for mapping elemental surface composition. The wide bandpass filter resulted in significant spectral interferences that could be partially overcome with chemometric data processing. Because time-resolved GD emission spectroscopy can provide fine depth-profiling measurements, a natural extension of GD imaging would be its application to three-dimensional characterization of samples. However, the simultaneous cathodic sputtering that occur across the sample results in a sampling process that is not completely predictable. These issues are frequently encountered when laterally varied samples are explored with glow discharge imaging techniques. These insights are described with respect to their consequences for both imaging and conventional GD spectroscopic techniques.

  20. Analogous on-axis interference topographic phase microscopy (AOITPM).

    PubMed

    Xiu, P; Liu, Q; Zhou, X; Xu, Y; Kuang, C; Liu, X

    2018-05-01

    The refractive index (RI) of a sample as an endogenous contrast agent plays an important role in transparent live cell imaging. In tomographic phase microscopy (TPM), 3D quantitative RI maps can be reconstructed based on the measured projections of the RI in multiple directions. The resolution of the RI maps not only depends on the numerical aperture of the employed objective lens, but also is determined by the accuracy of the quantitative phase of the sample measured at multiple scanning illumination angles. This paper reports an analogous on-axis interference TPM, where the interference angle between the sample and reference beams is kept constant for projections in multiple directions to improve the accuracy of the phase maps and the resolution of RI tomograms. The system has been validated with both silica beads and red blood cells. Compared with conventional TPM, the proposed system acquires quantitative RI maps with higher resolution (420 nm @λ = 633 nm) and signal-to-noise ratio that can be beneficial for live cell imaging in biomedical applications. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.

  1. Glimpse: Sparsity based weak lensing mass-mapping tool

    NASA Astrophysics Data System (ADS)

    Lanusse, F.; Starck, J.-L.; Leonard, A.; Pires, S.

    2018-02-01

    Glimpse, also known as Glimpse2D, is a weak lensing mass-mapping tool that relies on a robust sparsity-based regularization scheme to recover high resolution convergence from either gravitational shear alone or from a combination of shear and flexion. Including flexion allows the supplementation of the shear on small scales in order to increase the sensitivity to substructures and the overall resolution of the convergence map. To preserve all available small scale information, Glimpse avoids any binning of the irregularly sampled input shear and flexion fields and treats the mass-mapping problem as a general ill-posed inverse problem, regularized using a multi-scale wavelet sparsity prior. The resulting algorithm incorporates redshift, reduced shear, and reduced flexion measurements for individual galaxies and is made highly efficient by the use of fast Fourier estimators.

  2. Scanning electron microscopy coupled with energy-dispersive X-ray spectrometry for quick detection of sulfur-oxidizing bacteria in environmental water samples

    NASA Astrophysics Data System (ADS)

    Sun, Chengjun; Jiang, Fenghua; Gao, Wei; Li, Xiaoyun; Yu, Yanzhen; Yin, Xiaofei; Wang, Yong; Ding, Haibing

    2017-01-01

    Detection of sulfur-oxidizing bacteria has largely been dependent on targeted gene sequencing technology or traditional cell cultivation, which usually takes from days to months to carry out. This clearly does not meet the requirements of analysis for time-sensitive samples and/or complicated environmental samples. Since energy-dispersive X-ray spectrometry (EDS) can be used to simultaneously detect multiple elements in a sample, including sulfur, with minimal sample treatment, this technology was applied to detect sulfur-oxidizing bacteria using their high sulfur content within the cell. This article describes the application of scanning electron microscopy imaging coupled with EDS mapping for quick detection of sulfur oxidizers in contaminated environmental water samples, with minimal sample handling. Scanning electron microscopy imaging revealed the existence of dense granules within the bacterial cells, while EDS identified large amounts of sulfur within them. EDS mapping localized the sulfur to these granules. Subsequent 16S rRNA gene sequencing showed that the bacteria detected in our samples belonged to the genus Chromatium, which are sulfur oxidizers. Thus, EDS mapping made it possible to identify sulfur oxidizers in environmental samples based on localized sulfur within their cells, within a short time (within 24 h of sampling). This technique has wide ranging applications for detection of sulfur bacteria in environmental water samples.

  3. Correlation between microbial flora, sensory changes and biogenic amines formation in fresh chicken meat stored aerobically or under modified atmosphere packaging at 4 degrees C: possible role of biogenic amines as spoilage indicators.

    PubMed

    Balamatsia, C C; Paleologos, E K; Kontominas, M G; Savvaidis, I N

    2006-01-01

    This study evaluated the formation of biogenic amines (BAs) in breast chicken meat during storage under aerobic and modified atmospheric packaging (MAP) conditions at 4 degrees C, the correlation of microbial and sensory changes in chicken meat with formation of BAs and the possible role of BAs as indicators of poultry meat spoilage. Poultry breast fillets were stored aerobically or under MAP (30%, CO(2), 70% N(2)) at 4 degrees C for up to 17 days. Quality evaluation was carried out using microbiological, chemical and sensory analyses. Total viable counts, Pseudomonads and Enterobacteriaceae, were in general higher for chicken samples packaged in air whereas lactic acid bacteria (LAB) and Enterobacteriaceae were among the dominant species for samples under MAP. Levels of putrescine and cadaverine increased linearly with storage time and were higher in aerobically stored chicken samples. Spermine and spermidine levels were also detected in both aerobically and MAP stored chicken meat. Levels of tyramine in both chicken samples stored aerobically and or under MAP were low (< 10 mg kg(-1)) whereas the formation of histamine was only observed after day 11 of storage when Enterobacteriaceae had reached a population of ca. 10(7) CFU g(-1). Based on sensory and microbiological analyses and also taking into account a biogenic amines index (BAI, sum of putrescine, cadaverine and tyramine), BAI values between 96 and 101 mg kg(-1) may be proposed as a quality index of MAP and aerobically-packaged fresh chicken meat. Spermine and spermidine decreased steadily throughout the entire storage period of chicken meat under aerobic and MAP packaging, and thus these two amines cannot be used as indicators of fresh chicken meat quality.

  4. Fast and Accurate Construction of Ultra-Dense Consensus Genetic Maps Using Evolution Strategy Optimization

    PubMed Central

    Mester, David; Ronin, Yefim; Schnable, Patrick; Aluru, Srinivas; Korol, Abraham

    2015-01-01

    Our aim was to develop a fast and accurate algorithm for constructing consensus genetic maps for chip-based SNP genotyping data with a high proportion of shared markers between mapping populations. Chip-based genotyping of SNP markers allows producing high-density genetic maps with a relatively standardized set of marker loci for different mapping populations. The availability of a standard high-throughput mapping platform simplifies consensus analysis by ignoring unique markers at the stage of consensus mapping thereby reducing mathematical complicity of the problem and in turn analyzing bigger size mapping data using global optimization criteria instead of local ones. Our three-phase analytical scheme includes automatic selection of ~100-300 of the most informative (resolvable by recombination) markers per linkage group, building a stable skeletal marker order for each data set and its verification using jackknife re-sampling, and consensus mapping analysis based on global optimization criterion. A novel Evolution Strategy optimization algorithm with a global optimization criterion presented in this paper is able to generate high quality, ultra-dense consensus maps, with many thousands of markers per genome. This algorithm utilizes "potentially good orders" in the initial solution and in the new mutation procedures that generate trial solutions, enabling to obtain a consensus order in reasonable time. The developed algorithm, tested on a wide range of simulated data and real world data (Arabidopsis), outperformed two tested state-of-the-art algorithms by mapping accuracy and computation time. PMID:25867943

  5. Probabilistic mapping of descriptive health status responses onto health state utilities using Bayesian networks: an empirical analysis converting SF-12 into EQ-5D utility index in a national US sample.

    PubMed

    Le, Quang A; Doctor, Jason N

    2011-05-01

    As quality-adjusted life years have become the standard metric in health economic evaluations, mapping health-profile or disease-specific measures onto preference-based measures to obtain quality-adjusted life years has become a solution when health utilities are not directly available. However, current mapping methods are limited due to their predictive validity, reliability, and/or other methodological issues. We employ probability theory together with a graphical model, called a Bayesian network, to convert health-profile measures into preference-based measures and to compare the results to those estimated with current mapping methods. A sample of 19,678 adults who completed both the 12-item Short Form Health Survey (SF-12v2) and EuroQoL 5D (EQ-5D) questionnaires from the 2003 Medical Expenditure Panel Survey was split into training and validation sets. Bayesian networks were constructed to explore the probabilistic relationships between each EQ-5D domain and 12 items of the SF-12v2. The EQ-5D utility scores were estimated on the basis of the predicted probability of each response level of the 5 EQ-5D domains obtained from the Bayesian inference process using the following methods: Monte Carlo simulation, expected utility, and most-likely probability. Results were then compared with current mapping methods including multinomial logistic regression, ordinary least squares, and censored least absolute deviations. The Bayesian networks consistently outperformed other mapping models in the overall sample (mean absolute error=0.077, mean square error=0.013, and R overall=0.802), in different age groups, number of chronic conditions, and ranges of the EQ-5D index. Bayesian networks provide a new robust and natural approach to map health status responses into health utility measures for health economic evaluations.

  6. EnviroAtlas -Durham, NC- One Meter Resolution Urban Area Land Cover Map (2010)

    EPA Pesticide Factsheets

    The EnviroAtlas Durham, NC land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from July 2010 at 1 m spatial resolution. Five land cover classes were mapped: impervious surface, soil and barren, grass and herbaceous, trees and forest, and water. An accuracy assessment using a stratified random sampling of 500 samples yielded an overall accuracy of 83 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Durham, and includes the cities of Durham, Chapel Hill, Carrboro and Hillsborough, NC. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas ) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets ).

  7. Use of LANDSAT imagery for wildlife habitat mapping in northeast and eastcentral Alaska

    NASA Technical Reports Server (NTRS)

    Lent, P. C. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. There is strong indication that spatially rare feature classes may be missed in clustering classifications based on 2% random sampling. Therefore, it seems advisable to augment random sampling for cluster analysis with directed sampling of any spatially rare features which are relevant to the analysis.

  8. The Improved Locating Algorithm of Particle Filter Based on ROS Robot

    NASA Astrophysics Data System (ADS)

    Fang, Xun; Fu, Xiaoyang; Sun, Ming

    2018-03-01

    This paperanalyzes basic theory and primary algorithm of the real-time locating system and SLAM technology based on ROS system Robot. It proposes improved locating algorithm of particle filter effectively reduces the matching time of laser radar and map, additional ultra-wideband technology directly accelerates the global efficiency of FastSLAM algorithm, which no longer needs searching on the global map. Meanwhile, the re-sampling has been largely reduced about 5/6 that directly cancels the matching behavior on Roboticsalgorithm.

  9. Statistical mapping of count survey data

    USGS Publications Warehouse

    Royle, J. Andrew; Link, W.A.; Sauer, J.R.; Scott, J. Michael; Heglund, Patricia J.; Morrison, Michael L.; Haufler, Jonathan B.; Wall, William A.

    2002-01-01

    We apply a Poisson mixed model to the problem of mapping (or predicting) bird relative abundance from counts collected from the North American Breeding Bird Survey (BBS). The model expresses the logarithm of the Poisson mean as a sum of a fixed term (which may depend on habitat variables) and a random effect which accounts for remaining unexplained variation. The random effect is assumed to be spatially correlated, thus providing a more general model than the traditional Poisson regression approach. Consequently, the model is capable of improved prediction when data are autocorrelated. Moreover, formulation of the mapping problem in terms of a statistical model facilitates a wide variety of inference problems which are cumbersome or even impossible using standard methods of mapping. For example, assessment of prediction uncertainty, including the formal comparison of predictions at different locations, or through time, using the model-based prediction variance is straightforward under the Poisson model (not so with many nominally model-free methods). Also, ecologists may generally be interested in quantifying the response of a species to particular habitat covariates or other landscape attributes. Proper accounting for the uncertainty in these estimated effects is crucially dependent on specification of a meaningful statistical model. Finally, the model may be used to aid in sampling design, by modifying the existing sampling plan in a manner which minimizes some variance-based criterion. Model fitting under this model is carried out using a simulation technique known as Markov Chain Monte Carlo. Application of the model is illustrated using Mourning Dove (Zenaida macroura) counts from Pennsylvania BBS routes. We produce both a model-based map depicting relative abundance, and the corresponding map of prediction uncertainty. We briefly address the issue of spatial sampling design under this model. Finally, we close with some discussion of mapping in relation to habitat structure. Although our models were fit in the absence of habitat information, the resulting predictions show a strong inverse relation with a map of forest cover in the state, as expected. Consequently, the results suggest that the correlated random effect in the model is broadly representing ecological variation, and that BBS data may be generally useful for studying bird-habitat relationships, even in the presence of observer errors and other widely recognized deficiencies of the BBS.

  10. Comparison of Stem Map Developed from Crown Geometry Allometry Linked Census Data to Airborne and Terrestrial Lidar at Harvard Forest, MA

    NASA Astrophysics Data System (ADS)

    Sullivan, F.; Palace, M. W.; Ducey, M. J.; David, O.; Cook, B. D.; Lepine, L. C.

    2014-12-01

    Harvard Forest in Petersham, MA, USA is the location of one of the temperate forest plots established by the Center for Tropical Forest Science (CTFS) as a joint effort with Harvard Forest and the Smithsonian Institute's Forest Global Earth Observatory (ForestGEO) to characterize ecosystem processes and forest dynamics. Census of a 35 ha plot on Prospect Hill was completed during the winter of 2014 by researchers at Harvard Forest. Census data were collected according to CTFS protocol; measured variables included species, stem diameter, and relative X-Y locations. Airborne lidar data were collected over the censused plot using the high spatial resolution Goddard LiDAR, Hyperspectral, and Thermal sensor package (G-LiHT) during June 2012. As part of a separate study, 39 variable radius plots (VRPs) were randomly located and sampled within and throughout the Prospect Hill CTFS/ForestGEO plot during September and October 2013. On VRPs, biometric properties of trees were sampled, including species, stem diameter, total height, crown base height, crown radii, and relative location to plot centers using a 20 Basal Area Factor prism. In addition, a terrestrial-based lidar scanner was used to collect one lidar scan at plot center for 38 of the 39 VRPs. Leveraging allometric equations of crown geometry and tree height developed from 374 trees and 16 different species sampled on 39 VRPs, a 3-dimensional stem map will be created using the Harvard Forest ForestGEO Prospect Hill census. Vertical and horizontal structure of 3d field-based stem maps will be compared to terrestrial and airborne lidar scan data. Furthermore, to assess the quality of allometric equations, a 2d canopy height raster of the field-based stem map will be compared to a G-LiHT derived canopy height model for the 35 ha census plot. Our automated crown delineation methods will be applied to the 2d representation of the census stem map and the G-LiHT canopy height model. For future work related to this study, high quality field-based stem maps with species and crown geometry information will allow for better comparisons and interpretations of individual tree spectra from the G-LiHT hyperspectral sensor as estimated by automated crown delineation of the G-LiHT lidar canopy height model.

  11. Geologic Map of the Carlton Quadrangle, Yamhill County, Oregon

    USGS Publications Warehouse

    Wheeler, Karen L.; Wells, Ray E.; Minervini, Joseph M.; Block, Jessica L.

    2009-01-01

    The Carlton, Oregon, 7.5-minute quadrangle is located in northwestern Oregon, about 35 miles (57 km) southwest of Portland. It encompasses the towns of Yamhill and Carlton in the northwestern Willamette Valley and extends into the eastern flank of the Oregon Coast Range. The Carlton quadrangle is one of several dozen quadrangles being mapped by the U.S. Geological Survey (USGS) and the Oregon Department of Geology and Mineral Industries (DOGAMI) to provide a framework for earthquake- hazard assessments in the greater Portland, Oregon, metropolitan area. The focus of USGS mapping is on the structural setting of the northern Willamette Valley and its relation to the Coast Range uplift. Mapping was done in collaboration with soil scientists from the National Resource Conservation Service, and the distribution of geologic units is refined over earlier regional mapping (Schlicker and Deacon, 1967). Geologic mapping was done on 7.5-minute topographic base maps and digitized in ArcGIS to produce ArcGIS geodatabases and PDFs of the map and text. The geologic contacts are based on numerous observations and samples collected in 2002 and 2003, National Resource Conservation Service soils maps, and interpretations of 7.5-minute topography. The map was completed before new, high-resolution laser terrain mapping was flown for parts of the northern Willamette Valley in 2008.

  12. Hippocampus Segmentation Based on Local Linear Mapping

    PubMed Central

    Pang, Shumao; Jiang, Jun; Lu, Zhentai; Li, Xueli; Yang, Wei; Huang, Meiyan; Zhang, Yu; Feng, Yanqiu; Huang, Wenhua; Feng, Qianjin

    2017-01-01

    We propose local linear mapping (LLM), a novel fusion framework for distance field (DF) to perform automatic hippocampus segmentation. A k-means cluster method is propose for constructing magnetic resonance (MR) and DF dictionaries. In LLM, we assume that the MR and DF samples are located on two nonlinear manifolds and the mapping from the MR manifold to the DF manifold is differentiable and locally linear. We combine the MR dictionary using local linear representation to present the test sample, and combine the DF dictionary using the corresponding coefficients derived from local linear representation procedure to predict the DF of the test sample. We then merge the overlapped predicted DF patch to obtain the DF value of each point in the test image via a confidence-based weighted average method. This approach enabled us to estimate the label of the test image according to the predicted DF. The proposed method was evaluated on brain images of 35 subjects obtained from SATA dataset. Results indicate the effectiveness of the proposed method, which yields mean Dice similarity coefficients of 0.8697, 0.8770 and 0.8734 for the left, right and bi-lateral hippocampus, respectively. PMID:28368016

  13. Hippocampus Segmentation Based on Local Linear Mapping.

    PubMed

    Pang, Shumao; Jiang, Jun; Lu, Zhentai; Li, Xueli; Yang, Wei; Huang, Meiyan; Zhang, Yu; Feng, Yanqiu; Huang, Wenhua; Feng, Qianjin

    2017-04-03

    We propose local linear mapping (LLM), a novel fusion framework for distance field (DF) to perform automatic hippocampus segmentation. A k-means cluster method is propose for constructing magnetic resonance (MR) and DF dictionaries. In LLM, we assume that the MR and DF samples are located on two nonlinear manifolds and the mapping from the MR manifold to the DF manifold is differentiable and locally linear. We combine the MR dictionary using local linear representation to present the test sample, and combine the DF dictionary using the corresponding coefficients derived from local linear representation procedure to predict the DF of the test sample. We then merge the overlapped predicted DF patch to obtain the DF value of each point in the test image via a confidence-based weighted average method. This approach enabled us to estimate the label of the test image according to the predicted DF. The proposed method was evaluated on brain images of 35 subjects obtained from SATA dataset. Results indicate the effectiveness of the proposed method, which yields mean Dice similarity coefficients of 0.8697, 0.8770 and 0.8734 for the left, right and bi-lateral hippocampus, respectively.

  14. Hippocampus Segmentation Based on Local Linear Mapping

    NASA Astrophysics Data System (ADS)

    Pang, Shumao; Jiang, Jun; Lu, Zhentai; Li, Xueli; Yang, Wei; Huang, Meiyan; Zhang, Yu; Feng, Yanqiu; Huang, Wenhua; Feng, Qianjin

    2017-04-01

    We propose local linear mapping (LLM), a novel fusion framework for distance field (DF) to perform automatic hippocampus segmentation. A k-means cluster method is propose for constructing magnetic resonance (MR) and DF dictionaries. In LLM, we assume that the MR and DF samples are located on two nonlinear manifolds and the mapping from the MR manifold to the DF manifold is differentiable and locally linear. We combine the MR dictionary using local linear representation to present the test sample, and combine the DF dictionary using the corresponding coefficients derived from local linear representation procedure to predict the DF of the test sample. We then merge the overlapped predicted DF patch to obtain the DF value of each point in the test image via a confidence-based weighted average method. This approach enabled us to estimate the label of the test image according to the predicted DF. The proposed method was evaluated on brain images of 35 subjects obtained from SATA dataset. Results indicate the effectiveness of the proposed method, which yields mean Dice similarity coefficients of 0.8697, 0.8770 and 0.8734 for the left, right and bi-lateral hippocampus, respectively.

  15. Shelf life extension of lamb meat using thyme or oregano essential oils and modified atmosphere packaging.

    PubMed

    Karabagias, I; Badeka, A; Kontominas, M G

    2011-05-01

    The effect of thyme (TEO) and oregano (OEO) essential oils as well as modified atmosphere packaging (MAP) in extending the shelf life of fresh lamb meat stored at 4 °C was investigated. In a preliminary experiment TEO and OEO were used at concentrations 0.1 and 0.3% v/w while MAP tested included MAP1 (60% CO(2)/40% N(2)) and MAP2 (80% CO(2)/20% N(2)). Microbiological, physicochemical and sensory properties of lamb meat were monitored over a 20 day period. Sensory analysis showed that at the higher concentration both essential oils gave a strong objectionable odour and taste and were not further used. Of the two essential oils TEO was more effective as was MAP2 over MAP1 for lamb meat preservation. In a second experiment the combined effect of TEO (0.1%) and MAP2 (80/20) on shelf life extension of lamb meat was evaluated over a 25 day storage period. Microbial populations were reduced up to 2.8 log cfu/g on day 9 of storage with the most pronounced effect being achieved by the combination MAP2 plus TEO (0.1%). TBA values varied for all treatments and remained lower than 4 mg MDA/kg throughout storage. pH values varied between 6.4 and 6.0 during storage. Color parameters (L and b) increased with storage time while parameter (a) remained unaffected. Based primarily on sensory analysis (odour) but also on microbiological data, shelf life of lamb meat was 7 days for air packaged samples, 9-10 days for samples containing 0.1% of TEO and 21-22 days for MAP packaged samples containing 0.1% TEO. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.

  16. MAPPING THE GALAXY COLOR–REDSHIFT RELATION: OPTIMAL PHOTOMETRIC REDSHIFT CALIBRATION STRATEGIES FOR COSMOLOGY SURVEYS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, Daniel; Steinhardt, Charles; Faisst, Andreas

    2015-11-01

    Calibrating the photometric redshifts of ≳10{sup 9} galaxies for upcoming weak lensing cosmology experiments is a major challenge for the astrophysics community. The path to obtaining the required spectroscopic redshifts for training and calibration is daunting, given the anticipated depths of the surveys and the difficulty in obtaining secure redshifts for some faint galaxy populations. Here we present an analysis of the problem based on the self-organizing map, a method of mapping the distribution of data in a high-dimensional space and projecting it onto a lower-dimensional representation. We apply this method to existing photometric data from the COSMOS survey selectedmore » to approximate the anticipated Euclid weak lensing sample, enabling us to robustly map the empirical distribution of galaxies in the multidimensional color space defined by the expected Euclid filters. Mapping this multicolor distribution lets us determine where—in galaxy color space—redshifts from current spectroscopic surveys exist and where they are systematically missing. Crucially, the method lets us determine whether a spectroscopic training sample is representative of the full photometric space occupied by the galaxies in a survey. We explore optimal sampling techniques and estimate the additional spectroscopy needed to map out the color–redshift relation, finding that sampling the galaxy distribution in color space in a systematic way can efficiently meet the calibration requirements. While the analysis presented here focuses on the Euclid survey, similar analysis can be applied to other surveys facing the same calibration challenge, such as DES, LSST, and WFIRST.« less

  17. High resolution mapping and classification of oyster habitats in nearshore Louisiana using sidescan sonar

    USGS Publications Warehouse

    Allen, Y.C.; Wilson, C.A.; Roberts, H.H.; Supan, J.

    2005-01-01

    Sidescan sonar holds great promise as a tool to quantitatively depict the distribution and extent of benthic habitats in Louisiana's turbid estuaries. In this study, we describe an effective protocol for acoustic sampling in this environment. We also compared three methods of classification in detail: mean-based thresholding, supervised, and unsupervised techniques to classify sidescan imagery into categories of mud and shell. Classification results were compared to ground truth results using quadrat and dredge sampling. Supervised classification gave the best overall result (kappa = 75%) when compared to quadrat results. Classification accuracy was less robust when compared to all dredge samples (kappa = 21-56%), but increased greatly (90-100%) when only dredge samples taken from acoustically homogeneous areas were considered. Sidescan sonar when combined with ground truth sampling at an appropriate scale can be effectively used to establish an accurate substrate base map for both research applications and shellfish management. The sidescan imagery presented here also provides, for the first time, a detailed presentation of oyster habitat patchiness and scale in a productive oyster growing area.

  18. The Eastern Gas Shales Project (EGSP) Data System: A case study in data base design, development, and application

    USGS Publications Warehouse

    Dyman, T.S.; Wilcox, L.A.

    1983-01-01

    The U.S. Geological Survey and Petroleum Information Corporation in Denver, Colorado, developed the Eastern Gas Shale Project (EGSP)Data System for the U.S. Department of Energy, Morgantown, West Virginia. Geological, geochemical, geophysical, and engineering data from Devonian shale samples from more than 5800 wells and outcrops in the Appalachian basin were edited and converted to a Petroleum Information Corporation data base. Well and sample data may be retrieved from this data system to produce (1)production-test summaries by formation and well location; (2)contoured isopach, structure, and trendsurface maps of Devonian shale units; (3)sample summary reports for samples by location, well, contractor, and sample number; (4)cross sections displaying digitized log traces, geochemical, and lithologic data by depth for wells; and (5)frequency distributions and bivariate plots. Although part of the EGSP Data System is proprietary, and distribution of complete well histories is prohibited by contract, maps and aggregated well-data listings are being made available to the public through published reports. ?? 1983 Plenum Publishing Corporation.

  19. Evaluation of sampling methods to quantify abundance of hardwoods and snags within conifer-dominated riparian zones

    Treesearch

    Theresa Marquardt; Hailemariam Temesgen; Paul D. Anderson; Bianca Eskelson

    2012-01-01

    Six sampling alternatives were examined for their ability to quantify selected attributes of snags and hardwoods in conifer-dominated riparian areas of managed headwater forests in western Oregon. Each alternative was simulated 500 times at eight headwater forest locations based on a 0.52-ha square stem map. The alternatives were evaluated based on how well they...

  20. Algorithmic Approaches for Place Recognition in Featureless, Walled Environments

    DTIC Science & Technology

    2015-01-01

    inertial measurement unit LIDAR light detection and ranging RANSAC random sample consensus SLAM simultaneous localization and mapping SUSAN smallest...algorithm 38 21 Typical input image for general junction based algorithm 39 22 Short exposure image of hallway junction taken by LIDAR 40 23...discipline of simultaneous localization and mapping ( SLAM ) has been studied intensively over the past several years. Many technical approaches

  1. Accelerated Brain DCE-MRI Using Iterative Reconstruction With Total Generalized Variation Penalty for Quantitative Pharmacokinetic Analysis: A Feasibility Study.

    PubMed

    Wang, Chunhao; Yin, Fang-Fang; Kirkpatrick, John P; Chang, Zheng

    2017-08-01

    To investigate the feasibility of using undersampled k-space data and an iterative image reconstruction method with total generalized variation penalty in the quantitative pharmacokinetic analysis for clinical brain dynamic contrast-enhanced magnetic resonance imaging. Eight brain dynamic contrast-enhanced magnetic resonance imaging scans were retrospectively studied. Two k-space sparse sampling strategies were designed to achieve a simulated image acquisition acceleration factor of 4. They are (1) a golden ratio-optimized 32-ray radial sampling profile and (2) a Cartesian-based random sampling profile with spatiotemporal-regularized sampling density constraints. The undersampled data were reconstructed to yield images using the investigated reconstruction technique. In quantitative pharmacokinetic analysis on a voxel-by-voxel basis, the rate constant K trans in the extended Tofts model and blood flow F B and blood volume V B from the 2-compartment exchange model were analyzed. Finally, the quantitative pharmacokinetic parameters calculated from the undersampled data were compared with the corresponding calculated values from the fully sampled data. To quantify each parameter's accuracy calculated using the undersampled data, error in volume mean, total relative error, and cross-correlation were calculated. The pharmacokinetic parameter maps generated from the undersampled data appeared comparable to the ones generated from the original full sampling data. Within the region of interest, most derived error in volume mean values in the region of interest was about 5% or lower, and the average error in volume mean of all parameter maps generated through either sampling strategy was about 3.54%. The average total relative error value of all parameter maps in region of interest was about 0.115, and the average cross-correlation of all parameter maps in region of interest was about 0.962. All investigated pharmacokinetic parameters had no significant differences between the result from original data and the reduced sampling data. With sparsely sampled k-space data in simulation of accelerated acquisition by a factor of 4, the investigated dynamic contrast-enhanced magnetic resonance imaging pharmacokinetic parameters can accurately estimate the total generalized variation-based iterative image reconstruction method for reliable clinical application.

  2. Landsat-faciliated vegetation classification of the Kenai National Wildlife Refuge and adjacent areas, Alaska

    USGS Publications Warehouse

    Talbot, S. S.; Shasby, M.B.; Bailey, T.N.

    1985-01-01

    A Landsat-based vegetation map was prepared for Kenai National Wildlife Refuge and adjacent lands, 2 million and 2.5 million acres respectively. The refuge lies within the middle boreal sub zone of south central Alaska. Seven major classes and sixteen subclasses were recognized: forest (closed needleleaf, needleleaf woodland, mixed); deciduous scrub (lowland and montane, subalpine); dwarf scrub (dwarf shrub tundra, lichen tundra, dwarf shrub and lichen tundra, dwarf shrub peatland, string bog/wetlands); herbaceous (graminoid meadows and marshes); scarcely vegetated areas ; water (clear, moderately turbid, highly turbid); and glaciers. The methodology employed a cluster-block technique. Sample areas were described based on a combination of helicopter-ground survey, aerial photo interpretation, and digital Landsat data. Major steps in the Landsat analysis involved: preprocessing (geometric connection), spectral class labeling of sample areas, derivation of statistical parameters for spectral classes, preliminary classification of the entree study area using a maximum-likelihood algorithm, and final classification through ancillary information such as digital elevation data. The vegetation map (scale 1:250,000) was a pioneering effort since there were no intermediate-sclae maps of the area. Representative of distinctive regional patterns, the map was suitable for use in comprehensive conservation planning and wildlife management.

  3. Minimization for conditional simulation: Relationship to optimal transport

    NASA Astrophysics Data System (ADS)

    Oliver, Dean S.

    2014-05-01

    In this paper, we consider the problem of generating independent samples from a conditional distribution when independent samples from the prior distribution are available. Although there are exact methods for sampling from the posterior (e.g. Markov chain Monte Carlo or acceptance/rejection), these methods tend to be computationally demanding when evaluation of the likelihood function is expensive, as it is for most geoscience applications. As an alternative, in this paper we discuss deterministic mappings of variables distributed according to the prior to variables distributed according to the posterior. Although any deterministic mappings might be equally useful, we will focus our discussion on a class of algorithms that obtain implicit mappings by minimization of a cost function that includes measures of data mismatch and model variable mismatch. Algorithms of this type include quasi-linear estimation, randomized maximum likelihood, perturbed observation ensemble Kalman filter, and ensemble of perturbed analyses (4D-Var). When the prior pdf is Gaussian and the observation operators are linear, we show that these minimization-based simulation methods solve an optimal transport problem with a nonstandard cost function. When the observation operators are nonlinear, however, the mapping of variables from the prior to the posterior obtained from those methods is only approximate. Errors arise from neglect of the Jacobian determinant of the transformation and from the possibility of discontinuous mappings.

  4. System of Mueller-Jones matrix polarizing mapping of blood plasma films in breast pathology

    NASA Astrophysics Data System (ADS)

    Zabolotna, Natalia I.; Radchenko, Kostiantyn O.; Tarnovskiy, Mykola H.

    2017-08-01

    The combined method of Jones-Mueller matrix mapping and blood plasma films analysis based on the system that proposed in this paper. Based on the obtained data about the structure and state of blood plasma samples the diagnostic conclusions can be make about the state of breast cancer patients ("normal" or "pathology"). Then, by using the statistical analysis obtain statistical and correlational moments for every coordinate distributions; these indicators are served as diagnostic criterias. The final step is to comparing results and choosing the most effective diagnostic indicators. The paper presents the results of Mueller-Jones matrix mapping of optically thin (attenuation coefficient ,τ≤0,1) blood plasma layers.

  5. Bridging scale gaps between regional maps of forest aboveground biomass and field sampling plots using TanDEM-X data

    NASA Astrophysics Data System (ADS)

    Ni, W.; Zhang, Z.; Sun, G.

    2017-12-01

    Several large-scale maps of forest AGB have been released [1] [2] [3]. However, these existing global or regional datasets were only approximations based on combining land cover type and representative values instead of measurements of actual forest aboveground biomass or forest heights [4]. Rodríguez-Veiga et al[5] reported obvious discrepancies of existing forest biomass stock maps with in-situ observations in Mexico. One of the biggest challenges to the credibility of these maps comes from the scale gaps between the size of field sampling plots used to develop(or validate) estimation models and the pixel size of these maps and the availability of field sampling plots with sufficient size for the verification of these products [6]. It is time-consuming and labor-intensive to collect sufficient number of field sampling data over the plot size of the same as resolutions of regional maps. The smaller field sampling plots cannot fully represent the spatial heterogeneity of forest stands as shown in Figure 1. Forest AGB is directly determined by forest heights, diameter at breast height (DBH) of each tree, forest density and tree species. What measured in the field sampling are the geometrical characteristics of forest stands including the DBH, tree heights and forest densities. The LiDAR data is considered as the best dataset for the estimation of forest AGB. The main reason is that LiDAR can directly capture geometrical features of forest stands by its range detection capabilities.The remotely sensed dataset, which is capable of direct measurements of forest spatial structures, may serve as a ladder to bridge the scale gaps between the pixel size of regional maps of forest AGB and field sampling plots. Several researches report that TanDEM-X data can be used to characterize the forest spatial structures [7, 8]. In this study, the forest AGB map of northeast China were produced using ALOS/PALSAR data taking TanDEM-X data as a bridges. The TanDEM-X InSAR data used in this study and forest AGB map was shown in Figure 2. The technique details and further analysis will be given in the final report. AcknowledgmentThis work was supported in part by the National Basic Research Program of China (Grant No. 2013CB733401, 2013CB733404), and in part by the National Natural Science Foundation of China (Grant Nos. 41471311, 41371357, 41301395).

  6. Geologic map of the Zarkashan-Anguri copper and gold deposits, Ghazni Province, Afghanistan, modified from the 1968 original map compilation of E.P. Meshcheryakov and V.P. Sayapin

    USGS Publications Warehouse

    Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geological map of the area of Zarkashan-Anguri gold deposits, scale 1:50,000, which was compiled by E.P. Meshcheryakov and V.P. Sayapin in 1968. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in April 2010. This modified map, which includes a cross section, illustrates the geologic setting of the Zarkashan-Anguri copper and gold deposits. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross section and includes modifications based on our examination of that and other documents, and based on observations made and sampling undertaken during our field visit. (Refer to the Introduction and the References in the Map PDF for an explanation of our methodology and for complete citations of the original map and related reports.) Elevations on the cross section are derived from the original Soviet topography and may not match the newer topography used on the current map.

  7. Spectral Unmixing Based Construction of Lunar Mineral Abundance Maps

    NASA Astrophysics Data System (ADS)

    Bernhardt, V.; Grumpe, A.; Wöhler, C.

    2017-07-01

    In this study we apply a nonlinear spectral unmixing algorithm to a nearly global lunar spectral reflectance mosaic derived from hyper-spectral image data acquired by the Moon Mineralogy Mapper (M3) instrument. Corrections for topographic effects and for thermal emission were performed. A set of 19 laboratory-based reflectance spectra of lunar samples published by the Lunar Soil Characterization Consortium (LSCC) were used as a catalog of potential endmember spectra. For a given spectrum, the multi-population population-based incremental learning (MPBIL) algorithm was used to determine the subset of endmembers actually contained in it. However, as the MPBIL algorithm is computationally expensive, it cannot be applied to all pixels of the reflectance mosaic. Hence, the reflectance mosaic was clustered into a set of 64 prototype spectra, and the MPBIL algorithm was applied to each prototype spectrum. Each pixel of the mosaic was assigned to the most similar prototype, and the set of endmembers previously determined for that prototype was used for pixel-wise nonlinear spectral unmixing using the Hapke model, implemented as linear unmixing of the single-scattering albedo spectrum. This procedure yields maps of the fractional abundances of the 19 endmembers. Based on the known modal abundances of a variety of mineral species in the LSCC samples, a conversion from endmember abundances to mineral abundances was performed. We present maps of the fractional abundances of plagioclase, pyroxene and olivine and compare our results with previously published lunar mineral abundance maps.

  8. Forest Resource Information System (FRIS)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The technological and economical feasibility of using multispectral digital image data as acquired from the LANDSAT satellites in an ongoing operational forest information system was evaluated. Computer compatible multispectral scanner data secured from the LANDSAT satellites were demonstrated to be a significant contributor to ongoing information systems by providing the added dimensions of synoptic and repeat coverage of the Earth's surface. Major forest cover types of conifer, deciduous, mixed conifer-deciduous and non-forest, were classified well within the bounds of the statistical accuracy of the ground sample. Further, when overlayed with existing maps, the acreage of cover type retains a high level of positional integrity. Maps were digitized by a graphics design system, overlayed and registered onto LANDSAT imagery such that the map data with associated attributes were displayed on the image. Once classified, the analysis results were converted back to map form as a cover type of information. Existing tabular information as represented by inventory is registered geographically to the map base through a vendor provided data management system. The notion of a geographical reference base (map) providing the framework to which imagery and tabular data bases are registered and where each of the three functions of imagery, maps and inventory can be accessed singly or in combination is the very essence of the forest resource information system design.

  9. Automatic Polyp Detection via A Novel Unified Bottom-up and Top-down Saliency Approach.

    PubMed

    Yuan, Yixuan; Li, Dengwang; Meng, Max Q-H

    2017-07-31

    In this paper, we propose a novel automatic computer-aided method to detect polyps for colonoscopy videos. To find the perceptually and semantically meaningful salient polyp regions, we first segment images into multilevel superpixels. Each level corresponds to different sizes of superpixels. Rather than adopting hand-designed features to describe these superpixels in images, we employ sparse autoencoder (SAE) to learn discriminative features in an unsupervised way. Then a novel unified bottom-up and top-down saliency method is proposed to detect polyps. In the first stage, we propose a weak bottom-up (WBU) saliency map by fusing the contrast based saliency and object-center based saliency together. The contrast based saliency map highlights image parts that show different appearances compared with surrounding areas while the object-center based saliency map emphasizes the center of the salient object. In the second stage, a strong classifier with Multiple Kernel Boosting (MKB) is learned to calculate the strong top-down (STD) saliency map based on samples directly from the obtained multi-level WBU saliency maps. We finally integrate these two stage saliency maps from all levels together to highlight polyps. Experiment results achieve 0.818 recall for saliency calculation, validating the effectiveness of our method. Extensive experiments on public polyp datasets demonstrate that the proposed saliency algorithm performs favorably against state-of-the-art saliency methods to detect polyps.

  10. A grid matrix-based Raman spectroscopic method to characterize different cell milieu in biopsied axillary sentinel lymph nodes of breast cancer patients.

    PubMed

    Som, Dipasree; Tak, Megha; Setia, Mohit; Patil, Asawari; Sengupta, Amit; Chilakapati, C Murali Krishna; Srivastava, Anurag; Parmar, Vani; Nair, Nita; Sarin, Rajiv; Badwe, R

    2016-01-01

    Raman spectroscopy which is based upon inelastic scattering of photons has a potential to emerge as a noninvasive bedside in vivo or ex vivo molecular diagnostic tool. There is a need to improve the sensitivity and predictability of Raman spectroscopy. We developed a grid matrix-based tissue mapping protocol to acquire cellular-specific spectra that also involved digital microscopy for localizing malignant and lymphocytic cells in sentinel lymph node biopsy sample. Biosignals acquired from specific cellular milieu were subjected to an advanced supervised analytical method, i.e., cross-correlation and peak-to-peak ratio in addition to PCA and PC-LDA. We observed decreased spectral intensity as well as shift in the spectral peaks of amides and lipid bands in the completely metastatic (cancer cells) lymph nodes with high cellular density. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to create an automated smart diagnostic tool for bench side screening of sampled lymph nodes. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to develop an automated smart diagnostic tool for bench side screening of sampled lymph nodes supported by ongoing global research in developing better technology and signal and big data processing algorithms.

  11. Experimental philosophy leading to a small scale digital data base of the conterminous United States for designing experiments with remotely sensed data

    NASA Technical Reports Server (NTRS)

    Labovitz, M. L.; Masuoka, E. J.; Broderick, P. W.; Garman, T. R.; Ludwig, R. W.; Beltran, G. N.; Heyman, P. J.; Hooker, L. K.

    1983-01-01

    Research using satellite remotely sensed data, even within any single scientific discipline, often lacked a unifying principle or strategy with which to plan or integrate studies conducted over an area so large that exhaustive examination is infeasible, e.g., the U.S.A. However, such a series of studies would seem to be at the heart of what makes satellite remote sensing unique, that is the ability to select for study from among remotely sensed data sets distributed widely over the U.S., over time, where the resources do not exist to examine all of them. Using this philosophical underpinning and the concept of a unifying principle, an operational procedure for developing a sampling strategy and formal testable hypotheses was constructed. The procedure is applicable across disciplines, when the investigator restates the research question in symbolic form, i.e., quantifies it. The procedure is set within the statistical framework of general linear models. The dependent variable is any arbitrary function of remotely sensed data and the independent variables are values or levels of factors which represent regional climatic conditions and/or properties of the Earth's surface. These factors are operationally defined as maps from the U.S. National Atlas (U.S.G.S., 1970). Eighty-five maps from the National Atlas, representing climatic and surface attributes, were automated by point counting at an effective resolution of one observation every 17.6 km (11 miles) yielding 22,505 observations per map. The maps were registered to one another in a two step procedure producing a coarse, then fine scale registration. After registration, the maps were iteratively checked for errors using manual and automated procedures. The error free maps were annotated with identification and legend information and then stored as card images, one map to a file. A sampling design will be accomplished through a regionalization analysis of the National Atlas data base (presently being conducted). From this analysis a map of homogeneous regions of the U.S.A. will be created and samples (LANDSAT scenes) assigned by region.

  12. Mycobacterium avium subspecies paratuberculosis in bioaerosols after depopulation and cleaning of two cattle barns.

    PubMed

    Eisenberg, S; Nielen, M; Hoeboer, J; Bouman, M; Heederik, D; Koets, A

    2011-06-04

    Settled dust samples were collected on a commercial dairy farm in the Netherlands with a high prevalence of Mycobacterium avium subspecies paratuberculosis (MAP) (barn A) and on a Dutch experimental cattle farm (barn B) stocked with cattle confirmed to be MAP shedders. Barns were sampled while animals were present, after both barns were destocked and cleaned by cold high-pressure cleaning, and after being kept empty for two weeks (barn A) or after additional disinfection (barn B). MAP DNA was detected by IS900 real-time PCR and viable MAP were detected by liquid culture. MAP DNA was detected in 78 per cent of samples from barn A and 86 per cent of samples from barn B collected while animals were still present. Viable MAP was detected in six of nine samples from barn A and in three of seven samples from barn B. After cold high-pressure cleaning, viable MAP could be detected in only two samples from each barn. After leaving barn A empty for two weeks, and following additional disinfection of barn B, no viable MAP could be detected in any settled dust sample.

  13. Accurate Mobile Urban Mapping via Digital Map-Based SLAM †

    PubMed Central

    Roh, Hyunchul; Jeong, Jinyong; Cho, Younggun; Kim, Ayoung

    2016-01-01

    This paper presents accurate urban map generation using digital map-based Simultaneous Localization and Mapping (SLAM). Throughout this work, our main objective is generating a 3D and lane map aiming for sub-meter accuracy. In conventional mapping approaches, achieving extremely high accuracy was performed by either (i) exploiting costly airborne sensors or (ii) surveying with a static mapping system in a stationary platform. Mobile scanning systems recently have gathered popularity but are mostly limited by the availability of the Global Positioning System (GPS). We focus on the fact that the availability of GPS and urban structures are both sporadic but complementary. By modeling both GPS and digital map data as measurements and integrating them with other sensor measurements, we leverage SLAM for an accurate mobile mapping system. Our proposed algorithm generates an efficient graph SLAM and achieves a framework running in real-time and targeting sub-meter accuracy with a mobile platform. Integrated with the SLAM framework, we implement a motion-adaptive model for the Inverse Perspective Mapping (IPM). Using motion estimation derived from SLAM, the experimental results show that the proposed approaches provide stable bird’s-eye view images, even with significant motion during the drive. Our real-time map generation framework is validated via a long-distance urban test and evaluated at randomly sampled points using Real-Time Kinematic (RTK)-GPS. PMID:27548175

  14. Comparison of prevalence estimation of Mycobacterium avium subsp. paratuberculosis infection by sampling slaughtered cattle with macroscopic lesions vs. systematic sampling.

    PubMed

    Elze, J; Liebler-Tenorio, E; Ziller, M; Köhler, H

    2013-07-01

    The objective of this study was to identify the most reliable approach for prevalence estimation of Mycobacterium avium ssp. paratuberculosis (MAP) infection in clinically healthy slaughtered cattle. Sampling of macroscopically suspect tissue was compared to systematic sampling. Specimens of ileum, jejunum, mesenteric and caecal lymph nodes were examined for MAP infection using bacterial microscopy, culture, histopathology and immunohistochemistry. MAP was found most frequently in caecal lymph nodes, but sampling more tissues optimized the detection rate. Examination by culture was most efficient while combination with histopathology increased the detection rate slightly. MAP was detected in 49/50 animals with macroscopic lesions representing 1.35% of the slaughtered cattle examined. Of 150 systematically sampled macroscopically non-suspect cows, 28.7% were infected with MAP. This indicates that the majority of MAP-positive cattle are slaughtered without evidence of macroscopic lesions and before clinical signs occur. For reliable prevalence estimation of MAP infection in slaughtered cattle, systematic random sampling is essential.

  15. Applying multibeam sonar and mathematical modeling for mapping seabed substrate and biota of offshore shallows

    NASA Astrophysics Data System (ADS)

    Herkül, Kristjan; Peterson, Anneliis; Paekivi, Sander

    2017-06-01

    Both basic science and marine spatial planning are in a need of high resolution spatially continuous data on seabed habitats and biota. As conventional point-wise sampling is unable to cover large spatial extents in high detail, it must be supplemented with remote sensing and modeling in order to fulfill the scientific and management needs. The combined use of in situ sampling, sonar scanning, and mathematical modeling is becoming the main method for mapping both abiotic and biotic seabed features. Further development and testing of the methods in varying locations and environmental settings is essential for moving towards unified and generally accepted methodology. To fill the relevant research gap in the Baltic Sea, we used multibeam sonar and mathematical modeling methods - generalized additive models (GAM) and random forest (RF) - together with underwater video to map seabed substrate and epibenthos of offshore shallows. In addition to testing the general applicability of the proposed complex of techniques, the predictive power of different sonar-based variables and modeling algorithms were tested. Mean depth, followed by mean backscatter, were the most influential variables in most of the models. Generally, mean values of sonar-based variables had higher predictive power than their standard deviations. The predictive accuracy of RF was higher than that of GAM. To conclude, we found the method to be feasible and with predictive accuracy similar to previous studies of sonar-based mapping.

  16. xMAP Technology: Applications in Detection of Pathogens

    PubMed Central

    Reslova, Nikol; Michna, Veronika; Kasny, Martin; Mikel, Pavel; Kralik, Petr

    2017-01-01

    xMAP technology is applicable for high-throughput, multiplex and simultaneous detection of different analytes within a single complex sample. xMAP multiplex assays are currently available in various nucleic acid and immunoassay formats, enabling simultaneous detection and typing of pathogenic viruses, bacteria, parasites and fungi and also antigen or antibody interception. As an open architecture platform, the xMAP technology is beneficial to end users and therefore it is used in various pharmaceutical, clinical and research laboratories. The main aim of this review is to summarize the latest findings and applications in the field of pathogen detection using microsphere-based multiplex assays. PMID:28179899

  17. Landsat for practical forest type mapping - A test case

    NASA Technical Reports Server (NTRS)

    Bryant, E.; Dodge, A. G., Jr.; Warren, S. D.

    1980-01-01

    Computer classified Landsat maps are compared with a recent conventional inventory of forest lands in northern Maine. Over the 196,000 hectare area mapped, estimates of the areas of softwood, mixed wood and hardwood forest obtained by a supervised classification of the Landsat data and a standard inventory based on aerial photointerpretation, probability proportional to prediction, field sampling and a standard forest measurement program are found to agree to within 5%. The cost of the Landsat maps is estimated to be $0.065/hectare. It is concluded that satellite techniques are worth developing for forest inventories, although they are not yet refined enough to be incorporated into current practical inventories.

  18. Detection of Mycobacterium avium subspecies paratuberculosis in tie-stall dairy herds using a standardized environmental sampling technique and targeted pooled samples.

    PubMed

    Arango-Sabogal, Juan C; Côté, Geneviève; Paré, Julie; Labrecque, Olivia; Roy, Jean-Philippe; Buczinski, Sébastien; Doré, Elizabeth; Fairbrother, Julie H; Bissonnette, Nathalie; Wellemans, Vincent; Fecteau, Gilles

    2016-07-01

    Mycobacterium avium ssp. paratuberculosis (MAP) is the etiologic agent of Johne's disease, a chronic contagious enteritis of ruminants that causes major economic losses. Several studies, most involving large free-stall herds, have found environmental sampling to be a suitable method for detecting MAP-infected herds. In eastern Canada, where small tie-stall herds are predominant, certain conditions and management practices may influence the survival and transmission of MAP and recovery (isolation). Our objective was to estimate the performance of a standardized environmental and targeted pooled sampling technique for the detection of MAP-infected tie-stall dairy herds. Twenty-four farms (19 MAP-infected and 5 non-infected) were enrolled, but only 20 were visited twice in the same year, to collect 7 environmental samples and 2 pooled samples (sick cows and cows with poor body condition). Concurrent individual sampling of all adult cows in the herds was also carried out. Isolation of MAP was achieved using the MGIT Para TB culture media and the BACTEC 960 detection system. Overall, MAP was isolated in 7% of the environmental cultures. The sensitivity of the environmental culture was 44% [95% confidence interval (CI): 20% to 70%] when combining results from 2 different herd visits and 32% (95% CI: 13% to 57%) when results from only 1 random herd visit were used. The best sampling strategy was to combine samples from the manure pit, gutter, sick cows, and cows with poor body condition. The standardized environmental sampling technique and the targeted pooled samples presented in this study is an alternative sampling strategy to costly individual cultures for detecting MAP-infected tie-stall dairies. Repeated samplings may improve the detection of MAP-infected herds.

  19. Rapid, all-optical crystal orientation imaging of two-dimensional transition metal dichalcogenide monolayers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David, Sabrina N.; Zhai, Yao; van der Zande, Arend M.

    Two-dimensional (2D) atomic materials such as graphene and transition metal dichalcogenides (TMDCs) have attracted significant research and industrial interest for their electronic, optical, mechanical, and thermal properties. While large-area crystal growth techniques such as chemical vapor deposition have been demonstrated, the presence of grain boundaries and orientation of grains arising in such growths substantially affect the physical properties of the materials. There is currently no scalable characterization method for determining these boundaries and orientations over a large sample area. We here present a second-harmonic generation based microscopy technique for rapidly mapping grain orientations and boundaries of 2D TMDCs. We experimentallymore » demonstrate the capability to map large samples to an angular resolution of ±1° with minimal sample preparation and without involved analysis. A direct comparison of the all-optical grain orientation maps against results obtained by diffraction-filtered dark-field transmission electron microscopy plus selected-area electron diffraction on identical TMDC samples is provided. This rapid and accurate tool should enable large-area characterization of TMDC samples for expedited studies of grain boundary effects and the efficient characterization of industrial-scale production techniques.« less

  20. Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2006-01-01

    The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  1. EnviroAtlas -Portland, ME- One Meter Resolution Urban Land Cover (2010)

    EPA Pesticide Factsheets

    The EnviroAtlas Portland, ME land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Eight land cover classes were mapped: water, impervious surfaces, soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a stratified random sampling of 600 samples yielded an overall accuracy of 87.5 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Portland. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  2. EnviroAtlas -Milwaukee, WI- One Meter Resolution Urban Land Cover Data (2010)

    EPA Pesticide Factsheets

    The EnviroAtlas Milwaukee, WI land cover data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 85.39% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Milwaukee. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  3. EnviroAtlas -- Woodbine, IA -- One Meter Resolution Urban Land Cover Data (2011)

    EPA Pesticide Factsheets

    The EnviroAtlas Woodbine, IA land cover (LC) data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2011 at 1 m spatial resolution. Six land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, and agriculture. An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 87.03% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Woodbine. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  4. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  5. Combining mapped and statistical data in forest ecological inventory and monitoring - supplementing an existing system

    Treesearch

    H. T. Schreuder; R. Czaplewski; R. G. Bailey

    1999-01-01

     forest ecological inventory and monitoring system combining information derived from maps and samples is proposed based on ecosystem regions (Bailey, 1994). The system extends the design of the USDA Forest Service Region 6 Inventory and Monitoring System (R6IMS) in the Pacific Northwest of the United States. The key uses of the information are briefly discussed and...

  6. Implications of alternative field-sampling designs on Landsat-based mapping of stand age and carbon stocks in Oregon forests

    Treesearch

    Maureen V. Duane; Warren B. Cohen; John L. Campbell; Tara Hudiburg; David P. Turner; Dale Weyermann

    2010-01-01

    Empirical models relating forest attributes to remotely sensed metrics are widespread in the literature and underpin many of our efforts to map forest structure across complex landscapes. In this study we compared empirical models relating Landsat reflectance to forest age across Oregon using two alternate sets of ground data: one from a large (n ~ 1500) systematic...

  7. Changes in hydrogen isotope ratios in sequential plumage stages: an implication for the creation of isotope-base maps for tracking migratory birds.

    PubMed

    Duxbury, J M; Holroyd, G L; Muehlenbachs, K

    2003-09-01

    Accurate reference maps are important in the use of stable-isotopes to track the movements of migratory birds. Reference maps created by the analysis of samples collected from young at the nest site are more accurate than simply referring to naturally occurring patterns of hydrogen isotope ratios created by precipitation cycles. Ratios of hydrogen isotopes in the nutrients incorporated early in the development of young birds can be derived from endogenous, maternal sources. Base-maps should be created with the analysis of tissue samples from hatchlings after local the isotopic signature of exogenous nutrients is dominant. Migratory species such as Peregrine Falcons are known to use endogenous sources in the creation of their eggs, therefore knowledge of what plumage stage best represents the local hydrogen ratios would assist in the planning of nest visits. We conducted diet manipulation experiments involving Japanese Quail and Peregrine Falcons to determine the plumage stage when hydrogen isotope ratios were indicative of a switch in their food source. The natal down of both the quail and falcons reflected the diet of breeding adult females. The hydrogen isotope ratios of a new food source were dominant in the juvenile down of the young falcons, although a further shift was detected in the final juvenile plumage. The juvenile plumage is grown during weeks 3-4 after hatch on Peregrine Falcons. Nest visits for the purpose of collecting feathers for isotope-base-map creation should be made around 4 weeks after the presumed hatch of the young falcons.

  8. Spatial averaging errors in creating hemispherical reflectance (albedo) maps from directional reflectance data

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.; Kerber, A. G.; Sellers, P. J.

    1993-01-01

    Spatial averaging errors which may occur when creating hemispherical reflectance maps for different cover types from direct nadir technique to estimate the hemispherical reflectance are assessed by comparing the results with those obtained with a knowledge-based system called VEG (Kimes et al., 1991, 1992). It was found that hemispherical reflectance errors provided by using VEG are much less than those using the direct nadir techniques, depending on conditions. Suggestions are made concerning sampling and averaging strategies for creating hemispherical reflectance maps for photosynthetic, carbon cycle, and climate change studies.

  9. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 6: A low-cost method for land use mapping using simple visual techniques of interpretation. [Spain

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.

  10. Development and validation of a triplex real-time PCR for rapid detection and specific identification of M. avium sub sp. paratuberculosis in faecal samples.

    PubMed

    Irenge, Léonid M; Walravens, Karl; Govaerts, Marc; Godfroid, Jacques; Rosseels, Valérie; Huygen, Kris; Gala, Jean-Luc

    2009-04-14

    A triplex real-time (TRT-PCR) assay was developed to ensure a rapid and reliable detection of Mycobacterium avium subsp. paratuberculosis (Map) in faecal samples and to allow routine detection of Map in farmed livestock and wildlife species. The TRT-PCR assay was designed using IS900, ISMAP02 and f57 molecular targets. Specificity of TRT-PCR was first confirmed on a panel of control mycobacterial Map and non-Map strains and on faecal samples from Map-negative cows (n=35) and from Map-positive cows (n=20). The TRT-PCR assay was compared to direct examination after Ziehl-Neelsen (ZN) staining and to culture on 197 faecal samples collected serially from five calves experimentally exposed to Map over a 3-year period during the sub-clinical phase of the disease. The data showed a good agreement between culture and TRT-PCR (kappa score=0.63), with the TRT-PCR limit of detection of 2.5 x 10(2)microorganisms/g of faeces spiked with Map. ZN agreement with TRT-PCR was not good (kappa=0.02). Sequence analysis of IS900 amplicons from three single IS900 positive samples confirmed the true Map positivity of the samples. Highly specific IS900 amplification suggests therefore that each single IS900 positive sample from experimentally exposed animals was a true Map-positive specimen. In this controlled experimental setting, the TRT-PCT was rapid, specific and displayed a very high sensitivity for Map detection in faecal samples compared to conventional methods.

  11. The performance of approximations of farm contiguity compared to contiguity defined using detailed geographical information in two sample areas in Scotland: implications for foot-and-mouth disease modelling.

    PubMed

    Flood, Jessica S; Porphyre, Thibaud; Tildesley, Michael J; Woolhouse, Mark E J

    2013-10-08

    When modelling infectious diseases, accurately capturing the pattern of dissemination through space is key to providing optimal recommendations for control. Mathematical models of disease spread in livestock, such as for foot-and-mouth disease (FMD), have done this by incorporating a transmission kernel which describes the decay in transmission rate with increasing Euclidean distance from an infected premises (IP). However, this assumes a homogenous landscape, and is based on the distance between point locations of farms. Indeed, underlying the spatial pattern of spread are the contact networks involved in transmission. Accordingly, area-weighted tessellation around farm point locations has been used to approximate field-contiguity and simulate the effect of contiguous premises (CP) culling for FMD. Here, geographic data were used to determine contiguity based on distance between premises' fields and presence of landscape features for two sample areas in Scotland. Sensitivity, positive predictive value, and the True Skill Statistic (TSS) were calculated to determine how point distance measures and area-weighted tessellation compared to the 'gold standard' of the map-based measures in identifying CPs. In addition, the mean degree and density of the different contact networks were calculated. Utilising point distances <1 km and <5 km as a measure for contiguity resulted in poor discrimination between map-based CPs/non-CPs (TSS 0.279-0.344 and 0.385-0.400, respectively). Point distance <1 km missed a high proportion of map-based CPs; <5 km point distance picked up a high proportion of map-based non-CPs as CPs. Area-weighted tessellation performed best, with reasonable discrimination between map-based CPs/non-CPs (TSS 0.617-0.737) and comparable mean degree and density. Landscape features altered network properties considerably when taken into account. The farming landscape is not homogeneous. Basing contiguity on geographic locations of field boundaries and including landscape features known to affect transmission into FMD models are likely to improve individual farm-level accuracy of spatial predictions in the event of future outbreaks. If a substantial proportion of FMD transmission events are by contiguous spread, and CPs should be assigned an elevated relative transmission rate, the shape of the kernel could be significantly altered since ability to discriminate between map-based CPs and non-CPs is different over different Euclidean distances.

  12. Does the mind map learning strategy facilitate information retrieval and critical thinking in medical students?

    PubMed

    D'Antoni, Anthony V; Zipp, Genevieve Pinto; Olson, Valerie G; Cahill, Terrence F

    2010-09-16

    A learning strategy underutilized in medical education is mind mapping. Mind maps are multi-sensory tools that may help medical students organize, integrate, and retain information. Recent work suggests that using mind mapping as a note-taking strategy facilitates critical thinking. The purpose of this study was to investigate whether a relationship existed between mind mapping and critical thinking, as measured by the Health Sciences Reasoning Test (HSRT), and whether a relationship existed between mind mapping and recall of domain-based information. In this quasi-experimental study, 131 first-year medical students were randomly assigned to a standard note-taking (SNT) group or mind map (MM) group during orientation. Subjects were given a demographic survey and pre-HSRT. They were then given an unfamiliar text passage, a pre-quiz based upon the passage, and a 30-minute break, during which time subjects in the MM group were given a presentation on mind mapping. After the break, subjects were given the same passage and wrote notes based on their group (SNT or MM) assignment. A post-quiz based upon the passage was administered, followed by a post-HSRT. Differences in mean pre- and post-quiz scores between groups were analyzed using independent samples t-tests, whereas differences in mean pre- and post-HSRT total scores and subscores between groups were analyzed using ANOVA. Mind map depth was assessed using the Mind Map Assessment Rubric (MMAR). There were no significant differences in mean scores on both the pre- and post-quizzes between note-taking groups. And, no significant differences were found between pre- and post-HSRT mean total scores and subscores. Although mind mapping was not found to increase short-term recall of domain-based information or critical thinking compared to SNT, a brief introduction to mind mapping allowed novice MM subjects to perform similarly to SNT subjects. This demonstrates that medical students using mind maps can successfully retrieve information in the short term, and does not put them at a disadvantage compared to SNT students. Future studies should explore longitudinal effects of mind-map proficiency training on both short- and long-term information retrieval and critical thinking.

  13. Dopant mapping in thin FIB prepared silicon samples by Off-Axis Electron Holography.

    PubMed

    Pantzer, Adi; Vakahy, Atsmon; Eliyahou, Zohar; Levi, George; Horvitz, Dror; Kohn, Amit

    2014-03-01

    Modern semiconductor devices function due to accurate dopant distribution. Off-Axis Electron Holography (OAEH) in the transmission electron microscope (TEM) can map quantitatively the electrostatic potential in semiconductors with high spatial resolution. For the microelectronics industry, ongoing reduction of device dimensions, 3D device geometry, and failure analysis of specific devices require preparation of thin TEM samples, under 70 nm thick, by focused ion beam (FIB). Such thicknesses, which are considerably thinner than the values reported to date in the literature, are challenging due to FIB induced damage and surface depletion effects. Here, we report on preparation of TEM samples of silicon PN junctions in the FIB completed by low-energy (5 keV) ion milling, which reduced amorphization of the silicon to 10nm thick. Additional perpendicular FIB sectioning enabled a direct measurement of the TEM sample thickness in order to determine accurately the crystalline thickness of the sample. Consequently, we find that the low-energy milling also resulted in a negligible thickness of electrically inactive regions, approximately 4nm thick. The influence of TEM sample thickness, FIB induced damage and doping concentrations on the accuracy of the OAEH measurements were examined by comparison to secondary ion mass spectrometry measurements as well as to 1D and 3D simulations of the electrostatic potentials. We conclude that for TEM samples down to 100 nm thick, OAEH measurements of Si-based PN junctions, for the doping levels examined here, resulted in quantitative mapping of potential variations, within ~0.1 V. For thinner TEM samples, down to 20 nm thick, mapping of potential variations is qualitative, due to a reduced accuracy of ~0.3 V. This article is dedicated to the memory of Zohar Eliyahou. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Quantitative x-ray phase-contrast imaging using a single grating of comparable pitch to sample feature size.

    PubMed

    Morgan, Kaye S; Paganin, David M; Siu, Karen K W

    2011-01-01

    The ability to quantitatively retrieve transverse phase maps during imaging by using coherent x rays often requires a precise grating or analyzer-crystal-based setup. Imaging of live animals presents further challenges when these methods require multiple exposures for image reconstruction. We present a simple method of single-exposure, single-grating quantitative phase contrast for a regime in which the grating period is much greater than the effective pixel size. A grating is used to create a high-visibility reference pattern incident on the sample, which is distorted according to the complex refractive index and thickness of the sample. The resolution, along a line parallel to the grating, is not restricted by the grating spacing, and the detector resolution becomes the primary determinant of the spatial resolution. We present a method of analysis that maps the displacement of interrogation windows in order to retrieve a quantitative phase map. Application of this analysis to the imaging of known phantoms shows excellent correspondence.

  15. Raman spectroscopic analysis of real samples: Brazilian bauxite mineralogy

    NASA Astrophysics Data System (ADS)

    Faulstich, Fabiano Richard Leite; Castro, Harlem V.; de Oliveira, Luiz Fernando Cappa; Neumann, Reiner

    2011-10-01

    In this investigation, Raman spectroscopy with 1064 and 632.8 nm excitation was used to investigate real mineral samples of bauxite ore from mines of Northern Brazil, together with Raman mapping and X-rays diffraction. The obtained results show clearly that the use of microRaman spectroscopy is a powerful tool for the identification of all the minerals usually found in bauxites: gibbsite, kaolinite, goethite, hematite, anatase and quartz. Bulk samples can also be analysed, and FT-Raman is more adequate due to better signal-to-noise ratio and representativity, although not efficient for kaolinite. The identification of fingerprinting vibrations for all the minerals allows the acquisition of Raman-based chemical maps, potentially powerful tools for process mineralogy applied to bauxite ores.

  16. The Sloan Digital Sky Survey Reverberation Mapping Project: Hα and Hβ Reverberation Measurements from First-year Spectroscopy and Photometry

    NASA Astrophysics Data System (ADS)

    Grier, C. J.; Trump, J. R.; Shen, Yue; Horne, Keith; Kinemuchi, Karen; McGreer, Ian D.; Starkey, D. A.; Brandt, W. N.; Hall, P. B.; Kochanek, C. S.; Chen, Yuguang; Denney, K. D.; Greene, Jenny E.; Ho, L. C.; Homayouni, Y.; I-Hsiu Li, Jennifer; Pei, Liuyi; Peterson, B. M.; Petitjean, P.; Schneider, D. P.; Sun, Mouyuan; AlSayyad, Yusura; Bizyaev, Dmitry; Brinkmann, Jonathan; Brownstein, Joel R.; Bundy, Kevin; Dawson, K. S.; Eftekharzadeh, Sarah; Fernandez-Trincado, J. G.; Gao, Yang; Hutchinson, Timothy A.; Jia, Siyao; Jiang, Linhua; Oravetz, Daniel; Pan, Kaike; Paris, Isabelle; Ponder, Kara A.; Peters, Christina; Rogerson, Jesse; Simmons, Audrey; Smith, Robyn; Wang, Ran

    2017-12-01

    We present reverberation mapping results from the first year of combined spectroscopic and photometric observations of the Sloan Digital Sky Survey Reverberation Mapping Project. We successfully recover reverberation time delays between the g+i band emission and the broad Hβ emission line for a total of 44 quasars, and for the broad Hα emission line in 18 quasars. Time delays are computed using the JAVELIN and CREAM software and the traditional interpolated cross-correlation function (ICCF): using well-defined criteria, we report measurements of 32 Hβ and 13 Hα lags with JAVELIN, 42 Hβ and 17 Hα lags with CREAM, and 16 Hβ and eight Hα lags with the ICCF. Lag values are generally consistent among the three methods, though we typically measure smaller uncertainties with JAVELIN and CREAM than with the ICCF, given the more physically motivated light curve interpolation and more robust statistical modeling of the former two methods. The median redshift of our Hβ-detected sample of quasars is 0.53, significantly higher than that of the previous reverberation mapping sample. We find that in most objects, the time delay of the Hα emission is consistent with or slightly longer than that of Hβ. We measure black hole masses using our measured time delays and line widths for these quasars. These black hole mass measurements are mostly consistent with expectations based on the local {M}{BH}-{σ }* relationship, and are also consistent with single-epoch black hole mass measurements. This work increases the current sample size of reverberation-mapped active galaxies by about two-thirds and represents the first large sample of reverberation mapping observations beyond the local universe (z < 0.3).

  17. A comparative study evaluating the efficacy of IS_MAP04 with IS900 and IS_MAP02 as a new diagnostic target for the detection of Mycobacterium avium subspecies paratuberculosis from bovine faeces.

    PubMed

    de Kruijf, Marcel; Govender, Rodney; Yearsley, Dermot; Coffey, Aidan; O'Mahony, Jim

    2017-05-01

    The aim of this study was to investigate the efficacy of IS_MAP04 as a potential new diagnostic quantitative PCR (qPCR) target for the detection of Mycobacterium avium subspecies paratuberculosis from bovine faeces. IS_MAP04 primers were designed and tested negative against non-MAP strains. The detection limit of IS_MAP04 qPCR was evaluated on different MAP K-10 DNA concentrations and on faecal samples spiked with different MAP K-10 cell dilutions. A collection of 106 faecal samples was analysed and the efficacy of IS_MAP04 was statistically compared with IS900 and IS_MAP02. The detection limits observed for IS_MAP04 and IS900 on MAP DNA was 34 fg and 3.4 fg respectively. The detection limit of MAP from inoculated faecal samples was 10 2 CFU/g for both IS_MAP04 and IS900 targets and a detection limit of 10 2 CFU/g was also achieved with a TaqMan qPCR targeting IS_MAP04. The efficacy of IS_MAP04 to detect positive MAP faecal samples was 83.0% compared to 85.8% and 83.9% for IS900 and IS_MAP02 respectively. Strong kappa agreements were observed between IS_MAP04 and IS900 (κ=0.892) and between IS_MAP04 and IS_MAP02 (κ=0.897). As a new molecular target, IS_MAP04 showed that the detection limit was comparable to IS900 to detect MAP from inoculated faecal material. The MAP detection efficacy of IS_MAP04 from naturally infected faecal samples proved to be relatively comparable to IS_MAP02, but yielded efficacy results slightly less than IS900. Moreover, IS_MAP04 could be of significant value when used in duplex or multiplex qPCR assays. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Curiosity First 16 Rock or Soil Sampling Sites on Mars

    NASA Image and Video Library

    2016-10-03

    This graphic maps locations of the sites where NASA's Curiosity Mars rover collected its first 18 rock or soil samples for analysis by laboratory instruments inside the vehicle. It also presents images of the drilled holes where 14 rock-powder samples were acquired. Curiosity scooped two soil samples at each of the other two sites: Rocknest and Gobabeb. The diameter of each drill hole is about 0.6 inch (1.6 centimeters), slightly smaller than a U.S. dime. The images used here are raw color, as recorded by the rover's Mars Hand Lens Imager (MAHLI) camera. Notice the differences in color of the material at different drilling sites. For the map, north is toward upper left corner. The scale bar represents 2 kilometers (1.2 miles). The base map is from the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. The latest sample site included is "Quela,"where Curiosity drilled into bedrock of the Murray formation on Sept. 18, 2016, during the 1,464th Martian day, or sol, of the mission. Curiosity landed in August 2012 on the plain (named Aeolis Palus) near Mount Sharp (or Aeolis Mons). More drilling samples collected by MSL are available at http://photojournal.jpl.nasa.gov/catalog/PIA20845

  19. Integration of imagery and cartographic data through a common map base

    NASA Technical Reports Server (NTRS)

    Clark, J.

    1983-01-01

    Several disparate data types are integrated by using control points as the basis for spatially registering the data to a map base. The data are reprojected to match the coordinates of the reference UTM (Universal Transverse Mercator) map projection, as expressed in lines and samples. Control point selection is the most critical aspect of integrating the Thematic Mapper Simulator MSS imagery with the cartographic data. It is noted that control points chosen from the imagery are subject to error from mislocated points, either points that did not correlate well to the reference map or minor pixel offsets because of interactive cursorring errors. Errors are also introduced in map control points when points are improperly located and digitized, leading to inaccurate latitude and longitude coordinates. Nonsystematic aircraft platform variations, such as yawl, pitch, and roll, affect the spatial fidelity of the imagery in comparison with the quadrangles. Features in adjacent flight paths do not always correspond properly owing to the systematic panorama effect and alteration of flightline direction, as well as platform variations.

  20. Sampling Of SAR Imagery For Wind Resource Assesment

    NASA Astrophysics Data System (ADS)

    Badger, Merete; Badger, Jake; Hasager, Charlotte; Nielsen, Morten

    2010-04-01

    Wind resources over the sea can be assessed from a series of wind fields retrieved from Envisat ASAR imagery, or other SAR data. Previous wind resource maps have been produced through random sampling of 70 or more satellite scenes over a given area of interest followed by fitting of a Weibull function to the data. Here we introduce a more advanced sampling strategy based on wind class methodology that is normally applied in Risø DTU’s numerical modeling of wind resources. The aim is to obtain a more representative data set using fewer satellite SAR scenes. The new sampling strategy has been applied within a wind and solar resource assessment study for the United Arab Emirates (UAE) and also for wind resource mapping over a domain in the North Sea, as part of the EU- NORSEWInD project (2008-2012).

  1. Procedures for adjusting regional regression models of urban-runoff quality using local data

    USGS Publications Warehouse

    Hoos, A.B.; Sisolak, J.K.

    1993-01-01

    Statistical operations termed model-adjustment procedures (MAP?s) can be used to incorporate local data into existing regression models to improve the prediction of urban-runoff quality. Each MAP is a form of regression analysis in which the local data base is used as a calibration data set. Regression coefficients are determined from the local data base, and the resulting `adjusted? regression models can then be used to predict storm-runoff quality at unmonitored sites. The response variable in the regression analyses is the observed load or mean concentration of a constituent in storm runoff for a single storm. The set of explanatory variables used in the regression analyses is different for each MAP, but always includes the predicted value of load or mean concentration from a regional regression model. The four MAP?s examined in this study were: single-factor regression against the regional model prediction, P, (termed MAP-lF-P), regression against P,, (termed MAP-R-P), regression against P, and additional local variables (termed MAP-R-P+nV), and a weighted combination of P, and a local-regression prediction (termed MAP-W). The procedures were tested by means of split-sample analysis, using data from three cities included in the Nationwide Urban Runoff Program: Denver, Colorado; Bellevue, Washington; and Knoxville, Tennessee. The MAP that provided the greatest predictive accuracy for the verification data set differed among the three test data bases and among model types (MAP-W for Denver and Knoxville, MAP-lF-P and MAP-R-P for Bellevue load models, and MAP-R-P+nV for Bellevue concentration models) and, in many cases, was not clearly indicated by the values of standard error of estimate for the calibration data set. A scheme to guide MAP selection, based on exploratory data analysis of the calibration data set, is presented and tested. The MAP?s were tested for sensitivity to the size of a calibration data set. As expected, predictive accuracy of all MAP?s for the verification data set decreased as the calibration data-set size decreased, but predictive accuracy was not as sensitive for the MAP?s as it was for the local regression models.

  2. Active machine learning for rapid landslide inventory mapping with VHR satellite images (Invited)

    NASA Astrophysics Data System (ADS)

    Stumpf, A.; Lachiche, N.; Malet, J.; Kerle, N.; Puissant, A.

    2013-12-01

    VHR satellite images have become a primary source for landslide inventory mapping after major triggering events such as earthquakes and heavy rainfalls. Visual image interpretation is still the prevailing standard method for operational purposes but is time-consuming and not well suited to fully exploit the increasingly better supply of remote sensing data. Recent studies have addressed the development of more automated image analysis workflows for landslide inventory mapping. In particular object-oriented approaches that account for spatial and textural image information have been demonstrated to be more adequate than pixel-based classification but manually elaborated rule-based classifiers are difficult to adapt under changing scene characteristics. Machine learning algorithm allow learning classification rules for complex image patterns from labelled examples and can be adapted straightforwardly with available training data. In order to reduce the amount of costly training data active learning (AL) has evolved as a key concept to guide the sampling for many applications. The underlying idea of AL is to initialize a machine learning model with a small training set, and to subsequently exploit the model state and data structure to iteratively select the most valuable samples that should be labelled by the user. With relatively few queries and labelled samples, an AL strategy yields higher accuracies than an equivalent classifier trained with many randomly selected samples. This study addressed the development of an AL method for landslide mapping from VHR remote sensing images with special consideration of the spatial distribution of the samples. Our approach [1] is based on the Random Forest algorithm and considers the classifier uncertainty as well as the variance of potential sampling regions to guide the user towards the most valuable sampling areas. The algorithm explicitly searches for compact regions and thereby avoids a spatially disperse sampling pattern inherent to most other AL methods. The accuracy, the sampling time and the computational runtime of the algorithm were evaluated on multiple satellite images capturing recent large scale landslide events. Sampling between 1-4% of the study areas the accuracies between 74% and 80% were achieved, whereas standard sampling schemes yielded only accuracies between 28% and 50% with equal sampling costs. Compared to commonly used point-wise AL algorithm the proposed approach significantly reduces the number of iterations and hence the computational runtime. Since the user can focus on relatively few compact areas (rather than on hundreds of distributed points) the overall labeling time is reduced by more than 50% compared to point-wise queries. An experimental evaluation of multiple expert mappings demonstrated strong relationships between the uncertainties of the experts and the machine learning model. It revealed that the achieved accuracies are within the range of the inter-expert disagreement and that it will be indispensable to consider ground truth uncertainties to truly achieve further enhancements in the future. The proposed method is generally applicable to a wide range of optical satellite images and landslide types. [1] A. Stumpf, N. Lachiche, J.-P. Malet, N. Kerle, and A. Puissant, Active learning in the spatial domain for remote sensing image classification, IEEE Transactions on Geosciece and Remote Sensing. 2013, DOI 10.1109/TGRS.2013.2262052.

  3. Generation of 2D Land Cover Maps for Urban Areas Using Decision Tree Classification

    NASA Astrophysics Data System (ADS)

    Höhle, J.

    2014-09-01

    A 2D land cover map can automatically and efficiently be generated from high-resolution multispectral aerial images. First, a digital surface model is produced and each cell of the elevation model is then supplemented with attributes. A decision tree classification is applied to extract map objects like buildings, roads, grassland, trees, hedges, and walls from such an "intelligent" point cloud. The decision tree is derived from training areas which borders are digitized on top of a false-colour orthoimage. The produced 2D land cover map with six classes is then subsequently refined by using image analysis techniques. The proposed methodology is described step by step. The classification, assessment, and refinement is carried out by the open source software "R"; the generation of the dense and accurate digital surface model by the "Match-T DSM" program of the Trimble Company. A practical example of a 2D land cover map generation is carried out. Images of a multispectral medium-format aerial camera covering an urban area in Switzerland are used. The assessment of the produced land cover map is based on class-wise stratified sampling where reference values of samples are determined by means of stereo-observations of false-colour stereopairs. The stratified statistical assessment of the produced land cover map with six classes and based on 91 points per class reveals a high thematic accuracy for classes "building" (99 %, 95 % CI: 95 %-100 %) and "road and parking lot" (90 %, 95 % CI: 83 %-95 %). Some other accuracy measures (overall accuracy, kappa value) and their 95 % confidence intervals are derived as well. The proposed methodology has a high potential for automation and fast processing and may be applied to other scenes and sensors.

  4. GIS-based support vector machine modeling of earthquake-triggered landslide susceptibility in the Jianjiang River watershed, China

    NASA Astrophysics Data System (ADS)

    Xu, Chong; Dai, Fuchu; Xu, Xiwei; Lee, Yuan Hsi

    2012-04-01

    Support vector machine (SVM) modeling is based on statistical learning theory. It involves a training phase with associated input and target output values. In recent years, the method has become increasingly popular. The main purpose of this study is to evaluate the mapping power of SVM modeling in earthquake triggered landslide-susceptibility mapping for a section of the Jianjiang River watershed using a Geographic Information System (GIS) software. The river was affected by the Wenchuan earthquake of May 12, 2008. Visual interpretation of colored aerial photographs of 1-m resolution and extensive field surveys provided a detailed landslide inventory map containing 3147 landslides related to the 2008 Wenchuan earthquake. Elevation, slope angle, slope aspect, distance from seismogenic faults, distance from drainages, and lithology were used as the controlling parameters. For modeling, three groups of positive and negative training samples were used in concert with four different kernel functions. Positive training samples include the centroids of 500 large landslides, those of all 3147 landslides, and 5000 randomly selected points in landslide polygons. Negative training samples include 500, 3147, and 5000 randomly selected points on slopes that remained stable during the Wenchuan earthquake. The four kernel functions are linear, polynomial, radial basis, and sigmoid. In total, 12 cases of landslide susceptibility were mapped. Comparative analyses of landslide-susceptibility probability and area relation curves show that both the polynomial and radial basis functions suitably classified the input data as either landslide positive or negative though the radial basis function was more successful. The 12 generated landslide-susceptibility maps were compared with known landslide centroid locations and landslide polygons to verify the success rate and predictive accuracy of each model. The 12 results were further validated using area-under-curve analysis. Group 3 with 5000 randomly selected points on the landslide polygons, and 5000 randomly selected points along stable slopes gave the best results with a success rate of 79.20% and predictive accuracy of 79.13% under the radial basis function. Of all the results, the sigmoid kernel function was the least skillful when used in concert with the centroid data of all 3147 landslides as positive training samples, and the negative training samples of 3147 randomly selected points in regions of stable slope (success rate = 54.95%; predictive accuracy = 61.85%). This paper also provides suggestions and reference data for selecting appropriate training samples and kernel function types for earthquake triggered landslide-susceptibility mapping using SVM modeling. Predictive landslide-susceptibility maps could be useful in hazard mitigation by helping planners understand the probability of landslides in different regions.

  5. Application of an imputation method for geospatial inventory of forest structural attributes across multiple spatial scales in the Lake States, U.S.A

    NASA Astrophysics Data System (ADS)

    Deo, Ram K.

    Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.

  6. Easy and accurate reconstruction of whole HIV genomes from short-read sequence data with shiver

    PubMed Central

    Blanquart, François; Golubchik, Tanya; Gall, Astrid; Bakker, Margreet; Bezemer, Daniela; Croucher, Nicholas J; Hall, Matthew; Hillebregt, Mariska; Ratmann, Oliver; Albert, Jan; Bannert, Norbert; Fellay, Jacques; Fransen, Katrien; Gourlay, Annabelle; Grabowski, M Kate; Gunsenheimer-Bartmeyer, Barbara; Günthard, Huldrych F; Kivelä, Pia; Kouyos, Roger; Laeyendecker, Oliver; Liitsola, Kirsi; Meyer, Laurence; Porter, Kholoud; Ristola, Matti; van Sighem, Ard; Cornelissen, Marion; Kellam, Paul; Reiss, Peter

    2018-01-01

    Abstract Studying the evolution of viruses and their molecular epidemiology relies on accurate viral sequence data, so that small differences between similar viruses can be meaningfully interpreted. Despite its higher throughput and more detailed minority variant data, next-generation sequencing has yet to be widely adopted for HIV. The difficulty of accurately reconstructing the consensus sequence of a quasispecies from reads (short fragments of DNA) in the presence of large between- and within-host diversity, including frequent indels, may have presented a barrier. In particular, mapping (aligning) reads to a reference sequence leads to biased loss of information; this bias can distort epidemiological and evolutionary conclusions. De novo assembly avoids this bias by aligning the reads to themselves, producing a set of sequences called contigs. However contigs provide only a partial summary of the reads, misassembly may result in their having an incorrect structure, and no information is available at parts of the genome where contigs could not be assembled. To address these problems we developed the tool shiver to pre-process reads for quality and contamination, then map them to a reference tailored to the sample using corrected contigs supplemented with the user’s choice of existing reference sequences. Run with two commands per sample, it can easily be used for large heterogeneous data sets. We used shiver to reconstruct the consensus sequence and minority variant information from paired-end short-read whole-genome data produced with the Illumina platform, for sixty-five existing publicly available samples and fifty new samples. We show the systematic superiority of mapping to shiver’s constructed reference compared with mapping the same reads to the closest of 3,249 real references: median values of 13 bases called differently and more accurately, 0 bases called differently and less accurately, and 205 bases of missing sequence recovered. We also successfully applied shiver to whole-genome samples of Hepatitis C Virus and Respiratory Syncytial Virus. shiver is publicly available from https://github.com/ChrisHIV/shiver. PMID:29876136

  7. Assessing Volunteered Geographic Information (vgi) Quality Based on CONTRIBUTORS' Mapping Behaviours

    NASA Astrophysics Data System (ADS)

    Bégin, D.; Devillers, R.; Roche, S.

    2013-05-01

    VGI changed the mapping landscape by allowing people that are not professional cartographers to contribute to large mapping projects, resulting at the same time in concerns about the quality of the data produced. While a number of early VGI studies used conventional methods to assess data quality, such approaches are not always well adapted to VGI. Since VGI is a user-generated content, we posit that features and places mapped by contributors largely reflect contributors' personal interests. This paper proposes studying contributors' mapping processes to understand the characteristics and quality of the data produced. We argue that contributors' behaviour when mapping reflects contributors' motivation and individual preferences in selecting mapped features and delineating mapped areas. Such knowledge of contributors' behaviour could allow for the derivation of information about the quality of VGI datasets. This approach was tested using a sample area from OpenStreetMap, leading to a better understanding of data completeness for contributor's preferred features.

  8. EnviroAtlas -Durham, NC- One Meter Resolution Urban Area Land Cover Map (2010) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas ). The EnviroAtlas Durham, NC land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from July 2010 at 1 m spatial resolution. Five land cover classes were mapped: impervious surface, soil and barren, grass and herbaceous, trees and forest, and water. An accuracy assessment using a stratified random sampling of 500 samples yielded an overall accuracy of 83 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Durham, and includes the cities of Durham, Chapel Hill, Carrboro and Hillsborough, NC. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas ) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets ).

  9. Constrained map-based inventory estimation

    Treesearch

    Paul C. Van Deusen; Francis A. Roesch

    2007-01-01

    A region can conceptually be tessellated into polygons at different scales or resolutions. Likewise, samples can be taken from the region to determine the value of a polygon variable for each scale. Sampled polygons can be used to estimate values for other polygons at the same scale. However, estimates should be compatible across the different scales. Estimates are...

  10. A Genetic Linkage Map of Longleaf Pine (Pinus palustris Mill.) Based on Random Amplified Polymorphic DNAs

    Treesearch

    C.D. Nelson; Thomas L. Kubisiak; M. Stine; W.L. Nance

    1994-01-01

    Eight megagametophyte DNA samples from a single longleaf pine (Pinus palustris Mill.) tree were used to screen 576 oligonucleotide primers for random amplified polymorphic DNA (RAPD) fragments. Primers amplifying repeatable polymorphic fragments were further characterized within a sample of 72 megagametophytes from the same tree. Fragments...

  11. Theory of Single-Impact Atomic Force Spectroscopy in liquids with material contrast.

    PubMed

    López-Guerra, Enrique A; Banfi, Francesco; Solares, Santiago D; Ferrini, Gabriele

    2018-05-14

    Scanning probe microscopy has enabled nanoscale mapping of mechanical properties in important technological materials, such as tissues, biomaterials, polymers, nanointerfaces of composite materials, to name only a few. To improve and widen the measurement of nanoscale mechanical properties, a number of methods have been proposed to overcome the widely used force-displacement mode, that is inherently slow and limited to a quasi-static regime, mainly using multiple sinusoidal excitations of the sample base or of the cantilever. Here, a different approach is put forward. It exploits the unique capabilities of the wavelet transform analysis to harness the information encoded in a short duration spectroscopy experiment. It is based on an impulsive excitation of the cantilever and a single impact of the tip with the sample. It performs well in highly damped environments, which are often seen as problematic in other standard dynamic methods. Our results are very promising in terms of viscoelastic property discrimination. Their potential is oriented (but not limited) to samples that demand imaging in liquid native environments and also to highly vulnerable samples whose compositional mapping cannot be obtained through standard tapping imaging techniques.

  12. Metagenomes of the Picoalga Bathycoccus from the Chile Coastal Upwelling

    PubMed Central

    Vaulot, Daniel; Lepère, Cécile; Toulza, Eve; De la Iglesia, Rodrigo; Poulain, Julie; Gaboyer, Frédéric; Moreau, Hervé; Vandepoele, Klaas; Ulloa, Osvaldo; Gavory, Frederick; Piganeau, Gwenael

    2012-01-01

    Among small photosynthetic eukaryotes that play a key role in oceanic food webs, picoplanktonic Mamiellophyceae such as Bathycoccus, Micromonas, and Ostreococcus are particularly important in coastal regions. By using a combination of cell sorting by flow cytometry, whole genome amplification (WGA), and 454 pyrosequencing, we obtained metagenomic data for two natural picophytoplankton populations from the coastal upwelling waters off central Chile. About 60% of the reads of each sample could be mapped to the genome of Bathycoccus strain from the Mediterranean Sea (RCC1105), representing a total of 9 Mbp (sample T142) and 13 Mbp (sample T149) of non-redundant Bathycoccus genome sequences. WGA did not amplify all regions uniformly, resulting in unequal coverage along a given chromosome and between chromosomes. The identity at the DNA level between the metagenomes and the cultured genome was very high (96.3% identical bases for the three larger chromosomes over a 360 kbp alignment). At least two to three different genotypes seemed to be present in each natural sample based on read mapping to Bathycoccus RCC1105 genome. PMID:22745802

  13. Magnetic mapping of iron in rodent spleen

    PubMed Central

    Blissett, Angela R.; Ollander, Brooke; Penn, Brittany; McTigue, Dana M.; Agarwal, Gunjan

    2016-01-01

    Evaluation of iron distribution and density in biological tissues is important to understand the pathogenesis of a variety of diseases and the fate of exogenously administered iron-based carriers and contrast agents. Iron distribution in tissues is typically characterized via histochemical (Perl’s) stains or immunohistochemistry for ferritin, the major iron storage protein. A more accurate mapping of iron can be achieved via ultrastructural transmission electron microscopy (TEM) based techniques, which involve stringent sample preparation conditions. In this study, we elucidate the capability of magnetic force microscopy (MFM) as a label-free technique to map iron at the nanoscale level in rodent spleen tissue. We complemented and compared our MFM results with those obtained using Perl’s staining and TEM. Our results show how MFM mapping corresponded to sizes of iron-rich lysosomes at a resolution comparable to that of TEM. In addition MFM is compatible with tissue sections commonly prepared for routine histology. PMID:27890658

  14. The art and science of weed mapping

    USGS Publications Warehouse

    Barnett, David T.; Stohlgren, Thomas J.; Jarnevich, Catherine S.; Chong, Geneva W.; Ericson, Jenny A.; Davern, Tracy R.; Simonson, Sara E.

    2007-01-01

    Land managers need cost-effective and informative tools for non-native plant species management. Many local, state, and federal agencies adopted mapping systems designed to collect comparable data for the early detection and monitoring of non-native species. We compared mapping information to statistically rigorous, plot-based methods to better understand the benefits and compatibility of the two techniques. Mapping non-native species locations provided a species list, associated species distributions, and infested area for subjectively selected survey sites. The value of this information may be compromised by crude estimates of cover and incomplete or biased estimations of species distributions. Incorporating plot-based assessments guided by a stratified-random sample design provided a less biased description of non-native species distributions and increased the comparability of data over time and across regions for the inventory, monitoring, and management of non-native and native plant species.

  15. Automated mineralogy based on micro-energy-dispersive X-ray fluorescence microscopy (µ-EDXRF) applied to plutonic rock thin sections in comparison to a mineral liberation analyzer

    NASA Astrophysics Data System (ADS)

    Nikonow, Wilhelm; Rammlmair, Dieter

    2017-10-01

    Recent developments in the application of micro-energy-dispersive X-ray fluorescence spectrometry mapping (µ-EDXRF) have opened up new opportunities for fast geoscientific analyses. Acquiring spatially resolved spectral and chemical information non-destructively for large samples of up to 20 cm length provides valuable information for geoscientific interpretation. Using supervised classification of the spectral information, mineral distribution maps can be obtained. In this work, thin sections of plutonic rocks are analyzed by µ-EDXRF and classified using the supervised classification algorithm spectral angle mapper (SAM). Based on the mineral distribution maps, it is possible to obtain quantitative mineral information, i.e., to calculate the modal mineralogy, search and locate minerals of interest, and perform image analysis. The results are compared to automated mineralogy obtained from the mineral liberation analyzer (MLA) of a scanning electron microscope (SEM) and show good accordance, revealing variation resulting mostly from the limit of spatial resolution of the µ-EDXRF instrument. Taking into account the little time needed for sample preparation and measurement, this method seems suitable for fast sample overviews with valuable chemical, mineralogical and textural information. Additionally, it enables the researcher to make better and more targeted decisions for subsequent analyses.

  16. Geologic Interpretation of Data Sets Collected by Planetary Analog Geology Traverses and by Standard Geologic Field Mapping. Part 1; A Comparison Study

    NASA Technical Reports Server (NTRS)

    Eppler, Dean B.; Bleacher, Jacob F.; Evans, Cynthia A.; Feng, Wanda; Gruener, John; Hurwitz, Debra M.; Skinner, J. A., Jr.; Whitson, Peggy; Janoiko, Barbara

    2013-01-01

    Geologic maps integrate the distributions, contacts, and compositions of rock and sediment bodies as a means to interpret local to regional formative histories. Applying terrestrial mapping techniques to other planets is challenging because data is collected primarily by orbiting instruments, with infrequent, spatiallylimited in situ human and robotic exploration. Although geologic maps developed using remote data sets and limited "Apollo-style" field access likely contain inaccuracies, the magnitude, type, and occurrence of these are only marginally understood. This project evaluates the interpretative and cartographic accuracy of both field- and remote-based mapping approaches by comparing two 1:24,000 scale geologic maps of the San Francisco Volcanic Field (SFVF), north-central Arizona. The first map is based on traditional field mapping techniques, while the second is based on remote data sets, augmented with limited field observations collected during NASA Desert Research & Technology Studies (RATS) 2010 exercises. The RATS mission used Apollo-style methods not only for pre-mission traverse planning but also to conduct geologic sampling as part of science operation tests. Cross-comparison demonstrates that the Apollo-style map identifies many of the same rock units and determines a similar broad history as the field-based map. However, field mapping techniques allow markedly improved discrimination of map units, particularly unconsolidated surficial deposits, and recognize a more complex eruptive history than was possible using Apollo-style data. Further, the distribution of unconsolidated surface units was more obvious in the remote sensing data to the field team after conducting the fieldwork. The study raises questions about the most effective approach to balancing mission costs with the rate of knowledge capture, suggesting that there is an inflection point in the "knowledge capture curve" beyond which additional resource investment yields progressively smaller gains in geologic knowledge.

  17. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    NASA Astrophysics Data System (ADS)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  18. Estimation of landslide-triggering factors using clay minerals, ASTER satellite image and GIS in the Busan area, southeastern Korea

    NASA Astrophysics Data System (ADS)

    Jeong, G. C.; Kim, M. G.; Choi, J. J.; Ryu, J. O.; Nho, J. G.; Choo, C. O.

    2016-12-01

    This study aims at estimating landslide-inducing factors such as extreme rainfall, slope, and geological factors in Busan city, southeastern Korea, using clay mineralogy, DM analysis and DB construction in order to develop the landslide evaluation standards suitable for the country. GIS-based data collected from the study area include geological maps, topological maps, soil maps, forest maps and others in the DB construction. Data extraction and processing for landslide-induced factors consist of expandable clay minerals identified using XRD, along with XRF and weathering sensitivity analysis and fundamental soil analysis on 38 bulk samples composed of weathered rocks and soils. Finally landslide sensibility maps were constructed using ArcGIS, together with ASTER satellite images for identifying clay minerals on regional areas helpful for saving time and money. In Mt. Cheonma, 16 samples are composed of quartz, albite, illite, vermiculite, and kaolinite, with little difference in mineralogy. In Mt. Hwangryeong and Mt. Geumryeun, 12 samples consist of quartz, albite, illite, vermiculite, kaolinite and hornblende, with little difference in mineralogy. In Mt. Songhak, 10 samples are composed of quartz, illite, vermiculite, and kaolinite. Quartz, albite and illite are abundant in most samples, regardless of sites studied. IDW interpolation method was applied to the Busan area. The resolution of space grids consists of 5 m x 5 m. Especially, illite was used as the most effective factor that induces landslide using IDW interpolation and ASTER satellite images. In conclusion, sensibility maps constructed using 16 layers including illite content, weathered sensibility are well in accordance with the real sites where landslides took place, showing that areas with high sensibility are closely related to the high frequencies of landslide. This research was supported by the Public Welfare & Safety Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (grant number 2012M3A2A1050976)

  19. Using known map category marginal frequencies to improve estimates of thematic map accuracy

    NASA Technical Reports Server (NTRS)

    Card, D. H.

    1982-01-01

    By means of two simple sampling plans suggested in the accuracy-assessment literature, it is shown how one can use knowledge of map-category relative sizes to improve estimates of various probabilities. The fact that maximum likelihood estimates of cell probabilities for the simple random sampling and map category-stratified sampling were identical has permitted a unified treatment of the contingency-table analysis. A rigorous analysis of the effect of sampling independently within map categories is made possible by results for the stratified case. It is noted that such matters as optimal sample size selection for the achievement of a desired level of precision in various estimators are irrelevant, since the estimators derived are valid irrespective of how sample sizes are chosen.

  20. Improving estimates of genetic maps: a meta-analysis-based approach.

    PubMed

    Stewart, William C L

    2007-07-01

    Inaccurate genetic (or linkage) maps can reduce the power to detect linkage, increase type I error, and distort haplotype and relationship inference. To improve the accuracy of existing maps, I propose a meta-analysis-based method that combines independent map estimates into a single estimate of the linkage map. The method uses the variance of each independent map estimate to combine them efficiently, whether the map estimates use the same set of markers or not. As compared with a joint analysis of the pooled genotype data, the proposed method is attractive for three reasons: (1) it has comparable efficiency to the maximum likelihood map estimate when the pooled data are homogeneous; (2) relative to existing map estimation methods, it can have increased efficiency when the pooled data are heterogeneous; and (3) it avoids the practical difficulties of pooling human subjects data. On the basis of simulated data modeled after two real data sets, the proposed method can reduce the sampling variation of linkage maps commonly used in whole-genome linkage scans. Furthermore, when the independent map estimates are also maximum likelihood estimates, the proposed method performs as well as or better than when they are estimated by the program CRIMAP. Since variance estimates of maps may not always be available, I demonstrate the feasibility of three different variance estimators. Overall, the method should prove useful to investigators who need map positions for markers not contained in publicly available maps, and to those who wish to minimize the negative effects of inaccurate maps. Copyright 2007 Wiley-Liss, Inc.

  1. Spatial Mapping of Organic Carbon in Returned Samples from Mars

    NASA Astrophysics Data System (ADS)

    Siljeström, S.; Fornaro, T.; Greenwalt, D.; Steele, A.

    2018-04-01

    To map organic material spatially to minerals present in the sample will be essential for the understanding of the origin of any organics in returned samples from Mars. It will be shown how ToF-SIMS may be used to map organics in samples from Mars.

  2. Calculation of three-dimensional, inviscid, supersonic, steady flows

    NASA Technical Reports Server (NTRS)

    Moretti, G.

    1981-01-01

    A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.

  3. Limitations to mapping habitat-use areas in changing landscapes using the Mahalanobis distance statistic

    USGS Publications Warehouse

    Knick, Steven T.; Rotenberry, J.T.

    1998-01-01

    We tested the potential of a GIS mapping technique, using a resource selection model developed for black-tailed jackrabbits (Lepus californicus) and based on the Mahalanobis distance statistic, to track changes in shrubsteppe habitats in southwestern Idaho. If successful, the technique could be used to predict animal use areas, or those undergoing change, in different regions from the same selection function and variables without additional sampling. We determined the multivariate mean vector of 7 GIS variables that described habitats used by jackrabbits. We then ranked the similarity of all cells in the GIS coverage from their Mahalanobis distance to the mean habitat vector. The resulting map accurately depicted areas where we sighted jackrabbits on verification surveys. We then simulated an increase in shrublands (which are important habitats). Contrary to expectation, the new configurations were classified as lower similarity relative to the original mean habitat vector. Because the selection function is based on a unimodal mean, any deviation, even if biologically positive, creates larger Malanobis distances and lower similarity values. We recommend the Mahalanobis distance technique for mapping animal use areas when animals are distributed optimally, the landscape is well-sampled to determine the mean habitat vector, and distributions of the habitat variables does not change.

  4. Selecting the optimum plot size for a California design-based stream and wetland mapping program.

    PubMed

    Lackey, Leila G; Stein, Eric D

    2014-04-01

    Accurate estimates of the extent and distribution of wetlands and streams are the foundation of wetland monitoring, management, restoration, and regulatory programs. Traditionally, these estimates have relied on comprehensive mapping. However, this approach is prohibitively resource-intensive over large areas, making it both impractical and statistically unreliable. Probabilistic (design-based) approaches to evaluating status and trends provide a more cost-effective alternative because, compared with comprehensive mapping, overall extent is inferred from mapping a statistically representative, randomly selected subset of the target area. In this type of design, the size of sample plots has a significant impact on program costs and on statistical precision and accuracy; however, no consensus exists on the appropriate plot size for remote monitoring of stream and wetland extent. This study utilized simulated sampling to assess the performance of four plot sizes (1, 4, 9, and 16 km(2)) for three geographic regions of California. Simulation results showed smaller plot sizes (1 and 4 km(2)) were most efficient for achieving desired levels of statistical accuracy and precision. However, larger plot sizes were more likely to contain rare and spatially limited wetland subtypes. Balancing these considerations led to selection of 4 km(2) for the California status and trends program.

  5. Mapping local anisotropy axis for scattering media using backscattering Mueller matrix imaging

    NASA Astrophysics Data System (ADS)

    He, Honghui; Sun, Minghao; Zeng, Nan; Du, E.; Guo, Yihong; He, Yonghong; Ma, Hui

    2014-03-01

    Mueller matrix imaging techniques can be used to detect the micro-structure variations of superficial biological tissues, including the sizes and shapes of cells, the structures in cells, and the densities of the organelles. Many tissues contain anisotropic fibrous micro-structures, such as collagen fibers, elastin fibers, and muscle fibers. Changes of these fibrous structures are potentially good indicators for some pathological variations. In this paper, we propose a quantitative analysis technique based on Mueller matrix for mapping local anisotropy axis of scattering media. By conducting both experiments on silk sample and Monte Carlo simulation based on the sphere-cylinder scattering model (SCSM), we extract anisotropy axis parameters from different backscattering Mueller matrix elements. Moreover, we testify the possible applications of these parameters for biological tissues. The preliminary experimental results of human cancerous samples show that, these parameters are capable to map the local axis of fibers. Since many pathological changes including early stage cancers affect the well aligned structures for tissues, the experimental results indicate that these parameters can be used as potential tools in clinical applications for biomedical diagnosis purposes.

  6. High-resolution geologic mapping of the inner continental shelf: Nahant to Gloucester, Massachusetts

    USGS Publications Warehouse

    Barnhardt, Walter A.; Andrews, Brian D.; Butman, Bradford

    2006-01-01

    This report presents high-resolution maps of the seafloor offshore of Massachusetts, from Nahant to Gloucester. Approximately 134 km² of the inner shelf were mapped with a focus on the nearshore region in water depths less than 40 m (fig. 1.1). The maps were prepared as part of a cooperative mapping program between the U.S. Geological Survey (USGS) and the Massachusetts Office of Coastal Zone Management (CZM). They are based on marine geophysical data, sediment sampling, and bottom photography obtained on two research cruises carried out in 2003 and 2004. The primary objective of this program is to develop a suite of seafloor maps that provide geologic information for management of coastal and marine resources. Accurate maps of seafloor geology are important first steps toward protecting fish habitat, delineating marine reserves, and assessing environmental changes due to natural or human impacts. The maps also provide a geologic framework for scientific research, industry and the public. The organization of this report is outlined in the navigation bar along the left-hand margin of the page. This is section 1, the introduction. Section 2 briefly describes the mapping products contained in this report and has links to large-format map sheets, that can be viewed on line or downloaded. Section 3 is a description of the data collection, processing, and analysis procedures used to create the map products. Section 4 examines the geologic framework and late Quaternary evolution of the region, and presents two different strategies for mapping the complex seafloor. This report also contains four appendices that include GIS layers of all data collected in this study, and copies of the sample and photographic data used to validate the interpretations.

  7. A single-image method for x-ray refractive index CT.

    PubMed

    Mittone, A; Gasilov, S; Brun, E; Bravin, A; Coan, P

    2015-05-07

    X-ray refraction-based computer tomography imaging is a well-established method for nondestructive investigations of various objects. In order to perform the 3D reconstruction of the index of refraction, two or more raw computed tomography phase-contrast images are usually acquired and combined to retrieve the refraction map (i.e. differential phase) signal within the sample. We suggest an approximate method to extract the refraction signal, which uses a single raw phase-contrast image. This method, here applied to analyzer-based phase-contrast imaging, is employed to retrieve the index of refraction map of a biological sample. The achieved accuracy in distinguishing the different tissues is comparable with the non-approximated approach. The suggested procedure can be used for precise refraction computer tomography with the advantage of a reduction of at least a factor of two of both the acquisition time and the dose delivered to the sample with respect to any of the other algorithms in the literature.

  8. A new surveillance and response tool: risk map of infected Oncomelania hupensis detected by Loop-mediated isothermal amplification (LAMP) from pooled samples.

    PubMed

    Tong, Qun-Bo; Chen, Rui; Zhang, Yi; Yang, Guo-Jing; Kumagai, Takashi; Furushima-Shimogawara, Rieko; Lou, Di; Yang, Kun; Wen, Li-Yong; Lu, Shao-Hong; Ohta, Nobuo; Zhou, Xiao-Nong

    2015-01-01

    Although schistosomiasis remains a serious health problem worldwide, significant achievements in schistosomiasis control has been made in the People's Republic of China. The disease has been eliminated in five out of 12 endemic provinces, and the prevalence in remaining endemic areas is very low and is heading toward elimination. A rapid and sensitive method for monitoring the distribution of infected Oncomelania hupensis is urgently required. We applied a loop-mediated isothermal amplification (LAMP) assay targeting 28S rDNA for the rapid and effective detection of Schistosoma japonicum DNA in infected and prepatent infected O. hupensis snails. The detection limit of the LAMP method was 100 fg of S. japonicum genomic DNA. To promote the application of the approach in the field, the LAMP assay was used to detect infection in pooled samples of field-collected snails. In the pooled sample detection, snails were collected from 28 endemic areas, and 50 snails from each area were pooled based on the maximum pool size estimation, crushed together and DNA was extracted from each pooled sample as template for the LAMP assay. Based on the formula for detection from pooled samples, the proportion of positive pooled samples and the positive proportion of O. hupensis detected by LAMP of Xima village reached 66.67% and 1.33%, while those of Heini, Hongjia, Yangjiang and Huangshan villages were 33.33% and 0.67%, and those of Tuanzhou and Suliao villages were 16.67% and 0.33%, respectively. The remaining 21 monitoring field sites gave negative results. A risk map for the transmission of schistosomiasis was constructed using ArcMap, based on the positive proportion of O. hupensis infected with S. japonicum, as detected by the LAMP assay, which will form a guide for surveillance and response strategies in high risk areas. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Distortion correction of echo planar images applying the concept of finite rate of innovation to point spread function mapping (FRIP).

    PubMed

    Nunes, Rita G; Hajnal, Joseph V

    2018-06-01

    Point spread function (PSF) mapping enables estimating the displacement fields required for distortion correction of echo planar images. Recently, a highly accelerated approach was introduced for estimating displacements from the phase slope of under-sampled PSF mapping data. Sampling schemes with varying spacing were proposed requiring stepwise phase unwrapping. To avoid unwrapping errors, an alternative approach applying the concept of finite rate of innovation to PSF mapping (FRIP) is introduced, using a pattern search strategy to locate the PSF peak, and the two methods are compared. Fully sampled PSF data was acquired in six subjects at 3.0 T, and distortion maps were estimated after retrospective under-sampling. The two methods were compared for both previously published and newly optimized sampling patterns. Prospectively under-sampled data were also acquired. Shift maps were estimated and deviations relative to the fully sampled reference map were calculated. The best performance was achieved when using FRIP with a previously proposed sampling scheme. The two methods were comparable for the remaining schemes. The displacement field errors tended to be lower as the number of samples or their spacing increased. A robust method for estimating the position of the PSF peak has been introduced.

  10. Estimating accuracy of land-cover composition from two-stage cluster sampling

    USGS Publications Warehouse

    Stehman, S.V.; Wickham, J.D.; Fattorini, L.; Wade, T.D.; Baffetta, F.; Smith, J.H.

    2009-01-01

    Land-cover maps are often used to compute land-cover composition (i.e., the proportion or percent of area covered by each class), for each unit in a spatial partition of the region mapped. We derive design-based estimators of mean deviation (MD), mean absolute deviation (MAD), root mean square error (RMSE), and correlation (CORR) to quantify accuracy of land-cover composition for a general two-stage cluster sampling design, and for the special case of simple random sampling without replacement (SRSWOR) at each stage. The bias of the estimators for the two-stage SRSWOR design is evaluated via a simulation study. The estimators of RMSE and CORR have small bias except when sample size is small and the land-cover class is rare. The estimator of MAD is biased for both rare and common land-cover classes except when sample size is large. A general recommendation is that rare land-cover classes require large sample sizes to ensure that the accuracy estimators have small bias. ?? 2009 Elsevier Inc.

  11. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.

  12. Bayesian geostatistics in health cartography: the perspective of malaria

    PubMed Central

    Patil, Anand P.; Gething, Peter W.; Piel, Frédéric B.; Hay, Simon I.

    2011-01-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  13. Geochemical landscapes of the conterminous United States; new map presentations for 22 elements

    USGS Publications Warehouse

    Gustavsson, N.; Bolviken, B.; Smith, D.B.; Severson, R.C.

    2001-01-01

    Geochemical maps of the conterminous United States have been prepared for seven major elements (Al, Ca, Fe, K, Mg, Na, and Ti) and 15 trace elements (As, Ba, Cr, Cu, Hg, Li, Mn, Ni, Pb, Se, Sr, V, Y, Zn, and Zr). The maps are based on an ultra low-density geochemical survey consisting of 1,323 samples of soils and other surficial materials collected from approximately 1960-1975. The data were published by Boerngen and Shacklette (1981) and black-and-white point-symbol geochemical maps were published by Shacklette and Boerngen (1984). The data have been reprocessed using weighted-median and Bootstrap procedures for interpolation and smoothing.

  14. Mapping risk of Nipah virus transmission across Asia and across Bangladesh.

    PubMed

    Peterson, A Townsend

    2015-03-01

    Nipah virus is a highly pathogenic but poorly known paramyxovirus from South and Southeast Asia. In spite of the risks that it poses to human health, the geography and ecology of its occurrence remain little understood-the virus is basically known from Bangladesh and peninsular Malaysia, and little in between. In this contribution, I use documented occurrences of the virus to develop ecological niche-based maps summarizing its likely broader occurrence-although rangewide maps could not be developed that had significant predictive abilities, reflecting minimal sample sizes available, maps within Bangladesh were quite successful in identifying areas in which the virus is predictably present and likely transmitted. © 2013 APJPH.

  15. BowMapCL: Burrows-Wheeler Mapping on Multiple Heterogeneous Accelerators.

    PubMed

    Nogueira, David; Tomas, Pedro; Roma, Nuno

    2016-01-01

    The computational demand of exact-search procedures has pressed the exploitation of parallel processing accelerators to reduce the execution time of many applications. However, this often imposes strict restrictions in terms of the problem size and implementation efforts, mainly due to their possibly distinct architectures. To circumvent this limitation, a new exact-search alignment tool (BowMapCL) based on the Burrows-Wheeler Transform and FM-Index is presented. Contrasting to other alternatives, BowMapCL is based on a unified implementation using OpenCL, allowing the exploitation of multiple and possibly different devices (e.g., NVIDIA, AMD/ATI, and Intel GPUs/APUs). Furthermore, to efficiently exploit such heterogeneous architectures, BowMapCL incorporates several techniques to promote its performance and scalability, including multiple buffering, work-queue task-distribution, and dynamic load-balancing, together with index partitioning, bit-encoding, and sampling. When compared with state-of-the-art tools, the attained results showed that BowMapCL (using a single GPU) is 2 × to 7.5 × faster than mainstream multi-threaded CPU BWT-based aligners, like Bowtie, BWA, and SOAP2; and up to 4 × faster than the best performing state-of-the-art GPU implementations (namely, SOAP3 and HPG-BWT). When multiple and completely distinct devices are considered, BowMapCL efficiently scales the offered throughput, ensuring a convenient load-balance of the involved processing in the several distinct devices.

  16. Automated Plantation Mapping in Indonesia Using Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Karpatne, A.; Jia, X.; Khandelwal, A.; Kumar, V.

    2017-12-01

    Plantation mapping is critical for understanding and addressing deforestation, a key driver of climate change and ecosystem degradation. Unfortunately, most plantation maps are limited to small areas for specific years because they rely on visual inspection of imagery. In this work, we propose a data-driven approach which automatically generates yearly plantation maps for large regions using MODIS multi-spectral data. While traditional machine learning algorithms face manifold challenges in this task, e.g. imperfect training labels, spatio-temporal data heterogeneity, noisy and high-dimensional data, lack of evaluation data, etc., we introduce a novel deep learning-based framework that combines existing imperfect plantation products as training labels and models the spatio-temporal relationships of land covers. We also explores the post-processing steps based on Hidden Markov Model that further improve the detection accuracy. Then we conduct extensive evaluation of the generated plantation maps. Specifically, by randomly sampling and comparing with high-resolution Digital Globe imagery, we demonstrate that the generated plantation maps achieve both high precision and high recall. When compared with existing plantation mapping products, our detection can avoid both false positives and false negatives. Finally, we utilize the generated plantation maps in analyzing the relationship between forest fires and growth of plantations, which assists in better understanding the cause of deforestation in Indonesia.

  17. Analysis of ASTER data for mapping bauxite rich pockets within high altitude lateritic bauxite, Jharkhand, India

    NASA Astrophysics Data System (ADS)

    Guha, Arindam; Singh, Vivek Kr.; Parveen, Reshma; Kumar, K. Vinod; Jeyaseelan, A. T.; Dhanamjaya Rao, E. N.

    2013-04-01

    Bauxite deposits of Jharkhand in India are resulted from the lateritization process and therefore are often associated with the laterites. In the present study, ASTER (Advanced Space borne Thermal Emission and Reflection Radiometer) image is processed to delineate bauxite rich pockets within the laterites. In this regard, spectral signatures of lateritic bauxite samples are analyzed in the laboratory with reference to the spectral features of gibbsite (main mineral constituent of bauxite) and goethite (main mineral constituent of laterite) in VNIR-SWIR (visible-near infrared and short wave infrared) electromagnetic domain. The analysis of spectral signatures of lateritic bauxite samples helps in understanding the differences in the spectral features of bauxites and laterites. Based on these differences; ASTER data based relative band depth and simple ratio images are derived for spatial mapping of the bauxites developed within the lateritic province. In order to integrate the complementary information of different index image, an index based principal component (IPC) image is derived to incorporate the correlative information of these indices to delineate bauxite rich pockets. The occurrences of bauxite rich pockets derived from density sliced IPC image are further delimited by the topographic controls as it has been observed that the major bauxite occurrences of the area are controlled by slope and altitude. In addition to above, IPC image is draped over the digital elevation model (DEM) to illustrate how bauxite rich pockets are distributed with reference to the topographic variability of the terrain. Bauxite rich pockets delineated in the IPC image are also validated based on the known mine occurrences and existing geological map of the bauxite. It is also conceptually validated based on the spectral similarity of the bauxite pixels delineated in the IPC image with the ASTER convolved laboratory spectra of bauxite samples.

  18. ACCELERATING MR PARAMETER MAPPING USING SPARSITY-PROMOTING REGULARIZATION IN PARAMETRIC DIMENSION

    PubMed Central

    Velikina, Julia V.; Alexander, Andrew L.; Samsonov, Alexey

    2013-01-01

    MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work, we propose a novel regularization strategy, which utilizes smoothness of signal evolution in the parametric dimension within compressed sensing framework (p-CS) to provide accurate and precise estimation of parametric maps from undersampled data. The performance of the method was demonstrated with variable flip angle T1 mapping and compared favorably to two representative reconstruction approaches, image space-based total variation regularization and an analytical model-based reconstruction. The proposed p-CS regularization was found to provide efficient suppression of noise amplification and preservation of parameter mapping accuracy without explicit utilization of analytical signal models. The developed method may facilitate acceleration of quantitative MRI techniques that are not suitable to model-based reconstruction because of complex signal models or when signal deviations from the expected analytical model exist. PMID:23213053

  19. Spatial analysis of plutonium-239 + 240 and Americium-241 in soils around Rocky Flats, Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litaor, M.I.

    1995-05-01

    Plutonium and american contamination of soils around Rocky Flats, Colorado resulted from past outdoor storage practices. Four previous studies produce four different Pu isopleth maps. Spatial estimation techniques were not used in the construction of these maps and were also based on an extremely small number of soil samples. The purpose of this study was to elucidate the magnitude of Pu-239 + 240 and Am-241 dispersion in the soil environment east of Rocky Flats using robust spatial estimation techniques. Soils were sampled from 118 plots of 1.01 and 4.05 ha by compositing 25 evenly spaced samples in each plot frommore » the top 0.64 cm. Plutonium-239 + 240 activity ranged from 1.85 to 53 560 Bq/kg with a mean of 1924 Bq/kg and a standard deviation of 6327 Bq/kg. Americium-241 activity ranged from 0.18 to 9990 Bq/kg with a mean of 321 Bq/kg and a standard deviation of 1143 Bq/kg. Geostatistical techniques were used to model the spatial dependency and construct isopleth maps showing Pu-239 + 240 and Am-241 distribution. The isopleth configuration was consistent with the hypothesis that the dominant dispersal mechanism of Pu-239 + 240 was wind dispersion from west to east. The Pu-239 + 240 isopleth map proposed to this study differed significantly in the direction and distance of dispersal from the previously published maps. This ispleth map as well as the Am-241 map should be used as the primary data for future risk assessment associated with public exposure to Pu-239 + 240 and Am-241. 37 refs., 7 figs., 2 tabs.« less

  20. Kite aerial photography for low-cost, ultra-high spatial resolution multi-spectral mapping of intertidal landscapes.

    PubMed

    Bryson, Mitch; Johnson-Roberson, Matthew; Murphy, Richard J; Bongiorno, Daniel

    2013-01-01

    Intertidal ecosystems have primarily been studied using field-based sampling; remote sensing offers the ability to collect data over large areas in a snapshot of time that could complement field-based sampling methods by extrapolating them into the wider spatial and temporal context. Conventional remote sensing tools (such as satellite and aircraft imaging) provide data at limited spatial and temporal resolutions and relatively high costs for small-scale environmental science and ecologically-focussed studies. In this paper, we describe a low-cost, kite-based imaging system and photogrammetric/mapping procedure that was developed for constructing high-resolution, three-dimensional, multi-spectral terrain models of intertidal rocky shores. The processing procedure uses automatic image feature detection and matching, structure-from-motion and photo-textured terrain surface reconstruction algorithms that require minimal human input and only a small number of ground control points and allow the use of cheap, consumer-grade digital cameras. The resulting maps combine imagery at visible and near-infrared wavelengths and topographic information at sub-centimeter resolutions over an intertidal shoreline 200 m long, thus enabling spatial properties of the intertidal environment to be determined across a hierarchy of spatial scales. Results of the system are presented for an intertidal rocky shore at Jervis Bay, New South Wales, Australia. Potential uses of this technique include mapping of plant (micro- and macro-algae) and animal (e.g. gastropods) assemblages at multiple spatial and temporal scales.

  1. Kite Aerial Photography for Low-Cost, Ultra-high Spatial Resolution Multi-Spectral Mapping of Intertidal Landscapes

    PubMed Central

    Bryson, Mitch; Johnson-Roberson, Matthew; Murphy, Richard J.; Bongiorno, Daniel

    2013-01-01

    Intertidal ecosystems have primarily been studied using field-based sampling; remote sensing offers the ability to collect data over large areas in a snapshot of time that could complement field-based sampling methods by extrapolating them into the wider spatial and temporal context. Conventional remote sensing tools (such as satellite and aircraft imaging) provide data at limited spatial and temporal resolutions and relatively high costs for small-scale environmental science and ecologically-focussed studies. In this paper, we describe a low-cost, kite-based imaging system and photogrammetric/mapping procedure that was developed for constructing high-resolution, three-dimensional, multi-spectral terrain models of intertidal rocky shores. The processing procedure uses automatic image feature detection and matching, structure-from-motion and photo-textured terrain surface reconstruction algorithms that require minimal human input and only a small number of ground control points and allow the use of cheap, consumer-grade digital cameras. The resulting maps combine imagery at visible and near-infrared wavelengths and topographic information at sub-centimeter resolutions over an intertidal shoreline 200 m long, thus enabling spatial properties of the intertidal environment to be determined across a hierarchy of spatial scales. Results of the system are presented for an intertidal rocky shore at Jervis Bay, New South Wales, Australia. Potential uses of this technique include mapping of plant (micro- and macro-algae) and animal (e.g. gastropods) assemblages at multiple spatial and temporal scales. PMID:24069206

  2. Geochemical, aeromagnetic, and generalized geologic maps showing distribution and abundance of molybdenum and zinc, Golconda and Iron Point quadrangles, Humboldt County, Nevada

    USGS Publications Warehouse

    Erickson, R.L.; Marsh, S.P.

    1972-01-01

    This series of maps shows the distribution and abundance of mercury, arsenic, antimony, tungsten, gold, copper, lead, and silver related to a geologic and aeromagnetic base in the Golconda and Iron Point 7½-minute quadrangles. All samples are rock samples; most are from shear or fault zones, fractures, jasperoid, breccia reefs, and altered rocks. All the samples were prepared and analyzed in truck-mounted laboratories at Winnemucca, Nevada. Arsenic, tungsten, copper, lead, and silver were determined by semiquantitative spectrographic methods by D.F. Siems and E.F. Cooley. Mercury and gold were determined by atomic absorption methods and antimony was determined by wet chemical methods by R.M. O'Leary, M.S. Erickson, and others.

  3. Fine-Scale Map of Encyclopedia of DNA Elements Regions in the Korean Population

    PubMed Central

    Yoo, Yeon-Kyeong; Ke, Xiayi; Hong, Sungwoo; Jang, Hye-Yoon; Park, Kyunghee; Kim, Sook; Ahn, TaeJin; Lee, Yeun-Du; Song, Okryeol; Rho, Na-Young; Lee, Moon Sue; Lee, Yeon-Su; Kim, Jaeheup; Kim, Young J.; Yang, Jun-Mo; Song, Kyuyoung; Kimm, Kyuchan; Weir, Bruce; Cardon, Lon R.; Lee, Jong-Eun; Hwang, Jung-Joo

    2006-01-01

    The International HapMap Project aims to generate detailed human genome variation maps by densely genotyping single-nucleotide polymorphisms (SNPs) in CEPH, Chinese, Japanese, and Yoruba samples. This will undoubtedly become an important facility for genetic studies of diseases and complex traits in the four populations. To address how the genetic information contained in such variation maps is transferable to other populations, the Korean government, industries, and academics have launched the Korean HapMap project to genotype high-density Encyclopedia of DNA Elements (ENCODE) regions in 90 Korean individuals. Here we show that the LD pattern, block structure, haplotype diversity, and recombination rate are highly concordant between Korean and the two HapMap Asian samples, particularly Japanese. The availability of information from both Chinese and Japanese samples helps to predict more accurately the possible performance of HapMap markers in Korean disease-gene studies. Tagging SNPs selected from the two HapMap Asian maps, especially the Japanese map, were shown to be very effective for Korean samples. These results demonstrate that the HapMap variation maps are robust in related populations and will serve as an important resource for the studies of the Korean population in particular. PMID:16702437

  4. Detection of Mycobacterium avium subspecies paratuberculosis specific IS900 insertion sequences in bulk-tank milk samples obtained from different regions throughout Switzerland

    PubMed Central

    Corti, Sabrina; Stephan, Roger

    2002-01-01

    Background Since Mycobacterium avium subspecies paratuberculosis (MAP) was isolated from intestinal tissue of a human patient suffering Crohn's disease, a controversial discussion exists whether MAP have a role in the etiology of Crohn's disease or not. Raw milk may be a potential vehicle for the transmission of MAP to human population. In a previous paper, we have demonstrated that MAP are found in raw milk samples obtained from a defined region in Switzerland. The aim of this work is to collect data about the prevalence of MAP specific IS900 insertion sequence in bulk-tank milk samples in different regions of Switzerland. Furthermore, we examined eventual correlation between the presence of MAP and the somatic cell counts, the total colony counts and the presence of Enterobacteriaceae. Results 273 (19.7%) of the 1384 examined bulk-tank milk samples tested IS900 PCR-positive. The prevalence, however, in the different regions of Switzerland shows significant differences and ranged from 1.7% to 49.2%. Furthermore, there were no statistically significant (p >> 0.05) differences between the somatic cell counts and the total colony counts of PCR-positive and PCR-negative milk samples. Enterobacteriaceae occur as often in IS900 PCR-positive as in PCR-negative milk samples. Conclusion This is the first study, which investigates the prevalence of MAP in bulk-tank milk samples all over Switzerland and infers the herd-level prevalence of MAP infection in dairy herds. The prevalence of 19.7% IS900 PCR-positive bulk-milk samples shows a wide distribution of subclinical MAP-infections in dairy stock in Switzerland. MAP can therefore often be transmitted to humans by raw milk consumption. PMID:12097144

  5. Geochemical surveys in the United States in relation to health.

    USGS Publications Warehouse

    Tourtelot, H.A.

    1979-01-01

    Geochemical surveys in relation to health may be classified as having one, two or three dimensions. One-dimensional surveys examine relations between concentrations of elements such as Pb in soils and other media and burdens of the same elements in humans, at a given time. The spatial distributions of element concentrations are not investigated. The primary objective of two-dimensional surveys is to map the distributions of element concentrations, commonly according to stratified random sampling designs based on either conceptual landscape units or artificial sampling strata, but systematic sampling intervals have also been used. Political units have defined sample areas that coincide with the units used to accumulate epidemiological data. Element concentrations affected by point sources have also been mapped. Background values, location of natural or technological anomalies and the geographic scale of variation for several elements often are determined. Three-dimensional surveys result when two-dimensional surveys are repeated to detect environmental changes. -Author

  6. GESFIDE-PROPELLER approach for simultaneous R2 and R2* measurements in the abdomen.

    PubMed

    Jin, Ning; Guo, Yang; Zhang, Zhuoli; Zhang, Longjiang; Lu, Guangming; Larson, Andrew C

    2013-12-01

    To investigate the feasibility of combining GESFIDE with PROPELLER sampling approaches for simultaneous abdominal R2 and R2* mapping. R2 and R2* measurements were performed in 9 healthy volunteers and phantoms using the GESFIDE-PROPELLER and the conventional Cartesian-sampling GESFIDE approaches. Images acquired with the GESFIDE-PROPELLER sequence effectively mitigated the respiratory motion artifacts, which were clearly evident in the images acquired using the conventional GESFIDE approach. There was no significant difference between GESFIDE-PROPELLER and reference MGRE R2* measurements (p=0.162) whereas the Cartesian-sampling based GESFIDE methods significantly overestimated R2* values compared to MGRE measurements (p<0.001). The GESFIDE-PROPELLER sequence provided high quality images and accurate abdominal R2 and R2* maps while avoiding the motion artifacts common to the conventional Cartesian-sampling GESFIDE approaches. © 2013 Elsevier Inc. All rights reserved.

  7. GESFIDE-PROPELLER Approach for Simultaneous R2 and R2* Measurements in the Abdomen

    PubMed Central

    Jin, Ning; Guo, Yang; Zhang, Zhuoli; Zhang, Longjiang; Lu, Guangming; Larson, Andrew C.

    2013-01-01

    Purpose To investigate the feasibility of combining GESFIDE with PROPELLER sampling approaches for simultaneous abdominal R2 and R2* mapping. Materials and Methods R2 and R2* measurements were performed in 9 healthy volunteers and phantoms using the GESFIDE-PROPELLER and the conventional Cartesian-sampling GESFIDE approaches. Results Images acquired with the GESFIDE-PROPELLER sequence effectively mitigated the respiratory motion artifacts, which were clearly evident in the images acquired using the conventional GESFIDE approach. There were no significant difference between GESFIDE-PROPELLER and reference MGRE R2* measurements (p = 0.162) whereas the Cartesian-sampling based GESFIDE methods significantly overestimated R2* values compared to MGRE measurements (p < 0.001). Conclusion The GESFIDE-PROPELLER sequence provided high quality images and accurate abdominal R2 and R2* maps while avoiding the motion artifacts common to the conventional Cartesian-sampling GESFIDE approaches. PMID:24041478

  8. A new method for automatic discontinuity traces sampling on rock mass 3D model

    NASA Astrophysics Data System (ADS)

    Umili, G.; Ferrero, A.; Einstein, H. H.

    2013-02-01

    A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.

  9. Statistical power calculations for mixed pharmacokinetic study designs using a population approach.

    PubMed

    Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel

    2014-09-01

    Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.

  10. The new Brazilian national forest inventory

    Treesearch

    Joberto V. de Freitas; Yeda M. M. de Oliveira; Doadi A. Brena; Guilherme L.A. Gomide; Jose Arimatea Silva; < i> et al< /i>

    2009-01-01

    The new Brazilian national forest inventory (NFI) is being planned to be carried out through five components: (1) general coordination, led by the Brazilian Forest Service; (2) vegetation mapping, which will serve as the basis for sample plot location; (3) field data collection; (4) landscape data collection of 10 x 10-km sample plots, based on high-resolution...

  11. Accuracy and suitability of selected sampling methods within conifer dominated riparian zones

    Treesearch

    Theresa Marquardt; Hailemariam Temesgen; Paul D. Anderson

    2010-01-01

    Sixteen sampling alternatives were examined for their performance to quantify selected attributes of overstory conifers in riparian areas of western Oregon. Each alternative was examined at eight headwater forest locations based on 0.52 ha square stem maps. The alternatives were evaluated for selected stand attributes (tree per hectare, basal area per hectare, and...

  12. Two States Mapping Based Time Series Neural Network Model for Compensation Prediction Residual Error

    NASA Astrophysics Data System (ADS)

    Jung, Insung; Koo, Lockjo; Wang, Gi-Nam

    2008-11-01

    The objective of this paper was to design a model of human bio signal data prediction system for decreasing of prediction error using two states mapping based time series neural network BP (back-propagation) model. Normally, a lot of the industry has been applied neural network model by training them in a supervised manner with the error back-propagation algorithm for time series prediction systems. However, it still has got a residual error between real value and prediction result. Therefore, we designed two states of neural network model for compensation residual error which is possible to use in the prevention of sudden death and metabolic syndrome disease such as hypertension disease and obesity. We determined that most of the simulation cases were satisfied by the two states mapping based time series prediction model. In particular, small sample size of times series were more accurate than the standard MLP model.

  13. A method to estimate the effect of deformable image registration uncertainties on daily dose mapping

    PubMed Central

    Murphy, Martin J.; Salguero, Francisco J.; Siebers, Jeffrey V.; Staub, David; Vaman, Constantin

    2012-01-01

    Purpose: To develop a statistical sampling procedure for spatially-correlated uncertainties in deformable image registration and then use it to demonstrate their effect on daily dose mapping. Methods: Sequential daily CT studies are acquired to map anatomical variations prior to fractionated external beam radiotherapy. The CTs are deformably registered to the planning CT to obtain displacement vector fields (DVFs). The DVFs are used to accumulate the dose delivered each day onto the planning CT. Each DVF has spatially-correlated uncertainties associated with it. Principal components analysis (PCA) is applied to measured DVF error maps to produce decorrelated principal component modes of the errors. The modes are sampled independently and reconstructed to produce synthetic registration error maps. The synthetic error maps are convolved with dose mapped via deformable registration to model the resulting uncertainty in the dose mapping. The results are compared to the dose mapping uncertainty that would result from uncorrelated DVF errors that vary randomly from voxel to voxel. Results: The error sampling method is shown to produce synthetic DVF error maps that are statistically indistinguishable from the observed error maps. Spatially-correlated DVF uncertainties modeled by our procedure produce patterns of dose mapping error that are different from that due to randomly distributed uncertainties. Conclusions: Deformable image registration uncertainties have complex spatial distributions. The authors have developed and tested a method to decorrelate the spatial uncertainties and make statistical samples of highly correlated error maps. The sample error maps can be used to investigate the effect of DVF uncertainties on daily dose mapping via deformable image registration. An initial demonstration of this methodology shows that dose mapping uncertainties can be sensitive to spatial patterns in the DVF uncertainties. PMID:22320766

  14. Optimisation of decontamination method and influence of culture media on the recovery of Mycobacterium avium subspecies paratuberculosis from spiked water sediments.

    PubMed

    Aboagye, G; Rowe, M T

    2018-07-01

    The recovery of Mycobacterium avium subspecies paratuberculosis (Map) from the environment can be a laborious process - owing to Map being fastidious, its low number, and also high numbers of other microbial populations in such settings. Protocols i.e. filtration, decontamination and modified elution were devised to recover Map from spiked water sediments. Three culture media: Herrold's Egg Yolk Media (HEYM), Middlebrook 7H10 (M-7H10) and Bactec 12B were then employed to grow the organism following its elution. In the sterile sediment samples the recovery of Map was significant between the time of exposure for each of HEYM and M-7H10, and insignificant between both media (P < 0.05). However, in the non-sterile sediment samples, the HEYM grew other background microflora including moulds at all the times of exposure whilst 4 h followed by M-7H10 culture yielded Map colonies without any background microflora. Using sterile samples only for the Bactec 12B, the recovery of Map decreased as time of exposure increased. Based on these findings, M-7H10 should be considered for the recovery of Map from the natural environment including water sediments where the recovery of diverse microbial species remains a challenge. Map is a robust pathogen that abides in the environment. In water treatment operations, Map associates with floccules and other particulate matter including sediments. It is also a fastidious organism, and its detection and recovery from the water environment is a laborious process and can be misleading within the abundance of other mycobacterial species owing to their close resemblance in phylogenetic traits. In the absence of a reliable recovery method, Map continues to pose public health risks through biofilm in household water tanks, hence the need for the development of a reliable recovery protocol to monitor the presence of Map in water systems in order to curtail its public health risks. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. EnviroAtlas -Milwaukee, WI- One Meter Resolution Urban Land Cover Data (2010) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The EnviroAtlas Milwaukee, WI land cover data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 85.39% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Milwaukee. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-

  16. EnviroAtlas -- Woodbine, IA -- One Meter Resolution Urban Land Cover Data (2011) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The EnviroAtlas Woodbine, IA land cover (LC) data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2011 at 1 m spatial resolution. Six land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, and agriculture. An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 87.03% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Woodbine. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  17. EnviroAtlas -Portland, ME- One Meter Resolution Urban Land Cover (2010) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The Portland, ME land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a stratified random sampling of 600 samples yielded an overall accuracy of 87.5 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Portland.This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  18. The Mapping X-ray Fluorescence Spectrometer (MapX)

    NASA Astrophysics Data System (ADS)

    Sarrazin, P.; Blake, D. F.; Marchis, F.; Bristow, T.; Thompson, K.

    2017-12-01

    Many planetary surface processes leave traces of their actions as features in the size range 10s to 100s of microns. The Mapping X-ray Fluorescence Spectrometer (MapX) will provide elemental imaging at 100 micron spatial resolution, yielding elemental chemistry at a scale where many relict physical, chemical, or biological features can be imaged and interpreted in ancient rocks on planetary bodies and planetesimals. MapX is an arm-based instrument positioned on a rock or regolith with touch sensors. During an analysis, an X-ray source (tube or radioisotope) bombards the sample with X-rays or alpha-particles / gamma-rays, resulting in sample X-ray Fluorescence (XRF). X-rays emitted in the direction of an X-ray sensitive CCD imager pass through a 1:1 focusing lens (X-ray micro-pore Optic (MPO)) that projects a spatially resolved image of the X-rays onto the CCD. The CCD is operated in single photon counting mode so that the energies and positions of individual X-ray photons are recorded. In a single analysis, several thousand frames are both stored and processed in real-time. Higher level data products include single-element maps with a lateral spatial resolution of 100 microns and quantitative XRF spectra from ground- or instrument- selected Regions of Interest (ROI). XRF spectra from ROI are compared with known rock and mineral compositions to extrapolate the data to rock types and putative mineralogies. When applied to airless bodies and implemented with an appropriate radioisotope source for alpha-particle excitation, MapX will be able to analyze biogenic elements C, N, O, P, S, in addition to the cations of the rock-forming elements >Na, accessible with either X-ray or gamma-ray excitation. The MapX concept has been demonstrated with a series of lab-based prototypes and is currently under refinement and TRL maturation.

  19. Vegetation and terrain mapping in Alaska using Landsat MSS and digital terrain data

    USGS Publications Warehouse

    Shasby, Mark; Carneggie, David M.

    1986-01-01

    During the past 5 years, the U.S. Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center Field Office in Anchorage, Alaska has worked cooperatively with Federal and State resource management agencies to produce land-cover and terrain maps for 245 million acres of Alaska. The need for current land-cover information in Alaska comes principally from the mandates of the Alaska National Interest Lands Conservation Act (ANILCA), December 1980, which requires major land management agencies to prepare comprehensive management plans. The land-cover mapping projects integrate digital Landsat data, terrain data, aerial photographs, and field data. The resultant land-cover and terrain maps and associated data bases are used for resource assessment, management, and planning by many Alaskan agencies including the U.S. Fish and Wildlife Service, U.S. Forest Service, Bureau of Land Management, and Alaska Department of Natural Resources. Applications addressed through use of the digital land-cover and terrain data bases range from comprehensive refuge planning to multiphased sampling procedures designed to inventory vegetation statewide. The land-cover mapping programs in Alaska demonstrate the operational utility of digital Landsat data and have resulted in a new land-cover mapping program by the USGS National Mapping Division to compile 1:250,000-scale land-cover maps in Alaska using a common statewide land-cover map legend.

  20. Making Sense of 'Big Data' in Provenance Studies

    NASA Astrophysics Data System (ADS)

    Vermeesch, P.

    2014-12-01

    Huge online databases can be 'mined' to reveal previously hidden trends and relationships in society. One could argue that sedimentary geology has entered a similar era of 'Big Data', as modern provenance studies routinely apply multiple proxies to dozens of samples. Just like the Internet, sedimentary geology now requires specialised statistical tools to interpret such large datasets. These can be organised on three levels of progressively higher order:A single sample: The most effective way to reveal the provenance information contained in a representative sample of detrital zircon U-Pb ages are probability density estimators such as histograms and kernel density estimates. The widely popular 'probability density plots' implemented in IsoPlot and AgeDisplay compound analytical uncertainty with geological scatter and are therefore invalid.Several samples: Multi-panel diagrams comprising many detrital age distributions or compositional pie charts quickly become unwieldy and uninterpretable. For example, if there are N samples in a study, then the number of pairwise comparisons between samples increases quadratically as N(N-1)/2. This is simply too much information for the human eye to process. To solve this problem, it is necessary to (a) express the 'distance' between two samples as a simple scalar and (b) combine all N(N-1)/2 such values in a single two-dimensional 'map', grouping similar and pulling apart dissimilar samples. This can be easily achieved using simple statistics-based dissimilarity measures and a standard statistical method called Multidimensional Scaling (MDS).Several methods: Suppose that we use four provenance proxies: bulk petrography, chemistry, heavy minerals and detrital geochronology. This will result in four MDS maps, each of which likely show slightly different trends and patterns. To deal with such cases, it may be useful to use a related technique called 'three way multidimensional scaling'. This results in two graphical outputs: an MDS map, and a map with 'weights' showing to what extent the different provenance proxies influence the horizontal and vertical axis of the MDS map. Thus, detrital data can not only inform the user about the provenance of sediments, but also about the causal relationships between the mineralogy, geochronology and chemistry.

  1. Tip-enhanced Raman mapping with top-illumination AFM.

    PubMed

    Chan, K L Andrew; Kazarian, Sergei G

    2011-04-29

    Tip-enhanced Raman mapping is a powerful, emerging technique that offers rich chemical information and high spatial resolution. Currently, most of the successes in tip-enhanced Raman scattering (TERS) measurements are based on the inverted configuration where tips and laser are approaching the sample from opposite sides. This results in the limitation of measurement for transparent samples only. Several approaches have been developed to obtain tip-enhanced Raman mapping in reflection mode, many of which involve certain customisations of the system. We have demonstrated in this work that it is also possible to obtain TERS nano-images using an upright microscope (top-illumination) with a gold-coated Si atomic force microscope (AFM) cantilever without significant modification to the existing integrated AFM/Raman system. A TERS image of a single-walled carbon nanotube has been achieved with a spatial resolution of ∼ 20-50 nm, demonstrating the potential of this technique for studying non-transparent nanoscale materials.

  2. Prediction of fracture profile using digital image correlation

    NASA Astrophysics Data System (ADS)

    Chaitanya, G. M. S. K.; Sasi, B.; Kumar, Anish; Babu Rao, C.; Purnachandra Rao, B.; Jayakumar, T.

    2015-04-01

    Digital Image Correlation (DIC) based full field strain mapping methodology is used for mapping strain on an aluminum sample subjected to tensile deformation. The local strains on the surface of the specimen are calculated at different strain intervals. Early localization of strain is observed at a total strain of 0.050ɛ; itself, whereas a visually apparent localization of strain is observed at a total strain of 0.088ɛ;. Orientation of the line of fracture (12.0°) is very close to the orientation of locus of strain maxima (11.6°) computed from the strain mapping at 0.063ɛ itself. These results show the efficacy of the DIC based method to predict the location as well as the profile of the fracture, at an early stage.

  3. Does the mind map learning strategy facilitate information retrieval and critical thinking in medical students?

    PubMed Central

    2010-01-01

    Background A learning strategy underutilized in medical education is mind mapping. Mind maps are multi-sensory tools that may help medical students organize, integrate, and retain information. Recent work suggests that using mind mapping as a note-taking strategy facilitates critical thinking. The purpose of this study was to investigate whether a relationship existed between mind mapping and critical thinking, as measured by the Health Sciences Reasoning Test (HSRT), and whether a relationship existed between mind mapping and recall of domain-based information. Methods In this quasi-experimental study, 131 first-year medical students were randomly assigned to a standard note-taking (SNT) group or mind map (MM) group during orientation. Subjects were given a demographic survey and pre-HSRT. They were then given an unfamiliar text passage, a pre-quiz based upon the passage, and a 30-minute break, during which time subjects in the MM group were given a presentation on mind mapping. After the break, subjects were given the same passage and wrote notes based on their group (SNT or MM) assignment. A post-quiz based upon the passage was administered, followed by a post-HSRT. Differences in mean pre- and post-quiz scores between groups were analyzed using independent samples t-tests, whereas differences in mean pre- and post-HSRT total scores and subscores between groups were analyzed using ANOVA. Mind map depth was assessed using the Mind Map Assessment Rubric (MMAR). Results There were no significant differences in mean scores on both the pre- and post-quizzes between note-taking groups. And, no significant differences were found between pre- and post-HSRT mean total scores and subscores. Conclusions Although mind mapping was not found to increase short-term recall of domain-based information or critical thinking compared to SNT, a brief introduction to mind mapping allowed novice MM subjects to perform similarly to SNT subjects. This demonstrates that medical students using mind maps can successfully retrieve information in the short term, and does not put them at a disadvantage compared to SNT students. Future studies should explore longitudinal effects of mind-map proficiency training on both short- and long-term information retrieval and critical thinking. PMID:20846442

  4. Efficient evaluation of sampling quality of molecular dynamics simulations by clustering of dihedral torsion angles and Sammon mapping.

    PubMed

    Frickenhaus, Stephan; Kannan, Srinivasaraghavan; Zacharias, Martin

    2009-02-01

    A direct conformational clustering and mapping approach for peptide conformations based on backbone dihedral angles has been developed and applied to compare conformational sampling of Met-enkephalin using two molecular dynamics (MD) methods. Efficient clustering in dihedrals has been achieved by evaluating all combinations resulting from independent clustering of each dihedral angle distribution, thus resolving all conformational substates. In contrast, Cartesian clustering was unable to accurately distinguish between all substates. Projection of clusters on dihedral principal component (PCA) subspaces did not result in efficient separation of highly populated clusters. However, representation in a nonlinear metric by Sammon mapping was able to separate well the 48 highest populated clusters in just two dimensions. In addition, this approach also allowed us to visualize the transition frequencies between clusters efficiently. Significantly, higher transition frequencies between more distinct conformational substates were found for a recently developed biasing-potential replica exchange MD simulation method allowing faster sampling of possible substates compared to conventional MD simulations. Although the number of theoretically possible clusters grows exponentially with peptide length, in practice, the number of clusters is only limited by the sampling size (typically much smaller), and therefore the method is well suited also for large systems. The approach could be useful to rapidly and accurately evaluate conformational sampling during MD simulations, to compare different sampling strategies and eventually to detect kinetic bottlenecks in folding pathways.

  5. Sandwich mapping of schistosomiasis risk in Anhui Province, China.

    PubMed

    Hu, Yi; Bergquist, Robert; Lynn, Henry; Gao, Fenghua; Wang, Qizhi; Zhang, Shiqing; Li, Rui; Sun, Liqian; Xia, Congcong; Xiong, Chenglong; Zhang, Zhijie; Jiang, Qingwu

    2015-06-03

    Schistosomiasis mapping using data obtained from parasitological surveys is frequently used in planning and evaluation of disease control strategies. The available geostatistical approaches are, however, subject to the assumption of stationarity, a stochastic process whose joint probability distribution does not change when shifted in time. As this is impractical for large areas, we introduce here the sandwich method, the basic idea of which is to divide the study area (with its attributes) into homogeneous subareas and estimate the values for the reporting units using spatial stratified sampling. The sandwich method was applied to map the county-level prevalence of schistosomiasis japonica in Anhui Province, China based on parasitological data collected from sample villages and land use data. We first mapped the county-level prevalence using the sandwich method, then compared our findings with block Kriging. The sandwich estimates ranged from 0.17 to 0.21% with a lower level of uncertainty, while the Kriging estimates varied from 0 to 0.97% with a higher level of uncertainty, indicating that the former is more smoothed and stable compared to latter. Aside from various forms of reporting units, the sandwich method has the particular merit of simple model assumption coupled with full utilization of sample data. It performs well when a disease presents stratified heterogeneity over space.

  6. A microarray-based genotyping and genetic mapping approach for highly heterozygous outcrossing species enables localization of a large fraction of the unassembled Populus trichocarpa genome sequence.

    PubMed

    Drost, Derek R; Novaes, Evandro; Boaventura-Novaes, Carolina; Benedict, Catherine I; Brown, Ryan S; Yin, Tongming; Tuskan, Gerald A; Kirst, Matias

    2009-06-01

    Microarrays have demonstrated significant power for genome-wide analyses of gene expression, and recently have also revolutionized the genetic analysis of segregating populations by genotyping thousands of loci in a single assay. Although microarray-based genotyping approaches have been successfully applied in yeast and several inbred plant species, their power has not been proven in an outcrossing species with extensive genetic diversity. Here we have developed methods for high-throughput microarray-based genotyping in such species using a pseudo-backcross progeny of 154 individuals of Populus trichocarpa and P. deltoides analyzed with long-oligonucleotide in situ-synthesized microarray probes. Our analysis resulted in high-confidence genotypes for 719 single-feature polymorphism (SFP) and 1014 gene expression marker (GEM) candidates. Using these genotypes and an established microsatellite (SSR) framework map, we produced a high-density genetic map comprising over 600 SFPs, GEMs and SSRs. The abundance of gene-based markers allowed us to localize over 35 million base pairs of previously unplaced whole-genome shotgun (WGS) scaffold sequence to putative locations in the genome of P. trichocarpa. A high proportion of sampled scaffolds could be verified for their placement with independently mapped SSRs, demonstrating the previously un-utilized power that high-density genotyping can provide in the context of map-based WGS sequence reassembly. Our results provide a substantial contribution to the continued improvement of the Populus genome assembly, while demonstrating the feasibility of microarray-based genotyping in a highly heterozygous population. The strategies presented are applicable to genetic mapping efforts in all plant species with similarly high levels of genetic diversity.

  7. Evaluation of groundwater quality and its suitability for drinking and agricultural use in Thanjavur city, Tamil Nadu, India.

    PubMed

    Nagarajan, R; Rajmohan, N; Mahendran, U; Senthamilkumar, S

    2010-12-01

    As groundwater is a vital source of water for domestic and agricultural activities in Thanjavur city due to lack of surface water resources, groundwater quality and its suitability for drinking and agricultural usage were evaluated. In this study, 102 groundwater samples were collected from dug wells and bore wells during March 2008 and analyzed for pH, electrical conductivity, temperature, major ions, and nitrate. Results suggest that, in 90% of groundwater samples, sodium and chloride are predominant cation and anion, respectively, and NaCl and CaMgCl are major water types in the study area. The groundwater quality in the study site is impaired by surface contamination sources, mineral dissolution, ion exchange, and evaporation. Nitrate, chloride, and sulfate concentrations strongly express the impact of surface contamination sources such as agricultural and domestic activities, on groundwater quality, and 13% of samples have elevated nitrate content (>45 mg/l as NO(3)). PHREEQC code and Gibbs plots were employed to evaluate the contribution of mineral dissolution and suggest that mineral dissolution, especially carbonate minerals, regulates water chemistry. Groundwater suitability for drinking usage was evaluated by the World Health Organization and Indian standards and suggests that 34% of samples are not suitable for drinking. Integrated groundwater suitability map for drinking purposes was created using drinking water standards based on a concept that if the groundwater sample exceeds any one of the standards, it is not suitable for drinking. This map illustrates that wells in zones 1, 2, 3, and 4 are not fit for drinking purpose. Likewise, irrigational suitability of groundwater in the study region was evaluated, and results suggest that 20% samples are not fit for irrigation. Groundwater suitability map for irrigation was also produced based on salinity and sodium hazards and denotes that wells mostly situated in zones 2 and 3 are not suitable for irrigation. Both integrated suitability maps for drinking and irrigation usage provide overall scenario about the groundwater quality in the study area. Finally, the study concluded that groundwater quality is impaired by man-made activities, and proper management plan is necessary to protect valuable groundwater resources in Thanjavur city.

  8. The Sloan Digital Sky Survey Reverberation Mapping Project: Composite Lags at z ≤ 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jennifer; Shen, Yue; Horne, Keith

    We present composite broad-line region (BLR) reverberation mapping lag measurements for H α , H β , He ii λ 4686, and Mg ii for a sample of 144, z ≲ 1 quasars from the Sloan Digital Sky Survey Reverberation Mapping (SDSS-RM) project. Using only the 32-epoch spectroscopic light curves in the first six-month season of SDSS-RM observations, we compile correlation function measurements for individual objects and then coadd them to allow the measurement of the average lags for our sample at mean redshifts of 0.4 (for H α ) and ∼0.65 (for the other lines). At similar quasar luminositiesmore » and redshifts, the sample-averaged lag decreases in the order of Mg ii, H α , H β , and He ii. This decrease in lags is accompanied by an increase in the mean line width of the four lines, and is roughly consistent with the virialized motion for BLR gas in photoionization equilibrium. These are among the first RM measurements of stratified BLR structure at z > 0.3. Dividing our sample by luminosity, H α shows clear evidence of increasing lags with luminosity, consistent with the expectation from the measured BLR size–luminosity relation based on H β . The other three lines do not show a clear luminosity trend in their average lags due to the limited dynamic range of luminosity probed and the poor average correlation signals in the divided samples, a situation that will be improved with the incorporation of additional photometric and spectroscopic data from SDSS-RM. We discuss the utility and caveats of composite lag measurements for large statistical quasar samples with reverberation mapping data.« less

  9. The Sloan Digital Sky Survey Reverberation Mapping Project: Composite Lags at z ≤ 1

    NASA Astrophysics Data System (ADS)

    Li, Jennifer; Shen, Yue; Horne, Keith; Brandt, W. N.; Greene, Jenny E.; Grier, C. J.; Ho, Luis C.; Kochanek, Chris; Schneider, Donald P.; Trump, Jonathan R.; Dawson, Kyle S.; Pan, Kaike; Bizyaev, Dmitry; Oravetz, Daniel; Simmons, Audrey; Malanushenko, Elena

    2017-09-01

    We present composite broad-line region (BLR) reverberation mapping lag measurements for Hα, Hβ, He II λ4686, and Mg II for a sample of 144, z ≲ 1 quasars from the Sloan Digital Sky Survey Reverberation Mapping (SDSS-RM) project. Using only the 32-epoch spectroscopic light curves in the first six-month season of SDSS-RM observations, we compile correlation function measurements for individual objects and then coadd them to allow the measurement of the average lags for our sample at mean redshifts of 0.4 (for Hα) and ˜0.65 (for the other lines). At similar quasar luminosities and redshifts, the sample-averaged lag decreases in the order of Mg II, Hα, Hβ, and He II. This decrease in lags is accompanied by an increase in the mean line width of the four lines, and is roughly consistent with the virialized motion for BLR gas in photoionization equilibrium. These are among the first RM measurements of stratified BLR structure at z > 0.3. Dividing our sample by luminosity, Hα shows clear evidence of increasing lags with luminosity, consistent with the expectation from the measured BLR size-luminosity relation based on Hβ. The other three lines do not show a clear luminosity trend in their average lags due to the limited dynamic range of luminosity probed and the poor average correlation signals in the divided samples, a situation that will be improved with the incorporation of additional photometric and spectroscopic data from SDSS-RM. We discuss the utility and caveats of composite lag measurements for large statistical quasar samples with reverberation mapping data.

  10. Mapping a multiplexed zoo of mRNA expression.

    PubMed

    Choi, Harry M T; Calvert, Colby R; Husain, Naeem; Huss, David; Barsi, Julius C; Deverman, Benjamin E; Hunter, Ryan C; Kato, Mihoko; Lee, S Melanie; Abelin, Anna C T; Rosenthal, Adam Z; Akbari, Omar S; Li, Yuwei; Hay, Bruce A; Sternberg, Paul W; Patterson, Paul H; Davidson, Eric H; Mazmanian, Sarkis K; Prober, David A; van de Rijn, Matt; Leadbetter, Jared R; Newman, Dianne K; Readhead, Carol; Bronner, Marianne E; Wold, Barbara; Lansford, Rusty; Sauka-Spengler, Tatjana; Fraser, Scott E; Pierce, Niles A

    2016-10-01

    In situ hybridization methods are used across the biological sciences to map mRNA expression within intact specimens. Multiplexed experiments, in which multiple target mRNAs are mapped in a single sample, are essential for studying regulatory interactions, but remain cumbersome in most model organisms. Programmable in situ amplifiers based on the mechanism of hybridization chain reaction (HCR) overcome this longstanding challenge by operating independently within a sample, enabling multiplexed experiments to be performed with an experimental timeline independent of the number of target mRNAs. To assist biologists working across a broad spectrum of organisms, we demonstrate multiplexed in situ HCR in diverse imaging settings: bacteria, whole-mount nematode larvae, whole-mount fruit fly embryos, whole-mount sea urchin embryos, whole-mount zebrafish larvae, whole-mount chicken embryos, whole-mount mouse embryos and formalin-fixed paraffin-embedded human tissue sections. In addition to straightforward multiplexing, in situ HCR enables deep sample penetration, high contrast and subcellular resolution, providing an incisive tool for the study of interlaced and overlapping expression patterns, with implications for research communities across the biological sciences. © 2016. Published by The Company of Biologists Ltd.

  11. Mapping a multiplexed zoo of mRNA expression

    PubMed Central

    Choi, Harry M. T.; Calvert, Colby R.; Husain, Naeem; Huss, David; Barsi, Julius C.; Deverman, Benjamin E.; Hunter, Ryan C.; Kato, Mihoko; Lee, S. Melanie; Abelin, Anna C. T.; Rosenthal, Adam Z.; Akbari, Omar S.; Li, Yuwei; Hay, Bruce A.; Sternberg, Paul W.; Patterson, Paul H.; Davidson, Eric H.; Mazmanian, Sarkis K.; Prober, David A.; van de Rijn, Matt; Leadbetter, Jared R.; Newman, Dianne K.; Readhead, Carol; Bronner, Marianne E.; Wold, Barbara; Lansford, Rusty; Sauka-Spengler, Tatjana; Fraser, Scott E.

    2016-01-01

    In situ hybridization methods are used across the biological sciences to map mRNA expression within intact specimens. Multiplexed experiments, in which multiple target mRNAs are mapped in a single sample, are essential for studying regulatory interactions, but remain cumbersome in most model organisms. Programmable in situ amplifiers based on the mechanism of hybridization chain reaction (HCR) overcome this longstanding challenge by operating independently within a sample, enabling multiplexed experiments to be performed with an experimental timeline independent of the number of target mRNAs. To assist biologists working across a broad spectrum of organisms, we demonstrate multiplexed in situ HCR in diverse imaging settings: bacteria, whole-mount nematode larvae, whole-mount fruit fly embryos, whole-mount sea urchin embryos, whole-mount zebrafish larvae, whole-mount chicken embryos, whole-mount mouse embryos and formalin-fixed paraffin-embedded human tissue sections. In addition to straightforward multiplexing, in situ HCR enables deep sample penetration, high contrast and subcellular resolution, providing an incisive tool for the study of interlaced and overlapping expression patterns, with implications for research communities across the biological sciences. PMID:27702788

  12. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  13. Communicating infectious disease prevalence through graphics: Results from an international survey.

    PubMed

    Fagerlin, Angela; Valley, Thomas S; Scherer, Aaron M; Knaus, Megan; Das, Enny; Zikmund-Fisher, Brian J

    2017-07-13

    Graphics are increasingly used to represent the spread of infectious diseases (e.g., influenza, Zika, Ebola); however, the impact of using graphics to adequately inform the general population is unknown. To examine whether three ways of visually presenting data (heat map, dot map, or picto-trendline)-all depicting the same information regarding the spread of a hypothetical outbreak of influenza-influence intent to vaccinate, risk perception, and knowledge. Survey with participants randomized to receive a simulated news article accompanied by one of the three graphics that communicated prevalence of influenza and number of influenza-related deaths. International online survey. 16,510 adults living in 11 countries selected using stratified random sampling based on age and gender. After reading the article and viewing the presented graphic, participants completed a survey that measured interest in vaccination, perceived risk of contracting disease, knowledge gained, interest in additional information about the disease, and perception of the graphic. Heat maps and picto-trendlines were evaluated more positively than dot maps. Heat maps were more effective than picto-trendlines and no different from dot maps at increasing interest in vaccination, perceived risk of contracting disease, and interest in additional information about the disease. Heat maps and picto-trendlines were more successful at conveying knowledge than dot maps. Overall, heat maps were the only graphic to be superior in every outcome. Results are based on a hypothetical scenario. Heat maps are a viable option to promote interest in and concern about infectious diseases. Published by Elsevier Ltd.

  14. Fast IR laser mapping ellipsometry for the study of functional organic thin films.

    PubMed

    Furchner, Andreas; Sun, Guoguang; Ketelsen, Helge; Rappich, Jörg; Hinrichs, Karsten

    2015-03-21

    Fast infrared mapping with sub-millimeter lateral resolution as well as time-resolved infrared studies of kinetic processes of functional organic thin films require a new generation of infrared ellipsometers. We present a novel laboratory-based infrared (IR) laser mapping ellipsometer, in which a laser is coupled to a variable-angle rotating analyzer ellipsometer. Compared to conventional Fourier-transform infrared (FT-IR) ellipsometers, the IR laser ellipsometer provides ten- to hundredfold shorter measurement times down to 80 ms per measured spot, as well as about tenfold increased lateral resolution of 120 μm, thus enabling mapping of small sample areas with thin-film sensitivity. The ellipsometer, equipped with a HeNe laser emitting at about 2949 cm(-1), was applied for the optical characterization of inhomogeneous poly(3-hexylthiophene) [P3HT] and poly(N-isopropylacrylamide) [PNIPAAm] organic thin films used for opto-electronics and bioapplications. With the constant development of tunable IR laser sources, laser-based infrared ellipsometry is a promising technique for fast in-depth mapping characterization of thin films and blends.

  15. Simultaneous comparison and assessment of eight remotely sensed maps of Philippine forests

    NASA Astrophysics Data System (ADS)

    Estoque, Ronald C.; Pontius, Robert G.; Murayama, Yuji; Hou, Hao; Thapa, Rajesh B.; Lasco, Rodel D.; Villar, Merlito A.

    2018-05-01

    This article compares and assesses eight remotely sensed maps of Philippine forest cover in the year 2010. We examined eight Forest versus Non-Forest maps reclassified from eight land cover products: the Philippine Land Cover, the Climate Change Initiative (CCI) Land Cover, the Landsat Vegetation Continuous Fields (VCF), the MODIS VCF, the MODIS Land Cover Type product (MCD12Q1), the Global Tree Canopy Cover, the ALOS-PALSAR Forest/Non-Forest Map, and the GlobeLand30. The reference data consisted of 9852 randomly distributed sample points interpreted from Google Earth. We created methods to assess the maps and their combinations. Results show that the percentage of the Philippines covered by forest ranges among the maps from a low of 23% for the Philippine Land Cover to a high of 67% for GlobeLand30. Landsat VCF estimates 36% forest cover, which is closest to the 37% estimate based on the reference data. The eight maps plus the reference data agree unanimously on 30% of the sample points, of which 11% are attributable to forest and 19% to non-forest. The overall disagreement between the reference data and Philippine Land Cover is 21%, which is the least among the eight Forest versus Non-Forest maps. About half of the 9852 points have a nested structure such that the forest in a given dataset is a subset of the forest in the datasets that have more forest than the given dataset. The variation among the maps regarding forest quantity and allocation relates to the combined effects of the various definitions of forest and classification errors. Scientists and policy makers must consider these insights when producing future forest cover maps and when establishing benchmarks for forest cover monitoring.

  16. Identifier mapping performance for integrating transcriptomics and proteomics experimental results

    PubMed Central

    2011-01-01

    Background Studies integrating transcriptomic data with proteomic data can illuminate the proteome more clearly than either separately. Integromic studies can deepen understanding of the dynamic complex regulatory relationship between the transcriptome and the proteome. Integrating these data dictates a reliable mapping between the identifier nomenclature resultant from the two high-throughput platforms. However, this kind of analysis is well known to be hampered by lack of standardization of identifier nomenclature among proteins, genes, and microarray probe sets. Therefore data integration may also play a role in critiquing the fallible gene identifications that both platforms emit. Results We compared three freely available internet-based identifier mapping resources for mapping UniProt accessions (ACCs) to Affymetrix probesets identifications (IDs): DAVID, EnVision, and NetAffx. Liquid chromatography-tandem mass spectrometry analyses of 91 endometrial cancer and 7 noncancer samples generated 11,879 distinct ACCs. For each ACC, we compared the retrieval sets of probeset IDs from each mapping resource. We confirmed a high level of discrepancy among the mapping resources. On the same samples, mRNA expression was available. Therefore, to evaluate the quality of each ACC-to-probeset match, we calculated proteome-transcriptome correlations, and compared the resources presuming that better mapping of identifiers should generate a higher proportion of mapped pairs with strong inter-platform correlations. A mixture model for the correlations fitted well and supported regression analysis, providing a window into the performance of the mapping resources. The resources have added and dropped matches over two years, but their overall performance has not changed. Conclusions The methods presented here serve to achieve concrete context-specific insight, to support well-informed decisions in choosing an ID mapping strategy for "omic" data merging. PMID:21619611

  17. phylogeo: an R package for geographic analysis and visualization of microbiome data.

    PubMed

    Charlop-Powers, Zachary; Brady, Sean F

    2015-09-01

    We have created an R package named phylogeo that provides a set of geographic utilities for sequencing-based microbial ecology studies. Although the geographic location of samples is an important aspect of environmental microbiology, none of the major software packages used in processing microbiome data include utilities that allow users to map and explore the spatial dimension of their data. phylogeo solves this problem by providing a set of plotting and mapping functions that can be used to visualize the geographic distribution of samples, to look at the relatedness of microbiomes using ecological distance, and to map the geographic distribution of particular sequences. By extending the popular phyloseq package and using the same data structures and command formats, phylogeo allows users to easily map and explore the geographic dimensions of their data from the R programming language. phylogeo is documented and freely available http://zachcp.github.io/phylogeo : zcharlop@rockefeller.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Advancing the quantification of humid tropical forest cover loss with multi-resolution optical remote sensing data: Sampling & wall-to-wall mapping

    NASA Astrophysics Data System (ADS)

    Broich, Mark

    Humid tropical forest cover loss is threatening the sustainability of ecosystem goods and services as vast forest areas are rapidly cleared for industrial scale agriculture and tree plantations. Despite the importance of humid tropical forest in the provision of ecosystem services and economic development opportunities, the spatial and temporal distribution of forest cover loss across large areas is not well quantified. Here I improve the quantification of humid tropical forest cover loss using two remote sensing-based methods: sampling and wall-to-wall mapping. In all of the presented studies, the integration of coarse spatial, high temporal resolution data with moderate spatial, low temporal resolution data enable advances in quantifying forest cover loss in the humid tropics. Imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) are used as the source of coarse spatial resolution, high temporal resolution data and imagery from the Landsat Enhanced Thematic Mapper Plus (ETM+) sensor are used as the source of moderate spatial, low temporal resolution data. In a first study, I compare the precision of different sampling designs for the Brazilian Amazon using the annual deforestation maps derived by the Brazilian Space Agency for reference. I show that sampling designs can provide reliable deforestation estimates; furthermore, sampling designs guided by MODIS data can provide more efficient estimates than the systematic design used for the United Nations Food and Agricultural Organization Forest Resource Assessment 2010. Sampling approaches, such as the one demonstrated, are viable in regions where data limitations, such as cloud contamination, limit exhaustive mapping methods. Cloud-contaminated regions experiencing high rates of change include Insular Southeast Asia, specifically Indonesia and Malaysia. Due to persistent cloud cover, forest cover loss in Indonesia has only been mapped at a 5-10 year interval using photo interpretation of single best Landsat images. Such an approach does not provide timely results, and cloud cover reduces the utility of map outputs. In a second study, I develop a method to exhaustively mine the recently opened Landsat archive for cloud-free observations and automatically map forest cover loss for Sumatra and Kalimantan for the 2000-2005 interval. In a comparison with a reference dataset consisting of 64 Landsat sample blocks, I show that my method, using per pixel time-series, provides more accurate forest cover loss maps for multiyear intervals than approaches using image composites. In a third study, I disaggregate Landsat-mapped forest cover loss, mapped over a multiyear interval, by year using annual forest cover loss maps generated from coarse spatial, high temporal resolution MODIS imagery. I further disaggregate and analyze forest cover loss by forest land use, and provinces. Forest cover loss trends show high spatial and temporal variability. These results underline the importance of annual mapping for the quantification of forest cover loss in Indonesia, specifically in the light of the developing Reducing Emissions from Deforestation and Forest Degradation in Developing Countries policy framework (REDD). All three studies highlight the advances in quantifying forest cover loss in the humid tropics made by integrating coarse spatial, high temporal resolution data with moderate spatial, low temporal resolution data. The three methods presented can be combined into an integrated monitoring strategy.

  19. A Tamarisk Habitat Suitability Map for the Continental US

    NASA Technical Reports Server (NTRS)

    Morisette, Jeffrey T.; Jernevich, Catherine S.; Ullah, Asad; Cai, Weijie; Pedelty, Jeffrey A.; Gentle, Jim; Stohlgren, Thomas J.; Schnase, John L.

    2005-01-01

    This paper presents a national-scale map of habitat suitability for a high-priority invasive species, Tamarisk (Tamarisk spp., salt cedar). We successfully integrate satellite data and tens of thousands of field sampling points through logistic regression modeling to create a habitat suitability map that is 90% accurate. This interagency effort uses field data collected and coordinated through the US Geological Survey and nation-wide environmental data layers derived from NASA s MODerate Resolution Imaging Spectroradiometer (MODIS). We demonstrate the utilization of the map by ranking the lower 48 US states (and the District of Columbia) based upon their absolute, as well as proportional, areas of highly likely and moderately likely habitat for Tamarisk. The interagency effort and modeling approach presented here could be applied to map other harmful species in the US and globally.

  20. Nominal 30-M Cropland Extent Map of Continental Africa by Integrating Pixel-Based and Object-Based Algorithms Using Sentinel-2 and Landsat-8 Data on Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Xiong, Jun; Thenkabail, Prasad S.; Tilton, James C.; Gumma, Murali K.; Teluguntla, Pardhasaradhi; Oliphant, Adam; Congalton, Russell G.; Yadav, Kamini; Gorelick, Noel

    2017-01-01

    A satellite-derived cropland extent map at high spatial resolution (30-m or better) is a must for food and water security analysis. Precise and accurate global cropland extent maps, indicating cropland and non-cropland areas, is a starting point to develop high-level products such as crop watering methods (irrigated or rainfed), cropping intensities (e.g., single, double, or continuous cropping), crop types, cropland fallows, as well as assessment of cropland productivity (productivity per unit of land), and crop water productivity (productivity per unit of water). Uncertainties associated with the cropland extent map have cascading effects on all higher-level cropland products. However, precise and accurate cropland extent maps at high spatial resolution over large areas (e.g., continents or the globe) are challenging to produce due to the small-holder dominant agricultural systems like those found in most of Africa and Asia. Cloud-based Geospatial computing platforms and multi-date, multi-sensor satellite image inventories on Google Earth Engine offer opportunities for mapping croplands with precision and accuracy over large areas that satisfy the requirements of broad range of applications. Such maps are expected to provide highly significant improvements compared to existing products, which tend to be coarser in resolution, and often fail to capture fragmented small-holder farms especially in regions with high dynamic change within and across years. To overcome these limitations, in this research we present an approach for cropland extent mapping at high spatial resolution (30-m or better) using the 10-day, 10 to 20-m, Sentinel-2 data in combination with 16-day, 30-m, Landsat-8 data on Google Earth Engine (GEE). First, nominal 30-m resolution satellite imagery composites were created from 36,924 scenes of Sentinel-2 and Landsat-8 images for the entire African continent in 2015-2016. These composites were generated using a median-mosaic of five bands (blue, green, red, near-infrared, NDVI) during each of the two periods (period 1: January-June 2016 and period 2: July-December 2015) plus a 30-m slope layer derived from the Shuttle Radar Topographic Mission (SRTM) elevation dataset. Second, we selected Cropland/Non-cropland training samples (sample size 9791) from various sources in GEE to create pixel-based classifications. As supervised classification algorithm, Random Forest (RF) was used as the primary classifier because of its efficiency, and when over-fitting issues of RF happened due to the noise of input training data, Support Vector Machine (SVM) was applied to compensate for such defects in specific areas. Third, the Recursive Hierarchical Segmentation (RHSeg) algorithm was employed to generate an object-oriented segmentation layer based on spectral and spatial properties from the same input data. This layer was merged with the pixel-based classification to improve segmentation accuracy. Accuracies of the merged 30-m crop extent product were computed using an error matrix approach in which 1754 independent validation samples were used. In addition, a comparison was performed with other available cropland maps as well as with LULC maps to show spatial similarity. Finally, the cropland area results derived from the map were compared with UN FAO statistics. The independent accuracy assessment showed a weighted overall accuracy of 94, with a producers accuracy of 85.9 (or omission error of 14.1), and users accuracy of 68.5 (commission error of 31.5) for the cropland class. The total net cropland area (TNCA) of Africa was estimated as 313 Mha for the nominal year 2015.

  1. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data

    DOE PAGES

    Chang, C.

    2015-07-29

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg 2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We also find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing.more » These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. Finally, we summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.« less

  2. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data.

    PubMed

    Chang, C; Vikram, V; Jain, B; Bacon, D; Amara, A; Becker, M R; Bernstein, G; Bonnett, C; Bridle, S; Brout, D; Busha, M; Frieman, J; Gaztanaga, E; Hartley, W; Jarvis, M; Kacprzak, T; Kovács, A; Lahav, O; Lin, H; Melchior, P; Peiris, H; Rozo, E; Rykoff, E; Sánchez, C; Sheldon, E; Troxel, M A; Wechsler, R; Zuntz, J; Abbott, T; Abdalla, F B; Allam, S; Annis, J; Bauer, A H; Benoit-Lévy, A; Brooks, D; Buckley-Geer, E; Burke, D L; Capozzi, D; Carnero Rosell, A; Carrasco Kind, M; Castander, F J; Crocce, M; D'Andrea, C B; Desai, S; Diehl, H T; Dietrich, J P; Doel, P; Eifler, T F; Evrard, A E; Fausti Neto, A; Flaugher, B; Fosalba, P; Gruen, D; Gruendl, R A; Gutierrez, G; Honscheid, K; James, D; Kent, S; Kuehn, K; Kuropatkin, N; Maia, M A G; March, M; Martini, P; Merritt, K W; Miller, C J; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Roodman, A; Sako, M; Sanchez, E; Sevilla, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Tarle, G; Thaler, J; Thomas, D; Tucker, D; Walker, A R

    2015-07-31

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139  deg2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing. These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. We summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.

  3. Energy-selective Neutron Imaging for Three-dimensional Non-destructive Probing of Crystalline Structures

    NASA Astrophysics Data System (ADS)

    Peetermans, S.; Bopp, M.; Vontobel, P.; Lehmann, E. H.

    Common neutron imaging uses the full polychromatic neutron beam spectrum to reveal the material distribution in a non-destructive way. Performing it with a reduced energy band, i.e. energy-selective neutron imaging, allows access to local variation in sample crystallographic properties. Two sample categories can be discerned with different energy responses. Polycrystalline materials have an energy-dependent cross-section featuring Bragg edges. Energy-selective neutron imaging can be used to distinguish be- tween crystallographic phases, increase material sensitivity or penetration, improve quantification etc. An example of the latter is shown by the examination of copper discs prior to machining them into linear accelerator cavity structures. The cross-section of single crystals features distinct Bragg peaks. Based on their pattern, one can determine the orientation of the crystal, as in a Laue pattern, but with the tremendous advantage that the operation can be performed for each pixel, yielding crystal orientation maps at high spatial resolution. A wholly different method to investigate such samples is also introduced: neutron diffraction imaging. It is based on projections formed by neutrons diffracted from the crystal lattice out of the direct beam. The position of these projections on the detector gives information on the crystal orientation. The projection itself can be used to reconstruct the crystal shape. A three-dimensional mapping of local Bragg reflectivity or a grain orientation mapping can thus be obtained.

  4. Accounting for autocorrelation in multi-drug resistant tuberculosis predictors using a set of parsimonious orthogonal eigenvectors aggregated in geographic space.

    PubMed

    Jacob, Benjamin J; Krapp, Fiorella; Ponce, Mario; Gottuzzo, Eduardo; Griffith, Daniel A; Novak, Robert J

    2010-05-01

    Spatial autocorrelation is problematic for classical hierarchical cluster detection tests commonly used in multi-drug resistant tuberculosis (MDR-TB) analyses as considerable random error can occur. Therefore, when MDRTB clusters are spatially autocorrelated the assumption that the clusters are independently random is invalid. In this research, a product moment correlation coefficient (i.e., the Moran's coefficient) was used to quantify local spatial variation in multiple clinical and environmental predictor variables sampled in San Juan de Lurigancho, Lima, Peru. Initially, QuickBird 0.61 m data, encompassing visible bands and the near infra-red bands, were selected to synthesize images of land cover attributes of the study site. Data of residential addresses of individual patients with smear-positive MDR-TB were geocoded, prevalence rates calculated and then digitally overlaid onto the satellite data within a 2 km buffer of 31 georeferenced health centers, using a 10 m2 grid-based algorithm. Geographical information system (GIS)-gridded measurements of each health center were generated based on preliminary base maps of the georeferenced data aggregated to block groups and census tracts within each buffered area. A three-dimensional model of the study site was constructed based on a digital elevation model (DEM) to determine terrain covariates associated with the sampled MDR-TB covariates. Pearson's correlation was used to evaluate the linear relationship between the DEM and the sampled MDR-TB data. A SAS/GIS(R) module was then used to calculate univariate statistics and to perform linear and non-linear regression analyses using the sampled predictor variables. The estimates generated from a global autocorrelation analyses were then spatially decomposed into empirical orthogonal bases using a negative binomial regression with a non-homogeneous mean. Results of the DEM analyses indicated a statistically non-significant, linear relationship between georeferenced health centers and the sampled covariate elevation. The data exhibited positive spatial autocorrelation and the decomposition of Moran's coefficient into uncorrelated, orthogonal map pattern components revealed global spatial heterogeneities necessary to capture latent autocorrelation in the MDR-TB model. It was thus shown that Poisson regression analyses and spatial eigenvector mapping can elucidate the mechanics of MDR-TB transmission by prioritizing clinical and environmental-sampled predictor variables for identifying high risk populations.

  5. A Statistics-Based Cracking Criterion of Resin-Bonded Silica Sand for Casting Process Simulation

    NASA Astrophysics Data System (ADS)

    Wang, Huimin; Lu, Yan; Ripplinger, Keith; Detwiler, Duane; Luo, Alan A.

    2017-02-01

    Cracking of sand molds/cores can result in many casting defects such as veining. A robust cracking criterion is needed in casting process simulation for predicting/controlling such defects. A cracking probability map, relating to fracture stress and effective volume, was proposed for resin-bonded silica sand based on Weibull statistics. Three-point bending test results of sand samples were used to generate the cracking map and set up a safety line for cracking criterion. Tensile test results confirmed the accuracy of the safety line for cracking prediction. A laboratory casting experiment was designed and carried out to predict cracking of a cup mold during aluminum casting. The stress-strain behavior and the effective volume of the cup molds were calculated using a finite element analysis code ProCAST®. Furthermore, an energy dispersive spectroscopy fractographic examination of the sand samples confirmed the binder cracking in resin-bonded silica sand.

  6. Methods for sampling geographically mobile female traders in an East African market setting

    PubMed Central

    Achiro, Lillian; Kwena, Zachary A.; McFarland, Willi; Neilands, Torsten B.; Cohen, Craig R.; Bukusi, Elizabeth A.; Camlin, Carol S.

    2018-01-01

    Background The role of migration in the spread of HIV in sub-Saharan Africa is well-documented. Yet migration and HIV research have often focused on HIV risks to male migrants and their partners, or migrants overall, often failing to measure the risks to women via their direct involvement in migration. Inconsistent measures of mobility, gender biases in those measures, and limited data sources for sex-specific population-based estimates of mobility have contributed to a paucity of research on the HIV prevention and care needs of migrant and highly mobile women. This study addresses an urgent need for novel methods for developing probability-based, systematic samples of highly mobile women, focusing on a population of female traders operating out of one of the largest open air markets in East Africa. Our method involves three stages: 1.) identification and mapping of all market stall locations using Global Positioning System (GPS) coordinates; 2.) using female market vendor stall GPS coordinates to build the sampling frame using replicates; and 3.) using maps and GPS data for recruitment of study participants. Results The location of 6,390 vendor stalls were mapped using GPS. Of these, 4,064 stalls occupied by women (63.6%) were used to draw four replicates of 128 stalls each, and a fifth replicate of 15 pre-selected random alternates for a total of 527 stalls assigned to one of five replicates. Staff visited 323 stalls from the first three replicates and from these successfully recruited 306 female vendors into the study for a participation rate of 94.7%. Mobilization strategies and involving traders association representatives in participant recruitment were critical to the study’s success. Conclusion The study’s high participation rate suggests that this geospatial sampling method holds promise for development of probability-based samples in other settings that serve as transport hubs for highly mobile populations. PMID:29324780

  7. The impact of CmapTools utilization towards students' conceptual change on optics topic

    NASA Astrophysics Data System (ADS)

    Rofiuddin, Muhammad Rifqi; Feranie, Selly

    2017-05-01

    Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.

  8. Design and performance of an ultra-high vacuum scanning tunneling microscope operating at dilution refrigerator temperatures and high magnetic fields.

    PubMed

    Misra, S; Zhou, B B; Drozdov, I K; Seo, J; Urban, L; Gyenis, A; Kingsley, S C J; Jones, H; Yazdani, A

    2013-10-01

    We describe the construction and performance of a scanning tunneling microscope capable of taking maps of the tunneling density of states with sub-atomic spatial resolution at dilution refrigerator temperatures and high (14 T) magnetic fields. The fully ultra-high vacuum system features visual access to a two-sample microscope stage at the end of a bottom-loading dilution refrigerator, which facilitates the transfer of in situ prepared tips and samples. The two-sample stage enables location of the best area of the sample under study and extends the experiment lifetime. The successful thermal anchoring of the microscope, described in detail, is confirmed through a base temperature reading of 20 mK, along with a measured electron temperature of 250 mK. Atomically resolved images, along with complementary vibration measurements, are presented to confirm the effectiveness of the vibration isolation scheme in this instrument. Finally, we demonstrate that the microscope is capable of the same level of performance as typical machines with more modest refrigeration by measuring spectroscopic maps at base temperature both at zero field and in an applied magnetic field.

  9. Singular value decomposition approach to the yttrium occurrence in mineral maps of rare earth element ores using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Romppanen, Sari; Häkkänen, Heikki; Kaski, Saara

    2017-08-01

    Laser-induced breakdown spectroscopy (LIBS) has been used in analysis of rare earth element (REE) ores from the geological formation of Norra Kärr Alkaline Complex in southern Sweden. Yttrium has been detected in eudialyte (Na15 Ca6(Fe,Mn)3 Zr3Si(Si25O73)(O,OH,H2O)3 (OH,Cl)2) and catapleiite (Ca/Na2ZrSi3O9·2H2O). Singular value decomposition (SVD) has been employed in classification of the minerals in the rock samples and maps representing the mineralogy in the sampled area have been constructed. Based on the SVD classification the percentage of the yttrium-bearing ore minerals can be calculated even in fine-grained rock samples.

  10. Learning Efficiency of Two ICT-Based Instructional Strategies in Greek Sheep Farmers

    ERIC Educational Resources Information Center

    Bellos, Georgios; Mikropoulos, Tassos A.; Deligeorgis, Stylianos; Kominakis, Antonis

    2016-01-01

    Purpose: The objective of the present study was to compare the learning efficiency of two information and communications technology (ICT)-based instructional strategies (multimedia presentation (MP) and concept mapping) in a sample (n = 187) of Greek sheep farmers operating mainly in Western Greece. Design/methodology/approach: In total, 15…

  11. Development of a Neutron Diffraction Based Experiemental Capability for Investigating Hydraulic Fracturing for EGS-like Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polsky, Yarom; Anovitz, Lawrence; An, Ke

    2013-01-01

    Hydraulic fracturing to enhance formation permeability is an established practice in the Oil & Gas (O&G) industry and is expected to be an enabler for EGS. However, it is rarely employed in conventional geothermal systems and there are significant questions regarding the translation of practice from O&G to both conventional geothermal and EGS applications. Lithological differences(sedimentary versus crystalline rocks, significantly greater formation temperatures and different desired fracture characteristics are among a number of factors that are likely to result in a gap of understanding of how to manage hydraulic fracturing practice for geothermal. Whereas the O&G community has had bothmore » the capital and the opportunity to develop its understanding of hydraulic fracturing operations empirically in the field as well through extensive R&D efforts, field testing opportunities for EGS are likely to be minimal due to the high expense of hydraulic fracturing field trials. A significant portion of the knowledge needed to guide the management of geothermal/EGS hydraulic fracturing operations will therefore likely have to come from experimental efforts and simulation. This paper describes ongoing efforts at Oak Ridge National Laboratory (ORNL) to develop an experimental capability to map the internal stresses/strains in core samples subjected to triaxial stress states and temperatures representative of EGS-like conditions using neutron diffraction based strain mapping techniques. This capability is being developed at ORNL\\'s Spallation Neutron Source, the world\\'s most powerful pulsed neutron source and is still in a proof of concept phase. A specialized pressure cell has been developed that permits independent radial and axial fluid pressurization of core samples, with axial flow through capability and a temperature rating up to 300 degrees C. This cell will ultimately be used to hydraulically pressurize EGS-representative core samples to conditions of imminent fracture and map the associated internal strain states of the sample. This will hopefully enable a more precise mapping of the rock material failure envelope, facilitate a more refined understanding of the mechanism of hydraulically induced rock fracture, particularly in crystalline rocks, and serve as a platform for validating and improving fracture simulation codes. The elements of the research program and preliminary strain mapping results of a Sierra White granite sample subjected only to compressive loading will be discussed in this paper.« less

  12. Using Python to generate AHPS-based precipitation simulations over CONUS using Amazon distributed computing

    NASA Astrophysics Data System (ADS)

    Machalek, P.; Kim, S. M.; Berry, R. D.; Liang, A.; Small, T.; Brevdo, E.; Kuznetsova, A.

    2012-12-01

    We describe how the Climate Corporation uses Python and Clojure, a language impleneted on top of Java, to generate climatological forecasts for precipitation based on the Advanced Hydrologic Prediction Service (AHPS) radar based daily precipitation measurements. A 2-year-long forecasts is generated on each of the ~650,000 CONUS land based 4-km AHPS grids by constructing 10,000 ensembles sampled from a 30-year reconstructed AHPS history for each grid. The spatial and temporal correlations between neighboring AHPS grids and the sampling of the analogues are handled by Python. The parallelization for all the 650,000 CONUS stations is further achieved by utilizing the MAP-REDUCE framework (http://code.google.com/edu/parallel/mapreduce-tutorial.html). Each full scale computational run requires hundreds of nodes with up to 8 processors each on the Amazon Elastic MapReduce (http://aws.amazon.com/elasticmapreduce/) distributed computing service resulting in 3 terabyte datasets. We further describe how we have productionalized a monthly run of the simulations process at full scale of the 4km AHPS grids and how the resultant terabyte sized datasets are handled.

  13. Comparing the detection of iron-based pottery pigment on a carbon-coated sherd by SEM-EDS and by Micro-XRF-SEM.

    PubMed

    Pendleton, Michael W; Washburn, Dorothy K; Ellis, E Ann; Pendleton, Bonnie B

    2014-03-01

    The same sherd was analyzed using a scanning electron microscope with energy dispersive spectroscopy (SEM-EDS) and a micro X-ray fluorescence tube attached to a scanning electron microscope (Micro-XRF-SEM) to compare the effectiveness of elemental detection of iron-based pigment. To enhance SEM-EDS mapping, the sherd was carbon coated. The carbon coating was not required to produce Micro-XRF-SEM maps but was applied to maintain an unbiased comparison between the systems. The Micro-XRF-SEM analysis was capable of lower limits of detection than that of the SEM-EDS system, and therefore the Micro-XRF-SEM system could produce elemental maps of elements not easily detected by SEM-EDS mapping systems. Because SEM-EDS and Micro-XRF-SEM have been used for imaging and chemical analysis of biological samples, this comparison of the detection systems should be useful to biologists, especially those involved in bone or tooth (hard tissue) analysis.

  14. Comparing the Detection of Iron-Based Pottery Pigment on a Carbon-Coated Sherd by SEM-EDS and by Micro-XRF-SEM

    PubMed Central

    Pendleton, Michael W.; Washburn, Dorothy K.; Ellis, E. Ann; Pendleton, Bonnie B.

    2014-01-01

    The same sherd was analyzed using a scanning electron microscope with energy dispersive spectroscopy (SEM-EDS) and a micro X-ray fluorescence tube attached to a scanning electron microscope (Micro-XRF-SEM) to compare the effectiveness of elemental detection of iron-based pigment. To enhance SEM-EDS mapping, the sherd was carbon coated. The carbon coating was not required to produce Micro-XRF-SEM maps but was applied to maintain an unbiased comparison between the systems. The Micro-XRF-SEM analysis was capable of lower limits of detection than that of the SEM-EDS system, and therefore the Micro-XRF-SEM system could produce elemental maps of elements not easily detected by SEM-EDS mapping systems. Because SEM-EDS and Micro-XRF-SEM have been used for imaging and chemical analysis of biological samples, this comparison of the detection systems should be useful to biologists, especially those involved in bone or tooth (hard tissue) analysis. PMID:24600333

  15. Comparison of two Classification methods (MLC and SVM) to extract land use and land cover in Johor Malaysia

    NASA Astrophysics Data System (ADS)

    Rokni Deilmai, B.; Ahmad, B. Bin; Zabihi, H.

    2014-06-01

    Mapping is essential for the analysis of the land use and land cover, which influence many environmental processes and properties. For the purpose of the creation of land cover maps, it is important to minimize error. These errors will propagate into later analyses based on these land cover maps. The reliability of land cover maps derived from remotely sensed data depends on an accurate classification. In this study, we have analyzed multispectral data using two different classifiers including Maximum Likelihood Classifier (MLC) and Support Vector Machine (SVM). To pursue this aim, Landsat Thematic Mapper data and identical field-based training sample datasets in Johor Malaysia used for each classification method, which results indicate in five land cover classes forest, oil palm, urban area, water, rubber. Classification results indicate that SVM was more accurate than MLC. With demonstrated capability to produce reliable cover results, the SVM methods should be especially useful for land cover classification.

  16. Phenology-based Spartina alterniflora mapping in coastal wetland of the Yangtze Estuary using time series of GaoFen satellite no. 1 wide field of view imagery

    NASA Astrophysics Data System (ADS)

    Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao

    2017-04-01

    Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the use of monthly NDVI time series imagery. These results show the importance of considering the phenological stage for image selection for mapping S. alterniflora using GF-1 WFV imagery. Furthermore, in light of the better tradeoff between the number of images and classification accuracy when using multitemporal GF-1 WFV imagery, we suggest using multitemporal imagery acquired at appropriate phenological windows for S. alterniflora mapping at regional scales.

  17. Evaluating Soil Health Using Remotely Sensed Evapotranspiration on the Benchmark Barnes Soils of North Dakota

    NASA Astrophysics Data System (ADS)

    Bohn, Meyer; Hopkins, David; Steele, Dean; Tuscherer, Sheldon

    2017-04-01

    The benchmark Barnes soil series is an extensive upland Hapludoll of the northern Great Plains that is both economically and ecologically vital to the region. Effects of tillage erosion coupled with wind and water erosion have degraded Barnes soil quality, but with unknown extent, distribution, or severity. Evidence of soil degradation documented for a half century warrants that the assumption of productivity be tested. Soil resilience is linked to several dynamic soil properties and National Cooperative Soil Survey initiatives are now focused on identifying those properties for benchmark soils. Quantification of soil degradation is dependent on a reliable method for broad-scale evaluation. The soil survey community is currently developing rapid and widespread soil property assessment technologies. Improvements in satellite based remote-sensing and image analysis software have stimulated the application of broad-scale resource assessment. Furthermore, these technologies have fostered refinement of land-based surface energy balance algorithms, i.e. Mapping Evapotranspiration at High Resolution with Internalized Calibration (METRIC) algorithm for evapotranspiration (ET) mapping. The hypothesis of this study is that ET mapping technology can differentiate soil function on extensive landscapes and identify degraded areas. A recent soil change study in eastern North Dakota resampled legacy Barnes pedons sampled prior to 1960 and found significant decreases in organic carbon. An ancillary study showed that evapotranspiration (ET) estimates from METRIC decreased with Barnes erosion class severity. An ET raster map has been developed for three eastern North Dakota counties using METRIC and Landsat 5 imagery. ET pixel candidates on major Barnes soil map units were stratified into tertiles and classified as ranked ET subdivisions. A sampling population of randomly selected points stratified by ET class and county proportion was established. Morphologic and chemical data will be recorded at each sampling site to test whether soil properties correlate to ET, thus serving as a non-biased proxy for soil health.

  18. Surficial geologic map of the Amboy 30' x 60' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2010-01-01

    The surficial geologic map of the Amboy 30' x 60' quadrangle presents characteristics of surficial materials for an area of approximately 5,000 km2 in the eastern Mojave Desert of southern California. This map consists of new surficial mapping conducted between 2000 and 2007, as well as compilations from previous surficial mapping. Surficial geologic units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects following deposition, and, where appropriate, the lithologic nature of the material. Many physical properties were noted and measured during the geologic mapping. This information was used to classify surficial deposits and to understand their ecological importance. We focus on physical properties that drive hydrologic, biologic, and physical processes such as particle-size distribution (PSD) and bulk density. The database contains point data representing locations of samples for both laboratory determined physical properties and semiquantitative field-based information in the database. We include the locations of all field observations and note the type of information collected in the field to help assist in assessing the quality of the mapping. The publication is separated into three parts: documentation, spatial data, and printable map graphics of the database. Documentation includes this pamphlet, which provides a discussion of the surficial geology and units and the map. Spatial data are distributed as ArcGIS Geodatabase in Microsoft Access format and are accompanied by a readme file, which describes the database contents, and FGDC metadata for the spatial map information. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files that provide a view of the spatial database at the mapped scale.

  19. Experiences from using Autonomous Underwater Vehicles and Synthetic Aperture Sonar for Sediment and Habitat Mapping

    NASA Astrophysics Data System (ADS)

    Thorsnes, T.; Bjarnadóttir, L. R.

    2017-12-01

    Emerging platforms and tools like autonomous underwater vehicles and synthetic aperture sonars provide interesting opportunities for making seabed mapping more efficient and precise. Sediment grain-size maps are an important product in their own right and a key input for habitat and biotope maps. National and regional mapping programmes are tasked with mapping large areas, and survey efficiency, data quality, and resulting map confidence are important considerations when selecting the mapping strategy. Since 2005, c. 175,000 square kilometres of the Norwegian continental shelf and continental slope has been mapped with respect to sediments, habitats and biodiversity, and pollution under the MAREANO programme (www.mareano.no). At present the sediment mapping is based on a combination of ship-borne multibeam bathymetry and backscatter, visual documentation using a towed video platform, and grab sampling. We have now tested a new approach, using an Autonomous Underwater Vehicle (AUV) as the survey platform for the collection of acoustic data (Synthetic Aperture Sonar (SAS), EM2040 bathymetry and backscatter) and visual data (still images using a TFish colour photo system). This pilot project was conducted together the Norwegian Hydrographic Service, the Institute of Marine Research (biology observations) and the Norwegian Defence Research Establishment (operation of ship and AUV). The test site reported here is the Vesterdjupet area, offshore Lofoten, northern Norway. The water depth is between 170 and 300 metres, with sediments ranging from gravel, cobbles and boulders to sandy mud. A cold-water coral reef, associated with bioclastic sediments was also present in the study area. The presentation will give an overview of the main findings and experiences gained from this pilot project with a focus on geological mapping and will also discuss the relevance of AUV-based mapping to large-area mapping programmes like MAREANO.

  20. Geographical Information Systems (GIS) Mapping of Environmental Samples across College Campuses

    ERIC Educational Resources Information Center

    Purvis-Roberts, Kathleen L.; Moeur, Harriet P.; Zanella, Andrew

    2007-01-01

    In this laboratory experiment, students take environmental samples at various locations around the college campuses, take geospatial coordinates with a global position systems (GPS) unit, and map their results on a geo-referenced campus map with geographical information systems (GIS) software. Nitrogen dioxide air pollution sampling is used as an…

  1. Mapping of fluoride endemic areas and correlation studies of fluoride with other quality parameters of drinking water of Veppanapalli block of Dharmapuri district in Tamil Nadu.

    PubMed

    Karthikeyan, G; Sundarraj, A Shunmuga; Elango, K P

    2003-10-01

    193 drinking water samples from water sources of 27 panchayats of Veppanapalli block of Dharmapuri district of Tamil Nadu were analysed for chemical quality parameters. Based on the fluoride content of the water sources, fluoride maps differentiating regions with high / low fluoride levels were prepared using Isopleth mapping technique. The interdependence among the important chemical quality parameters were assessed using correlation studies. The experimental results of the application of linear and multiple regression equations on the influence of hardness, alkalinity, total dissolved solids and pH on fluoride are discussed.

  2. Mapping moderate-scale land-cover over very large geographic areas within a collaborative framework: A case study of the Southwest Regional Gap Analysis Project (SWReGAP)

    USGS Publications Warehouse

    Lowry, J.; Ramsey, R.D.; Thomas, K.; Schrupp, D.; Sajwaj, T.; Kirby, J.; Waller, E.; Schrader, S.; Falzarano, S.; Langs, L.; Manis, G.; Wallace, C.; Schulz, K.; Comer, P.; Pohs, K.; Rieth, W.; Velasquez, C.; Wolk, B.; Kepner, W.; Boykin, K.; O'Brien, L.; Bradford, D.; Thompson, B.; Prior-Magee, J.

    2007-01-01

    Land-cover mapping efforts within the USGS Gap Analysis Program have traditionally been state-centered; each state having the responsibility of implementing a project design for the geographic area within their state boundaries. The Southwest Regional Gap Analysis Project (SWReGAP) was the first formal GAP project designed at a regional, multi-state scale. The project area comprises the southwestern states of Arizona, Colorado, Nevada, New Mexico, and Utah. The land-cover map/dataset was generated using regionally consistent geospatial data (Landsat ETM+ imagery (1999-2001) and DEM derivatives), similar field data collection protocols, a standardized land-cover legend, and a common modeling approach (decision tree classifier). Partitioning of mapping responsibilities amongst the five collaborating states was organized around ecoregion-based "mapping zones". Over the course of 21/2 field seasons approximately 93,000 reference samples were collected directly, or obtained from other contemporary projects, for the land-cover modeling effort. The final map was made public in 2004 and contains 125 land-cover classes. An internal validation of 85 of the classes, representing 91% of the land area was performed. Agreement between withheld samples and the validated dataset was 61% (KHAT = .60, n = 17,030). This paper presents an overview of the methodologies used to create the regional land-cover dataset and highlights issues associated with large-area mapping within a coordinated, multi-institutional management framework. ?? 2006 Elsevier Inc. All rights reserved.

  3. EnviroAtlas -Tampa, FL- One Meter Resolution Urban Land Cover (2010)

    EPA Pesticide Factsheets

    The EnviroAtlas Tampa, FL land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from April-May 2010 at 1 m spatial resolution. Eight land cover classes were mapped: impervious surface, soil and barren, grass and herbaceous, trees and forest, water, agriculture, woody wetland, and emergent wetland. The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Tampa, and includes the cities of Clearwater and St. Petersburg, as well as additional out-lying areas. An accuracy assessment using a stratified random sampling of 600 samples (100 per class) yielded an overall accuracy of 70.67 percent and an area weighted accuracy of 81.87 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  4. Single-breath-hold abdominal [Formula: see text]  mapping using 3D Cartesian Look-Locker with spatiotemporal sparsity constraints.

    PubMed

    Lugauer, Felix; Wetzl, Jens; Forman, Christoph; Schneider, Manuel; Kiefer, Berthold; Hornegger, Joachim; Nickel, Dominik; Maier, Andreas

    2018-06-01

    Our aim was to develop and validate a 3D Cartesian Look-Locker [Formula: see text] mapping technique that achieves high accuracy and whole-liver coverage within a single breath-hold. The proposed method combines sparse Cartesian sampling based on a spatiotemporally incoherent Poisson pattern and k-space segmentation, dedicated for high-temporal-resolution imaging. This combination allows capturing tissue with short relaxation times with volumetric coverage. A joint reconstruction of the 3D + inversion time (TI) data via compressed sensing exploits the spatiotemporal sparsity and ensures consistent quality for the subsequent multistep [Formula: see text] mapping. Data from the National Institute of Standards and Technology (NIST) phantom and 11 volunteers, along with reference 2D Look-Locker acquisitions, are used for validation. 2D and 3D methods are compared based on [Formula: see text] values in different abdominal tissues at 1.5 and 3 T. [Formula: see text] maps obtained from the proposed 3D method compare favorably with those from the 2D reference and additionally allow for reformatting or volumetric analysis. Excellent agreement is shown in phantom [bias[Formula: see text] < 2%, bias[Formula: see text] < 5% for (120; 2000) ms] and volunteer data (3D and 2D deviation < 4% for liver, muscle, and spleen) for clinically acceptable scan (20 s) and reconstruction times (< 4 min). Whole-liver [Formula: see text] mapping with high accuracy and precision is feasible in one breath-hold using spatiotemporally incoherent, sparse 3D Cartesian sampling.

  5. Effectiveness of Vegetation Index Transformation for Land Use Identifying and Mapping in the Area of Oil palm Plantation based on SPOT-6 Imagery (Case Study: PT.Tunggal Perkasa Plantations, Air Molek, Indragiri Hulu)

    NASA Astrophysics Data System (ADS)

    Setyowati, H. A.; S, S. H. Murti B.; Sukentyas, E. S.

    2016-11-01

    The reflection of land surface, atmosphere and vegetation conditions affect the reflectance value of the object is recorded on remote sensing image so that it can affect the outcome of information extraction from remote sensing imagery one multispectral classification. This study aims to assess the ability of the transformation of generic vegetation index (Wide Dynamic Range Vegetation Index), the vegetation index transformation that is capable reducing the influence of the atmosphere (Atmospherically Resistant Vegetation Index), and the transformation of vegetation index that is capable of reducing the influence of the background soil (Second Modified Soil Adjusted Vegetation Index) for the identification and mapping of land use in the oil palm plantation area based on SPOT-6 archived on June 13, 2013 from LAPAN. The study area selected oil palm plantations PT. Tunggal Perkasa Plantations, Air Molek, Indragiri Hulu, Riau Province. The method is using the transformation of the vegetation index ARVI, MSAVI2, and WDRVI. Sample selection method used was stratified random sampling. The test method used mapping accuracy of the confusion matrix. The results showed that the best transformation of the vegetation index for the identification and mapping of land use in the plantation area is ARVI transformation with a total of accuracy is 96%. Accuracy of mapping land use settlements 100%, replanting 82.35%, 81.25% young oil palm, old oil palm 99.46%, 100% bush, body of water 100%, and 100% bare-soil.

  6. Methodological considerations for the evaluation of EEG mapping data: a practical example based on a placebo/diazepam crossover trial.

    PubMed

    Jähnig, P; Jobert, M

    1995-01-01

    Quantitative EEG is a sensitive method for measuring pharmacological effects on the central nervous system. Nowadays, computers enable EEG data to be stored and spectral parameters to be computed for signals obtained from a large number of electrode locations. However, the statistical analysis of such vast amounts of EEG data is complicated due to the limited number of subjects usually involved in pharmacological studies. In the present study, data from a trial aimed at comparing diazepam and placebo were used to investigate different properties of EEG mapping data and to compare different methods of data analysis. Both the topography and the temporal changes of EEG activity were investigated using descriptive data analysis, which is based on an inspection of patterns of pd values (descriptive p values) assessed for all pair-wise tests for differences in time or treatment. An empirical measure (tri-mean) for the computation of group maps is suggested, allowing a better description of group effects with skewed data of small samples size. Finally, both the investigation of maps based on principal component analysis and the notion of distance between maps are discussed and applied to the analysis of the data collected under diazepam treatment, exemplifying the evaluation of pharmacodynamic drug effects.

  7. Asteroid orbital inversion using uniform phase-space sampling

    NASA Astrophysics Data System (ADS)

    Muinonen, K.; Pentikäinen, H.; Granvik, M.; Oszkiewicz, D.; Virtanen, J.

    2014-07-01

    We review statistical inverse methods for asteroid orbit computation from a small number of astrometric observations and short time intervals of observations. With the help of Markov-chain Monte Carlo methods (MCMC), we present a novel inverse method that utilizes uniform sampling of the phase space for the orbital elements. The statistical orbital ranging method (Virtanen et al. 2001, Muinonen et al. 2001) was set out to resolve the long-lasting challenges in the initial computation of orbits for asteroids. The ranging method starts from the selection of a pair of astrometric observations. Thereafter, the topocentric ranges and angular deviations in R.A. and Decl. are randomly sampled. The two Cartesian positions allow for the computation of orbital elements and, subsequently, the computation of ephemerides for the observation dates. Candidate orbital elements are included in the sample of accepted elements if the χ^2-value between the observed and computed observations is within a pre-defined threshold. The sample orbital elements obtain weights based on a certain debiasing procedure. When the weights are available, the full sample of orbital elements allows the probabilistic assessments for, e.g., object classification and ephemeris computation as well as the computation of collision probabilities. The MCMC ranging method (Oszkiewicz et al. 2009; see also Granvik et al. 2009) replaces the original sampling algorithm described above with a proposal probability density function (p.d.f.), and a chain of sample orbital elements results in the phase space. MCMC ranging is based on a bivariate Gaussian p.d.f. for the topocentric ranges, and allows for the sampling to focus on the phase-space domain with most of the probability mass. In the virtual-observation MCMC method (Muinonen et al. 2012), the proposal p.d.f. for the orbital elements is chosen to mimic the a posteriori p.d.f. for the elements: first, random errors are simulated for each observation, resulting in a set of virtual observations; second, corresponding virtual least-squares orbital elements are derived using the Nelder-Mead downhill simplex method; third, repeating the procedure two times allows for a computation of a difference for two sets of virtual orbital elements; and, fourth, this orbital-element difference constitutes a symmetric proposal in a random-walk Metropolis-Hastings algorithm, avoiding the explicit computation of the proposal p.d.f. In a discrete approximation, the allowed proposals coincide with the differences that are based on a large number of pre-computed sets of virtual least-squares orbital elements. The virtual-observation MCMC method is thus based on the characterization of the relevant volume in the orbital-element phase space. Here we utilize MCMC to map the phase-space domain of acceptable solutions. We can make use of the proposal p.d.f.s from the MCMC ranging and virtual-observation methods. The present phase-space mapping produces, upon convergence, a uniform sampling of the solution space within a pre-defined χ^2-value. The weights of the sampled orbital elements are then computed on the basis of the corresponding χ^2-values. The present method resembles the original ranging method. On one hand, MCMC mapping is insensitive to local extrema in the phase space and efficiently maps the solution space. This is somewhat contrary to the MCMC methods described above. On the other hand, MCMC mapping can suffer from producing a small number of sample elements with small χ^2-values, in resemblance to the original ranging method. We apply the methods to example near-Earth, main-belt, and transneptunian objects, and highlight the utilization of the methods in the data processing and analysis pipeline of the ESA Gaia space mission.

  8. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Stable Isotope Mapping of Alaskan Grasses and Marijuana

    NASA Astrophysics Data System (ADS)

    Booth, A. L.; Wooller, M. J.

    2008-12-01

    The spatial variation of isotope signatures in organic material is a useful forensic tool, particularly when applied to the task of tracking the production and distribution of plant-derived illicit drugs. In order to identify the likely grow-locations of drugs such as marijuana from unknown locations (i.e., confiscated during trafficking), base isotope maps are needed that include measurements of plants from known grow-locations. This task is logistically challenging in remote, large regions such as Alaska. We are therefore investigating the potential of supplementing our base (marijuana) isotope maps with data derived from other plants from known locations and with greater spatial coverage in Alaska. These currently include >150 samples of modern C3 grasses (Poaceae) as well as marijuana samples (n = 18) from known grow-locations across the state. We conducted oxygen, carbon and nitrogen stable isotope analyses of marijuana and grasses (Poaceae). Poaceae samples were obtained from the University of Alaska Fairbanks (UAF) Museum of the North herbarium collection, originally collected by field botanists from around Alaska. Results indicate that the oxygen isotopic composition of these grasses range from 10‰ to 30‰, and broadly mirror the spatial pattern of water isotopes in Alaska. Our marijuana samples were confiscated around the state of Alaska and supplied to us by the UAF Police Department. δ13C, δ15N and δ18O values exhibit geographic patterns similar to the modern grasses, but carbon and nitrogen isotopes of some marijuana plants appear to be influenced by additional factors related to indoor growing conditions (supplementary CO2 sources and the application of organic fertilizer). As well as providing a potential forensic resource, our Poaceae isotope maps could serve additional value by providing resources for studying ecosystem nutrient cycling, for tracing natural ecological processes (i.e., animal migration and food web dynamics) and providing modern data for comparison with isotope analyses conducted on fossil leaf material in paleoecological studies.

  10. Toward accelerating landslide mapping with interactive machine learning techniques

    NASA Astrophysics Data System (ADS)

    Stumpf, André; Lachiche, Nicolas; Malet, Jean-Philippe; Kerle, Norman; Puissant, Anne

    2013-04-01

    Despite important advances in the development of more automated methods for landslide mapping from optical remote sensing images, the elaboration of inventory maps after major triggering events still remains a tedious task. Image classification with expert defined rules typically still requires significant manual labour for the elaboration and adaption of rule sets for each particular case. Machine learning algorithm, on the contrary, have the ability to learn and identify complex image patterns from labelled examples but may require relatively large amounts of training data. In order to reduce the amount of required training data active learning has evolved as key concept to guide the sampling for applications such as document classification, genetics and remote sensing. The general underlying idea of most active learning approaches is to initialize a machine learning model with a small training set, and to subsequently exploit the model state and/or the data structure to iteratively select the most valuable samples that should be labelled by the user and added in the training set. With relatively few queries and labelled samples, an active learning strategy should ideally yield at least the same accuracy than an equivalent classifier trained with many randomly selected samples. Our study was dedicated to the development of an active learning approach for landslide mapping from VHR remote sensing images with special consideration of the spatial distribution of the samples. The developed approach is a region-based query heuristic that enables to guide the user attention towards few compact spatial batches rather than distributed points resulting in time savings of 50% and more compared to standard active learning techniques. The approach was tested with multi-temporal and multi-sensor satellite images capturing recent large scale triggering events in Brazil and China and demonstrated balanced user's and producer's accuracies between 74% and 80%. The assessment also included an experimental evaluation of the uncertainties of manual mappings from multiple experts and demonstrated strong relationships between the uncertainty of the experts and the machine learning model.

  11. Characterizing regional soil mineral composition using spectroscopyand geostatistics

    USGS Publications Warehouse

    Mulder, V.L.; de Bruin, S.; Weyermann, J.; Kokaly, Raymond F.; Schaepman, M.E.

    2013-01-01

    This work aims at improving the mapping of major mineral variability at regional scale using scale-dependent spatial variability observed in remote sensing data. Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and statistical methods were combined with laboratory-based mineral characterization of field samples to create maps of the distributions of clay, mica and carbonate minerals and their abundances. The Material Identification and Characterization Algorithm (MICA) was used to identify the spectrally-dominant minerals in field samples; these results were combined with ASTER data using multinomial logistic regression to map mineral distributions. X-ray diffraction (XRD)was used to quantify mineral composition in field samples. XRD results were combined with ASTER data using multiple linear regression to map mineral abundances. We testedwhether smoothing of the ASTER data to match the scale of variability of the target sample would improve model correlations. Smoothing was donewith Fixed Rank Kriging (FRK) to represent the mediumand long-range spatial variability in the ASTER data. Stronger correlations resulted using the smoothed data compared to results obtained with the original data. Highest model accuracies came from using both medium and long-range scaled ASTER data as input to the statistical models. High correlation coefficients were obtained for the abundances of calcite and mica (R2 = 0.71 and 0.70, respectively). Moderately-high correlation coefficients were found for smectite and kaolinite (R2 = 0.57 and 0.45, respectively). Maps of mineral distributions, obtained by relating ASTER data to MICA analysis of field samples, were found to characterize major soil mineral variability (overall accuracies for mica, smectite and kaolinite were 76%, 89% and 86% respectively). The results of this study suggest that the distributions of minerals and their abundances derived using FRK-smoothed ASTER data more closely match the spatial variability of soil and environmental properties at regional scale.

  12. Use of GIS-Based Sampling to Inform Food Security Assessments and Decision Making in Kenya

    NASA Astrophysics Data System (ADS)

    Wahome, A.; Ndubi, A. O.; Ndungu, L. W.; Mugo, R. M.; Flores Cordova, A. I.

    2017-12-01

    Kenya relies on agricultural production for supporting local consumption and other processing value chains. With changing climate in a rain-fed dependent agricultural production system, cropping zones are shifting and proper decision making will require updated data. Where up-to-date data is not available it is important that it is generated and passed over to relevant stakeholders to inform their decision making. The process of generating this data should be cost effective and less time consuming. The Kenyan State Department of Agriculture (SDA) runs an insurance programme for maize farmers in a number of counties in Kenya. Previously, SDA was using a list of farmers to identify the crop fields for this insurance programme. However, the process of listing of all farmers in each Unit Area of Insurance (UAI) proved to be tedious and very costly, hence need for an alternative approach, but acceptable sampling methodology. Building on the existing cropland maps, SERVIR, a joint NASA-USAID initiative that brings Earth observations (EO) for improved environmental decision making in developing countries, specifically its hub in Eastern and Soutehrn Africa developed a High Resolution Map based on 10m Sentinel satellite images from which a GIS based sampling frame for identifying maize fields was developed. Sampling points were randomly generated in each UAI and navigated to using hand-held GPS units for identification of maize farmers. In the GIS-based identification of farmers SDA uses 1 day to cover an area covered in 1 week by list identification of farmers. Similarly, SDA spends approximately 3,000 USD per sub-county to locate maize fields using GIS-based sampling as compared 10,000 USD they used to spend before. This has resulted in 70% cost reduction.

  13. Comparing conventional Descriptive Analysis and Napping®-UFP against physiochemical measurements: a case study using apples.

    PubMed

    Pickup, William; Bremer, Phil; Peng, Mei

    2018-03-01

    The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P < 0.05). The finding also noted that sample characterisations from DA and Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  14. Modeling and comparative study of fluid velocities in heterogeneous rocks

    NASA Astrophysics Data System (ADS)

    Hingerl, Ferdinand F.; Romanenko, Konstantin; Pini, Ronny; Balcom, Bruce; Benson, Sally

    2013-04-01

    Detailed knowledge of the distribution of effective porosity and fluid velocities in heterogeneous rock samples is crucial for understanding and predicting spatially resolved fluid residence times and kinetic reaction rates of fluid-rock interactions. The applicability of conventional MRI techniques to sedimentary rocks is limited by internal magnetic field gradients and short spin relaxation times. The approach developed at the UNB MRI Centre combines the 13-interval Alternating-Pulsed-Gradient Stimulated-Echo (APGSTE) scheme and three-dimensional Single Point Ramped Imaging with T1 Enhancement (SPRITE). These methods were designed to reduce the errors due to effects of background gradients and fast transverse relaxation. SPRITE is largely immune to time-evolution effects resulting from background gradients, paramagnetic impurities and chemical shift. Using these techniques quantitative 3D porosity maps as well as single-phase fluid velocity fields in sandstone core samples were measured. Using a new Magnetic Resonance Imaging technique developed at the MRI Centre at UNB, we created 3D maps of porosity distributions as well as single-phase fluid velocity distributions of sandstone rock samples. Then, we evaluated the applicability of the Kozeny-Carman relationship for modeling measured fluid velocity distributions in sandstones samples showing meso-scale heterogeneities using two different modeling approaches. The MRI maps were used as reference points for the modeling approaches. For the first modeling approach, we applied the Kozeny-Carman relationship to the porosity distributions and computed respective permeability maps, which in turn provided input for a CFD simulation - using the Stanford CFD code GPRS - to compute averaged velocity maps. The latter were then compared to the measured velocity maps. For the second approach, the measured velocity distributions were used as input for inversely computing permeabilities using the GPRS CFD code. The computed permeabilities were then correlated with the ones based on the porosity maps and the Kozeny-Carman relationship. The findings of the comparative modeling study are discussed and its potential impact on the modeling of fluid residence times and kinetic reaction rates of fluid-rock interactions in rocks containing meso-scale heterogeneities are reviewed.

  15. Joint fMRI analysis and subject clustering using sparse dictionary learning

    NASA Astrophysics Data System (ADS)

    Kim, Seung-Jun; Dontaraju, Krishna K.

    2017-08-01

    Multi-subject fMRI data analysis methods based on sparse dictionary learning are proposed. In addition to identifying the component spatial maps by exploiting the sparsity of the maps, clusters of the subjects are learned by postulating that the fMRI volumes admit a subspace clustering structure. Furthermore, in order to tune the associated hyper-parameters systematically, a cross-validation strategy is developed based on entry-wise sampling of the fMRI dataset. Efficient algorithms for solving the proposed constrained dictionary learning formulations are developed. Numerical tests performed on synthetic fMRI data show promising results and provides insights into the proposed technique.

  16. Prevalence of Mycobacterium avium subspecies paratuberculosis and hepatitis E in New World camelids in Austria.

    PubMed

    Stanitznig, A; Khol, J L; Lambacher, B; Franz, S; Wittek, T; Kralik, P; Slana, I; Vasickova, P

    2017-07-07

    Mycobacterium avium subspecies paratuberculosis (MAP) is the causative agent of paratuberculosis in domestic ruminants and New World Camelids (NWC). Hepatitis E virus (HEV) is an important public health concern worldwide. The virus has been identified in several species, some of them serving as a reservoir for zoonotic HEV strains. Husbandry and breeding of llamas and alpacas have increased in Austria in recent years. Therefore, the aim of the present study was to evaluate the prevalence of MAP and HEV in NWC in Austria. Altogether 445 animals, originating from 78 farms were enrolled in the study. Of the animals sampled, 184 (41.35%) were llamas and 261 (58.65%) were alpacas. 443 blood samples for MAP-ELISA and 399 faecal samples for quantitative PCR (qPCR) and culture for MAP as well as for HEV detection by RT-qPCR have been collected. All of the 399 animals tested for shedding of MAP were negative by faecal solid culture. Using qPCR, 15 (3.8%) of the animals were MAP positive and 384 (96.2%) negative. Out of the 443 serum samples examined for specific antibodies against MAP by ELISA, 6 (1.4%) were positive, 1 (0.2%) was questionable and 436 (98.4%) samples were negative. All faecal samples were tested negative for HEV.

  17. The accuracy of selected land use and land cover maps at scales of 1:250,000 and 1:100,000

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine

    1980-01-01

    Land use and land cover maps produced by the U.S. Geological Survey are found to meet or exceed the established standard of accuracy. When analyzed using a point sampling technique and binomial probability theory, several maps, illustrative of those produced for different parts of the country, were found to meet or exceed accuracies of 85 percent. Those maps tested were Tampa, Fla., Portland, Me., Charleston, W. Va., and Greeley, Colo., published at a scale of 1:250,000, and Atlanta, Ga., and Seattle and Tacoma, Wash., published at a scale of 1:100,000. For each map, the values were determined by calculating the ratio of the total number of points correctly interpreted to the total number of points sampled. Six of the seven maps tested have accuracies of 85 percent or better at the 95-percent lower confidence limit. When the sample data for predominant categories (those sampled with a significant number of points) were grouped together for all maps, accuracies of those predominant categories met the 85-percent accuracy criterion, with one exception. One category, Residential, had less than 85-percent accuracy at the 95-percent lower confidence limit. Nearly all residential land sampled was mapped correctly, but some areas of other land uses were mapped incorrectly as Residential.

  18. 36 CFR 9.42 - Well records and reports, plots and maps, samples, tests and surveys.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Well records and reports, plots and maps, samples, tests and surveys. Any technical data gathered... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Well records and reports, plots and maps, samples, tests and surveys. 9.42 Section 9.42 Parks, Forests, and Public Property...

  19. LC-MS/MS Peptide Mapping with Automated Data Processing for Routine Profiling of N-Glycans in Immunoglobulins

    NASA Astrophysics Data System (ADS)

    Shah, Bhavana; Jiang, Xinzhao Grace; Chen, Louise; Zhang, Zhongqi

    2014-06-01

    Protein N-Glycan analysis is traditionally performed by high pH anion exchange chromatography (HPAEC), reversed phase liquid chromatography (RPLC), or hydrophilic interaction liquid chromatography (HILIC) on fluorescence-labeled glycans enzymatically released from the glycoprotein. These methods require time-consuming sample preparations and do not provide site-specific glycosylation information. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) peptide mapping is frequently used for protein structural characterization and, as a bonus, can potentially provide glycan profile on each individual glycosylation site. In this work, a recently developed glycopeptide fragmentation model was used for automated identification, based on their MS/MS, of N-glycopeptides from proteolytic digestion of monoclonal antibodies (mAbs). Experimental conditions were optimized to achieve accurate profiling of glycoforms. Glycan profiles obtained from LC-MS/MS peptide mapping were compared with those obtained from HPAEC, RPLC, and HILIC analyses of released glycans for several mAb molecules. Accuracy, reproducibility, and linearity of the LC-MS/MS peptide mapping method for glycan profiling were evaluated. The LC-MS/MS peptide mapping method with fully automated data analysis requires less sample preparation, provides site-specific information, and may serve as an alternative method for routine profiling of N-glycans on immunoglobulins as well as other glycoproteins with simple N-glycans.

  20. Cross-correlating 2D and 3D galaxy surveys

    DOE PAGES

    Passaglia, Samuel; Manzotti, Alessandro; Dodelson, Scott

    2017-06-08

    Galaxy surveys probe both structure formation and the expansion rate, making them promising avenues for understanding the dark universe. Photometric surveys accurately map the 2D distribution of galaxy positions and shapes in a given redshift range, while spectroscopic surveys provide sparser 3D maps of the galaxy distribution. We present a way to analyse overlapping 2D and 3D maps jointly and without loss of information. We represent 3D maps using spherical Fourier-Bessel (sFB) modes, which preserve radial coverage while accounting for the spherical sky geometry, and we decompose 2D maps in a spherical harmonic basis. In these bases, a simple expression exists for the cross-correlation of the two fields. One very powerful application is the ability to simultaneously constrain the redshift distribution of the photometric sample, the sample biases, and cosmological parameters. We use our framework to show that combined analysis of DESI and LSST can improve cosmological constraints by factors ofmore » $${\\sim}1.2$$ to $${\\sim}1.8$$ on the region where they overlap relative to identically sized disjoint regions. We also show that in the overlap of DES and SDSS-III in Stripe 82, cross-correlating improves photo-$z$ parameter constraints by factors of $${\\sim}2$$ to $${\\sim}12$$ over internal photo-$z$ reconstructions.« less

  1. Cross-correlating 2D and 3D galaxy surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passaglia, Samuel; Manzotti, Alessandro; Dodelson, Scott

    Galaxy surveys probe both structure formation and the expansion rate, making them promising avenues for understanding the dark universe. Photometric surveys accurately map the 2D distribution of galaxy positions and shapes in a given redshift range, while spectroscopic surveys provide sparser 3D maps of the galaxy distribution. We present a way to analyse overlapping 2D and 3D maps jointly and without loss of information. We represent 3D maps using spherical Fourier-Bessel (sFB) modes, which preserve radial coverage while accounting for the spherical sky geometry, and we decompose 2D maps in a spherical harmonic basis. In these bases, a simple expression exists for the cross-correlation of the two fields. One very powerful application is the ability to simultaneously constrain the redshift distribution of the photometric sample, the sample biases, and cosmological parameters. We use our framework to show that combined analysis of DESI and LSST can improve cosmological constraints by factors ofmore » $${\\sim}1.2$$ to $${\\sim}1.8$$ on the region where they overlap relative to identically sized disjoint regions. We also show that in the overlap of DES and SDSS-III in Stripe 82, cross-correlating improves photo-$z$ parameter constraints by factors of $${\\sim}2$$ to $${\\sim}12$$ over internal photo-$z$ reconstructions.« less

  2. Semantic Mapping and Motion Planning with Turtlebot Roomba

    NASA Astrophysics Data System (ADS)

    Aslam Butt, Rizwan; Usman Ali, Syed M.

    2013-12-01

    In this paper, we have successfully demonstrated the semantic mapping and motion planning experiments on Turtlebot Robot using Microsoft Kinect in ROS environment. Moreover, we have also performed the comparative studies on various sampling based motion planning algorithms with Turtlebot in Open Motion Planning Library. Our comparative analysis revealed that Expansive Space Trees (EST) surmounted all other approaches with respect to memory occupation and processing time. We have also tried to summarize the related concepts of autonomous robotics which we hope would be helpful for beginners.

  3. Stability of Mixed Preparations Consisting of Commercial Moisturizing Creams with an Ointment Base Investigated by Magnetic Resonance Imaging.

    PubMed

    Onuki, Yoshinori; Funatani, Chiaki; Yamamoto, Yoshihisa; Fukami, Toshiro; Koide, Tatsuo; Hayashi, Yoshihiro; Takayama, Kozo

    2017-01-01

    A moisturizing cream mixed with a steroid ointment is frequently prescribed to patients suffering from atopic dermatitis. However, there is a concern that the mixing operation causes destabilization. The present study was performed to investigate the stability of such preparations closely using magnetic resonance imaging (MRI). As sample preparations, five commercial moisturizing creams that are popular in Japan were mixed with an ointment base, a white petrolatum, at a volume ratio of 1 : 1. The mixed preparations were stored at 60°C to accelerate the destabilization processes. Subsequently, the phase separations induced by the storage test were monitored using MRI. Using advanced MR technologies including spin-spin relaxation time (T 2 ) mapping and MR spectroscopy, we successfully characterized the phase-separation behavior of the test samples. For most samples, phase separations developed by the bleeding of liquid oil components. From a sample consisting of an oil-in-water-type cream, Urepearl Cream 10%, a distinct phase-separation mode was observed, which was initiated by the aqueous component separating from the bottom part of the sample. The resultant phase separation was the most distinct among the test samples. To investigate the phase separation quantitatively and objectively, we conducted a histogram analysis on the acquired T 2 maps. The water-in-oil type creams were found to be much more stable after mixing with ointment base than those of oil-in-water type creams. This finding strongly supported the validity of the mixing operation traditionally conducted in pharmacies.

  4. A scheme of pedagogical problems solving in kinematic to observe toulmin argumentation feasibility

    NASA Astrophysics Data System (ADS)

    Manurung, Sondang R.; Rustaman, Nuryani Y.; Siregar, Nelson

    2013-09-01

    The purpose of this study is to determine the students' ability to map out the problem solving. This paper would show a schematic template map used to analyze the students' tasks in performing problem solving pedagogically. Scheme of problem solving map of student was undertaken based on Toulmin Argumentation Pattern (TAP) argumentative discourse. The samples of this study were three work-sheets of physics education students who represented the upper, middle and lower levels of class in one LPTK in Medan. The instrument of this study was an essay test in kinematics topic. The data analyses were performed with schematic template map in order to know the students' ability in mapping the problem solving. The results showed that the student in the Upper level of class followed the appropriate direction pattern, while two others students could not followed the pattern exactly.

  5. Integrating geophysical data for mapping the contamination of industrial sites by polycyclic aromatic hydrocarbons: A geostatistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colin, P.; Nicoletis, S.; Froidevaux, R.

    1996-12-31

    A case study is presented of building a map showing the probability that the concentration in polycyclic aromatic hydrocarbon (PAH) exceeds a critical threshold. This assessment is based on existing PAH sample data (direct information) and on an electrical resistivity survey (indirect information). Simulated annealing is used to build a model of the range of possible values for PAH concentrations and of the bivariate relationship between PAH concentrations and electrical resistivity. The geostatistical technique of simple indicator kriging is then used, together with the probabilistic model, to infer, at each node of a grid, the range of possible values whichmore » the PAH concentration can take. The risk map is then extracted for this characterization of the local uncertainty. The difference between this risk map and a traditional iso-concentration map is then discussed in terms of decision-making.« less

  6. Development of a seroprevalence map for avian influenza in broiler chickens from Comunidad Valenciana, Spain.

    PubMed

    2014-12-01

    The aim of this study was to design and implement a seroprevalence map based on business intelligence for low pathogenicity notifiable avian influenza (LPNAI) in broilerchickens in Comunidad Valenciana (Spain). The software mapping tool developed for this study consisted of three main phases: data collection, data analysis and data representation. To obtain the serological data, the authors analysed 8,520 serum samples from broiler farms over three years. The data were represented on a map of Comunidad Valenciana, including geographical information of flock locations to facilitate disease monitoring. No clinical signs of LPNAI were reported in the studied flocks. The data from this study showed no evidence of contact with LPNAI in broiler flocks and the novel software mapping tool proved a valuable method for easily monitoring on the serological response to avian influenza information, including geographical information.

  7. Spatial mapping of lead, arsenic, iron, and polycyclic aromatic hydrocarbon soil contamination in Sydney, Nova Scotia: community impact from the coke ovens and steel plant.

    PubMed

    Lambert, Timothy W; Boehmer, Jennifer; Feltham, Jason; Guyn, Lindsay; Shahid, Rizwan

    2011-01-01

    This paper presents spatial maps of the arsenic, lead, and polycyclic aromatic hydrocarbon (PAH) soil contamination in Sydney, Nova Scotia, Canada. The spatial maps were designed to create exposure cohorts to help understand the observed increase in health effects. To assess whether contamination can be a proxy for exposures, the following hypothesis was tested: residential soils were impacted by the coke oven and steel plant industrial complex. The spatial map showed contaminants are centered on the industrial facility, significantly correlated, and exceed Canadian health risk-based soil quality guidelines. Core samples taken at 5-cm intervals suggest a consistent deposition over time. The concentrations in Sydney significantly exceed background Sydney soil concentrations, and are significantly elevated compared with North Sydney, an adjacent industrial community. The contaminant spatial maps will also be useful for developing cohorts of exposure and guiding risk management decisions.

  8. Assessment of the prevalence of Mycobacterium avium subsp. paratuberculosis in commercially pasteurized milk.

    PubMed

    Cerf, O; Griffiths, M; Aziza, F

    2007-01-01

    Conflicting laboratory-acquired data have been published about the heat resistance of Mycobacterium avium subsp. paratuberculosis (MAP), the cause of the deadly paratuberculosis (Johne's disease) of ruminants. Results of surveys of the presence of MAP in industrially pasteurized milk from several countries are conflicting also. This paper critically reviews the available data on the heat resistance of MAP and, based on these studies, a quantitative model describing the probability of finding MAP in pasteurized milk under the conditions prevailing in industrialized countries was derived using Monte Carlo simulation. The simulation assesses the probability of detecting MAP in 50-mL samples of pasteurized milk as lower than 1%. Hypotheses are presented to explain why higher frequencies were found by some authors; these included improper pasteurization and cross-contamination in the analytical laboratory. Hypotheses implicating a high rate of inter- and intraherd prevalence of paratuberculosis or heavy contamination of raw milk by feces were rejected.

  9. Fuel models and fire potential from satellite and surface observations

    USGS Publications Warehouse

    Burgan, R.E.; Klaver, R.W.; Klarer, J.M.

    1998-01-01

    A national 1-km resolution fire danger fuel model map was derived through use of previously mapped land cover classes and ecoregions, and extensive ground sample data, then refined through review by fire managers familiar with various portions of the U.S. The fuel model map will be used in the next generation fire danger rating system for the U.S., but it also made possible immediate development of a satellite and ground based fire potential index map. The inputs and algorithm of the fire potential index are presented, along with a case study of the correlation between the fire potential index and fire occurrence in California and Nevada. Application of the fire potential index in the Mediterranean ecosystems of Spain, Chile, and Mexico will be tested.

  10. Analysis of storage possibility of raw and smoked breast meat of oat-fattened White Kołuda® goose based on their quality characteristics.

    PubMed

    Damaziak, K; Stelmasiak, A; Michalczuk, M; Wyrwisz, J; Moczkowska, M; Marcinkowska-Lesiak, M M; Niemiec, J; Wierzbicka, A

    2016-09-01

    Raw and smoked (spickgans) fillets of oat-fattened White Kołuda® goose were packed in: PET - ethylene terephthalate bags; VSP - 99% vacuum; MAP1 - 80% O2, 20% CO2; MAP2 - 70% N2, 30% O2; MAP3 - 30% O2, 40% N2, 30% CO2, and stored at a temperature of 2°C. On the day of packaging (0 d) and during storage of raw (5, 7, 10 d) and smoked fillets (5, 10, 15 d), the samples were analyzed for weight losses, physicochemical traits, and chemical composition. The study demonstrated the effect of storage time and packaging method on storage yield of raw and smoked fillets. In VSP, the raw fillets were characterized by the lowest amount of leakage, whereas spickgans were characterized by the highest storage yield and weight loss. The analysis of the effect of the modified atmosphere demonstrated the lowest weight loss of raw fillets at, simultaneously, the smallest amount of leakage in MAP1. The spickgans stored in MAP2 showed higher weight, higher yield after storage, and lower storage loss in all terms of analyses compared to MAP1 and MAP3. The greatest cooking loss at simultaneously the lowest pH values was determined for the samples stored in VSP. The WBSF values of raw fillets were decreasing along with storage time, in contrast to WBSF values of spickgans, in which case the value of this parameter increased compared to 0 d. Raw fillets stored in MAP1 and MAP3 were characterized by the most significant increase in the value of L*, by a decrease in the value of a* and an increase in that of b* parameter. Visual assessment of spickgans on 15 d of storage revealed the presence of white sediment on the surface of products, except for the samples stored in VSP. The study demonstrated the effect of storage time on the contents of protein and fat in raw fillets and on the contents of salt and fat in spickgans. © 2016 Poultry Science Association Inc.

  11. Communicating infectious disease prevalence through graphics: results from an international survey

    PubMed Central

    Fagerlin, Angela; Valley, Thomas S.; Scherer, Aaron M.; Knaus, Megan; Das, Enny; Zikmund-Fisher, Brian J.

    2017-01-01

    Background Graphics are increasingly used to represent the spread of infectious diseases (e.g., influenza, Zika, Ebola); however, the impact of using graphics to adequately inform the general population is unknown. Objective To examine whether three ways of visually presenting data (heat map, dot map, or picto-trendline)—all depicting the same information regarding the spread of a hypothetical outbreak of influenza—influence intent to vaccinate, risk perception, and knowledge. Design Survey with participants randomized to receive a simulated news article accompanied by one of the three graphics that communicated prevalence of influenza and number of influenza-related deaths. Setting International online survey Participants 16,510 adults living in 11 countries selected using stratified random sampling based on age and gender Measurements After reading the article and viewing the presented graphic, participants completed a survey that measured interest in vaccination, perceived risk of contracting disease, knowledge gained, interest in additional information about the disease, and perception of the graphic. Results Heat maps and picto-trendlines were evaluated more positively than dot maps. Heat maps were more effective than picto-trendlines and no different from dot maps at increasing interest in vaccination, perceived risk of contracting disease, and interest in additional information about the disease. Heat maps and picto-trendlines were more successful at conveying knowledge than dot maps. Overall, heat maps were the only graphic to be superior in every outcome. Limitations Results are based on a hypothetical scenario Conclusion Heat maps are a viable option to promote interest in and concern about infectious diseases. PMID:28647168

  12. Protein side chain rotational isomerization: A minimum perturbation mapping study

    NASA Astrophysics Data System (ADS)

    Haydock, Christopher

    1993-05-01

    A theory of the rotational isomerization of the indole side chain of tryptophan-47 of variant-3 scorpion neurotoxin is presented. The isomerization potential energy, entropic part of the isomerization free energy, isomer probabilities, transition state theory reaction rates, and indole order parameters are calculated from a minimum perturbation mapping over tryptophan-47 χ1×χ2 torsion space. A new method for calculating the fluorescence anisotropy from molecular dynamics simulations is proposed. The method is based on an expansion that separates transition dipole orientation from chromophore dynamics. The minimum perturbation potential energy map is inverted and applied as a bias potential for a 100 ns umbrella sampling simulation. The entropic part of the isomerization free energy as calculated by minimum perturbation mapping and umbrella sampling are in fairly close agreement. Throughout, the approximation is made that two glutamine and three tyrosine side chains neighboring tryptophan-47 are truncated at the Cβ atom. Comparison with the previous combination thermodynamic perturbation and umbrella sampling study suggests that this truncated neighbor side chain approximation leads to at least a qualitatively correct theory of tryptophan-47 rotational isomerization in the wild type variant-3 scorpion neurotoxin. Analysis of van der Waals interactions in a transition state region indicates that for the simulation of barrier crossing trajectories a linear combination of three specially defined dihedral angles will be superior to a simple side chain dihedral reaction coordinate.

  13. Vector Doppler: spatial sampling analysis and presentation techniques for real-time systems

    NASA Astrophysics Data System (ADS)

    Capineri, Lorenzo; Scabia, Marco; Masotti, Leonardo F.

    2001-05-01

    The aim of the vector Doppler (VD) technique is the quantitative reconstruction of a velocity field independently of the ultrasonic probe axis to flow angle. In particular vector Doppler is interesting for studying vascular pathologies related to complex blood flow conditions. Clinical applications require a real-time operating mode and the capability to perform Doppler measurements over a defined volume. The combination of these two characteristics produces a real-time vector velocity map. In previous works the authors investigated the theory of pulsed wave (PW) vector Doppler and developed an experimental system capable of producing off-line 3D vector velocity maps. Afterwards, for producing dynamic velocity vector maps, we realized a new 2D vector Doppler system based on a modified commercial echograph. The measurement and presentation of a vector velocity field requires a correct spatial sampling that must satisfy the Shannon criterion. In this work we tackled this problem, establishing a relationship between sampling steps and scanning system characteristics. Another problem posed by the vector Doppler technique is the data representation in real-time that should be easy to interpret for the physician. With this in mine we attempted a multimedia solution that uses both interpolated images and sound to represent the information of the measured vector velocity map. These presentation techniques were experimented for real-time scanning on flow phantoms and preliminary measurements in vivo on a human carotid artery.

  14. Mobility Patterns of Children of Migrant Agricultural Workers.

    ERIC Educational Resources Information Center

    Cox, J. Lamarr; And Others

    Narrative text, tables, and maps summarize information derived from a random sample of 20% of the Migrant Student Record Transfer System (MSRTS) data base as it existed in June 1976 related to the mobility patterns of migrant children in the contiguous United States and Puerto Rico from January 1975 to April 1976. The data base is a tabulation of…

  15. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  16. Space-resolved diffusing wave spectroscopy measurements of the macroscopic deformation and the microscopic dynamics in tensile strain tests

    NASA Astrophysics Data System (ADS)

    Nagazi, Med-Yassine; Brambilla, Giovanni; Meunier, Gérard; Marguerès, Philippe; Périé, Jean-Noël; Cipelletti, Luca

    2017-01-01

    We couple a laser-based, space-resolved dynamic light scattering apparatus to a universal traction machine for mechanical extensional tests. We perform simultaneous optical and mechanical measurements on polyether ether ketone, a semi-crystalline polymer widely used in the industry. Due to the high turbidity of the sample, light is multiply scattered by the sample and the diffusing wave spectroscopy (DWS) formalism is used to interpret the data. Space-resolved DWS yields spatial maps of the sample strain and of the microscopic dynamics. An excellent agreement is found between the strain maps thus obtained and those measured by a conventional stereo-digital image correlation technique. The microscopic dynamics reveals both affine motion and plastic rearrangements. Thanks to the extreme sensitivity of DWS to displacements as small as 1 nm, plastic activity and its spatial localization can be detected at an early stage of the sample strain, making the technique presented here a valuable complement to existing material characterization methods.

  17. The Science and Application of Satellite Based Fire Radiative Energy

    NASA Technical Reports Server (NTRS)

    Ellicott, Evan; Vermote, Eric (Editor)

    2012-01-01

    The accurate measurement of ecosystem biomass is of great importance in scientific, resource management and energy sectors. In particular, biomass is a direct measurement of carbon storage within an ecosystem and of great importance for carbon cycle science and carbon emission mitigation. Remote Sensing is the most accurate tool for global biomass measurements because of the ability to measure large areas. Current biomass estimates are derived primarily from ground-based samples, as compiled and reported in inventories and ecosystem samples. By using remote sensing technologies, we are able to scale up the sample values and supply wall to wall mapping of biomass.

  18. Tampa Bay environmental atlas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunneke, J.T.; Palik, T.F.

    1984-12-01

    Biological and water resource data for Tampa Bay were compiled and mapped at a scale of 1:24,000. This atlas consists of (1) composited information overlain on 18 biological and 20 water resource base maps and (2) an accompanying map narrative. Subjects mapped on the water resource maps are contours of the mean middepth specific conductivity which can be converted to salinity; bathymetry, sediments, tidal currents, the freshwater/saltwater interface, dredge spoil disposal sites; locations of industrial and municipal point source discharges, tide stations, and water quality sampling stations. The point source discharge locations show permitted capacity and the water quality samplingmore » stations show 5-year averages for chlorophyll, conductivity, turbidity, temperature, and total nitrogen. The subjects shown on the biological resource maps are clam and oyster beds, shellfish harvest areas, colonial bird nesting sites, manatee habitat, seagrass beds and artificial reefs. Spawning seasons, nursery habitats, and adult habitats are identified for major fish species. The atlas will provide useful information for coastal planning and management in Tampa Bay.« less

  19. Impact of a community-based prevention marketing intervention to promote physical activity among middle-aged women.

    PubMed

    Sharpe, Patricia A; Burroughs, Ericka L; Granner, Michelle L; Wilcox, Sara; Hutto, Brent E; Bryant, Carol A; Peck, Lara; Pekuri, Linda

    2010-06-01

    A physical activity intervention applied principles of community-based participatory research, the community-based prevention marketing framework, and social cognitive theory. A nonrandomized design included women ages 35 to 54 in the southeastern United States. Women (n = 430 preprogram, n = 217 postprogram) enrolled in a 24-week behavioral intervention and were exposed to a media campaign. They were compared to cross-sectional survey samples at pre- (n = 245) and postprogram (n = 820) from the media exposed county and a no-intervention county (n = 234 pre, n = 822 post). Women in the behavioral intervention had statistically significant positive changes on physical activity minutes, walking, park and trail use, knowledge of mapped routes and exercise partner, and negative change on exercise self-efficacy. Media exposed women had statistically significant pre- to postprogram differences on knowledge of mapped routes. No-intervention women had significant pre- to postprogram differences on physical activity minutes, walking, and knowledge of mapped routes.

  20. Automatic Construction of Wi-Fi Radio Map Using Smartphones

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Li, Qingquan; Zhang, Xing

    2016-06-01

    Indoor positioning could provide interesting services and applications. As one of the most popular indoor positioning methods, location fingerprinting determines the location of mobile users by matching the received signal strength (RSS) which is location dependent. However, fingerprinting-based indoor positioning requires calibration and updating of the fingerprints which is labor-intensive and time-consuming. In this paper, we propose a visual-based approach for the construction of radio map for anonymous indoor environments without any prior knowledge. This approach collects multi-sensors data, e.g. video, accelerometer, gyroscope, Wi-Fi signals, etc., when people (with smartphones) walks freely in indoor environments. Then, it uses the multi-sensor data to restore the trajectories of people based on an integrated structure from motion (SFM) and image matching method, and finally estimates location of sampling points on the trajectories and construct Wi-Fi radio map. Experiment results show that the average location error of the fingerprints is about 0.53 m.

  1. Fast magnetic resonance fingerprinting for dynamic contrast-enhanced studies in mice.

    PubMed

    Gu, Yuning; Wang, Charlie Y; Anderson, Christian E; Liu, Yuchi; Hu, He; Johansen, Mette L; Ma, Dan; Jiang, Yun; Ramos-Estebanez, Ciro; Brady-Kalnay, Susann; Griswold, Mark A; Flask, Chris A; Yu, Xin

    2018-05-09

    The goal of this study was to develop a fast MR fingerprinting (MRF) method for simultaneous T 1 and T 2 mapping in DCE-MRI studies in mice. The MRF sequences based on balanced SSFP and fast imaging with steady-state precession were implemented and evaluated on a 7T preclinical scanner. The readout used a zeroth-moment-compensated variable-density spiral trajectory that fully sampled the entire k-space and the inner 10 × 10 k-space with 48 and 4 interleaves, respectively. In vitro and in vivo studies of mouse brain were performed to evaluate the accuracy of MRF measurements with both fully sampled and undersampled data. The application of MRF to dynamic T 1 and T 2 mapping in DCE-MRI studies were demonstrated in a mouse model of heterotopic glioblastoma using gadolinium-based and dysprosium-based contrast agents. The T 1 and T 2 measurements in phantom showed strong agreement between the MRF and the conventional methods. The MRF with spiral encoding allowed up to 8-fold undersampling without loss of measurement accuracy. This enabled simultaneous T 1 and T 2 mapping with 2-minute temporal resolution in DCE-MRI studies. Magnetic resonance fingerprinting provides the opportunity for dynamic quantification of contrast agent distribution in preclinical tumor models on high-field MRI scanners. © 2018 International Society for Magnetic Resonance in Medicine.

  2. Low-Cost Ultra-High Spatial and Temporal Resolution Mapping of Intertidal Rock Platforms

    NASA Astrophysics Data System (ADS)

    Bryson, M.; Johnson-Roberson, M.; Murphy, R.

    2012-07-01

    Intertidal ecosystems have primarily been studied using field-based sampling; remote sensing offers the ability to collect data over large areas in a snapshot of time which could compliment field-based sampling methods by extrapolating them into the wider spatial and temporal context. Conventional remote sensing tools (such as satellite and aircraft imaging) provide data at relatively course, sub-meter resolutions or with limited temporal resolutions and relatively high costs for small-scale environmental science and ecology studies. In this paper, we describe a low-cost, kite-based imaging system and photogrammetric pipeline that was developed for constructing highresolution, 3D, photo-realistic terrain models of intertidal rocky shores. The processing pipeline uses automatic image feature detection and matching, structure-from-motion and photo-textured terrain surface reconstruction algorithms that require minimal human input and only a small number of ground control points and allow the use of cheap, consumer-grade digital cameras. The resulting maps combine colour and topographic information at sub-centimeter resolutions over an area of approximately 100m, thus enabling spatial properties of the intertidal environment to be determined across a hierarchy of spatial scales. Results of the system are presented for an intertidal rock platform at Cape Banks, Sydney, Australia. Potential uses of this technique include mapping of plant (micro- and macro-algae) and animal (e.g. gastropods) assemblages at multiple spatial and temporal scales.

  3. Use of geostatistics to predict virus decay rates for determination of septic tank setback distances.

    PubMed Central

    Yates, M V; Yates, S R; Warrick, A W; Gerba, C P

    1986-01-01

    Water samples were collected from 71 public drinking-water supply wells in the Tucson, Ariz., basin. Virus decay rates in the water samples were determined with MS-2 coliphage as a model virus. The correlations between the virus decay rates and the sample locations were shown by fitting a spherical model to the experimental semivariogram. Kriging, a geostatistical technique, was used to calculate virus decay rates at unsampled locations by using the known values at nearby wells. Based on the regional characteristics of groundwater flow and the kriged estimates of virus decay rates, a contour map of the area was constructed. The map shows the variation in separation distances that would have to be maintained between wells and sources of contamination to afford similar degrees of protection from viral contamination of the drinking water in wells throughout the basin. PMID:3532954

  4. Detecting reactive islands using Lagrangian descriptors and the relevance to transition path sampling.

    PubMed

    Patra, Sarbani; Keshavamurthy, Srihari

    2018-02-14

    It has been known for sometime now that isomerization reactions, classically, are mediated by phase space structures called reactive islands (RI). RIs provide one possible route to correct for the nonstatistical effects in the reaction dynamics. In this work, we map out the reactive islands for the two dimensional Müller-Brown model potential and show that the reactive islands are intimately linked to the issue of rare event sampling. In particular, we establish the sensitivity of the so called committor probabilities, useful quantities in the transition path sampling technique, to the hierarchical RI structures. Mapping out the RI structure for high dimensional systems, however, is a challenging task. Here, we show that the technique of Lagrangian descriptors is able to effectively identify the RI hierarchy in the model system. Based on our results, we suggest that the Lagrangian descriptors can be useful for detecting RIs in high dimensional systems.

  5. Calculation of susceptibility through multiple orientation sampling (COSMOS): a method for conditioning the inverse problem from measured magnetic field map to susceptibility source image in MRI.

    PubMed

    Liu, Tian; Spincemaille, Pascal; de Rochefort, Ludovic; Kressler, Bryan; Wang, Yi

    2009-01-01

    Magnetic susceptibility differs among tissues based on their contents of iron, calcium, contrast agent, and other molecular compositions. Susceptibility modifies the magnetic field detected in the MR signal phase. The determination of an arbitrary susceptibility distribution from the induced field shifts is a challenging, ill-posed inverse problem. A method called "calculation of susceptibility through multiple orientation sampling" (COSMOS) is proposed to stabilize this inverse problem. The field created by the susceptibility distribution is sampled at multiple orientations with respect to the polarization field, B(0), and the susceptibility map is reconstructed by weighted linear least squares to account for field noise and the signal void region. Numerical simulations and phantom and in vitro imaging validations demonstrated that COSMOS is a stable and precise approach to quantify a susceptibility distribution using MRI.

  6. Gemas: Geochemical mapping of the agricultural and grasing land soils of Europe

    NASA Astrophysics Data System (ADS)

    Reimann, Clemens; Fabian, Karl; Birke, Manfred; Demetriades, Alecos; Matschullat, Jörg; Gemas Project Team

    2017-04-01

    Geochemical Mapping of Agricultural and grazing land Soil (GEMAS) is a cooperative project between the Geochemistry Expert Group of EuroGeoSurveys and Eurometaux. During 2008 and until early 2009, a total of 2108 samples of agricultural (ploughed land, 0-20 cm, Ap-samples) and 2023 samples of grazing land (0-10 cm, Gr samples)) soil were collected at a density of 1 site/2500 km2 each from 33 European countries, covering an area of 5,600,000 km2. All samples were analysed for 52 chemical elements following an aqua regia extraction, 42 elements by XRF (total), and soil properties, like CEC, TOC, pH (CaCl2), following tight external quality control procedures. In addition, the Ap soil samples were analysed for 57 elements in a mobile metal ion (MMI®) extraction, Pb isotopes, magnetic susceptibility and total C, N and S. The results demonstrate that robust geochemical maps of Europe can be constructed based on low density sampling, the two independent sample materials, Ap and Gr, show very comparable distribution patterns across Europe. At the European scale, element distribution patterns are governed by natural processes, most often a combination of geology and climate. The geochemical maps reflect most of the known metal mining districts in Europe. In addition, a number of new anomalies emerge that may indicate mineral potential. The size of some anomalies is such that they can only be detected when mapping at the continental scale. For some elements completely new geological settings are detected. An anthropogenic impact at a much more local scale is discernible in the immediate vicinity of some major European cities (e.g., London, Paris) and some metal smelters. The impact of agriculture is visible for Cu (vineyard soils) and for some further elements only in the mobile metal ion (MMI) extraction. For several trace elements deficiency issues are a larger threat to plant, animal and finally human health at the European scale than toxicity. Taking the famous step back to see the whole picture at the continental scale and to understand the relative importance of the processes leading to element enrichment/depletion in soil may hold unexpected promise for mineral exploration as well as for environmental sciences.

  7. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  8. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  9. Shallow Water Habitat Mapping in Cape Cod National Seashore: A Post-Hurricane Sandy Study

    NASA Astrophysics Data System (ADS)

    Borrelli, M.; Smith, T.; Legare, B.; Mittermayr, A.

    2017-12-01

    Hurricane Sandy had a dramatic impact along coastal areas in proximity to landfall in late October 2012, and those impacts have been well-documented in terrestrial coastal settings. However, due to the lack of data on submerged marine habitats, similar subtidal impact studies have been limited. This study, one of four contemporaneous studies commissioned by the US National Park Service, developed maps of submerged shallow water marine habitats in and around Cape Cod National Seashore, Massachusetts. All four studies used similar methods of data collection, processing and analysis for the production of habitat maps. One of the motivations for the larger study conducted in the four coastal parks was to provide park managers with a baseline inventory of submerged marine habitats, against which to measure change after future storm events and other natural and anthropogenic phenomena. In this study data from a phase-measuring sidescan sonar, bottom grab samples, seismic reflection profiling, and sediment coring were all used to develop submerged marine habitat maps using the Coastal and Marine Ecological Classification Standard (CMECS). Vessel-based acoustic surveys (n = 76) were conducted in extreme shallow water across four embayments from 2014-2016. Sidescan sonar imagery covering 83.37 km2 was collected, and within that area, 49.53 km2 of co-located bathymetric data were collected with a mean depth of 4.00 m. Bottom grab samples (n = 476) to sample macroinvertebrates and sediments (along with other water column and habitat data) were collected, and these data were used along with the geophysical and coring data to develop final habitat maps using the CMECS framework.

  10. On the ecological relevance of landscape mapping and its application in the spatial planning of very large marine protected areas.

    PubMed

    Hogg, Oliver T; Huvenne, Veerle A I; Griffiths, Huw J; Linse, Katrin

    2018-06-01

    In recent years very large marine protected areas (VLMPAs) have become the dominant form of spatial protection in the marine environment. Whilst seen as a holistic and geopolitically achievable approach to conservation, there is currently a mismatch between the size of VLMPAs, and the data available to underpin their establishment and inform on their management. Habitat mapping has increasingly been adopted as a means of addressing paucity in biological data, through use of environmental proxies to estimate species and community distribution. Small-scale studies have demonstrated environmental-biological links in marine systems. Such links, however, are rarely demonstrated across larger spatial scales in the benthic environment. As such, the utility of habitat mapping as an effective approach to the ecosystem-based management of VLMPAs remains, thus far, largely undetermined. The aim of this study was to assess the ecological relevance of broadscale landscape mapping. Specifically we test the relationship between broad-scale marine landscapes and the structure of their benthic faunal communities. We focussed our work at the sub-Antarctic island of South Georgia, site of one of the largest MPAs in the world. We demonstrate a statistically significant relationship between environmentally derived landscape mapping clusters, and the composition of presence-only species data from the region. To demonstrate this relationship required specific re-sampling of historical species occurrence data to balance biological rarity, biological cosmopolitism, range-restricted sampling and fine-scale heterogeneity between sampling stations. The relationship reveals a distinct biological signature in the faunal composition of individual landscapes, attributing ecological relevance to South Georgia's environmentally derived marine landscape map. We argue therefore, that landscape mapping represents an effective framework for ensuring representative protection of habitats in management plans. Such scientific underpinning of marine spatial planning is critical in balancing the needs of multiple stakeholders whilst maximising conservation payoff. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Whole genome sequences are required to fully resolve the linkage disequilibrium structure of human populations.

    PubMed

    Pengelly, Reuben J; Tapper, William; Gibson, Jane; Knut, Marcin; Tearle, Rick; Collins, Andrew; Ennis, Sarah

    2015-09-03

    An understanding of linkage disequilibrium (LD) structures in the human genome underpins much of medical genetics and provides a basis for disease gene mapping and investigating biological mechanisms such as recombination and selection. Whole genome sequencing (WGS) provides the opportunity to determine LD structures at maximal resolution. We compare LD maps constructed from WGS data with LD maps produced from the array-based HapMap dataset, for representative European and African populations. WGS provides up to 5.7-fold greater SNP density than array-based data and achieves much greater resolution of LD structure, allowing for identification of up to 2.8-fold more regions of intense recombination. The absence of ascertainment bias in variant genotyping improves the population representativeness of the WGS maps, and highlights the extent of uncaptured variation using array genotyping methodologies. The complete capture of LD patterns using WGS allows for higher genome-wide association study (GWAS) power compared to array-based GWAS, with WGS also allowing for the analysis of rare variation. The impact of marker ascertainment issues in arrays has been greatest for Sub-Saharan African populations where larger sample sizes and substantially higher marker densities are required to fully resolve the LD structure. WGS provides the best possible resource for LD mapping due to the maximal marker density and lack of ascertainment bias. WGS LD maps provide a rich resource for medical and population genetics studies. The increasing availability of WGS data for large populations will allow for improved research utilising LD, such as GWAS and recombination biology studies.

  12. A quantitative approach to measure road network information based on edge diversity

    NASA Astrophysics Data System (ADS)

    Wu, Xun; Zhang, Hong; Lan, Tian; Cao, Weiwei; He, Jing

    2015-12-01

    The measure of map information has been one of the key issues in assessing cartographic quality and map generalization algorithms. It is also important for developing efficient approaches to transfer geospatial information. Road network is the most common linear object in real world. Approximately describe road network information will benefit road map generalization, navigation map production and urban planning. Most of current approaches focused on node diversities and supposed that all the edges are the same, which is inconsistent to real-life condition, and thus show limitations in measuring network information. As real-life traffic flow are directed and of different quantities, the original undirected vector road map was first converted to a directed topographic connectivity map. Then in consideration of preferential attachment in complex network study and rich-club phenomenon in social network, the from and to weights of each edge are assigned. The from weight of a given edge is defined as the connectivity of its end node to the sum of the connectivities of all the neighbors of the from nodes of the edge. After getting the from and to weights of each edge, edge information, node information and the whole network structure information entropies could be obtained based on information theory. The approach has been applied to several 1 square mile road network samples. Results show that information entropies based on edge diversities could successfully describe the structural differences of road networks. This approach is a complementarity to current map information measurements, and can be extended to measure other kinds of geographical objects.

  13. An evaluation of rapid methods for monitoring vegetation characteristics of wetland bird habitat

    USGS Publications Warehouse

    Tavernia, Brian G.; Lyons, James E.; Loges, Brian W.; Wilson, Andrew; Collazo, Jaime A.; Runge, Michael C.

    2016-01-01

    Wetland managers benefit from monitoring data of sufficient precision and accuracy to assess wildlife habitat conditions and to evaluate and learn from past management decisions. For large-scale monitoring programs focused on waterbirds (waterfowl, wading birds, secretive marsh birds, and shorebirds), precision and accuracy of habitat measurements must be balanced with fiscal and logistic constraints. We evaluated a set of protocols for rapid, visual estimates of key waterbird habitat characteristics made from the wetland perimeter against estimates from (1) plots sampled within wetlands, and (2) cover maps made from aerial photographs. Estimated percent cover of annuals and perennials using a perimeter-based protocol fell within 10 percent of plot-based estimates, and percent cover estimates for seven vegetation height classes were within 20 % of plot-based estimates. Perimeter-based estimates of total emergent vegetation cover did not differ significantly from cover map estimates. Post-hoc analyses revealed evidence for observer effects in estimates of annual and perennial covers and vegetation height. Median time required to complete perimeter-based methods was less than 7 percent of the time needed for intensive plot-based methods. Our results show that rapid, perimeter-based assessments, which increase sample size and efficiency, provide vegetation estimates comparable to more intensive methods.

  14. Using gradient-based ray and candidate shadow maps for environmental illumination distribution estimation

    NASA Astrophysics Data System (ADS)

    Eem, Changkyoung; Kim, Iksu; Hong, Hyunki

    2015-07-01

    A method to estimate the environmental illumination distribution of a scene with gradient-based ray and candidate shadow maps is presented. In the shadow segmentation stage, we apply a Canny edge detector to the shadowed image by using a three-dimensional (3-D) augmented reality (AR) marker of a known size and shape. Then the hierarchical tree of the connected edge components representing the topological relation is constructed, and the connected components are merged, taking their hierarchical structures into consideration. A gradient-based ray that is perpendicular to the gradient of the edge pixel in the shadow image can be used to extract the shadow regions. In the light source detection stage, shadow regions with both a 3-D AR marker and the light sources are partitioned into candidate shadow maps. A simple logic operation between each candidate shadow map and the segmented shadow is used to efficiently compute the area ratio between them. The proposed method successively extracts the main light sources according to their relative contributions on the segmented shadows. The proposed method can reduce unwanted effects due to the sampling positions in the shadow region and the threshold values in the shadow edge detection.

  15. Application of a New Grain-Based Reconstruction Algorithm to Microtomography Images for Quantitative Characterization and Flow Modeling

    DTIC Science & Technology

    2008-06-01

    mapping the X-ray absorption through the sample. The amount of absorption depends on the chemical composition and structure of the material and the X...obtained by measuring the X-ray attenua- tion coefficients of the sample at different angles as the sample is rotated about the vertical axis. These... McMaster University, Hamilton, Ontario, Canada. Allen H. Reed is a geologist with the Naval Research Laboratory. His research interests are in marine

  16. Mapping Secondary Forest Succession on Abandoned Agricultural Land in the Polish Carpathians

    NASA Astrophysics Data System (ADS)

    Kolecka, N.; Kozak, J.; Kaim, D.; Dobosz, M.; Ginzler, Ch.; Psomas, A.

    2016-06-01

    Land abandonment and secondary forest succession have played a significant role in land cover changes and forest cover increase in mountain areas in Europe over the past several decades. Land abandonment can be easily observed in the field over small areas, but it is difficult to map over the large areas, e.g., with remote sensing, due to its subtle and spatially dispersed character. Our previous paper presented how the LiDAR (Light Detection and Ranging) and topographic data were used to detect secondary forest succession on abandoned land in one commune located in the Polish Carpathians by means of object-based image analysis (OBIA) and GIS (Kolecka et al., 2015). This paper proposes how the method can be applied to efficiently map secondary forest succession over the entire Polish Carpathians, incorporating spatial sampling strategy supported by various ancillary data. Here we discuss the methods of spatial sampling, its limitations and results in the context of future secondary forest succession modelling.

  17. Large area optical mapping of surface contact angle.

    PubMed

    Dutra, Guilherme; Canning, John; Padden, Whayne; Martelli, Cicero; Dligatch, Svetlana

    2017-09-04

    Top-down contact angle measurements have been validated and confirmed to be as good if not more reliable than side-based measurements. A range of samples, including industrially relevant materials for roofing and printing, has been compared. Using the top-down approach, mapping in both 1-D and 2-D has been demonstrated. The method was applied to study the change in contact angle as a function of change in silver (Ag) nanoparticle size controlled by thermal evaporation. Large area mapping reveals good uniformity for commercial Aspen paper coated with black laser printer ink. A demonstration of the forensic and chemical analysis potential in 2-D is shown by uncovering the hidden CsF initials made with mineral oil on the coated Aspen paper. The method promises to revolutionize nanoscale characterization and industrial monitoring as well as chemical analyses by allowing rapid contact angle measurements over large areas or large numbers of samples in ways and times that have not been possible before.

  18. Quality control of the paracetamol drug by chemometrics and imaging spectroscopy in the near infrared region

    NASA Astrophysics Data System (ADS)

    Baptistao, Mariana; Rocha, Werickson Fortunato de Carvalho; Poppi, Ronei Jesus

    2011-09-01

    In this work, it was used imaging spectroscopy and chemometric tools for the development and analysis of paracetamol and excipients in pharmaceutical formulations. It was also built concentration maps to study the distribution of the drug in the tablets surface. Multivariate models based on PLS regression were developed for paracetamol and excipients concentrations prediction. For the construction of the models it was used 31 samples in the tablet form containing the active principle in a concentration range of 30.0-90.0% (w/w) and errors below to 5% were obtained for validation samples. Finally, the study of the distribution in the drug was performed through the distribution maps of concentration of active principle and excipients. The analysis of maps showed the complementarity between the active principle and excipients in the tablets. The region with a high concentration of a constituent must have, necessarily, absence or low concentration of the other one. Thus, an alternative method for the paracetamol drug quality monitoring is presented.

  19. Micro-Laser-Induced Breakdown Spectroscopy (Micro-LIBS) Study on Ancient Roman Mortars.

    PubMed

    Pagnotta, Stefano; Lezzerini, Marco; Ripoll-Seguer, Laura; Hidalgo, Montserrat; Grifoni, Emanuela; Legnaioli, Stefano; Lorenzetti, Giulia; Poggialini, Francesco; Palleschi, Vincenzo

    2017-04-01

    The laser-induced breakdown spectroscopy (LIBS) technique was used for analyzing the composition of an ancient Roman mortar (5th century A.D.), exploiting an experimental setup which allows the determination of the compositions of binder and aggregate in few minutes, without the need for sample treatment. Four thousand LIBS spectra were acquired from an area of 10 mm 2 , with a 50 µm lateral resolution. The elements of interest in the mortar sample (H, C, O, Na, Mg, Al, Si, K, Ca, Ti, Mn, Fe) were detected and mapped. The collected data graphically shown as compositional images were interpreted using different statistical approaches for the determination of the chemical composition of the binder and aggregate fraction. The methods of false color imaging, blind separation, and self-organizing maps were applied and their results are discussed in this paper. In particular, the method based on the use of self-organizing maps gives well interpretable results in very short times, without any reduction in the dimensionality of the system.

  20. The Use of Geostatistics in the Study of Floral Phenology of Vulpia geniculata (L.) Link

    PubMed Central

    León Ruiz, Eduardo J.; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered troughout the city and low mountains in the “Sierra de Córdoba” were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to ellaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps. PMID:22629169

  1. The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.

    PubMed

    León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps.

  2. Ultra-high resolution, polarization sensitive transversal optical coherence tomography for structural analysis and strain mapping

    NASA Astrophysics Data System (ADS)

    Wiesauer, Karin; Pircher, Michael; Goetzinger, Erich; Hitzenberger, Christoph K.; Engelke, Rainer; Ahrens, Gisela; Pfeiffer, Karl; Ostrzinski, Ute; Gruetzner, Gabi; Oster, Reinhold; Stifter, David

    2006-02-01

    Optical coherence tomography (OCT) is a contactless and non-invasive technique nearly exclusively applied for bio-medical imaging of tissues. Besides the internal structure, additionally strains within the sample can be mapped when OCT is performed in a polarization sensitive (PS) way. In this work, we demonstrate the benefits of PS-OCT imaging for non-biological applications. We have developed the OCT technique beyond the state-of-the-art: based on transversal ultra-high resolution (UHR-)OCT, where an axial resolution below 2 μm within materials is obtained using a femtosecond laser as light source, we have modified the setup for polarization sensitive measurements (transversal UHR-PS-OCT). We perform structural analysis and strain mapping for different types of samples: for a highly strained elastomer specimen we demonstrate the necessity of UHR-imaging. Furthermore, we investigate epoxy waveguide structures, photoresist moulds for the fabrication of micro-electromechanical parts (MEMS), and the glass-fibre composite outer shell of helicopter rotor blades where cracks are present. For these examples, transversal scanning UHR-PS-OCT is shown to provide important information about the structural properties and the strain distribution within the samples.

  3. Tree Cover Mapping Tool—Documentation and user manual

    USGS Publications Warehouse

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2016-06-02

    The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.

  4. Regional geochemistry Bandung Quadrangle West Java: for environmental and resources studies

    NASA Astrophysics Data System (ADS)

    Sendjaja, Purnama; Baharuddin

    2017-06-01

    Geochemical mapping based on the stream sediment method has been carried out in the whole of Java Region by the Centre for Geological Survey. The Regional Geochemistry Bandung Quadrangle as part of West Java Region has been mapped in 1:100.000 scale map, base on the Geological Map of Bandung Quadrangle. About 82 stream sediment samples collected and sieved in the 80 mesh sieve fraction during the field work session at 2011. This fraction was prepared and analysed for 30 elements by X-ray fluorescence spectrometry at the Centre for Geological Survey Laboratory. There are some elements indicating significant anomaly in this region, and it is important to determine the present abundance and spatial distribution of the elements for presuming result from natural product or derived from human activities. The volcanic products (Tangkuban Perahu Volcano, Volcanic Rock Complex and Quarternary Volcanic-Alluvial Deposit) are clearly identified on the distribution of As, Ba, Cl, Cu, Zr and La elements. However Mn, Zn, V and Sr are related to precipitation in the Tertiary Sediments, while the influence of human activities are showing from a geochemical map of Cl, Cr, Cu, Pb and Zn that show scattered anomalies localized close to the cities, farming and industries.

  5. Mapping the Gulf of Maine with side-scan sonar: A new bottom-type classification for complex seafloors

    USGS Publications Warehouse

    Barnhardt, W.A.; Kelley, J.T.; Dickson, S.M.; Belknap, D.F.

    1998-01-01

    The bedrock-framed seafloor in the northwestern Gulf of Maine is characterized by extreme changes in bathymetric relief and covered with a wide variety of surficial materials. Traditional methods of mapping cannot accurately represent the great heterogeneity of such a glaciated region. A new mapping scheme for complex seafloors, based primarily on the interpretation of side-scan sonar imagery, utilizes four easily recognized units: rock, gravel, sand and mud. In many places, however, the seafloor exhibits a complicated mixture or extremely 'patchy' distribution of the four basic units, which are too small to map individually. Twelve composite units, each a two-component mixture of the basic units, were established to represent this patchiness at a small scale (1:100,000). Using a geographic information system, these and all other available data (seismic profiles, grab samples, submersible dives and cores) were referenced to a common geographic base, superimposed on bathymetric contours and then integrated into surficial geologic maps of the regional inner continental shelf. This digital representation of the seafloor comprises a multidimensional, interactive model complete with explicit attributes (depth, bottom type) that allow for detailed analysis of marine environments.

  6. Evaluation of a PCR assay on overgrown environmental samples cultured for Mycobacterium avium subsp. paratuberculosis.

    PubMed

    Arango-Sabogal, Juan C; Labrecque, Olivia; Paré, Julie; Fairbrother, Julie-Hélène; Roy, Jean-Philippe; Wellemans, Vincent; Fecteau, Gilles

    2016-11-01

    Culture of Mycobacterium avium subsp. paratuberculosis (MAP) is the definitive antemortem test method for paratuberculosis. Microbial overgrowth is a challenge for MAP culture, as it complicates, delays, and increases the cost of the process. Additionally, herd status determination is impeded when noninterpretable (NI) results are obtained. The performance of PCR is comparable to fecal culture, thus it may be a complementary detection tool to classify NI samples. Our study aimed to determine if MAP DNA can be identified by PCR performed on NI environmental samples and to evaluate the performance of PCR before and after the culture of these samples in liquid media. A total of 154 environmental samples (62 NI, 62 negative, and 30 positive) were analyzed by PCR before being incubated in an automated system. Growth was confirmed by acid-fast bacilli stain and then the same PCR method was again applied on incubated samples, regardless of culture and stain results. Change in MAP DNA after incubation was assessed by converting the PCR quantification cycle (Cq) values into fold change using the 2 -ΔCq method (ΔCq = Cq after culture - Cq before culture). A total of 1.6% (standard error [SE] = 1.6) of the NI environmental samples had detectable MAP DNA. The PCR had a significantly better performance when applied after culture than before culture (p = 0.004). After culture, a 66-fold change (SE = 17.1) in MAP DNA was observed on average. Performing a PCR on NI samples improves MAP culturing. The PCR method used in our study is a reliable and consistent method to classify NI environmental samples. © 2016 The Author(s).

  7. Monitoring biological impacts of space shuttle launches from Vandenberg Air Force Base: Establishment of baseline conditions

    NASA Technical Reports Server (NTRS)

    Schmaizer, Paul A.; Hinkle, C. Ross

    1987-01-01

    Space shuttle launches produce environmental impacts resulting from the formation of an exhaust cloud containing hydrogen chloride aerosols and aluminum oxide particulates. Studies have shown that most impacts occur near-field (within 1.5 km) of the launch site while deposition from launches occurs far-field (as distant as 22 km). In order to establish baseline conditions of vegetation and soils in the areas likely to be impacted by shuttle launches from Vandenberg Air Force Base (VAFB), vegetation and soils in the vicinity of Space Launch Complex-6 (SLC-6) were sampled and a vegetation map prepared. The areas likely to be impacted by launches were determined considering the structure of the launch complex, the prevailing winds, the terrain, and predictions of the Rocket Exhaust Effluent Diffusion Model (REEDM). Fifty vegetation transects were established and sampled in March 1986 and resampled in September 1986. A vegetation map was prepared for six Master Planning maps surrounding SLC-6 using LANDSAT Thematic Mapper imagery as well as color and color infrared aerial photography. Soil samples were collected form the 0 to 7.5 cm layer at all transects in the wet season and at a subsample of the transects in the dry season and analyzed for pH, organic matter, conductivity, cation exchange capacity, exchangeable Ca, Mg, Na, K, and Al, available NH3-N, PO4-P, Cu, Fe, Mn, Zn, and TKN.

  8. Marine habitat mapping of the Milford Haven Waterway, Wales, UK: Comparison of facies mapping and EUNIS classification for monitoring sediment habitats in an industrialized estuary

    NASA Astrophysics Data System (ADS)

    Carey, Drew A.; Hayn, Melanie; Germano, Joseph D.; Little, David I.; Bullimore, Blaise

    2015-06-01

    A detailed map and dataset of sedimentary habitats of the Milford Haven Waterway (MHW) was compiled for the Milford Haven Waterway Environmental Surveillance Group (MHWESG) from seafloor images collected in May, 2012 using sediment-profile and plan-view imaging (SPI/PV) survey techniques. This is the most comprehensive synoptic assessment of sediment distribution and benthic habitat composition available for the MHW, with 559 stations covering over 40 km2 of subtidal habitats. In the context of the MHW, an interpretative framework was developed that classified each station within a 'facies' that included information on the location within the waterway and inferred sedimentary and biological processes. The facies approach provides critical information on landscape-scale habitats including relative location and inferred sediment transport processes and can be used to direct future monitoring activities within the MHW and to predict areas of greatest potential risk from contaminant transport. Intertidal sediment 'facies' maps have been compiled in the past for MHW; this approach was expanded to map the subtidal portions of the waterway. Because sediment facies can be projected over larger areas than individual samples (due to assumptions based on physiography, or landforms) they represent an observational model of the distribution of sediments in an estuary. This model can be tested over time and space through comparison with additional past or future sample results. This approach provides a means to evaluate stability or change in the physical and biological conditions of the estuarine system. Initial comparison with past results for intertidal facies mapping and grain size analysis from grab samples showed remarkable stability over time for the MHW. The results of the SPI/PV mapping effort were cross-walked to the European Nature Information System (EUNIS) classification to provide a comparison of locally derived habitat mapping with European-standard habitat mapping. Cross-walk was conducted by assigning each facies (or group of facies) to a EUNIS habitat (Levels 3 or 5) and compiling maps comparing facies distribution with EUNIS habitat distribution. The facies approach provides critical information on landscape-scale habitats including relative location and inferred sediment transport processes. The SPI/PV approach cannot consistently identify key species contained within the EUNIS Level 5 Habitats. For regional planning and monitoring efforts, a combination of EUNIS classification and facies description provides the greatest flexibility for management of dynamic soft-bottom habitats in coastal estuaries. The combined approach can be used to generate and test hypotheses of linkages between biological characteristics (EUNIS) and physical characteristics (facies). This approach is practical if a robust cross-walk methodology is developed to utilize both classification approaches. SPI/PV technology can be an effective rapid ground truth method for refining marine habitat maps based on predictive models.

  9. Application of a simple cerebellar model to geologic surface mapping

    USGS Publications Warehouse

    Hagens, A.; Doveton, J.H.

    1991-01-01

    Neurophysiological research into the structure and function of the cerebellum has inspired computational models that simulate information processing associated with coordination and motor movement. The cerebellar model arithmetic computer (CMAC) has a design structure which makes it readily applicable as an automated mapping device that "senses" a surface, based on a sample of discrete observations of surface elevation. The model operates as an iterative learning process, where cell weights are continuously modified by feedback to improve surface representation. The storage requirements are substantially less than those of a conventional memory allocation, and the model is extended easily to mapping in multidimensional space, where the memory savings are even greater. ?? 1991.

  10. Mapping underwater sound noise and assessing its sources by using a self-organizing maps method.

    PubMed

    Rako, Nikolina; Vilibić, Ivica; Mihanović, Hrvoje

    2013-03-01

    This study aims to provide an objective mapping of the underwater noise and its sources over an Adriatic coastal marine habitat by applying the self-organizing maps (SOM) method. Systematic sampling of sea ambient noise (SAN) was carried out at ten predefined acoustic stations between 2007 and 2009. Analyses of noise levels were performed for 1/3 octave band standard centered frequencies in terms of instantaneous sound pressure levels averaged over 300 s to calculate the equivalent continuous sound pressure levels. Data on vessels' presence, type, and distance from the monitoring stations were also collected at each acoustic station during the acoustic sampling. Altogether 69 noise surveys were introduced to the SOM predefined 2 × 2 array. The overall results of the analysis distinguished two dominant underwater soundscapes, associating them mainly to the seasonal changes in the nautical tourism and fishing activities within the study area and to the wind and wave action. The analysis identified recreational vessels as the dominant anthropogenic source of underwater noise, particularly during the tourist season. The method demonstrated to be an efficient tool in predicting the SAN levels based on the vessel distribution, indicating also the possibility of its wider implication for marine conservation.

  11. Chemical Cartography. I. A Carbonicity Map of the Galactic Halo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Young Sun; Kim, Young Kwang; Beers, Timothy C.

    We present the first map of carbonicity, [C/Fe], for the halo system of the Milky Way, based on a sample of over 100,000 main-sequence turnoff stars with available spectroscopy from the Sloan Digital Sky Survey. This map, which explores distances up to 15 kpc from the Sun, reveals clear evidence for the dual nature of the Galactic halo, based on the spatial distribution of stellar carbonicity. The metallicity distribution functions of stars in the inner- and outer-halo regions of the carbonicity map reproduce those previously argued to arise from contributions of the inner- and outer-halo populations, with peaks at [Fe/H]more » = −1.5 and −2.2, respectively. From consideration of the absolute carbon abundances for our sample, A (C), we also confirm that the carbon-enhanced metal-poor (CEMP) stars in the outer-halo region exhibit a higher frequency of CEMP-no stars (those with no overabundances of heavy neutron-capture elements) than of CEMP- s stars (those with strong overabundances of elements associated with the s -process), whereas the stars in the inner-halo region exhibit a higher frequency of CEMP- s stars. We argue that the contrast in the behavior of the CEMP-no and CEMP- s fractions in these regions arises from differences in the mass distributions of the mini-halos from which the stars of the inner- and outer-halo populations formed, which gives rise in turn to the observed dichotomy of the Galactic halo.« less

  12. EVALUATING ECOREGIONS FOR SAMPLING AND MAPPING LAND-COVER PATTERNS

    EPA Science Inventory

    Ecoregional stratification has been proposed for sampling and mapping land- cover composition and pattern over time. Using a wall-to-wall land-cover map of the United States, we evaluated geographic scales of variance for 17 landscape pattern indices, and compared stratification ...

  13. Map and table showing isotopic age data in Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Shew, Nora B.; DuBois, G.D.

    1994-01-01

    The source of the data reported here is a compilation of radiometric ages maintained in conjunction with the Alaska Mineral Resource Assessment Program (AMRAP) studies for Alaska. The symbol shape plotted at each location is coded for rock type, whether igneous, metamorphic, or other; the color of the symbol shows the geologic era or period for the Sample(s) at each locale. A list of references for each quadrangle is given to enable the user to find specific information including analytical data for each sample dated within a particular quadrangle. At the scale of this map, the very large number of Samples and the clustering of the samples in limited areas prevented the showing of individual sample numbers on the map.Synthesis and interpretation of any data set requires the user to evaluate the reliability or value of each component of the data set with respect to his or her intended use of the data. For geochronological data, this evaluation must be based on both analytical and geological criteria. Most age determinations are published with calculated estimates of analytical precision, Replicate analyses are infrequently performed; therefore, reported analytical precision is based on estimates of the precision of various components of the analysis and often on an intuitive factor to cover components that may have not been considered. Analytical accuracy is somewhat more difficult to determine; it is not only dependent on the actual measurement, it is also concerned with uncertainties in decay and abundance constants, uncertainties in the isotopic composition and size of the tracer for conventional K-Ar ages, and uncertainties in the Original isotopic composition of the sample, Geologic accuracy of a date is Variable; the interpretation of the meaning of an age determination, is important in the evaluation of its geologic accuracy. Potassium-argon, rubidium-strontium, and uranium-lead age determinations on a single sample can differ widely yet none or all may be wrong Given that the basic Conditions of each dating method were met, each method determines an age based on the equilibration of its particular isotopic system, yet these are different systems and they react to heat, pressure, and recrystallization in different ways.This map is a compilation and not a synthesis or interpretation. Its purpose is to help the user determine the dating coverage of areas of Alaska and gain access to the available data for the state or a project area. Interpretation of that data and evaluation of its suitability for use with any particular project is left to the user. Compilations, with sample data, have been published for much of the state; and are as follows: Wilson, and others (1979), southeastern Alaska; Wilson (1981), Aleutian Islands and Alaska Peninsula, Shew and Wilson (1981), southwestern Alaska; Wilson and others (1985), Yukon Crystalline terrane; Grybeck and others (1977), northern Alaska; Dadisman (1980), south-central Alaska.

  14. Extinction maps toward the Milky Way bulge: Two-dimensional and three-dimensional tests with apogee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultheis, M.; Zasowski, G.; Allende Prieto, C.

    Galactic interstellar extinction maps are powerful and necessary tools for Milky Way structure and stellar population analyses, particularly toward the heavily reddened bulge and in the midplane. However, due to the difficulty of obtaining reliable extinction measures and distances for a large number of stars that are independent of these maps, tests of their accuracy and systematics have been limited. Our goal is to assess a variety of photometric stellar extinction estimates, including both two-dimensional and three-dimensional extinction maps, using independent extinction measures based on a large spectroscopic sample of stars toward the Milky Way bulge. We employ stellar atmosphericmore » parameters derived from high-resolution H-band Apache Point Observatory Galactic Evolution Experiment (APOGEE) spectra, combined with theoretical stellar isochrones, to calculate line-of-sight extinction and distances for a sample of more than 2400 giants toward the Milky Way bulge. We compare these extinction values to those predicted by individual near-IR and near+mid-IR stellar colors, two-dimensional bulge extinction maps, and three-dimensional extinction maps. The long baseline, near+mid-IR stellar colors are, on average, the most accurate predictors of the APOGEE extinction estimates, and the two-dimensional and three-dimensional extinction maps derived from different stellar populations along different sightlines show varying degrees of reliability. We present the results of all of the comparisons and discuss reasons for the observed discrepancies. We also demonstrate how the particular stellar atmospheric models adopted can have a strong impact on this type of analysis, and discuss related caveats.« less

  15. Mapping Petroluem Migration Pathways Using Magnetics and Seismic Interpretations

    NASA Astrophysics Data System (ADS)

    Abubakar, R.; Muxworthy, A. R.; Sephton, M. A.; Fraser, A.; Heslop, D.; Paterson, G. A.; Southern, P.

    2015-12-01

    We report the formation of magnetic minerals in petroleum reservoirs. Eleven wells from Wessex Basin in Dorset, southern England, were sampled from the British Geological Core Store, across the main reservoir unit; Bridport Sandstone and the overlying Inferior Oolite. Sampling was carried out based on visible evidence of oil stain and a high magnetic susceptibility reading. The samples were chemically extracted to determine which were naturally stained with hydrocarbon and which were not. Magnetic analysis was carried out on all the samples: this including hysteresis analysis at low temperatures (5-15K) and room temperature, and low-temperature thermogmagentic analysis. The results indicated a trend based on the migration of hydrocarbons; from the source area, to the reservoir through the carrier beds.

  16. Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.

    PubMed

    Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia

    2017-04-01

    Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.

  17. Vegetation database for land-cover mapping, Clark and Lincoln Counties, Nevada

    USGS Publications Warehouse

    Charlet, David A.; Damar, Nancy A.; Leary, Patrick J.

    2014-01-01

    Floristic and other vegetation data were collected at 3,175 sample sites to support land-cover mapping projects in Clark and Lincoln Counties, Nevada, from 2007 to 2013. Data were collected at sample sites that were selected to fulfill mapping priorities by one of two different plot sampling approaches. Samples were described at the stand level and classified into the National Vegetation Classification hierarchy at the alliance level and above. The vegetation database is presented in geospatial and tabular formats.

  18. Documenting research with transgender and gender diverse people: protocol for an evidence map and thematic analysis.

    PubMed

    Marshall, Zack; Welch, Vivian; Thomas, James; Brunger, Fern; Swab, Michelle; Shemilt, Ian; Kaposy, Chris

    2017-02-20

    There is limited information about how transgender, gender diverse, and Two-Spirit (trans) people have been represented and studied by researchers. The objectives of this study are to (1) map and describe trans research in the social sciences, sciences, humanities, health, education, and business, (2) identify evidence gaps and opportunities for more responsible research with trans people, (3) assess the use of text mining for study identification, and (4) increase access to trans research for key stakeholders through the creation of a web-based evidence map. Study design was informed by community consultations and pilot searches. Eligibility criteria were established to include all original research of any design, including trans people or their health information, and published in English in peer-reviewed journals. A complex electronic search strategy based on relevant concepts in 15 databases was developed to obtain a broad range of results linked to transgender, gender diverse, and Two-Spirit individuals and communities. Searches conducted in early 2015 resulted in 25,242 references after removal of duplicates. Based on the number of references, resources, and an objective to capture upwards of 90% of the existing literature, this study is a good candidate for text mining using Latent Dirichlet Allocation to improve efficiency of the screening process. The following information will be collected for evidence mapping: study topic, study design, methods and data sources, recruitment strategies, sample size, sample demographics, researcher name and affiliation, country where research was conducted, funding source, and year of publication. The proposed research incorporates an extensive search strategy, text mining, and evidence map; it therefore has the potential to build on knowledge in several fields. Review results will increase awareness of existing trans research, identify evidence gaps, and inform strategic research prioritization. Publishing the map online will improve access to research for key stakeholders including community members, policy makers, and healthcare providers. This study will also contribute to knowledge in the area of text mining for study identification by providing an example of how semi-automation performs for screening on title and abstract and on full text.

  19. Computer-aided diagnosis of early knee osteoarthritis based on MRI T2 mapping.

    PubMed

    Wu, Yixiao; Yang, Ran; Jia, Sen; Li, Zhanjun; Zhou, Zhiyang; Lou, Ting

    2014-01-01

    This work was aimed at studying the method of computer-aided diagnosis of early knee OA (OA: osteoarthritis). Based on the technique of MRI (MRI: Magnetic Resonance Imaging) T2 Mapping, through computer image processing, feature extraction, calculation and analysis via constructing a classifier, an effective computer-aided diagnosis method for knee OA was created to assist doctors in their accurate, timely and convenient detection of potential risk of OA. In order to evaluate this method, a total of 1380 data from the MRI images of 46 samples of knee joints were collected. These data were then modeled through linear regression on an offline general platform by the use of the ImageJ software, and a map of the physical parameter T2 was reconstructed. After the image processing, the T2 values of ten regions in the WORMS (WORMS: Whole-organ Magnetic Resonance Imaging Score) areas of the articular cartilage were extracted to be used as the eigenvalues in data mining. Then,a RBF (RBF: Radical Basis Function) network classifier was built to classify and identify the collected data. The classifier exhibited a final identification accuracy of 75%, indicating a good result of assisting diagnosis. Since the knee OA classifier constituted by a weights-directly-determined RBF neural network didn't require any iteration, our results demonstrated that the optimal weights, appropriate center and variance could be yielded through simple procedures. Furthermore, the accuracy for both the training samples and the testing samples from the normal group could reach 100%. Finally, the classifier was superior both in time efficiency and classification performance to the frequently used classifiers based on iterative learning. Thus it was suitable to be used as an aid to computer-aided diagnosis of early knee OA.

  20. Integrative Cardiac Health Project, Windber Research Institute

    DTIC Science & Technology

    2014-07-01

    laparoscopically placed adjustable gastric banding (LAGB) baseline (5) and one year (5), control baseline (5) and one year (5). OD260/280 ratios...coverage and detection of 3-4 million CpG sites . All samples had a bisulfite conversion rate of >98.25%; number of CpG (methylated) sites per sample...methylation) and hyper-methylated (increasing methylation) sites in the three groups were identified. For LAGB patients, a heat map based on

  1. The IRHUM (Isotopic Reconstruction of Human Migration) database - bioavailable strontium isotope ratios for geochemical fingerprinting in France

    NASA Astrophysics Data System (ADS)

    Willmes, M.; McMorrow, L.; Kinsley, L.; Armstrong, R.; Aubert, M.; Eggins, S.; Falguères, C.; Maureille, B.; Moffat, I.; Grün, R.

    2013-11-01

    Strontium isotope ratios (87Sr/86Sr) are a key geochemical tracer used in a wide range of fields including archaeology, ecology, food and forensic sciences. These applications are based on the principle that the Sr isotopic ratios of natural materials reflect the sources of strontium available during their formation. A major constraint for current studies is the lack of robust reference maps to evaluate the source of strontium isotope ratios measured in the samples. Here we provide a new dataset of bioavailable Sr isotope ratios for the major geologic units of France, based on plant and soil samples (Pangaea data repository doi:10.1594/PANGAEA.819142). The IRHUM (Isotopic Reconstruction of Human Migration) database is a web platform to access, explore and map our dataset. The database provides the spatial context and metadata for each sample, allowing the user to evaluate the suitability of the sample for their specific study. In addition, it allows users to upload and share their own datasets and data products, which will enhance collaboration across the different research fields. This article describes the sampling and analytical methods used to generate the dataset and how to use and access of the dataset through the IRHUM database. Any interpretation of the isotope dataset is outside the scope of this publication.

  2. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    NASA Astrophysics Data System (ADS)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  3. Comparison of Probabilistic Coastal Inundation Maps Based on Historical Storms and Statistically Modeled Storm Ensemble

    NASA Astrophysics Data System (ADS)

    Feng, X.; Sheng, Y.; Condon, A. J.; Paramygin, V. A.; Hall, T.

    2012-12-01

    A cost effective method, JPM-OS (Joint Probability Method with Optimal Sampling), for determining storm response and inundation return frequencies was developed and applied to quantify the hazard of hurricane storm surges and inundation along the Southwest FL,US coast (Condon and Sheng 2012). The JPM-OS uses piecewise multivariate regression splines coupled with dimension adaptive sparse grids to enable the generation of a base flood elevation (BFE) map. Storms are characterized by their landfall characteristics (pressure deficit, radius to maximum winds, forward speed, heading, and landfall location) and a sparse grid algorithm determines the optimal set of storm parameter combinations so that the inundation from any other storm parameter combination can be determined. The end result is a sample of a few hundred (197 for SW FL) optimal storms which are simulated using a dynamically coupled storm surge / wave modeling system CH3D-SSMS (Sheng et al. 2010). The limited historical climatology (1940 - 2009) is explored to develop probabilistic characterizations of the five storm parameters. The probability distributions are discretized and the inundation response of all parameter combinations is determined by the interpolation in five-dimensional space of the optimal storms. The surge response and the associated joint probability of the parameter combination is used to determine the flood elevation with a 1% annual probability of occurrence. The limited historical data constrains the accuracy of the PDFs of the hurricane characteristics, which in turn affect the accuracy of the BFE maps calculated. To offset the deficiency of limited historical dataset, this study presents a different method for producing coastal inundation maps. Instead of using the historical storm data, here we adopt 33,731 tracks that can represent the storm climatology in North Atlantic basin and SW Florida coasts. This large quantity of hurricane tracks is generated from a new statistical model which had been used for Western North Pacific (WNP) tropical cyclone (TC) genesis (Hall 2011) as well as North Atlantic tropical cyclone genesis (Hall and Jewson 2007). The introduction of these tracks complements the shortage of the historical samples and allows for more reliable PDFs required for implementation of JPM-OS. Using the 33,731 tracks and JPM-OS, an optimal storm ensemble is determined. This approach results in different storms/winds for storm surge and inundation modeling, and produces different Base Flood Elevation maps for coastal regions. Coastal inundation maps produced by the two different methods will be discussed in detail in the poster paper.

  4. Integrating depth functions and hyper-scale terrain analysis for 3D soil organic carbon modeling in agricultural fields at regional scale

    NASA Astrophysics Data System (ADS)

    Ramirez-Lopez, L.; van Wesemael, B.; Stevens, A.; Doetterl, S.; Van Oost, K.; Behrens, T.; Schmidt, K.

    2012-04-01

    Soil Organic Carbon (SOC) represents a key component in the global C cycle and has an important influence on the global CO2 fluxes between terrestrial biosphere and atmosphere. In the context of agricultural landscapes, SOC inventories are important since soil management practices have a strong influence on CO2 fluxes and SOC stocks. However, there is lack of accurate and cost-effective methods for producing high spatial resolution of SOC information. In this respect, our work is focused on the development of a three dimensional modeling approach for SOC monitoring in agricultural fields. The study area comprises ~420 km2 and includes 4 of the 5 agro-geological regions of the Grand-Duchy of Luxembourg. The soil dataset consist of 172 profiles (1033 samples) which were not sampled specifically for this study. This dataset is a combination of profile samples collected in previous soil surveys and soil profiles sampled for other research purposes. The proposed strategy comprises two main steps. In the first step the SOC distribution within each profile (vertical distribution) is modeled. Depth functions for are fitted in order to summarize the information content in the profile. By using these functions the SOC can be interpolated at any depth within the profiles. The second step involves the use of contextual terrain (ConMap) features (Behrens et al., 2010). These features are based on the differences in elevation between a given point location in the landscape and its circular neighbourhoods at a given set of different radius. One of the main advantages of this approach is that it allows the integration of several spatial scales (eg. local and regional) for soil spatial analysis. In this work the ConMap features are derived from a digital elevation model of the area and are used as predictors for spatial modeling of the parameters of the depth functions fitted in the previous step. In this poster we present some preliminary results in which we analyze: i. The use of different depth functions, ii. The use of different machine learning approaches for modeling the parameters of the fitted depth functions using the ConMap features and iii. The influence of different spatial scales on the SOC profile distribution variability. Keywords: 3D modeling, Digital soil mapping, Depth functions, Terrain analysis. Reference Behrens, T., K. Schmidt, K., Zhu, A.X. Scholten, T. 2010. The ConMap approach for terrain-based digital soil mapping. European Journal of Soil Science, v. 61, p.133-143.

  5. Using Moss to Detect Fine-Scaled Deposition of Heavy Metals in Urban Environments

    NASA Astrophysics Data System (ADS)

    Jovan, S.; Donovan, G.; Demetrios, G.; Monleon, V. J.; Amacher, M. C.

    2017-12-01

    Mosses are commonly used as bio-indicators of heavy metal deposition to forests. Their application in urban airsheds is relatively rare. Our objective was to develop fine-scaled, city-wide maps for heavy metals in Portland, Oregon, to identify pollution "hotspots" and serve as a screening tool for more effective placement of air quality monitoring instruments. In 2013 we measured twenty-two elements in epiphytic moss sampled on a 1km x1km sampling grid (n = 346). We detected large hotspots of cadmium and arsenic in two neighborhoods associated with stained glass manufacturers. Air instruments deployed by local regulators measured cadmium concentrations 49 times and arsenic levels 155 times the state health benchmarks. Moss maps also detected a large nickel hotspot in a neighborhood near a forge where air instruments later measured concentrations 4 times the health benchmark. In response, the facilities implemented new pollution controls, air quality improved in all three affected neighborhoods, revision of regulations for stained glass furnace emissions are underway, and Oregon's governor launched an initiative to develop health-based (vs technology-based) regulations for air toxics in the state. The moss maps also indicated a couple dozen smaller hotspots of heavy metals, including lead, chromium, and cobalt, in Portland neighborhoods. Ongoing follow-up work includes: 1) use of moss sampling by local regulators to investigate source and extent of the smaller hotspots, 2) use of lead isotopes to determine origins of higher lead levels observed in moss collected from the inner city, and 3) co-location of air instruments and moss sampling to determine accuracy, timeframe represented, and seasonality of heavy metals in moss.

  6. Simultaneous topography imaging and broadband nanomechanical mapping on atomic force microscope

    NASA Astrophysics Data System (ADS)

    Li, Tianwei; Zou, Qingze

    2017-12-01

    In this paper, an approach is proposed to achieve simultaneous imaging and broadband nanomechanical mapping of soft materials in air by using an atomic force microscope. Simultaneous imaging and nanomechanical mapping are needed, for example, to correlate the morphological and mechanical evolutions of the sample during dynamic phenomena such as the cell endocytosis process. Current techniques for nanomechanical mapping, however, are only capable of capturing static elasticity of the material, or the material viscoelasticity in a narrow frequency band around the resonant frequency(ies) of the cantilever used, not competent for broadband nanomechanical mapping, nor acquiring topography image of the sample simultaneously. These limitations are addressed in this work by enabling the augmentation of an excitation force stimuli of rich frequency spectrum for nanomechanical mapping in the imaging process. Kalman-filtering technique is exploited to decouple and split the mixed signals for imaging and mapping, respectively. Then the sample indentation generated is quantified online via a system-inversion method, and the effects of the indentation generated and the topography tracking error on the topography quantification are taken into account. Moreover, a data-driven feedforward-feedback control is utilized to track the sample topography. The proposed approach is illustrated through experimental implementation on a polydimethylsiloxane sample with a pre-fabricated pattern.

  7. Effect of increasing the sampling interval to 2 seconds on the radiation dose and accuracy of CT perfusion of the head and neck.

    PubMed

    Tawfik, Ahmed M; Razek, Ahmed A; Elhawary, Galal; Batouty, Nihal M

    2014-01-01

    To evaluate the effect of increasing the sampling interval from 1 second (1 image per second) to 2 seconds (1 image every 2 seconds) on computed tomographic (CT) perfusion (CTP) of head and neck tumors. Twenty patients underwent CTP studies of head and neck tumors with images acquired in cine mode for 50 seconds using sampling interval of 1 second. Using deconvolution-based software, analysis of CTP was done with sampling interval of 1 second and then 2 seconds. Perfusion maps representing blood flow, blood volume, mean transit time, and permeability surface area product (PS) were obtained. Quantitative tumor CTP values were compared between the 2 sampling intervals. Two blinded radiologists compared the subjective quality of CTP maps using a 3-point scale between the 2 sampling intervals. Radiation dose parameters were recorded for the 2 sampling interval rates. No significant differences were observed between the means of the 4 perfusion parameters generated using both sampling intervals; all P >0.05. The 95% limits of agreement between the 2 sampling intervals were -65.9 to 48.1) mL/min per 100 g for blood flow, -3.6 to 3.1 mL/100 g for blood volume, -2.9 to 3.8 seconds for mean transit time, and -10.0 to 12.5 mL/min per 100 g for PS. There was no significant difference between the subjective quality scores of CTP maps obtained using the 2 sampling intervals; all P > 0.05. Radiation dose was halved when sampling interval increased from 1 to 2 seconds. Increasing the sampling interval rate to 1 image every 2 seconds does not compromise the image quality and has no significant effect on quantitative perfusion parameters of head and neck tumors. The radiation dose is halved.

  8. Effect of Map-vaccination in ewes on body condition score, weight and Map-shedding.

    PubMed

    Hüttner, Klim; Krämer, Ulla; Kleist, Petra

    2012-01-01

    Vaccination against Mycobacterium avium subspecies paratuberculosis (Map) in sheep receives growing attention worldwide, particularly in countries with national Map control strategies. A field study was conducted, investigating the effect of GUDAIR on body condition, weight and Map-shedding in a professionally managed but largely Map-affected suffolk flock prior and after vaccination. For this, 80 ewes out of 1000 animals were randomly sampled. In the univariate analysis body condition scores of ewes twelve months after vaccination improved significantly compared to those sampled prior to vaccination. At the same time the rate of ewes shedding Map was reduced by 37%.

  9. Sampling Simulations for Assessing the Accuracy of U.S. Agricultural Crop Mapping from Remotely Sensed Imagery

    NASA Astrophysics Data System (ADS)

    Dwyer, Linnea; Yadav, Kamini; Congalton, Russell G.

    2017-04-01

    Providing adequate food and water for a growing, global population continues to be a major challenge. Mapping and monitoring crops are useful tools for estimating the extent of crop productivity. GFSAD30 (Global Food Security Analysis Data at 30m) is a program, funded by NASA, that is producing global cropland maps by using field measurements and remote sensing images. This program studies 8 major crop types, and includes information on cropland area/extent, if crops are irrigated or rainfed, and the cropping intensities. Using results from the US and the extensive reference data available, CDL (USDA Crop Data Layer), we will experiment with various sampling simulations to determine optimal sampling for thematic map accuracy assessment. These simulations will include varying the sampling unit, the sampling strategy, and the sample number. Results of these simulations will allow us to recommend assessment approaches to handle different cropping scenarios.

  10. Mapping the force field of a hydrogen-bonded assembly

    NASA Astrophysics Data System (ADS)

    Sweetman, A. M.; Jarvis, S. P.; Sang, Hongqian; Lekkas, I.; Rahe, P.; Wang, Yu; Wang, Jianbo; Champness, N. R.; Kantorovich, L.; Moriarty, P.

    2014-05-01

    Hydrogen bonding underpins the properties of a vast array of systems spanning a wide variety of scientific fields. From the elegance of base pair interactions in DNA to the symmetry of extended supramolecular assemblies, hydrogen bonds play an essential role in directing intermolecular forces. Yet fundamental aspects of the hydrogen bond continue to be vigorously debated. Here we use dynamic force microscopy (DFM) to quantitatively map the tip-sample force field for naphthalene tetracarboxylic diimide molecules hydrogen-bonded in two-dimensional assemblies. A comparison of experimental images and force spectra with their simulated counterparts shows that intermolecular contrast arises from repulsive tip-sample interactions whose interpretation can be aided via an examination of charge density depletion across the molecular system. Interpreting DFM images of hydrogen-bonded systems therefore necessitates detailed consideration of the coupled tip-molecule system: analyses based on intermolecular charge density in the absence of the tip fail to capture the essential physical chemistry underpinning the imaging mechanism.

  11. An internal reference model-based PRF temperature mapping method with Cramer-Rao lower bound noise performance analysis.

    PubMed

    Li, Cheng; Pan, Xinyi; Ying, Kui; Zhang, Qiang; An, Jing; Weng, Dehe; Qin, Wen; Li, Kuncheng

    2009-11-01

    The conventional phase difference method for MR thermometry suffers from disturbances caused by the presence of lipid protons, motion-induced error, and field drift. A signal model is presented with multi-echo gradient echo (GRE) sequence using a fat signal as an internal reference to overcome these problems. The internal reference signal model is fit to the water and fat signals by the extended Prony algorithm and the Levenberg-Marquardt algorithm to estimate the chemical shifts between water and fat which contain temperature information. A noise analysis of the signal model was conducted using the Cramer-Rao lower bound to evaluate the noise performance of various algorithms, the effects of imaging parameters, and the influence of the water:fat signal ratio in a sample on the temperature estimate. Comparison of the calculated temperature map and thermocouple temperature measurements shows that the maximum temperature estimation error is 0.614 degrees C, with a standard deviation of 0.06 degrees C, confirming the feasibility of this model-based temperature mapping method. The influence of sample water:fat signal ratio on the accuracy of the temperature estimate is evaluated in a water-fat mixed phantom experiment with an optimal ratio of approximately 0.66:1. (c) 2009 Wiley-Liss, Inc.

  12. Genome-wide base-resolution mapping of DNA methylation in single cells using single-cell bisulfite sequencing (scBS-seq).

    PubMed

    Clark, Stephen J; Smallwood, Sébastien A; Lee, Heather J; Krueger, Felix; Reik, Wolf; Kelsey, Gavin

    2017-03-01

    DNA methylation (DNAme) is an important epigenetic mark in diverse species. Our current understanding of DNAme is based on measurements from bulk cell samples, which obscures intercellular differences and prevents analyses of rare cell types. Thus, the ability to measure DNAme in single cells has the potential to make important contributions to the understanding of several key biological processes, such as embryonic development, disease progression and aging. We have recently reported a method for generating genome-wide DNAme maps from single cells, using single-cell bisulfite sequencing (scBS-seq), allowing the quantitative measurement of DNAme at up to 50% of CpG dinucleotides throughout the mouse genome. Here we present a detailed protocol for scBS-seq that includes our most recent developments to optimize recovery of CpGs, mapping efficiency and success rate; reduce hands-on time; and increase sample throughput with the option of using an automated liquid handler. We provide step-by-step instructions for each stage of the method, comprising cell lysis and bisulfite (BS) conversion, preamplification and adaptor tagging, library amplification, sequencing and, lastly, alignment and methylation calling. An individual with relevant molecular biology expertise can complete library preparation within 3 d. Subsequent computational steps require 1-3 d for someone with bioinformatics expertise.

  13. Estimating cross-validatory predictive p-values with integrated importance sampling for disease mapping models.

    PubMed

    Li, Longhai; Feng, Cindy X; Qiu, Shi

    2017-06-30

    An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Neighborhood size of training data influences soil map disaggregation

    USDA-ARS?s Scientific Manuscript database

    Soil class mapping relies on the ability of sample locations to represent portions of the landscape with similar soil types; however, most digital soil mapping (DSM) approaches intersect sample locations with one raster pixel per covariate layer regardless of pixel size. This approach does not take ...

  15. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  16. Geochemical map of the Rattlesnake Roadless Area, Coconino and Yavapai counties, Arizona

    USGS Publications Warehouse

    Gerstel, W.J.

    1985-01-01

    The geochemical survey of the Rattlesnake Roadless Area was conducted in May 1982 by the U.S. Geological Survey to aid in a mineral resource appraisal of the area. A total of 114 stream-sediment samples, 68 heavy-mineral concentrates from stream sediment, 20 rock samples, and 4 water samples was collected by S.C. Rose, D.E. Hendzel, and W.J. Gerstel, with helicopter support from Jack Ruby, pilot for Helicopters Unlimited. All sample localities are plotted on the map; sample localities showing anomalous barium and lead are also indicated on the map.

  17. Group-regularized individual prediction: theory and application to pain.

    PubMed

    Lindquist, Martin A; Krishnan, Anjali; López-Solà, Marina; Jepma, Marieke; Woo, Choong-Wan; Koban, Leonie; Roy, Mathieu; Atlas, Lauren Y; Schmidt, Liane; Chang, Luke J; Reynolds Losin, Elizabeth A; Eisenbarth, Hedwig; Ashar, Yoni K; Delk, Elizabeth; Wager, Tor D

    2017-01-15

    Multivariate pattern analysis (MVPA) has become an important tool for identifying brain representations of psychological processes and clinical outcomes using fMRI and related methods. Such methods can be used to predict or 'decode' psychological states in individual subjects. Single-subject MVPA approaches, however, are limited by the amount and quality of individual-subject data. In spite of higher spatial resolution, predictive accuracy from single-subject data often does not exceed what can be accomplished using coarser, group-level maps, because single-subject patterns are trained on limited amounts of often-noisy data. Here, we present a method that combines population-level priors, in the form of biomarker patterns developed on prior samples, with single-subject MVPA maps to improve single-subject prediction. Theoretical results and simulations motivate a weighting based on the relative variances of biomarker-based prediction-based on population-level predictive maps from prior groups-and individual-subject, cross-validated prediction. Empirical results predicting pain using brain activity on a trial-by-trial basis (single-trial prediction) across 6 studies (N=180 participants) confirm the theoretical predictions. Regularization based on a population-level biomarker-in this case, the Neurologic Pain Signature (NPS)-improved single-subject prediction accuracy compared with idiographic maps based on the individuals' data alone. The regularization scheme that we propose, which we term group-regularized individual prediction (GRIP), can be applied broadly to within-person MVPA-based prediction. We also show how GRIP can be used to evaluate data quality and provide benchmarks for the appropriateness of population-level maps like the NPS for a given individual or study. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Development of a transformation model to derive general population-based utility: Mapping the pruritus-visual analog scale (VAS) to the EQ-5D utility.

    PubMed

    Park, Sun-Young; Park, Eun-Ja; Suh, Hae Sun; Ha, Dongmun; Lee, Eui-Kyung

    2017-08-01

    Although nonpreference-based disease-specific measures are widely used in clinical studies, they cannot generate utilities for economic evaluation. A solution to this problem is to estimate utilities from disease-specific instruments using the mapping function. This study aimed to develop a transformation model for mapping the pruritus-visual analog scale (VAS) to the EuroQol 5-Dimension 3-Level (EQ-5D-3L) utility index in pruritus. A cross-sectional survey was conducted with a sample (n = 268) drawn from the general population of South Korea. Data were randomly divided into 2 groups, one for estimating and the other for validating mapping models. To select the best model, we developed and compared 3 separate models using demographic information and the pruritus-VAS as independent variables. The predictive performance was assessed using the mean absolute deviation and root mean square error in a separate dataset. Among the 3 models, model 2 using age, age squared, sex, and the pruritus-VAS as independent variables had the best performance based on the goodness of fit and model simplicity, with a log likelihood of 187.13. The 3 models had similar precision errors based on mean absolute deviation and root mean square error in the validation dataset. No statistically significant difference was observed between the mean observed and predicted values in all models. In conclusion, model 2 was chosen as the preferred mapping model. Outcomes measured as the pruritus-VAS can be transformed into the EQ-5D-3L utility index using this mapping model, which makes an economic evaluation possible when only pruritus-VAS data are available. © 2017 John Wiley & Sons, Ltd.

  19. Non-Destructive Characterization of Engineering Materials Using High-Energy X-rays at the Advanced Photon Source

    DOE PAGES

    Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika; ...

    2017-05-30

    High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.

  20. Non-Destructive Characterization of Engineering Materials Using High-Energy X-rays at the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika

    High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.

  1. Spatial Prediction of Soil Classes by Using Soil Weathering Parameters Derived from vis-NIR Spectroscopy

    NASA Astrophysics Data System (ADS)

    Ramirez-Lopez, Leonardo; Alexandre Dematte, Jose

    2010-05-01

    There is consensus in the scientific community about the great need of spatial soil information. Conventional mapping methods are time consuming and involve high costs. Digital soil mapping has emerged as an area in which the soil mapping is optimized by the application of mathematical and statistical approaches, as well as the application of expert knowledge in pedology. In this sense, the objective of the study was to develop a methodology for the spatial prediction of soil classes by using soil spectroscopy methodologies related with fieldwork, spectral data from satellite image and terrain attributes in simultaneous. The studied area is located in São Paulo State, and comprised an area of 473 ha, which was covered by a regular grid (100 x 100 m). In each grid node was collected soil samples at two depths (layers A and B). There were extracted 206 samples from transect sections and submitted to soil analysis (clay, Al2O3, Fe2O3, SiO2 TiO2, and weathering index). The first analog soil class map (ASC-N) contains only soil information regarding from orders to subgroups of the USDA Soil Taxonomy System. The second (ASC-H) map contains some additional information related to some soil attributes like color, ferric levels and base sum. For the elaboration of the digital soil maps the data was divided into three groups: i) Predicted soil attributes of the layer B (related to the soil weathering) which were obtained by using a local soil spectral library; ii) Spectral bands data extracted from a Landsat image; and iii) Terrain parameters. This information was summarized by a principal component analysis (PCA) in each group. Digital soil maps were generated by supervised classification using a maximum likelihood method. The trainee information for this classification was extracted from five toposequences based on the analog soil class maps. The spectral models of weathering soil attributes shown a high predictive performance with low error (R2 0.71 to 0.90). The spatial prediction of these attributes also showed a high performance (validations with R2> 0.78). These models allowed to increase spatial resolution of soil weathering information. On the other hand, the comparison between the analog and digital soil maps showed a global accuracy of 69% for the ASC-N map and 62% in the ASC-H map, with kappa indices of 0.52 and 0.45 respectively.

  2. Comparison of rapid diagnostic tests to detect Mycobacterium avium subsp. paratuberculosis disseminated infection in bovine liver.

    PubMed

    Zarei, Mehdi; Ghorbanpour, Masoud; Tajbakhsh, Samaneh; Mosavari, Nader

    2017-08-01

    Mycobacterium avium subsp. paratuberculosis (MAP) causes Johne's disease, a chronic enteritis in cattle and other domestic and wild ruminants. The presence of MAP in tissues other than intestines and associated lymph nodes, such as meat and liver, is a potential public health concern. In the present study, the relationship between the results of rapid diagnostic tests of the Johne's disease, such as serum ELISA, rectal scraping PCR, and acid-fast staining, and the presence of MAP in liver was evaluated. Blood, liver, and rectal scraping samples were collected from 200 slaughtered cattle with unknown Johne's disease status. ELISA was performed to determine the MAP antibody activity in the serum. Acid-fast staining was performed on rectal scraping samples, and PCR was performed on rectal scraping and liver samples. PCR-positive liver samples were used for mycobacterial culture. Overall, the results of this study demonstrated that MAP can be detected and cultured from liver of slaughtered cattle and rapid diagnostic tests of Johne's disease have limited value in detecting cattle with MAP infection in liver. These findings show that the presence of MAP in liver tissue may occur in cows with negative results for rapid diagnostic tests and vice versa. Hence, liver might represent another possible risk of human exposure to MAP. Given concerns about a potential zoonotic role for MAP, these results show the necessity to find new methods for detecting cattle with MAP disseminated infection.

  3. Culture-Independent Identification of Mycobacterium avium Subspecies paratuberculosis in Ovine Tissues: Comparison with Bacterial Culture and Histopathological Lesions

    PubMed Central

    Acharya, Kamal R.; Dhand, Navneet K.; Whittington, Richard J.; Plain, Karren M.

    2017-01-01

    Johne’s disease is a chronic debilitating enteropathy of ruminants caused by Mycobacterium avium subspecies paratuberculosis (MAP). Current abattoir surveillance programs detect disease via examination of gross lesions and confirmation by histopathological and/or tissue culture, which is time-consuming and has relatively low sensitivity. This study aimed to investigate whether a high-throughput quantitative PCR (qPCR) test is a viable alternative for tissue testing. Intestine and mesenteric lymph nodes were sourced from sheep experimentally infected with MAP and the DNA extracted using a protocol developed for tissues, comprised enzymatic digestion of the tissue homogenate, chemical and mechanical lysis, and magnetic bead-based DNA purification. The extracted DNA was tested by adapting a previously validated qPCR for fecal samples, and the results were compared with culture and histopathology results of the corresponding tissues. The MAP tissue qPCR confirmed infection in the majority of sheep with gross lesions on postmortem (37/38). Likewise, almost all tissue culture (61/64) or histopathology (52/58) positives were detected with good to moderate agreement (Cohen’s kappa statistic) and no significant difference to the reference tests (McNemar’s Chi-square test). Higher MAP DNA quantities corresponded to animals with more severe histopathology (odds ratio: 1.82; 95% confidence interval: 1.60, 2.07). Culture-independent strain typing on tissue DNA was successfully performed. This MAP tissue qPCR method had a sensitivity equivalent to the reference tests and is thus a viable replacement for gross- and histopathological examination of tissue samples in abattoirs. In addition, the test could be validated for testing tissue samples intended for human consumption. PMID:29312970

  4. Culture-Independent Identification of Mycobacterium avium Subspecies paratuberculosis in Ovine Tissues: Comparison with Bacterial Culture and Histopathological Lesions.

    PubMed

    Acharya, Kamal R; Dhand, Navneet K; Whittington, Richard J; Plain, Karren M

    2017-01-01

    Johne's disease is a chronic debilitating enteropathy of ruminants caused by Mycobacterium avium subspecies paratuberculosis (MAP). Current abattoir surveillance programs detect disease via examination of gross lesions and confirmation by histopathological and/or tissue culture, which is time-consuming and has relatively low sensitivity. This study aimed to investigate whether a high-throughput quantitative PCR (qPCR) test is a viable alternative for tissue testing. Intestine and mesenteric lymph nodes were sourced from sheep experimentally infected with MAP and the DNA extracted using a protocol developed for tissues, comprised enzymatic digestion of the tissue homogenate, chemical and mechanical lysis, and magnetic bead-based DNA purification. The extracted DNA was tested by adapting a previously validated qPCR for fecal samples, and the results were compared with culture and histopathology results of the corresponding tissues. The MAP tissue qPCR confirmed infection in the majority of sheep with gross lesions on postmortem (37/38). Likewise, almost all tissue culture (61/64) or histopathology (52/58) positives were detected with good to moderate agreement (Cohen's kappa statistic) and no significant difference to the reference tests (McNemar's Chi-square test). Higher MAP DNA quantities corresponded to animals with more severe histopathology (odds ratio: 1.82; 95% confidence interval: 1.60, 2.07). Culture-independent strain typing on tissue DNA was successfully performed. This MAP tissue qPCR method had a sensitivity equivalent to the reference tests and is thus a viable replacement for gross- and histopathological examination of tissue samples in abattoirs. In addition, the test could be validated for testing tissue samples intended for human consumption.

  5. How serious a problem is subsoil compaction in the Netherlands? A survey based on probability sampling

    NASA Astrophysics Data System (ADS)

    Brus, Dick J.; van den Akker, Jan J. H.

    2018-02-01

    Although soil compaction is widely recognized as a soil threat to soil resources, reliable estimates of the acreage of overcompacted soil and of the level of soil compaction parameters are not available. In the Netherlands data on subsoil compaction were collected at 128 locations selected by stratified random sampling. A map showing the risk of subsoil compaction in five classes was used for stratification. Measurements of bulk density, porosity, clay content and organic matter content were used to compute the relative bulk density and relative porosity, both expressed as a fraction of a threshold value. A subsoil was classified as overcompacted if either the relative bulk density exceeded 1 or the relative porosity was below 1. The sample data were used to estimate the means of the two subsoil compaction parameters and the overcompacted areal fraction. The estimated global means of relative bulk density and relative porosity were 0.946 and 1.090, respectively. The estimated areal fraction of the Netherlands with overcompacted subsoils was 43 %. The estimates per risk map unit showed two groups of map units: a low-risk group (units 1 and 2, covering only 4.6 % of the total area) and a high-risk group (units 3, 4 and 5). The estimated areal fraction of overcompacted subsoil was 0 % in the low-risk unit and 47 % in the high-risk unit. The map contains no information about where overcompacted subsoils occur. This was caused by the poor association of the risk map units 3, 4 and 5 with the subsoil compaction parameters and subsoil overcompaction. This can be explained by the lack of time for recuperation.

  6. Development of a national geodatabase (Greece) for soil surveys and land evaluation using space technology and GIS

    NASA Astrophysics Data System (ADS)

    Bilas, George; Dionysiou, Nina; Karapetsas, Nikolaos; Silleos, Nikolaos; Kosmas, Konstantinos; Misopollinos, Nikolaos

    2016-04-01

    This project was funded by OPEKEPE, Ministry of Agricultural Development and Food, Greece and involves development of a national geodatabase and a WebGIS that encompass soil data of all the agricultural areas of Greece in order to supply the country with a multi-purpose master plan for agricultural land management. The area mapped covered more than 385,000 ha divided in more than 9.000 Soil Mapping Units (SMUs) based on physiographic analysis, field work and photo interpretation of satellite images. The field work included description and sampling in three depths (0-30, 30-60 and >60 cm) of 2,000 soil profiles and 8,000 augers (sampling 0-30 and >30 cm). In total more than 22,000 soil samples were collected and analyzed for determining main soil properties associated with soil classification and soil evaluation. Additionally the project included (1) integration of all data in the Soil Geodatabase, (2) finalization of SMUs, (3) development of a Master Plan for Agricultural Land Management and (4) development and operational testing of the Web Portal for e-information and e-services. The integrated system is expected, after being fully operational, to provide important electronic services and benefits to farmers, private sector and governmental organizations. An e-book with the soil maps of Greece was also provided including 570 sheets with data description and legends. The Master Plan for Agricultural Land Management includes soil quality maps for 30 agricultural crops, together with maps showing soil degradation risks, such as erosion, desertification, salinity and nitrates, thus providing the tools for soil conservation and sustainable land management.

  7. Stable Estimation of a Covariance Matrix Guided by Nuclear Norm Penalties

    PubMed Central

    Chi, Eric C.; Lange, Kenneth

    2014-01-01

    Estimation of a covariance matrix or its inverse plays a central role in many statistical methods. For these methods to work reliably, estimated matrices must not only be invertible but also well-conditioned. The current paper introduces a novel prior to ensure a well-conditioned maximum a posteriori (MAP) covariance estimate. The prior shrinks the sample covariance estimator towards a stable target and leads to a MAP estimator that is consistent and asymptotically efficient. Thus, the MAP estimator gracefully transitions towards the sample covariance matrix as the number of samples grows relative to the number of covariates. The utility of the MAP estimator is demonstrated in two standard applications – discriminant analysis and EM clustering – in this sampling regime. PMID:25143662

  8. Completeness and Reliability of Location Data Collected on the Web: Assessing the Quality of Self-Reported Locations in an Internet Sample of Men Who Have Sex With Men.

    PubMed

    Vaughan, Adam S; Kramer, Michael R; Cooper, Hannah Lf; Rosenberg, Eli S; Sullivan, Patrick S

    2016-06-09

    Place is critical to our understanding of human immunodeficiency virus (HIV) infections among men who have sex with men (MSM) in the United States. However, within the scientific literature, place is almost always represented by residential location, suggesting a fundamental assumption of equivalency between neighborhood of residence, place of risk, and place of prevention. However, the locations of behaviors among MSM show significant spatial variation, and theory has posited the importance of nonresidential contextual exposures. This focus on residential locations has been at least partially necessitated by the difficulties in collecting detailed geolocated data required to explore nonresidential locations. Using a Web-based map tool to collect locations, which may be relevant to the daily lives and health behaviors of MSM, this study examines the completeness and reliability of the collected data. MSM were recruited on the Web and completed a Web-based survey. Within this survey, men used a map tool embedded within a question to indicate their homes and multiple nonresidential locations, including those representing work, sex, socialization, physician, and others. We assessed data quality by examining data completeness and reliability. We used logistic regression to identify demographic, contextual, and location-specific predictors of answering all eligible map questions and answering specific map questions. We assessed data reliability by comparing selected locations with other participant-reported data. Of 247 men completing the survey, 167 (67.6%) answered the entire set of eligible map questions. Most participants (>80%) answered specific map questions, with sex locations being the least reported (80.6%). Participants with no college education were less likely than those with a college education to answer all map questions (prevalence ratio, 0.4; 95% CI, 0.2-0.8). Participants who reported sex at their partner's home were less likely to indicate the location of that sex (prevalence ratio, 0.8; 95% CI, 0.7-1.0). Overall, 83% of participants placed their home's location within the boundaries of their reported residential ZIP code. Of locations having a specific text description, the median distance between the participant-selected location and the location determined using the specific text description was 0.29 miles (25th and 75th percentiles, 0.06-0.88). Using this Web-based map tool, this Web-based sample of MSM was generally willing and able to provide accurate data regarding both home and nonresidential locations. This tool provides a mechanism to collect data that can be used in more nuanced studies of place and sexual risk and preventive behaviors of MSM.

  9. Puzzle Imaging: Using Large-Scale Dimensionality Reduction Algorithms for Localization.

    PubMed

    Glaser, Joshua I; Zamft, Bradley M; Church, George M; Kording, Konrad P

    2015-01-01

    Current high-resolution imaging techniques require an intact sample that preserves spatial relationships. We here present a novel approach, "puzzle imaging," that allows imaging a spatially scrambled sample. This technique takes many spatially disordered samples, and then pieces them back together using local properties embedded within the sample. We show that puzzle imaging can efficiently produce high-resolution images using dimensionality reduction algorithms. We demonstrate the theoretical capabilities of puzzle imaging in three biological scenarios, showing that (1) relatively precise 3-dimensional brain imaging is possible; (2) the physical structure of a neural network can often be recovered based only on the neural connectivity matrix; and (3) a chemical map could be reproduced using bacteria with chemosensitive DNA and conjugative transfer. The ability to reconstruct scrambled images promises to enable imaging based on DNA sequencing of homogenized tissue samples.

  10. An evaluation of potential sampling locations in a reservoir with emphasis on conserved spatial correlation structure.

    PubMed

    Yenilmez, Firdes; Düzgün, Sebnem; Aksoy, Aysegül

    2015-01-01

    In this study, kernel density estimation (KDE) was coupled with ordinary two-dimensional kriging (OK) to reduce the number of sampling locations in measurement and kriging of dissolved oxygen (DO) concentrations in Porsuk Dam Reservoir (PDR). Conservation of the spatial correlation structure in the DO distribution was a target. KDE was used as a tool to aid in identification of the sampling locations that would be removed from the sampling network in order to decrease the total number of samples. Accordingly, several networks were generated in which sampling locations were reduced from 65 to 10 in increments of 4 or 5 points at a time based on kernel density maps. DO variograms were constructed, and DO values in PDR were kriged. Performance of the networks in DO estimations were evaluated through various error metrics, standard error maps (SEM), and whether the spatial correlation structure was conserved or not. Results indicated that smaller number of sampling points resulted in loss of information in regard to spatial correlation structure in DO. The minimum representative sampling points for PDR was 35. Efficacy of the sampling location selection method was tested against the networks generated by experts. It was shown that the evaluation approach proposed in this study provided a better sampling network design in which the spatial correlation structure of DO was sustained for kriging.

  11. Automated Phase Segmentation for Large-Scale X-ray Diffraction Data Using a Graph-Based Phase Segmentation (GPhase) Algorithm.

    PubMed

    Xiong, Zheng; He, Yinyan; Hattrick-Simpers, Jason R; Hu, Jianjun

    2017-03-13

    The creation of composition-processing-structure relationships currently represents a key bottleneck for data analysis for high-throughput experimental (HTE) material studies. Here we propose an automated phase diagram attribution algorithm for HTE data analysis that uses a graph-based segmentation algorithm and Delaunay tessellation to create a crystal phase diagram from high throughput libraries of X-ray diffraction (XRD) patterns. We also propose the sample-pair based objective evaluation measures for the phase diagram prediction problem. Our approach was validated using 278 diffraction patterns from a Fe-Ga-Pd composition spread sample with a prediction precision of 0.934 and a Matthews Correlation Coefficient score of 0.823. The algorithm was then applied to the open Ni-Mn-Al thin-film composition spread sample to obtain the first predicted phase diagram mapping for that sample.

  12. Geologic map and structure sections of the Clear Lake Volcanics, Northern California

    USGS Publications Warehouse

    Hearn, B.C.; Donnelly-Nolan, J. M.; Goff, F.E.

    1995-01-01

    The Clear Lake Volcanics are located in the California Coast Ranges about 150 km north of San Francisco. This Quaternary volcanic field has erupted intermittently since 2.1 million years ago. This volcanic field is considered a high-threat volcanic system (Ewert and others, 2005) The adjacent Geysers geothermal field, largest power-producing geothermal field in the world, is powered by the magmatic heat source for the volcanic field. This report consists of three sheets that include the geologic map, one table, two figures, three cross sections, description of map units, charts of standard and diagrammatic correlation of map units, and references. This map supersedes U.S. Geological Survey Open-File Report 76-751. Descriptions of map units are grouped by geographic area. Summaries of the evolution, chemistry, structure, and tectonic setting of the Clear Lake Volcanics are given in Hearn and others (1981) and Donnelly-Nolan and others (1981). The geology of parts of the area underlain by the Cache Formation is based on mapping by Rymer (1981); the geology of parts of the areas underlain by the Sonoma Volcanics, Franciscan assemblage, and Great Valley sequence is based on mapping by McLaughlin (1978). Volcanic compositional map units are basalt, basaltic andesite, andesite, dacite, rhyodacite, and rhyolite, based on SiO2 content. Included in this report are maps showing the distribution of volcanic rocks through time and a chart showing erupted volumes of different lava types through time. A table gives petrographic data for each map unit by mineral type, abundance, and size. Most ages are potassium-argon (K/Ar) ages determined for whole-rock samples and mineral separates by Donnelly-Nolan and others (1981), unless otherwise noted. A few ages are carbon-14 ages or were estimated from geologic relationships. Magnetic polarities are from Mankinen and others (1978; 1981) or were determined in the field by B.C. Hearn, Jr., using a portable fluxgate magnetometer. Thickness for most units is estimated from topographic relief except where drill-hole data were available.

  13. The bedrock electrical conductivity map of the UK

    NASA Astrophysics Data System (ADS)

    Beamish, David

    2013-09-01

    Airborne electromagnetic (AEM) surveys, when regionally extensive, may sample a wide-range of geological formations. The majority of AEM surveys can provide estimates of apparent (half-space) conductivity and such derived data provide a mapping capability. Depth discrimination of the geophysical mapping information is controlled by the bandwidth of each particular system. The objective of this study is to assess the geological information contained in accumulated frequency-domain AEM survey data from the UK where existing geological mapping can be considered well-established. The methodology adopted involves a simple GIS-based, spatial join of AEM and geological databases. A lithology-based classification of bedrock is used to provide an inherent association with the petrophysical rock parameters controlling bulk conductivity. At a scale of 1:625k, the UK digital bedrock geological lexicon comprises just 86 lithological classifications compared with 244 standard lithostratigraphic assignments. The lowest common AEM survey frequency of 3 kHz is found to provide an 87% coverage (by area) of the UK formations. The conductivities of the unsampled classes have been assigned on the basis of inherent lithological associations between formations. The statistical analysis conducted uses over 8 M conductivity estimates and provides a new UK national scale digital map of near-surface bedrock conductivity. The new baseline map, formed from central moments of the statistical distributions, allows assessments/interpretations of data exhibiting departures from the norm. The digital conductivity map developed here is believed to be the first such UK geophysical map compilation for over 75 years. The methodology described can also be applied to many existing AEM data sets.

  14. Counterintuitive increase in observed Mycobacterium avium subspecies paratuberculosis prevalence in sympatric rabbits following the introduction of paratuberculosis control measures in cattle.

    PubMed

    Fox, Naomi J; Caldow, George L; Liebeschuetz, Hilary; Stevenson, Karen; Hutchings, Michael R

    2018-06-02

    Paratuberculosis (Johne's disease) is caused by the bacterium Mycobacterium avium subspecies paratuberculosis ( Map ). Achieving herd-level control of mycobacterial infection is notoriously difficult, despite widespread adoption of test-and-cull-based control strategies. The presence of infection in wildlife populations could be contributing to this difficulty. Rabbits are naturally infected with the same Map strain as cattle, and can excrete high levels in their faeces. The aim of this study is to determine if implementation of paratuberculosis control in cattle leads to a decline in Map infection levels in rabbits. An island-wide, test-and-cull-based paratuberculosis control programme was initiated on a Scottish island in 2008. In this study annual tests were obtained from 15 cattle farms, from 2008 to 2011, totalling 2609 tests. Rabbits (1564) were sampled from the 15 participating farms, from 2008 to 2011, and Map was detected by faecal culture. Map seroprevalence in cattle decreased from 16 to 7.2 per cent, while Map prevalence in rabbits increased from 10.3 to 20.3 per cent. Results indicate that efforts to control paratuberculosis in cattle do not reduce Map levels in sympatric rabbits. This adds to mounting evidence that if Map becomes established in wild rabbit populations, rabbits represent a persistent and widespread source of infection, potentially impeding livestock control strategies. © British Veterinary Association (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Forest Biomass Mapping From Lidar and Radar Synergies

    NASA Technical Reports Server (NTRS)

    Sun, Guoqing; Ranson, K. Jon; Guo, Z.; Zhang, Z.; Montesano, P.; Kimes, D.

    2011-01-01

    The use of lidar and radar instruments to measure forest structure attributes such as height and biomass at global scales is being considered for a future Earth Observation satellite mission, DESDynI (Deformation, Ecosystem Structure, and Dynamics of Ice). Large footprint lidar makes a direct measurement of the heights of scatterers in the illuminated footprint and can yield accurate information about the vertical profile of the canopy within lidar footprint samples. Synthetic Aperture Radar (SAR) is known to sense the canopy volume, especially at longer wavelengths and provides image data. Methods for biomass mapping by a combination of lidar sampling and radar mapping need to be developed. In this study, several issues in this respect were investigated using aircraft borne lidar and SAR data in Howland, Maine, USA. The stepwise regression selected the height indices rh50 and rh75 of the Laser Vegetation Imaging Sensor (LVIS) data for predicting field measured biomass with a R(exp 2) of 0.71 and RMSE of 31.33 Mg/ha. The above-ground biomass map generated from this regression model was considered to represent the true biomass of the area and used as a reference map since no better biomass map exists for the area. Random samples were taken from the biomass map and the correlation between the sampled biomass and co-located SAR signature was studied. The best models were used to extend the biomass from lidar samples into all forested areas in the study area, which mimics a procedure that could be used for the future DESDYnI Mission. It was found that depending on the data types used (quad-pol or dual-pol) the SAR data can predict the lidar biomass samples with R2 of 0.63-0.71, RMSE of 32.0-28.2 Mg/ha up to biomass levels of 200-250 Mg/ha. The mean biomass of the study area calculated from the biomass maps generated by lidar- SAR synergy 63 was within 10% of the reference biomass map derived from LVIS data. The results from this study are preliminary, but do show the potential of the combined use of lidar samples and radar imagery for forest biomass mapping. Various issues regarding lidar/radar data synergies for biomass mapping are discussed in the paper.

  16. Identifying Interactions that Determine Fragment Binding at Protein Hotspots.

    PubMed

    Radoux, Chris J; Olsson, Tjelvar S G; Pitt, Will R; Groom, Colin R; Blundell, Tom L

    2016-05-12

    Locating a ligand-binding site is an important first step in structure-guided drug discovery, but current methods do little to suggest which interactions within a pocket are the most important for binding. Here we illustrate a method that samples atomic hotspots with simple molecular probes to produce fragment hotspot maps. These maps specifically highlight fragment-binding sites and their corresponding pharmacophores. For ligand-bound structures, they provide an intuitive visual guide within the binding site, directing medicinal chemists where to grow the molecule and alerting them to suboptimal interactions within the original hit. The fragment hotspot map calculation is validated using experimental binding positions of 21 fragments and subsequent lead molecules. The ligands are found in high scoring areas of the fragment hotspot maps, with fragment atoms having a median percentage rank of 97%. Protein kinase B and pantothenate synthetase are examined in detail. In each case, the fragment hotspot maps are able to rationalize a Free-Wilson analysis of SAR data from a fragment-based drug design project.

  17. Local activation time sampling density for atrial tachycardia contact mapping: how much is enough?

    PubMed

    Williams, Steven E; Harrison, James L; Chubb, Henry; Whitaker, John; Kiedrowicz, Radek; Rinaldi, Christopher A; Cooklin, Michael; Wright, Matthew; Niederer, Steven; O'Neill, Mark D

    2018-02-01

    Local activation time (LAT) mapping forms the cornerstone of atrial tachycardia diagnosis. Although anatomic and positional accuracy of electroanatomic mapping (EAM) systems have been validated, the effect of electrode sampling density on LAT map reconstruction is not known. Here, we study the effect of chamber geometry and activation complexity on optimal LAT sampling density using a combined in silico and in vivo approach. In vivo 21 atrial tachycardia maps were studied in three groups: (1) focal activation, (2) macro-re-entry, and (3) localized re-entry. In silico activation was simulated on a 4×4cm atrial monolayer, sampled randomly at 0.25-10 points/cm2 and used to re-interpolate LAT maps. Activation patterns were studied in the geometrically simple porcine right atrium (RA) and complex human left atrium (LA). Activation complexity was introduced into the porcine RA by incomplete inter-caval linear ablation. In all cases, optimal sampling density was defined as the highest density resulting in minimal further error reduction in the re-interpolated maps. Optimal sampling densities for LA tachycardias were 0.67 ± 0.17 points/cm2 (focal activation), 1.05 ± 0.32 points/cm2 (macro-re-entry) and 1.23 ± 0.26 points/cm2 (localized re-entry), P = 0.0031. Increasing activation complexity was associated with increased optimal sampling density both in silico (focal activation 1.09 ± 0.14 points/cm2; re-entry 1.44 ± 0.49 points/cm2; spiral-wave 1.50 ± 0.34 points/cm2, P < 0.0001) and in vivo (porcine RA pre-ablation 0.45 ± 0.13 vs. post-ablation 0.78 ± 0.17 points/cm2, P = 0.0008). Increasing chamber geometry was also associated with increased optimal sampling density (0.61 ± 0.22 points/cm2 vs. 1.0 ± 0.34 points/cm2, P = 0.0015). Optimal sampling densities can be identified to maximize diagnostic yield of LAT maps. Greater sampling density is required to correctly reveal complex activation and represent activation across complex geometries. Overall, the optimal sampling density for LAT map interpolation defined in this study was ∼1.0-1.5 points/cm2. Published on behalf of the European Society of Cardiology

  18. Study of the effect of atmosphere modification in conjunction with honey on the extent of shelf life of Greek bakery delicacy "touloumpaki".

    PubMed

    Arvanitoyannis, Ioannis S; Bosinas, Konstantinos P; Bouletis, Achilleas D; Gkagtzis, Dimitrios C; Hadjichristodoulou, Christos; Papaloucas, C

    2011-12-01

    The aim of the present work was to evaluate the effect of atmosphere modification on microbial (mesophiles, yeast and molds) qualities, color, pH, texture and water activity of the Greek bakery product "touloumpaki". Samples were stored under MAP (60% CO(2)) either alone or with the addition of honey syrup for 16 days at room temperature (22-24 °C). Texture was better maintained under MAP and the addition of honey prevented the increase of shear force needed (1.498 and 3.20 for samples with and without honey). Honey inhibited the growth of yeasts on samples stored under MAP (1.6 and 2.02 log CFU/g for samples under MAP with and without honey respectively) while multivariant analysis showed that MAP and honey acted synergistically in confining yeasts. Presence of honey restrained the mesophilic growth until the end of storage period (5.21 and 4.29 log CFU/g for MAP and control samples respectively) while MAP did not have any beneficial effect. Water activity (a(W) < 0.754) was strongly associated with reduced mesophile growth. Lightness values showed a significant decrease during time with no significant changes among treatments in both internal layers and external surface of the product. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Mapping of lead, magnesium and copper accumulation in plant tissues by laser-induced breakdown spectroscopy and laser-ablation inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Kaiser, J.; Galiová, M.; Novotný, K.; Červenka, R.; Reale, L.; Novotný, J.; Liška, M.; Samek, O.; Kanický, V.; Hrdlička, A.; Stejskal, K.; Adam, V.; Kizek, R.

    2009-01-01

    Laser-Induced Breakdown Spectroscopy (LIBS) and Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) were utilized for mapping the accumulation of Pb, Mg and Cu with a resolution up to 200 μm in a up to cm × cm area of sunflower ( Helianthus annuus L.) leaves. The results obtained by LIBS and LA-ICP-MS are compared with the outcomes from Atomic Absorption Spectrometry (AAS) and Thin-Layer Chromatography (TLC). It is shown that laser-ablation based analytical methods can substitute or supplement these techniques mainly in the cases when a fast multi-elemental mapping of a large sample area is needed.

  20. Evaluation of nine-frame enhanced multiband photography San Andreas fault zone, Carrizo Plain, California

    NASA Technical Reports Server (NTRS)

    Wallace, R. E.

    1969-01-01

    Nine-frame multiband aerial photography of a sample area 4500 feet on a side was processed to enhance spectral contrasts. The area concerned is in the Carrizo Plain, 45 miles west of Bakersfield, California, in sec. 29, T 31 S., R. 21 E., as shown on the Panorama Hills quadrangle topographic map published by the U. S. Geological Survey. The accompany illustrations include an index map showing the location of the Carrizo Plain area; a geologic map of the area based on field studies and examination of black and white aerial photographs; an enhanced multiband aerial photograph; an Aero Ektachrome photograph; black and white aerial photographs; and infrared image in the 8-13 micron band.

  1. Radiometric age map of Aleutian Islands

    USGS Publications Warehouse

    Wilson, Frederic H.; Turner, D.L.

    1975-01-01

    This map includes published, thesis, and open-file radiometric data available to us as of June, 1975. Some dates are not plotted because of inadequate location data in the original references.The map is divided into five sections, based on 1:1,000,000 scale enlargements of the National Atlas maps of Alaska. Within each section (e.g., southeastern Alaska), radiometric dates are plotted and keyed to 1:250,000 scale quadrangles. Accompanying each map section is table 1, listing map numbers and the sample identification numbers used in DGGS Special Report 10: Radiometric Dates from Alaska-A 1975 Compilation”. The reader is referred to Special Report 10 for more complete information on location, rock type, dating method, and literature references for each age entry. A listing of dates in Special Report lo which require correction or deletion is included S table 2. Corrected and additional entries are listed in table 3. The listings in tables 2 and 3 follow the format of Special Report 10. Table 4 is a glossary of abbreviations used for quadrangle name, rock type, mineral dated, and type of dating method used.

  2. Radiometric age map of southcentral Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Turner, D.L.

    1975-01-01

    This map includes published, thesis, and open-file radiometric data available to us as of June, 1975. Some dates are not plotted because of inadequate location data in the original references.The map is divided into five sections, based on 1:1,000,000 scale enlargements of the National Atlas maps of Alaska. Within each section (e.g., southeastern Alaska), radiometric dates are plotted and keyed to 1:250,000 scale quadrangles. Accompanying each map section is table 1, listing map numbers and the sample identification numbers used in DGGS Special Report 10: Radiometric Dates from Alaska-A 1975 Compilation”. The reader is referred to Special Report 10 for more complete information on location, rock type, dating method, and literature references for each age entry. A listing of dates in Special Report lo which require correction or deletion is included S table 2. Corrected and additional entries are listed in table 3. The listings in tables 2 and 3 follow the format of Special Report 10. Table 4 is a glossary of abbreviations used for quadrangle name, rock type, mineral dated, and type of dating method used.

  3. Radiometric age map of southwest Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Turner, D.L.

    1975-01-01

    This map includes published, thesis, and open-file radiometric data available to us as of June, 1975. Some dates are not plotted because of inadequate location data in the original references.The map is divided into five sections, based on 1:1,000,000 scale enlargements of the National Atlas maps of Alaska. Within each section (e.g., southeastern Alaska), radiometric dates are plotted and keyed to 1:250,000 scale quadrangles. Accompanying each map section is table 1, listing map numbers and the sample identification numbers used in DGGS Special Report 10: Radiometric Dates from Alaska-A 1975 Compilation”. The reader is referred to Special Report 10 for more complete information on location, rock type, dating method, and literature references for each age entry. A listing of dates in Special Report lo which require correction or deletion is included S table 2. Corrected and additional entries are listed in table 3. The listings in tables 2 and 3 follow the format of Special Report 10. Table 4 is a glossary of abbreviations used for quadrangle name, rock type, mineral dated, and type of dating method used.

  4. Radiometric age map of southeast Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Turner, D.L.

    1975-01-01

    This map includes published, thesis, and open-file radiometric data available to us as of June, 1975. Some dates are not plotted because of inadequate location data in the original references.The map is divided into five sections, based on 1:1,000,000 scale enlargements of the National Atlas maps of Alaska. Within each section (e.g., southeastern Alaska), radiometric dates are plotted and keyed to 1:250,000 scale quadrangles. Accompanying each map section is table 1, listing map numbers and the sample identification numbers used in DGGS Special Report 10: Radiometric Dates from Alaska-A 1975 Compilation”. The reader is referred to Special Report 10 for more complete information on location, rock type, dating method, and literature references for each age entry. A listing of dates in Special Report lo which require correction or deletion is included S table 2. Corrected and additional entries are listed in table 3. The listings in tables 2 and 3 follow the format of Special Report 10. Table 4 is a glossary of abbreviations used for quadrangle name, rock type, mineral dated, and type of dating method used.

  5. Radiometric age map of northern Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Turner, D.L.

    1975-01-01

    This map includes published, thesis, and open-file radiometric data available to us as of June, 1975. Some dates are not plotted because of inadequate location data in the original references.The map is divided into five sections, based on 1:1,000,000 scale enlargements of the National Atlas maps of Alaska. Within each section (e.g., southeastern Alaska), radiometric dates are plotted and keyed to 1:250,000 scale quadrangles. Accompanying each map section is table 1, listing map numbers and the sample identification numbers used in DGGS Special Report 10: Radiometric Dates from Alaska-A 1975 Compilation”. The reader is referred to Special Report 10 for more complete information on location, rock type, dating method, and literature references for each age entry. A listing of dates in Special Report lo which require correction or deletion is included S table 2. Corrected and additional entries are listed in table 3. The listings in tables 2 and 3 follow the format of Special Report 10. Table 4 is a glossary of abbreviations used for quadrangle name, rock type, mineral dated, and type of dating method used.

  6. Digital Geologic Map of the Wallace 1:100,000 Quadrangle, Idaho

    USGS Publications Warehouse

    Lewis, Reed S.; Burmester, Russell F.; McFaddan, Mark D.; Derkey, Pamela D.; Oblad, Jon R.

    1999-01-01

    The geology of the Wallace 1:100,000 quadrangle, Idaho was compiled by Reed S. Lewis in 1997 primarily from published materials including 1983 data from Foster, Harrison's unpublished mapping done from 1975 to 1985, Hietenan's 1963, 1967, 1968, and 1984 mapping, Hobbs and others 1965 mapping, and Vance's 1981 mapping, supplemented by eight weeks of field mapping by Reed S. Lewis, Russell F. Burmester, and Mark D. McFaddan in 1997 and 1998. This geologic map information was inked onto a 1:100,000-scale greenline mylar of the topographic base map for input into a geographic information system (GIS). The resulting digital geologic map GIS can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000). The map area is located in north Idaho. The primary sources of map data are shown in figure 2 and additional sources are shown in figure 3. This open-file report describes the geologic map units, the methods used to convert the geologic map data into a digital format, the Arc/Info GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. Mapping and compilation was completed by the Idaho Geological Survey under contract with the U.S. Geological Survey (USGS) office in Spokane, Washington. The authors would like to acknowledge the help of the following field assistants: Josh Goodman, Yvonne Issak, Jeremy Johnson and Kevin Myer. Don Winston provided help with our ongoing study of Belt stratigraphy, and Tom Frost assisted with logistical problems and sample collection. Manuscript reviews by Steve Box, Tom Frost, and Brian White are greatly appreciated. We wish to thank Karen S. Bolm of the USGS for reviewing the digital files.

  7. California desert resource inventory using multispectral classification of digitally mosaicked Landsat frames

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Mcleod, R. G.; Zobrist, A. L.; Johnson, H. B.

    1979-01-01

    Procedures for adjustment of brightness values between frames and the digital mosaicking of Landsat frames to standard map projections are developed for providing a continuous data base for multispectral thematic classification. A combination of local terrain variations in the Californian deserts and a global sampling strategy based on transects provided the framework for accurate classification throughout the entire geographic region.

  8. Evaluating ecoregions for sampling and mapping land-cover patterns

    Treesearch

    Kurt H. Riitters; James D. Wickham; Timothy G. Wade

    2006-01-01

    Ecoregional stratification has been proposed for sampling and mapping land-cover composition and pattern over time. Using a wall-to-wall land-cover map of the United States, we evaluated geographic scales of variance for nine landscapelevel and eight forest pattern indices, and compared stratification by ecoregions, administrative units, and watersheds. Ecoregions...

  9. Composition and origin of holotype Al-Cu-Zn minerals in relation to quasicrystals in the Khatyrka meteorite

    NASA Astrophysics Data System (ADS)

    Ivanova, Marina A.; Lorenz, Cyril A.; Borisovskiy, Sergey E.; Burmistrov, Andrey A.; Korost, Dmitriy V.; Korochantsev, Alexander V.; Logunova, Maria N.; Shornikov, Sergei I.; Petaev, Michail I.

    2017-05-01

    We investigated the khatyrkite-cupalite holotype sample, 1.2 × 0.5 mm across. It consists of khatyrkite (Cu,Zn)Al2, cupalite (Cu,Zn)Al, and interstitial material with approximate composition (Zn,Cu)Al3. All mineral phases of the holotype sample contain Zn and lack Fe that distinguishes them from khatyrkite and cupalite in the Khatyrka meteorite particles (Bindi et al. , , , ; MacPherson et al. ; Hollister et al. ). Neither highly fractionated natural systems nor geo- or cosmochemical processes capable of forming the holotype sample are known so far. The bulk chemistry and thermal history of khatyrkite-cupalite assemblage in the holotype sample hint for its possible industrial origin. Likewise, the aluminides in the Khatyrka meteorite particles may also be derived from industrial materials and mixed with extraterrestrial matter during gold prospecting in the Listvenitovy Stream valley.

  10. Historical Maps Potential on the Assessment of the Hydromorphological Changes in Large Rivers: Towards Sustainable Rivers Management under Altered Flows

    NASA Astrophysics Data System (ADS)

    Kuriqi, Alban; Rosário Fernandes, M.; Santos, Artur; Ferreira, M. Teresa

    2017-04-01

    Hydromorphological patterns changes in large rivers, result from a long history of human interventions. In this study, we evaluate the causes and effects of hydromorphological alterations in the Iberian Minho River using a planform change analysis. We performed a temporal comparison using historical maps (nineteen century) and contemporaneous maps. The studied river was divided in 2.5 km long river stretches in a total of 25 sampling units. The historical maps were initially georeferenced for the WGS84 coordinate system. We used Geographic Information System (GIS) to extract the hydromorphological features and to store and organised the spatial data. The hydromorphological features (sinuosity index, braiding intensity, river corridor and active channel width, lotic and lentic habitats) were mapped by visual interpretation of the historical and the contemporaneous maps on a scale 1:2500 by applying the same methodology. Also, we analysed certain Indicators of Hydrological Alteration (IHA) based on pre- and post-dam daily streamflow data obtained from the Spanish Water Information System (SIA). The results revealed a significant reduction in the active channel width and all sinuosity indexes representing an overall degradation of river conditions. We also noticed a drastic diminution in the number and total area of lentic habitats causing fish habitat shifts. Changes were less evident in upstream sampling units due to diverse Land Use/Land Cover (LULC) changes combine with some geological constraints. These responses were consistent with reductions in mean annual discharge, flood disturbance decrease and minimum flow increase during the summer season. This work allows to understand the evolutionary trajectory of large fluvial system over more than 100 years and to implement concrete measures for sustainable river management. Keywords: historical maps, large rivers, flow alteration, sinuosity index, lotic and lentic habitats, regulated rivers, river restoration.

  11. Using digital soil maps to infer edaphic affinities of plant species in Amazonia: Problems and prospects.

    PubMed

    Moulatlet, Gabriel Massaine; Zuquim, Gabriela; Figueiredo, Fernando Oliveira Gouvêa; Lehtonen, Samuli; Emilio, Thaise; Ruokolainen, Kalle; Tuomisto, Hanna

    2017-10-01

    Amazonia combines semi-continental size with difficult access, so both current ranges of species and their ability to cope with environmental change have to be inferred from sparse field data. Although efficient techniques for modeling species distributions on the basis of a small number of species occurrences exist, their success depends on the availability of relevant environmental data layers. Soil data are important in this context, because soil properties have been found to determine plant occurrence patterns in Amazonian lowlands at all spatial scales. Here we evaluate the potential for this purpose of three digital soil maps that are freely available online: SOTERLAC, HWSD, and SoilGrids. We first tested how well they reflect local soil cation concentration as documented with 1,500 widely distributed soil samples. We found that measured soil cation concentration differed by up to two orders of magnitude between sites mapped into the same soil class. The best map-based predictor of local soil cation concentration was obtained with a regression model combining soil classes from HWSD with cation exchange capacity (CEC) from SoilGrids. Next, we evaluated to what degree the known edaphic affinities of thirteen plant species (as documented with field data from 1,200 of the soil sample sites) can be inferred from the soil maps. The species segregated clearly along the soil cation concentration gradient in the field, but only partially along the model-estimated cation concentration gradient, and hardly at all along the mapped CEC gradient. The main problems reducing the predictive ability of the soil maps were insufficient spatial resolution and/or georeferencing errors combined with thematic inaccuracy and absence of the most relevant edaphic variables. Addressing these problems would provide better models of the edaphic environment for ecological studies in Amazonia.

  12. MGDS: Free, on-line, cutting-edge tools to enable the democratisation of geoscience data

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Ryan, W. B.; O'Hara, S.; Ferrini, V.; Arko, R. A.; Coplan, J.; Chan, S.; Carbotte, S. M.; Nitsche, F. O.; Bonczkowski, J.; Morton, J. J.; Weissel, R.; Leung, A.

    2010-12-01

    The availability of user-friendly, effective cyber-information resources for accessing and manipulating geoscience data has grown rapidly in recent years. Based at Lamont-Doherty Earth Observatory the MGDS group has developed a number of free tools that have wide application across the geosciences for both educators and researchers. A simple web page (http://www.marine-geo.org/) allows users to search for and download many types of data by key word, geographical region, or published citation. The popular Create Maps and Grids function and the downloadable Google Earth-compatible KML files appeal to a wide user base. MGDS MediaBank galleries (http://media.marine-geo.org/) enable users to view and download compelling images that are purposefully selected for their educational value from NSF-funded field programs. GeoMapApp (http://www.geomapapp.org), a free map-based interactive tool that works on any machine, is increasingly being adopted across a broad suite of users from middle school students to university researchers. GeoMapApp allows users to plot, manipulate and present data in an intuitive geographical reference frame. GeoMapApp offers a convenient way to explore the wide range of built-in data sets, to quickly generate maps and images that aid visualisation and, when importing their own gridded and tabular data sets, to access the same rich built-in functionality. A user guide, short multi-media tutorials, and webinar are available on-line. The regularly-updated Global Multi-Resolution Topography (GMRT) Synthesis is used as the default GeoMapApp base map and is an increasingly popular means to rapidly create location maps. Additionally, the layer manager offers a fast way to overlay and compare multiple data sets and is augmented by the ability to alter layer transparency so that underlying layers become visible. Examples of GeoMapApp built-in data sets include high-resolution land topography and ocean floor bathymetry derived from satellite and multi-beam swath mapping systems - these can be profiled, shaded, and contoured; geo-registered geochemical sample analyses from the EarthChem database; plate boundary, earthquake and volcano catalogues; physical oceanography global and water column data; seafloor photos and Alvin dive video images; geological maps at various scales; and, high-quality coastline, lakes and rivers data. Customised data portals offer enhanced functionality for multi-channel seismic profiles, drill core logs, and earthquake animations. GeoMapApp is used in many MARGINS undergraduate-level off-the-shelf interactive learning activities called mini-lessons (http://serc.carleton.edu/margins/collection.html). Examples of educational applicability will be shown.

  13. Drought Management Activities of the National Drought Mitigation Center (NDMC): Contributions Toward a Global Drought Early Warning System (GDEWS)

    NASA Astrophysics Data System (ADS)

    Stumpf, A.; Lachiche, N.; Malet, J.; Kerle, N.; Puissant, A.

    2011-12-01

    VHR satellite images have become a primary source for landslide inventory mapping after major triggering events such as earthquakes and heavy rainfalls. Visual image interpretation is still the prevailing standard method for operational purposes but is time-consuming and not well suited to fully exploit the increasingly better supply of remote sensing data. Recent studies have addressed the development of more automated image analysis workflows for landslide inventory mapping. In particular object-oriented approaches that account for spatial and textural image information have been demonstrated to be more adequate than pixel-based classification but manually elaborated rule-based classifiers are difficult to adapt under changing scene characteristics. Machine learning algorithm allow learning classification rules for complex image patterns from labelled examples and can be adapted straightforwardly with available training data. In order to reduce the amount of costly training data active learning (AL) has evolved as a key concept to guide the sampling for many applications. The underlying idea of AL is to initialize a machine learning model with a small training set, and to subsequently exploit the model state and data structure to iteratively select the most valuable samples that should be labelled by the user. With relatively few queries and labelled samples, an AL strategy yields higher accuracies than an equivalent classifier trained with many randomly selected samples. This study addressed the development of an AL method for landslide mapping from VHR remote sensing images with special consideration of the spatial distribution of the samples. Our approach [1] is based on the Random Forest algorithm and considers the classifier uncertainty as well as the variance of potential sampling regions to guide the user towards the most valuable sampling areas. The algorithm explicitly searches for compact regions and thereby avoids a spatially disperse sampling pattern inherent to most other AL methods. The accuracy, the sampling time and the computational runtime of the algorithm were evaluated on multiple satellite images capturing recent large scale landslide events. Sampling between 1-4% of the study areas the accuracies between 74% and 80% were achieved, whereas standard sampling schemes yielded only accuracies between 28% and 50% with equal sampling costs. Compared to commonly used point-wise AL algorithm the proposed approach significantly reduces the number of iterations and hence the computational runtime. Since the user can focus on relatively few compact areas (rather than on hundreds of distributed points) the overall labeling time is reduced by more than 50% compared to point-wise queries. An experimental evaluation of multiple expert mappings demonstrated strong relationships between the uncertainties of the experts and the machine learning model. It revealed that the achieved accuracies are within the range of the inter-expert disagreement and that it will be indispensable to consider ground truth uncertainties to truly achieve further enhancements in the future. The proposed method is generally applicable to a wide range of optical satellite images and landslide types. [1] A. Stumpf, N. Lachiche, J.-P. Malet, N. Kerle, and A. Puissant, Active learning in the spatial domain for remote sensing image classification, IEEE Transactions on Geosciece and Remote Sensing. 2013, DOI 10.1109/TGRS.2013.2262052.

  14. Identification of BRCA1 and 2 Other Tumor Suppressor Genes on Chromosome 17 Through Positional Cloning

    DTIC Science & Technology

    2000-04-01

    Genes, LOH Mapping, Chromosome 17, Physical Mapping, Genetic Mapping, CDNA Screening, Humans, Anatomical 81 Samples, Mutation Detection, Breast Cancer...According to the established model for LOH involving tumor suppressor genes, the allele remaining in the tumor sample would harbor the deleterious mutation ...sequencing on an AB1373A sequencer (Applied Biosystems, Foster City, CA). As none of the samples we have sequenced have revealed any mutations , we have

  15. Composite Interval Mapping Based on Lattice Design for Error Control May Increase Power of Quantitative Trait Locus Detection.

    PubMed

    He, Jianbo; Li, Jijie; Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan

    2015-01-01

    Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively.

  16. Hydrocarbon Reservoir Identification in Volcanic Zone by using Magnetotelluric and Geochemistry Information

    NASA Astrophysics Data System (ADS)

    Firda, S. I.; Permadi, A. N.; Supriyanto; Suwardi, B. N.

    2018-03-01

    The resistivity of Magnetotelluric (MT) data show the resistivity mapping in the volcanic reservoir zone and the geochemistry information for confirm the reservoir and source rock formation. In this research, we used 132 data points divided with two line at exploration area. We used several steps to make the resistivity mapping. There are time series correction, crosspower correction, then inversion of Magnetotelluric (MT) data. Line-2 and line-3 show anomaly geological condition with Gabon fault. The geology structure from the resistivity mapping show the fault and the geological formation with the geological rock data mapping distribution. The geochemistry information show the maturity of source rock formation. According to core sample analysis information, we get the visual porosity for reservoir rock formation in several geological structure. Based on that, we make the geological modelling where the potential reservoir and the source rock around our interest area.

  17. GT1_mbaes_1: HERschel Observations of Edge-on Spirals (HEROES)

    NASA Astrophysics Data System (ADS)

    Baes, M.

    2010-03-01

    We propose to use PACS and SPIRE to map the dust distribution in a sample of seven large edge-on spiral galaxies with regular dust lanes. We will look for the presence of cold dust at large galactocentric radii and investigate the link between dust, gas and metallicity as a function of radius. We will also constrain the vertical distribution of the dust and particularly look for dust emission at large heights above the plane of the galaxies. We will compare the observed Herschel maps with simulated maps resulting from detailed radiative transfer models based on optical and near-infrared images. This will enable us to investigate whether we can confirm the existence of a dust energy balance problem suggested by previous observations (the dust seen in absorption in optical maps underestimates the dust seen in emission) and investigate possible ways to alleviate this potential problem.

  18. Rapid detection methods for viable Mycobacterium avium subspecies paratuberculosis in milk and cheese.

    PubMed

    Botsaris, George; Slana, Iva; Liapi, Maria; Dodd, Christine; Economides, Constantinos; Rees, Catherine; Pavlik, Ivo

    2010-07-31

    Mycobacterium avium subsp. paratuberculosis (MAP) may have a role in the development of Crohn's disease in humans via the consumption of contaminated milk and milk products. Detection of MAP from milk and dairy products has been reported from countries on the European continent, Argentina, the UK and Australia. In this study three different methods (quantitative real time PCR, combined phage IS900 PCR and conventional cultivation) were used to detect the presence of MAP in bulk tank milk (BTM) and cheese originating from sheep, goat and mixed milks from farms and products in Cyprus. During the first survey the presence of MAP was detected in 63 (28.6%) of cows' BTM samples by quantitative real time PCR. A second survey of BTM used a new combined phage IS900 PCR assay, and in this case MAP was detected in 50 (22.2%) samples showing a good level of agreement by both methods. None of the herds tested were known to be affected by Johne's disease and the presence of viable MAP was confirmed by conventional culture in only two cases of cows BTM. This suggests that either rapid method used is more sensitive than the conventional culture when testing raw milk samples for MAP. The two isolates recovered from BTM were identified by IS1311 PCR REA as cattle and sheep strains, respectively. In contrast when cheese samples were tested, MAP DNA was detected by quantitative real time PCR in seven (25.0%) samples (n=28). However no viable MAP was detected when either the combined phage IS900 PCR or conventional culture methods were used. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Quantification of 2D elemental distribution maps of intermediate-thick biological sections by low energy synchrotron μ-X-ray fluorescence spectrometry

    NASA Astrophysics Data System (ADS)

    Kump, P.; Vogel-Mikuš, K.

    2018-05-01

    Two fundamental-parameter (FP) based models for quantification of 2D elemental distribution maps of intermediate-thick biological samples by synchrotron low energy μ-X-ray fluorescence spectrometry (SR-μ-XRF) are presented and applied to the elemental analysis in experiments with monochromatic focused photon beam excitation at two low energy X-ray fluorescence beamlines—TwinMic, Elettra Sincrotrone Trieste, Italy, and ID21, ESRF, Grenoble, France. The models assume intermediate-thick biological samples composed of measured elements, the sources of the measurable spectral lines, and by the residual matrix, which affects the measured intensities through absorption. In the first model a fixed residual matrix of the sample is assumed, while in the second model the residual matrix is obtained by the iteration refinement of elemental concentrations and an adjusted residual matrix. The absorption of the incident focused beam in the biological sample at each scanned pixel position, determined from the output of a photodiode or a CCD camera, is applied as a control in the iteration procedure of quantification.

  20. Integration of Neuroimaging and Microarray Datasets through Mapping and Model-Theoretic Semantic Decomposition of Unstructured Phenotypes

    PubMed Central

    Pantazatos, Spiro P.; Li, Jianrong; Pavlidis, Paul; Lussier, Yves A.

    2009-01-01

    An approach towards heterogeneous neuroscience dataset integration is proposed that uses Natural Language Processing (NLP) and a knowledge-based phenotype organizer system (PhenOS) to link ontology-anchored terms to underlying data from each database, and then maps these terms based on a computable model of disease (SNOMED CT®). The approach was implemented using sample datasets from fMRIDC, GEO, The Whole Brain Atlas and Neuronames, and allowed for complex queries such as “List all disorders with a finding site of brain region X, and then find the semantically related references in all participating databases based on the ontological model of the disease or its anatomical and morphological attributes”. Precision of the NLP-derived coding of the unstructured phenotypes in each dataset was 88% (n = 50), and precision of the semantic mapping between these terms across datasets was 98% (n = 100). To our knowledge, this is the first example of the use of both semantic decomposition of disease relationships and hierarchical information found in ontologies to integrate heterogeneous phenotypes across clinical and molecular datasets. PMID:20495688

  1. Assessing genome-wide copy number variation in the Han Chinese population.

    PubMed

    Lu, Jianqi; Lou, Haiyi; Fu, Ruiqing; Lu, Dongsheng; Zhang, Feng; Wu, Zhendong; Zhang, Xi; Li, Changhua; Fang, Baijun; Pu, Fangfang; Wei, Jingning; Wei, Qian; Zhang, Chao; Wang, Xiaoji; Lu, Yan; Yan, Shi; Yang, Yajun; Jin, Li; Xu, Shuhua

    2017-10-01

    Copy number variation (CNV) is a valuable source of genetic diversity in the human genome and a well-recognised cause of various genetic diseases. However, CNVs have been considerably under-represented in population-based studies, particularly the Han Chinese which is the largest ethnic group in the world. To build a representative CNV map for the Han Chinese population. We conducted a genome-wide CNV study involving 451 male Han Chinese samples from 11 geographical regions encompassing 28 dialect groups, representing a less-biased panel compared with the currently available data. We detected CNVs by using 4.2M NimbleGen comparative genomic hybridisation array and whole-genome deep sequencing of 51 samples to optimise the filtering conditions in CNV discovery. A comprehensive Han Chinese CNV map was built based on a set of high-quality variants (positive predictive value >0.8, with sizes ranging from 369 bp to 4.16 Mb and a median of 5907 bp). The map consists of 4012 CNV regions (CNVRs), and more than half are novel to the 30 East Asian CNV Project and the 1000 Genomes Project Phase 3. We further identified 81 CNVRs specific to regional groups, which was indicative of the subpopulation structure within the Han Chinese population. Our data are complementary to public data sources, and the CNV map may facilitate in the identification of pathogenic CNVs and further biomedical research studies involving the Han Chinese population. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Mass spectrometric-based stable isotopic 2-aminobenzoic acid glycan mapping for rapid glycan screening of biotherapeutics.

    PubMed

    Prien, Justin M; Prater, Bradley D; Qin, Qiang; Cockrill, Steven L

    2010-02-15

    Fast, sensitive, robust methods for "high-level" glycan screening are necessary during various stages of a biotherapeutic product's lifecycle, including clone selection, process changes, and quality control for lot release testing. Traditional glycan screening involves chromatographic or electrophoretic separation-based methods, and, although reproducible, these methods can be time-consuming. Even ultrahigh-performance chromatographic and microfluidic integrated LC/MS systems, which work on the tens of minute time scale, become lengthy when hundreds of samples are to be analyzed. Comparatively, a direct infusion mass spectrometry (MS)-based glycan screening method acquires data on a millisecond time scale, exhibits exquisite sensitivity and reproducibility, and is amenable to automated peak annotation. In addition, characterization of glycan species via sequential mass spectrometry can be performed simultaneously. Here, we demonstrate a quantitative high-throughput MS-based mapping approach using stable isotope 2-aminobenzoic acid (2-AA) for rapid "high-level" glycan screening.

  3. Three-dimensional full-field X-ray orientation microscopy

    PubMed Central

    Viganò, Nicola; Tanguy, Alexandre; Hallais, Simon; Dimanov, Alexandre; Bornert, Michel; Batenburg, Kees Joost; Ludwig, Wolfgang

    2016-01-01

    A previously introduced mathematical framework for full-field X-ray orientation microscopy is for the first time applied to experimental near-field diffraction data acquired from a polycrystalline sample. Grain by grain tomographic reconstructions using convex optimization and prior knowledge are carried out in a six-dimensional representation of position-orientation space, used for modelling the inverse problem of X-ray orientation imaging. From the 6D reconstruction output we derive 3D orientation maps, which are then assembled into a common sample volume. The obtained 3D orientation map is compared to an EBSD surface map and local misorientations, as well as remaining discrepancies in grain boundary positions are quantified. The new approach replaces the single orientation reconstruction scheme behind X-ray diffraction contrast tomography and extends the applicability of this diffraction imaging technique to material micro-structures exhibiting sub-grains and/or intra-granular orientation spreads of up to a few degrees. As demonstrated on textured sub-regions of the sample, the new framework can be extended to operate on experimental raw data, thereby bypassing the concept of orientation indexation based on diffraction spot peak positions. This new method enables fast, three-dimensional characterization with isotropic spatial resolution, suitable for time-lapse observations of grain microstructures evolving as a function of applied strain or temperature. PMID:26868303

  4. Mass Spectrometric and Synchrotron Radiation based techniques for the identification and distribution of painting materials in samples from paints of Josep Maria Sert

    PubMed Central

    2012-01-01

    Background Establishing the distribution of materials in paintings and that of their degradation products by imaging techniques is fundamental to understand the painting technique and can improve our knowledge on the conservation status of the painting. The combined use of chromatographic-mass spectrometric techniques, such as GC/MS or Py/GC/MS, and the chemical mapping of functional groups by imaging SR FTIR in transmission mode on thin sections and SR XRD line scans will be presented as a suitable approach to have a detailed characterisation of the materials in a paint sample, assuring their localisation in the sample build-up. This analytical approach has been used to study samples from Catalan paintings by Josep Maria Sert y Badía (20th century), a muralist achieving international recognition whose canvases adorned international buildings. Results The pigments used by the painter as well as the organic materials used as binders and varnishes could be identified by means of conventional techniques. The distribution of these materials by means of Synchrotron Radiation based techniques allowed to establish the mixtures used by the painter depending on the purpose. Conclusions Results show the suitability of the combined use of SR μFTIR and SR μXRD mapping and conventional techniques to unequivocally identify all the materials present in the sample and their localization in the sample build-up. This kind of approach becomes indispensable to solve the challenge of micro heterogeneous samples. The complementary interpretation of the data obtained with all the different techniques allowed the characterization of both organic and inorganic materials in the samples layer by layer as well as to establish the painting techniques used by Sert in the works-of-art under study. PMID:22616949

  5. Drop transfer between superhydrophobic wells using air logic control.

    PubMed

    Vuong, Thach; Cheong, Brandon Huey-Ping; Huynh, So Hung; Muradoglu, Murat; Liew, Oi Wah; Ng, Tuck Wah

    2015-02-21

    Superhydrophobic surfaces aid biochemical analysis by limiting sample loss. A system based on wells here tolerated tilting up to 20° and allowed air logic transfer with evidence of mixing. Conditions for intact transfer on 15 to 60 μL drops using compressed air pressure operation were also mapped.

  6. Publications - GMC 383 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    Alaska MAPTEACH Tsunami Inundation Mapping Energy Resources Gas Hydrates STATEMAP Program information DGGS GMC 383 Publication Details Title: Makushin Geothermal Project ST-1R, A-1, D-2 Core 2009 re -sampling and analysis: Analytical results for anomalous precious and base metals associated with geothermal

  7. Publications - GMC 366 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    Alaska MAPTEACH Tsunami Inundation Mapping Energy Resources Gas Hydrates STATEMAP Program information DGGS GMC 366 Publication Details Title: Makushin Geothermal Project ST-1R Core 2009 re-sampling and analysis: Analytical results for anomalous precious and base metals associated with geothermal systems

  8. Family Friendly Policies in STEM Departments: Awareness and Determinants

    ERIC Educational Resources Information Center

    Su, Xuhong; Bozeman, Barry

    2016-01-01

    Focused on academic departments in science, technology, engineering, and mathematics (STEM) fields in the United States, we attempt to map department chairs' awareness of family friendly policies and investigate possible determinants of their knowledge levels. Based on a sample of STEM department chairs in American research universities, we find…

  9. Mapping the Gaps in Services for L2 Writers

    ERIC Educational Resources Information Center

    Patton, Martha Davis

    2011-01-01

    Given complaints about preparation of international students for their writing-intensive courses, a director of first-year writing and an undergraduate researcher at a Midwestern research university conducted a needs assessment based on Kaufman's model. Instruments used included a survey, interviews, and analysis of commentary on sample papers.…

  10. Stroke Location Is an Independent Predictor of Cognitive Outcome.

    PubMed

    Munsch, Fanny; Sagnier, Sharmila; Asselineau, Julien; Bigourdan, Antoine; Guttmann, Charles R; Debruxelles, Sabrina; Poli, Mathilde; Renou, Pauline; Perez, Paul; Dousset, Vincent; Sibon, Igor; Tourdias, Thomas

    2016-01-01

    On top of functional outcome, accurate prediction of cognitive outcome for stroke patients is an unmet need with major implications for clinical management. We investigated whether stroke location may contribute independent prognostic value to multifactorial predictive models of functional and cognitive outcomes. Four hundred twenty-eight consecutive patients with ischemic stroke were prospectively assessed with magnetic resonance imaging at 24 to 72 hours and at 3 months for functional outcome using the modified Rankin Scale and cognitive outcome using the Montreal Cognitive Assessment (MoCA). Statistical maps of functional and cognitive eloquent regions were derived from the first 215 patients (development sample) using voxel-based lesion-symptom mapping. We used multivariate logistic regression models to study the influence of stroke location (number of eloquent voxels from voxel-based lesion-symptom mapping maps), age, initial National Institutes of Health Stroke Scale and stroke volume on modified Rankin Scale and MoCA. The second part of our cohort was used as an independent replication sample. In univariate analyses, stroke location, age, initial National Institutes of Health Stroke Scale, and stroke volume were all predictive of poor modified Rankin Scale and MoCA. In multivariable analyses, stroke location remained the strongest independent predictor of MoCA and significantly improved the prediction compared with using only age, initial National Institutes of Health Stroke Scale, and stroke volume (area under the curve increased from 0.697-0.771; difference=0.073; 95% confidence interval, 0.008-0.155). In contrast, stroke location did not persist as independent predictor of modified Rankin Scale that was mainly driven by initial National Institutes of Health Stroke Scale (area under the curve going from 0.840 to 0.835). Similar results were obtained in the replication sample. Stroke location is an independent predictor of cognitive outcome (MoCA) at 3 months post stroke. © 2015 American Heart Association, Inc.

  11. Robotic ecological mapping: Habitats and the search for life in the Atacama Desert

    NASA Astrophysics Data System (ADS)

    Warren-Rhodes, K.; Weinstein, S.; Piatek, J. L.; Dohm, J.; Hock, A.; Minkley, E.; Pane, D.; Ernst, L. A.; Fisher, G.; Emani, S.; Waggoner, A. S.; Cabrol, N. A.; Wettergreen, D. S.; Grin, E.; Coppin, P.; Diaz, Chong; Moersch, J.; Oril, G. G.; Smith, T.; Stubbs, K.; Thomas, G.; Wagner, M.; Wyatt, M.; Boyle, L. Ng

    2007-12-01

    As part of the three-year `Life in the Atacama' (LITA) project, plant and microbial abundance were mapped within three sites in the Atacama Desert, Chile, using an automated robotic rover. On-board fluorescence imaging of six biological signatures (e.g., chlorophyll, DNA, proteins) was used to assess abundance, based on a percent positive sample rating system and standardized robotic ecological transects. The percent positive rating system scored each sample based on the measured signal strength (0 for no signal to 2 for strong signal) for each biological signature relative to the total rating possible. The 2005 field experiment results show that percent positive ratings varied significantly across Site D (coastal site with fog), with patchy zones of high abundance correlated with orbital and microscale habitat types (heaved surface crust and gravel bars); alluvial fan habitats generally had lower abundance. Non-random multi-scale biological patchiness also characterized interior desert Sites E and F, with relatively high abundance associated with (paleo)aqueous habitats such as playas. Localized variables, including topography, played an important, albeit complex, role in microbial spatial distribution. Site D biosignature trends correlated with culturable soil bacteria, with MPN ranging from 10-1000 CFU/g-soil, and chlorophyll ratings accurately mapped lichen/moss abundance (Site D) and higher plant (Site F) distributions. Climate also affected biological patchiness, with significant correlation shown between abundance and (rover) air relative humidity, while lichen patterns were linked to the presence of fog. Rover biological mapping results across sites parallel longitudinal W-E wet/dry/wet Atacama climate trends. Overall, the study highlights the success of targeting of aqueous-associated habitats identifiable from orbital geology and mineralogy. The LITA experience also suggests the terrestrial study of life and its distribution, particularly the fields of landscape ecology and ecohydrology, hold critical lessons for the search for life on other planets. Their applications to robotic sampling strategies on Mars should be further exploited.

  12. Hydrogeological interpretation of natural radionuclide contents in Austrian groundwaters

    NASA Astrophysics Data System (ADS)

    Schubert, Gerhard; Berka, Rudolf; Hörhan, Thomas; Katzlberger, Christian; Landstetter, Claudia; Philippitsch, Rudolf

    2010-05-01

    The Austrian Agency for Health and Food Safety (AGES) stores comprehensive data sets of radionuclide contents in Austrian groundwater. There are several analyses concerning Rn-222, Ra-226, gross alpha and gross beta as well as selected analyses of Ra-228, Pb-210, Po-210, Uranium and U-234/U-238. In a current project financed by the Austrian Federal Ministry of Agriculture, Forestry, Environment and Water Management, AGES and the Geological Survey of Austria (GBA) are evaluating these data sets with regard to the geological backgrounds. Several similar studies based on groundwater monitoring have been made in the USA (for instance by Focazio, M.J., Szabo, Z., Kraemer, T.F., Mullin, A.H., Barringer, T.H., De Paul, V.T. (2001): Occurrence of selected radionuclides in groundwater used for drinking water in the United States: a reconnaissance survey, 1998. U.S. Geological Survey Water-Resources Investigations Report 00-4273). The geological background for the radionuclide contents of groundwater will be derived from geological maps in combination with existing Thorium and Uranium analyses of the country rocks and stream-sediments and from airborne radiometric maps. Airborne radiometric data could contribute to identify potential radionuclide hot spot areas as only airborne radiometric mapping could provide countrywide Thorium and Uranium data coverage in high resolution. The project will also focus on the habit of the sampled wells and springs and the hydrological situation during the sampling as these factors can have an important influence on the Radon content of the sampled groundwater (Schubert, G., Alletsgruber, I., Finger, F., Gasser, V., Hobiger, G. and Lettner, H. (2010): Radon im Grundwasser des Mühlviertels (Oberösterreich) Grundwasser. - Springer (in print). Based on the project results an overview map (1:500,000) concerning the radionuclide potential should be produced. The first version should be available in February 2011.

  13. Prospects and pitfalls of occupational hazard mapping: 'between these lines there be dragons'.

    PubMed

    Koehler, Kirsten A; Volckens, John

    2011-10-01

    Hazard data mapping is a promising new technique that can enhance the process of occupational exposure assessment and risk communication. Hazard maps have the potential to improve worker health by providing key input for the design of hazard intervention and control strategies. Hazard maps are developed with aid from direct-reading instruments, which can collect highly spatially and temporally resolved data in a relatively short period of time. However, quantifying spatial-temporal variability in the occupational environment is not a straightforward process, and our lack of understanding of how to ascertain and model spatial and temporal variability is a limiting factor in the use and interpretation of workplace hazard maps. We provide an example of how sources of and exposures to workplace hazards may be mischaracterized in a hazard map due to a lack of completeness and representativeness of collected measurement data. Based on this example, we believe that a major priority for research in this emerging area should focus on the development of a statistical framework to quantify uncertainty in spatially and temporally varying data. In conjunction with this need is one for the development of guidelines and procedures for the proper sampling, generation, and evaluation of workplace hazard maps.

  14. On the map: Nature and Science editorials.

    PubMed

    Waaijer, Cathelijn J F; van Bochove, Cornelis A; van Eck, Nees Jan

    2011-01-01

    Bibliometric mapping of scientific articles based on keywords and technical terms in abstracts is now frequently used to chart scientific fields. In contrast, no significant mapping has been applied to the full texts of non-specialist documents. Editorials in Nature and Science are such non-specialist documents, reflecting the views of the two most read scientific journals on science, technology and policy issues. We use the VOSviewer mapping software to chart the topics of these editorials. A term map and a document map are constructed and clusters are distinguished in both of them. The validity of the document clustering is verified by a manual analysis of a sample of the editorials. This analysis confirms the homogeneity of the clusters obtained by mapping and augments the latter with further detail. As a result, the analysis provides reliable information on the distribution of the editorials over topics, and on differences between the journals. The most striking difference is that Nature devotes more attention to internal science policy issues and Science more to the political influence of scientists. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s11192-010-0205-9) contains supplementary material, which is available to authorized users.

  15. Comparison of traditional burn wound mapping with a computerized program.

    PubMed

    Williams, James F; King, Booker T; Aden, James K; Serio-Melvin, Maria; Chung, Kevin K; Fenrich, Craig A; Salinas, José; Renz, Evan M; Wolf, Steven E; Blackbourne, Lorne H; Cancio, Leopoldo C

    2013-01-01

    Accurate burn estimation affects the use of burn resuscitation formulas and treatment strategies, and thus can affect patient outcomes. The objective of this process-improvement project was to compare the accuracy of a computer-based burn mapping program, WoundFlow (WF), with the widely used hand-mapped Lund-Browder (LB) diagram. Manikins with various burn representations (from 1% to more than 60% TBSA) were used for comparison of the WF system and LB diagrams. Burns were depicted on the manikins using red vinyl adhesive. Healthcare providers responsible for mapping of burn patients were asked to perform burn mapping of the manikins. Providers were randomized to either an LB or a WF group. Differences in the total map area between groups were analyzed. Also, direct measurements of the burn representations were taken and compared with LB and WF results. The results of 100 samples, compared using Bland-Altman analysis, showed no difference between the two methods. WF was as accurate as LB mapping for all burn surface areas. WF may be additionally beneficial in that it can track daily progress until complete wound closure, and can automatically calculate burn size, thus decreasing the chances of mathematical errors.

  16. DNA Probe Pooling for Rapid Delineation of Chromosomal Breakpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Chun-Mei; Kwan, Johnson; Baumgartner, Adolf

    2009-01-30

    Structural chromosome aberrations are hallmarks of many human genetic diseases. The precise mapping of translocation breakpoints in tumors is important for identification of genes with altered levels of expression, prediction of tumor progression, therapy response, or length of disease-free survival as well as the preparation of probes for detection of tumor cells in peripheral blood. Similarly, in vitro fertilization (IVF) and preimplantation genetic diagnosis (PGD) for carriers of balanced, reciprocal translocations benefit from accurate breakpoint maps in the preparation of patient-specific DNA probes followed by a selection of normal or balanced oocytes or embryos. We expedited the process of breakpointmore » mapping and preparation of case-specific probes by utilizing physically mapped bacterial artificial chromosome (BAC) clones. Historically, breakpoint mapping is based on the definition of the smallest interval between proximal and distal probes. Thus, many of the DNA probes prepared for multi-clone and multi-color mapping experiments do not generate additional information. Our pooling protocol described here with examples from thyroid cancer research and PGD accelerates the delineation of translocation breakpoints without sacrificing resolution. The turnaround time from clone selection to mapping results using tumor or IVF patient samples can be as short as three to four days.« less

  17. Systolic MOLLI T1 mapping with heart-rate-dependent pulse sequence sampling scheme is feasible in patients with atrial fibrillation.

    PubMed

    Zhao, Lei; Li, Songnan; Ma, Xiaohai; Greiser, Andreas; Zhang, Tianjing; An, Jing; Bai, Rong; Dong, Jianzeng; Fan, Zhanming

    2016-03-15

    T1 mapping enables assessment of myocardial characteristics. As the most common type of arrhythmia, atrial fibrillation (AF) is often accompanied by a variety of cardiac pathologies, whereby the irregular and usually rapid ventricle rate of AF may cause inaccurate T1 estimation due to mis-triggering and inadequate magnetization recovery. We hypothesized that systolic T1 mapping with a heart-rate-dependent (HRD) pulse sequence scheme may overcome this issue. 30 patients with AF and 13 healthy volunteers were enrolled and underwent cardiovascular magnetic resonance (CMR) at 3 T. CMR was repeated for 3 patients after electric cardioversion and for 2 volunteers after lowering heart rate (HR). A Modified Look-Locker Inversion Recovery (MOLLI) sequence was acquired before and 15 min after administration of 0.1 mmol/kg gadopentetate dimeglumine. For AF patients, both the fixed 5(3)3/4(1)3(1)2 and the HRD sampling scheme were performed at diastole and systole, respectively. The HRD pulse sequence sampling scheme was 5(n)3/4(n)3(n)2, where n was determined by the heart rate to ensure adequate magnetization recovery. Image quality of T1 maps was assessed. T1 times were measured in myocardium and blood. Extracellular volume fraction (ECV) was calculated. In volunteers with repeated T1 mapping, the myocardial native T1 and ECV generated from the 1st fixed sampling scheme were smaller than from the 1st HRD and 2nd fixed sampling scheme. In healthy volunteers, the overall native T1 times and ECV of the left ventricle (LV) in diastolic T1 maps were greater than in systolic T1 maps (P < 0.01, P < 0.05). In the 3 AF patients that had received electrical cardioversion therapy, the myocardial native T1 times and ECV generated from the fixed sampling scheme were smaller than in the 1st and 2nd HRD sampling scheme (all P < 0.05). In patients with AF (HR: 88 ± 20 bpm, HR fluctuation: 12 ± 9 bpm), more T1 maps with artifact were found in diastole than in systole (P < 0.01). The overall native T1 times and ECV of the left ventricle (LV) in diastolic T1 maps were greater than systolic T1 maps, either with fixed or HRD sampling scheme (all P < 0.05). Systolic MOLLI T1 mapping with heart-rate-dependent pulse sequence scheme can improve image quality and avoid T1 underestimation. It is feasible and with further validation may extend clinical applicability of T1 mapping to patients with atrial fibrillation.

  18. Effect of different packaging methods and storage temperature on microbiological and physicochemical quality characteristics of meatball.

    PubMed

    Yilmaz, I; Demirci, M

    2010-06-01

    The objective of this research was to determine physicochemical changes and microbiological quality of the different packaged meatball samples. Meatball samples in polystyrene tray were closed with polyethylene film (PS packs), vacuumed and modified atmosphere packaged, (MAP) (65% N(2), 35% CO(2)), and held under refrigerated display (4 °C) for 8, 16 and 16 days for PS packs, vacuum and MAP, respectively. Microbial load, free fatty acids and thiobarbituric acid values of the samples tended to increase with storage time. Bacteria counts of the raw meatball samples increased 2 log cycles at the end of storage compared with initial values. Meatball samples can be stored without any microbiological problem for 7 days at 4 °C. Results from this study suggested that shelf-life assigned to modified-MAP and vacuum-packed meatballs may be appropriate. Meatball samples underwent physical deformation when they were packed before vacuum process. With these negative factors considered, MAP is superior to other two packs methods.

  19. Improving the accuracy of livestock distribution estimates through spatial interpolation.

    PubMed

    Bryssinckx, Ward; Ducheyne, Els; Muhwezi, Bernard; Godfrey, Sunday; Mintiens, Koen; Leirs, Herwig; Hendrickx, Guy

    2012-11-01

    Animal distribution maps serve many purposes such as estimating transmission risk of zoonotic pathogens to both animals and humans. The reliability and usability of such maps is highly dependent on the quality of the input data. However, decisions on how to perform livestock surveys are often based on previous work without considering possible consequences. A better understanding of the impact of using different sample designs and processing steps on the accuracy of livestock distribution estimates was acquired through iterative experiments using detailed survey. The importance of sample size, sample design and aggregation is demonstrated and spatial interpolation is presented as a potential way to improve cattle number estimates. As expected, results show that an increasing sample size increased the precision of cattle number estimates but these improvements were mainly seen when the initial sample size was relatively low (e.g. a median relative error decrease of 0.04% per sampled parish for sample sizes below 500 parishes). For higher sample sizes, the added value of further increasing the number of samples declined rapidly (e.g. a median relative error decrease of 0.01% per sampled parish for sample sizes above 500 parishes. When a two-stage stratified sample design was applied to yield more evenly distributed samples, accuracy levels were higher for low sample densities and stabilised at lower sample sizes compared to one-stage stratified sampling. Aggregating the resulting cattle number estimates yielded significantly more accurate results because of averaging under- and over-estimates (e.g. when aggregating cattle number estimates from subcounty to district level, P <0.009 based on a sample of 2,077 parishes using one-stage stratified samples). During aggregation, area-weighted mean values were assigned to higher administrative unit levels. However, when this step is preceded by a spatial interpolation to fill in missing values in non-sampled areas, accuracy is improved remarkably. This counts especially for low sample sizes and spatially even distributed samples (e.g. P <0.001 for a sample of 170 parishes using one-stage stratified sampling and aggregation on district level). Whether the same observations apply on a lower spatial scale should be further investigated.

  20. Spatial Variability of Sources and Mixing State of Atmospheric Particles in a Metropolitan Area.

    PubMed

    Ye, Qing; Gu, Peishi; Li, Hugh Z; Robinson, Ellis S; Lipsky, Eric; Kaltsonoudis, Christos; Lee, Alex K Y; Apte, Joshua S; Robinson, Allen L; Sullivan, Ryan C; Presto, Albert A; Donahue, Neil M

    2018-05-30

    Characterizing intracity variations of atmospheric particulate matter has mostly relied on fixed-site monitoring and quantifying variability in terms of different bulk aerosol species. In this study, we performed ground-based mobile measurements using a single-particle mass spectrometer to study spatial patterns of source-specific particles and the evolution of particle mixing state in 21 areas in the metropolitan area of Pittsburgh, PA. We selected sampling areas based on traffic density and restaurant density with each area ranging from 0.2 to 2 km 2 . Organics dominate particle composition in all of the areas we sampled while the sources of organics differ. The contribution of particles from traffic and restaurant cooking varies greatly on the neighborhood scale. We also investigate how primary and aged components in particles mix across the urban scale. Lastly we quantify and map the particle mixing state for all areas we sampled and discuss the overall pattern of mixing state evolution and its implications. We find that in the upwind and downwind of the urban areas, particles are more internally mixed while in the city center, particle mixing state shows large spatial heterogeneity that is mostly driven by emissions. This study is to our knowledge, the first study to perform fine spatial scale mapping of particle mixing state using ground-based mobile measurement and single-particle mass spectrometry.

  1. Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - I. Methodology

    NASA Astrophysics Data System (ADS)

    Speagle, Joshua S.; Eisenstein, Daniel J.

    2017-07-01

    We propose a method to substantially increase the flexibility and power of template fitting-based photometric redshifts by transforming a large number of galaxy spectral templates into a corresponding collection of 'fuzzy archetypes' using a suitable set of perturbative priors designed to account for empirical variation in dust attenuation and emission-line strengths. To bypass widely separated degeneracies in parameter space (e.g. the redshift-reddening degeneracy), we train self-organizing maps (SOMs) on large 'model catalogues' generated from Monte Carlo sampling of our fuzzy archetypes to cluster the predicted observables in a topologically smooth fashion. Subsequent sampling over the SOM then allows full reconstruction of the relevant probability distribution functions (PDFs). This combined approach enables the multimodal exploration of known variation among galaxy spectral energy distributions with minimal modelling assumptions. We demonstrate the power of this approach to recover full redshift PDFs using discrete Markov chain Monte Carlo sampling methods combined with SOMs constructed from Large Synoptic Survey Telescope ugrizY and Euclid YJH mock photometry.

  2. PhyloGeoViz: a web-based program that visualizes genetic data on maps.

    PubMed

    Tsai, Yi-Hsin E

    2011-05-01

    The first step of many population genetic studies is the simple visualization of allele frequencies on a landscape. This basic data exploration can be challenging without proprietary software, and the manual plotting of data is cumbersome and unfeasible at large sample sizes. I present an open source, web-based program that plots any kind of frequency or count data as pie charts in Google Maps (Google Inc., Mountain View, CA). Pie polygons are then exportable to Google Earth (Google Inc.), a free Geographic Information Systems platform. Import of genetic data into Google Earth allows phylogeographers access to a wealth of spatial information layers integral to forming hypotheses and understanding patterns in the data. © 2010 Blackwell Publishing Ltd.

  3. `VIS/NIR mapping of TOC and extent of organic soils in the Nørre Å valley

    NASA Astrophysics Data System (ADS)

    Knadel, M.; Greve, M. H.; Thomsen, A.

    2009-04-01

    Organic soils represent a substantial pool of carbon in Denmark. The need for carbon stock assessment calls for more rapid and effective mapping methods to be developed. The aim of this study was to compare traditional soil mapping with maps produced from the results of a mobile VIS/NIR system and to evaluate the ability to estimate TOC and map the area of organic soils. The Veris mobile VIS/NIR spectroscopy system was compared to traditional manual sampling. The system is developed for in-situ near surface measurements of soil carbon content. It measures diffuse reflectance in the 350 nm-2200 nm region. The system consists of two spectrophotometers mounted on a toolbar and pulled by a tractor. Optical measurements are made through a sapphire window at the bottom of the shank. The shank was pulled at a depth of 5-7 cm at a speed of 4-5 km/hr. 20-25 spectra per second with 8 nm resolution were acquired by the spectrometers. Measurements were made on 10-12 m spaced transects. The system also acquired soil electrical conductivity (EC) for two soil depths: shallow EC-SH (0- 31 cm) and deep conductivity EC-DP (0- 91 cm). The conductivity was recorded together with GPS coordinates and spectral data for further construction of the calibration models. Two maps of organic soils in the Nørre Å valley (Central Jutland) were generated: (i) based on a conventional 25 m grid with 162 sampling points and laboratory analysis of TOC, (ii) based on in-situ VIS/NIR measurements supported by chemometrics. Before regression analysis, spectral information was compressed by calculating principal components. The outliers were determined by a mahalanobis distance equation and removed. Clustering using a fuzzy c- means algorithm was conducted. Within each cluster a location with the minimal spatial variability was selected. A map of 15 representative sample locations was proposed. The interpolation of the spectra into a single spectrum was performed using a Gaussian kernel weighting function. Spectra obtained near a sampled location were averaged. The collected spectra were correlated to TOC of the 15 representative samples using multivariate regression techniques (Unscrambler 9.7; Camo ASA, Oslo, Norway). Two types of calibrations were performed: using only spectra and using spectra together with the auxiliary data (EC-SH and EC-DP). These calibration equations were computed using PLS regression, segmented cross-validation method on centred data (using the raw spectral data, log 1/R). Six different spectra pre-treatments were conducted: (1) only spectra, (2) Savitsky-Golay smoothing over 11 wavelength points and transformation to a (3) 1'st and (4) 2'nd Savitzky and Golay derivative algorithm with a derivative interval of 21 wavelength points, (5) with or (6) without smoothing. The best treatment was considered to be the one with the lowest Root Mean Square Error of Prediction (RMSEP), the highest r2 between the VIS/NIR-predicted and measured values in the calibration model and the lowest mean deviation of predicted TOC values. The best calibration model was obtained with the mathematical pre-treatment's including smoothing, calculating the 2'nd derivative and outlier removal. The two TOC maps were compared after interpolation using kriging. They showed a similar pattern in the TOC distribution. Despite the unfavourable field conditions the VIS/NIR system performed well in both low and high TOC areas. Water content in places exceeding field capacity in the lower parts of the investigated field did not seriously degrade measurements. The present study represents the first attempt to apply the mobile Veris VIS/NIR system to the mapping of TOC of peat soils in Denmark. The result from this study show that a mobile VIS/NIR system can be applied to cost effective TOC mapping of mineral and organic soils with highly varying water content. Key words: VIS/NIR spectroscopy, organic soils, TOC

  4. Vegetation mapping of Nowitna National Wildlife Reguge, Alaska using Landsat MSS digital data

    USGS Publications Warehouse

    Talbot, S. S.; Markon, Carl J.

    1986-01-01

    A Landsat-derived vegetation map was prepared for Nowitna National Wildlife Refuge. The refuge lies within the middle boreal subzone of north central Alaska. Seven major vegetation classes and sixteen subclasses were recognized: forest (closed needleleaf, open needleleaf, needleleaf woodland, mixed, and broadleaf); broadleaf scrub (lowland, alluvial, subalpine); dwarf scrub (prostrate dwarf shrub tundra, dwarf shrub-graminoid tussock peatland); herbaceous (graminoid bog, marsh and meadow); scarcely vegetated areas (scarcely vegetated scree and floodplain); water (clear, turbid); and other areas (mountain shadow). The methodology employed a cluster-block technique. Sample areas were described based on a combination of helicopter-ground survey, aerial photointerpretation, and digital Landsat data. Major steps in the Landsat analysis involved preprocessing (geometric correction), derivation of statistical parameters for spectral classes, spectral class labeling of sample areas, preliminary classification of the entire study area using a maximum-likelihood algorithm, and final classification utilizing ancillary information such as digital elevation data. The final product is a 1:250,000-scale vegetation map representative of distinctive regional patterns and suitable for use in comprehensive conservation planning.

  5. The Global Genome Biodiversity Network (GGBN) Data Standard specification

    PubMed Central

    Droege, G.; Barker, K.; Seberg, O.; Coddington, J.; Benson, E.; Berendsohn, W. G.; Bunk, B.; Butler, C.; Cawsey, E. M.; Deck, J.; Döring, M.; Flemons, P.; Gemeinholzer, B.; Güntsch, A.; Hollowell, T.; Kelbert, P.; Kostadinov, I.; Kottmann, R.; Lawlor, R. T.; Lyal, C.; Mackenzie-Dodds, J.; Meyer, C.; Mulcahy, D.; Nussbeck, S. Y.; O'Tuama, É.; Orrell, T.; Petersen, G.; Robertson, T.; Söhngen, C.; Whitacre, J.; Wieczorek, J.; Yilmaz, P.; Zetzsche, H.; Zhang, Y.; Zhou, X.

    2016-01-01

    Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today’s ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard. Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard PMID:27694206

  6. Characterizing the zenithal night sky brightness in large territories: how many samples per square kilometre are needed?

    NASA Astrophysics Data System (ADS)

    Bará, Salvador

    2018-01-01

    A recurring question arises when trying to characterize, by means of measurements or theoretical calculations, the zenithal night sky brightness throughout a large territory: how many samples per square kilometre are needed? The optimum sampling distance should allow reconstructing, with sufficient accuracy, the continuous zenithal brightness map across the whole region, whilst at the same time avoiding unnecessary and redundant oversampling. This paper attempts to provide some tentative answers to this issue, using two complementary tools: the luminance structure function and the Nyquist-Shannon spatial sampling theorem. The analysis of several regions of the world, based on the data from the New world atlas of artificial night sky brightness, suggests that, as a rule of thumb, about one measurement per square kilometre could be sufficient for determining the zenithal night sky brightness of artificial origin at any point in a region to within ±0.1 magV arcsec-2 (in the root-mean-square sense) of its true value in the Johnson-Cousins V band. The exact reconstruction of the zenithal night sky brightness maps from samples taken at the Nyquist rate seems to be considerably more demanding.

  7. Applying time series Landsat data for vegetation change analysis in the Florida Everglades Water Conservation Area 2A during 1996-2016

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang

    2017-05-01

    Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.

  8. The IRHUM (Isotopic Reconstruction of Human Migration) database - bioavailable strontium isotope ratios for geochemical fingerprinting in France

    NASA Astrophysics Data System (ADS)

    Willmes, M.; McMorrow, L.; Kinsley, L.; Armstrong, R.; Aubert, M.; Eggins, S.; Falguères, C.; Maureille, B.; Moffat, I.; Grün, R.

    2014-03-01

    Strontium isotope ratios (87Sr / 86Sr) are a key geochemical tracer used in a wide range of fields including archaeology, ecology, food and forensic sciences. These applications are based on the principle that the Sr isotopic ratios of natural materials reflect the sources of strontium available during their formation. A major constraint for current studies is the lack of robust reference maps to evaluate the source of strontium isotope ratios measured in the samples. Here we provide a new data set of bioavailable Sr isotope ratios for the major geologic units of France, based on plant and soil samples (Pangaea data repository doi:10.1594/PANGAEA.819142). The IRHUM (Isotopic Reconstruction of Human Migration) database is a web platform to access, explore and map our data set. The database provides the spatial context and metadata for each sample, allowing the user to evaluate the suitability of the sample for their specific study. In addition, it allows users to upload and share their own data sets and data products, which will enhance collaboration across the different research fields. This article describes the sampling and analytical methods used to generate the data set and how to use and access the data set through the IRHUM database. Any interpretation of the isotope data set is outside the scope of this publication.

  9. Integrating in-situ, Landsat, and MODIS data for mapping in Southern African savannas: experiences of LCCS-based land-cover mapping in the Kalahari in Namibia.

    PubMed

    Hüttich, Christian; Herold, Martin; Strohbach, Ben J; Dech, Stefan

    2011-05-01

    Integrated ecosystem assessment initiatives are important steps towards a global biodiversity observing system. Reliable earth observation data are key information for tracking biodiversity change on various scales. Regarding the establishment of standardized environmental observation systems, a key question is: What can be observed on each scale and how can land cover information be transferred? In this study, a land cover map from a dry semi-arid savanna ecosystem in Namibia was obtained based on the UN LCCS, in-situ data, and MODIS and Landsat satellite imagery. In situ botanical relevé samples were used as baseline data for the definition of a standardized LCCS legend. A standard LCCS code for savanna vegetation types is introduced. An object-oriented segmentation of Landsat imagery was used as intermediate stage for downscaling in-situ training data on a coarse MODIS resolution. MODIS time series metrics of the growing season 2004/2005 were used to classify Kalahari vegetation types using a tree-based ensemble classifier (Random Forest). The prevailing Kalahari vegetation types based on LCCS was open broadleaved deciduous shrubland with an herbaceous layer which differs from the class assignments of the global and regional land-cover maps. The separability analysis based on Bhattacharya distance measurements applied on two LCCS levels indicated a relationship of spectral mapping dependencies of annual MODIS time series features due to the thematic detail of the classification scheme. The analysis of LCCS classifiers showed an increased significance of life-form composition and soil conditions to the mapping accuracy. An overall accuracy of 92.48% was achieved. Woody plant associations proved to be most stable due to small omission and commission errors. The case study comprised a first suitability assessment of the LCCS classifier approach for a southern African savanna ecosystem.

  10. Systematic analysis of transcription start sites in avian development.

    PubMed

    Lizio, Marina; Deviatiiarov, Ruslan; Nagai, Hiroki; Galan, Laura; Arner, Erik; Itoh, Masayoshi; Lassmann, Timo; Kasukawa, Takeya; Hasegawa, Akira; Ros, Marian A; Hayashizaki, Yoshihide; Carninci, Piero; Forrest, Alistair R R; Kawaji, Hideya; Gusev, Oleg; Sheng, Guojun

    2017-09-01

    Cap Analysis of Gene Expression (CAGE) in combination with single-molecule sequencing technology allows precision mapping of transcription start sites (TSSs) and genome-wide capture of promoter activities in differentiated and steady state cell populations. Much less is known about whether TSS profiling can characterize diverse and non-steady state cell populations, such as the approximately 400 transitory and heterogeneous cell types that arise during ontogeny of vertebrate animals. To gain such insight, we used the chick model and performed CAGE-based TSS analysis on embryonic samples covering the full 3-week developmental period. In total, 31,863 robust TSS peaks (>1 tag per million [TPM]) were mapped to the latest chicken genome assembly, of which 34% to 46% were active in any given developmental stage. ZENBU, a web-based, open-source platform, was used for interactive data exploration. TSSs of genes critical for lineage differentiation could be precisely mapped and their activities tracked throughout development, suggesting that non-steady state and heterogeneous cell populations are amenable to CAGE-based transcriptional analysis. Our study also uncovered a large set of extremely stable housekeeping TSSs and many novel stage-specific ones. We furthermore demonstrated that TSS mapping could expedite motif-based promoter analysis for regulatory modules associated with stage-specific and housekeeping genes. Finally, using Brachyury as an example, we provide evidence that precise TSS mapping in combination with Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR)-on technology enables us, for the first time, to efficiently target endogenous avian genes for transcriptional activation. Taken together, our results represent the first report of genome-wide TSS mapping in birds and the first systematic developmental TSS analysis in any amniote species (birds and mammals). By facilitating promoter-based molecular analysis and genetic manipulation, our work also underscores the value of avian models in unravelling the complex regulatory mechanism of cell lineage specification during amniote development.

  11. Combining XCO2 Measurements Derived from SCIAMACHY and GOSAT for Potentially Generating Global CO2 Maps with High Spatiotemporal Resolution

    PubMed Central

    Wang, Tianxing; Shi, Jiancheng; Jing, Yingying; Zhao, Tianjie; Ji, Dabin; Xiong, Chuan

    2014-01-01

    Global warming induced by atmospheric CO2 has attracted increasing attention of researchers all over the world. Although space-based technology provides the ability to map atmospheric CO2 globally, the number of valid CO2 measurements is generally limited for certain instruments owing to the presence of clouds, which in turn constrain the studies of global CO2 sources and sinks. Thus, it is a potentially promising work to combine the currently available CO2 measurements. In this study, a strategy for fusing SCIAMACHY and GOSAT CO2 measurements is proposed by fully considering the CO2 global bias, averaging kernel, and spatiotemporal variations as well as the CO2 retrieval errors. Based on this method, a global CO2 map with certain UTC time can also be generated by employing the pattern of the CO2 daily cycle reflected by Carbon Tracker (CT) data. The results reveal that relative to GOSAT, the global spatial coverage of the combined CO2 map increased by 41.3% and 47.7% on a daily and monthly scale, respectively, and even higher when compared with that relative to SCIAMACHY. The findings in this paper prove the effectiveness of the combination method in supporting the generation of global full-coverage XCO2 maps with higher temporal and spatial sampling by jointly using these two space-based XCO2 datasets. PMID:25119468

  12. Soil organic carbon content assessment in a heterogeneous landscape: comparison of digital soil mapping and visible and near Infrared spectroscopy approaches

    NASA Astrophysics Data System (ADS)

    Michot, Didier; Fouad, Youssef; Pascal, Pichelin; Viaud, Valérie; Soltani, Inès; Walter, Christian

    2017-04-01

    This study aims are: i) to assess SOC content distribution according to the global soil map (GSM) project recommendations in a heterogeneous landscape ; ii) to compare the prediction performance of digital soil mapping (DSM) and visible-near infrared (Vis-NIR) spectroscopy approaches. The study area of 140 ha, located at Plancoët, surrounds the unique mineral spring water of Brittany (Western France). It's a hillock characterized by a heterogeneous landscape mosaic with different types of forest, permanent pastures and wetlands along a small coastal river. We acquired two independent datasets: j) 50 points selected using a conditioned Latin hypercube sampling (cLHS); jj) 254 points corresponding to the GSM grid. Soil samples were collected in three layers (0-5, 20-25 and 40-50cm) for both sampling strategies. SOC content was only measured in cLHS soil samples, while Vis-NIR spectra were measured on all the collected samples. For the DSM approach, a machine-learning algorithm (Cubist) was applied on the cLHS calibration data to build rule-based models linking soil carbon content in the different layers with environmental covariates, derived from digital elevation model, geological variables, land use data and existing large scale soil maps. For the spectroscopy approach, we used two calibration datasets: k) the local cLHS ; kk) a subset selected from the regional spectral database of Brittany after a PCA with a hierarchical clustering analysis and spiked by local cLHS spectra. The PLS regression algorithm with "leave-one-out" cross validation was performed for both calibration datasets. SOC contents for the 3 layers of the GSM grid were predicted using the different approaches and were compared with each other. Their prediction performance was evaluated by the following parameters: R2, RMSE and RPD. Both approaches led to satisfactory predictions for SOC content with an advantage for the spectral approach, particularly as regards the pertinence of the variation range.

  13. Geographic Mapping of Use and Knowledge of the Existence of Projects or ICT-Based Devices in Dementia Care.

    PubMed

    Zacharopoulou, Vassiliki; Zarakovitis, Dimitrios; Zacharopoulou, Georgia; Tsaloukidis, Nikolaos; Lazakidou, Athina

    2017-01-01

    The purpose of this study is to investigate the knowledge and use of Information and Communication Technologies (ICT) from community-based patients, while the results were visualized on maps by using Geographic Information Systems (GIS). Of the 779 participants, 37.4% of the patients responded that they were aware of the availability of ICTs available for dementia, with only 9.2% responding that they were using individual devices. It was apparent that 94.7% of those with a university education had knowledge of ICT and 47.4% of them did use, unlike patients with lower levels of education. In conclusion, based on a small sample of the Greek population, the knowledge and use of ICTs is still on a limited scale and mainly concerns patients with high socioeconomic status.

  14. Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Donald J (Inventor)

    2011-01-01

    A computer implemented process for simultaneously measuring the velocity of terahertz electromagnetic radiation in a dielectric material sample without prior knowledge of the thickness of the sample and for measuring the thickness of a material sample using terahertz electromagnetic radiation in a material sample without prior knowledge of the velocity of the terahertz electromagnetic radiation in the sample is disclosed and claimed. Utilizing interactive software the process evaluates, in a plurality of locations, the sample for microstructural variations and for thickness variations and maps the microstructural and thickness variations by location. A thin sheet of dielectric material may be used on top of the sample to create a dielectric mismatch. The approximate focal point of the radiation source (transceiver) is initially determined for good measurements.

  15. Analyzing gene expression profiles in dilated cardiomyopathy via bioinformatics methods.

    PubMed

    Wang, Liming; Zhu, L; Luan, R; Wang, L; Fu, J; Wang, X; Sui, L

    2016-10-10

    Dilated cardiomyopathy (DCM) is characterized by ventricular dilatation, and it is a common cause of heart failure and cardiac transplantation. This study aimed to explore potential DCM-related genes and their underlying regulatory mechanism using methods of bioinformatics. The gene expression profiles of GSE3586 were downloaded from Gene Expression Omnibus database, including 15 normal samples and 13 DCM samples. The differentially expressed genes (DEGs) were identified between normal and DCM samples using Limma package in R language. Pathway enrichment analysis of DEGs was then performed. Meanwhile, the potential transcription factors (TFs) and microRNAs (miRNAs) of these DEGs were predicted based on their binding sequences. In addition, DEGs were mapped to the cMap database to find the potential small molecule drugs. A total of 4777 genes were identified as DEGs by comparing gene expression profiles between DCM and control samples. DEGs were significantly enriched in 26 pathways, such as lymphocyte TarBase pathway and androgen receptor signaling pathway. Furthermore, potential TFs (SP1, LEF1, and NFAT) were identified, as well as potential miRNAs (miR-9, miR-200 family, and miR-30 family). Additionally, small molecules like isoflupredone and trihexyphenidyl were found to be potential therapeutic drugs for DCM. The identified DEGs (PRSS12 and FOXG1), potential TFs, as well as potential miRNAs, might be involved in DCM.

  16. Analyzing gene expression profiles in dilated cardiomyopathy via bioinformatics methods

    PubMed Central

    Wang, Liming; Zhu, L.; Luan, R.; Wang, L.; Fu, J.; Wang, X.; Sui, L.

    2016-01-01

    Dilated cardiomyopathy (DCM) is characterized by ventricular dilatation, and it is a common cause of heart failure and cardiac transplantation. This study aimed to explore potential DCM-related genes and their underlying regulatory mechanism using methods of bioinformatics. The gene expression profiles of GSE3586 were downloaded from Gene Expression Omnibus database, including 15 normal samples and 13 DCM samples. The differentially expressed genes (DEGs) were identified between normal and DCM samples using Limma package in R language. Pathway enrichment analysis of DEGs was then performed. Meanwhile, the potential transcription factors (TFs) and microRNAs (miRNAs) of these DEGs were predicted based on their binding sequences. In addition, DEGs were mapped to the cMap database to find the potential small molecule drugs. A total of 4777 genes were identified as DEGs by comparing gene expression profiles between DCM and control samples. DEGs were significantly enriched in 26 pathways, such as lymphocyte TarBase pathway and androgen receptor signaling pathway. Furthermore, potential TFs (SP1, LEF1, and NFAT) were identified, as well as potential miRNAs (miR-9, miR-200 family, and miR-30 family). Additionally, small molecules like isoflupredone and trihexyphenidyl were found to be potential therapeutic drugs for DCM. The identified DEGs (PRSS12 and FOXG1), potential TFs, as well as potential miRNAs, might be involved in DCM. PMID:27737314

  17. Mercury Slovenian soils: High, medium and low sample density geochemical maps

    NASA Astrophysics Data System (ADS)

    Gosar, Mateja; Šajn, Robert; Teršič, Tamara

    2017-04-01

    Regional geochemical survey was conducted in whole territory of Slovenia (20273 km2). High, medium and low sample density surveys were compared. High sample density represented the regional geochemical data set supplemented by local high-density sampling data (irregular grid, n=2835). Medium-density soil sampling was performed in a 5 x 5 km grid (n=817) and low-density geochemical survey was conducted in a sampling grid 25 x 25 km (n=54). Mercury distribution in Slovenian soils was determined with models of mercury distribution in soil using all three data sets. A distinct Hg anomaly in western part of Slovenia is evident on all three models. It is a consequence of 500-years of mining and ore processing in the second largest mercury mine in the world, the Idrija mine. The determined mercury concentrations revealed an important difference between the western and the eastern parts of the country. For the medium scale geochemical mapping is the median value (0.151 mg /kg) for western Slovenia almost 2-fold higher than the median value (0.083 mg/kg) in eastern Slovenia. Besides the Hg median for the western part of Slovenia exceeds the Hg median for European soil by a factor of 4 (Gosar et al., 2016). Comparing these sample density surveys, it was shown that high sampling density allows the identification and characterization of anthropogenic influences on a local scale, while medium- and low-density sampling reveal general trends in the mercury spatial distribution, but are not appropriate for identifying local contamination in industrial regions and urban areas. The resolution of the pattern generated is the best when the high-density survey on a regional scale is supplemented with the geochemical data of the high-density surveys on a local scale. References: Gosar, M, Šajn, R, Teršič, T. Distribution pattern of mercury in the Slovenian soil: geochemical mapping based on multiple geochemical datasets. Journal of geochemical exploration, 2016, 167/38-48.

  18. Converting Parkinson-Specific Scores into Health State Utilities to Assess Cost-Utility Analysis.

    PubMed

    Chen, Gang; Garcia-Gordillo, Miguel A; Collado-Mateo, Daniel; Del Pozo-Cruz, Borja; Adsuar, José C; Cordero-Ferrera, José Manuel; Abellán-Perpiñán, José María; Sánchez-Martínez, Fernando Ignacio

    2018-06-07

    The aim of this study was to compare the Parkinson's Disease Questionnaire-8 (PDQ-8) with three multi-attribute utility (MAU) instruments (EQ-5D-3L, EQ-5D-5L, and 15D) and to develop mapping algorithms that could be used to transform PDQ-8 scores into MAU scores. A cross-sectional study was conducted. A final sample of 228 evaluable patients was included in the analyses. Sociodemographic and clinical data were also collected. Two EQ-5D questionnaires were scored using Spanish tariffs. Two models and three statistical techniques were used to estimate each model in the direct mapping framework for all three MAU instruments, including the most widely used ordinary least squares (OLS), the robust MM-estimator, and the generalized linear model (GLM). For both EQ-5D-3L and EQ-5D-5L, indirect response mapping based on an ordered logit model was also conducted. Three goodness-of-fit tests were employed to compare the models: the mean absolute error (MAE), the root-mean-square error (RMSE), and the intra-class correlation coefficient (ICC) between the predicted and observed utilities. Health state utility scores ranged from 0.61 (EQ-5D-3L) to 0.74 (15D). The mean PDQ-8 score was 27.51. The correlation between overall PDQ-8 score and each MAU instrument ranged from - 0.729 (EQ-5D-5L) to - 0.752 (EQ-5D-3L). A mapping algorithm based on PDQ-8 items had better performance than using the overall score. For the two EQ-5D questionnaires, in general, the indirect mapping approach had comparable or even better performance than direct mapping based on MAE. Mapping algorithms developed in this study enable the estimation of utility values from the PDQ-8. The indirect mapping equations reported for two EQ-5D questionnaires will further facilitate the calculation of EQ-5D utility scores using other country-specific tariffs.

  19. Soil and plant contamination with Mycobacterium avium subsp. paratuberculosis after exposure to naturally contaminated mouflon feces.

    PubMed

    Pribylova, Radka; Slana, Iva; Kaevska, Marija; Lamka, Jiri; Babak, Vladimir; Jandak, Jiri; Pavlik, Ivo

    2011-05-01

    The aim of this study was to demonstrate the persistence of Mycobacterium avium subsp. paratuberculosis (MAP) in soil and colonization of different plant parts after deliberate exposure to mouflon feces naturally contaminated with different amounts of MAP. Samples of aerial parts of plants, their roots, and the soil below the roots were collected after 15 weeks and examined using IS900 real-time quantitative PCR (qPCR) and cultivation. Although the presence of viable MAP cells was not demonstrated, almost all samples were found to be positive using qPCR. MAP IS900 was not only found in the upper green parts, but also in the roots and soil samples (from 1.00 × 10(0) to 6.43 × 10(3)). The level of soil and plant contamination was influenced mainly by moisture, clay content, and the depth from which the samples were collected, rather than by the initial concentration of MAP in the feces at the beginning of the experiment.

  20. Shelf life of ready to use peeled shrimps as affected by thymol essential oil and modified atmosphere packaging.

    PubMed

    Mastromatteo, Marianna; Danza, Alessandra; Conte, Amalia; Muratore, Giuseppe; Del Nobile, Matteo Alessandro

    2010-12-15

    In this work the influence of different packaging strategies on the shelf life of ready to use peeled shrimps was investigated. First, the effectiveness of the coating (Coat) and the active coating loaded with different concentrations of thymol (Coat-500, Coat-1000, and Coat-1500) on the quality loss of the investigated food product packaged in air was addressed; afterwards, the thymol concentration that had shown the best performance was used in combination with MAP (5% O(2); 95% CO(2)). Microbial cell load of main spoilage microorganisms, pH and sensorial quality were monitored during the refrigerated storage. Results of the first step suggested that the sole coating did not affect the microbial growth. A slight antimicrobial effect was obtained when the coating was loaded with thymol and a concentration dependence was also observed. Moreover, the active coating was effective in minimizing the sensory quality loss of the investigated product, it was particularly true at the lowest thymol concentration. In the second step, the thymol concentration (1000 ppm) that showed the strike balance between microbial and sensorial quality was chosen in combination with MAP. As expected, MAP significantly affected the growth of the mesophilic bacteria. In particular, a cell load reduction of about 2 log cycle for the samples under MAP respect to that in air was obtained. Moreover, the MAP packaging inhibited the growth of the Pseudomonas spp. and hydrogen sulphide-producing bacteria. The MAP alone was not able to improve the shelf life of the uncoated samples. In fact, no significant difference between the control samples packaged in air and MAP was observed. Whilst, the use of coating under MAP condition prolonged the shelf life of about 6 days with respect to the same samples packaged in air. Moreover, when the MAP was used in combination with thymol, a further shelf life prolongation with respect to the samples packaged in air was observed. In particular, a shelf life of about 14 days for the active coating under MAP compared to the same samples in air (5 days) was obtained. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Spatial cluster analysis of nanoscopically mapped serotonin receptors for classification of fixed brain tissue

    NASA Astrophysics Data System (ADS)

    Sams, Michael; Silye, Rene; Göhring, Janett; Muresan, Leila; Schilcher, Kurt; Jacak, Jaroslaw

    2014-01-01

    We present a cluster spatial analysis method using nanoscopic dSTORM images to determine changes in protein cluster distributions within brain tissue. Such methods are suitable to investigate human brain tissue and will help to achieve a deeper understanding of brain disease along with aiding drug development. Human brain tissue samples are usually treated postmortem via standard fixation protocols, which are established in clinical laboratories. Therefore, our localization microscopy-based method was adapted to characterize protein density and protein cluster localization in samples fixed using different protocols followed by common fluorescent immunohistochemistry techniques. The localization microscopy allows nanoscopic mapping of serotonin 5-HT1A receptor groups within a two-dimensional image of a brain tissue slice. These nanoscopically mapped proteins can be confined to clusters by applying the proposed statistical spatial analysis. Selected features of such clusters were subsequently used to characterize and classify the tissue. Samples were obtained from different types of patients, fixed with different preparation methods, and finally stored in a human tissue bank. To verify the proposed method, samples of a cryopreserved healthy brain have been compared with epitope-retrieved and paraffin-fixed tissues. Furthermore, samples of healthy brain tissues were compared with data obtained from patients suffering from mental illnesses (e.g., major depressive disorder). Our work demonstrates the applicability of localization microscopy and image analysis methods for comparison and classification of human brain tissues at a nanoscopic level. Furthermore, the presented workflow marks a unique technological advance in the characterization of protein distributions in brain tissue sections.

  2. Effect of starter cultures and packaging methods on amino acid profile and eating quality characteristics of pork ham.

    PubMed

    Gogoi, Protiva; Borpuzari, R N; Borpuzari, T; Hazarika, R A; Bora, J R

    2015-08-01

    Wet cured pork hams were inoculated with a mixed starter cultures comprising of Lactobacillus acidophilus and Micrococcus varians M483 at the dose level of 106 cfu/g and the un inoculated hams served as controls. The amino acid profile of hams of the treated and the control groups stored at 4oC under MAP and VP and evaluated on 60th day of storage revealed that treated hams liberated higher concentration of free amino acids except for proline and methionine which were found in higher concentration (P < 0.01) in the MAP control samples. The MAP control samples liberated glutamic acid (85.65 ± 1.40 ppm), cystine (21.56 ± 1.14 ppm) and tyrosine (16.63 ± 1.94 ppm) whereas, the treated samples did not release these amino acids. The VP control samples too liberated cystine (6.98 ± 1.36 ppm) and arginine (42.70 ± 2.78 ppm) but the treated ham of the VP did not liberate these amino acids. The VP hams had higher concentration (P < 0.01) of free proline, glycine, alanine, valine, methionine, isoleucine, phenylalanine, lysine and histidine than the MAP samples. Colour analysis of ham using CIE Lab colour system revealed that the treated samples had significantly higher concentrations of L*, a* and b* components. The L* and a* values were higher in the MAP than under VP systems while the b* values were higher in the VP samples than the MAP samples. Neither the bacterial cultures nor the packaging system influenced the textural property of ham. Starter cultures inoculated hams were rated superior (P < 0.05) in terms of their sensory properties. Hams packaged under MAP were rated superior (P < 0.05) than those packaged under VP in terms of appearance, colour, taste, tenderness, flavour, juiciness and overall acceptability.

  3. Influence of storage duration and processing on chromatic attributes and flavonoid content of moxa floss.

    PubMed

    Lim, Min Yee; Huang, Jian; Zhao, Bai-xiao; Zou, Hui-qin; Yan, Yong-hong

    2016-01-01

    Moxibustion is an important traditional Chinese medicine therapy using heat from ignited moxa floss for disease treatment. The purpose of the present study is to establish a reproducible method to assess the color of moxa floss, discriminate the samples based on chromatic coordinates and explore the relationship between chromatic coordinates and total flavonoid content (TFC). Moxa floss samples of different storage years and production ratios were obtained from a moxa production factory in Henan Province, China. Chromatic coordinates (L*, a* and b*) were analyzed with an ultraviolet-visible spectrophotometer and the chroma (C*) and hue angle (h°) values were calculated. TFC was determined by a colorimetric method. Data were analyzed with correlation, principal component analysis (PCA). Significant differences in the chromatic values and TFC were observed among samples of different storage years and production ratios. Samples of higher production ratio displayed higher chromatic characteristics and lower TFC. Samples of longer storage years contained higher TFC. Preliminary separation of moxa floss production ratio was obtained by means of color feature maps developed using L*-a* or L*-b* as coordinates. PCA allowed the separation of the samples from their storage years and production ratios based on their chromatic characteristics and TFC. The use of a colorimetric technique and CIELAB coordinates coupled with chemometrics can be practical and objective for discriminating moxa floss of different storage years and production ratios. The development of color feature maps could be used as a model for classifying the color grading of moxa floss.

  4. An expanded maize gene expression atlas based on RNA sequencing and its use to explore root development

    DOE PAGES

    Stelpflug, Scott C.; Sekhon, Rajandeep S.; Vaillancourt, Brieanne; ...

    2015-12-30

    Comprehensive and systematic transcriptome profiling provides valuable insight into biological and developmental processes that occur throughout the life cycle of a plant. We have enhanced our previously published microarray-based gene atlas of maize ( Zea mays L.) inbred B73 to now include 79 distinct replicated samples that have been interrogated using RNA sequencing (RNA-seq). The current version of the atlas includes 50 original array-based gene atlas samples, a time-course of 12 stalk and leaf samples postflowering, and an additional set of 17 samples from the maize seedling and adult root system. The entire dataset contains 4.6 billion mapped reads, withmore » an average of 20.5 million mapped reads per biological replicate, allowing for detection of genes with lower transcript abundance. As the new root samples represent key additions to the previously examined tissues, we highlight insights into the root transcriptome, which is represented by 28,894 (73.2%) annotated genes in maize. Additionally, we observed remarkable expression differences across both the longitudinal (four zones) and radial gradients (cortical parenchyma and stele) of the primary root supported by fourfold differential expression of 9353 and 4728 genes, respectively. Among the latter were 1110 genes that encode transcription factors, some of which are orthologs of previously characterized transcription factors known to regulate root development in Arabidopsis thaliana (L.) Heynh., while most are novel, and represent attractive targets for reverse genetics approaches to determine their roles in this important organ. As a result, this comprehensive transcriptome dataset is a powerful tool toward understanding maize development, physiology, and phenotypic diversity.« less

  5. An expanded maize gene expression atlas based on RNA sequencing and its use to explore root development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stelpflug, Scott C.; Sekhon, Rajandeep S.; Vaillancourt, Brieanne

    Comprehensive and systematic transcriptome profiling provides valuable insight into biological and developmental processes that occur throughout the life cycle of a plant. We have enhanced our previously published microarray-based gene atlas of maize ( Zea mays L.) inbred B73 to now include 79 distinct replicated samples that have been interrogated using RNA sequencing (RNA-seq). The current version of the atlas includes 50 original array-based gene atlas samples, a time-course of 12 stalk and leaf samples postflowering, and an additional set of 17 samples from the maize seedling and adult root system. The entire dataset contains 4.6 billion mapped reads, withmore » an average of 20.5 million mapped reads per biological replicate, allowing for detection of genes with lower transcript abundance. As the new root samples represent key additions to the previously examined tissues, we highlight insights into the root transcriptome, which is represented by 28,894 (73.2%) annotated genes in maize. Additionally, we observed remarkable expression differences across both the longitudinal (four zones) and radial gradients (cortical parenchyma and stele) of the primary root supported by fourfold differential expression of 9353 and 4728 genes, respectively. Among the latter were 1110 genes that encode transcription factors, some of which are orthologs of previously characterized transcription factors known to regulate root development in Arabidopsis thaliana (L.) Heynh., while most are novel, and represent attractive targets for reverse genetics approaches to determine their roles in this important organ. As a result, this comprehensive transcriptome dataset is a powerful tool toward understanding maize development, physiology, and phenotypic diversity.« less

  6. International Maps | Geospatial Data Science | NREL

    Science.gov Websites

    International Maps International Maps This map collection provides examples of how geographic information system modeling is used in international resource analysis. The images below are samples of

  7. Updating the planetary time scale: focus on Mars

    USGS Publications Warehouse

    Tanaka, Kenneth L.; Quantin-Nataf, Cathy

    2013-01-01

    Formal stratigraphic systems have been developed for the surface materials of the Moon, Mars, Mercury, and the Galilean satellite Ganymede. These systems are based on geologic mapping, which establishes relative ages of surfaces delineated by superposition, morphology, impact crater densities, and other relations and features. Referent units selected from the mapping determine time-stratigraphic bases and/or representative materials characteristic of events and periods for definition of chronologic units. Absolute ages of these units in some cases can be estimated using crater size-frequency data. For the Moon, the chronologic units and cratering record are calibrated by radiometric ages measured from samples collected from the lunar surface. Model ages for other cratered planetary surfaces are constructed primarily by estimating cratering rates relative to that of the Moon. Other cratered bodies with estimated surface ages include Venus and the Galilean satellites of Jupiter. New global geologic mapping and crater dating studies of Mars are resulting in more accurate and detailed reconstructions of its geologic history.

  8. Maximum entropy estimation of a Benzene contaminated plume using ecotoxicological assays.

    PubMed

    Wahyudi, Agung; Bartzke, Mariana; Küster, Eberhard; Bogaert, Patrick

    2013-01-01

    Ecotoxicological bioassays, e.g. based on Danio rerio teratogenicity (DarT) or the acute luminescence inhibition with Vibrio fischeri, could potentially lead to significant benefits for detecting on site contaminations on qualitative or semi-quantitative bases. The aim was to use the observed effects of two ecotoxicological assays for estimating the extent of a Benzene groundwater contamination plume. We used a Maximum Entropy (MaxEnt) method to rebuild a bivariate probability table that links the observed toxicity from the bioassays with Benzene concentrations. Compared with direct mapping of the contamination plume as obtained from groundwater samples, the MaxEnt concentration map exhibits on average slightly higher concentrations though the global pattern is close to it. This suggest MaxEnt is a valuable method to build a relationship between quantitative data, e.g. contaminant concentrations, and more qualitative or indirect measurements, in a spatial mapping framework, which is especially useful when clear quantitative relation is not at hand. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Puzzle Imaging: Using Large-Scale Dimensionality Reduction Algorithms for Localization

    PubMed Central

    Glaser, Joshua I.; Zamft, Bradley M.; Church, George M.; Kording, Konrad P.

    2015-01-01

    Current high-resolution imaging techniques require an intact sample that preserves spatial relationships. We here present a novel approach, “puzzle imaging,” that allows imaging a spatially scrambled sample. This technique takes many spatially disordered samples, and then pieces them back together using local properties embedded within the sample. We show that puzzle imaging can efficiently produce high-resolution images using dimensionality reduction algorithms. We demonstrate the theoretical capabilities of puzzle imaging in three biological scenarios, showing that (1) relatively precise 3-dimensional brain imaging is possible; (2) the physical structure of a neural network can often be recovered based only on the neural connectivity matrix; and (3) a chemical map could be reproduced using bacteria with chemosensitive DNA and conjugative transfer. The ability to reconstruct scrambled images promises to enable imaging based on DNA sequencing of homogenized tissue samples. PMID:26192446

  10. Spectrally based bathymetric mapping of a dynamic, sand‐bedded channel: Niobrara River, Nebraska, USA

    USGS Publications Warehouse

    Dilbone, Elizabeth; Legleiter, Carl; Alexander, Jason S.; McElroy, Brandon

    2018-01-01

    Methods for spectrally based mapping of river bathymetry have been developed and tested in clear‐flowing, gravel‐bed channels, with limited application to turbid, sand‐bed rivers. This study used hyperspectral images and field surveys from the dynamic, sandy Niobrara River to evaluate three depth retrieval methods. The first regression‐based approach, optimal band ratio analysis (OBRA), paired in situ depth measurements with image pixel values to estimate depth. The second approach used ground‐based field spectra to calibrate an OBRA relationship. The third technique, image‐to‐depth quantile transformation (IDQT), estimated depth by linking the cumulative distribution function (CDF) of depth to the CDF of an image‐derived variable. OBRA yielded the lowest depth retrieval mean error (0.005 m) and highest observed versus predicted R2 (0.817). Although misalignment between field and image data did not compromise the performance of OBRA in this study, poor georeferencing could limit regression‐based approaches such as OBRA in dynamic, sand‐bedded rivers. Field spectroscopy‐based depth maps exhibited a mean error with a slight shallow bias (0.068 m) but provided reliable estimates for most of the study reach. IDQT had a strong deep bias but provided informative relative depth maps. Overprediction of depth by IDQT highlights the need for an unbiased sampling strategy to define the depth CDF. Although each of the techniques we tested demonstrated potential to provide accurate depth estimates in sand‐bed rivers, each method also was subject to certain constraints and limitations.

  11. Quantification of map similarity to magnetic pre-screening for heavy metal pollution assessment in top soil

    NASA Astrophysics Data System (ADS)

    Cao, L.; Appel, E.; Roesler, W.; Ojha, G.

    2013-12-01

    From numerous published results, the link between magnetic concentration and heavy metal (HM) concentrations is well established. However, bivariate correlation analysis does not imply causality, and if there are extreme values, which often appear in magnetic data, they can lead to seemingly excellent correlation. It seems clear that site selection for chemical sampling based on magnetic pre-screening can deliver a superior result for outlining HM pollution, but this conclusion has only been drawn from qualitative evaluation so far. In this study, we use map similarity comparison techniques to demonstrate the usefulness of a combined magnetic-chemical approach quantitatively. We chose available data around the 'Schwarze Pumpe', a large coal burning power plant complex located in eastern Germany. The site of 'Schwarze Pumpe' is suitable for a demonstration study as soil in its surrounding is heavy fly-ash polluted, the magnetic natural background is very low, and magnetic investigations can be done in undisturbed forest soil. Magnetic susceptibility (MS) of top soil was measured by a Bartington MS2D surface sensor at 180 locations and by a SM400 downhole device in ~0.5m deep vertical sections at 90 locations. Cores from the 90 downhole sites were also studied for HM analysis. From these results 85 sites could be used to determine a spatial distribution map of HM contents reflecting the 'True' situation of pollution. Different sets comprising 30 sites were chosen by arbitrarily selection from the above 85 sample sites (we refer to four such maps here: S1-4). Additionally, we determined a 'Targeted' map from 30 sites selected on the basis of the pre-screening MS results. The map comparison process is as follows: (1) categorization of all absolute values into five classes by the Natural Breaks classification method; (2) use Delaunay triangulation for connecting the sample locations in the x-y plane; (3) determination of a distribution map of triangular planes with classified values as the Z coordinate; (4) calculation of normal vectors for each individual triangular plane; (5)transformation to the TINs into raster data assigning the same normal vectors to all grid-points which are inside the same TIN; (6) calculation of the root-mean-square of angles between normal vectors of two maps at the same grid points. Additionally, we applied the kappa statistics method to assess map similarities, and moreover developed a Fuzzy set approach. Combining both methods using indices of Khisto, Klocation, Kappa, Kfuzzy obtains a broad comparison system, which allows determining the degree of similarity and also the spatial distribution of similarity between two maps. The results indicate that the similarity between the 'Targeted' and 'True' distribution map is higher than that between 'S1-4' and the 'True' map. It manifests that magnetic pre-screening can provide a reliable basis for targeted selection of chemical sampling sites demonstrating the superior efficiency of a combined magnetic-chemical site assessment in comparison to a traditional chemical-only approach.

  12. Satellite Remote Sensing of Cropland Characteristics in 30m Resolution: The First North American Continental-Scale Classification on High Performance Computing Platforms

    NASA Astrophysics Data System (ADS)

    Massey, Richard

    Cropland characteristics and accurate maps of their spatial distribution are required to develop strategies for global food security by continental-scale assessments and agricultural land use policies. North America is the major producer and exporter of coarse grains, wheat, and other crops. While cropland characteristics such as crop types are available at country-scales in North America, however, at continental-scale cropland products are lacking at fine sufficient resolution such as 30m. Additionally, applications of automated, open, and rapid methods to map cropland characteristics over large areas without the need of ground samples are needed on efficient high performance computing platforms for timely and long-term cropland monitoring. In this study, I developed novel, automated, and open methods to map cropland extent, crop intensity, and crop types in the North American continent using large remote sensing datasets on high-performance computing platforms. First, a novel method was developed in this study to fuse pixel-based classification of continental-scale Landsat data using Random Forest algorithm available on Google Earth Engine cloud computing platform with an object-based classification approach, recursive hierarchical segmentation (RHSeg) to map cropland extent at continental scale. Using the fusion method, a continental-scale cropland extent map for North America at 30m spatial resolution for the nominal year 2010 was produced. In this map, the total cropland area for North America was estimated at 275.2 million hectares (Mha). This map was assessed for accuracy using randomly distributed samples derived from United States Department of Agriculture (USDA) cropland data layer (CDL), Agriculture and Agri-Food Canada (AAFC) annual crop inventory (ACI), Servicio de Informacion Agroalimentaria y Pesquera (SIAP), Mexico's agricultural boundaries, and photo-interpretation of high-resolution imagery. The overall accuracies of the map are 93.4% with a producer's accuracy for crop class at 85.4% and user's accuracy of 74.5% across the continent. The sub-country statistics including state-wise and county-wise cropland statistics derived from this map compared well in regression models resulting in R2 > 0.84. Secondly, an automated phenological pattern matching (PPM) method to efficiently map cropping intensity was also developed in this study. This study presents a continental-scale cropping intensity map for the North American continent at 250m spatial resolution for 2010. In this map, the total areas for single crop, double crop, continuous crop, and fallow were estimated to be 123.5 Mha, 11.1 Mha, 64.0 Mha, and 83.4 Mha, respectively. This map was assessed using limited country-level reference datasets derived from United States Department of Agriculture cropland data layer and Agriculture and Agri-Food Canada annual crop inventory with overall accuracies of 79.8% and 80.2%, respectively. Third, two novel and automated decision tree classification approaches to map crop types across the conterminous United States (U.S.) using MODIS 250 m resolution data: 1) generalized, and 2) year-specific classification were developed. The classification approaches use similarities and dissimilarities in crop type phenology derived from NDVI time-series data for the two approaches. Annual crop type maps were produced for 8 major crop types in the United States using the generalized classification approach for 2001-2014 and the year-specific approach for 2008, 2010, 2011 and 2012. The year-specific classification had overall accuracies greater than 78%, while the generalized classifier had accuracies greater than 75% for the conterminous U.S. for 2008, 2010, 2011, and 2012. The generalized classifier enables automated and routine crop type mapping without repeated and expensive ground sample collection year after year with overall accuracies > 70% across all independent years. Taken together, these cropland products of extent, cropping intensity, and crop types, are significantly beneficial in agricultural and water use planning and monitoring to formulate policies towards global and North American food security issues.

  13. Fine mapping on chromosome 13q32-34 and brain expression analysis implicates MYO16 in schizophrenia.

    PubMed

    Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria

    2014-03-01

    We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32-34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32-34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case-control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case-control data sets of European descent highlighted a region across introns 2-6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia.

  14. Stratigraphy and Tectonics of Southeastern Serenitatis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maxwell, T. A.

    1976-01-01

    Results of investigations of returned Apollo 17 samples, and Apollo 15 and 17 photographs have provided a broad data base on which to interpret the southeastern Serenitatis region of the moon. Although many of the pre-Apollo 17 mission interpretations remain valid, detailed mapping of this region and correlation with earth-based and orbital remote-sensing data have resulted in a revision of the local mare stratigraphy.

  15. Time-of-flight camera via a single-pixel correlation image sensor

    NASA Astrophysics Data System (ADS)

    Mao, Tianyi; Chen, Qian; He, Weiji; Dai, Huidong; Ye, Ling; Gu, Guohua

    2018-04-01

    A time-of-flight imager based on single-pixel correlation image sensors is proposed for noise-free depth map acquisition in presence of ambient light. Digital micro-mirror device and time-modulated IR-laser provide spatial and temporal illumination on the unknown object. Compressed sensing and ‘four bucket principle’ method are combined to reconstruct the depth map from a sequence of measurements at a low sampling rate. Second-order correlation transform is also introduced to reduce the noise from the detector itself and direct ambient light. Computer simulations are presented to validate the computational models and improvement of reconstructions.

  16. Quality of water from bedrock aquifers in the South Carolina Piedmont

    USGS Publications Warehouse

    Patterson, G.G.; Padgett, G.C.

    1984-01-01

    The geographic distributions of 12 common water-quality parameters of ground water from bedrock aquifers in the Piedmont physiographic province of South Carolina are presented in a series of maps. The maps are based on analyses by the South Carolina Department of Health and Environmental Control of water samples taken during the period 1972 to 1982 from 442 public and private wells developed in the Piedmont. In general, alkalinity, hardness, and concentrations of sodium, magnesium, and chloride were higher in the Carolina Slate Belt than they were in the other geologic belts of the Piedmont. (USGS)

  17. An object-based approach for tree species extraction from digital orthophoto maps

    NASA Astrophysics Data System (ADS)

    Jamil, Akhtar; Bayram, Bulent

    2018-05-01

    Tree segmentation is an active and ongoing research area in the field of photogrammetry and remote sensing. It is more challenging due to both intra-class and inter-class similarities among various tree species. In this study, we exploited various statistical features for extraction of hazelnut trees from 1 : 5000 scaled digital orthophoto maps. Initially, the non-vegetation areas were eliminated using traditional normalized difference vegetation index (NDVI) followed by application of mean shift segmentation for transforming the pixels into meaningful homogeneous objects. In order to eliminate false positives, morphological opening and closing was employed on candidate objects. A number of heuristics were also derived to eliminate unwanted effects such as shadow and bounding box aspect ratios, before passing them into the classification stage. Finally, a knowledge based decision tree was constructed to distinguish the hazelnut trees from rest of objects which include manmade objects and other type of vegetation. We evaluated the proposed methodology on 10 sample orthophoto maps obtained from Giresun province in Turkey. The manually digitized hazelnut tree boundaries were taken as reference data for accuracy assessment. Both manually digitized and segmented tree borders were converted into binary images and the differences were calculated. According to the obtained results, the proposed methodology obtained an overall accuracy of more than 85 % for all sample images.

  18. Optical coherence tomography detection of shear wave propagation in inhomogeneous tissue equivalent phantoms and ex-vivo carotid artery samples

    PubMed Central

    Razani, Marjan; Luk, Timothy W.H.; Mariampillai, Adrian; Siegler, Peter; Kiehl, Tim-Rasmus; Kolios, Michael C.; Yang, Victor X.D.

    2014-01-01

    In this work, we explored the potential of measuring shear wave propagation using optical coherence elastography (OCE) in an inhomogeneous phantom and carotid artery samples based on a swept-source optical coherence tomography (OCT) system. Shear waves were generated using a piezoelectric transducer transmitting sine-wave bursts of 400 μs duration, applying acoustic radiation force (ARF) to inhomogeneous phantoms and carotid artery samples, synchronized with a swept-source OCT (SS-OCT) imaging system. The phantoms were composed of gelatin and titanium dioxide whereas the carotid artery samples were embedded in gel. Differential OCT phase maps, measured with and without the ARF, detected the microscopic displacement generated by shear wave propagation in these phantoms and samples of different stiffness. We present the technique for calculating tissue mechanical properties by propagating shear waves in inhomogeneous tissue equivalent phantoms and carotid artery samples using the ARF of an ultrasound transducer, and measuring the shear wave speed and its associated properties in the different layers with OCT phase maps. This method lays the foundation for future in-vitro and in-vivo studies of mechanical property measurements of biological tissues such as vascular tissues, where normal and pathological structures may exhibit significant contrast in the shear modulus. PMID:24688822

  19. A land-cover map for South and Southeast Asia derived from SPOT-VEGETATION data

    USGS Publications Warehouse

    Stibig, H.-J.; Belward, A.S.; Roy, P.S.; Rosalina-Wasrin, U.; Agrawal, S.; Joshi, P.K.; ,; Beuchle, R.; Fritz, S.; Mubareka, S.; Giri, C.

    2007-01-01

    Aim  Our aim was to produce a uniform ‘regional’ land-cover map of South and Southeast Asia based on ‘sub-regional’ mapping results generated in the context of the Global Land Cover 2000 project.Location  The ‘region’ of tropical and sub-tropical South and Southeast Asia stretches from the Himalayas and the southern border of China in the north, to Sri Lanka and Indonesia in the south, and from Pakistan in the west to the islands of New Guinea in the far east.Methods  The regional land-cover map is based on sub-regional digital mapping results derived from SPOT-VEGETATION satellite data for the years 1998–2000. Image processing, digital classification and thematic mapping were performed separately for the three sub-regions of South Asia, continental Southeast Asia, and insular Southeast Asia. Landsat TM images, field data and existing national maps served as references. We used the FAO (Food and Agriculture Organization) Land Cover Classification System (LCCS) for coding the sub-regional land-cover classes and for aggregating the latter to a uniform regional legend. A validation was performed based on a systematic grid of sample points, referring to visual interpretation from high-resolution Landsat imagery. Regional land-cover area estimates were obtained and compared with FAO statistics for the categories ‘forest’ and ‘cropland’.Results  The regional map displays 26 land-cover classes. The LCCS coding provided a standardized class description, independent from local class names; it also allowed us to maintain the link to the detailed sub-regional land-cover classes. The validation of the map displayed a mapping accuracy of 72% for the dominant classes of ‘forest’ and ‘cropland’; regional area estimates for these classes correspond reasonably well to existing regional statistics.Main conclusions  The land-cover map of South and Southeast Asia provides a synoptic view of the distribution of land cover of tropical and sub-tropical Asia, and it delivers reasonable thematic detail and quantitative estimates of the main land-cover proportions. The map may therefore serve for regional stratification or modelling of vegetation cover, but could also support the implementation of forest policies, watershed management or conservation strategies at regional scales.

  20. Detection and Spatial Mapping of Mercury Contamination in Water Samples Using a Smart-Phone

    PubMed Central

    2014-01-01

    Detection of environmental contamination such as trace-level toxic heavy metal ions mostly relies on bulky and costly analytical instruments. However, a considerable global need exists for portable, rapid, specific, sensitive, and cost-effective detection techniques that can be used in resource-limited and field settings. Here we introduce a smart-phone-based hand-held platform that allows the quantification of mercury(II) ions in water samples with parts per billion (ppb) level of sensitivity. For this task, we created an integrated opto-mechanical attachment to the built-in camera module of a smart-phone to digitally quantify mercury concentration using a plasmonic gold nanoparticle (Au NP) and aptamer based colorimetric transmission assay that is implemented in disposable test tubes. With this smart-phone attachment that weighs <40 g, we quantified mercury(II) ion concentration in water samples by using a two-color ratiometric method employing light-emitting diodes (LEDs) at 523 and 625 nm, where a custom-developed smart application was utilized to process each acquired transmission image on the same phone to achieve a limit of detection of ∼3.5 ppb. Using this smart-phone-based detection platform, we generated a mercury contamination map by measuring water samples at over 50 locations in California (USA), taken from city tap water sources, rivers, lakes, and beaches. With its cost-effective design, field-portability, and wireless data connectivity, this sensitive and specific heavy metal detection platform running on cellphones could be rather useful for distributed sensing, tracking, and sharing of water contamination information as a function of both space and time. PMID:24437470

  1. Geochemical and Pb isotopic characterization of soil, groundwater, human hair, and corn samples from the Domizio Flegreo and Agro Aversano area (Campania region, Italy)

    USGS Publications Warehouse

    Rezza, Carmela; Albanese, Stefano; Ayuso, Robert A.; Lima, Annamaria; Sorvari, Jaana; De Vivo, Benedetto

    2018-01-01

    A geochemical survey was carried out to investigate metal contamination in the Domizio Littoral and Agro Aversano area (Southern Italy) by means of soil, groundwater, human hair and corn samples. Pb isotope ratios were also determined to identify the sources of metals. Specifically, the investigation focused on topsoils (n = 1064), groundwater (n = 26), 25 human hair (n = 24) and corn samples (n = 13). Topsoils have been sampled and analysed in a previous study for 53 elements (including potentially harmful ones), and determined by ICP-MS after dissolving with aqua regia. Groundwater was analysed for 72 elements by ICP-MS and by ICP-ES. Samples of human hair were prepared and analysed for 16 elements by ICP-MS. Dried corn collected at several farms were also analysed for 53 elements by ICP-MS. The isotopic ratios of 206Pb/207Pb and 208Pb/207Pb in selected topsoil (n = 24), groundwater (n = 9), human hair (n = 9) and corn (n = 4) samples were analysed from both eluates and residues to investigate possible anthropogenic contamination and geogenic contributions. All data were processed and mapped by ArcGis software to produce interpolated maps and contamination factor maps of potentially harmful elements, in accordance with Italian Environmental Law (Legislative Decree 152/06). Results show that soil sampling sites are characterized by As, Cd, Co, Cr, Cu, Hg, Pb, Se, and Zn contents exceeding the action limits established for residential land use (RAL) and, in some cases, also the action limits for industrial land use (IAL) as established by Legislative Decree 152/06. A map of contamination factors and a map showing the degrees of contamination indicate that the areas in the municipalities of Acerra, Casoria and Giugliano have been affected by considerable anthropogenic-related pollution. To interpret the isotopic data and roughly estimate proportion of Pb from an anthropogenic source we broadly defined possible natural and anthropogenic Pb end-member fields based on literature data. For example, we summarized data for Vesuvius and Campi Flegrei volcanic rocks, gasoline, and aerosol deposits.Lead isotope data show mixing between geogenic and anthropogenic sources. Topsoil, groundwater, human hair and corn samples show a greater contribution from geogenic sources like the Yellow Tuff (from Campi Flegrei) and volcanic rocks from Mt. Vesuvius. Aerosols, fly ash and gasoline (anthropogenic sources) have also been contributors. In detail, 46% of the topsoil residues, 96% of topsoil leachates, 88% of groundwater, 90% of human hair, and 25% of corn samples indicate that > 50% percent of the lead in this area can be ascribed to anthropogenic activity.

  2. Photometric Modeling of Simulated Surace-Resolved Bennu Images

    NASA Astrophysics Data System (ADS)

    Golish, D.; DellaGiustina, D. N.; Clark, B.; Li, J. Y.; Zou, X. D.; Bennett, C. A.; Lauretta, D. S.

    2017-12-01

    The Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) is a NASA mission to study and return a sample of asteroid (101955) Bennu. Imaging data from the mission will be used to develop empirical surface-resolved photometric models of Bennu at a series of wavelengths. These models will be used to photometrically correct panchromatic and color base maps of Bennu, compensating for variations due to shadows and photometric angle differences, thereby minimizing seams in mosaicked images. Well-corrected mosaics are critical to the generation of a global hazard map and a global 1064-nm reflectance map which predicts LIDAR response. These data products directly feed into the selection of a site from which to safely acquire a sample. We also require photometric correction for the creation of color ratio maps of Bennu. Color ratios maps provide insight into the composition and geological history of the surface and allow for comparison to other Solar System small bodies. In advance of OSIRIS-REx's arrival at Bennu, we use simulated images to judge the efficacy of both the photometric modeling software and the mission observation plan. Our simulation software is based on USGS's Integrated Software for Imagers and Spectrometers (ISIS) and uses a synthetic shape model, a camera model, and an empirical photometric model to generate simulated images. This approach gives us the flexibility to create simulated images of Bennu based on analog surfaces from other small Solar System bodies and to test our modeling software under those conditions. Our photometric modeling software fits image data to several conventional empirical photometric models and produces the best fit model parameters. The process is largely automated, which is crucial to the efficient production of data products during proximity operations. The software also produces several metrics on the quality of the observations themselves, such as surface coverage and the completeness of the data set for evaluating the phase and disk functions of the surface. Application of this software to simulated mission data has revealed limitations in the initial mission design, which has fed back into the planning process. The entire photometric pipeline further serves as an exercise of planned activities for proximity operations.

  3. A Geosynchronous Synthetic Aperture Provides for Disaster Management, Measurement of Soil Moisture, and Measurement of Earth-Surface Dynamics

    NASA Technical Reports Server (NTRS)

    Madsen, Soren; Komar, George (Technical Monitor)

    2001-01-01

    A GEO-based Synthetic Aperture Radar (SAR) could provide daily coverage of basically all of North and South America with very good temporal coverage within the mapped area. This affords a key capability to disaster management, tectonic mapping and modeling, and vegetation mapping. The fine temporal sampling makes this system particularly useful for disaster management of flooding, hurricanes, and earthquakes. By using a fairly long wavelength, changing water boundaries caused by storms or flooding could be monitored in near real-time. This coverage would also provide revolutionary capabilities in the field of radar interferometry, including the capability to study the interferometric signature immediately before and after an earthquake, thus allowing unprecedented studies of Earth-surface dynamics. Preeruptive volcano dynamics could be studied as well as pre-seismic deformation, one of the most controversial and elusive aspects of earthquakes. Interferometric correlation would similarly allow near real-time mapping of surface changes caused by volcanic eruptions, mud slides, or fires. Finally, a GEO SAR provides an optimum configuration for soil moisture measurement that requires a high temporal sampling rate (1-2 days) with a moderate spatial resolution (1 km or better). From a technological point of view, the largest challenges involved in developing a geosynchronous SAR capability relate to the very large slant range distance from the radar to the mapped area. This leads to requirements for large power or alternatively very large antenna, the ability to steer the mapping area to the left and right of the satellite, and control of the elevation and azimuth angles. The weight of this system is estimated to be 2750 kg and it would require 20 kW of DC-power. Such a system would provide up to a 600 km ground swath in a strip-mapping mode and 4000 km dual-sided mapping in a scan-SAR mode.

  4. Cooperative Autonomous Observation of Volcanic Environments with sUAS

    NASA Astrophysics Data System (ADS)

    Ravela, S.

    2015-12-01

    The Cooperative Autonomous Observing System Project (CAOS) at the MIT Earth Signals and Systems Group has developed methodology and systems for dynamically mapping coherent fluids such as plumes using small unmanned aircraft systems (sUAS). In the CAOS approach, two classes of sUAS, one remote the other in-situ, implement a dynamic data-driven mapping system by closing the loop between Modeling, Estimation, Sampling, Planning and Control (MESPAC). The continually gathered measurements are assimilated to produce maps/analyses which also guide the sUAS network to adaptively resample the environment. Rather than scan the volume in fixed Eulerian or Lagrangian flight plans, the adaptive nature of the sampling process enables objectives for efficiency and resilience to be incorporated. Modeling includes realtime prediction using two types of reduced models, one based on nowcasting remote observations of plume tracer using scale-cascaded alignment, and another based on dynamically-deformable EOF/POD developed for coherent structures. Ensemble-based Information-theoretic machine learning approaches are used for the highly non-linear/non-Gaussian state/parameter estimation, and for planning. Control of the sUAS is based on model reference control coupled with hierarchical PID. MESPAC is implemented in part on a SkyCandy platform, and implements an airborne mesh that provides instantaneous situational awareness and redundant communication to an operating fleet. SkyCandy is deployed on Itzamna Aero's I9X/W UAS with low-cost sensors, and is currently being used to study the Popocatepetl volcano. Results suggest that operational communities can deploy low-cost sUAS to systematically monitor whilst optimizing for efficiency/maximizing resilience. The CAOS methodology is applicable to many other environments where coherent structures are present in the background. More information can be found at caos.mit.edu.

  5. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.

  6. Synergistic effect of spice extracts and modified atmospheric packaging towards non-thermal preservation of chicken meat under refrigerated storage.

    PubMed

    Sivarajan, M; Lalithapriya, U; Mariajenita, Peter; Vajiha, B Aafrin; Harini, K; Madhushalini, D; Sukumar, M

    2017-08-01

    This study investigates the integrated approach of spice extracts and modified atmospheric packaging (MAP) chicken meat preservation. Specifically, extracts from clove (CL), cinnamon (CI) individually and in combination (3% w/w) along with MAP (30% CO2/70% N2 and 10% O2/30% CO2/60% N2) were used to increase the shelf life of fresh chicken meat stored at 4°C. The parameters evaluated as shelf life indications are microbiological (total viable count, Pseudomonas spp., lactic acid bacteria (LAB), and Enterobacteriaceae), physicochemical (pH, Lipid oxidation, color changes) and Sensory attributes. Microbial population were reduced by 2.5 to 5 log cfu/g, with the greater impact being accomplished by the blend of clove and cinnamon extract with 30% CO2/70% N2 MAP. Thiobarbituric values for all treated and MAP packed samples remained lower than 1 mg malondialdehyde (MDA)/kg all through the 24 day storage period. pH values varied from 5.5 for fresh sample on day 0 to 7.11 (day 25) on combined extract treated and MAP packaged samples. The estimations of the color parameters L*, a*, and b* were well maintained in oxygen deficient MAP. Finally, sensory investigation demonstrated that combined clove and cinnamon extract of 3% conferred acceptable sensory attributes to the samples on day 24 of storage. These results indicate the extended shelf life of chicken meat from 4 days to 24 days for samples when coated with 3% of combined clove and cinnamon extract and packaged under MAP without oxygen. These pooled extracts along with MAP displayed expanded the usability and the organoleptic qualities of chicken meat. © 2017 Poultry Science Association Inc.

  7. Assessment of Scanning Tunneling Spectroscopy Modes Inspecting Electron Confinement in Surface-Confined Supramolecular Networks

    PubMed Central

    Krenner, Wolfgang; Kühne, Dirk; Klappenberger, Florian; Barth, Johannes V.

    2013-01-01

    Scanning tunneling spectroscopy (STS) enables the local, energy-resolved investigation of a samples surface density of states (DOS) by measuring the differential conductance (dI/dV) being approximately proportional to the DOS. It is popular to examine the electronic structure of elementary samples by acquiring dI/dV maps under constant current conditions. Here we demonstrate the intricacy of STS mapping of samples exhibiting a strong corrugation originating from electronic density and local work function changes. The confinement of the Ag(111) surface state by a porous organic network is studied with maps obtained under constant-current (CC) as well as open-feedback-loop (OFL) conditions. We show how the CC maps deviate markedly from the physically more meaningful OFL maps. By applying a renormalization procedure to the OFL data we can mimic the spurious effects of the CC mode and thereby rationalize the physical effects evoking the artefacts in the CC maps. PMID:23503526

  8. Mapping Sub-Saharan African Agriculture in High-Resolution Satellite Imagery with Computer Vision & Machine Learning

    NASA Astrophysics Data System (ADS)

    Debats, Stephanie Renee

    Smallholder farms dominate in many parts of the world, including Sub-Saharan Africa. These systems are characterized by small, heterogeneous, and often indistinct field patterns, requiring a specialized methodology to map agricultural landcover. In this thesis, we developed a benchmark labeled data set of high-resolution satellite imagery of agricultural fields in South Africa. We presented a new approach to mapping agricultural fields, based on efficient extraction of a vast set of simple, highly correlated, and interdependent features, followed by a random forest classifier. The algorithm achieved similar high performance across agricultural types, including spectrally indistinct smallholder fields, and demonstrated the ability to generalize across large geographic areas. In sensitivity analyses, we determined multi-temporal images provided greater performance gains than the addition of multi-spectral bands. We also demonstrated how active learning can be incorporated in the algorithm to create smaller, more efficient training data sets, which reduced computational resources, minimized the need for humans to hand-label data, and boosted performance. We designed a patch-based uncertainty metric to drive the active learning framework, based on the regular grid of a crowdsourcing platform, and demonstrated how subject matter experts can be replaced with fleets of crowdsourcing workers. Our active learning algorithm achieved similar performance as an algorithm trained with randomly selected data, but with 62% less data samples. This thesis furthers the goal of providing accurate agricultural landcover maps, at a scale that is relevant for the dominant smallholder class. Accurate maps are crucial for monitoring and promoting agricultural production. Furthermore, improved agricultural landcover maps will aid a host of other applications, including landcover change assessments, cadastral surveys to strengthen smallholder land rights, and constraints for crop modeling and famine prediction.

  9. Rapid genotyping with DNA micro-arrays for high-density linkage mapping and QTL mapping in common buckwheat (Fagopyrum esculentum Moench)

    PubMed Central

    Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi

    2014-01-01

    For genetic studies and genomics-assisted breeding, particularly of minor crops, a genotyping system that does not require a priori genomic information is preferable. Here, we demonstrated the potential of a novel array-based genotyping system for the rapid construction of high-density linkage map and quantitative trait loci (QTL) mapping. By using the system, we successfully constructed an accurate, high-density linkage map for common buckwheat (Fagopyrum esculentum Moench); the map was composed of 756 loci and included 8,884 markers. The number of linkage groups converged to eight, which is the basic number of chromosomes in common buckwheat. The sizes of the linkage groups of the P1 and P2 maps were 773.8 and 800.4 cM, respectively. The average interval between adjacent loci was 2.13 cM. The linkage map constructed here will be useful for the analysis of other common buckwheat populations. We also performed QTL mapping for main stem length and detected four QTL. It took 37 days to process 178 samples from DNA extraction to genotyping, indicating the system enables genotyping of genome-wide markers for a few hundred buckwheat plants before the plants mature. The novel system will be useful for genomics-assisted breeding in minor crops without a priori genomic information. PMID:25914583

  10. Rapid genotyping with DNA micro-arrays for high-density linkage mapping and QTL mapping in common buckwheat (Fagopyrum esculentum Moench).

    PubMed

    Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi

    2014-12-01

    For genetic studies and genomics-assisted breeding, particularly of minor crops, a genotyping system that does not require a priori genomic information is preferable. Here, we demonstrated the potential of a novel array-based genotyping system for the rapid construction of high-density linkage map and quantitative trait loci (QTL) mapping. By using the system, we successfully constructed an accurate, high-density linkage map for common buckwheat (Fagopyrum esculentum Moench); the map was composed of 756 loci and included 8,884 markers. The number of linkage groups converged to eight, which is the basic number of chromosomes in common buckwheat. The sizes of the linkage groups of the P1 and P2 maps were 773.8 and 800.4 cM, respectively. The average interval between adjacent loci was 2.13 cM. The linkage map constructed here will be useful for the analysis of other common buckwheat populations. We also performed QTL mapping for main stem length and detected four QTL. It took 37 days to process 178 samples from DNA extraction to genotyping, indicating the system enables genotyping of genome-wide markers for a few hundred buckwheat plants before the plants mature. The novel system will be useful for genomics-assisted breeding in minor crops without a priori genomic information.

  11. A complete mass spectrometric map for the analysis of the yeast proteome and its application to quantitative trait analysis

    PubMed Central

    Picotti, Paola; Clement-Ziza, Mathieu; Lam, Henry; Campbell, David S.; Schmidt, Alexander; Deutsch, Eric W.; Röst, Hannes; Sun, Zhi; Rinner, Oliver; Reiter, Lukas; Shen, Qin; Michaelson, Jacob J.; Frei, Andreas; Alberti, Simon; Kusebauch, Ulrike; Wollscheid, Bernd; Moritz, Robert; Beyer, Andreas; Aebersold, Ruedi

    2013-01-01

    Complete reference maps or datasets, like the genomic map of an organism, are highly beneficial tools for biological and biomedical research. Attempts to generate such reference datasets for a proteome so far failed to reach complete proteome coverage, with saturation apparent at approximately two thirds of the proteomes tested, even for the most thoroughly characterized proteomes. Here, we used a strategy based on high-throughput peptide synthesis and mass spectrometry to generate a close to complete reference map (97% of the genome-predicted proteins) of the S. cerevisiae proteome. We generated two versions of this mass spectrometric map one supporting discovery- (shotgun) and the other hypothesis-driven (targeted) proteomic measurements. The two versions of the map, therefore, constitute a complete set of proteomic assays to support most studies performed with contemporary proteomic technologies. The reference libraries can be browsed via a web-based repository and associated navigation tools. To demonstrate the utility of the reference libraries we applied them to a protein quantitative trait locus (pQTL) analysis, which requires measurement of the same peptides over a large number of samples with high precision. Protein measurements over a set of 78 S. cerevisiae strains revealed a complex relationship between independent genetic loci, impacting on the levels of related proteins. Our results suggest that selective pressure favors the acquisition of sets of polymorphisms that maintain the stoichiometry of protein complexes and pathways. PMID:23334424

  12. Complex conductivity of volcanic rocks and the geophysical mapping of alteration in volcanoes

    NASA Astrophysics Data System (ADS)

    Ghorbani, A.; Revil, A.; Coperey, A.; Soueid Ahmed, A.; Roque, S.; Heap, M. J.; Grandis, H.; Viveiros, F.

    2018-05-01

    Induced polarization measurements can be used to image alteration at the scale of volcanic edifices to a depth of few kilometers. Such a goal cannot be achieved with electrical conductivity alone, because too many textural and environmental parameters influence the electrical conductivity of volcanic rocks. We investigate the spectral induced polarization measurements (complex conductivity) in the frequency band 10 mHz-45 kHz of 85 core samples from five volcanoes: Merapi and Papandayan in Indonesia (32 samples), Furnas in Portugal (5 samples), Yellowstone in the USA (26 samples), and Whakaari (White Island) in New Zealand (22 samples). This collection of samples covers not only different rock compositions (basaltic andesite, andesite, trachyte and rhyolite), but also various degrees of alteration. The specific surface area is found to be correlated to the cation exchange capacity (CEC) of the samples measured by the cobalthexamine method, both serving as rough proxies of the hydrothermal alteration experienced by these materials. The in-phase (real) conductivity of the samples is the sum of a bulk contribution associated with conduction in the pore network and a surface conductivity that increases with alteration. The quadrature conductivity and the normalized chargeability are two parameters related to the polarization of the electrical double layer coating the minerals of the volcanic rocks. Both parameters increase with the degree of alteration. The surface conductivity, the quadrature conductivity, and the normalized chargeability (defined as the difference between the in-phase conductivity at high and low frequencies) are linearly correlated to the CEC normalized by the bulk tortuosity of the pore space. The effects of temperature and pyrite-content are also investigated and can be understood in terms of a physics-based model. Finally, we performed a numerical study of the use of induced polarization to image the normalized chargeability of a volcanic edifice. Induced polarization tomography can be used to map alteration of volcanic edifices with applications to geohazard mapping.

  13. Stroma Based Prognosticators Incorporating Differences between African and European Americans

    DTIC Science & Technology

    2017-10-01

    amenable to bisulfite sequencing of more than a few genes. Exploiting the recent three-fold reduction in the cost of sequencing per read , we developed oligo...cards. The ability of the HiSeq 4000 to obtain about three times as many reads as the HiSeq2500, at the same price, means we can stay on track, though...capture, and sequencing (Table 2). We obtain tens of millions of mapped deduplicated reads per sample, while using only 5% of a sequencing lane per sample

  14. Maps and tables showing data and analyses of semiquantitative emmission spectrometry and atomic-absorption spectrophotometry of rock samples, Ugashik, Bristol Bay, and part of Karluk quadrangles, Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; O'Leary, R. M.

    1986-01-01

    The accompanying maps and tables show analytical data and data analyses from rock samples collected in conjunction with geologic mapping in the Ugashik, Bristol Bay and western Karluck quadrangles from 1979 through 1981. This work was conducted under the auspices of the Alaska Mineral Resource Assessment Program (AMRAP). A total of 337 samples were collected for analysis, primarily in areas of surficial alteration. The sample locations are shown on sheet 1: they are concentrated along the Pacific Ocean side of the area because the Bristol Bay lowlands part of the map is predominantly unconsolidated Quaternary deposits. Sample collection was by the following people, with their respective two letter identifying code shown in parentheses: W.H. Allaway (AY), J.E. Case (CE), D.P. Cox (CX), R.L. Detterman, (DT), T.G. Theodore (MK), F.H. Wilson (WS), and M.E. Yount (YB).

  15. Rapid case-based mapping of seasonal malaria transmission risk for strategic elimination planning in Swaziland

    PubMed Central

    2013-01-01

    Background As successful malaria control programmes move towards elimination, they must identify residual transmission foci, target vector control to high-risk areas, focus on both asymptomatic and symptomatic infections, and manage importation risk. High spatial and temporal resolution maps of malaria risk can support all of these activities, but commonly available malaria maps are based on parasite rate, a poor metric for measuring malaria at extremely low prevalence. New approaches are required to provide case-based risk maps to countries seeking to identify remaining hotspots of transmission while managing the risk of transmission from imported cases. Methods Household locations and travel histories of confirmed malaria patients during 2011 were recorded through routine surveillance by the Swaziland National Malaria Control Programme for the higher transmission months of January to April and the lower transmission months of May to December. Household locations for patients with no travel history to endemic areas were compared against a random set of background points sampled proportionate to population density with respect to a set of variables related to environment, population density, vector control, and distance to the locations of identified imported cases. Comparisons were made separately for the high and low transmission seasons. The Random Forests regression tree classification approach was used to generate maps predicting the probability of a locally acquired case at 100 m resolution across Swaziland for each season. Results Results indicated that case households during the high transmission season tended to be located in areas of lower elevation, closer to bodies of water, in more sparsely populated areas, with lower rainfall and warmer temperatures, and closer to imported cases than random background points (all p < 0.001). Similar differences were evident during the low transmission season. Maps from the fit models suggested better predictive ability during the high season. Both models proved useful at predicting the locations of local cases identified in 2012. Conclusions The high-resolution mapping approaches described here can help elimination programmes understand the epidemiology of a disappearing disease. Generating case-based risk maps at high spatial and temporal resolution will allow control programmes to direct interventions proactively according to evidence-based measures of risk and ensure that the impact of limited resources is maximized to achieve and maintain malaria elimination. PMID:23398628

  16. Rapid case-based mapping of seasonal malaria transmission risk for strategic elimination planning in Swaziland.

    PubMed

    Cohen, Justin M; Dlamini, Sabelo; Novotny, Joseph M; Kandula, Deepika; Kunene, Simon; Tatem, Andrew J

    2013-02-11

    As successful malaria control programmes move towards elimination, they must identify residual transmission foci, target vector control to high-risk areas, focus on both asymptomatic and symptomatic infections, and manage importation risk. High spatial and temporal resolution maps of malaria risk can support all of these activities, but commonly available malaria maps are based on parasite rate, a poor metric for measuring malaria at extremely low prevalence. New approaches are required to provide case-based risk maps to countries seeking to identify remaining hotspots of transmission while managing the risk of transmission from imported cases. Household locations and travel histories of confirmed malaria patients during 2011 were recorded through routine surveillance by the Swaziland National Malaria Control Programme for the higher transmission months of January to April and the lower transmission months of May to December. Household locations for patients with no travel history to endemic areas were compared against a random set of background points sampled proportionate to population density with respect to a set of variables related to environment, population density, vector control, and distance to the locations of identified imported cases. Comparisons were made separately for the high and low transmission seasons. The Random Forests regression tree classification approach was used to generate maps predicting the probability of a locally acquired case at 100 m resolution across Swaziland for each season. Results indicated that case households during the high transmission season tended to be located in areas of lower elevation, closer to bodies of water, in more sparsely populated areas, with lower rainfall and warmer temperatures, and closer to imported cases than random background points (all p < 0.001). Similar differences were evident during the low transmission season. Maps from the fit models suggested better predictive ability during the high season. Both models proved useful at predicting the locations of local cases identified in 2012. The high-resolution mapping approaches described here can help elimination programmes understand the epidemiology of a disappearing disease. Generating case-based risk maps at high spatial and temporal resolution will allow control programmes to direct interventions proactively according to evidence-based measures of risk and ensure that the impact of limited resources is maximized to achieve and maintain malaria elimination.

  17. Estimating an exchange rate between the EQ-5D-3L and ASCOT.

    PubMed

    Stevens, Katherine; Brazier, John; Rowen, Donna

    2018-06-01

    The aim was to estimate an exchange rate between EQ-5D-3L and the Adult Social Care Outcome Tool (ASCOT) using preference-based mapping via common time trade-off (TTO) valuations. EQ-5D and ASCOT are useful for examining cost-effectiveness within the health and social care sectors, respectively, but there is a policy need to understand overall benefits and compare across sectors to assess relative value for money. Standard statistical mapping is unsuitable since it relies on conceptual overlap of the measures but EQ-5D and ASCOT have different conceptualisations of quality of life. We use a preference-based mapping approach to estimate the exchange rate using common TTO valuations for both measures. A sample of health states from each measure was valued using TTO by 200 members of the UK adult general population. Regression analyses are used to generate separate equations between EQ-5D-3L and ASCOT values using their original value set and TTO values elicited here. These are solved as simultaneous equations to estimate the relationship between EQ-5D-3L and ASCOT. The relationship for moving from ASCOT to EQ-5D-3L is a linear transformation with an intercept of -0.0488 and gradient of 0.978. This enables QALY gains generated by ASCOT and EQ-5D to be compared across different interventions. This paper estimated an exchange rate between ASCOT and EQ-5D-3L using a preference-based mapping approach that does not compromise the descriptive systems of the two measures. This contributes to the development of preference-based mapping through the use of TTO as the common metric used to estimate the exchange rate between measures.

  18. Monitoring of thermal therapy based on shear modulus changes: I. shear wave thermometry.

    PubMed

    Arnal, Bastien; Pernot, Mathieu; Tanter, Mickael

    2011-02-01

    The clinical applicability of high-intensity focused ultrasound (HIFU) for noninvasive therapy is today hampered by the lack of robust and real-time monitoring of tissue damage during treatment. The goal of this study is to show that the estimation of local tissue elasticity from shear wave imaging (SWI) can lead to the 2-D mapping of temperature changes during HIFU treatments. This new concept of shear wave thermometry is experimentally implemented here using conventional ultrasonic imaging probes. HIFU treatment and monitoring were, respectively, performed using a confocal setup consisting of a 2.5-MHz single-element transducer focused at 30 mm on ex vivo samples and an 8-MHz ultrasound diagnostic probe. Thermocouple measurements and ultrasound-based thermometry were used as a gold standard technique and were combined with SWI on the same device. The SWI sequences consisted of 2 successive shear waves induced at different lateral positions. Each wave was created using 100-μs pushing beams at 3 depths. The shear wave propagation was acquired at 17,000 frames/s, from which the elasticity map was recovered. HIFU sonications were interleaved with fast imaging acquisitions, allowing a duty cycle of more than 90%. Elasticity and temperature mapping was achieved every 3 s, leading to realtime monitoring of the treatment. Tissue stiffness was found to decrease in the focal zone for temperatures up to 43°C. Ultrasound-based temperature estimation was highly correlated to stiffness variation maps (r² = 0.91 to 0.97). A reversible calibration phase of the changes of elasticity with temperature can be made locally using sighting shots. This calibration process allows for the derivation of temperature maps from shear wave imaging. Compared with conventional ultrasound-based approaches, shear wave thermometry is found to be much more robust to motion artifacts.

  19. Enhancing the performance of regional land cover mapping

    NASA Astrophysics Data System (ADS)

    Wu, Weicheng; Zucca, Claudio; Karam, Fadi; Liu, Guangping

    2016-10-01

    Different pixel-based, object-based and subpixel-based methods such as time-series analysis, decision-tree, and different supervised approaches have been proposed to conduct land use/cover classification. However, despite their proven advantages in small dataset tests, their performance is variable and less satisfactory while dealing with large datasets, particularly, for regional-scale mapping with high resolution data due to the complexity and diversity in landscapes and land cover patterns, and the unacceptably long processing time. The objective of this paper is to demonstrate the comparatively highest performance of an operational approach based on integration of multisource information ensuring high mapping accuracy in large areas with acceptable processing time. The information used includes phenologically contrasted multiseasonal and multispectral bands, vegetation index, land surface temperature, and topographic features. The performance of different conventional and machine learning classifiers namely Malahanobis Distance (MD), Maximum Likelihood (ML), Artificial Neural Networks (ANNs), Support Vector Machines (SVMs) and Random Forests (RFs) was compared using the same datasets in the same IDL (Interactive Data Language) environment. An Eastern Mediterranean area with complex landscape and steep climate gradients was selected to test and develop the operational approach. The results showed that SVMs and RFs classifiers produced most accurate mapping at local-scale (up to 96.85% in Overall Accuracy), but were very time-consuming in whole-scene classification (more than five days per scene) whereas ML fulfilled the task rapidly (about 10 min per scene) with satisfying accuracy (94.2-96.4%). Thus, the approach composed of integration of seasonally contrasted multisource data and sampling at subclass level followed by a ML classification is a suitable candidate to become an operational and effective regional land cover mapping method.

  20. Prediction of CT Substitutes from MR Images Based on Local Diffeomorphic Mapping for Brain PET Attenuation Correction.

    PubMed

    Wu, Yao; Yang, Wei; Lu, Lijun; Lu, Zhentai; Zhong, Liming; Huang, Meiyan; Feng, Yanqiu; Feng, Qianjin; Chen, Wufan

    2016-10-01

    Attenuation correction is important for PET reconstruction. In PET/MR, MR intensities are not directly related to attenuation coefficients that are needed in PET imaging. The attenuation coefficient map can be derived from CT images. Therefore, prediction of CT substitutes from MR images is desired for attenuation correction in PET/MR. This study presents a patch-based method for CT prediction from MR images, generating attenuation maps for PET reconstruction. Because no global relation exists between MR and CT intensities, we propose local diffeomorphic mapping (LDM) for CT prediction. In LDM, we assume that MR and CT patches are located on 2 nonlinear manifolds, and the mapping from the MR manifold to the CT manifold approximates a diffeomorphism under a local constraint. Locality is important in LDM and is constrained by the following techniques. The first is local dictionary construction, wherein, for each patch in the testing MR image, a local search window is used to extract patches from training MR/CT pairs to construct MR and CT dictionaries. The k-nearest neighbors and an outlier detection strategy are then used to constrain the locality in MR and CT dictionaries. Second is local linear representation, wherein, local anchor embedding is used to solve MR dictionary coefficients when representing the MR testing sample. Under these local constraints, dictionary coefficients are linearly transferred from the MR manifold to the CT manifold and used to combine CT training samples to generate CT predictions. Our dataset contains 13 healthy subjects, each with T1- and T2-weighted MR and CT brain images. This method provides CT predictions with a mean absolute error of 110.1 Hounsfield units, Pearson linear correlation of 0.82, peak signal-to-noise ratio of 24.81 dB, and Dice in bone regions of 0.84 as compared with real CTs. CT substitute-based PET reconstruction has a regression slope of 1.0084 and R 2 of 0.9903 compared with real CT-based PET. In this method, no image segmentation or accurate registration is required. Our method demonstrates superior performance in CT prediction and PET reconstruction compared with competing methods. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

Top