Conditional Random Field-Based Offline Map Matching for Indoor Environments
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-01-01
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892
Conditional Random Field-Based Offline Map Matching for Indoor Environments.
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-08-16
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.
Digital Mapping Techniques '11–12 workshop proceedings
Soller, David R.
2014-01-01
At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
NASA Astrophysics Data System (ADS)
Chockalingam, Letchumanan
2005-01-01
The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.
Combining Techniques to Refine Item to Skills Q-Matrices with a Partition Tree
ERIC Educational Resources Information Center
Desmarais, Michel C.; Xu, Peng; Beheshti, Behzad
2015-01-01
The problem of mapping items to skills is gaining interest with the emergence of recent techniques that can use data for both defining this mapping, and for refining mappings given by experts. We investigate the problem of refining mapping from an expert by combining the output of different techniques. The combination is based on a partition tree…
Interagency Report: Astrogeology 58, television cartography
Batson, Raymond M.
1973-01-01
The purpose of this paper is to illustrate the processing of digital television pictures into base maps. In this context, a base map is defined as a pictorial representation of planetary surface morphology accurately reproduced on standard map projections. Topographic contour lines, albedo or geologic overprints may be super imposed on these base maps. The compilation of geodetic map controls, the techniques of mosaic compilation, computer processing and airbrush enhancement, and the compilation of con tour lines are discussed elsewhere by the originators of these techniques. A bibliography of applicable literature is included for readers interested in more detailed discussions.
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
Salinet, João L; Masca, Nicholas; Stafford, Peter J; Ng, G André; Schlindwein, Fernando S
2016-03-08
Areas with high frequency activity within the atrium are thought to be 'drivers' of the rhythm in patients with atrial fibrillation (AF) and ablation of these areas seems to be an effective therapy in eliminating DF gradient and restoring sinus rhythm. Clinical groups have applied the traditional FFT-based approach to generate the three-dimensional dominant frequency (3D DF) maps during electrophysiology (EP) procedures but literature is restricted on using alternative spectral estimation techniques that can have a better frequency resolution that FFT-based spectral estimation. Autoregressive (AR) model-based spectral estimation techniques, with emphasis on selection of appropriate sampling rate and AR model order, were implemented to generate high-density 3D DF maps of atrial electrograms (AEGs) in persistent atrial fibrillation (persAF). For each patient, 2048 simultaneous AEGs were recorded for 20.478 s-long segments in the left atrium (LA) and exported for analysis, together with their anatomical locations. After the DFs were identified using AR-based spectral estimation, they were colour coded to produce sequential 3D DF maps. These maps were systematically compared with maps found using the Fourier-based approach. 3D DF maps can be obtained using AR-based spectral estimation after AEGs downsampling (DS) and the resulting maps are very similar to those obtained using FFT-based spectral estimation (mean 90.23 %). There were no significant differences between AR techniques (p = 0.62). The processing time for AR-based approach was considerably shorter (from 5.44 to 5.05 s) when lower sampling frequencies and model order values were used. Higher levels of DS presented higher rates of DF agreement (sampling frequency of 37.5 Hz). We have demonstrated the feasibility of using AR spectral estimation methods for producing 3D DF maps and characterised their differences to the maps produced using the FFT technique, offering an alternative approach for 3D DF computation in human persAF studies.
Fourier-Mellin moment-based intertwining map for image encryption
NASA Astrophysics Data System (ADS)
Kaur, Manjit; Kumar, Vijay
2018-03-01
In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.
Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009
Soller, David R.
2011-01-01
As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
NASA Astrophysics Data System (ADS)
Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia
2007-12-01
To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.
Bang, Yoonsik; Kim, Jiyoung; Yu, Kiyun
2016-01-01
Wearable and smartphone technology innovations have propelled the growth of Pedestrian Navigation Services (PNS). PNS need a map-matching process to project a user’s locations onto maps. Many map-matching techniques have been developed for vehicle navigation services. These techniques are inappropriate for PNS because pedestrians move, stop, and turn in different ways compared to vehicles. In addition, the base map data for pedestrians are more complicated than for vehicles. This article proposes a new map-matching method for locating Global Positioning System (GPS) trajectories of pedestrians onto road network datasets. The theory underlying this approach is based on the Fréchet distance, one of the measures of geometric similarity between two curves. The Fréchet distance approach can provide reasonable matching results because two linear trajectories are parameterized with the time variable. Then we improved the method to be adaptive to the positional error of the GPS signal. We used an adaptation coefficient to adjust the search range for every input signal, based on the assumption of auto-correlation between consecutive GPS points. To reduce errors in matching, the reliability index was evaluated in real time for each match. To test the proposed map-matching method, we applied it to GPS trajectories of pedestrians and the road network data. We then assessed the performance by comparing the results with reference datasets. Our proposed method performed better with test data when compared to a conventional map-matching technique for vehicles. PMID:27782091
Current trends in geomorphological mapping
NASA Astrophysics Data System (ADS)
Seijmonsbergen, A. C.
2012-04-01
Geomorphological mapping is a world currently in motion, driven by technological advances and the availability of new high resolution data. As a consequence, classic (paper) geomorphological maps which were the standard for more than 50 years are rapidly being replaced by digital geomorphological information layers. This is witnessed by the following developments: 1. the conversion of classic paper maps into digital information layers, mainly performed in a digital mapping environment such as a Geographical Information System, 2. updating the location precision and the content of the converted maps, by adding more geomorphological details, taken from high resolution elevation data and/or high resolution image data, 3. (semi) automated extraction and classification of geomorphological features from digital elevation models, broadly separated into unsupervised and supervised classification techniques and 4. New digital visualization / cartographic techniques and reading interfaces. Newly digital geomorphological information layers can be based on manual digitization of polygons using DEMs and/or aerial photographs, or prepared through (semi) automated extraction and delineation of geomorphological features. DEMs are often used as basis to derive Land Surface Parameter information which is used as input for (un) supervised classification techniques. Especially when using high-res data, object-based classification is used as an alternative to traditional pixel-based classifications, to cluster grid cells into homogeneous objects, which can be classified as geomorphological features. Classic map content can also be used as training material for the supervised classification of geomorphological features. In the classification process, rule-based protocols, including expert-knowledge input, are used to map specific geomorphological features or entire landscapes. Current (semi) automated classification techniques are increasingly able to extract morphometric, hydrological, and in the near future also morphogenetic information. As a result, these new opportunities have changed the workflows for geomorphological mapmaking, and their focus have shifted from field-based techniques to using more computer-based techniques: for example, traditional pre-field air-photo based maps are now replaced by maps prepared in a digital mapping environment, and designated field visits using mobile GIS / digital mapping devices now focus on gathering location information and attribute inventories and are strongly time efficient. The resulting 'modern geomorphological maps' are digital collections of geomorphological information layers consisting of georeferenced vector, raster and tabular data which are stored in a digital environment such as a GIS geodatabase, and are easily visualized as e.g. 'birds' eye' views, as animated 3D displays, on virtual globes, or stored as GeoPDF maps in which georeferenced attribute information can be easily exchanged over the internet. Digital geomorphological information layers are increasingly accessed via web-based services distributed through remote servers. Information can be consulted - or even build using remote geoprocessing servers - by the end user. Therefore, it will not only be the geomorphologist anymore, but also the professional end user that dictates the applied use of digital geomorphological information layers.
Denys Yemshanov; Frank H. Koch; Mark Ducey; Klaus Koehler
2013-01-01
Geographic mapping of risks is a useful analytical step in ecological risk assessments and in particular, in analyses aimed to estimate risks associated with introductions of invasive organisms. In this paper, we approach invasive species risk mapping as a portfolio allocation problem and apply techniques from decision theory to build an invasion risk map that combines...
The efficacy of the 'mind map' study technique.
Farrand, Paul; Hussain, Fearzana; Hennessy, Enid
2002-05-01
To examine the effectiveness of using the 'mind map' study technique to improve factual recall from written information. To obtain baseline data, subjects completed a short test based on a 600-word passage of text prior to being randomly allocated to form two groups: 'self-selected study technique' and 'mind map'. After a 30-minute interval the self-selected study technique group were exposed to the same passage of text previously seen and told to apply existing study techniques. Subjects in the mind map group were trained in the mind map technique and told to apply it to the passage of text. Recall was measured after an interfering task and a week later. Measures of motivation were taken. Barts and the London School of Medicine and Dentistry, University of London. 50 second- and third-year medical students. Recall of factual material improved for both the mind map and self-selected study technique groups at immediate test compared with baseline. However this improvement was only robust after a week for those in the mind map group. At 1 week, the factual knowledge in the mind map group was greater by 10% (adjusting for baseline) (95% CI -1% to 22%). However motivation for the technique used was lower in the mind map group; if motivation could have been made equal in the groups, the improvement with mind mapping would have been 15% (95% CI 3% to 27%). Mind maps provide an effective study technique when applied to written material. However before mind maps are generally adopted as a study technique, consideration has to be given towards ways of improving motivation amongst users.
A Different Web-Based Geocoding Service Using Fuzzy Techniques
NASA Astrophysics Data System (ADS)
Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.
2015-12-01
Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.
NASA Technical Reports Server (NTRS)
Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.
1984-01-01
Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.
Validation and application of Acoustic Mapping Velocimetry
NASA Astrophysics Data System (ADS)
Baranya, Sandor; Muste, Marian
2016-04-01
The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.
ERIC Educational Resources Information Center
Seyihoglu, Aysegul; Kartal, Ayca
2010-01-01
The purpose of this study is to reveal the opinions of teachers on using the mind mapping technique in Life Science and Social Studies lessons. The participants of the study are 20 primary education teachers. In this study, a semi-structured interview technique was used. For content analysis, the themes and codes were defined, based on the views…
A map of dust reddening to 4.5 kpc from Pan-STARRS1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlafly, E. F.; Rix, H.-W.; Martin, N. F.
2014-07-01
We present a map of the dust reddening to 4.5 kpc derived from Pan-STARRS1 stellar photometry. The map covers almost the entire sky north of declination –30° at a resolution of 7'-14', and is based on the estimated distances and reddenings to more than 500 million stars. The technique is designed to map dust in the Galactic plane, where many other techniques are stymied by the presence of multiple dust clouds at different distances along each line of sight. This reddening-based dust map agrees closely with the Schlegel et al. (SFD) far-infrared emission-based dust map away from the Galactic plane,more » and the most prominent differences between the two maps stem from known limitations of SFD in the plane. We also compare the map with Planck, finding likewise good agreement in general at high latitudes. The use of optical data from Pan-STARRS1 yields reddening uncertainty as low as 25 mmag E(B – V).« less
Adaptive proxy map server for efficient vector spatial data rendering
NASA Astrophysics Data System (ADS)
Sayar, Ahmet
2013-01-01
The rapid transmission of vector map data over the Internet is becoming a bottleneck of spatial data delivery and visualization in web-based environment because of increasing data amount and limited network bandwidth. In order to improve both the transmission and rendering performances of vector spatial data over the Internet, we propose a proxy map server enabling parallel vector data fetching as well as caching to improve the performance of web-based map servers in a dynamic environment. Proxy map server is placed seamlessly anywhere between the client and the final services, intercepting users' requests. It employs an efficient parallelization technique based on spatial proximity and data density in case distributed replica exists for the same spatial data. The effectiveness of the proposed technique is proved at the end of the article by the application of creating map images enriched with earthquake seismic data records.
An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques
ERIC Educational Resources Information Center
Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.
2007-01-01
Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…
Damage Evaluation Based on a Wave Energy Flow Map Using Multiple PZT Sensors
Liu, Yaolu; Hu, Ning; Xu, Hong; Yuan, Weifeng; Yan, Cheng; Li, Yuan; Goda, Riu; Alamusi; Qiu, Jinhao; Ning, Huiming; Wu, Liangke
2014-01-01
A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti's reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map. PMID:24463430
Xu, Yiming; Smith, Scot E; Grunwald, Sabine; Abd-Elrahman, Amr; Wani, Suhas P; Nair, Vimala D
2017-09-11
Digital soil mapping (DSM) is gaining momentum as a technique to help smallholder farmers secure soil security and food security in developing regions. However, communications of the digital soil mapping information between diverse audiences become problematic due to the inconsistent scale of DSM information. Spatial downscaling can make use of accessible soil information at relatively coarse spatial resolution to provide valuable soil information at relatively fine spatial resolution. The objective of this research was to disaggregate the coarse spatial resolution soil exchangeable potassium (K ex ) and soil total nitrogen (TN) base map into fine spatial resolution soil downscaled map using weighted generalized additive models (GAMs) in two smallholder villages in South India. By incorporating fine spatial resolution spectral indices in the downscaling process, the soil downscaled maps not only conserve the spatial information of coarse spatial resolution soil maps but also depict the spatial details of soil properties at fine spatial resolution. The results of this study demonstrated difference between the fine spatial resolution downscaled maps and fine spatial resolution base maps is smaller than the difference between coarse spatial resolution base maps and fine spatial resolution base maps. The appropriate and economical strategy to promote the DSM technique in smallholder farms is to develop the relatively coarse spatial resolution soil prediction maps or utilize available coarse spatial resolution soil maps at the regional scale and to disaggregate these maps to the fine spatial resolution downscaled soil maps at farm scale.
Diffusion Tensor Magnetic Resonance Imaging Strategies for Color Mapping of Human Brain Anatomy
Boujraf, Saïd
2018-01-01
Background: A color mapping of fiber tract orientation using diffusion tensor imaging (DTI) can be prominent in clinical practice. The goal of this paper is to perform a comparative study of visualized diffusion anisotropy in the human brain anatomical entities using three different color-mapping techniques based on diffusion-weighted imaging (DWI) and DTI. Methods: The first technique is based on calculating a color map from DWIs measured in three perpendicular directions. The second technique is based on eigenvalues derived from the diffusion tensor. The last technique is based on three eigenvectors corresponding to sorted eigenvalues derived from the diffusion tensor. All magnetic resonance imaging measurements were achieved using a 1.5 Tesla Siemens Vision whole body imaging system. A single-shot DW echoplanar imaging sequence used a Stejskal–Tanner approach. Trapezoidal diffusion gradients are used. The slice orientation was transverse. The basic measurement yielded a set of 13 images. Each series consists of a single image without diffusion weighting, besides two DWIs for each of the next six noncollinear magnetic field gradient directions. Results: The three types of color maps were calculated consequently using the DWI obtained and the DTI. Indeed, we established an excellent similarity between the image data in the color maps and the fiber directions of known anatomical structures (e.g., corpus callosum and gray matter). Conclusions: In the meantime, rotationally invariant quantities such as the eigenvectors of the diffusion tensor reflected better, the real orientation found in the studied tissue. PMID:29928631
Efficient morse decompositions of vector fields.
Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene
2008-01-01
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.
Digital Mapping Techniques '07 - Workshop Proceedings
Soller, David R.
2008-01-01
The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.
Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa
2015-04-13
Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kruse, Fred A.; Dwyer, John L.
1993-01-01
The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.
Three Reading Comprehension Strategies: TELLS, Story Mapping, and QARs.
ERIC Educational Resources Information Center
Sorrell, Adrian L.
1990-01-01
Three reading comprehension strategies are presented to assist learning-disabled students: an advance organizer technique called "TELLS Fact or Fiction" used before reading a passage, a schema-based technique called "Story Mapping" used while reading, and a postreading method of categorizing questions called…
Advantage of spatial map ion imaging in the study of large molecule photodissociation
NASA Astrophysics Data System (ADS)
Lee, Chin; Lin, Yen-Cheng; Lee, Shih-Huang; Lee, Yin-Yu; Tseng, Chien-Ming; Lee, Yuan-Tseh; Ni, Chi-Kung
2017-07-01
The original ion imaging technique has low velocity resolution, and currently, photodissociation is mostly investigated using velocity map ion imaging. However, separating signals from the background (resulting from undissociated excited parent molecules) is difficult when velocity map ion imaging is used for the photodissociation of large molecules (number of atoms ≥ 10). In this study, we used the photodissociation of phenol at the S1 band origin as an example to demonstrate how our multimass ion imaging technique, based on modified spatial map ion imaging, can overcome this difficulty. The photofragment translational energy distribution obtained when multimass ion imaging was used differed considerably from that obtained when velocity map ion imaging and Rydberg atom tagging were used. We used conventional translational spectroscopy as a second method to further confirm the experimental results, and we conclude that data should be interpreted carefully when velocity map ion imaging or Rydberg atom tagging is used in the photodissociation of large molecules. Finally, we propose a modified velocity map ion imaging technique without the disadvantages of the current velocity map ion imaging technique.
Backwards compatible high dynamic range video compression
NASA Astrophysics Data System (ADS)
Dolzhenko, Vladimir; Chesnokov, Vyacheslav; Edirisinghe, Eran A.
2014-02-01
This paper presents a two layer CODEC architecture for high dynamic range video compression. The base layer contains the tone mapped video stream encoded with 8 bits per component which can be decoded using conventional equipment. The base layer content is optimized for rendering on low dynamic range displays. The enhancement layer contains the image difference, in perceptually uniform color space, between the result of inverse tone mapped base layer content and the original video stream. Prediction of the high dynamic range content reduces the redundancy in the transmitted data while still preserves highlights and out-of-gamut colors. Perceptually uniform colorspace enables using standard ratedistortion optimization algorithms. We present techniques for efficient implementation and encoding of non-uniform tone mapping operators with low overhead in terms of bitstream size and number of operations. The transform representation is based on human vision system model and suitable for global and local tone mapping operators. The compression techniques include predicting the transform parameters from previously decoded frames and from already decoded data for current frame. Different video compression techniques are compared: backwards compatible and non-backwards compatible using AVC and HEVC codecs.
NASA Astrophysics Data System (ADS)
Al-Ruzouq, R.; Shanableh, A.; Merabtene, T.
2015-04-01
In United Arab Emirates (UAE) domestic water consumption has increased rapidly over the last decade. The increased demand for high-quality water, create an urgent need to evaluate the groundwater production of aquifers. The development of a reasonable model for groundwater potential is therefore crucial for future systematic developments, efficient management, and sustainable use of groundwater resources. The objective of this study is to map the groundwater potential zones in northern part of UAE and assess the contributing factors for exploration of potential groundwater resources. Remote sensing data and geographic information system will be used to locate potential zones for groundwater. Various maps (i.e., base, soil, geological, Hydro-geological, Geomorphologic Map, structural, drainage, slope, land use/land cover and average annual rainfall map) will be prepared based on geospatial techniques. The groundwater availability of the basin will qualitatively classified into different classes based on its hydro-geo-morphological conditions. The land use/land cover map will be also prepared for the different seasons using a digital classification technique with a ground truth based on field investigation.
2D Presentation Techniques of Mind-maps for Blind Meeting Participants.
Pölzer, Stephan; Miesenberger, Klaus
2015-01-01
Mind-maps, used as ideation technique in co-located meetings (e.g. in brainstorming sessions), which meet with increased importance in business and education, show considerably accessibility challenges for blind meeting participants. Besides an overview of general aspects of accessibility issues in co-located meetings, this paper focuses on the design and development of alternative non-visual presentation techniques for mind-maps. The different aspects of serialized presentation techniques (e.g. treeview) for Braille and audio rendering and two dimensional presentation techniques (e.g. tactile two dimensional array matrix and edge-projection method [1]) are discussed based on the user feedback gathered in intermediate tests following a user centered design approach.
NASA Technical Reports Server (NTRS)
Eppler, Dean B.; Bleacher, Jacob F.; Evans, Cynthia A.; Feng, Wanda; Gruener, John; Hurwitz, Debra M.; Skinner, J. A., Jr.; Whitson, Peggy; Janoiko, Barbara
2013-01-01
Geologic maps integrate the distributions, contacts, and compositions of rock and sediment bodies as a means to interpret local to regional formative histories. Applying terrestrial mapping techniques to other planets is challenging because data is collected primarily by orbiting instruments, with infrequent, spatiallylimited in situ human and robotic exploration. Although geologic maps developed using remote data sets and limited "Apollo-style" field access likely contain inaccuracies, the magnitude, type, and occurrence of these are only marginally understood. This project evaluates the interpretative and cartographic accuracy of both field- and remote-based mapping approaches by comparing two 1:24,000 scale geologic maps of the San Francisco Volcanic Field (SFVF), north-central Arizona. The first map is based on traditional field mapping techniques, while the second is based on remote data sets, augmented with limited field observations collected during NASA Desert Research & Technology Studies (RATS) 2010 exercises. The RATS mission used Apollo-style methods not only for pre-mission traverse planning but also to conduct geologic sampling as part of science operation tests. Cross-comparison demonstrates that the Apollo-style map identifies many of the same rock units and determines a similar broad history as the field-based map. However, field mapping techniques allow markedly improved discrimination of map units, particularly unconsolidated surficial deposits, and recognize a more complex eruptive history than was possible using Apollo-style data. Further, the distribution of unconsolidated surface units was more obvious in the remote sensing data to the field team after conducting the fieldwork. The study raises questions about the most effective approach to balancing mission costs with the rate of knowledge capture, suggesting that there is an inflection point in the "knowledge capture curve" beyond which additional resource investment yields progressively smaller gains in geologic knowledge.
Lentle, Roger G.; Hulls, Corrin M.
2018-01-01
The uses and limitations of the various techniques of video spatiotemporal mapping based on change in diameter (D-type ST maps), change in longitudinal strain rate (L-type ST maps), change in area strain rate (A-type ST maps), and change in luminous intensity of reflected light (I-maps) are described, along with their use in quantifying motility of the wall of hollow structures of smooth muscle such as the gut. Hence ST-methods for determining the size, speed of propagation and frequency of contraction in the wall of gut compartments of differing geometric configurations are discussed. We also discuss the shortcomings and problems that are inherent in the various methods and the use of techniques to avoid or minimize them. This discussion includes, the inability of D-type ST maps to indicate the site of a contraction that does not reduce the diameter of a gut segment, the manipulation of axis [the line of interest (LOI)] of L-maps to determine the true axis of propagation of a contraction, problems with anterior curvature of gut segments and the use of adjunct image analysis techniques that enhance particular features of the maps. PMID:29686624
Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008
Soller, David R.
2009-01-01
The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
ERIC Educational Resources Information Center
Sunal, Cynthia Szymanski; Warash, Bobbi Gibson
Techniques for encouraging young children to discover the purpose and use of maps are discussed. Motor activity and topological studies form a base from which the teacher and children can build a mapping program of progressive sophistication. Concepts important to mapping include boundaries, regions, exteriors, interiors, holes, order, point of…
A self-trained classification technique for producing 30 m percent-water maps from Landsat data
Rover, Jennifer R.; Wylie, Bruce K.; Ji, Lei
2010-01-01
Small bodies of water can be mapped with moderate-resolution satellite data using methods where water is mapped as subpixel fractions using field measurements or high-resolution images as training datasets. A new method, developed from a regression-tree technique, uses a 30 m Landsat image for training the regression tree that, in turn, is applied to the same image to map subpixel water. The self-trained method was evaluated by comparing the percent-water map with three other maps generated from established percent-water mapping methods: (1) a regression-tree model trained with a 5 m SPOT 5 image, (2) a regression-tree model based on endmembers and (3) a linear unmixing classification technique. The results suggest that subpixel water fractions can be accurately estimated when high-resolution satellite data or intensively interpreted training datasets are not available, which increases our ability to map small water bodies or small changes in lake size at a regional scale.
Determination of a Limited Scope Network's Lightning Detection Efficiency
NASA Technical Reports Server (NTRS)
Rompala, John T.; Blakeslee, R.
2008-01-01
This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.
Investigating the Use of ICT-Based Concept Mapping Techniques on Creativity in Literacy Tasks
ERIC Educational Resources Information Center
Riley, Nigel R.; Ahlberg, Mauri
2004-01-01
The key research question in this small-scale study focuses on the effects that an ICT (information and communications technologies)-based concept mapping intervention has on creativity and writing achievement in 10-11-year-old primary age pupils. The data shows that pupils using a concept mapping intervention significantly improve their NFER…
NASA Astrophysics Data System (ADS)
Jha, Madan K.; Chowdary, V. M.; Chowdhury, Alivia
2010-11-01
An approach is presented for the evaluation of groundwater potential using remote sensing, geographic information system, geoelectrical, and multi-criteria decision analysis techniques. The approach divides the available hydrologic and hydrogeologic data into two groups, exogenous (hydrologic) and endogenous (subsurface). A case study in Salboni Block, West Bengal (India), uses six thematic layers of exogenous parameters and four thematic layers of endogenous parameters. These thematic layers and their features were assigned suitable weights which were normalized by analytic hierarchy process and eigenvector techniques. The layers were then integrated using ArcGIS software to generate two groundwater potential maps. The hydrologic parameters-based groundwater potential zone map indicated that the `good' groundwater potential zone covers 27.14% of the area, the `moderate' zone 45.33%, and the `poor' zone 27.53%. A comparison of this map with the groundwater potential map based on subsurface parameters revealed that the hydrologic parameters-based map accurately delineates groundwater potential zones in about 59% of the area, and hence it is dependable to a certain extent. More than 80% of the study area has moderate-to-poor groundwater potential, which necessitates efficient groundwater management for long-term water security. Overall, the integrated technique is useful for the assessment of groundwater resources at a basin or sub-basin scale.
NASA Astrophysics Data System (ADS)
Lewis, Donna L.; Phinn, Stuart
2011-01-01
Aerial photography interpretation is the most common mapping technique in the world. However, unlike an algorithm-based classification of satellite imagery, accuracy of aerial photography interpretation generated maps is rarely assessed. Vegetation communities covering an area of 530 km2 on Bullo River Station, Northern Territory, Australia, were mapped using an interpretation of 1:50,000 color aerial photography. Manual stereoscopic line-work was delineated at 1:10,000 and thematic maps generated at 1:25,000 and 1:100,000. Multivariate and intuitive analysis techniques were employed to identify 22 vegetation communities within the study area. The accuracy assessment was based on 50% of a field dataset collected over a 4 year period (2006 to 2009) and the remaining 50% of sites were used for map attribution. The overall accuracy and Kappa coefficient for both thematic maps was 66.67% and 0.63, respectively, calculated from standard error matrices. Our findings highlight the need for appropriate scales of mapping and accuracy assessment of aerial photography interpretation generated vegetation community maps.
Choi, Kyongsik; Chon, James W; Gu, Min; Lee, Byoungho
2007-08-20
In this paper, a simple confocal laser scanning microscopic (CLSM) image mapping technique based on the finite-difference time domain (FDTD) calculation has been proposed and evaluated for characterization of a subwavelength-scale three-dimensional (3D) void structure fabricated inside polymer matrix. The FDTD simulation method adopts a focused Gaussian beam incident wave, Berenger's perfectly matched layer absorbing boundary condition, and the angular spectrum analysis method. Through the well matched simulation and experimental results of the xz-scanned 3D void structure, we first characterize the exact position and the topological shape factor of the subwavelength-scale void structure, which was fabricated by a tightly focused ultrashort pulse laser. The proposed CLSM image mapping technique based on the FDTD can be widely applied from the 3D near-field microscopic imaging, optical trapping, and evanescent wave phenomenon to the state-of-the-art bio- and nanophotonics.
Accuracy of vertical radial plume mapping technique in measuring lagoon gas emission
USDA-ARS?s Scientific Manuscript database
Recently, the U.S. Environmental Protection Agency (USEPA) posted a ground-based optical remote sensing method on its website called OTM 10 for measuring fugitive gas emission flux from area sources such as closed landfills. The OTM 10 utilizes the vertical radial plume mapping (VRPM) technique to c...
Mapping and Managing Knowledge and Information in Resource-Based Learning
ERIC Educational Resources Information Center
Tergan, Sigmar-Olaf; Graber, Wolfgang; Neumann, Anja
2006-01-01
In resource-based learning scenarios, students are often overwhelmed by the complexity of task-relevant knowledge and information. Techniques for the external interactive representation of individual knowledge in graphical format may help them to cope with complex problem situations. Advanced computer-based concept-mapping tools have the potential…
A combination of selected mapping and clipping to increase energy efficiency of OFDM systems
Lee, Byung Moo; Rim, You Seung
2017-01-01
We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591
Identification of understory invasive exotic plants with remote sensing in urban forests
NASA Astrophysics Data System (ADS)
Shouse, Michael; Liang, Liang; Fei, Songlin
2013-04-01
Invasive exotic plants (IEP) pose a significant threat to many ecosystems. To effectively manage IEP, it is important to efficiently detect their presences and determine their distribution patterns. Remote sensing has been a useful tool to map IEP but its application is limited in urban forests, which are often the sources and sinks for IEP. In this study, we examined the feasibility and tradeoffs of species level IEP mapping using multiple remote sensing techniques in a highly complex urban forest setting. Bush honeysuckle (Lonicera maackii), a pervasive IEP in eastern North America, was used as our modeling species. Both medium spatial resolution (MSR) and high spatial resolution (HSR) imagery were employed in bush honeysuckle mapping. The importance of spatial scale was also examined using an up-scaling simulation from the HSR object based classification. Analysis using both MSR and HSR imagery provided viable results for IEP distribution mapping in urban forests. Overall mapping accuracy ranged from 89.8% to 94.9% for HSR techniques and from 74.6% to 79.7% for MSR techniques. As anticipated, classification accuracy reduces as pixel size increases. HSR based techniques produced the most desirable results, therefore is preferred for precise management of IEP in heterogeneous environment. However, the use of MSR techniques should not be ruled out given their wide availability and moderate accuracy.
Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010
Soller, David R.; Soller, David R.
2012-01-01
The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
Murayama, Tomonori; Nakajima, Jun
2016-01-01
Anatomical segmentectomies play an important role in oncological lung resection, particularly for ground-glass types of primary lung cancers. This operation can also be applied to metastatic lung tumors deep in the lung. Virtual assisted lung mapping (VAL-MAP) is a novel technique that allows for bronchoscopic multi-spot dye markings to provide “geometric information” to the lung surface, using three-dimensional virtual images. In addition to wedge resections, VAL-MAP has been found to be useful in thoracoscopic segmentectomies, particularly complex segmentectomies, such as combined subsegmentectomies or extended segmentectomies. There are five steps in VAL-MAP-assisted segmentectomies: (I) “standing” stitches along the resection lines; (II) cleaning hilar anatomy; (III) confirming hilar anatomy; (IV) going 1 cm deeper; (V) step-by-step stapling technique. Depending on the anatomy, segmentectomies can be classified into linear (lingular, S6, S2), V- or U-shaped (right S1, left S3, S2b + S3a), and three dimensional (S7, S8, S9, S10) segmentectomies. Particularly three dimensional segmentectomies are challenging in the complexity of stapling techniques. This review focuses on how VAL-MAP can be utilized in segmentectomy, and how this technique can assist the stapling process in even the most challenging ones. PMID:28066675
Historical evolution of disease mapping in general and specifically of cancer mapping.
Howe, G M
1989-01-01
The presentation of areal data in epidemiology is illustrated by such mapping techniques as dots (spots), shading (choropleth, thematic) and isolines (isopleths). Examples are also given of computer-assisted cartography (computer graphics) which employs hardware and software components of digital computers, together with the use of geographical and demographic base maps.
The role of photogeologic mapping in traverse planning: Lessons from DRATS 2010 activities
Skinner, James A.; Fortezzo, Corey M.
2013-01-01
We produced a 1:24,000 scale photogeologic map of the Desert Research and Technology Studies (DRATS) 2010 simulated lunar mission traverse area and surrounding environments located within the northeastern part of the San Francisco Volcanic Field (SFVF), north-central Arizona. To mimic an exploratory mission, we approached the region “blindly” by rejecting prior knowledge or preconceived notions of the regional geologic setting and focused instead only on image and topographic base maps that were intended to be equivalent to pre-cursor mission “orbital returns”. We used photogeologic mapping techniques equivalent to those employed during the construction of modern planetary geologic maps. Based on image and topographic base maps, we identified 4 surficial units (talus, channel, dissected, and plains units), 5 volcanic units (older cone, younger cone, older flow, younger flow, and block field units), and 5 basement units (grey-toned mottled, red-toned platy, red-toned layered, light-toned slabby, and light-toned layered units). Comparison of our remote-based map units with published field-based map units indicates that the two techniques yield pervasively similar results of contrasting detail, with higher accuracies linked to remote-based units that have high topographic relief and tonal contrast relative to adjacent units. We list key scientific questions that remained after photogeologic mapping and prior to DRATS activities and identify 13 specific observations that the crew and science team would need to make in order to address those questions and refine the interpreted geologic context. We translated potential observations into 62 recommended sites for visitation and observation during the mission traverse. The production and use of a mission-specific photogeologic map for DRATS 2010 activities resulted in strategic and tactical recommendations regarding observational context and hypothesis tracking over the course of an exploratory mission.
Procedures for woody vegetation surveys in the Kazgail rural council area, Kordofan, Sudan
Falconer, Allan; Cross, Matthew D.; Orr, Donald G.
1990-01-01
Efforts to reforest parts of the Kordofan Province of Sudan are receiving support from international development agencies. These efforts include planning and implementing reforestation activities that require the collection of natural resources and socioeconomic data, and the preparation of base maps. A combination of remote sensing, geographic information system and global positioning systems procedures are used in this study to meet these requirements.Remote sensing techniques were used to provide base maps and to guide the compilation of vegetation resources maps. These techniques provided a rapid and efficient method for documenting available resources. Pocket‐sized global positioning system units were used to establish the location of field data collected for mapping and resource analysis. A microcomputer data management system tabulated and displayed the field data. The resulting system for data analysis, management, and planning has been adopted for the mapping and inventory of the Gum Belt of Sudan.
Green Map Exercises as an Avenue for Problem-Based Learning in a Data-Rich Environment
ERIC Educational Resources Information Center
Tulloch, David; Graff, Elizabeth
2007-01-01
This article describes a series of data-based Green Map learning exercises positioned within a problem-based framework and examines the appropriateness of projects like these as a form of geography education. Problem-based learning (PBL) is an educational technique that engages students in learning through activities that require creative problem…
2009-01-01
Background Expressed sequence tags (ESTs) are an important source of gene-based markers such as those based on insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). Several gel based methods have been reported for the detection of sequence variants, however they have not been widely exploited in common bean, an important legume crop of the developing world. The objectives of this project were to develop and map EST based markers using analysis of single strand conformation polymorphisms (SSCPs), to create a transcript map for common bean and to compare synteny of the common bean map with sequenced chromosomes of other legumes. Results A set of 418 EST based amplicons were evaluated for parental polymorphisms using the SSCP technique and 26% of these presented a clear conformational or size polymorphism between Andean and Mesoamerican genotypes. The amplicon based markers were then used for genetic mapping with segregation analysis performed in the DOR364 × G19833 recombinant inbred line (RIL) population. A total of 118 new marker loci were placed into an integrated molecular map for common bean consisting of 288 markers. Of these, 218 were used for synteny analysis and 186 presented homology with segments of the soybean genome with an e-value lower than 7 × 10-12. The synteny analysis with soybean showed a mosaic pattern of syntenic blocks with most segments of any one common bean linkage group associated with two soybean chromosomes. The analysis with Medicago truncatula and Lotus japonicus presented fewer syntenic regions consistent with the more distant phylogenetic relationship between the galegoid and phaseoloid legumes. Conclusion The SSCP technique is a useful and inexpensive alternative to other SNP or Indel detection techniques for saturating the common bean genetic map with functional markers that may be useful in marker assisted selection. In addition, the genetic markers based on ESTs allowed the construction of a transcript map and given their high conservation between species allowed synteny comparisons to be made to sequenced genomes. This synteny analysis may support positional cloning of target genes in common bean through the use of genomic information from these other legumes. PMID:20030833
A note on chaotic unimodal maps and applications.
Zhou, C T; He, X T; Yu, M Y; Chew, L Y; Wang, X G
2006-09-01
Based on the word-lift technique of symbolic dynamics of one-dimensional unimodal maps, we investigate the relation between chaotic kneading sequences and linear maximum-length shift-register sequences. Theoretical and numerical evidence that the set of the maximum-length shift-register sequences is a subset of the set of the universal sequence of one-dimensional chaotic unimodal maps is given. By stabilizing unstable periodic orbits on superstable periodic orbits, we also develop techniques to control the generation of long binary sequences.
NASA Astrophysics Data System (ADS)
Madrucci, Vanessa; Taioli, Fabio; de Araújo, Carlos César
2008-08-01
SummaryThis paper presents the groundwater favorability mapping on a fractured terrain in the eastern portion of São Paulo State, Brazil. Remote sensing, airborne geophysical data, photogeologic interpretation, geologic and geomorphologic maps and geographic information system (GIS) techniques have been used. The results of cross-tabulation between these maps and well yield data allowed groundwater prospective parameters in a fractured-bedrock aquifer. These prospective parameters are the base for the favorability analysis whose principle is based on the knowledge-driven method. The multicriteria analysis (weighted linear combination) was carried out to give a groundwater favorability map, because the prospective parameters have different weights of importance and different classes of each parameter. The groundwater favorability map was tested by cross-tabulation with new well yield data and spring occurrence. The wells with the highest values of productivity, as well as all the springs occurrence are situated in the excellent and good favorability mapped areas. It shows good coherence between the prospective parameters and the well yield and the importance of GIS techniques for definition of target areas for detail study and wells location.
Influence of pansharpening techniques in obtaining accurate vegetation thematic maps
NASA Astrophysics Data System (ADS)
Ibarrola-Ulzurrun, Edurne; Gonzalo-Martin, Consuelo; Marcello-Ruiz, Javier
2016-10-01
In last decades, there have been a decline in natural resources, becoming important to develop reliable methodologies for their management. The appearance of very high resolution sensors has offered a practical and cost-effective means for a good environmental management. In this context, improvements are needed for obtaining higher quality of the information available in order to get reliable classified images. Thus, pansharpening enhances the spatial resolution of the multispectral band by incorporating information from the panchromatic image. The main goal in the study is to implement pixel and object-based classification techniques applied to the fused imagery using different pansharpening algorithms and the evaluation of thematic maps generated that serve to obtain accurate information for the conservation of natural resources. A vulnerable heterogenic ecosystem from Canary Islands (Spain) was chosen, Teide National Park, and Worldview-2 high resolution imagery was employed. The classes considered of interest were set by the National Park conservation managers. 7 pansharpening techniques (GS, FIHS, HCS, MTF based, Wavelet `à trous' and Weighted Wavelet `à trous' through Fractal Dimension Maps) were chosen in order to improve the data quality with the goal to analyze the vegetation classes. Next, different classification algorithms were applied at pixel-based and object-based approach, moreover, an accuracy assessment of the different thematic maps obtained were performed. The highest classification accuracy was obtained applying Support Vector Machine classifier at object-based approach in the Weighted Wavelet `à trous' through Fractal Dimension Maps fused image. Finally, highlight the difficulty of the classification in Teide ecosystem due to the heterogeneity and the small size of the species. Thus, it is important to obtain accurate thematic maps for further studies in the management and conservation of natural resources.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
A campus-based course in field geology
NASA Astrophysics Data System (ADS)
Richard, G. A.; Hanson, G. N.
2009-12-01
GEO 305: Field Geology offers students practical experience in the field and in the computer laboratory conducting geological field studies on the Stony Brook University campus. Computer laboratory exercises feature mapping techniques and field studies of glacial and environmental geology, and include geophysical and hydrological analysis, interpretation, and mapping. Participants learn to use direct measurement and mathematical techniques to compute the location and geometry of features and gain practical experience in representing raster imagery and vector geographic data as features on maps. Data collecting techniques in the field include the use of hand-held GPS devices, compasses, ground-penetrating radar, tape measures, pacing, and leveling devices. Assignments that utilize these skills and techniques include mapping campus geology with GPS, using Google Earth to explore our geologic context, data file management and ArcGIS, tape and compass mapping of woodland trails, pace and compass mapping of woodland trails, measuring elevation differences on a hillside, measuring geologic sections and cores, drilling through glacial deposits, using ground penetrating radar on glaciotectonic topography, mapping the local water table, and the identification and mapping of boulders. Two three-hour sessions are offered per week, apportioned as needed between lecture; discussion; guided hands-on instruction in geospatial and other software such as ArcGIS, Google Earth, spreadsheets, and custom modules such as an arc intersection calculator; outdoor data collection and mapping; and writing of illustrated reports.
USDA-ARS?s Scientific Manuscript database
Ultra high resolution digital aerial photography has great potential to complement or replace ground measurements of vegetation cover for rangeland monitoring and assessment. We investigated object-based image analysis (OBIA) techniques for classifying vegetation in southwestern U.S. arid rangelands...
Subsurface Mapping: A Question of Position and Interpretation
ERIC Educational Resources Information Center
Kellie, Andrew C.
2009-01-01
This paper discusses the character and challenges inherent in the graphical portrayal of features in subsurface mapping. Subsurface structures are, by their nature, hidden and must be mapped based on drilling and/or geophysical data. Efficient use of graphical techniques is central to effectively communicating the results of expensive exploration…
Concept Mapping: A Critical Thinking Technique
ERIC Educational Resources Information Center
Harris, Charles M.; Zha, Shenghua
2013-01-01
Concept mapping, graphically depicting the structure of abstract concepts, is based on the observation that pictures and line drawings are often more easily comprehended than the words that represent an abstract concept. The efficacy of concept mapping for facilitating critical thinking was assessed in four sections of an introductory psychology…
Toward standardized mapping for left atrial analysis and cardiac ablation guidance
NASA Astrophysics Data System (ADS)
Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.
2014-03-01
In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.
Correction techniques for depth errors with stereo three-dimensional graphic displays
NASA Technical Reports Server (NTRS)
Parrish, Russell V.; Holden, Anthony; Williams, Steven P.
1992-01-01
Three-dimensional (3-D), 'real-world' pictorial displays that incorporate 'true' depth cues via stereopsis techniques have proved effective for displaying complex information in a natural way to enhance situational awareness and to improve pilot/vehicle performance. In such displays, the display designer must map the depths in the real world to the depths available with the stereo display system. However, empirical data have shown that the human subject does not perceive the information at exactly the depth at which it is mathematically placed. Head movements can also seriously distort the depth information that is embedded in stereo 3-D displays because the transformations used in mapping the visual scene to the depth-viewing volume (DVV) depend intrinsically on the viewer location. The goal of this research was to provide two correction techniques; the first technique corrects the original visual scene to the DVV mapping based on human perception errors, and the second (which is based on head-positioning sensor input data) corrects for errors induced by head movements. Empirical data are presented to validate both correction techniques. A combination of the two correction techniques effectively eliminates the distortions of depth information embedded in stereo 3-D displays.
Page layout analysis and classification for complex scanned documents
NASA Astrophysics Data System (ADS)
Erkilinc, M. Sezer; Jaber, Mustafa; Saber, Eli; Bauer, Peter; Depalov, Dejan
2011-09-01
A framework for region/zone classification in color and gray-scale scanned documents is proposed in this paper. The algorithm includes modules for extracting text, photo, and strong edge/line regions. Firstly, a text detection module which is based on wavelet analysis and Run Length Encoding (RLE) technique is employed. Local and global energy maps in high frequency bands of the wavelet domain are generated and used as initial text maps. Further analysis using RLE yields a final text map. The second module is developed to detect image/photo and pictorial regions in the input document. A block-based classifier using basis vector projections is employed to identify photo candidate regions. Then, a final photo map is obtained by applying probabilistic model based on Markov random field (MRF) based maximum a posteriori (MAP) optimization with iterated conditional mode (ICM). The final module detects lines and strong edges using Hough transform and edge-linkages analysis, respectively. The text, photo, and strong edge/line maps are combined to generate a page layout classification of the scanned target document. Experimental results and objective evaluation show that the proposed technique has a very effective performance on variety of simple and complex scanned document types obtained from MediaTeam Oulu document database. The proposed page layout classifier can be used in systems for efficient document storage, content based document retrieval, optical character recognition, mobile phone imagery, and augmented reality.
Voxel-based lesion mapping of meningioma: a comprehensive lesion location mapping of 260 lesions.
Hirayama, Ryuichi; Kinoshita, Manabu; Arita, Hideyuki; Kagawa, Naoki; Kishima, Haruhiko; Hashimoto, Naoya; Fujimoto, Yasunori; Yoshimine, Toshiki
2018-06-01
OBJECTIVE In the present study the authors aimed to determine preferred locations of meningiomas by avoiding descriptive analysis and instead using voxel-based lesion mapping and 3D image-rendering techniques. METHODS Magnetic resonance images obtained in 248 treatment-naïve meningioma patients with 260 lesions were retrospectively and consecutively collected. All images were registered to a 1-mm isotropic, high-resolution, T1-weighted brain atlas provided by the Montreal Neurological Institute (the MNI152), and a lesion frequency map was created, followed by 3D volume rendering to visualize the preferred locations of meningiomas in 3D. RESULTS The 3D lesion frequency map clearly showed that skull base structures such as parasellar, sphenoid wing, and petroclival regions were commonly affected by the tumor. The middle one-third of the superior sagittal sinus was most commonly affected in parasagittal tumors. Substantial lesion accumulation was observed around the leptomeninges covering the central sulcus and the sylvian fissure, with very few lesions observed at the frontal, parietal, and occipital convexities. CONCLUSIONS Using an objective visualization method, meningiomas were shown to be located around the middle third of the superior sagittal sinus, the perisylvian convexity, and the skull base. These observations, which are in line with previous descriptive analyses, justify further use of voxel-based lesion mapping techniques to help understand the biological nature of this disease.
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Kerber, A. G.; Sellers, P. J.
1993-01-01
Spatial averaging errors which may occur when creating hemispherical reflectance maps for different cover types from direct nadir technique to estimate the hemispherical reflectance are assessed by comparing the results with those obtained with a knowledge-based system called VEG (Kimes et al., 1991, 1992). It was found that hemispherical reflectance errors provided by using VEG are much less than those using the direct nadir techniques, depending on conditions. Suggestions are made concerning sampling and averaging strategies for creating hemispherical reflectance maps for photosynthetic, carbon cycle, and climate change studies.
Remote sensing techniques for conservation and management of natural vegetation ecosystems
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Verdesio, J. J.; Dossantos, J. R.
1981-01-01
The importance of using remote sensing techniques, in the visible and near-infrared ranges, for mapping, inventory, conservation and management of natural ecosystems is discussed. Some examples realized in Brazil or other countries are given to evaluate the products from orbital platform (MSS and RBV imagery of LANDSAT) and aerial level (photography) for ecosystems study. The maximum quantitative and qualitative information which can be obtained from each sensor, at different level, are discussed. Based on the developed experiments it is concluded that the remote sensing technique is a useful tool in mapping vegetation units, estimating biomass, forecasting and evaluation of fire damage, disease detection, deforestation mapping and change detection in land-use. In addition, remote sensing techniques can be used in controling implantation and planning natural/artificial regeneration.
Principles and techniques of polarimetric mapping.
NASA Technical Reports Server (NTRS)
Halajian, J.; Hallock, H.
1973-01-01
This paper introduces the concept and potential value of polarimetric maps and the techniques for generating these maps in operational remote sensing. The application-oriented polarimetric signature analyses in the literature are compiled, and several optical models are illustrated to bring out requirements of a sensor system for polarimetric mapping. By use of the concepts of Stokes parameters the descriptive specification of one sensor system is refined. The descriptive specification for a multichannel digital photometric-polarimetric mapper is based upon our experience with the present single channel device which includes the generation of polarimetric maps and pictures. High photometric accuracy and stability coupled with fast, accurate digital output has enabled us to overcome the handicap of taking sequential data from the same terrain.
Face recognition using 3D facial shape and color map information: comparison and combination
NASA Astrophysics Data System (ADS)
Godil, Afzal; Ressler, Sandy; Grother, Patrick
2004-08-01
In this paper, we investigate the use of 3D surface geometry for face recognition and compare it to one based on color map information. The 3D surface and color map data are from the CAESAR anthropometric database. We find that the recognition performance is not very different between 3D surface and color map information using a principal component analysis algorithm. We also discuss the different techniques for the combination of the 3D surface and color map information for multi-modal recognition by using different fusion approaches and show that there is significant improvement in results. The effectiveness of various techniques is compared and evaluated on a dataset with 200 subjects in two different positions.
Computerized data reduction techniques for nadir viewing remote sensors
NASA Technical Reports Server (NTRS)
Tiwari, S. N.; Gormsen, Barbara B.
1985-01-01
Computer resources have been developed for the analysis and reduction of MAPS experimental data from the OSTA-1 payload. The MAPS Research Project is concerned with the measurement of the global distribution of mid-tropospheric carbon monoxide. The measurement technique for the MAPS instrument is based on non-dispersive gas filter radiometer operating in the nadir viewing mode. The MAPS experiment has two passive remote sensing instruments, the prototype instrument which is used to measure tropospheric air pollution from aircraft platforms and the third generation (OSTA) instrument which is used to measure carbon monoxide in the mid and upper troposphere from space platforms. Extensive effort was also expended in support of the MAPS/OSTA-3 shuttle flight. Specific capabilities and resources developed are discussed.
ERIC Educational Resources Information Center
Kokko, Suvi; Lagerkvist, Carl Johan
2017-01-01
Using a case example of an innovative sanitation solution in a slum setting, this study explores the usefulness of the Zaltman Metaphor Elicitation Technique in a program planning and evaluation context. Using a qualitative image-based method to map people's mental models of ill-structured problems such as sanitation can aid program planners and…
NASA Astrophysics Data System (ADS)
Lerner, Michael G.; Meagher, Kristin L.; Carlson, Heather A.
2008-10-01
Use of solvent mapping, based on multiple-copy minimization (MCM) techniques, is common in structure-based drug discovery. The minima of small-molecule probes define locations for complementary interactions within a binding pocket. Here, we present improved methods for MCM. In particular, a Jarvis-Patrick (JP) method is outlined for grouping the final locations of minimized probes into physical clusters. This algorithm has been tested through a study of protein-protein interfaces, showing the process to be robust, deterministic, and fast in the mapping of protein "hot spots." Improvements in the initial placement of probe molecules are also described. A final application to HIV-1 protease shows how our automated technique can be used to partition data too complicated to analyze by hand. These new automated methods may be easily and quickly extended to other protein systems, and our clustering methodology may be readily incorporated into other clustering packages.
NASA Technical Reports Server (NTRS)
Mader, G. L.
1981-01-01
A technique for producing topographic information is described which is based on same side/same time viewing using a dissimilar combination of radar imagery and photographic images. Common geographic areas viewed from similar space reference locations produce scene elevation displacements in opposite direction and proper use of this characteristic can yield the perspective information necessary for determination of base to height ratios. These base to height ratios can in turn be used to produce a topographic map. A test area covering the Harrisburg, Pennsylvania region was observed by synthetic aperture radar on the Seasat satellite and by return beam vidicon on by the LANDSAT - 3 satellite. The techniques developed for the scaling re-orientation and common registration of the two images are presented along with the topographic determination data. Topographic determination based exclusively on the images content is compared to the map information which is used as a performance calibration base.
Time-efficient high-resolution whole-brain three-dimensional macromolecular proton fraction mapping
Yarnykh, Vasily L.
2015-01-01
Purpose Macromolecular proton fraction (MPF) mapping is a quantitative MRI method that reconstructs parametric maps of a relative amount of macromolecular protons causing the magnetization transfer (MT) effect and provides a biomarker of myelination in neural tissues. This study aimed to develop a high-resolution whole-brain MPF mapping technique utilizing a minimal possible number of source images for scan time reduction. Methods The described technique is based on replacement of an actually acquired reference image without MT saturation by a synthetic one reconstructed from R1 and proton density maps, thus requiring only three source images. This approach enabled whole-brain three-dimensional MPF mapping with isotropic 1.25×1.25×1.25 mm3 voxel size and scan time of 20 minutes. The synthetic reference method was validated against standard MPF mapping with acquired reference images based on data from 8 healthy subjects. Results Mean MPF values in segmented white and gray matter appeared in close agreement with no significant bias and small within-subject coefficients of variation (<2%). High-resolution MPF maps demonstrated sharp white-gray matter contrast and clear visualization of anatomical details including gray matter structures with high iron content. Conclusions Synthetic reference method improves resolution of MPF mapping and combines accurate MPF measurements with unique neuroanatomical contrast features. PMID:26102097
Appropriating Invention through Concept Maps in Writing for Multimedia and the Web
ERIC Educational Resources Information Center
Bacabac, Florence Elizabeth
2015-01-01
As an alternative approach to web preproduction, I propose the use of concept maps for invention of website projects in business and professional writing courses. This mapping device approximates our students' initial site plans since rough ideas are formed based on a substantial exploratory technique. Incorporated in various disciplines, the…
High-resolution carbon mapping on the million-hectare Island of Hawaii
Gregory P. Asner; R. Flint Hughes; Joseph Mascaro; Amanda L. Uowolo; David E. Knapp; James Jacobson; Ty Kennedy-Bowdoin; John K . Clark
2011-01-01
Current markets and international agreements for reducing emissions from deforestation and forest degradation (REDD) rely on carbon (C) monitoring techniques. Combining field measurements, airborne light detection and ranging (LiDAR)-based observations, and satellite-based imagery, we developed a 30-meter-resolution map of aboveground C density spanning 40 vegetation...
Spotlight-Mode Synthetic Aperture Radar Processing for High-Resolution Lunar Mapping
NASA Technical Reports Server (NTRS)
Harcke, Leif; Weintraub, Lawrence; Yun, Sang-Ho; Dickinson, Richard; Gurrola, Eric; Hensley, Scott; Marechal, Nicholas
2010-01-01
During the 2008-2009 year, the Goldstone Solar System Radar was upgraded to support radar mapping of the lunar poles at 4 m resolution. The finer resolution of the new system and the accompanying migration through resolution cells called for spotlight, rather than delay-Doppler, imaging techniques. A new pre-processing system supports fast-time Doppler removal and motion compensation to a point. Two spotlight imaging techniques which compensate for phase errors due to i) out of focus-plane motion of the radar and ii) local topography, have been implemented and tested. One is based on the polar format algorithm followed by a unique autofocus technique, the other is a full bistatic time-domain backprojection technique. The processing system yields imagery of the specified resolution. Products enabled by this new system include topographic mapping through radar interferometry, and change detection techniques (amplitude and coherent change) for geolocation of the NASA LCROSS mission impact site.
NASA Technical Reports Server (NTRS)
Smedes, H. W. (Principal Investigator); Root, R. R.; Roller, N. E. G.; Despain, D.
1978-01-01
The author has identified the following significant results. A terrain map of Yellowstone National Park showed plant community types and other classes of ground cover in what is basically a wild land. The map comprised 12 classes, six of which were mapped with accuracies of 70 to 95%. The remaining six classes had spectral reflectances that overlapped appreciably, and hence, those were mapped less accurately. Techniques were devised for quantitatively comparing the recognition map of the park with control data acquired from ground inspection and from analysis of sidelooking radar images, a thermal IR mosaic, and IR aerial photos of several scales. Quantitative analyses were made in ten 40 sq km test areas. Comparison mechanics were performed by computer with the final results displayed on line printer output. Forested areas were mapped by computer using ERTS data for less than 1/4 the cost of the conventional forest mapping technique for topographic base maps.
Magnetic mapping of iron in rodent spleen
Blissett, Angela R.; Ollander, Brooke; Penn, Brittany; McTigue, Dana M.; Agarwal, Gunjan
2016-01-01
Evaluation of iron distribution and density in biological tissues is important to understand the pathogenesis of a variety of diseases and the fate of exogenously administered iron-based carriers and contrast agents. Iron distribution in tissues is typically characterized via histochemical (Perl’s) stains or immunohistochemistry for ferritin, the major iron storage protein. A more accurate mapping of iron can be achieved via ultrastructural transmission electron microscopy (TEM) based techniques, which involve stringent sample preparation conditions. In this study, we elucidate the capability of magnetic force microscopy (MFM) as a label-free technique to map iron at the nanoscale level in rodent spleen tissue. We complemented and compared our MFM results with those obtained using Perl’s staining and TEM. Our results show how MFM mapping corresponded to sizes of iron-rich lysosomes at a resolution comparable to that of TEM. In addition MFM is compatible with tissue sections commonly prepared for routine histology. PMID:27890658
Assessing MODIS-based Products and Techniques for Detecting Gypsy Moth Defoliation
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Hargrove, William; Smoot, James C.; Prados, Don; McKellip, Rodney; Sader, Steven A.; Gasser, Jerry; May, George
2008-01-01
The project showed potential of MODIS and VIIRS time series data for contributing defoliation detection products to the USFS forest threat early warning system. This study yielded the first satellite-based wall-to-wall 2001 gypsy moth defoliation map for the study area. Initial results led to follow-on work to map 2007 gypsy moth defoliation over the eastern United States (in progress). MODIS-based defoliation maps offer promise for aiding aerial sketch maps either in planning surveys and/or adjusting acreage estimates of annual defoliation. More work still needs to be done to assess potential of technology for "now casts"of defoliation.
Increased-resolution OCT thickness mapping of the human macula: a statistically based registration.
Bernardes, Rui; Santos, Torcato; Cunha-Vaz, José
2008-05-01
To describe the development of a technique that enhances spatial resolution of retinal thickness maps of the Stratus OCT (Carl Zeiss Meditec, Inc., Dublin, CA). A retinal thickness atlas (RT-atlas) template was calculated, and a macular coordinate system was established, to pursue this objective. The RT-atlas was developed from principal component analysis of retinal thickness analyzer (RTA) maps acquired from healthy volunteers. The Stratus OCT radial thickness measurements were registered on the RT-atlas, from which an improved macular thickness map was calculated. Thereafter, Stratus OCT circular scans were registered on the previously calculated map to enhance spatial resolution. The developed technique was applied to Stratus OCT thickness data from healthy volunteers and from patients with diabetic retinopathy (DR) or age-related macular degeneration (AMD). Results showed that for normal, or close to normal, macular thickness maps from healthy volunteers and patients with DR, this technique can be an important aid in determining retinal thickness. Efforts are under way to improve the registration of retinal thickness data in patients with AMD. The developed technique enhances the evaluation of data acquired by the Stratus OCT, helping the detection of early retinal thickness abnormalities. Moreover, a normative database of retinal thickness measurements gained from this technique, as referenced to the macular coordinate system, can be created without errors induced by missed fixation and eye tilt.
Multi-Autonomous Ground-robotic International Challenge (MAGIC) 2010
2010-12-14
SLAM technique since this setup, having a LIDAR with long-range high-accuracy measurement capability, allows accurate localization and mapping more...achieve the accuracy of 25cm due to the use of multi-dimensional information. OGM is, similarly to SLAM , carried out by using LIDAR data. The OGM...a result of the development and implementation of the hybrid feature-based/scan-matching Simultaneous Localization and Mapping ( SLAM ) technique, the
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
Mapping yeast origins of replication via single-stranded DNA detection.
Feng, Wenyi; Raghuraman, M K; Brewer, Bonita J
2007-02-01
Studies in th Saccharomyces cerevisiae have provided a framework for understanding how eukaryotic cells replicate their chromosomal DNA to ensure faithful transmission of genetic information to their daughter cells. In particular, S. cerevisiae is the first eukaryote to have its origins of replication mapped on a genomic scale, by three independent groups using three different microarray-based approaches. Here we describe a new technique of origin mapping via detection of single-stranded DNA in yeast. This method not only identified the majority of previously discovered origins, but also detected new ones. We have also shown that this technique can identify origins in Schizosaccharomyces pombe, illustrating the utility of this method for origin mapping in other eukaryotes.
Patient-specific coronary territory maps
NASA Astrophysics Data System (ADS)
Beliveau, Pascale; Setser, Randolph; Cheriet, Farida; O'Donnell, Thomas
2007-03-01
It is standard practice for physicians to rely on empirical, population based models to define the relationship between regions of left ventricular (LV) myocardium and the coronary arteries which supply them with blood. Physicians use these models to infer the presence and location of disease within the coronary arteries based on the condition of the myocardium within their distribution (which can be established non-invasively using imaging techniques such as ultrasound or magnetic resonance imaging). However, coronary artery anatomy often varies from the assumed model distribution in the individual patient; thus, a non-invasive method to determine the correspondence between coronary artery anatomy and LV myocardium would have immediate clinical impact. This paper introduces an image-based rendering technique for visualizing maps of coronary distribution in a patient-specific approach. From an image volume derived from computed tomography (CT) images, a segmentation of the LV epicardial surface, as well as the paths of the coronary arteries, is obtained. These paths form seed points for a competitive region growing algorithm applied to the surface of the LV. A ray casting procedure in spherical coordinates from the center of the LV is then performed. The cast rays are mapped to a two-dimensional circular based surface forming our coronary distribution map. We applied our technique to a patient with known coronary artery disease and a qualitative evaluation by an expert in coronary cardiac anatomy showed promising results.
Lithology and aggregate quality attributes for the digital geologic map of Colorado
Knepper, Daniel H.; Green, Gregory N.; Langer, William H.
1999-01-01
This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map.
Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes
NASA Astrophysics Data System (ADS)
Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen
2016-06-01
Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.
Kamarudin, Kamarulzaman; Mamduh, Syed Muhammad; Shakaff, Ali Yeon Md; Zakaria, Ammar
2014-12-05
This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect's depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks.
Kamarudin, Kamarulzaman; Mamduh, Syed Muhammad; Shakaff, Ali Yeon Md; Zakaria, Ammar
2014-01-01
This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect's depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks. PMID:25490595
Northern Everglades, Florida, satellite image map
Thomas, Jean-Claude; Jones, John W.
2002-01-01
These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program with support from the Everglades National Park. The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.
ERIC Educational Resources Information Center
Jain, G. Panka; Gurupur, Varadraj P.; Schroeder, Jennifer L.; Faulkenberry, Eileen D.
2014-01-01
In this paper, we describe a tool coined as artificial intelligence-based student learning evaluation tool (AISLE). The main purpose of this tool is to improve the use of artificial intelligence techniques in evaluating a student's understanding of a particular topic of study using concept maps. Here, we calculate the probability distribution of…
TSEMA: interactive prediction of protein pairings between interacting families
Izarzugaza, José M. G.; Juan, David; Pons, Carles; Ranea, Juan A. G.; Valencia, Alfonso; Pazos, Florencio
2006-01-01
An entire family of methodologies for predicting protein interactions is based on the observed fact that families of interacting proteins tend to have similar phylogenetic trees due to co-evolution. One application of this concept is the prediction of the mapping between the members of two interacting protein families (which protein within one family interacts with which protein within the other). The idea is that the real mapping would be the one maximizing the similarity between the trees. Since the exhaustive exploration of all possible mappings is not feasible for large families, current approaches use heuristic techniques which do not ensure the best solution to be found. This is why it is important to check the results proposed by heuristic techniques and to manually explore other solutions. Here we present TSEMA, the server for efficient mapping assessment. This system calculates an initial mapping between two families of proteins based on a Monte Carlo approach and allows the user to interactively modify it based on performance figures and/or specific biological knowledge. All the explored mappings are graphically shown over a representation of the phylogenetic trees. The system is freely available at . Standalone versions of the software behind the interface are available upon request from the authors. PMID:16845017
Data selection techniques in the interpretation of MAGSAT data over Australia
NASA Technical Reports Server (NTRS)
Johnson, B. D.; Dampney, C. N. G.
1983-01-01
The MAGSAT data require critical selection in order to produce a self-consistent data set suitable for map construction and subsequent interpretation. Interactive data selection techniques are described which involve the use of a special-purpose profile-oriented data base and a colour graphics display. The careful application of these data selection techniques permits validation every data value and ensures that the best possible self-consistent data set is being used to construct the maps of the magnetic field measured at satellite altitudes over Australia.
Radiant thinking and the use of the mind map in nurse practitioner education.
Spencer, Julie R; Anderson, Kelley M; Ellis, Kathryn K
2013-05-01
The concept of radiant thinking, which led to the concept of mind mapping, promotes all aspects of the brain working in synergy, with thought beginning from a central point. The mind map, which is a graphical technique to improve creative thinking and knowledge attainment, utilizes colors, images, codes, and dimensions to amplify and enhance key ideas. This technique augments the visualization of relationships and links between concepts, which aids in information acquisition, data retention, and overall comprehension. Faculty can promote students' use of the technique for brainstorming, organizing ideas, taking notes, learning collaboratively, presenting, and studying. These applications can be used in problem-based learning, developing plans of care, health promotion activities, synthesizing disease processes, and forming differential diagnoses. Mind mapping is a creative way for students to engage in a unique method of learning that can expand memory recall and help create a new environment for processing information. Copyright 2013, SLACK Incorporated.
From a meso- to micro-scale connectome: array tomography and mGRASP
Rah, Jong-Cheol; Feng, Linqing; Druckmann, Shaul; Lee, Hojin; Kim, Jinhyun
2015-01-01
Mapping mammalian synaptic connectivity has long been an important goal of neuroscience because knowing how neurons and brain areas are connected underpins an understanding of brain function. Meeting this goal requires advanced techniques with single synapse resolution and large-scale capacity, especially at multiple scales tethering the meso- and micro-scale connectome. Among several advanced LM-based connectome technologies, Array Tomography (AT) and mammalian GFP-Reconstitution Across Synaptic Partners (mGRASP) can provide relatively high-throughput mapping synaptic connectivity at multiple scales. AT- and mGRASP-assisted circuit mapping (ATing and mGRASPing), combined with techniques such as retrograde virus, brain clearing techniques, and activity indicators will help unlock the secrets of complex neural circuits. Here, we discuss these useful new tools to enable mapping of brain circuits at multiple scales, some functional implications of spatial synaptic distribution, and future challenges and directions of these endeavors. PMID:26089781
Mapping wetlands on beaver flowages with 35-mm photography
Kirby, R.E.
1976-01-01
Beaver flowages and associated wetlands on the Chippewa National Forest, north-central Minnesota, were photographed from the ground and from the open side window of a small high-wing monoplane. The 35-mm High Speed Ektachrome transparencies obtained were used to map the cover-type associations visible on the aerial photographs. Nearly vertical aerial photos were rectified by projecting the slides onto a base map consisting ofcontrol points located by plane-table survey. Maps were prepared by tracing the recognizable stands of vegetation in the rectified projection at the desired map scale. Final map scales ranging from 1:260 to 1:571 permitted identification and mapping of 26 cover-type associations on 10 study flowages in 1971. This cover-mapping technique was economical and substituted for detailed ground surveys. Comparative data from 10 flowages were collected serially throughout the entire open-water season. Although developed for analysis of waterfowl habitat, the technique has application to other areas of wildlife management and ecological investigation.
Progress in diode-pumped alexandrite lasers as a new resource for future space lidar missions
NASA Astrophysics Data System (ADS)
Damzen, M. J.; Thomas, G. M.; Teppitaksak, A.; Minassian, A.
2017-11-01
Satellite-based remote sensing using laser-based lidar techniques provides a powerful tool for global 3-D mapping of atmospheric species (e.g. CO2, ozone, clouds, aerosols), physical attributes of the atmosphere (e.g. temperature, wind speed), and spectral indicators of Earth features (e.g. vegetation, water). Such information provides a valuable source for weather prediction, understanding of climate change, atmospheric science and health of the Earth eco-system. Similarly, laser-based altimetry can provide high precision ground topography mapping and more complex 3-D mapping (e.g. canopy height profiling). The lidar technique requires use of cutting-edge laser technologies and engineered designs that are capable of enduring the space environment over the mission lifetime. The laser must operate with suitably high electrical-to-optical efficiency and risk reduction strategy adopted to mitigate against laser failure or excessive operational degradation of laser performance.
Long term economic relationships from cointegration maps
NASA Astrophysics Data System (ADS)
Vicente, Renato; Pereira, Carlos de B.; Leite, Vitor B. P.; Caticha, Nestor
2007-07-01
We employ the Bayesian framework to define a cointegration measure aimed to represent long term relationships between time series. For visualization of these relationships we introduce a dissimilarity matrix and a map based on the sorting points into neighborhoods (SPIN) technique, which has been previously used to analyze large data sets from DNA arrays. We exemplify the technique in three data sets: US interest rates (USIR), monthly inflation rates and gross domestic product (GDP) growth rates.
Color reproduction system based on color appearance model and gamut mapping
NASA Astrophysics Data System (ADS)
Cheng, Fang-Hsuan; Yang, Chih-Yuan
2000-06-01
By the progress of computer, computer peripherals such as color monitor and printer are often used to generate color image. However, cross media color reproduction by human perception is usually different. Basically, the influence factors are device calibration and characterization, viewing condition, device gamut and human psychology. In this thesis, a color reproduction system based on color appearance model and gamut mapping is proposed. It consists of four parts; device characterization, color management technique, color appearance model and gamut mapping.
Secure positioning technique based on the encrypted visible light map
NASA Astrophysics Data System (ADS)
Lee, Y. U.; Jung, G.
2017-01-01
For overcoming the performance degradation problems of the conventional visible light (VL) positioning system, which are due to the co-channel interference by adjacent light and the irregularity of the VL reception position in the three dimensional (3-D) VL channel, the secure positioning technique based on the two dimensional (2-D) encrypted VL map is proposed, implemented as the prototype for the specific embedded positioning system, and verified by performance tests in this paper. It is shown from the test results that the proposed technique achieves the performance enhancement over 21.7% value better than the conventional one in the real positioning environment, and the well known PN code is the optimal stream encryption key for the good VL positioning.
Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro
2016-06-01
In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.
Innovative Visualization Techniques applied to a Flood Scenario
NASA Astrophysics Data System (ADS)
Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael
2013-04-01
The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.
Training site statistics from Landsat and Seasat satellite imagery registered to a common map base
NASA Technical Reports Server (NTRS)
Clark, J.
1981-01-01
Landsat and Seasat satellite imagery and training site boundary coordinates were registered to a common Universal Transverse Mercator map base in the Newport Beach area of Orange County, California. The purpose was to establish a spatially-registered, multi-sensor data base which would test the use of Seasat synthetic aperture radar imagery to improve spectral separability of channels used for land use classification of an urban area. Digital image processing techniques originally developed for the digital mosaics of the California Desert and the State of Arizona were adapted to spatially register multispectral and radar data. Techniques included control point selection from imagery and USGS topographic quadrangle maps, control point cataloguing with the Image Based Information System, and spatial and spectral rectifications of the imagery. The radar imagery was pre-processed to reduce its tendency toward uniform data distributions, so that training site statistics for selected Landsat and pre-processed Seasat imagery indicated good spectral separation between channels.
NASA Technical Reports Server (NTRS)
Ho, C.; Wilson, B.; Mannucci, A.; Lindqwister, U.; Yuan, D.
1997-01-01
Global ionospheric mapping (GIM) is a new, emerging technique for determining global ionospheric TEC (total electron content) based on measurements from a worldwide network of Global Positioning System (GPS) receivers.
Secure positioning technique based on encrypted visible light map for smart indoor service
NASA Astrophysics Data System (ADS)
Lee, Yong Up; Jung, Gillyoung
2018-03-01
Indoor visible light (VL) positioning systems for smart indoor services are negatively affected by both cochannel interference from adjacent light sources and VL reception position irregularity in the three-dimensional (3-D) VL channel. A secure positioning methodology based on a two-dimensional (2-D) encrypted VL map is proposed, implemented in prototypes of the specific positioning system, and analyzed based on performance tests. The proposed positioning technique enhances the positioning performance by more than 21.7% compared to the conventional method in real VL positioning tests. Further, the pseudonoise code is found to be the optimal encryption key for secure VL positioning for this smart indoor service.
Mapping snags and understory shrubs for LiDAR based assessment of wildlife habitat suitability
Sebastian Martinuzzi; Lee A. Vierling; William A. Gould; Michael J. Falkowski; Jeffrey S. Evans; Andrew T. Hudak; Kerri T. Vierling
2009-01-01
The lack of maps depicting forest three-dimensional structure, particularly as pertaining to snags and understory shrub species distribution, is a major limitation for managing wildlife habitat in forests. Developing new techniques to remotely map snags and understory shrubs is therefore an important need. To address this, we first evaluated the use of LiDAR data for...
USDA-ARS?s Scientific Manuscript database
Fragaria iinumae is recognized as an ancestor of the octoploid strawberry species, including the cultivated strawberry, Fragaria ×ananassa. Here we report the construction of the first high density linkage map for F. iinumae. The map is based on two high-throughput techniques of single nucleotide p...
2006-01-01
information of the robot (Figure 1) acquired via laser-based localization techniques. The results are maps of the global soundscape . The algorithmic...environments than noise maps. Furthermore, provided the acoustic localization algorithm can detect the sources, the soundscape can be mapped with many...gathering information about the auditory soundscape in which it is working. In addition to robustness in the presence of noise, it has also been
NASA Astrophysics Data System (ADS)
Mogaji, K. A.
2017-04-01
Producing a bias-free vulnerability assessment map model is significantly needed for planning a scheme of groundwater quality protection. This study developed a GIS-based AHPDST vulnerability index model for producing groundwater vulnerability model map in the hard rock terrain, Nigeria by exploiting the potentials of analytic hierarchy process (AHP) and Dempster-Shafer theory (DST) data mining models. The acquired borehole and geophysical data in the study area were processed to derive five groundwater vulnerability conditioning factors (GVCFs), namely recharge rate, aquifer transmissivity, hydraulic conductivity, transverse resistance and longitudinal conductance. The produced GVCFs' thematic maps were multi-criterially analyzed by employing the mechanisms of AHP and DST models to determine the normalized weight ( W) parameter for the GVCFs and mass function factors (MFFs) parameter for the GVCFs' thematic maps' class boundaries, respectively. Based on the application of the weighted linear average technique, the determined W and MFFs parameters were synthesized to develop groundwater vulnerability potential index (GVPI)-based AHPDST model algorithm. The developed model was applied to establish four GVPI mass/belief function indices. The estimates based on the applied GVPI belief function indices were processed in GIS environment to create prospective groundwater vulnerability potential index maps. The most representative of the resulting vulnerability maps (the GVPIBel map) was considered for producing the groundwater vulnerability potential zones (GVPZ) map for the area. The produced GVPZ map established 48 and 52% of the areal extent to be covered by the lows/moderate and highs vulnerable zones, respectively. The success and the prediction rates of the produced GVPZ map were determined using the relative operating characteristics technique to give 82.3 and 77.7%, respectively. The analyzed results reveal that the developed GVPI-based AHPDST model algorithm is capable of producing efficient groundwater vulnerability potential zones prediction map and characterizing the predicted zones uncertainty via the DST mechanism processes in the area. The produced GVPZ map in this study can be used by decision makers to formulate appropriate groundwater management strategies and the approach may be well opted in other hard rock regions of the world, especially in economically poor nations.
Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem
NASA Astrophysics Data System (ADS)
Zhang, Caiyun
2015-06-01
Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.
South Florida Everglades: satellite image map
Jones, John W.; Thomas, Jean-Claude; Desmond, G.B.
2001-01-01
These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program (http://access.usgs.gov/) with support from the Everglades National Park (http://www.nps.gov/ever/). The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.
Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon
2014-04-15
For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.
System Considerations and Challendes in 3d Mapping and Modeling Using Low-Cost Uav Systems
NASA Astrophysics Data System (ADS)
Lari, Z.; El-Sheimy, N.
2015-08-01
In the last few years, low-cost UAV systems have been acknowledged as an affordable technology for geospatial data acquisition that can meet the needs of a variety of traditional and non-traditional mapping applications. In spite of its proven potential, UAV-based mapping is still lacking in terms of what is needed for it to become an acceptable mapping tool. In other words, a well-designed system architecture that considers payload restrictions as well as the specifications of the utilized direct geo-referencing component and the imaging systems in light of the required mapping accuracy and intended application is still required. Moreover, efficient data processing workflows, which are capable of delivering the mapping products with the specified quality while considering the synergistic characteristics of the sensors onboard, the wide range of potential users who might lack deep knowledge in mapping activities, and time constraints of emerging applications, are still needed to be adopted. Therefore, the introduced challenges by having low-cost imaging and georeferencing sensors onboard UAVs with limited payload capability, the necessity of efficient data processing techniques for delivering required products for intended applications, and the diversity of potential users with insufficient mapping-related expertise needs to be fully investigated and addressed by UAV-based mapping research efforts. This paper addresses these challenges and reviews system considerations, adaptive processing techniques, and quality assurance/quality control procedures for achievement of accurate mapping products from these systems.
Viladot, D; Véron, M; Gemmi, M; Peiró, F; Portillo, J; Estradé, S; Mendoza, J; Llorca-Isern, N; Nicolopoulos, S
2013-10-01
A recently developed technique based on the transmission electron microscope, which makes use of electron beam precession together with spot diffraction pattern recognition now offers the possibility to acquire reliable orientation/phase maps with a spatial resolution down to 2 nm on a field emission gun transmission electron microscope. The technique may be described as precession-assisted crystal orientation mapping in the transmission electron microscope, precession-assisted crystal orientation mapping technique-transmission electron microscope, also known by its product name, ASTAR, and consists in scanning the precessed electron beam in nanoprobe mode over the specimen area, thus producing a collection of precession electron diffraction spot patterns, to be thereafter indexed automatically through template matching. We present a review on several application examples relative to the characterization of microstructure/microtexture of nanocrystalline metals, ceramics, nanoparticles, minerals and organics. The strengths and limitations of the technique are also discussed using several application examples. ©2013 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.
Pan, W R; Rozen, W M; Stretch, J; Thierry, B; Ashton, M W; Corlett, R J
2008-09-01
Lymphatic anatomy has become increasingly clinically important as surgical techniques evolve for investigating and treating cancer metastases. However, due to limited anatomical techniques available, research in this field has been insufficient. The techniques of computed tomography (CT) and magnetic resonance (MR) lymphangiography have not been described previously in the imaging of cadaveric lymphatic anatomy. This preliminary work describes the feasibility of these advanced imaging technologies for imaging lymphatic anatomy. A single, fresh cadaveric lower limb underwent lymphatic dissection and cannulation utilizing microsurgical techniques. Contrast materials for both CT and MR studies were chosen based on their suitability for subsequent clinical use, and imaging was undertaken with a view to mapping lymphatic anatomy. Microdissection studies were compared with imaging findings in each case. Both MR-based and CT-based contrast media in current clinical use were found to be suitable for demonstrating cadaveric lymphatic anatomy upon direct intralymphatic injection. MR lymphangiography and CT lymphangiography are feasible modalities for cadaveric anatomical research for lymphatic anatomy. Future studies including refinements in scanning techniques may offer these technologies to the clinical setting.
NASA Technical Reports Server (NTRS)
Sabol, Donald E., Jr.; Roberts, Dar A.; Adams, John B.; Smith, Milton O.
1993-01-01
An important application of remote sensing is to map and monitor changes over large areas of the land surface. This is particularly significant with the current interest in monitoring vegetation communities. Most of traditional methods for mapping different types of plant communities are based upon statistical classification techniques (i.e., parallel piped, nearest-neighbor, etc.) applied to uncalibrated multispectral data. Classes from these techniques are typically difficult to interpret (particularly to a field ecologist/botanist). Also, classes derived for one image can be very different from those derived from another image of the same area, making interpretation of observed temporal changes nearly impossible. More recently, neural networks have been applied to classification. Neural network classification, based upon spectral matching, is weak in dealing with spectral mixtures (a condition prevalent in images of natural surfaces). Another approach to mapping vegetation communities is based on spectral mixture analysis, which can provide a consistent framework for image interpretation. Roberts et al. (1990) mapped vegetation using the band residuals from a simple mixing model (the same spectral endmembers applied to all image pixels). Sabol et al. (1992b) and Roberts et al. (1992) used different methods to apply the most appropriate spectral endmembers to each image pixel, thereby allowing mapping of vegetation based upon the the different endmember spectra. In this paper, we describe a new approach to classification of vegetation communities based upon the spectra fractions derived from spectral mixture analysis. This approach was applied to three 1992 AVIRIS images of Jasper Ridge, California to observe seasonal changes in surface composition.
A Digital Tectonic Activity Map of the Earth
NASA Technical Reports Server (NTRS)
Lowman, Paul; Masuoka, Penny; Montgomery, Brian; OLeary, Jay; Salisbury, Demetra; Yates, Jacob
1999-01-01
The subject of neotectonics, covering the structures and structural activity of the last 5 million years (i.e., post-Miocene) is a well-recognized field, including "active tectonics," focussed on the last 500,000 years in a 1986 National Research Council report of that title. However, there is a cartographic gap between tectonic maps, generally showing all features regardless of age, and maps of current seismic or volcanic activity. We have compiled a map intended to bridge this gap, using modern data bases and computer-aided cartographic techniques. The maps presented here are conceptually descended from an earlier map showing tectonic and volcanic activity of the last one million years. Drawn by hand with the National Geographic Society's 1975 "The Physical World" map as a base, the 1981 map in various revisions has been widely reproduced in textbooks and various technical publications. However, two decades of progress call for a completely new map that can take advantage of new knowledge and cartographic techniques. The digital tectonic activity map (DTM), presented in shaded relief (Fig. 1) and schematic (Fig. 2) versions, is the result. The DTM is intended to show tectonism and volcanism of the last one million years, a period long enough to be representative of global activity, but short enough that features such as fault scarps and volcanos are still geomorphically recognizable. Data Sources and Cartographic Methods The DTM is based on a wide range of sources, summarized in Table 1. The most important is the digital elevation model, used to construct a shaded relief map. The bathymetry is largely from satellite altimetry, specifically the marine gravity compilations by Smith and Sandwell (1996). The shaded relief map was designed to match the new National Geographic Society world physical map (1992), although drawn independently, from the digital elevation model. The Robinson Projection is used instead of the earlier Van der Grinten one. Although neither conformal nor equal-area, the Robinson Projection provides a reasonable compromise and retains useful detail at high latitudes.
Three-Dimensional Maps for Disaster Management
NASA Astrophysics Data System (ADS)
Bandrova, T.; Zlatanova, S.; Konecny, M.
2012-07-01
Geo-information techniques have proven their usefulness for the purposes of early warning and emergency response. These techniques enable us to generate extensive geo-information to make informed decisions in response to natural disasters that lead to better protection of citizens, reduce damage to property, improve the monitoring of these disasters, and facilitate estimates of the damages and losses resulting from them. The maintenance and accessibility of spatial information has improved enormously with the development of spatial data infrastructures (SDIs), especially with second-generation SDIs, in which the original product-based SDI was improved to a process-based SDI. Through the use of SDIs, geo-information is made available to local, national and international organisations in regions affected by natural disasters as well as to volunteers serving in these areas. Volunteer-based systems for information collection (e.g., Ushahidi) have been created worldwide. However, the use of 3D maps is still limited. This paper discusses the applicability of 3D geo-information to disaster management. We discuss some important aspects of maps for disaster management, such as user-centred maps, the necessary components for 3D maps, symbols, and colour schemas. In addition, digital representations are evaluated with respect to their visual controls, i.e., their usefulness for the navigation and exploration of the information. Our recommendations are based on responses from a variety of users of these technologies, including children, geospecialists and disaster managers from different countries.
Music-therapy analyzed through conceptual mapping
NASA Astrophysics Data System (ADS)
Martinez, Rodolfo; de la Fuente, Rebeca
2002-11-01
Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.
Ronald E. McRoberts; Warren B. Cohen; Erik Naesset; Stephen V. Stehman; Erkki O. Tomppo
2010-01-01
Tremendous advances in the construction and assessment of forest attribute maps and related spatial products have been realized in recent years, partly as a result of the use of remotely sensed data as an information source. This review focuses on the current state of techniques for the construction and assessment of remote sensing-based maps and addresses five topic...
Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors
NASA Astrophysics Data System (ADS)
Mahalingam, R.; Olsen, M. J.
2014-12-01
Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : LiDAR, Landslides, Oregon, Inventory, Hazard
NASA Astrophysics Data System (ADS)
Gallagher, Anne; Tremblay, Julie; Vannasing, Phetsamone
2016-12-01
Patients with brain tumor or refractory epilepsy may be candidates for neurosurgery. Presurgical evaluation often includes language investigation to prevent or reduce the risk of postsurgical language deficits. Current techniques involve significant limitations with pediatric populations. Recently, near-infrared spectroscopy (NIRS) has been shown to be a valuable neuroimaging technique for language localization in children. However, it typically requires the child to perform a task (task-based NIRS), which may constitute a significant limitation. Resting-state functional connectivity NIRS (fcNIRS) is an approach that can be used to identify language networks at rest. This study aims to assess the utility of fcNIRS in children by comparing fcNIRS to more conventional task-based NIRS for language mapping in 33 healthy participants: 25 children (ages 3 to 16) and 8 adults. Data were acquired at rest and during a language task. Results show very good concordance between both approaches for language localization (Dice similarity coefficient=0.81±0.13) and hemispheric language dominance (kappa=0.86, p<0.006). The fcNIRS technique may be a valuable tool for language mapping in clinical populations, including children and patients with cognitive and behavioral impairments.
Geology of the Sklodowska Region, Lunar Farside. M.S. Thesis Final Report
NASA Technical Reports Server (NTRS)
Kauffman, J. D.
1974-01-01
Investigation of an area on the lunar farside has resulted in a geologic map, development of a regional stratigraphic sequence, and interpretation of surface materials. Apollo 15 metric photographs were used in conjunction with photogrammetric techniques to produce a base map to which geologic units were later added. Geologic units were first delineated on the metric photographs and then transferred to the base map. Materials were defined and described from selected Lunar Orbiter and Apollo 15 metric, panoramic, and Hasselblad photographs on the basis of distinctive morphologic characteristics.
Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods
NASA Technical Reports Server (NTRS)
Berry, J. K.; Tomlin, C. D.
1982-01-01
Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.
NASA Technical Reports Server (NTRS)
Batson, R. M.; Bridges, P. M.; Mullins, K. F.
1985-01-01
The Jovian and Saturnian satellites are being mapped at several scales from Voyager 1 and 2 data. The maps include specially formatted color mosaics, controlled photomosaics, and airbrush maps. More than 500 Voyager images of the Jovian and Saturnian satellites were radiometrically processed in preparation for cartographic processing. Of these images, 235 were geometrically transformed to map projections for base mosaic compilations. Special techniques for producing hybrid photomosaic/airbrush maps of Callisto are under investigation. The techniques involve making controlled computer mosaics of all available images with highest resolution images superimposed on lowest resolution images. The mosaics are then improved by airbrushing: seams and artifacts are removed, and image details enhanced that had been lost by saturation in some images. A controlled mosaic of the northern hemisphere of Rhea is complete, as is all processing for a similar mosaic of the equatorial region. Current plans and status of the various series are shown in a table.
DNA nanomapping using CRISPR-Cas9 as a programmable nanoparticle.
Mikheikin, Andrey; Olsen, Anita; Leslie, Kevin; Russell-Pavier, Freddie; Yacoot, Andrew; Picco, Loren; Payton, Oliver; Toor, Amir; Chesney, Alden; Gimzewski, James K; Mishra, Bud; Reed, Jason
2017-11-21
Progress in whole-genome sequencing using short-read (e.g., <150 bp), next-generation sequencing technologies has reinvigorated interest in high-resolution physical mapping to fill technical gaps that are not well addressed by sequencing. Here, we report two technical advances in DNA nanotechnology and single-molecule genomics: (1) we describe a labeling technique (CRISPR-Cas9 nanoparticles) for high-speed AFM-based physical mapping of DNA and (2) the first successful demonstration of using DVD optics to image DNA molecules with high-speed AFM. As a proof of principle, we used this new "nanomapping" method to detect and map precisely BCL2-IGH translocations present in lymph node biopsies of follicular lymphoma patents. This HS-AFM "nanomapping" technique can be complementary to both sequencing and other physical mapping approaches.
Three-dimensional analysis of magnetometer array data
NASA Technical Reports Server (NTRS)
Richmond, A. D.; Baumjohann, W.
1984-01-01
A technique is developed for mapping magnetic variation fields in three dimensions using data from an array of magnetometers, based on the theory of optimal linear estimation. The technique is applied to data from the Scandinavian Magnetometer Array. Estimates of the spatial power spectra for the internal and external magnetic variations are derived, which in turn provide estimates of the spatial autocorrelation functions of the three magnetic variation components. Statistical errors involved in mapping the external and internal fields are quantified and displayed over the mapping region. Examples of field mapping and of separation into external and internal components are presented. A comparison between the three-dimensional field separation and a two-dimensional separation from a single chain of stations shows that significant differences can arise in the inferred internal component.
Bas-relief map using texture analysis with application to live enhancement of ultrasound images.
Du, Huarui; Ma, Rui; Wang, Xiaoying; Zhang, Jue; Fang, Jing
2015-05-01
For ultrasound imaging, speckle is one of the most important factors in the degradation of contrast resolution because it masks meaningful texture and has the potential to interfere with diagnosis. It is expected that researchers would explore appropriate ways to reduce the speckle noise, to find the edges of structures and enhance weak borders between different organs in ultrasound imaging. Inspired by the principle of differential interference contrast microscopy, a "bas-relief map" is proposed that depicts the texture structure of ultrasound images. Based on a bas-relief map, an adaptive bas-relief filter was developed for ultrafast despeckling. Subsequently, an edge map was introduced to enhance the edges of images in real time. The holistic bas-relief map approach has been used experimentally with synthetic phantoms and digital ultrasound B-scan images of liver, kidney and gallbladder. Based on the visual inspection and the performance metrics of the despeckled images, it was found that the bas-relief map approach is capable of effectively reducing the speckle while significantly enhancing contrast and tissue boundaries for ultrasonic images, and its speckle reduction ability is comparable to that of Kuan, Lee and Frost filters. Meanwhile, the proposed technique could preserve more intra-region details compared with the popular speckle reducing anisotropic diffusion technique and more effectively enhance edges. In addition, the adaptive bas-relief filter was much less time consuming than the Kuan, Lee and Frost filter and speckle reducing anisotropic diffusion techniques. The bas-relief map strategy is effective for speckle reduction and live enhancement of ultrasound images, and can provide a valuable tool for clinical diagnosis. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Page, Lance; Shen, C. N.
1991-01-01
This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.
An Experimental Realization of a Chaos-Based Secure Communication Using Arduino Microcontrollers.
Zapateiro De la Hoz, Mauricio; Acho, Leonardo; Vidal, Yolanda
2015-01-01
Security and secrecy are some of the important concerns in the communications world. In the last years, several encryption techniques have been proposed in order to improve the secrecy of the information transmitted. Chaos-based encryption techniques are being widely studied as part of the problem because of the highly unpredictable and random-look nature of the chaotic signals. In this paper we propose a digital-based communication system that uses the logistic map which is a mathematically simple model that is chaotic under certain conditions. The input message signal is modulated using a simple Delta modulator and encrypted using a logistic map. The key signal is also encrypted using the same logistic map with different initial conditions. In the receiver side, the binary-coded message is decrypted using the encrypted key signal that is sent through one of the communication channels. The proposed scheme is experimentally tested using Arduino shields which are simple yet powerful development kits that allows for the implementation of the communication system for testing purposes.
Automatic spatiotemporal matching of detected pleural thickenings
NASA Astrophysics Data System (ADS)
Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas
2014-01-01
Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).
Weather or Not To Teach Junior High Meteorology.
ERIC Educational Resources Information Center
Knorr, Thomas P.
1984-01-01
Presents a technique for teaching meteorology allowing students to observe and analyze consecutive weather maps and relate local conditions; a model illustrating the three-dimensional nature of the atmosphere is employed. Instructional methods based on studies of daily weather maps to trace systems sweeping across the United States are discussed.…
Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program
ERIC Educational Resources Information Center
Jeffrey, Scott; Alvarez, Jaime
2010-01-01
The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…
RAMP: a computer system for mapping regional areas
Bradley B. Nickey
1975-01-01
Until 1972, the U.S. Forest Service's Individual Fire Reports recorded locations by the section-township-range system..These earlier fire reports, therefore, lacked congruent locations. RAMP (Regional Area Mapping Procedure) was designed to make the reports more useful for quantitative analysis. This computer-based technique converts locations expressed in...
Mapping and localization for extraterrestrial robotic explorations
NASA Astrophysics Data System (ADS)
Xu, Fengliang
In the exploration of an extraterrestrial environment such as Mars, orbital data, such as high-resolution imagery Mars Orbital Camera-Narrow Angle (MOC-NA), laser ranging data Mars Orbital Laser Altimeter (MOLA), and multi-spectral imagery Thermal Emission Imaging System (THEMIS), play more and more important roles. However, these remote sensing techniques can never replace the role of landers and rovers, which can provide a close up and inside view. Similarly, orbital mapping can not compete with ground-level close-range mapping in resolution, precision, and speed. This dissertation addresses two tasks related to robotic extraterrestrial exploration: mapping and rover localization. Image registration is also discussed as an important aspect for both of them. Techniques from computer vision and photogrammetry are applied for automation and precision. Image registration is classified into three sub-categories: intra-stereo, inter-stereo, and cross-site, according to the relationship between stereo images. In the intra-stereo registration, which is the most fundamental sub-category, interest point-based registration and verification by parallax continuity in the principal direction are proposed. Two other techniques, inter-scanline search with constrained dynamic programming for far range matching and Markov Random Field (MRF) based registration for big terrain variation, are explored as possible improvements. Creating using rover ground images mainly involves the generation of Digital Terrain Model (DTM) and ortho-rectified map (orthomap). The first task is to derive the spatial distribution statistics from the first panorama and model the DTM with a dual polynomial model. This model is used for interpolation of the DTM, using Kriging in the close range and Triangular Irregular Network (TIN) in the far range. To generate a uniformly illuminated orthomap from the DTM, a least-squares-based automatic intensity balancing method is proposed. Finally a seamless orthomap is constructed by a split-and-merge technique: the mapped area is split or subdivided into small regions of image overlap, and then each small map piece was processed and all of the pieces are merged together to form a seamless map. Rover localization has three stages, all of which use a least-squares adjustment procedure: (1) an initial localization which is accomplished by adjustment over features common to rover images and orbital images, (2) an adjustment of image pointing angles at a single site through inter and intra-stereo tie points, and (3) an adjustment of the rover traverse through manual cross-site tie points. The first stage is based on adjustment of observation angles of features. The second stage and third stage are based on bundle-adjustment. In the third-stage an incremental adjustment method was proposed. Automation in rover localization includes automatic intra/inter-stereo tie point selection, computer-assisted cross-site tie point selection, and automatic verification of accuracy. (Abstract shortened by UMI.)
Knick, Steven T.; Rotenberry, J.T.
1998-01-01
We tested the potential of a GIS mapping technique, using a resource selection model developed for black-tailed jackrabbits (Lepus californicus) and based on the Mahalanobis distance statistic, to track changes in shrubsteppe habitats in southwestern Idaho. If successful, the technique could be used to predict animal use areas, or those undergoing change, in different regions from the same selection function and variables without additional sampling. We determined the multivariate mean vector of 7 GIS variables that described habitats used by jackrabbits. We then ranked the similarity of all cells in the GIS coverage from their Mahalanobis distance to the mean habitat vector. The resulting map accurately depicted areas where we sighted jackrabbits on verification surveys. We then simulated an increase in shrublands (which are important habitats). Contrary to expectation, the new configurations were classified as lower similarity relative to the original mean habitat vector. Because the selection function is based on a unimodal mean, any deviation, even if biologically positive, creates larger Malanobis distances and lower similarity values. We recommend the Mahalanobis distance technique for mapping animal use areas when animals are distributed optimally, the landscape is well-sampled to determine the mean habitat vector, and distributions of the habitat variables does not change.
NASA Technical Reports Server (NTRS)
Skinner, J. A., Jr.; Eppler, D. B.; Bleacher, J. E.; Evans, C. A.; Feng, W.; Gruener, J.; Hurwitz, D. M.; Janoiko, B.; Whitson, P.
2014-01-01
Cartographic products and - specifically - geologic maps provide critical assistance for establishing physical and temporal frameworks of planetary surfaces. The technical methods that result in the creation of geologic maps vary depending on how observations are made as well as the overall intent of the final products [1-3]. These methods tend to follow a common linear work flow, including the identification and delineation of spatially and temporally discrete materials (units), the documentation of their primary (emplacement) and secondary (erosional) characteristics, analysis of the relative and absolute age relationships between these materials, and the collation of observations and interpretations into an objective map product. The "objectivity" of a map is critical cross comparison with overlapping maps and topical studies as well as its relevance to scientific posterity. However, the "accuracy" and "correctness" of a geologic map is very subject to debate. This can be evidenced by comparison of existing geologic maps at various scales, particularly those compiled through field- and remote-based mapped efforts. Our study focuses on comparing the fidelity of (1) "Apollo-style" geologic investigations, where typically non-geologist crew members follow static traverse routes established through pre-mission planning, and (2) "traditional" field-based investigations, where geologists are given free rein to observe without preplanned routes. This abstract summarizes the regional geology wherein our study was conducted, presents the geologic map created from traditional field mapping techniques, and offers basic insights into how geologic maps created from different tactics can be reconciled in support of exploratory missions. Additional abstracts [4-6] from this study discuss various exploration and science results of these efforts.
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-12
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
NASA Astrophysics Data System (ADS)
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
Characteristics of Forests in Western Sayani Mountains, Siberia from SAR Data
NASA Technical Reports Server (NTRS)
Ranson, K. Jon; Sun, Guoqing; Kharuk, V. I.; Kovacs, Katalin
1998-01-01
This paper investigated the possibility of using spaceborne radar data to map forest types and logging in the mountainous Western Sayani area in Siberia. L and C band HH, HV, and VV polarized images from the Shuttle Imaging Radar-C instrument were used in the study. Techniques to reduce topographic effects in the radar images were investigated. These included radiometric correction using illumination angle inferred from a digital elevation model, and reducing apparent effects of topography through band ratios. Forest classification was performed after terrain correction utilizing typical supervised techniques and principal component analyses. An ancillary data set of local elevations was also used to improve the forest classification. Map accuracy for each technique was estimated for training sites based on Russian forestry maps, satellite imagery and field measurements. The results indicate that it is necessary to correct for topography when attempting to classify forests in mountainous terrain. Radiometric correction based on a DEM (Digital Elevation Model) improved classification results but required reducing the SAR (Synthetic Aperture Radar) resolution to match the DEM. Using ratios of SAR channels that include cross-polarization improved classification and
Towards Unmanned Systems for Dismounted Operations in the Canadian Forces
2011-01-01
LIDAR , and RADAR) and lower power/mass, passive imaging techniques such as structure from motion and simultaneous localisation and mapping ( SLAM ...sensors and learning algorithms. 5.1.2 Simultaneous localisation and mapping SLAM algorithms concurrently estimate a robot pose and a map of unique...locations and vehicle pose are part of the SLAM state vector and are estimated in each update step. AISS developed a monocular camera-based SLAM
A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.
Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun
2015-08-31
Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.
A Probabilistic Feature Map-Based Localization System Using a Monocular Camera
Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun
2015-01-01
Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments. PMID:26404284
Alexander C. Vibrans; Ronald E. McRoberts; Paolo Moser; Adilson L. Nicoletti
2013-01-01
Estimation of large area forest attributes, such as area of forest cover, from remote sensing-based maps is challenging because of image processing, logistical, and data acquisition constraints. In addition, techniques for estimating and compensating for misclassification and estimating uncertainty are often unfamiliar. Forest area for the state of Santa Catarina in...
Lymphatic drainage in renal cell carcinoma: back to the basics.
Karmali, Riaz J; Suami, Hiroo; Wood, Christopher G; Karam, Jose A
2014-12-01
Lymphatic drainage in renal cell carcinoma (RCC) is unpredictable, however, basic patterns can be observed in cadaveric and sentinel lymph node mapping studies in patients with RCC. The existence of peripheral lymphovenous communications at the level of the renal vein has been shown in mammals but remains unknown in humans. The sentinel lymph node biopsy technique can be safely applied to map lymphatic drainage patterns in patients with RCC. Further standardisation of sentinel node biopsy techniques is required to improve the clinical significance of mapping studies. Understanding lymphatic drainage in RCC may lead to an evidence-based consensus on the surgical management of retroperitoneal lymph nodes. © 2014 The Authors. BJU International © 2014 BJU International.
NASA Astrophysics Data System (ADS)
Hao, Ming; Rohrdantz, Christian; Janetzko, Halldór; Keim, Daniel; Dayal, Umeshwar; Haug, Lars-Erik; Hsu, Mei-Chun
2012-01-01
Twitter currently receives over 190 million tweets (small text-based Web posts) and manufacturing companies receive over 10 thousand web product surveys a day, in which people share their thoughts regarding a wide range of products and their features. A large number of tweets and customer surveys include opinions about products and services. However, with Twitter being a relatively new phenomenon, these tweets are underutilized as a source for determining customer sentiments. To explore high-volume customer feedback streams, we integrate three time series-based visual analysis techniques: (1) feature-based sentiment analysis that extracts, measures, and maps customer feedback; (2) a novel idea of term associations that identify attributes, verbs, and adjectives frequently occurring together; and (3) new pixel cell-based sentiment calendars, geo-temporal map visualizations and self-organizing maps to identify co-occurring and influential opinions. We have combined these techniques into a well-fitted solution for an effective analysis of large customer feedback streams such as for movie reviews (e.g., Kung-Fu Panda) or web surveys (buyers).
Surface registration technique for close-range mapping applications
NASA Astrophysics Data System (ADS)
Habib, Ayman F.; Cheng, Rita W. T.
2006-08-01
Close-range mapping applications such as cultural heritage restoration, virtual reality modeling for the entertainment industry, and anatomical feature recognition for medical activities require 3D data that is usually acquired by high resolution close-range laser scanners. Since these datasets are typically captured from different viewpoints and/or at different times, accurate registration is a crucial procedure for 3D modeling of mapped objects. Several registration techniques are available that work directly with the raw laser points or with extracted features from the point cloud. Some examples include the commonly known Iterative Closest Point (ICP) algorithm and a recently proposed technique based on matching spin-images. This research focuses on developing a surface matching algorithm that is based on the Modified Iterated Hough Transform (MIHT) and ICP to register 3D data. The proposed algorithm works directly with the raw 3D laser points and does not assume point-to-point correspondence between two laser scans. The algorithm can simultaneously establish correspondence between two surfaces and estimates the transformation parameters relating them. Experiment with two partially overlapping laser scans of a small object is performed with the proposed algorithm and shows successful registration. A high quality of fit between the two scans is achieved and improvement is found when compared to the results obtained using the spin-image technique. The results demonstrate the feasibility of the proposed algorithm for registering 3D laser scanning data in close-range mapping applications to help with the generation of complete 3D models.
X-Ray Fluorescence Solvent Detection at the Substrate-Adhesive Interface
NASA Technical Reports Server (NTRS)
Wurth, Laura; Evans, Kurt; Weber, Bart; Headrick, Sarah
2005-01-01
With environmental regulations limiting the use of volatile organic compounds, low-vapor pressure solvents have replaced traditional degreasing solvents for bond substrate preparation. When used to clean and prepare porous bond substrates such as phenolic composites, low vapor pressure solvents can penetrate deep into substrate pore networks and remain there for extended periods. Trapped solvents can interact with applied adhesives either prior to or during cure, potentially compromising bond properties. Currently, methods for characterizing solvent time-depth profiles in bond substrates are limited to bulk gravimetric or sectioning techniques. While sectioning techniques such as microtome allow construction of solvent depth profiles, their depth resolution and reliability are limited by substrate type. Sectioning techniques are particularly limited near the adhesive-substrate interface where depth resolution is further limited by adhesive-substrate hardness and, in the case of a partially cured adhesive, mechanical properties differences. Additionally, sectioning techniques cannot provide information about lateral solvent diffusion. Cross-section component mapping is an alternative method for measuring solvent migration in porous substrates that eliminates the issues associated with sectioning techniques. With cross-section mapping, the solvent-wiped substrate is sectioned perpendicular rather than parallel to the wiped surface, and the sectioned surface is analyzed for the solvent or solvent components of interest using a two-dimensional mapping or imaging technique. Solvent mapping can be performed using either direct or indirect methods. With a direct method, one or more solvent components are mapped using red or Raman spectroscopy together with a moveable sample stage and/or focal plane array detector. With an indirect method, an elemental "tag" not present in the substrate is added to the solvent before the substrate is wiped. Following cross sectioning, the tag element can then be mapped by its characteristic x-ray emission using either x-ray fluorescence, or electron-beam energy-and wavelength-dispersive x-ray spectrometry. The direct mapping techniques avoid issues of different diffusion or migration rates of solvents and elemental tags, while the indirect techniques avoid spectral resolution issues in cases where solvents and substrates have adjacent or overlapping peaks. In this study, cross-section component indirect mapping is being evaluated as a method for measuring migration of d-limonene based solvents in glass-cloth phenolic composite (GCP) prior to and during subsequent bonding and epoxy adhesive cure.
Fabrication of glass gas cells for the HALOE and MAPS satellite experiments
NASA Technical Reports Server (NTRS)
Sullivan, E. M.; Walthall, H. G.
1984-01-01
The Halogen Occultation Experiment (HALOE) and the Measurement of Air Pollution from Satellites (MAPS) experiment are satellite-borne experiments which measure trace constituents in the Earth's atmosphere. The instruments which obtain the data for these experiments are based on the gas filter correlation radiometer measurement technique. In this technique, small samples of the gases of interest are encapsulated in glass cylinders, called gas cells, which act as very selective optical filters. This report describes the techniques employed in the fabrication of the gas cells for the HALOE and MAPS instruments. Details of the method used to fuse the sapphire windows (required for IR transmission) to the glass cell bodies are presented along with detailed descriptions of the jigs and fixtures used during the assembly process. The techniques and equipment used for window inspection and for pairing the HALOE windows are discussed. Cell body materials and the steps involved in preparing the cell bodies for the glass-to-sapphire fusion process are given.
Mapping cardiogenic oscillations using synchrotron-based phase contrast CT imaging
NASA Astrophysics Data System (ADS)
Thurgood, Jordan; Dubsky, Stephen; Siu, Karen K. W.; Wallace, Megan; Siew, Melissa; Hooper, Stuart; Fouras, Andreas
2012-10-01
In many animals, including humans, the lungs encase the majority of the heart thus the motion of each organ affects the other. The effects of the motion of the heart on the lungs potentially provides information with regards to both lung and heart health. We present a novel technique that is capable of measuring the effect of the heart on the surrounding lung tissue through the use of advanced synchrotron imaging techniques and recently developed X-ray velocimetry methods. This technique generates 2D frequency response maps of the lung tissue motion at multiple projection angles from projection X-ray images. These frequency response maps are subsequently used to generate 3D reconstructions of the lung tissue exhibiting motion at the frequency of ventilation and the lung tissue exhibiting motion at the frequency of the heart. This technique has a combined spatial and temporal resolution sufficient to observe the dynamic and complex 3D nature of lung-heart interactions.
NASA Astrophysics Data System (ADS)
Macander, M. J.; Frost, G. V., Jr.
2015-12-01
Regional-scale mapping of vegetation and other ecosystem properties has traditionally relied on medium-resolution remote sensing such as Landsat (30 m) and MODIS (250 m). Yet, the burgeoning availability of high-resolution (<=2 m) imagery and ongoing advances in computing power and analysis tools raises the prospect of performing ecosystem mapping at fine spatial scales over large study domains. Here we demonstrate cutting-edge mapping approaches over a ~35,000 km² study area on Alaska's North Slope using calibrated and atmospherically-corrected mosaics of high-resolution WorldView-2 and GeoEye-1 imagery: (1) an a priori spectral approach incorporating the Satellite Imagery Automatic Mapper (SIAM) algorithms; (2) image segmentation techniques; and (3) texture metrics. The SIAM spectral approach classifies radiometrically-calibrated imagery to general vegetation density categories and non-vegetated classes. The SIAM classes were developed globally and their applicability in arctic tundra environments has not been previously evaluated. Image segmentation, or object-based image analysis, automatically partitions high-resolution imagery into homogeneous image regions that can then be analyzed based on spectral, textural, and contextual information. We applied eCognition software to delineate waterbodies and vegetation classes, in combination with other techniques. Texture metrics were evaluated to determine the feasibility of using high-resolution imagery to algorithmically characterize periglacial surface forms (e.g., ice-wedge polygons), which are an important physical characteristic of permafrost-dominated regions but which cannot be distinguished by medium-resolution remote sensing. These advanced mapping techniques provide products which can provide essential information supporting a broad range of ecosystem science and land-use planning applications in northern Alaska and elsewhere in the circumpolar Arctic.
Digital mapping techniques '06 - Workshop proceedings
Soller, David R.
2007-01-01
The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.
NASA Astrophysics Data System (ADS)
Salsone, Silvia; Taylor, Andrew; Gomez, Juliana; Pretty, Iain; Ellwood, Roger; Dickinson, Mark; Lombardo, Giuseppe; Zakian, Christian
2012-07-01
Near infrared (NIR) multispectral imaging is a novel noninvasive technique that maps and quantifies dental caries. The technique has the ability to reduce the confounding effect of stain present on teeth. The aim of this study was to develop and validate a quantitative NIR multispectral imaging system for caries detection and assessment against a histological reference standard. The proposed technique is based on spectral imaging at specific wavelengths in the range from 1000 to 1700 nm. A total of 112 extracted teeth (molars and premolars) were used and images of occlusal surfaces at different wavelengths were acquired. Three spectral reflectance images were combined to generate a quantitative lesion map of the tooth. The maximum value of the map at the corresponding histological section was used as the NIR caries score. The NIR caries score significantly correlated with the histological reference standard (Spearman's Coefficient=0.774, p<0.01). Caries detection sensitivities and specificities of 72% and 91% for sound areas, 36% and 79% for lesions on the enamel, and 82% and 69% for lesions in dentin were found. These results suggest that NIR spectral imaging is a novel and promising method for the detection, quantification, and mapping of dental caries.
Salsone, Silvia; Taylor, Andrew; Gomez, Juliana; Pretty, Iain; Ellwood, Roger; Dickinson, Mark; Lombardo, Giuseppe; Zakian, Christian
2012-07-01
Near infrared (NIR) multispectral imaging is a novel noninvasive technique that maps and quantifies dental caries. The technique has the ability to reduce the confounding effect of stain present on teeth. The aim of this study was to develop and validate a quantitative NIR multispectral imaging system for caries detection and assessment against a histological reference standard. The proposed technique is based on spectral imaging at specific wavelengths in the range from 1000 to 1700 nm. A total of 112 extracted teeth (molars and premolars) were used and images of occlusal surfaces at different wavelengths were acquired. Three spectral reflectance images were combined to generate a quantitative lesion map of the tooth. The maximum value of the map at the corresponding histological section was used as the NIR caries score. The NIR caries score significantly correlated with the histological reference standard (Spearman's Coefficient=0.774, p<0.01). Caries detection sensitivities and specificities of 72% and 91% for sound areas, 36% and 79% for lesions on the enamel, and 82% and 69% for lesions in dentin were found. These results suggest that NIR spectral imaging is a novel and promising method for the detection, quantification, and mapping of dental caries.
Han, Yang; Wang, Shutao; Payen, Thomas; Konofagou, Elisa
2017-01-01
The successful clinical application of High Intensity Focused Ultrasound (HIFU) ablation depends on reliable monitoring of the lesion formation. Harmonic Motion Imaging guided Focused Ultrasound (HMIgFUS) is an ultrasound-based elasticity imaging technique, which monitors HIFU ablation based on the stiffness change of the tissue instead of the echo intensity change in conventional B-mode monitoring, rendering it potentially more sensitive to lesion development. Our group has shown that predicting the lesion location based on the radiation force-excited region is feasible during HMIgFUS. In this study, the feasibility of a fast lesion mapping method is explored to directly monitor the lesion map during HIFU. The HMI lesion map was generated by subtracting the reference HMI image from the present HMI peak-to-peak displacement map to be streamed on the computer display. The dimensions of the HMIgFUS lesions were compared against gross pathology. Excellent agreement was found between the lesion depth (r2 = 0.81, slope = 0.90), width (r2 = 0.85, slope = 1.12) and area (r2 = 0.58, slope = 0.75). In vivo feasibility was assessed in a mouse with a pancreatic tumor. These findings demonstrate that HMIgFUS can successfully map thermal lesion and monitor lesion development in real time in vitro and in vivo. The HMIgFUS technique may therefore constitute a novel clinical tool for HIFU treatment monitoring. PMID:28323638
Using image mapping towards biomedical and biological data sharing
2013-01-01
Image-based data integration in eHealth and life sciences is typically concerned with the method used for anatomical space mapping, needed to retrieve, compare and analyse large volumes of biomedical data. In mapping one image onto another image, a mechanism is used to match and find the corresponding spatial regions which have the same meaning between the source and the matching image. Image-based data integration is useful for integrating data of various information structures. Here we discuss a broad range of issues related to data integration of various information structures, review exemplary work on image representation and mapping, and discuss the challenges that these techniques may bring. PMID:24059352
Allones, J L; Martinez, D; Taboada, M
2014-10-01
Clinical terminologies are considered a key technology for capturing clinical data in a precise and standardized manner, which is critical to accurately exchange information among different applications, medical records and decision support systems. An important step to promote the real use of clinical terminologies, such as SNOMED-CT, is to facilitate the process of finding mappings between local terms of medical records and concepts of terminologies. In this paper, we propose a mapping tool to discover text-to-concept mappings in SNOMED-CT. Name-based techniques were combined with a query expansion system to generate alternative search terms, and with a strategy to analyze and take advantage of the semantic relationships of the SNOMED-CT concepts. The developed tool was evaluated and compared to the search services provided by two SNOMED-CT browsers. Our tool automatically mapped clinical terms from a Spanish glossary of procedures in pathology with 88.0% precision and 51.4% recall, providing a substantial improvement of recall (28% and 60%) over other publicly accessible mapping services. The improvements reached by the mapping tool are encouraging. Our results demonstrate the feasibility of accurately mapping clinical glossaries to SNOMED-CT concepts, by means a combination of structural, query expansion and named-based techniques. We have shown that SNOMED-CT is a great source of knowledge to infer synonyms for the medical domain. Results show that an automated query expansion system overcomes the challenge of vocabulary mismatch partially.
Mapping knowledge domains: Characterizing PNAS
Boyack, Kevin W.
2004-01-01
A review of data mining and analysis techniques that can be used for the mapping of knowledge domains is given. Literature mapping techniques can be based on authors, documents, journals, words, and/or indicators. Most mapping questions are related to research assessment or to the structure and dynamics of disciplines or networks. Several mapping techniques are demonstrated on a data set comprising 20 years of papers published in PNAS. Data from a variety of sources are merged to provide unique indicators of the domain bounded by PNAS. By using funding source information and citation counts, it is shown that, on an aggregate basis, papers funded jointly by the U.S. Public Health Service (which includes the National Institutes of Health) and non-U.S. government sources outperform papers funded by other sources, including by the U.S. Public Health Service alone. Grant data from the National Institute on Aging show that, on average, papers from large grants are cited more than those from small grants, with performance increasing with grant amount. A map of the highest performing papers over the 20-year period was generated by using citation analysis. Changes and trends in the subjects of highest impact within the PNAS domain are described. Interactions between topics over the most recent 5-year period are also detailed. PMID:14963238
Liborg: a lidar-based robot for efficient 3D mapping
NASA Astrophysics Data System (ADS)
Vlaminck, Michiel; Luong, Hiep; Philips, Wilfried
2017-09-01
In this work we present Liborg, a spatial mapping and localization system that is able to acquire 3D models on the y using data originated from lidar sensors. The novelty of this work is in the highly efficient way we deal with the tremendous amount of data to guarantee fast execution times while preserving sufficiently high accuracy. The proposed solution is based on a multi-resolution technique based on octrees. The paper discusses and evaluates the main benefits of our approach including its efficiency regarding building and updating the map and its compactness regarding compressing the map. In addition, the paper presents a working prototype consisting of a robot equipped with a Velodyne Lidar Puck (VLP-16) and controlled by a Raspberry Pi serving as an independent acquisition platform.
Improving IMRT delivery efficiency with reweighted L1-minimization for inverse planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hojin; Becker, Stephen; Lee, Rena
2013-07-15
Purpose: This study presents an improved technique to further simplify the fluence-map in intensity modulated radiation therapy (IMRT) inverse planning, thereby reducing plan complexity and improving delivery efficiency, while maintaining the plan quality.Methods: First-order total-variation (TV) minimization (min.) based on L1-norm has been proposed to reduce the complexity of fluence-map in IMRT by generating sparse fluence-map variations. However, with stronger dose sparing to the critical structures, the inevitable increase in the fluence-map complexity can lead to inefficient dose delivery. Theoretically, L0-min. is the ideal solution for the sparse signal recovery problem, yet practically intractable due to its nonconvexity of themore » objective function. As an alternative, the authors use the iteratively reweighted L1-min. technique to incorporate the benefits of the L0-norm into the tractability of L1-min. The weight multiplied to each element is inversely related to the magnitude of the corresponding element, which is iteratively updated by the reweighting process. The proposed penalizing process combined with TV min. further improves sparsity in the fluence-map variations, hence ultimately enhancing the delivery efficiency. To validate the proposed method, this work compares three treatment plans obtained from quadratic min. (generally used in clinic IMRT), conventional TV min., and our proposed reweighted TV min. techniques, implemented by a large-scale L1-solver (template for first-order conic solver), for five patient clinical data. Criteria such as conformation number (CN), modulation index (MI), and estimated treatment time are employed to assess the relationship between the plan quality and delivery efficiency.Results: The proposed method yields simpler fluence-maps than the quadratic and conventional TV based techniques. To attain a given CN and dose sparing to the critical organs for 5 clinical cases, the proposed method reduces the number of segments by 10-15 and 30-35, relative to TV min. and quadratic min. based plans, while MIs decreases by about 20%-30% and 40%-60% over the plans by two existing techniques, respectively. With such conditions, the total treatment time of the plans obtained from our proposed method can be reduced by 12-30 s and 30-80 s mainly due to greatly shorter multileaf collimator (MLC) traveling time in IMRT step-and-shoot delivery.Conclusions: The reweighted L1-minimization technique provides a promising solution to simplify the fluence-map variations in IMRT inverse planning. It improves the delivery efficiency by reducing the entire segments and treatment time, while maintaining the plan quality in terms of target conformity and critical structure sparing.« less
NASA Astrophysics Data System (ADS)
Sternberg, Oren; Bednarski, Valerie R.; Perez, Israel; Wheeland, Sara; Rockway, John D.
2016-09-01
Non-invasive optical techniques pertaining to the remote sensing of power quality disturbances (PQD) are part of an emerging technology field typically dominated by radio frequency (RF) and invasive-based techniques. Algorithms and methods to analyze and address PQD such as probabilistic neural networks and fully informed particle swarms have been explored in industry and academia. Such methods are tuned to work with RF equipment and electronics in existing power grids. As both commercial and defense assets are heavily power-dependent, understanding electrical transients and failure events using non-invasive detection techniques is crucial. In this paper we correlate power quality empirical models to the observed optical response. We also empirically demonstrate a first-order approach to map household, office and commercial equipment PQD to user functions and stress levels. We employ a physics-based image and signal processing approach, which demonstrates measured non-invasive (remote sensing) techniques to detect and map the base frequency associated with the power source to the various PQD on a calibrated source.
Specifications for updating USGS land use and land cover maps
Milazzo, Valerie A.
1983-01-01
To meet the increasing demands for up-to-date land use and land cover information, a primary goal of the U.S. Geological Survey's (USGS) national land use and land cover mapping program is to provide for periodic updating of maps and data in a timely and uniform manner. The technical specifications for updating existing USGS land use and land cover maps that are presented here cover both the interpretive aspects of detecting and identifying land use and land cover changes and the cartographic aspects of mapping and presenting the change data in conventional map format. They provide the map compiler with the procedures and techniques necessary to then use these change data to update existing land use and land cover maps in a manner that is both standardized and repeatable. Included are specifications for the acquisition of remotely sensed source materials, selection of compilation map bases, handling of data base corrections, editing and quality control operations, generation of map update products for USGS open file, and the reproduction and distribution of open file materials. These specifications are planned to become part of the National Mapping Division's Technical Instructions.
Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra
2014-01-01
High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally.
Developing a mapping tool for tablets
NASA Astrophysics Data System (ADS)
Vaughan, Alan; Collins, Nathan; Krus, Mike
2014-05-01
Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.
NASA Astrophysics Data System (ADS)
Gao, Z.; Song, Y.; Li, C.; Zeng, F.; Wang, F.
2017-08-01
Rapid acquisition and processing method of large scale topographic map data, which relies on the Unmanned Aerial Vehicle (UAV) low-altitude aerial photogrammetry system, is studied in this paper, elaborating the main work flow. Key technologies of UAV photograph mapping is also studied, developing a rapid mapping system based on electronic plate mapping system, thus changing the traditional mapping mode and greatly improving the efficiency of the mapping. Production test and achievement precision evaluation of Digital Orth photo Map (DOM), Digital Line Graphic (DLG) and other digital production were carried out combined with the city basic topographic map update project, which provides a new techniques for large scale rapid surveying and has obvious technical advantage and good application prospect.
Comparing five modelling techniques for predicting forest characteristics
Gretchen G. Moisen; Tracey S. Frescino
2002-01-01
Broad-scale maps of forest characteristics are needed throughout the United States for a wide variety of forest land management applications. Inexpensive maps can be produced by modelling forest class and structure variables collected in nationwide forest inventories as functions of satellite-based information. But little work has been directed at comparing modelling...
A new multicriteria risk mapping approach based on a multiattribute frontier concept
Denys Yemshanov; Frank H. Koch; Yakov Ben-Haim; Marla Downing; Frank Sapio; Marty Siltanen
2013-01-01
Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that...
Single-shot three-dimensional reconstruction based on structured light line pattern
NASA Astrophysics Data System (ADS)
Wang, ZhenZhou; Yang, YongMing
2018-07-01
Reconstruction of the object by single-shot is of great importance in many applications, in which the object is moving or its shape is non-rigid and changes irregularly. In this paper, we propose a single-shot structured light 3D imaging technique that calculates the phase map from the distorted line pattern. This technique makes use of the image processing techniques to segment and cluster the projected structured light line pattern from one single captured image. The coordinates of the clustered lines are extracted to form a low-resolution phase matrix which is then transformed to full-resolution phase map by spline interpolation. The 3D shape of the object is computed from the full-resolution phase map and the 2D camera coordinates. Experimental results show that the proposed method was able to reconstruct the three-dimensional shape of the object robustly from one single image.
NASA Astrophysics Data System (ADS)
Roelfsema, C. M.; Phinn, S. R.; Lyons, M. B.; Kovacs, E.; Saunders, M. I.; Leon, J. X.
2013-12-01
Corals and Submerged Aquatic Vegetation (SAV) are typically found in highly dynamic environments where the magnitude and types of physical and biological processes controlling their distribution, diversity and function changes dramatically. Recent advances in the types of satellite image data and the length of their archives that are available globally, coupled with new techniques for extracting environmental information from these data sets has enabled significant advances to be made in our ability to map and monitor coral and SAV environments. Object Based Image Analysis techniques are one of the most significant advances in information extraction techniques for processing images to deliver environmental information at multiple spatial scales. This poster demonstrates OBIA applied to high spatial resolution satellite image data to map and monitor coral and SAV communities across a variety of environments in the Western Pacific that vary in their extent, biological composition, forcing physical factors and location. High spatial resolution satellite imagery (Quickbird, Ikonos and Worldview2) were acquired coincident with field surveys on each reef to collect georeferenced benthic photo transects, over various areas in the Western Pacific. Base line maps were created, from Roviana Lagoon Solomon island (600 km2), Bikini Atoll Marshall Island (800 Km2), Lizard Island, Australia (30 km2) and time series maps for geomorphic and benthic communities were collected for Heron Reef, Australia (24 km2) and Eastern Banks area of Moreton Bay, Australia (200 km2). The satellite image data were corrected for radiometric and atmospheric distortions to at-surface reflectance. Georeferenced benthic photos were acquired by divers or Autonomous Underwater Vehicles, analysed for benthic cover composition, and used for calibration and validation purposes. Hierarchical mapping from: reef/non-reef (1000's - 10000's m); reef type (100's - 1000's m); 'geomorphic zone' (10's - 100's m); to dominant components of benthic cover compositions (1 - 10's m); and individual benthic cover type scale (0.5-5.0's m), was completed using object based segmentation and semi-automated labelling through membership rules. Accuracy assessment of the satellite image based maps and field data sets scales maps produced with 90% maximum accuracy larger scales and less complex maps, versus 40 % at smaller scale and complex maps. The study showed that current data sets and object based analysis are able to reliable map at various scales and level of complexity covering a variety of extent and environments at various times; as a result science and management can use these tools to assess and understand the ecological processes taking place in coral and SAV environments.
NASA Technical Reports Server (NTRS)
Hollandsworth, Stacey M.; Schoeberl, Mark R.; Morris, Gary A.; Long, Craig; Zhou, Shuntai; Miller, Alvin J.
1999-01-01
In this study we utilize potential vorticity - isentropic (PVI) coordinate transformations as a means of combining ozone data from different sources to construct daily, synthetic three-dimensional ozone fields. This methodology has been used successfully to reconstruct ozone maps in particular regions from aircraft data over the period of the aircraft campaign. We expand this method to create high-resolution daily global maps of profile ozone data, particularly in the lower stratosphere, where high-resolution ozone data are sparse. Ozone climatologies in PVI-space are constructed from satellite-based SAGE II and UARS/HALOE data, both of which-use solar occultation techniques to make high vertical resolution ozone profile measurements, but with low spatial resolution. A climatology from ground-based balloonsonde data is also created. The climatologies are used to establish the relationship between ozone and dynamical variability, which is defined by the potential vorticity (in the form of equivalent latitude) and potential temperature fields. Once a PVI climatology has been created from data taken by one or more instruments, high-resolution daily profile ozone field estimates are constructed based solely on the PVI fields, which are available on a daily basis from NCEP analysis. These profile ozone maps could be used for a variety of applications, including use in conjunction with total ozone maps to create a daily tropospheric ozone product, as input to forecast models, or as a tool for validating independent ozone measurements when correlative data are not available. This technique is limited to regions where the ozone is a long-term tracer and the flow is adiabatic. We evaluate the internal consistency of the technique by transforming the ozone back to physical space and comparing to the original profiles. Biases in the long-term average of the differences are used to identify regions where the technique is consistently introducing errors. Initial results show the technique is useful in the lower stratosphere at most latitudes throughout the year,and in the winter hemisphere in the middle stratosphere. The results are problematic in the summer hemisphere middle stratosphere due to increased ozone photochemistry and weak PV gradients. Alternate techniques in these regions will be discussed. An additional limitation is the quality and resolution of the meteorological data.
Layout Slam with Model Based Loop Closure for 3d Indoor Corridor Reconstruction
NASA Astrophysics Data System (ADS)
Baligh Jahromi, A.; Sohn, G.; Jung, J.; Shahbazi, M.; Kang, J.
2018-05-01
In this paper, we extend a recently proposed visual Simultaneous Localization and Mapping (SLAM) techniques, known as Layout SLAM, to make it robust against error accumulations, abrupt changes of camera orientation and miss-association of newly visited parts of the scene to the previously visited landmarks. To do so, we present a novel technique of loop closing based on layout model matching; i.e., both model information (topology and geometry of reconstructed models) and image information (photometric features) are used to address a loop-closure detection. The advantages of using the layout-related information in the proposed loop-closing technique are twofold. First, it imposes a metric constraint on the global map consistency and, thus, adjusts the mapping scale drifts. Second, it can reduce matching ambiguity in the context of indoor corridors, where the scene is homogenously textured and extracting sufficient amount of distinguishable point features is a challenging task. To test the impact of the proposed technique on the performance of Layout SLAM, we have performed the experiments on wide-angle videos captured by a handheld camera. This dataset was collected from the indoor corridors of a building at York University. The obtained results demonstrate that the proposed method successfully detects the instances of loops while producing very limited trajectory errors.
Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian
2014-01-01
We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.
NASA Astrophysics Data System (ADS)
Li, Long; Solana, Carmen; Canters, Frank; Kervyn, Matthieu
2017-10-01
Mapping lava flows using satellite images is an important application of remote sensing in volcanology. Several volcanoes have been mapped through remote sensing using a wide range of data, from optical to thermal infrared and radar images, using techniques such as manual mapping, supervised/unsupervised classification, and elevation subtraction. So far, spectral-based mapping applications mainly focus on the use of traditional pixel-based classifiers, without much investigation into the added value of object-based approaches and into advantages of using machine learning algorithms. In this study, Nyamuragira, characterized by a series of > 20 overlapping lava flows erupted over the last century, was used as a case study. The random forest classifier was tested to map lava flows based on pixels and objects. Image classification was conducted for the 20 individual flows and for 8 groups of flows of similar age using a Landsat 8 image and a DEM of the volcano, both at 30-meter spatial resolution. Results show that object-based classification produces maps with continuous and homogeneous lava surfaces, in agreement with the physical characteristics of lava flows, while lava flows mapped through the pixel-based classification are heterogeneous and fragmented including much "salt and pepper noise". In terms of accuracy, both pixel-based and object-based classification performs well but the former results in higher accuracies than the latter except for mapping lava flow age groups without using topographic features. It is concluded that despite spectral similarity, lava flows of contrasting age can be well discriminated and mapped by means of image classification. The classification approach demonstrated in this study only requires easily accessible image data and can be applied to other volcanoes as well if there is sufficient information to calibrate the mapping.
Orsi, Rebecca
2017-02-01
Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Addressing Inter-set Write-Variation for Improving Lifetime of Non-Volatile Caches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S
We propose a technique which minimizes inter-set write variation in NVM caches for improving its lifetime. Our technique uses cache coloring scheme to add a software-controlled mapping layer between groups of physical pages (called memory regions) and cache sets. Periodically, the number of writes to different colors of the cache is computed and based on this result, the mapping of a few colors is changed to channel the write traffic to least utilized cache colors. This change helps to achieve wear-leveling.
Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika; ...
2017-05-30
High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika
High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.
NASA Astrophysics Data System (ADS)
Mercer, Jason J.; Westbrook, Cherie J.
2016-11-01
Microform is important in understanding wetland functions and processes. But collecting imagery of and mapping the physical structure of peatlands is often expensive and requires specialized equipment. We assessed the utility of coupling computer vision-based structure from motion with multiview stereo photogrammetry (SfM-MVS) and ground-based photos to map peatland topography. The SfM-MVS technique was tested on an alpine peatland in Banff National Park, Canada, and guidance was provided on minimizing errors. We found that coupling SfM-MVS with ground-based photos taken with a point and shoot camera is a viable and competitive technique for generating ultrahigh-resolution elevations (i.e., <0.01 m, mean absolute error of 0.083 m). In evaluating 100+ viable SfM-MVS data collection and processing scenarios, vegetation was found to considerably influence accuracy. Vegetation class, when accounted for, reduced absolute error by as much as 50%. The logistic flexibility of ground-based SfM-MVS paired with its high resolution, low error, and low cost makes it a research area worth developing as well as a useful addition to the wetland scientists' toolkit.
Qian, Wei; Fan, Guiyan; Liu, Dandan; Zhang, Helong; Wang, Xiaowu; Wu, Jian; Xu, Zhaosheng
2017-04-04
Cultivated spinach (Spinacia oleracea L.) is one of the most widely cultivated types of leafy vegetable in the world, and it has a high nutritional value. Spinach is also an ideal plant for investigating the mechanism of sex determination because it is a dioecious species with separate male and female plants. Some reports on the sex labeling and localization of spinach in the study of molecular markers have surfaced. However, there have only been two reports completed on the genetic map of spinach. The lack of rich and reliable molecular markers and the shortage of high-density linkage maps are important constraints in spinach research work. In this study, a high-density genetic map of spinach based on the Specific-locus Amplified Fragment Sequencing (SLAF-seq) technique was constructed; the sex-determining gene was also finely mapped. Through bio-information analysis, 50.75 Gb of data in total was obtained, including 207.58 million paired-end reads. Finally, 145,456 high-quality SLAF markers were obtained, with 27,800 polymorphic markers and 4080 SLAF markers were finally mapped onto the genetic map after linkage analysis. The map spanned 1,125.97 cM with an average distance of 0.31 cM between the adjacent marker loci. It was divided into 6 linkage groups corresponding to the number of spinach chromosomes. Besides, the combination of Bulked Segregation Analysis (BSA) with SLAF-seq technology(super-BSA) was employed to generate the linkage markers with the sex-determining gene. Combined with the high-density genetic map of spinach, the sex-determining gene X/Y was located at the position of the linkage group (LG) 4 (66.98 cM-69.72 cM and 75.48 cM-92.96 cM), which may be the ideal region for the sex-determining gene. A high-density genetic map of spinach based on the SLAF-seq technique was constructed with a backcross (BC 1 ) population (which is the highest density genetic map of spinach reported at present). At the same time, the sex-determining gene X/Y was mapped to LG4 with super-BSA. This map will offer a suitable basis for further study of spinach, such as gene mapping, map-based cloning of Specific genes, quantitative trait locus (QTL) mapping and marker-assisted selection (MAS). It will also provide an efficient reference for studies on the mechanism of sex determination in other dioecious plants.
High-definition X-ray fluorescence elemental mapping of paintings.
Howard, Daryl L; de Jonge, Martin D; Lau, Deborah; Hay, David; Varcoe-Cocks, Michael; Ryan, Chris G; Kirkham, Robin; Moorhead, Gareth; Paterson, David; Thurrowgood, David
2012-04-03
A historical self-portrait painted by Sir Arthur Streeton (1867-1943) has been studied with fast-scanning X-ray fluorescence microscopy using synchrotron radiation. One of the technique's unique strengths is the ability to reveal metal distributions in the pigments of underlying brushstrokes, thus providing information critical to the interpretation of a painting. We have applied the nondestructive technique with the event-mode Maia X-ray detector, which has the capability to record elemental maps at megapixels per hour with the full X-ray fluorescence spectrum collected per pixel. The painting poses a difficult challenge to conventional X-ray analysis, because it was completely obscured with heavy brushstrokes of highly X-ray absorptive lead white paint (2PbCO(3)·Pb(OH)(2)) by the artist, making it an excellent candidate for the application of the synchrotron-based technique. The 25 megapixel elemental maps were successfully observed through the lead white paint across the 200 × 300 mm(2) scan area. The sweeping brushstrokes of the lead white overpaint contributed significant detrimental structure to the elemental maps. A corrective procedure was devised to enhance the visualization of the elemental maps by using the elastic X-ray scatter as a proxy for the lead white overpaint. We foresee the technique applied to the most demanding of culturally significant artworks where conventional analytical methods are inadequate.
NASA Astrophysics Data System (ADS)
Asal Kzar, Ahmed; Mat Jafri, M. Z.; Hwee San, Lim; Al-Zuky, Ali A.; Mutter, Kussay N.; Hassan Al-Saleh, Anwar
2016-06-01
There are many techniques that have been given for water quality problem, but the remote sensing techniques have proven their success, especially when the artificial neural networks are used as mathematical models with these techniques. Hopfield neural network is one type of artificial neural networks which is common, fast, simple, and efficient, but it when it deals with images that have more than two colours such as remote sensing images. This work has attempted to solve this problem via modifying the network that deals with colour remote sensing images for water quality mapping. A Feed-forward Hopfield Neural Network Algorithm (FHNNA) was modified and used with a satellite colour image from type of Thailand earth observation system (THEOS) for TSS mapping in the Penang strait, Malaysia, through the classification of TSS concentrations. The new algorithm is based essentially on three modifications: using HNN as feed-forward network, considering the weights of bitplanes, and non-self-architecture or zero diagonal of weight matrix, in addition, it depends on a validation data. The achieved map was colour-coded for visual interpretation. The efficiency of the new algorithm has found out by the higher correlation coefficient (R=0.979) and the lower root mean square error (RMSE=4.301) between the validation data that were divided into two groups. One used for the algorithm and the other used for validating the results. The comparison was with the minimum distance classifier. Therefore, TSS mapping of polluted water in Penang strait, Malaysia, can be performed using FHNNA with remote sensing technique (THEOS). It is a new and useful application of HNN, so it is a new model with remote sensing techniques for water quality mapping which is considered important environmental problem.
Non-laser-based scanner for three-dimensional digitization of historical artifacts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hahn, Daniel V.; Baldwin, Kevin C.; Duncan, Donald D
2007-05-20
A 3D scanner, based on incoherent illumination techniques, and associated data-processing algorithms are presented that can be used to scan objects at lateral resolutions ranging from 5 to100 {mu}m (or more) and depth resolutions of approximately 2 {mu}m.The scanner was designed with the specific intent to scan cuneiform tablets but can be utilized for other applications. Photometric stereo techniques are used to obtain both a surface normal map and a parameterized model of the object's bidirectional reflectance distribution function. The normal map is combined with height information,gathered by structured light techniques, to form a consistent 3D surface. Data from Lambertianmore » and specularly diffuse spherical objects are presented and used to quantify the accuracy of the techniques. Scans of a cuneiform tablet are also presented. All presented data are at a lateral resolution of 26.8 {mu}m as this is approximately the minimum resolution deemed necessary to accurately represent cuneiform.« less
NASA Astrophysics Data System (ADS)
Mobasheri, Mohammad Reza; Ghamary-Asl, Mohsen
2011-12-01
Imaging through hyperspectral technology is a powerful tool that can be used to spectrally identify and spatially map materials based on their specific absorption characteristics in electromagnetic spectrum. A robust method called Tetracorder has shown its effectiveness at material identification and mapping, using a set of algorithms within an expert system decision-making framework. In this study, using some stages of Tetracorder, a technique called classification by diagnosing all absorption features (CDAF) is introduced. This technique enables one to assign a class to the most abundant mineral in each pixel with high accuracy. The technique is based on the derivation of information from reflectance spectra of the image. This can be done through extraction of spectral absorption features of any minerals from their respected laboratory-measured reflectance spectra, and comparing it with those extracted from the pixels in the image. The CDAF technique has been executed on the AVIRIS image where the results show an overall accuracy of better than 96%.
Digital mapping techniques '00, workshop proceedings - May 17-20, 2000, Lexington, Kentucky
Soller, David R.
2000-01-01
Introduction: The Digital Mapping Techniques '00 (DMT'00) workshop was attended by 99 technical experts from 42 agencies, universities, and private companies, including representatives from 28 state geological surveys (see Appendix A). This workshop was similar in nature to the first three meetings, held in June, 1997, in Lawrence, Kansas (Soller, 1997), in May, 1998, in Champaign, Illinois (Soller, 1998a), and in May, 1999, in Madison, Wisconsin (Soller, 1999). This year's meeting was hosted by the Kentucky Geological Survey, from May 17 to 20, 2000, on the University of Kentucky campus in Lexington. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. When, based on discussions at the workshop, an attendee adopts or modifies a newly learned technique, the workshop clearly has met that objective. Evidence of learning and cooperation among participating agencies continued to be a highlight of the DMT workshops (see example in Soller, 1998b, and various papers in this volume). The meeting's general goal was to help move the state geological surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and geographic information systems (GIS) analysis. Through oral and poster presentations and special discussion sessions, emphasis was given to: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) continued development of the National Geologic Map Database; 3) progress toward building a standard geologic map data model; 4) field data-collection systems; and 5) map citation and authorship guidelines. Four representatives of the GIS hardware and software vendor community were invited to participate. The four annual DMT workshops were coordinated by the AASG/USGS Data Capture Working Group, which was formed in August, 1996, to support the Association of American State Geologists and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ncgmp.usgs.gov/ngmdbproject/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed to help the Database, and the State and Federal geological surveys, provide more high-quality digital maps to the public.
Adaptive video-based vehicle classification technique for monitoring traffic : [executive summary].
DOT National Transportation Integrated Search
2015-08-01
Federal Highway Administration (FHWA) recommends axle-based classification standards to map : passenger vehicles, single unit trucks, and multi-unit trucks, at Automatic Traffic Recorder (ATR) stations : statewide. Many state Departments of Transport...
Behavior Analysis of Novel Wearable Indoor Mapping System Based on 3D-SLAM.
Lagüela, Susana; Dorado, Iago; Gesto, Manuel; Arias, Pedro; González-Aguilera, Diego; Lorenzo, Henrique
2018-03-02
This paper presents a Wearable Prototype for indoor mapping developed by the University of Vigo. The system is based on a Velodyne LiDAR, acquiring points with 16 rays for a simplistic or low-density 3D representation of reality. With this, a Simultaneous Localization and Mapping (3D-SLAM) method is developed for the mapping and generation of 3D point clouds of scenarios deprived from GNSS signal. The quality of the system presented is validated through the comparison with a commercial indoor mapping system, Zeb-Revo, from the company GeoSLAM and with a terrestrial LiDAR, Faro Focus 3D X330. The first is considered as a relative reference with other mobile systems and is chosen due to its use of the same principle for mapping: SLAM techniques based on Robot Operating System (ROS), while the second is taken as ground-truth for the determination of the final accuracy of the system regarding reality. Results show that the accuracy of the system is mainly determined by the accuracy of the sensor, with little increment in the error introduced by the mapping algorithm.
Optimized MLAA for quantitative non-TOF PET/MR of the brain
NASA Astrophysics Data System (ADS)
Benoit, Didier; Ladefoged, Claes N.; Rezaei, Ahmadreza; Keller, Sune H.; Andersen, Flemming L.; Højgaard, Liselotte; Hansen, Adam E.; Holm, Søren; Nuyts, Johan
2016-12-01
For quantitative tracer distribution in positron emission tomography, attenuation correction is essential. In a hybrid PET/CT system the CT images serve as a basis for generation of the attenuation map, but in PET/MR, the MR images do not have a similarly simple relationship with the attenuation map. Hence attenuation correction in PET/MR systems is more challenging. Typically either of two MR sequences are used: the Dixon or the ultra-short time echo (UTE) techniques. However these sequences have some well-known limitations. In this study, a reconstruction technique based on a modified and optimized non-TOF MLAA is proposed for PET/MR brain imaging. The idea is to tune the parameters of the MLTR applying some information from an attenuation image computed from the UTE sequences and a T1w MR image. In this MLTR algorithm, an {αj} parameter is introduced and optimized in order to drive the algorithm to a final attenuation map most consistent with the emission data. Because the non-TOF MLAA is used, a technique to reduce the cross-talk effect is proposed. In this study, the proposed algorithm is compared to the common reconstruction methods such as OSEM using a CT attenuation map, considered as the reference, and OSEM using the Dixon and UTE attenuation maps. To show the robustness and the reproducibility of the proposed algorithm, a set of 204 [18F]FDG patients, 35 [11C]PiB patients and 1 [18F]FET patient are used. The results show that by choosing an optimized value of {αj} in MLTR, the proposed algorithm improves the results compared to the standard MR-based attenuation correction methods (i.e. OSEM using the Dixon or the UTE attenuation maps), and the cross-talk and the scale problem are limited.
Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005
Soller, David R.
2005-01-01
Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; 6) continued development of the National Geologic Map Database; and 7) progress toward building and implementing a standard geologic map data model and standard science language for the U.S. and for North America.
Dooley, Kathryn A; Conover, Damon M; Glinsman, Lisha Deming; Delaney, John K
2014-12-08
Two imaging modalities based on molecular and elemental spectroscopy were used to characterize a painting by Cosimo Tura. Visible-to-near-infrared (400-1680 nm) reflectance imaging spectroscopy (RIS) and X-ray fluorescence (XRF) imaging spectroscopy were employed to identify pigments and determine their spatial distribution with higher confidence than from either technique alone. For example, Mary's red robe was modeled through the distribution of an insect-derived red lake (RIS map) and lead white (XRF lead map), rather than a layer of red lake on vermilion. The RIS image cube was also used to isolate the preparatory design by mapping the reflectance spectra associated with it. In conjunction with results from an earlier RIS study (1650-2500 nm) to map and identify the binding media, a more thorough understanding was gained of the materials and techniques used in the painting. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Application of filtering techniques in preprocessing magnetic data
NASA Astrophysics Data System (ADS)
Liu, Haijun; Yi, Yongping; Yang, Hongxia; Hu, Guochuang; Liu, Guoming
2010-08-01
High precision magnetic exploration is a popular geophysical technique for its simplicity and its effectiveness. The explanation in high precision magnetic exploration is always a difficulty because of the existence of noise and disturbance factors, so it is necessary to find an effective preprocessing method to get rid of the affection of interference factors before further processing. The common way to do this work is by filtering. There are many kinds of filtering methods. In this paper we introduced in detail three popular kinds of filtering techniques including regularized filtering technique, sliding averages filtering technique, compensation smoothing filtering technique. Then we designed the work flow of filtering program based on these techniques and realized it with the help of DELPHI. To check it we applied it to preprocess magnetic data of a certain place in China. Comparing the initial contour map with the filtered contour map, we can see clearly the perfect effect our program. The contour map processed by our program is very smooth and the high frequency parts of data are disappeared. After filtering, we separated useful signals and noisy signals, minor anomaly and major anomaly, local anomaly and regional anomaly. It made us easily to focus on the useful information. Our program can be used to preprocess magnetic data. The results showed the effectiveness of our program.
Satellite SAR applied in offhore wind resource mapping: possibilities and limitations
NASA Astrophysics Data System (ADS)
Hasager, C. B.
Satellite remote sensing of ocean wind fields from Synthetic Aperture Radar (SAR) observations is presented. The study is based on a series of more than 60 ERS-2 SAR satellite scenes from the Horns Rev in the North Sea. The wind climate from the coastline and 80 km offshore is mapped in detail with a resolution of 400 m by 400 m grid cells. Spatial variations in wind speed as a function of wind direction and fetch are observed and discussed. The satellite wind fields are compared to in-situ observations from a tall offshore meteorological mast at which wind speed at 4 levels are analysed. The mast is located 14 km offshore and the wind climate is observed continously since May 1999. For offshore wind resource mapping the SAR-based wind field maps can constitute an alternative to in-situ observations and a practical method is developed for applied use in WAsP (Wind Atlas Analysis and Application Program). The software is the de facto world standard tool used for prediction of wind climate and power production from wind turbines and wind farms. The possibilities and limitations on achieving offshore wind resource estimates using SAR-based wind fields in lieu of in-situ data are discussed. It includes a presentation of the footprint area-averaging techniques tailored for SAR-based wind field maps. Averaging techniques are relevant for the reduction of noise apparent in SAR wind speed maps. Acknowledgments: Danish Research Agency (SAT-WIND Sagsnr. 2058-03-0006) for funding, ESA (EO-1356, AO-153) for ERS-2 SAR scenes, and Elsam Engineering A/S for in-situ met-data.
Assessment of myocardial fibrosis with T1 mapping MRI.
Everett, R J; Stirrat, C G; Semple, S I R; Newby, D E; Dweck, M R; Mirsadraee, S
2016-08-01
Myocardial fibrosis can arise from a range of pathological processes and its presence correlates with adverse clinical outcomes. Cardiac magnetic resonance (CMR) can provide a non-invasive assessment of cardiac structure, function, and tissue characteristics, which includes late gadolinium enhancement (LGE) techniques to identify focal irreversible replacement fibrosis with a high degree of accuracy and reproducibility. Importantly the presence of LGE is consistently associated with adverse outcomes in a range of common cardiac conditions; however, LGE techniques are qualitative and unable to detect diffuse myocardial fibrosis, which is an earlier form of fibrosis preceding replacement fibrosis that may be reversible. Novel T1 mapping techniques allow quantitative CMR assessment of diffuse myocardial fibrosis with the two most common measures being native T1 and extracellular volume (ECV) fraction. Native T1 differentiates normal from infarcted myocardium, is abnormal in hypertrophic cardiomyopathy, and may be particularly useful in the diagnosis of Anderson-Fabry disease and amyloidosis. ECV is a surrogate measure of the extracellular space and is equivalent to the myocardial volume of distribution of the gadolinium-based contrast medium. It is reproducible and correlates well with fibrosis on histology. ECV is abnormal in patients with cardiac failure and aortic stenosis, and is associated with functional impairment in these groups. T1 mapping techniques promise to allow earlier detection of disease, monitor disease progression, and inform prognosis; however, limitations remain. In particular, reference ranges are lacking for T1 mapping values as these are influenced by specific CMR techniques and magnetic field strength. In addition, there is significant overlap between T1 mapping values in healthy controls and most disease states, particularly using native T1, limiting the clinical application of these techniques at present. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions
NASA Astrophysics Data System (ADS)
Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong
2018-01-01
Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.
Semantic Image Based Geolocation Given a Map (Author’s Initial Manuscript)
2016-09-01
novel technique for detection and identification of building facades from geo-tagged reference view using the map and geometry of the building facades. We...2D map of the environment, and geometry of building facades. We evaluate our approach for building identification and geo-localization on a new...location recognition and building identification is done by matching the query view to a reference set, followed by estimation of 3D building facades
Rapid Landslide Mapping by Means of Post-Event Polarimetric SAR Imagery
NASA Astrophysics Data System (ADS)
Plank, Simon; Martinis, Sandro; Twele, Andre
2016-08-01
Rapid mapping of landslides, quickly providing information about the extent of the affected area and type and grade of damage, is crucial to enable fast crisis response. Reviewing the literature shows that most synthetic aperture radar (SAR) data-based landslide mapping procedures use change detection techniques. However, the required very high resolution (VHR) pre-event SAR imagery, acquired shortly before the landslide event, is commonly not available. Due to limitations in onboard disk space and downlink transmission rates modern VHR SAR missions do not systematically cover the entire world. We present a fast and robust procedure for mapping of landslides, based on change detection between freely available and systematically acquired pre-event optical and post-event polarimetric SAR data.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
Bistatic LIDAR experiment proposed for the shuttle/tethered satellite system missions
NASA Technical Reports Server (NTRS)
Mccomas, D. J.; Spense, H. E.; Karl, R. R.; Horak, H. G.; Wilkerson, T. D.
1986-01-01
A new experiment concept has been proposed for the shuttle/tethered satellite system missions, which can provide high resolution, global density mappings of certain ionospheric species. The technique utilizes bistatic LIDAR to take advantage of the unique dual platform configuration offered by these missions. A tuned, shuttle-based laser is used to excite a column of the atmosphere adjacent to the tethered satellite, while triangulating photometic detectors on the satellite are employed to measure the fluorescence from sections of the column. The fluorescent intensity at the detectors is increased about six decades over both ground-based and monostatic shuttle-based LIDAR sounding of the same region. In addition, the orbital motion of the Shuttle provides for quasi-global mapping unattainable with ground-based observations. Since this technique provides such vastly improved resolution on a synoptic scale, many important middle atmospheric studies, heretofore untenable, may soon be addressed.
A new gradient shimming method based on undistorted field map of B0 inhomogeneity.
Bao, Qingjia; Chen, Fang; Chen, Li; Song, Kan; Liu, Zao; Liu, Chaoyang
2016-04-01
Most existing gradient shimming methods for NMR spectrometers estimate field maps that resolve B0 inhomogeneity spatially from dual gradient-echo (GRE) images acquired at different echo times. However, the distortions induced by B0 inhomogeneity that always exists in the GRE images can result in estimated field maps that are distorted in both geometry and intensity, leading to inaccurate shimming. This work proposes a new gradient shimming method based on undistorted field map of B0 inhomogeneity obtained by a more accurate field map estimation technique. Compared to the traditional field map estimation method, this new method exploits both the positive and negative polarities of the frequency encoded gradients to eliminate the distortions caused by B0 inhomogeneity in the field map. Next, the corresponding automatic post-data procedure is introduced to obtain undistorted B0 field map based on knowledge of the invariant characteristics of the B0 inhomogeneity and the variant polarity of the encoded gradient. The experimental results on both simulated and real gradient shimming tests demonstrate the high performance of this new method. Copyright © 2015 Elsevier Inc. All rights reserved.
Bosch, Thijs; Verkade, Erwin; van Luit, Martijn; Pot, Bruno; Vauterin, Paul; Burggrave, Ronald; Savelkoul, Paul; Kluytmans, Jan; Schouls, Leo
2013-01-01
After its emergence in 2003, a livestock-associated (LA-)MRSA clade (CC398) has caused an impressive increase in the number of isolates submitted for the Dutch national MRSA surveillance and now comprises 40% of all isolates. The currently used molecular typing techniques have limited discriminatory power for this MRSA clade, which hampers studies on the origin and transmission routes. Recently, a new molecular analysis technique named whole genome mapping was introduced. This method creates high-resolution, ordered whole genome restriction maps that may have potential for strain typing. In this study, we assessed and validated the capability of whole genome mapping to differentiate LA-MRSA isolates. Multiple validation experiments showed that whole genome mapping produced highly reproducible results. Assessment of the technique on two well-documented MRSA outbreaks showed that whole genome mapping was able to confirm one outbreak, but revealed major differences between the maps of a second, indicating that not all isolates belonged to this outbreak. Whole genome mapping of LA-MRSA isolates that were epidemiologically unlinked provided a much higher discriminatory power than spa-typing or MLVA. In contrast, maps created from LA-MRSA isolates obtained during a proven LA-MRSA outbreak were nearly indistinguishable showing that transmission of LA-MRSA can be detected by whole genome mapping. Finally, whole genome maps of LA-MRSA isolates originating from two unrelated veterinarians and their household members showed that veterinarians may carry and transmit different LA-MRSA strains at the same time. No such conclusions could be drawn based spa-typing and MLVA. Although PFGE seems to be suitable for molecular typing of LA-MRSA, WGM provides a much higher discriminatory power. Furthermore, whole genome mapping can provide a comparison with other maps within 2 days after the bacterial culture is received, making it suitable to investigate transmission events and outbreaks caused by LA-MRSA. PMID:23805225
An Experimental Realization of a Chaos-Based Secure Communication Using Arduino Microcontrollers
Zapateiro De la Hoz, Mauricio; Vidal, Yolanda
2015-01-01
Security and secrecy are some of the important concerns in the communications world. In the last years, several encryption techniques have been proposed in order to improve the secrecy of the information transmitted. Chaos-based encryption techniques are being widely studied as part of the problem because of the highly unpredictable and random-look nature of the chaotic signals. In this paper we propose a digital-based communication system that uses the logistic map which is a mathematically simple model that is chaotic under certain conditions. The input message signal is modulated using a simple Delta modulator and encrypted using a logistic map. The key signal is also encrypted using the same logistic map with different initial conditions. In the receiver side, the binary-coded message is decrypted using the encrypted key signal that is sent through one of the communication channels. The proposed scheme is experimentally tested using Arduino shields which are simple yet powerful development kits that allows for the implementation of the communication system for testing purposes. PMID:26413563
NASA Astrophysics Data System (ADS)
Han, Yang; Wang, Shutao; Payen, Thomas; Konofagou, Elisa
2017-04-01
The successful clinical application of high intensity focused ultrasound (HIFU) ablation depends on reliable monitoring of the lesion formation. Harmonic motion imaging guided focused ultrasound (HMIgFUS) is an ultrasound-based elasticity imaging technique, which monitors HIFU ablation based on the stiffness change of the tissue instead of the echo intensity change in conventional B-mode monitoring, rendering it potentially more sensitive to lesion development. Our group has shown that predicting the lesion location based on the radiation force-excited region is feasible during HMIgFUS. In this study, the feasibility of a fast lesion mapping method is explored to directly monitor the lesion map during HIFU. The harmonic motion imaging (HMI) lesion map was generated by subtracting the reference HMI image from the present HMI peak-to-peak displacement map, as streamed on the computer display. The dimensions of the HMIgFUS lesions were compared against gross pathology. Excellent agreement was found between the lesion depth (r 2 = 0.81, slope = 0.90), width (r 2 = 0.85, slope = 1.12) and area (r 2 = 0.58, slope = 0.75). In vivo feasibility was assessed in a mouse with a pancreatic tumor. These findings demonstrate that HMIgFUS can successfully map thermal lesions and monitor lesion development in real time in vitro and in vivo. The HMIgFUS technique may therefore constitute a novel clinical tool for HIFU treatment monitoring.
Han, Yang; Wang, Shutao; Payen, Thomas; Konofagou, Elisa
2017-04-21
The successful clinical application of high intensity focused ultrasound (HIFU) ablation depends on reliable monitoring of the lesion formation. Harmonic motion imaging guided focused ultrasound (HMIgFUS) is an ultrasound-based elasticity imaging technique, which monitors HIFU ablation based on the stiffness change of the tissue instead of the echo intensity change in conventional B-mode monitoring, rendering it potentially more sensitive to lesion development. Our group has shown that predicting the lesion location based on the radiation force-excited region is feasible during HMIgFUS. In this study, the feasibility of a fast lesion mapping method is explored to directly monitor the lesion map during HIFU. The harmonic motion imaging (HMI) lesion map was generated by subtracting the reference HMI image from the present HMI peak-to-peak displacement map, as streamed on the computer display. The dimensions of the HMIgFUS lesions were compared against gross pathology. Excellent agreement was found between the lesion depth (r 2 = 0.81, slope = 0.90), width (r 2 = 0.85, slope = 1.12) and area (r 2 = 0.58, slope = 0.75). In vivo feasibility was assessed in a mouse with a pancreatic tumor. These findings demonstrate that HMIgFUS can successfully map thermal lesions and monitor lesion development in real time in vitro and in vivo. The HMIgFUS technique may therefore constitute a novel clinical tool for HIFU treatment monitoring.
Teaching children the structure of science
NASA Astrophysics Data System (ADS)
Börner, Katy; Palmer, Fileve; Davis, Julie M.; Hardy, Elisha; Uzzo, Stephen M.; Hook, Bryan J.
2009-01-01
Maps of the world are common in classroom settings. They are used to teach the juxtaposition of natural and political functions, mineral resources, political, cultural and geographical boundaries; occurrences of processes such as tectonic drift; spreading of epidemics; and weather forecasts, among others. Recent work in scientometrics aims to create a map of science encompassing our collective scholarly knowledge. Maps of science can be used to see disciplinary boundaries; the origin of ideas, expertise, techniques, or tools; the birth, evolution, merging, splitting, and death of scientific disciplines; the spreading of ideas and technology; emerging research frontiers and bursts of activity; etc. Just like the first maps of our planet, the first maps of science are neither perfect nor correct. Today's science maps are predominantly generated based on English scholarly data: Techniques and procedures to achieve local and global accuracy of these maps are still being refined, and a visual language to communicate something as abstract and complex as science is still being developed. Yet, the maps are successfully used by institutions or individuals who can afford them to guide science policy decision making, economic decision making, or as visual interfaces to digital libraries. This paper presents the process and results of creating hands-on science maps for kids that teaches children ages 4-14 about the structure of scientific disciplines. The maps were tested in both formal and informal science education environments. The results show that children can easily transfer their (world) map and concept map reading skills to utilize maps of science in interesting ways.
Vegetation burn severity mapping using Landsat-8 and WorldView-2
Wu, Zhuoting; Middleton, Barry R.; Hetzler, Robert; Vogel, John M.; Dye, Dennis G.
2015-01-01
We used remotely sensed data from the Landsat-8 and WorldView-2 satellites to estimate vegetation burn severity of the Creek Fire on the San Carlos Apache Reservation, where wildfire occurrences affect the Tribe's crucial livestock and logging industries. Accurate pre- and post-fire canopy maps at high (0.5-meter) resolution were created from World- View-2 data to generate canopy loss maps, and multiple indices from pre- and post-fire Landsat-8 images were used to evaluate vegetation burn severity. Normalized difference vegetation index based vegetation burn severity map had the highest correlation coefficients with canopy loss map from WorldView-2. Two distinct approaches - canopy loss mapping from WorldView-2 and spectral index differencing from Landsat-8 - agreed well with the field-based burn severity estimates and are both effective for vegetation burn severity mapping. Canopy loss maps created with WorldView-2 imagery add to a short list of accurate vegetation burn severity mapping techniques that can help guide effective management of forest resources on the San Carlos Apache Reservation, and the broader fire-prone regions of the Southwest.
Park, Sung-Hong; Wang, Danny J J; Duong, Timothy Q
2013-09-01
We implemented pseudo-continuous ASL (pCASL) with 2D and 3D balanced steady state free precession (bSSFP) readout for mapping blood flow in the human brain, retina, and kidney, free of distortion and signal dropout, which are typically observed in the most commonly used echo-planar imaging acquisition. High resolution functional brain imaging in the human visual cortex was feasible with 3D bSSFP pCASL. Blood flow of the human retina could be imaged with pCASL and bSSFP in conjunction with a phase cycling approach to suppress the banding artifacts associated with bSSFP. Furthermore, bSSFP based pCASL enabled us to map renal blood flow within a single breath hold. Control and test-retest experiments suggested that the measured blood flow values in retina and kidney were reliable. Because there is no specific imaging tool for mapping human retina blood flow and the standard contrast agent technique for mapping renal blood flow can cause problems for patients with kidney dysfunction, bSSFP based pCASL may provide a useful tool for the diagnosis of retinal and renal diseases and can complement existing imaging techniques. Copyright © 2013 Elsevier Inc. All rights reserved.
Single-Molecule Denaturation Mapping of Genomic DNA in Nanofluidic Channels
NASA Astrophysics Data System (ADS)
Reisner, Walter; Larsen, Niels; Kristensen, Anders; Tegenfeldt, Jonas O.; Flyvbjerg, Henrik
2009-03-01
We have developed a new DNA barcoding technique based on the partial denaturation of extended fluorescently labeled DNA molecules. We partially melt DNA extended in nanofluidic channels via a combination of local heating and added chemical denaturants. The melted molecules, imaged via a standard fluorescence videomicroscopy setup, exhibit a nonuniform fluorescence profile corresponding to a series of local dips and peaks in the intensity trace along the stretched molecule. We show that this barcode is consistent with the presence of locally melted regions and can be explained by calculations of sequence-dependent melting probability. We believe this melting mapping technology is the first optically based single molecule technique sensitive to genome wide sequence variation that does not require an additional enzymatic labeling or restriction scheme.
NASA Astrophysics Data System (ADS)
Singh, Arun K.; Auton, Gregory; Hill, Ernie; Song, Aimin
2018-07-01
Due to a very high carrier concentration and low band gap, graphene based self-switching diodes do not demonstrate a very high rectification ratio. Despite that, it takes the advantage of graphene’s high carrier mobility and has been shown to work at very high microwave frequencies. However, the AC component of these devices is hidden in the very linear current–voltage characteristics. Here, we extract and quantitatively study the device capacitance that determines the device nonlinearity by implementing a conformal mapping technique. The estimated value of the nonlinear component or curvature coefficient from DC results based on Shichman–Hodges model predicts the rectified output voltage, which is in good agreement with the experimental RF results.
Inverse full state hybrid projective synchronization for chaotic maps with different dimensions
NASA Astrophysics Data System (ADS)
Ouannas, Adel; Grassi, Giuseppe
2016-09-01
A new synchronization scheme for chaotic (hyperchaotic) maps with different dimensions is presented. Specifically, given a drive system map with dimension n and a response system with dimension m, the proposed approach enables each drive system state to be synchronized with a linear response combination of the response system states. The method, based on the Lyapunov stability theory and the pole placement technique, presents some useful features: (i) it enables synchronization to be achieved for both cases of n < m and n > m; (ii) it is rigorous, being based on theorems; (iii) it can be readily applied to any chaotic (hyperchaotic) maps defined to date. Finally, the capability of the approach is illustrated by synchronization examples between the two-dimensional Hénon map (as the drive system) and the three-dimensional hyperchaotic Wang map (as the response system), and the three-dimensional Hénon-like map (as the drive system) and the two-dimensional Lorenz discrete-time system (as the response system).
Comparison of Image Restoration Methods for Lunar Epithermal Neutron Emission Mapping
NASA Technical Reports Server (NTRS)
McClanahan, T. P.; Ivatury, V.; Milikh, G.; Nandikotkur, G.; Puetter, R. C.; Sagdeev, R. Z.; Usikov, D.; Mitrofanov, I. G.
2009-01-01
Orbital measurements of neutrons by the Lunar Exploring Neutron Detector (LEND) onboard the Lunar Reconnaissance Orbiter are being used to quantify the spatial distribution of near surface hydrogen (H). Inferred H concentration maps have low signal-to-noise (SN) and image restoration (IR) techniques are being studied to enhance results. A single-blind. two-phase study is described in which four teams of researchers independently developed image restoration techniques optimized for LEND data. Synthetic lunar epithermal neutron emission maps were derived from LEND simulations. These data were used as ground truth to determine the relative quantitative performance of the IR methods vs. a default denoising (smoothing) technique. We review and used factors influencing orbital remote sensing of neutrons emitted from the lunar surface to develop a database of synthetic "true" maps for performance evaluation. A prior independent training phase was implemented for each technique to assure methods were optimized before the blind trial. Method performance was determined using several regional root-mean-square error metrics specific to epithermal signals of interest. Results indicate unbiased IR methods realize only small signal gains in most of the tested metrics. This suggests other physically based modeling assumptions are required to produce appreciable signal gains in similar low SN IR applications.
Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.
Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J
2017-10-20
This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p < 0.05) between the two lesion types. A hybrid biomarker developed using a stepwise feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.
Checking of individuality by DNA profiling.
Brdicka, R; Nürnberg, P
1993-08-25
A review of methods of DNA analysis used in forensic medicine for identification, paternity testing, etc. is provided. Among other techniques, DNA fingerprinting using different probes and polymerase chain reaction-based techniques such as amplified sequence polymorphisms and minisatellite variant repeat mapping are thoroughly described and both theoretical and practical aspects are discussed.
NASA Astrophysics Data System (ADS)
Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang
2017-05-01
Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.
Ningaloo Reef: Shallow Marine Habitats Mapped Using a Hyperspectral Sensor
Kobryn, Halina T.; Wouters, Kristin; Beckley, Lynnath E.; Heege, Thomas
2013-01-01
Research, monitoring and management of large marine protected areas require detailed and up-to-date habitat maps. Ningaloo Marine Park (including the Muiron Islands) in north-western Australia (stretching across three degrees of latitude) was mapped to 20 m depth using HyMap airborne hyperspectral imagery (125 bands) at 3.5 m resolution across the 762 km2 of reef environment between the shoreline and reef slope. The imagery was corrected for atmospheric, air-water interface and water column influences to retrieve bottom reflectance and bathymetry using the physics-based Modular Inversion and Processing System. Using field-validated, image-derived spectra from a representative range of cover types, the classification combined a semi-automated, pixel-based approach with fuzzy logic and derivative techniques. Five thematic classification levels for benthic cover (with probability maps) were generated with varying degrees of detail, ranging from a basic one with three classes (biotic, abiotic and mixed) to the most detailed with 46 classes. The latter consisted of all abiotic and biotic seabed components and hard coral growth forms in dominant or mixed states. The overall accuracy of mapping for the most detailed maps was 70% for the highest classification level. Macro-algal communities formed most of the benthic cover, while hard and soft corals represented only about 7% of the mapped area (58.6 km2). Dense tabulate coral was the largest coral mosaic type (37% of all corals) and the rest of the corals were a mix of tabulate, digitate, massive and soft corals. Our results show that for this shallow, fringing reef environment situated in the arid tropics, hyperspectral remote sensing techniques can offer an efficient and cost-effective approach to mapping and monitoring reef habitats over large, remote and inaccessible areas. PMID:23922921
Object based technique for delineating and mapping 15 tree species using VHR WorldView-2 imagery
NASA Astrophysics Data System (ADS)
Mustafa, Yaseen T.; Habeeb, Hindav N.
2014-10-01
Monitoring and analyzing forests and trees are required task to manage and establish a good plan for the forest sustainability. To achieve such a task, information and data collection of the trees are requested. The fastest way and relatively low cost technique is by using satellite remote sensing. In this study, we proposed an approach to identify and map 15 tree species in the Mangish sub-district, Kurdistan Region-Iraq. Image-objects (IOs) were used as the tree species mapping unit. This is achieved using the shadow index, normalized difference vegetation index and texture measurements. Four classification methods (Maximum Likelihood, Mahalanobis Distance, Neural Network, and Spectral Angel Mapper) were used to classify IOs using selected IO features derived from WorldView-2 imagery. Results showed that overall accuracy was increased 5-8% using the Neural Network method compared with other methods with a Kappa coefficient of 69%. This technique gives reasonable results of various tree species classifications by means of applying the Neural Network method with IOs techniques on WorldView-2 imagery.
NASA Astrophysics Data System (ADS)
Aydogan, D.
2007-04-01
An image processing technique called the cellular neural network (CNN) approach is used in this study to locate geological features giving rise to gravity anomalies such as faults or the boundary of two geologic zones. CNN is a stochastic image processing technique based on template optimization using the neighborhood relationships of cells. These cells can be characterized by a functional block diagram that is typical of neural network theory. The functionality of CNN is described in its entirety by a number of small matrices (A, B and I) called the cloning template. CNN can also be considered to be a nonlinear convolution of these matrices. This template describes the strength of the nearest neighbor interconnections in the network. The recurrent perceptron learning algorithm (RPLA) is used in optimization of cloning template. The CNN and standard Canny algorithms were first tested on two sets of synthetic gravity data with the aim of checking the reliability of the proposed approach. The CNN method was compared with classical derivative techniques by applying the cross-correlation method (CC) to the same anomaly map as this latter approach can detect some features that are difficult to identify on the Bouguer anomaly maps. This approach was then applied to the Bouguer anomaly map of Biga and its surrounding area, in Turkey. Structural features in the area between Bandirma, Biga, Yenice and Gonen in the southwest Marmara region are investigated by applying the CNN and CC to the Bouguer anomaly map. Faults identified by these algorithms are generally in accordance with previously mapped surface faults. These examples show that the geologic boundaries can be detected from Bouguer anomaly maps using the cloning template approach. A visual evaluation of the outputs of the CNN and CC approaches is carried out, and the results are compared with each other. This approach provides quantitative solutions based on just a few assumptions, which makes the method more powerful than the classical methods.
Fjodorova, Natalja; Novič, Marjana
2012-01-01
The knowledge-based Toxtree expert system (SAR approach) was integrated with the statistically based counter propagation artificial neural network (CP ANN) model (QSAR approach) to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs) for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats) within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals. PMID:24688639
[Application of electronic fence technology based on GIS in Oncomelania hupensis snail monitoring].
Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang
2017-07-27
To study the application of Geographic Information System (GIS) electronic fence technique in Oncomelania hupensis snail monitoring. The electronic fence was set around the history and existing snail environments in the electronic map, the information about snail monitoring and controlling was linked to the electronic fence, and the snail monitoring information system was established on these bases. The monitoring information was input through the computer and smart phone. The electronic fence around the history and existing snail environments was set in the electronic map (Baidu map), and the snail monitoring information system and smart phone APP were established. The monitoring information was input and upload real-time, and the snail monitoring information was demonstrated in real time on Baidu map. By using the electronic fence technology based on GIS, the unique "environment electronic archives" for each snail monitoring environment can be established in the electronic map, and real-time, dynamic monitoring and visual management can be realized.
Techniques, problems and uses of mega-geomorphological mapping
NASA Technical Reports Server (NTRS)
Embleton, C.
1985-01-01
A plea for a program of global geomorphological mapping based on remote sensing data is presented. It is argued that the program is a necessary step in bringing together the rapidly evolving concepts of plate tectonics with the science of geomorphology. Geomorphologists are urged to bring temporal scales into their subject and to abandon their recent isolation from tectonics and geological history. It is suggested that a start be made with a new geomorphological map of Europe, utilizing the latest space technology.
NASA Astrophysics Data System (ADS)
Pelikan, Erich; Vogelsang, Frank; Tolxdorff, Thomas
1996-04-01
The texture-based segmentation of x-ray images of focal bone lesions using topological maps is introduced. Texture characteristics are described by image-point correlation of feature images to feature vectors. For the segmentation, the topological map is labeled using an improved labeling strategy. Results of the technique are demonstrated on original and synthetic x-ray images and quantified with the aid of quality measures. In addition, a classifier-specific contribution analysis is applied for assessing the feature space.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
On the reconstruction of the surface structure of the spotted stars
NASA Astrophysics Data System (ADS)
Kolbin, A. I.; Shimansky, V. V.; Sakhibullin, N. A.
2013-07-01
We have developed and tested a light-curve inversion technique for photometric mapping of spotted stars. The surface of a spotted star is partitioned into small area elements, over which a search is carried out for the intensity distribution providing the best agreement between the observed and model light curves within a specified uncertainty. We have tested mapping techniques based on the use of both a single light curve and several light curves obtained in different photometric bands. Surface reconstruction artifacts due to the ill-posed nature of the problem have been identified.
Crowdsourcing The National Map
McCartney, Elizabeth; Craun, Kari J.; Korris, Erin M.; Brostuen, David A.; Moore, Laurence R.
2015-01-01
Using crowdsourcing techniques, the US Geological Survey’s (USGS) Volunteered Geographic Information (VGI) project known as “The National Map Corps (TNMCorps)” encourages citizen scientists to collect and edit data about man-made structures in an effort to provide accurate and authoritative map data for the USGS National Geospatial Program’s web-based The National Map. VGI is not new to the USGS, but past efforts have been hampered by available technologies. Building on lessons learned, TNMCorps volunteers are successfully editing 10 different structure types in all 50 states as well as Puerto Rico and the US Virgin Islands.
Using satellite data in map design and production
Hutchinson, John A.
2002-01-01
Satellite image maps have been produced by the U.S. Geological Survey (USGS) since shortly after the launch of the first Landsat satellite in 1972. Over the years, the use of image data to design and produce maps has developed from a manual and photographic process to one that incorporates geographic information systems, desktop publishing, and digital prepress techniques. At the same time, the content of most image-based maps produced by the USGS has shifted from raw image data to land cover or other information layers derived from satellite imagery, often portrayed in combination with shaded relief.
Weighted image de-fogging using luminance dark prior
NASA Astrophysics Data System (ADS)
Kansal, Isha; Kasana, Singara Singh
2017-10-01
In this work, the weighted image de-fogging process based upon dark channel prior is modified by using luminance dark prior. Dark channel prior estimates the transmission by using three colour channels whereas luminance dark prior does the same by making use of only Y component of YUV colour space. For each pixel in a patch of ? size, the luminance dark prior uses ? pixels, rather than ? pixels used in DCP technique, which speeds up the de-fogging process. To estimate the transmission map, weighted approach based upon difference prior is used which mitigates halo artefacts at the time of transmission estimation. The major drawback of weighted technique is that it does not maintain the constancy of the transmission in a local patch even if there are no significant depth disruptions, due to which the de-fogged image looks over smooth and has low contrast. Apart from this, in some images, weighted transmission still carries less visible halo artefacts. Therefore, Gaussian filter is used to blur the estimated weighted transmission map which enhances the contrast of de-fogged images. In addition to this, a novel approach is proposed to remove the pixels belonging to bright light source(s) during the atmospheric light estimation process based upon histogram of YUV colour space. To show the effectiveness, the proposed technique is compared with existing techniques. This comparison shows that the proposed technique performs better than the existing techniques.
ERIC Educational Resources Information Center
Chiou, Chei-Chang; Lee, Li-Tze; Tien, Li-Chu; Wang, Yu-Min
2017-01-01
This study explored the effectiveness of different concept mapping techniques on the learning achievement of senior accounting students and whether achievements attained using various techniques are affected by different learning styles. The techniques are computer-assisted construct-by-self-concept mapping (CACSB), computer-assisted…
D Model Visualization Enhancements in Real-Time Game Engines
NASA Astrophysics Data System (ADS)
Merlo, A.; Sánchez Belenguer, C.; Vendrell Vidal, E.; Fantini, F.; Aliperta, A.
2013-02-01
This paper describes two procedures used to disseminate tangible cultural heritage through real-time 3D simulations providing accurate-scientific representations. The main idea is to create simple geometries (with low-poly count) and apply two different texture maps to them: a normal map and a displacement map. There are two ways to achieve models that fit with normal or displacement maps: with the former (normal maps), the number of polygons in the reality-based model may be dramatically reduced by decimation algorithms and then normals may be calculated by rendering them to texture solutions (baking). With the latter, a LOD model is needed; its topology has to be quad-dominant for it to be converted to a good quality subdivision surface (with consistent tangency and curvature all over). The subdivision surface is constructed using methodologies for the construction of assets borrowed from character animation: these techniques have been recently implemented in many entertainment applications known as "retopology". The normal map is used as usual, in order to shade the surface of the model in a realistic way. The displacement map is used to finish, in real-time, the flat faces of the object, by adding the geometric detail missing in the low-poly models. The accuracy of the resulting geometry is progressively refined based on the distance from the viewing point, so the result is like a continuous level of detail, the only difference being that there is no need to create different 3D models for one and the same object. All geometric detail is calculated in real-time according to the displacement map. This approach can be used in Unity, a real-time 3D engine originally designed for developing computer games. It provides a powerful rendering engine, fully integrated with a complete set of intuitive tools and rapid workflows that allow users to easily create interactive 3D contents. With the release of Unity 4.0, new rendering features have been added, including DirectX 11 support. Real-time tessellation is a technique that can be applied by using such technology. Since the displacement and the resulting geometry are calculated by the GPU, the time-based execution cost of this technique is very low.
Ehler, Martin; Dobrosotskaya, Julia; Cunningham, Denise; Wong, Wai T.; Chew, Emily Y.; Czaja, Wojtek; Bonner, Robert F.
2015-01-01
We introduce and describe a novel non-invasive in-vivo method for mapping local rod rhodopsin distribution in the human retina over a 30-degree field. Our approach is based on analyzing the brightening of detected lipofuscin autofluorescence within small pixel clusters in registered imaging sequences taken with a commercial 488nm confocal scanning laser ophthalmoscope (cSLO) over a 1 minute period. We modeled the kinetics of rhodopsin bleaching by applying variational optimization techniques from applied mathematics. The physical model and the numerical analysis with its implementation are outlined in detail. This new technique enables the creation of spatial maps of the retinal rhodopsin and retinal pigment epithelium (RPE) bisretinoid distribution with an ≈ 50μm resolution. PMID:26196397
NASA Astrophysics Data System (ADS)
Liu, Zhanwen; Feng, Yan; Chen, Hang; Jiao, Licheng
2017-10-01
A novel and effective image fusion method is proposed for creating a highly informative and smooth surface of fused image through merging visible and infrared images. Firstly, a two-scale non-subsampled shearlet transform (NSST) is employed to decompose the visible and infrared images into detail layers and one base layer. Then, phase congruency is adopted to extract the saliency maps from the detail layers and a guided filtering is proposed to compute the filtering output of base layer and saliency maps. Next, a novel weighted average technique is used to make full use of scene consistency for fusion and obtaining coefficients map. Finally the fusion image was acquired by taking inverse NSST of the fused coefficients map. Experiments show that the proposed approach can achieve better performance than other methods in terms of subjective visual effect and objective assessment.
Measurable realistic image-based 3D mapping
NASA Astrophysics Data System (ADS)
Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.
2011-12-01
Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable realistic image-based (MRI) system can produce. The major contribution here is the implementation of measurable images on 3D maps to obtain various measurements from real scenes.
ERIC Educational Resources Information Center
Ruiz-Palomino, Pablo; Martinez-Canas, Ricardo
2013-01-01
In the search to improve the quality of education at the university level, the use of concept mapping is becoming an important instructional technique for enhancing the teaching-learning process. This educational tool is based on cognitive theories by making a distinction between learning by rote (memorizing) and learning by meaning, where…
Forest type mapping with satellite data
NASA Technical Reports Server (NTRS)
Dodge, A. G., Jr.; Bryant, E. S.
1976-01-01
Computer classification of data from Landsat, an earth-orbiting satellite, has resulted in measurements and maps of forest types for two New Hampshire counties. The acreages of hardwood and softwood types and total forested areas compare favorably with Forest Service figures for the same areas. These techniques have advantages for field application, particularly in states having forest taxation laws based on general productivity.
Comparison of simulation modeling and satellite techniques for monitoring ecological processes
NASA Technical Reports Server (NTRS)
Box, Elgene O.
1988-01-01
In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.
Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather
2018-04-01
Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.
Current state of the art of vision based SLAM
NASA Astrophysics Data System (ADS)
Muhammad, Naveed; Fofi, David; Ainouz, Samia
2009-02-01
The ability of a robot to localise itself and simultaneously build a map of its environment (Simultaneous Localisation and Mapping or SLAM) is a fundamental characteristic required for autonomous operation of the robot. Vision Sensors are very attractive for application in SLAM because of their rich sensory output and cost effectiveness. Different issues are involved in the problem of vision based SLAM and many different approaches exist in order to solve these issues. This paper gives a classification of state-of-the-art vision based SLAM techniques in terms of (i) imaging systems used for performing SLAM which include single cameras, stereo pairs, multiple camera rigs and catadioptric sensors, (ii) features extracted from the environment in order to perform SLAM which include point features and line/edge features, (iii) initialisation of landmarks which can either be delayed or undelayed, (iv) SLAM techniques used which include Extended Kalman Filtering, Particle Filtering, biologically inspired techniques like RatSLAM, and other techniques like Local Bundle Adjustment, and (v) use of wheel odometry information. The paper also presents the implementation and analysis of stereo pair based EKF SLAM for synthetic data. Results prove the technique to work successfully in the presence of considerable amounts of sensor noise. We believe that state of the art presented in the paper can serve as a basis for future research in the area of vision based SLAM. It will permit further research in the area to be carried out in an efficient and application specific way.
Application of AIS Technology to Forest Mapping
NASA Technical Reports Server (NTRS)
Yool, S. R.; Star, J. L.
1985-01-01
Concerns about environmental effects of large scale deforestation have prompted efforts to map forests over large areas using various remote sensing data and image processing techniques. Basic research on the spectral characteristics of forest vegetation are required to form a basis for development of new techniques, and for image interpretation. Examination of LANDSAT data and image processing algorithms over a portion of boreal forest have demonstrated the complexity of relations between the various expressions of forest canopies, environmental variability, and the relative capacities of different image processing algorithms to achieve high classification accuracies under these conditions. Airborne Imaging Spectrometer (AIS) data may in part provide the means to interpret the responses of standard data and techniques to the vegetation based on its relatively high spectral resolution.
Dilbone, Elizabeth; Legleiter, Carl; Alexander, Jason S.; McElroy, Brandon
2018-01-01
Methods for spectrally based mapping of river bathymetry have been developed and tested in clear‐flowing, gravel‐bed channels, with limited application to turbid, sand‐bed rivers. This study used hyperspectral images and field surveys from the dynamic, sandy Niobrara River to evaluate three depth retrieval methods. The first regression‐based approach, optimal band ratio analysis (OBRA), paired in situ depth measurements with image pixel values to estimate depth. The second approach used ground‐based field spectra to calibrate an OBRA relationship. The third technique, image‐to‐depth quantile transformation (IDQT), estimated depth by linking the cumulative distribution function (CDF) of depth to the CDF of an image‐derived variable. OBRA yielded the lowest depth retrieval mean error (0.005 m) and highest observed versus predicted R2 (0.817). Although misalignment between field and image data did not compromise the performance of OBRA in this study, poor georeferencing could limit regression‐based approaches such as OBRA in dynamic, sand‐bedded rivers. Field spectroscopy‐based depth maps exhibited a mean error with a slight shallow bias (0.068 m) but provided reliable estimates for most of the study reach. IDQT had a strong deep bias but provided informative relative depth maps. Overprediction of depth by IDQT highlights the need for an unbiased sampling strategy to define the depth CDF. Although each of the techniques we tested demonstrated potential to provide accurate depth estimates in sand‐bed rivers, each method also was subject to certain constraints and limitations.
Assessing LiDAR elevation data for KDOT applications.
DOT National Transportation Integrated Search
2013-02-01
LiDAR-based elevation surveys are a cost-effective means for mapping topography over large areas. LiDAR : surveys use an airplane-mounted or ground-based laser radar unit to scan terrain. Post-processing techniques are : applied to remove vegetation ...
Behavior Analysis of Novel Wearable Indoor Mapping System Based on 3D-SLAM
Dorado, Iago; Gesto, Manuel; Arias, Pedro; Lorenzo, Henrique
2018-01-01
This paper presents a Wearable Prototype for indoor mapping developed by the University of Vigo. The system is based on a Velodyne LiDAR, acquiring points with 16 rays for a simplistic or low-density 3D representation of reality. With this, a Simultaneous Localization and Mapping (3D-SLAM) method is developed for the mapping and generation of 3D point clouds of scenarios deprived from GNSS signal. The quality of the system presented is validated through the comparison with a commercial indoor mapping system, Zeb-Revo, from the company GeoSLAM and with a terrestrial LiDAR, Faro Focus3D X330. The first is considered as a relative reference with other mobile systems and is chosen due to its use of the same principle for mapping: SLAM techniques based on Robot Operating System (ROS), while the second is taken as ground-truth for the determination of the final accuracy of the system regarding reality. Results show that the accuracy of the system is mainly determined by the accuracy of the sensor, with little increment in the error introduced by the mapping algorithm. PMID:29498715
Mucke, M; Zhaunerchyk, V; Frasinski, L J; ...
2015-07-01
Few-photon ionization and relaxation processes in acetylene (C 2H 2) and ethane (C 2H 6) were investigated at the linac coherent light source x-ray free electron laser (FEL) at SLAC, Stanford using a highly efficient multi-particle correlation spectroscopy technique based on a magnetic bottle. The analysis method of covariance mapping has been applied and enhanced, allowing us to identify electron pairs associated with double core hole (DCH) production and competing multiple ionization processes including Auger decay sequences. The experimental technique and the analysis procedure are discussed in the light of earlier investigations of DCH studies carried out at the samemore » FEL and at third generation synchrotron radiation sources. In particular, we demonstrate the capability of the covariance mapping technique to disentangle the formation of molecular DCH states which is barely feasible with conventional electron spectroscopy methods.« less
Environmental mapping and monitoring of Iceland by remote sensing (EMMIRS)
NASA Astrophysics Data System (ADS)
Pedersen, Gro B. M.; Vilmundardóttir, Olga K.; Falco, Nicola; Sigurmundsson, Friðþór S.; Rustowicz, Rose; Belart, Joaquin M.-C.; Gísladóttir, Gudrun; Benediktsson, Jón A.
2016-04-01
Iceland is exposed to rapid and dynamic landscape changes caused by natural processes and man-made activities, which impact and challenge the country. Fast and reliable mapping and monitoring techniques are needed on a big spatial scale. However, currently there is lack of operational advanced information processing techniques, which are needed for end-users to incorporate remote sensing (RS) data from multiple data sources. Hence, the full potential of the recent RS data explosion is not being fully exploited. The project Environmental Mapping and Monitoring of Iceland by Remote Sensing (EMMIRS) bridges the gap between advanced information processing capabilities and end-user mapping of the Icelandic environment. This is done by a multidisciplinary assessment of two selected remote sensing super sites, Hekla and Öræfajökull, which encompass many of the rapid natural and man-made landscape changes that Iceland is exposed to. An open-access benchmark repository of the two remote sensing supersites is under construction, providing high-resolution LIDAR topography and hyperspectral data for land-cover and landform classification. Furthermore, a multi-temporal and multi-source archive stretching back to 1945 allows a decadal evaluation of landscape and ecological changes for the two remote sensing super sites by the development of automated change detection techniques. The development of innovative pattern recognition and machine learning-based approaches to image classification and change detection is one of the main tasks of the EMMIRS project, aiming to extract and compute earth observation variables as automatically as possible. Ground reference data collected through a field campaign will be used to validate the implemented methods, which outputs are then inferred with geological and vegetation models. Here, preliminary results of an automatic land-cover classification based on hyperspectral image analysis are reported. Furthermore, the EMMIRS project investigates the complex landscape dynamics between geological and ecological processes. This is done through cross-correlation of mapping results and implementation of modelling techniques that simulate geological and ecological processes in order to extrapolate the landscape evolution
Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian
2014-01-01
We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this “Atlas-T1w-DUTE” approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the “silver standard”; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally. PMID:24753982
Dugdale, Stephanie; Ward, Jonathan; Hernen, Jan; Elison, Sarah; Davies, Glyn; Donkor, Daniel
2016-07-22
In recent years, research within the field of health psychology has made significant progress in terms of advancing and standardizing the science of developing, evaluating and reporting complex behavioral change interventions. A major part of this work has involved the development of an evidence-based Behavior Change Technique Taxonomy v1 (BCTTv1), as a means of describing the active components contained within such complex interventions. To date, however, this standardized approach derived from health psychology research has not been applied to the development of complex interventions for the treatment of substance use disorders (SUD). Therefore, this paper uses Breaking Free Online (BFO), a computer-assisted therapy program for SUD, as an example of how the clinical techniques contained within such an intervention might be mapped onto the BCTTv1. The developers of BFO were able to produce a full list of the clinical techniques contained within BFO. Exploratory mapping of the BCTTv1 onto the clinical content of the BFO program was conducted separately by the authors of the paper. This included the developers of the BFO program and psychology professionals working within the SUD field. These coded techniques were reviewed by the authors and any discrepancies in the coding were discussed between all authors until an agreement was reached. The BCTTv1 was mapped onto the clinical content of the BFO program. At least one behavioral change technique was found in 12 out of 16 grouping categories within the BCTTv1. A total of 26 out of 93 behavior change techniques were identified across the clinical content of the program. This exploratory mapping exercise has identified the specific behavior change techniques contained within BFO, and has provided a means of describing these techniques in a standardized way using the BCTTv1 terminology. It has also provided an opportunity for the BCTTv1 mapping process to be reported to the wider SUD treatment community, as it may have real utility in the development and evaluation of other psychosocial and behavioral change interventions within this field.
Lee, Ho; Fahimian, Benjamin P; Xing, Lei
2017-03-21
This paper proposes a binary moving-blocker (BMB)-based technique for scatter correction in cone-beam computed tomography (CBCT). In concept, a beam blocker consisting of lead strips, mounted in front of the x-ray tube, moves rapidly in and out of the beam during a single gantry rotation. The projections are acquired in alternating phases of blocked and unblocked cone beams, where the blocked phase results in a stripe pattern in the width direction. To derive the scatter map from the blocked projections, 1D B-Spline interpolation/extrapolation is applied by using the detected information in the shaded regions. The scatter map of the unblocked projections is corrected by averaging two scatter maps that correspond to their adjacent blocked projections. The scatter-corrected projections are obtained by subtracting the corresponding scatter maps from the projection data and are utilized to generate the CBCT image by a compressed-sensing (CS)-based iterative reconstruction algorithm. Catphan504 and pelvis phantoms were used to evaluate the method's performance. The proposed BMB-based technique provided an effective method to enhance the image quality by suppressing scatter-induced artifacts, such as ring artifacts around the bowtie area. Compared to CBCT without a blocker, the spatial nonuniformity was reduced from 9.1% to 3.1%. The root-mean-square error of the CT numbers in the regions of interest (ROIs) was reduced from 30.2 HU to 3.8 HU. In addition to high resolution, comparable to that of the benchmark image, the CS-based reconstruction also led to a better contrast-to-noise ratio in seven ROIs. The proposed technique enables complete scatter-corrected CBCT imaging with width-truncated projections and allows reducing the acquisition time to approximately half. This work may have significant implications for image-guided or adaptive radiation therapy, where CBCT is often used.
NASA Astrophysics Data System (ADS)
Lee, Ho; Fahimian, Benjamin P.; Xing, Lei
2017-03-01
This paper proposes a binary moving-blocker (BMB)-based technique for scatter correction in cone-beam computed tomography (CBCT). In concept, a beam blocker consisting of lead strips, mounted in front of the x-ray tube, moves rapidly in and out of the beam during a single gantry rotation. The projections are acquired in alternating phases of blocked and unblocked cone beams, where the blocked phase results in a stripe pattern in the width direction. To derive the scatter map from the blocked projections, 1D B-Spline interpolation/extrapolation is applied by using the detected information in the shaded regions. The scatter map of the unblocked projections is corrected by averaging two scatter maps that correspond to their adjacent blocked projections. The scatter-corrected projections are obtained by subtracting the corresponding scatter maps from the projection data and are utilized to generate the CBCT image by a compressed-sensing (CS)-based iterative reconstruction algorithm. Catphan504 and pelvis phantoms were used to evaluate the method’s performance. The proposed BMB-based technique provided an effective method to enhance the image quality by suppressing scatter-induced artifacts, such as ring artifacts around the bowtie area. Compared to CBCT without a blocker, the spatial nonuniformity was reduced from 9.1% to 3.1%. The root-mean-square error of the CT numbers in the regions of interest (ROIs) was reduced from 30.2 HU to 3.8 HU. In addition to high resolution, comparable to that of the benchmark image, the CS-based reconstruction also led to a better contrast-to-noise ratio in seven ROIs. The proposed technique enables complete scatter-corrected CBCT imaging with width-truncated projections and allows reducing the acquisition time to approximately half. This work may have significant implications for image-guided or adaptive radiation therapy, where CBCT is often used.
NASA Astrophysics Data System (ADS)
Harman, Philip V.; Flack, Julien; Fox, Simon; Dowley, Mark
2002-05-01
The conversion of existing 2D images to 3D is proving commercially viable and fulfills the growing need for high quality stereoscopic images. This approach is particularly effective when creating content for the new generation of autostereoscopic displays that require multiple stereo images. The dominant technique for such content conversion is to develop a depth map for each frame of 2D material. The use of a depth map as part of the 2D to 3D conversion process has a number of desirable characteristics: 1. The resolution of the depth may be lower than that of the associated 2D image. 2. It can be highly compressed. 3. 2D compatibility is maintained. 4. Real time generation of stereo, or multiple stereo pairs, is possible. The main disadvantage has been the laborious nature of the manual conversion techniques used to create depth maps from existing 2D images, which results in a slow and costly process. An alternative, highly productive technique has been developed based upon the use of Machine Leaning Algorithm (MLAs). This paper describes the application of MLAs to the generation of depth maps and presents the results of the commercial application of this approach.
Contour-Driven Atlas-Based Segmentation
Wachinger, Christian; Fritscher, Karl; Sharp, Greg; Golland, Polina
2016-01-01
We propose new methods for automatic segmentation of images based on an atlas of manually labeled scans and contours in the image. First, we introduce a Bayesian framework for creating initial label maps from manually annotated training images. Within this framework, we model various registration- and patch-based segmentation techniques by changing the deformation field prior. Second, we perform contour-driven regression on the created label maps to refine the segmentation. Image contours and image parcellations give rise to non-stationary kernel functions that model the relationship between image locations. Setting the kernel to the covariance function in a Gaussian process establishes a distribution over label maps supported by image structures. Maximum a posteriori estimation of the distribution over label maps conditioned on the outcome of the atlas-based segmentation yields the refined segmentation. We evaluate the segmentation in two clinical applications: the segmentation of parotid glands in head and neck CT scans and the segmentation of the left atrium in cardiac MR angiography images. PMID:26068202
Yatsushiro, Satoshi; Hirayama, Akihiro; Matsumae, Mitsunori; Kajiwara, Nao; Abdullah, Afnizanfaizal; Kuroda, Kagayaki
2014-01-01
Correlation time mapping based on magnetic resonance (MR) velocimetry has been applied to pulsatile cerebrospinal fluid (CSF) motion to visualize the pressure transmission between CSF at different locations and/or between CSF and arterial blood flow. Healthy volunteer experiments demonstrated that the technique exhibited transmitting pulsatile CSF motion from CSF space in the vicinity of blood vessels with short delay and relatively high correlation coefficients. Patient and healthy volunteer experiments indicated that the properties of CSF motion were different from the healthy volunteers. Resultant images in healthy volunteers implied that there were slight individual difference in the CSF driving source locations. Clinical interpretation for these preliminary results is required to apply the present technique for classifying status of hydrocephalus.
In vivo quantification of amyloid burden in TTR-related cardiac amyloidosis
Kollikowski, Alexander Marco; Kahles, Florian; Kintsler, Svetlana; Hamada, Sandra; Reith, Sebastian; Knüchel, Ruth; Röcken, Christoph; Mottaghy, Felix Manuel; Marx, Nikolaus; Burgmaier, Mathias
2017-01-01
Summary Cardiac transthyretin-related (ATTR) amyloidosis is a severe cardiomyopathy for which therapeutic approaches are currently under development. Because non-invasive imaging techniques such as cardiac magnetic resonance imaging and echocardiography are non-specific, the diagnosis of ATTR amyloidosis is still based on myocardial biopsy. Thus, diagnosis of ATTR amyloidosis is difficult in patients refusing myocardial biopsy. Furthermore, myocardial biopsy does not allow 3D-mapping and quantification of myocardial ATTR amyloid. In this report we describe a 99mTc-DPD-based molecular imaging technique for non-invasive single-step diagnosis, three-dimensional mapping and semiquantification of cardiac ATTR amyloidosis in a patient with suspected amyloid heart disease who initially rejected myocardial biopsy. This report underlines the clinical value of SPECT-based nuclear medicine imaging to enable non-invasive diagnosis of cardiac ATTR amyloidosis, particularly in patients rejecting biopsy. PMID:29259858
Feedback mechanism for smart nozzles and nebulizers
Montaser, Akbar [Potomac, MD; Jorabchi, Kaveh [Arlington, VA; Kahen, Kaveh [Kleinburg, CA
2009-01-27
Nozzles and nebulizers able to produce aerosol with optimum and reproducible quality based on feedback information obtained using laser imaging techniques. Two laser-based imaging techniques based on particle image velocimetry (PTV) and optical patternation map and contrast size and velocity distributions for indirect and direct pneumatic nebulizations in plasma spectrometry. Two pulses from thin laser sheet with known time difference illuminate droplets flow field. Charge coupled device (CCL)) captures scattering of laser light from droplets, providing two instantaneous particle images. Pointwise cross-correlation of corresponding images yields two-dimensional velocity map of aerosol velocity field. For droplet size distribution studies, solution is doped with fluorescent dye and both laser induced florescence (LIF) and Mie scattering images are captured simultaneously by two CCDs with the same field of view. Ratio of LIF/Mie images provides relative droplet size information, then scaled by point calibration method via phase Doppler particle analyzer.
Whole brain myelin mapping using T1- and T2-weighted MR imaging data
Ganzetti, Marco; Wenderoth, Nicole; Mantini, Dante
2014-01-01
Despite recent advancements in MR imaging, non-invasive mapping of myelin in the brain still remains an open issue. Here we attempted to provide a potential solution. Specifically, we developed a processing workflow based on T1-w and T2-w MR data to generate an optimized myelin enhanced contrast image. The workflow allows whole brain mapping using the T1-w/T2-w technique, which was originally introduced as a non-invasive method for assessing cortical myelin content. The hallmark of our approach is a retrospective calibration algorithm, applied to bias-corrected T1-w and T2-w images, that relies on image intensities outside the brain. This permits standardizing the intensity histogram of the ratio image, thereby allowing for across-subject statistical analyses. Quantitative comparisons of image histograms within and across different datasets confirmed the effectiveness of our normalization procedure. Not only did the calibrated T1-w/T2-w images exhibit a comparable intensity range, but also the shape of the intensity histograms was largely corresponding. We also assessed the reliability and specificity of the ratio image compared to other MR-based techniques, such as magnetization transfer ratio (MTR), fractional anisotropy (FA), and fluid-attenuated inversion recovery (FLAIR). With respect to these other techniques, T1-w/T2-w had consistently high values, as well as low inter-subject variability, in brain structures where myelin is most abundant. Overall, our results suggested that the T1-w/T2-w technique may be a valid tool supporting the non-invasive mapping of myelin in the brain. Therefore, it might find important applications in the study of brain development, aging and disease. PMID:25228871
Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results
NASA Astrophysics Data System (ADS)
Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.
2014-03-01
Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.
Westbrook, Johanna I; Coiera, Enrico W; Braithwaite, Jeffrey
2005-01-01
Online evidence retrieval systems are one potential tool in supporting evidence-based practice. We have undertaken a program of research to investigate how hospital-based clinicians (doctors, nurses and allied health professionals) use these systems, factors influencing use and their impact on decision-making and health care delivery. A central component of this work has been the development and testing of a broad range of evaluation techniques. This paper provides an overview of the results obtained from three stages of this evaluation and details the results derived from the final stage which sought to test two methods for assessing the integration of an online evidence system and its impact on decision making and patient care. The critical incident and journey mapping techniques were applied. Semi-structured interviews were conducted with 29 clinicians who were experienced users of the online evidence system. Clinicians were asked to described recent instances in which the information obtained using the online evidence system was especially helpful with their work. A grounded approach to data analysis was taken producing three categories of impact. The journey mapping technique was adapted as a method to describe and quantify clinicians' integration of CIAP into their practice and the impact of this on patient care. The analogy of a journey is used to capture the many stages in this integration process, from introduction to the system to full integration into everyday clinical practice with measurable outcomes. Transcribed interview accounts of system use were mapped against the journey stages and scored. Clinicians generated 85 critical incidents and one quarter of these provided specific examples of system use leading to improvements in patient care. The journey mapping technique proved to be a useful method for providing a quantification of the ways and extent to which clincians had integrated system use into practice, and insights into how information systems can influence organisational culture. Further work is required on this technique to assess its value as an evaluation method. The study demonstrates the strength of a triangulated evidence approach to assessing the use and impact of online clinical evidence systems.
HiC-spector: a matrix library for spectral and reproducibility analysis of Hi-C contact maps.
Yan, Koon-Kiu; Yardimci, Galip Gürkan; Yan, Chengfei; Noble, William S; Gerstein, Mark
2017-07-15
Genome-wide proximity ligation based assays like Hi-C have opened a window to the 3D organization of the genome. In so doing, they present data structures that are different from conventional 1D signal tracks. To exploit the 2D nature of Hi-C contact maps, matrix techniques like spectral analysis are particularly useful. Here, we present HiC-spector, a collection of matrix-related functions for analyzing Hi-C contact maps. In particular, we introduce a novel reproducibility metric for quantifying the similarity between contact maps based on spectral decomposition. The metric successfully separates contact maps mapped from Hi-C data coming from biological replicates, pseudo-replicates and different cell types. Source code in Julia and Python, and detailed documentation is available at https://github.com/gersteinlab/HiC-spector . koonkiu.yan@gmail.com or mark@gersteinlab.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Photogrammetric portrayal of Mars topography.
Wu, S.S.C.
1979-01-01
Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.-Author
Photogrammetric portrayal of Mars topography
NASA Technical Reports Server (NTRS)
Wu, S. S. C.
1979-01-01
Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.
Novel techniques of real-time blood flow and functional mapping: technical note.
Kamada, Kyousuke; Ogawa, Hiroshi; Saito, Masato; Tamura, Yukie; Anei, Ryogo; Kapeller, Christoph; Hayashi, Hideaki; Prueckl, Robert; Guger, Christoph
2014-01-01
There are two main approaches to intraoperative monitoring in neurosurgery. One approach is related to fluorescent phenomena and the other is related to oscillatory neuronal activity. We developed novel techniques to visualize blood flow (BF) conditions in real time, based on indocyanine green videography (ICG-VG) and the electrophysiological phenomenon of high gamma activity (HGA). We investigated the use of ICG-VG in four patients with moyamoya disease and two with arteriovenous malformation (AVM), and we investigated the use of real-time HGA mapping in four patients with brain tumors who underwent lesion resection with awake craniotomy. Real-time data processing of ICG-VG was based on perfusion imaging, which generated parameters including arrival time (AT), mean transit time (MTT), and BF of brain surface vessels. During awake craniotomy, we analyzed the frequency components of brain oscillation and performed real-time HGA mapping to identify functional areas. Processed results were projected on a wireless monitor linked to the operating microscope. After revascularization for moyamoya disease, AT and BF were significantly shortened and increased, respectively, suggesting hyperperfusion. Real-time fusion images on the wireless monitor provided anatomical, BF, and functional information simultaneously, and allowed the resection of AVMs under the microscope. Real-time HGA mapping during awake craniotomy rapidly indicated the eloquent areas of motor and language function and significantly shortened the operation time. These novel techniques, which we introduced might improve the reliability of intraoperative monitoring and enable the development of rational and objective surgical strategies.
Novel Techniques of Real-time Blood Flow and Functional Mapping: Technical Note
KAMADA, Kyousuke; OGAWA, Hiroshi; SAITO, Masato; TAMURA, Yukie; ANEI, Ryogo; KAPELLER, Christoph; HAYASHI, Hideaki; PRUECKL, Robert; GUGER, Christoph
2014-01-01
There are two main approaches to intraoperative monitoring in neurosurgery. One approach is related to fluorescent phenomena and the other is related to oscillatory neuronal activity. We developed novel techniques to visualize blood flow (BF) conditions in real time, based on indocyanine green videography (ICG-VG) and the electrophysiological phenomenon of high gamma activity (HGA). We investigated the use of ICG-VG in four patients with moyamoya disease and two with arteriovenous malformation (AVM), and we investigated the use of real-time HGA mapping in four patients with brain tumors who underwent lesion resection with awake craniotomy. Real-time data processing of ICG-VG was based on perfusion imaging, which generated parameters including arrival time (AT), mean transit time (MTT), and BF of brain surface vessels. During awake craniotomy, we analyzed the frequency components of brain oscillation and performed real-time HGA mapping to identify functional areas. Processed results were projected on a wireless monitor linked to the operating microscope. After revascularization for moyamoya disease, AT and BF were significantly shortened and increased, respectively, suggesting hyperperfusion. Real-time fusion images on the wireless monitor provided anatomical, BF, and functional information simultaneously, and allowed the resection of AVMs under the microscope. Real-time HGA mapping during awake craniotomy rapidly indicated the eloquent areas of motor and language function and significantly shortened the operation time. These novel techniques, which we introduced might improve the reliability of intraoperative monitoring and enable the development of rational and objective surgical strategies. PMID:25263624
Quantifying torso deformity in scoliosis
NASA Astrophysics Data System (ADS)
Ajemba, Peter O.; Kumar, Anish; Durdle, Nelson G.; Raso, V. James
2006-03-01
Scoliosis affects the alignment of the spine and the shape of the torso. Most scoliosis patients and their families are more concerned about the effect of scoliosis on the torso than its effect on the spine. There is a need to develop robust techniques for quantifying torso deformity based on full torso scans. In this paper, deformation indices obtained from orthogonal maps of full torso scans are used to quantify torso deformity in scoliosis. 'Orthogonal maps' are obtained by applying orthogonal transforms to 3D surface maps. (An 'orthogonal transform' maps a cylindrical coordinate system to a Cartesian coordinate system.) The technique was tested on 361 deformed computer models of the human torso and on 22 scans of volunteers (8 normal and 14 scoliosis). Deformation indices from the orthogonal maps correctly classified up to 95% of the volunteers with a specificity of 1.00 and a sensitivity of 0.91. In addition to classifying scoliosis, the system gives a visual representation of the entire torso in one view and is viable for use in a clinical environment for managing scoliosis.
Non-supervised method for early forest fire detection and rapid mapping
NASA Astrophysics Data System (ADS)
Artés, Tomás; Boca, Roberto; Liberta, Giorgio; San-Miguel, Jesús
2017-09-01
Natural hazards are a challenge for the society. Scientific community efforts have been severely increased assessing tasks about prevention and damage mitigation. The most important points to minimize natural hazard damages are monitoring and prevention. This work focuses particularly on forest fires. This phenomenon depends on small-scale factors and fire behavior is strongly related to the local weather. Forest fire spread forecast is a complex task because of the scale of the phenomena, the input data uncertainty and time constraints in forest fire monitoring. Forest fire simulators have been improved, including some calibration techniques avoiding data uncertainty and taking into account complex factors as the atmosphere. Such techniques increase dramatically the computational cost in a context where the available time to provide a forecast is a hard constraint. Furthermore, an early mapping of the fire becomes crucial to assess it. In this work, a non-supervised method for forest fire early detection and mapping is proposed. As main sources, the method uses daily thermal anomalies from MODIS and VIIRS combined with land cover map to identify and monitor forest fires with very few resources. This method relies on a clustering technique (DBSCAN algorithm) and on filtering thermal anomalies to detect the forest fires. In addition, a concave hull (alpha shape algorithm) is applied to obtain rapid mapping of the fire area (very coarse accuracy mapping). Therefore, the method leads to a potential use for high-resolution forest fire rapid mapping based on satellite imagery using the extent of each early fire detection. It shows the way to an automatic rapid mapping of the fire at high resolution processing as few data as possible.
Linear time relational prototype based learning.
Gisbrecht, Andrej; Mokbel, Bassam; Schleif, Frank-Michael; Zhu, Xibin; Hammer, Barbara
2012-10-01
Prototype based learning offers an intuitive interface to inspect large quantities of electronic data in supervised or unsupervised settings. Recently, many techniques have been extended to data described by general dissimilarities rather than Euclidean vectors, so-called relational data settings. Unlike the Euclidean counterparts, the techniques have quadratic time complexity due to the underlying quadratic dissimilarity matrix. Thus, they are infeasible already for medium sized data sets. The contribution of this article is twofold: On the one hand we propose a novel supervised prototype based classification technique for dissimilarity data based on popular learning vector quantization (LVQ), on the other hand we transfer a linear time approximation technique, the Nyström approximation, to this algorithm and an unsupervised counterpart, the relational generative topographic mapping (GTM). This way, linear time and space methods result. We evaluate the techniques on three examples from the biomedical domain.
Assessing LiDAR elevation data for KDOT applications : [technical summary].
DOT National Transportation Integrated Search
2013-02-01
LiDAR-based elevation surveys : are a cost-effective means for : mapping topography over large : areas. LiDAR surveys use an : airplane-mounted or ground-based : laser radar unit to scan terrain. : Post-processing techniques are : applied to remove v...
NASA Astrophysics Data System (ADS)
Tulsyan, Gaurav
Doping profiles are engineered to manipulate device properties and to determine electrical performances of microelectronic devices frequently. To support engineering studies afterward, essential information is usually required from physically characterized doping profiles. Secondary Ion Mass Spectrometry (SIMS), Spreading Resistance Profiling (SRP) and Electrochemical Capacitance Voltage (ECV) profiling are standard techniques for now to map profile. SIMS yields a chemical doping profile via ion sputtering process and owns a better resolution, whereas ECV and SRP produce an electrical doping profile detecting free carriers in microelectronic devices. The major difference between electrical and chemical doping profiles is at heavily doped regions greater than 1020 atoms/cm3. At the profile region over the solubility limit, inactive dopants induce a flat plateau and detected by electrical measurements only. Destructive techniques are usually designed as stand-alone systems to study impurities. For an in-situ process control purpose, non-contact methods, such as ellipsometry and non-contact capacitance voltage (CV) techniques are current under development. In this theses work, terahertz time domain spectroscopy (THz-TDS) is utilized to achieve electrical doping profile in both destructive and non-contact manners. In recent years the Terahertz group at Rochester Institute Technology developed several techniques that use terahertz pulses to non-destructively map doping profiles. In this thesis, we study a destructive but potentially higher resolution version of the terahertz based approach to map the profile of activated dopants and augment the non-destructive approaches already developed. The basic idea of the profile mapping approach developed in this MS thesis is to anodize, and thus oxidize to silicon dioxide, thin layers (down to below 10 nm) of the wafer with the doping profile to be mapped. Since the dopants atoms and any free carriers in the silicon oxide thin film are invisible to the terahertz probe this anodization step very effectively removes a 'thin slice' from the doping profile to be mapped. By iterating between anodization and terahertz measurements that detect only the 'remaining' non-oxidized portion of the doping profile one can re-construct the doping profile with significantly higher precision compared to what is possible by only a single non-destructive measurement of the un-anodized profile as used in the non-destructive version of our technique. In this MS thesis we explore all aspects of this anodization based variation of doping profile mapping using free space terahertz pulses. This includes a study of silicon dioxide thin film growth using a room temperature electrochemical oxidation process. Etching procedures providing the option to remove between successive anodization and terahertz measurement steps. THz-TDS measurements of successively anodized profiles will be compared with sheet resistance and SIMS measurements to benchmark and improve the new technique.
Application of the 1:2,000,000-scale data base: A National Atlas sectional prototype
Dixon, Donna M.
1985-01-01
A study of the potential to produce a National Atlas sectional prototype from the 1:2,000,000-scale data base was concluded recently by the National Mapping Division, U. S. Geological Survey. This paper discusses the specific digital cartographic production procedures involved in the preparation of the prototype map, as well as the theoretical and practical cartographic framework for the study. Such items as data organization, data classification, digital techniques, data conversions, and modification of traditional design specifications for an automated environment are discussed. The bulk of the cartographic work for the production of the prototype was carried out in raster format on the Scitex Response-250 mapping system.
NASA Astrophysics Data System (ADS)
Costanzo, Antonio; Montuori, Antonio; Silva, Juan Pablo; Silvestri, Malvina; Musacchio, Massimo; Buongiorno, Maria Fabrizia; Stramondo, Salvatore
2016-08-01
In this work, a web-GIS procedure to map the risk of road blockage in urban environments through the combined use of space-borne and airborne remote sensing sensors is presented. The methodology concerns (1) the provision of a geo-database through the integration of space-borne multispectral images and airborne LiDAR data products; (2) the modeling of building vulnerability, based on the corresponding 3D geometry and construction time information; (3) the GIS-based mapping of road closure due to seismic- related building collapses based on the building characteristic height and the width of the road. Experimental results, gathered for the Cosenza urban area, allow demonstrating the benefits of both the proposed approach and the GIS-based integration of multi-platforms remote sensing sensors and techniques for seismic road assessment purposes.
NASA Astrophysics Data System (ADS)
Balzarini, R.; Dalmasso, A.; Murat, M.
2015-08-01
This article presents preliminary results from a research project in progress that brings together geographers, cognitive scientists, historians and computer scientists. The project investigates the evolution of a particular territorial model: ski trails maps. Ski resorts, tourist and sporting innovations for mountain economies since the 1930s, have needed cartographic representations corresponding to new practices of the space.Painter artists have been involved in producing ski maps with painting techniques and panoramic views, which are by far the most common type of map, because they allow the resorts to look impressive to potential visitors. These techniques have evolved throughout the mutations of the ski resorts. Paper ski maps no longer meet the needs of a large part of the customers; the question now arises of their adaptation to digital media. In a computerized process perspective, the early stage of the project aims to identify the artist-representations, based on conceptual and technical rules, which are handled by users-skiers to perform a task (location, wayfinding, decision-making) and can be transferred to a computer system. This article presents the experimental phase that analyzes artist and user mental representations that are at stake during the making and the reading of a paper ski map. It particularly focuses on how the invention of the artist influences map reading.
NASA Astrophysics Data System (ADS)
Rebich, S.
2003-12-01
The concept mapping technique has been proposed as a method for examining the evolving nature of students' conceptualizations of scientific concepts, and promises insight into a dimension of learning different from the one accessible through more conventional classroom testing techniques. The theory behind concept mapping is based on an assumption that knowledge acquisition is accomplished through "linking" of new information to an existing knowledge framework, and that meaningful (as opposed to arbitrary or verbatim) links allow for deeper understanding and conceptual change. Reflecting this theory, concept maps are constructed as a network of related concepts connected by labeled links that illustrate the relationship between the concepts. Two concepts connected by one such link make up a "proposition", the basic element of the concept map structure. In this paper, we examine the results of a pre- and post-test assessment program for an upper-division undergraduate geography course entitled "Mock Environmental Summit," which was part of a research project on assessment. Concept mapping was identified as a potentially powerful assessment tool for this course, as more conventional tools such as multiple-choice tests did not seem to provide a reliable indication of the learning students were experiencing as a result of the student-directed research, presentations, and discussions that make up a substantial portion of the course. The assessment program began at the beginning of the course with a one-hour training session during which students were introduced to the theory behind concept mapping, provided with instructions and guidance for constructing a concept map using the CMap software developed and maintained by the Institute for Human and Machine Cognition at the University of West Florida, and asked to collaboratively construct a concept map on a topic not related to the one to be assessed. This training session was followed by a 45-minute "pre-test" on the topic of global climate change, for which students were provided with a list of questions to guide their thoughts during the concept map construction. Following the pre-test, students were not exposed to further concept mapping until the end of the course, when they were asked to complete a "post-test" consisting of exactly the same task. In addition to a summary of our results, this paper presents an overview of available digital concept-mapping tools, proposed scoring techniques, and design principles to keep in mind when designing a concept-mapping assessment program. We also discuss our experience with concept map assessment, the insights it provided into the evolution in student understanding of global climate change that resulted from the course, and our ideas about the potential role of concept mapping in an overall assessment program for interdisciplinary and/or student-directed curricula.
A contrast enhancement method for improving the segmentation of breast lesions on ultrasonography.
Flores, Wilfrido Gómez; Pereira, Wagner Coelho de Albuquerque
2017-01-01
This paper presents an adaptive contrast enhancement method based on sigmoidal mapping function (SACE) used for improving the computerized segmentation of breast lesions on ultrasound. First, from the original ultrasound image an intensity variation map is obtained, which is used to generate local sigmoidal mapping functions related to distinct contextual regions. Then, a bilinear interpolation scheme is used to transform every original pixel to a new gray level value. Also, four contrast enhancement techniques widely used in breast ultrasound enhancement are implemented: histogram equalization (HEQ), contrast limited adaptive histogram equalization (CLAHE), fuzzy enhancement (FEN), and sigmoid based enhancement (SEN). In addition, these contrast enhancement techniques are considered in a computerized lesion segmentation scheme based on watershed transformation. The performance comparison among techniques is assessed in terms of both the quality of contrast enhancement and the segmentation accuracy. The former is quantified by the measure, where the greater the value, the better the contrast enhancement, whereas the latter is calculated by the Jaccard index, which should tend towards unity to indicate adequate segmentation. The experiments consider a data set with 500 breast ultrasound images. The results show that SACE outperforms its counterparts, where the median values for the measure are: SACE: 139.4, SEN: 68.2, HEQ: 64.1, CLAHE: 62.8, and FEN: 7.9. Considering the segmentation performance results, the SACE method presents the largest accuracy, where the median values for the Jaccard index are: SACE: 0.81, FEN: 0.80, CLAHE: 0.79, HEQ: 77, and SEN: 0.63. The SACE method performs well due to the combination of three elements: (1) the intensity variation map reduces intensity variations that could distort the real response of the mapping function, (2) the sigmoidal mapping function enhances the gray level range where the transition between lesion and background is found, and (3) the adaptive enhancing scheme for coping with local contrasts. Hence, the SACE approach is appropriate for enhancing contrast before computerized lesion segmentation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Vector Doppler: spatial sampling analysis and presentation techniques for real-time systems
NASA Astrophysics Data System (ADS)
Capineri, Lorenzo; Scabia, Marco; Masotti, Leonardo F.
2001-05-01
The aim of the vector Doppler (VD) technique is the quantitative reconstruction of a velocity field independently of the ultrasonic probe axis to flow angle. In particular vector Doppler is interesting for studying vascular pathologies related to complex blood flow conditions. Clinical applications require a real-time operating mode and the capability to perform Doppler measurements over a defined volume. The combination of these two characteristics produces a real-time vector velocity map. In previous works the authors investigated the theory of pulsed wave (PW) vector Doppler and developed an experimental system capable of producing off-line 3D vector velocity maps. Afterwards, for producing dynamic velocity vector maps, we realized a new 2D vector Doppler system based on a modified commercial echograph. The measurement and presentation of a vector velocity field requires a correct spatial sampling that must satisfy the Shannon criterion. In this work we tackled this problem, establishing a relationship between sampling steps and scanning system characteristics. Another problem posed by the vector Doppler technique is the data representation in real-time that should be easy to interpret for the physician. With this in mine we attempted a multimedia solution that uses both interpolated images and sound to represent the information of the measured vector velocity map. These presentation techniques were experimented for real-time scanning on flow phantoms and preliminary measurements in vivo on a human carotid artery.
Generalized image contrast enhancement technique based on Heinemann contrast discrimination model
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1994-03-01
This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.
NASA Astrophysics Data System (ADS)
Fischer, J.; Doolan, C.
2017-12-01
A method to improve the quality of acoustic beamforming in reverberant environments is proposed in this paper. The processing is based on a filtering of the cross-correlation matrix of the microphone signals obtained using a microphone array. The main advantage of the proposed method is that it does not require information about the geometry of the reverberant environment and thus it can be applied to any configuration. The method is applied to the particular example of aeroacoustic testing in a hard-walled low-speed wind tunnel; however, the technique can be used in any reverberant environment. Two test cases demonstrate the technique. The first uses a speaker placed in the hard-walled working section with no wind tunnel flow. In the second test case, an airfoil is placed in a flow and acoustic beamforming maps are obtained. The acoustic maps have been improved, as the reflections observed in the conventional maps have been removed after application of the proposed method.
Soddu, Andrea; Gómez, Francisco; Heine, Lizette; Di Perri, Carol; Bahri, Mohamed Ali; Voss, Henning U; Bruno, Marie-Aurélie; Vanhaudenhuyse, Audrey; Phillips, Christophe; Demertzi, Athena; Chatelle, Camille; Schrouff, Jessica; Thibaut, Aurore; Charland-Verville, Vanessa; Noirhomme, Quentin; Salmon, Eric; Tshibanda, Jean-Flory Luaba; Schiff, Nicholas D; Laureys, Steven
2016-01-01
The mildly invasive 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) is a well-established imaging technique to measure 'resting state' cerebral metabolism. This technique made it possible to assess changes in metabolic activity in clinical applications, such as the study of severe brain injury and disorders of consciousness. We assessed the possibility of creating functional MRI activity maps, which could estimate the relative levels of activity in FDG-PET cerebral metabolic maps. If no metabolic absolute measures can be extracted, our approach may still be of clinical use in centers without access to FDG-PET. It also overcomes the problem of recognizing individual networks of independent component selection in functional magnetic resonance imaging (fMRI) resting state analysis. We extracted resting state fMRI functional connectivity maps using independent component analysis and combined only components of neuronal origin. To assess neuronality of components a classification based on support vector machine (SVM) was used. We compared the generated maps with the FDG-PET maps in 16 healthy controls, 11 vegetative state/unresponsive wakefulness syndrome patients and four locked-in patients. The results show a significant similarity with ρ = 0.75 ± 0.05 for healthy controls and ρ = 0.58 ± 0.09 for vegetative state/unresponsive wakefulness syndrome patients between the FDG-PET and the fMRI based maps. FDG-PET, fMRI neuronal maps, and the conjunction analysis show decreases in frontoparietal and medial regions in vegetative patients with respect to controls. Subsequent analysis in locked-in syndrome patients produced also consistent maps with healthy controls. The constructed resting state fMRI functional connectivity map points toward the possibility for fMRI resting state to estimate relative levels of activity in a metabolic map.
BROOKER, S.; KABATEREINE, N. B.; GYAPONG, J. O.; STOTHARD, J. R.; UTZINGER, J.
2009-01-01
SUMMARY There is growing interest and commitment to the control of schistosomiasis and other so-called neglected tropical diseases (NTDs). Resources for control are inevitably limited, necessitating assessment methods that can rapidly and accurately identify and map high-risk communities so that interventions can be targeted in a spatially-explicit and cost-effective manner. Here, we review progress made with (i) mapping schistosomiasis across Africa using available epidemiological data and more recently, climate-based risk prediction; (ii) the development and use of morbidity questionnaires for rapid identification of high-risk communities of urinary schistosomiasis; and (iii) innovative sampling-based approaches for intestinal schistosomiasis, using the lot quality assurance sampling technique. Experiences are also presented for the rapid mapping of other NTDs, including onchocerciasis, loiasis and lymphatic filariasis. Future directions for an integrated rapid mapping approach targeting multiple NTDs simultaneously are outlined, including potential challenges in developing an integrated survey tool. The lessons from the mapping of human helminth infections may also be relevant for the rapid mapping of malaria as its control efforts are intensified. PMID:19450373
In vivo correlation mapping microscopy
NASA Astrophysics Data System (ADS)
McGrath, James; Alexandrov, Sergey; Owens, Peter; Subhash, Hrebesh; Leahy, Martin
2016-04-01
To facilitate regular assessment of the microcirculation in vivo, noninvasive imaging techniques such as nailfold capillaroscopy are required in clinics. Recently, a correlation mapping technique has been applied to optical coherence tomography (OCT), which extends the capabilities of OCT to microcirculation morphology imaging. This technique, known as correlation mapping optical coherence tomography, has been shown to extract parameters, such as capillary density and vessel diameter, and key clinical markers associated with early changes in microvascular diseases. However, OCT has limited spatial resolution in both the transverse and depth directions. Here, we extend this correlation mapping technique to other microscopy modalities, including confocal microscopy, and take advantage of the higher spatial resolution offered by these modalities. The technique is achieved as a processing step on microscopy images and does not require any modification to the microscope hardware. Results are presented which show that this correlation mapping microscopy technique can extend the capabilities of conventional microscopy to enable mapping of vascular networks in vivo with high spatial resolution in both the transverse and depth directions.
Efficient, adaptive estimation of two-dimensional firing rate surfaces via Gaussian process methods.
Rad, Kamiar Rahnama; Paninski, Liam
2010-01-01
Estimating two-dimensional firing rate maps is a common problem, arising in a number of contexts: the estimation of place fields in hippocampus, the analysis of temporally nonstationary tuning curves in sensory and motor areas, the estimation of firing rates following spike-triggered covariance analyses, etc. Here we introduce methods based on Gaussian process nonparametric Bayesian techniques for estimating these two-dimensional rate maps. These techniques offer a number of advantages: the estimates may be computed efficiently, come equipped with natural errorbars, adapt their smoothness automatically to the local density and informativeness of the observed data, and permit direct fitting of the model hyperparameters (e.g., the prior smoothness of the rate map) via maximum marginal likelihood. We illustrate the method's flexibility and performance on a variety of simulated and real data.
Application of Satellite SAR Imagery in Mapping the Active Layer of Arctic Permafrost
NASA Technical Reports Server (NTRS)
Li, Shu-Sun; Romanovsky, V.; Lovick, Joe; Wang, Z.; Peterson, Rorik
2003-01-01
A method of mapping the active layer of Arctic permafrost using a combination of conventional synthetic aperture radar (SAR) backscatter and more sophisticated interferometric SAR (INSAR) techniques is proposed. The proposed research is based on the sensitivity of radar backscatter to the freeze and thaw status of the surface soil, and the sensitivity of INSAR techniques to centimeter- to sub-centimeter-level surface differential deformation. The former capability of SAR is investigated for deriving the timing and duration of the thaw period for surface soil of the active layer over permafrost. The latter is investigated for the feasibility of quantitative measurement of frost heaving and thaw settlement of the active layer during the freezing and thawing processes. The resulting knowledge contributes to remote sensing mapping of the active layer dynamics and Arctic land surface hydrology.
Maleke, Caroline; Luo, Jianwen; Gamarnik, Viktor; Lu, Xin L; Konofagou, Elisa E
2010-07-01
The objective of this study is to show that Harmonic Motion Imaging (HMI) can be used as a reliable tumor-mapping technique based on the tumor's distinct stiffness at the early onset of disease. HMI is a radiation-force-based imaging method that generates a localized vibration deep inside the tissue to estimate the relative tissue stiffness based on the resulting displacement amplitude. In this paper, a finite-element model (FEM) study is presented, followed by an experimental validation in tissue-mimicking polyacrylamide gels and excised human breast tumors ex vivo. This study compares the resulting tissue motion in simulations and experiments at four different gel stiffnesses and three distinct spherical inclusion diameters. The elastic moduli of the gels were separately measured using mechanical testing. Identical transducer parameters were used in both the FEM and experimental studies, i.e., a 4.5-MHz single-element focused ultrasound (FUS) and a 7.5-MHz diagnostic (pulse-echo) transducer. In the simulation, an acoustic pressure field was used as the input stimulus to generate a localized vibration inside the target. Radiofrequency (rf) signals were then simulated using a 2D convolution model. A one-dimensional cross-correlation technique was performed on the simulated and experimental rf signals to estimate the axial displacement resulting from the harmonic radiation force. In order to measure the reliability of the displacement profiles in estimating the tissue stiffness distribution, the contrast-transfer efficiency (CTE) was calculated. For tumor mapping ex vivo, a harmonic radiation force was applied using a 2D raster-scan technique. The 2D HMI images of the breast tumor ex vivo could detect a malignant tumor (20 x 10 mm2) surrounded by glandular and fat tissues. The FEM and experimental results from both gels and breast tumors ex vivo demonstrated that HMI was capable of detecting and mapping the tumor or stiff inclusion with various diameters or stiffnesses. HMI may thus constitute a promising technique in tumor detection (>3 mm in diameter) and mapping based on its distinct stiffness.
Landsat for practical forest type mapping - A test case
NASA Technical Reports Server (NTRS)
Bryant, E.; Dodge, A. G., Jr.; Warren, S. D.
1980-01-01
Computer classified Landsat maps are compared with a recent conventional inventory of forest lands in northern Maine. Over the 196,000 hectare area mapped, estimates of the areas of softwood, mixed wood and hardwood forest obtained by a supervised classification of the Landsat data and a standard inventory based on aerial photointerpretation, probability proportional to prediction, field sampling and a standard forest measurement program are found to agree to within 5%. The cost of the Landsat maps is estimated to be $0.065/hectare. It is concluded that satellite techniques are worth developing for forest inventories, although they are not yet refined enough to be incorporated into current practical inventories.
Remote sensing sensors and applications in environmental resources mapping and modeling
Melesse, Assefa M.; Weng, Qihao; Thenkabail, Prasad S.; Senay, Gabriel B.
2007-01-01
The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling.
A flexible and rapid frequency selective scheme for SRS microscopy
NASA Astrophysics Data System (ADS)
Li, Jingting; Yue, Yuankai; Shih, Wei-Chuan
2017-02-01
Stimulated Raman scattering (SRS) is a label-free imaging technique suitable for studying biological systems. Due to stimulated nature by ultrafast laser pulses, SRS microscopy has the advantage of significantly higher sensitivity but often reduced spectroscopic information. In this paper, we present a newly constructed femtosecond SRS microscope with a high-speed dynamic micromirror device based pulse shaper to achieve flexible and rapid frequency selection within the C-H stretch region near 2800 to 3100 cm-1 with spectral width of 30 cm-1. This technique is applicable to lipid profiling such as cell activity mapping, lipid distribution mapping and distinction among subclasses.
NASA Astrophysics Data System (ADS)
Glass, John O.; Reddick, Wilburn E.; Reeves, Cara; Pui, Ching-Hon
2004-05-01
Reliably quantifying therapy-induced leukoencephalopathy in children treated for cancer is a challenging task due to its varying MR properties and similarity to normal tissues and imaging artifacts. T1, T2, PD, and FLAIR images were analyzed for a subset of 15 children from an institutional protocol for the treatment of acute lymphoblastic leukemia. Three different analysis techniques were compared to examine improvements in the segmentation accuracy of leukoencephalopathy versus manual tracings by two expert observers. The first technique utilized no apriori information and a white matter mask based on the segmentation of the first serial examination of each patient. MR images were then segmented with a Kohonen Self-Organizing Map. The other two techniques combine apriori maps from the ICBM atlas spatially normalized to each patient and resliced using SPM99 software. The apriori maps were included as input and a gradient magnitude threshold calculated on the FLAIR images was also utilized. The second technique used a 2-dimensional threshold, while the third algorithm utilized a 3-dimensional threshold. Kappa values were compared for the three techniques to each observer, and improvements were seen with each addition to the original algorithm (Observer 1: 0.651, 0.653, 0.744; Observer 2: 0.603, 0.615, 0.699).
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.
Konofagou, Elisa E.; Provost, Jean
2014-01-01
Cardiovascular diseases rank as America’s primary killer, claiming the lives of over 41% of more than 2.4 million Americans. One of the main reasons for this high death toll is the severe lack of effective imaging techniques for screening, early detection and localization of an abnormality detected on the electrocardiogram (ECG). The two most widely used imaging techniques in the clinic are CT angiography and echocardiography with limitations in speed of application and reliability, respectively. It has been established that the mechanical and electrical properties of the myocardium change dramatically as a result of ischemia, infarction or arrhythmia; both at their onset and after survival. Despite these findings, no imaging technique currently exists that is routinely used in the clinic and can provide reliable, non-invasive, quantitative mapping of the regional, mechanical and electrical function of the myocardium. Electromechanical Wave Imaging (EWI) is an ultrasound-based technique that utilizes the electromechanical coupling and its associated resulting strain to infer to the underlying electrical function of the myocardium. The methodology of EWI is first described and its fundamental performance is presented. Subsequent in vivo canine and human applications are provided that demonstrate the applicability of Electromechanical Wave Imaging in differentiating between sinus rhythm and induced pacing schemes as well as mapping arrhythmias. Preliminary validation with catheter mapping is also provided and transthoracic electromechanical mapping in all four chambers of the human heart is also presented demonstrating the potential of this novel methodology to noninvasively infer to both the normal and pathological electrical conduction of the heart. PMID:22284425
Carrier-phase multipath corrections for GPS-based satellite attitude determination
NASA Technical Reports Server (NTRS)
Axelrad, A.; Reichert, P.
2001-01-01
This paper demonstrates the high degree of spatial repeatability of these errors for a spacecraft environment and describes a correction technique, termed the sky map method, which exploits the spatial correlation to correct measurements and improve the accuracy of GPS-based attitude solutions.
Spectral features based tea garden extraction from digital orthophoto maps
NASA Astrophysics Data System (ADS)
Jamil, Akhtar; Bayram, Bulent; Kucuk, Turgay; Zafer Seker, Dursun
2018-05-01
The advancements in the photogrammetry and remote sensing technologies has made it possible to extract useful tangible information from data which plays a pivotal role in various application such as management and monitoring of forests and agricultural lands etc. This study aimed to evaluate the effectiveness of spectral signatures for extraction of tea gardens from 1 : 5000 scaled digital orthophoto maps obtained from Rize city in Turkey. First, the normalized difference vegetation index (NDVI) was derived from the input images to suppress the non-vegetation areas. NDVI values less than zero were discarded and the output images was normalized in the range 0-255. Individual pixels were then mapped into meaningful objects using global region growing technique. The resulting image was filtered and smoothed to reduce the impact of noise. Furthermore, geometrical constraints were applied to remove small objects (less than 500 pixels) followed by morphological opening operator to enhance the results. These objects served as building blocks for further image analysis. Finally, for the classification stage, a range of spectral values were empirically calculated for each band and applied on candidate objects to extract tea gardens. For accuracy assessment, we employed an area based similarity metric by overlapping obtained tea garden boundaries with the manually digitized tea garden boundaries created by experts of photogrammetry. The overall accuracy of the proposed method scored 89 % for tea gardens from 10 sample orthophoto maps. We concluded that exploiting the spectral signatures using object based analysis is an effective technique for extraction of dominant tree species from digital orthophoto maps.
Comparison between genetic algorithm and self organizing map to detect botnet network traffic
NASA Astrophysics Data System (ADS)
Yugandhara Prabhakar, Shinde; Parganiha, Pratishtha; Madhu Viswanatham, V.; Nirmala, M.
2017-11-01
In Cyber Security world the botnet attacks are increasing. To detect botnet is a challenging task. Botnet is a group of computers connected in a coordinated fashion to do malicious activities. Many techniques have been developed and used to detect and prevent botnet traffic and the attacks. In this paper, a comparative study is done on Genetic Algorithm (GA) and Self Organizing Map (SOM) to detect the botnet network traffic. Both are soft computing techniques and used in this paper as data analytics system. GA is based on natural evolution process and SOM is an Artificial Neural Network type, uses unsupervised learning techniques. SOM uses neurons and classifies the data according to the neurons. Sample of KDD99 dataset is used as input to GA and SOM.
Ringler, Max; Mangione, Rosanna; Pašukonis, Andrius; Rainer, Gerhard; Gyimesi, Kristin; Felling, Julia; Kronaus, Hannes; Réjou-Méchain, Maxime; Chave, Jérôme; Reiter, Karl; Ringler, Eva
2015-01-01
For animals with spatially complex behaviours at relatively small scales, the resolution of a global positioning system (GPS) receiver location is often below the resolution needed to correctly map animals’ spatial behaviour. Natural conditions such as canopy cover, canyons or clouds can further degrade GPS receiver reception. Here we present a detailed, high-resolution map of a 4.6 ha Neotropical river island and a 8.3 ha mainland plot with the location of every tree >5 cm DBH and all structures on the forest floor, which are relevant to our study species, the territorial frog Allobates femoralis (Dendrobatidae). The map was derived using distance- and compass-based survey techniques, rooted on dGPS reference points, and incorporates altitudinal information based on a LiDAR survey of the area. PMID:27053943
Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi
2017-01-01
We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chiu, Bernard; Li, Bing; Chow, Tommy W. S.
2013-09-01
With the advent of new therapies and management strategies for carotid atherosclerosis, there is a parallel need for measurement tools or biomarkers to evaluate the efficacy of these new strategies. 3D ultrasound has been shown to provide reproducible measurements of plaque area/volume and vessel wall volume. However, since carotid atherosclerosis is a focal disease that predominantly occurs at bifurcations, biomarkers based on local plaque change may be more sensitive than global volumetric measurements in demonstrating efficacy of new therapies. The ultimate goal of this paper is to develop a biomarker that is based on the local distribution of vessel-wall-plus-plaque thickness change (VWT-Change) that has occurred during the course of a clinical study. To allow comparison between different treatment groups, the VWT-Change distribution of each subject must first be mapped to a standardized domain. In this study, we developed a technique to map the 3D VWT-Change distribution to a 2D standardized template. We then applied a feature selection technique to identify regions on the 2D standardized map on which subjects in different treatment groups exhibit greater difference in VWT-Change. The proposed algorithm was applied to analyse the VWT-Change of 20 subjects in a placebo-controlled study of the effect of atorvastatin (Lipitor). The average VWT-Change for each subject was computed (i) over all points in the 2D map and (ii) over feature points only. For the average computed over all points, 97 subjects per group would be required to detect an effect size of 25% that of atorvastatin in a six-month study. The sample size is reduced to 25 subjects if the average were computed over feature points only. The introduction of this sensitive quantification technique for carotid atherosclerosis progression/regression would allow many proof-of-principle studies to be performed before a more costly and longer study involving a larger population is held to confirm the treatment efficacy.
Yatsushiro, Satoshi; Sunohara, Saeko; Hayashi, Naokazu; Hirayama, Akihiro; Matsumae, Mitsunori; Atsumi, Hideki; Kuroda, Kagayaki
2018-04-10
A correlation mapping technique delineating delay time and maximum correlation for characterizing pulsatile cerebrospinal fluid (CSF) propagation was proposed. After proofing its technical concept, this technique was applied to healthy volunteers and idiopathic normal pressure hydrocephalus (iNPH) patients. A time-resolved three dimensional-phase contrast (3D-PC) sampled the cardiac-driven CSF velocity at 32 temporal points per cardiac period at each spatial location using retrospective cardiac gating. The proposed technique visualized distributions of propagation delay and correlation coefficient of the PC-based CSF velocity waveform with reference to a waveform at a particular point in the CSF space. The delay time was obtained as the amount of time-shift, giving the maximum correlation for the velocity waveform at an arbitrary location with that at the reference location. The validity and accuracy of the technique were confirmed in a flow phantom equipped with a cardiovascular pump. The technique was then applied to evaluate the intracranial CSF motions in young, healthy (N = 13), and elderly, healthy (N = 13) volunteers and iNPH patients (N = 13). The phantom study demonstrated that root mean square error of the delay time was 2.27%, which was less than the temporal resolution of PC measurement used in this study (3.13% of a cardiac cycle). The human studies showed a significant difference (P < 0.01) in the mean correlation coefficient between the young, healthy group and the other two groups. A significant difference (P < 0.05) was also recognized in standard deviation of the correlation coefficients in intracranial CSF space among all groups. The result suggests that the CSF space compliance of iNPH patients was lower than that of healthy volunteers. The correlation mapping technique allowed us to visualize pulsatile CSF velocity wave propagations as still images. The technique may help to classify diseases related to CSF dynamics, such as iNPH.
Fusion-based multi-target tracking and localization for intelligent surveillance systems
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2008-04-01
In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.
Generating realistic images using Kray
NASA Astrophysics Data System (ADS)
Tanski, Grzegorz
2004-07-01
Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.
NASA Astrophysics Data System (ADS)
Roberge, S.; Chokmani, K.; De Sève, D.
2012-04-01
The snow cover plays an important role in the hydrological cycle of Quebec (Eastern Canada). Consequently, evaluating its spatial extent interests the authorities responsible for the management of water resources, especially hydropower companies. The main objective of this study is the development of a snow-cover mapping strategy using remote sensing data and ensemble based systems techniques. Planned to be tested in a near real-time operational mode, this snow-cover mapping strategy has the advantage to provide the probability of a pixel to be snow covered and its uncertainty. Ensemble systems are made of two key components. First, a method is needed to build an ensemble of classifiers that is diverse as much as possible. Second, an approach is required to combine the outputs of individual classifiers that make up the ensemble in such a way that correct decisions are amplified, and incorrect ones are cancelled out. In this study, we demonstrate the potential of ensemble systems to snow-cover mapping using remote sensing data. The chosen classifier is a sequential thresholds algorithm using NOAA-AVHRR data adapted to conditions over Eastern Canada. Its special feature is the use of a combination of six sequential thresholds varying according to the day in the winter season. Two versions of the snow-cover mapping algorithm have been developed: one is specific for autumn (from October 1st to December 31st) and the other for spring (from March 16th to May 31st). In order to build the ensemble based system, different versions of the algorithm are created by varying randomly its parameters. One hundred of the versions are included in the ensemble. The probability of a pixel to be snow, no-snow or cloud covered corresponds to the amount of votes the pixel has been classified as such by all classifiers. The overall performance of ensemble based mapping is compared to the overall performance of the chosen classifier, and also with ground observations at meteorological stations.
A comparison of two conformal mapping techniques applied to an aerobrake body
NASA Technical Reports Server (NTRS)
Hommel, Mark J.
1987-01-01
Conformal mapping is a classical technique which has been utilized for solving problems in aerodynamics and hydrodynamics. Conformal mapping has been successfully applied in the construction of grids around airfoils, engine inlets and other aircraft configurations. Conformal mapping techniques were applied to an aerobrake body having an axis of symmetry. Two different approaches were utilized: (1) Karman-Trefftz transformation; and (2) Point Wise Schwarz Christoffel transformation. In both cases, the aerobrake body was mapped onto a near circle, and a grid was generated in the mapped plane. The mapped body and grid were then mapped back into physical space and the properties of the associated grids were examined. Advantages and disadvantages of both approaches are discussed.
Teuho, Jarmo; Saunavaara, Virva; Tolvanen, Tuula; Tuokkola, Terhi; Karlsson, Antti; Tuisku, Jouni; Teräs, Mika
2017-10-01
In PET, corrections for photon scatter and attenuation are essential for visual and quantitative consistency. MR attenuation correction (MRAC) is generally conducted by image segmentation and assignment of discrete attenuation coefficients, which offer limited accuracy compared with CT attenuation correction. Potential inaccuracies in MRAC may affect scatter correction, because the attenuation image (μ-map) is used in single scatter simulation (SSS) to calculate the scatter estimate. We assessed the impact of MRAC to scatter correction using 2 scatter-correction techniques and 3 μ-maps for MRAC. Methods: The tail-fitted SSS (TF-SSS) and a Monte Carlo-based single scatter simulation (MC-SSS) algorithm implementations on the Philips Ingenuity TF PET/MR were used with 1 CT-based and 2 MR-based μ-maps. Data from 7 subjects were used in the clinical evaluation, and a phantom study using an anatomic brain phantom was conducted. Scatter-correction sinograms were evaluated for each scatter correction method and μ-map. Absolute image quantification was investigated with the phantom data. Quantitative assessment of PET images was performed by volume-of-interest and ratio image analysis. Results: MRAC did not result in large differences in scatter algorithm performance, especially with TF-SSS. Scatter sinograms and scatter fractions did not reveal large differences regardless of the μ-map used. TF-SSS showed slightly higher absolute quantification. The differences in volume-of-interest analysis between TF-SSS and MC-SSS were 3% at maximum in the phantom and 4% in the patient study. Both algorithms showed excellent correlation with each other with no visual differences between PET images. MC-SSS showed a slight dependency on the μ-map used, with a difference of 2% on average and 4% at maximum when a μ-map without bone was used. Conclusion: The effect of different MR-based μ-maps on the performance of scatter correction was minimal in non-time-of-flight 18 F-FDG PET/MR brain imaging. The SSS algorithm was not affected significantly by MRAC. The performance of the MC-SSS algorithm is comparable but not superior to TF-SSS, warranting further investigations of algorithm optimization and performance with different radiotracers and time-of-flight imaging. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Interactive Mapping on Virtual Terrain Models Using RIMS (Real-time, Interactive Mapping System)
NASA Astrophysics Data System (ADS)
Bernardin, T.; Cowgill, E.; Gold, R. D.; Hamann, B.; Kreylos, O.; Schmitt, A.
2006-12-01
Recent and ongoing space missions are yielding new multispectral data for the surfaces of Earth and other planets at unprecedented rates and spatial resolution. With their high spatial resolution and widespread coverage, these data have opened new frontiers in observational Earth and planetary science. But they have also precipitated an acute need for new analytical techniques. To address this problem, we have developed RIMS, a Real-time, Interactive Mapping System that allows scientists to visualize, interact with, and map directly on, three-dimensional (3D) displays of georeferenced texture data, such as multispectral satellite imagery, that is draped over a surface representation derived from digital elevation data. The system uses a quadtree-based multiresolution method to render in real time high-resolution (3 to 10 m/pixel) data over large (800 km by 800 km) spatial areas. It allows users to map inside this interactive environment by generating georeferenced and attributed vector-based elements that are draped over the topography. We explain the technique using 15 m ASTER stereo-data from Iraq, P.R. China, and other remote locations because our particular motivation is to develop a technique that permits the detailed (10 m to 1000 m) neotectonic mapping over large (100 km to 1000 km long) active fault systems that is needed to better understand active continental deformation on Earth. RIMS also includes a virtual geologic compass that allows users to fit a plane to geologic surfaces and thereby measure their orientations. It also includes tools that allow 3D surface reconstruction of deformed and partially eroded surfaces such as folded bedding planes. These georeferenced map and measurement data can be exported to, or imported from, a standard GIS (geographic information systems) file format. Our interactive, 3D visualization and analysis system is designed for those who study planetary surfaces, including neotectonic geologists, geomorphologists, marine geophysicists, and planetary scientists. The strength of our system is that it combines interactive rendering with interactive mapping and measurement of features observed in topographic and texture data. Comparison with commercially available software indicates that our system improves mapping accuracy and efficiency. More importantly, it enables Earth scientists to rapidly achieve a deeper level of understanding of remotely sensed data, as observations can be made that are not possible with existing systems.
The Survey of Vision-based 3D Modeling Techniques
NASA Astrophysics Data System (ADS)
Ruan, Mingzhe
2017-10-01
This paper reviews the vision-based localization and map construction methods from the perspectives of VSLAM, SFM, 3DMax and Unity3D. It focuses on the key technologies and the latest research progress on each aspect, analyzes the advantages and disadvantages of each method, illustrates their implementation process and system framework, and further discusses the way to promote the combination for their complementary strength. Finally, the future opportunity of the combination of the four techniques is expected.
Surface plasmon resonance sensing: from purified biomolecules to intact cells.
Su, Yu-Wen; Wang, Wei
2018-04-12
Surface plasmon resonance (SPR) has become a well-recognized label-free technique for measuring the binding kinetics between biomolecules since the invention of the first SPR-based immunosensor in 1980s. The most popular and traditional format for SPR analysis is to monitor the real-time optical signals when a solution containing ligand molecules is flowing over a sensor substrate functionalized with purified receptor molecules. In recent years, rapid development of several kinds of SPR imaging techniques have allowed for mapping the dynamic distribution of local mass density within single living cells with high spatial and temporal resolutions and reliable sensitivity. Such capability immediately enabled one to investigate the interaction between important biomolecules and intact cells in a label-free, quantitative, and single cell manner, leading to an exciting new trend of cell-based SPR bioanalysis. In this Trend Article, we first describe the principle and technical features of two types of SPR imaging techniques based on prism and objective, respectively. Then we survey the intact cell-based applications in both fundamental cell biology and drug discovery. We conclude the article with comments and perspectives on the future developments. Graphical abstract Recent developments in surface plasmon resonance (SPR) imaging techniques allow for label-free mapping the mass-distribution within single living cells, leading to great expansions in biomolecular interactions studies from homogeneous substrates functionalized with purified biomolecules to heterogeneous substrates containing individual living cells.
NASA Astrophysics Data System (ADS)
Kincal, Cem; Singleton, Andrew; Liu, Peng; Li, Zhenhong; Drummond, Jane; Hoey, Trevor; Muller, Jan-Peter; Qu, Wei; Zeng, Qiming; Zhang, Jingfa; Du, Peijun
2010-10-01
Mass movements on steep slopes are a major hazard to communities and infrastructure in the Three Gorges region, China. Developing susceptibility maps of mass movements is therefore very important in both current and future land use planning. This study employed satellite optical imagery and an ASTER GDEM (15 m) to derive various parameters (namely geology; slope gradient; proximity to drainage networks and proximity to lineaments) in order to create a GIS-based map of mass movement susceptibility. This map was then evaluated using highly accurate deformation signals processed using the Persistent Scatterer (PS) InSAR technique. Areas of high susceptibility correspond well to points of high subsidence, which provides a strong support of our susceptibility map.
Mapping local anisotropy axis for scattering media using backscattering Mueller matrix imaging
NASA Astrophysics Data System (ADS)
He, Honghui; Sun, Minghao; Zeng, Nan; Du, E.; Guo, Yihong; He, Yonghong; Ma, Hui
2014-03-01
Mueller matrix imaging techniques can be used to detect the micro-structure variations of superficial biological tissues, including the sizes and shapes of cells, the structures in cells, and the densities of the organelles. Many tissues contain anisotropic fibrous micro-structures, such as collagen fibers, elastin fibers, and muscle fibers. Changes of these fibrous structures are potentially good indicators for some pathological variations. In this paper, we propose a quantitative analysis technique based on Mueller matrix for mapping local anisotropy axis of scattering media. By conducting both experiments on silk sample and Monte Carlo simulation based on the sphere-cylinder scattering model (SCSM), we extract anisotropy axis parameters from different backscattering Mueller matrix elements. Moreover, we testify the possible applications of these parameters for biological tissues. The preliminary experimental results of human cancerous samples show that, these parameters are capable to map the local axis of fibers. Since many pathological changes including early stage cancers affect the well aligned structures for tissues, the experimental results indicate that these parameters can be used as potential tools in clinical applications for biomedical diagnosis purposes.
NASA Technical Reports Server (NTRS)
Komjathy, Attila; Sparks, Lawrence; Wilson, Brian D.; Mannucci, Anthony J.
2005-01-01
To take advantage of the vast amount of GPS data, researchers use a number of techniques to estimate satellite and receiver interfrequency biases and the total electron content (TEC) of the ionosphere. Most techniques estimate vertical ionospheric structure and, simultaneously, hardware-related biases treated as nuisance parameters. These methods often are limited to 200 GPS receivers and use a sequential least squares or Kalman filter approach. The biases are later removed from the measurements to obtain unbiased TEC. In our approach to calibrating GPS receiver and transmitter interfrequency biases we take advantage of all available GPS receivers using a new processing algorithm based on the Global Ionospheric Mapping (GIM) software developed at the Jet Propulsion Laboratory. This new capability is designed to estimate receiver biases for all stations. We solve for the instrumental biases by modeling the ionospheric delay and removing it from the observation equation using precomputed GIM maps. The precomputed GIM maps rely on 200 globally distributed GPS receivers to establish the ''background'' used to model the ionosphere at the remaining 800 GPS sites.
ERIC Educational Resources Information Center
Kandiko, Camille B.; Kinchin, Ian M.
2012-01-01
Background: Concept-mapping and interview techniques are used to track knowledge and understanding over the duration of PhD study amongst four students and their supervisors in the course of full-time research towards their PhDs. This work is in contrast to much PhD supervision research and policy research that focuses on supervisory styles and…
Planetary cartography in the next decade: Digital cartography and emerging opportunities
NASA Technical Reports Server (NTRS)
1989-01-01
Planetary maps being produced today will represent views of the solar system for many decades to come. The primary objective of the planetary cartography program is to produce the most complete and accurate maps from hundreds of thousands of planetary images in support of scientific studies and future missions. Here, the utilization of digital techniques and digital bases in response to recent advances in computer technology are emphasized.
Impervious surface mapping with Quickbird imagery
Lu, Dengsheng; Hetrick, Scott; Moran, Emilio
2010-01-01
This research selects two study areas with different urban developments, sizes, and spatial patterns to explore the suitable methods for mapping impervious surface distribution using Quickbird imagery. The selected methods include per-pixel based supervised classification, segmentation-based classification, and a hybrid method. A comparative analysis of the results indicates that per-pixel based supervised classification produces a large number of “salt-and-pepper” pixels, and segmentation based methods can significantly reduce this problem. However, neither method can effectively solve the spectral confusion of impervious surfaces with water/wetland and bare soils and the impacts of shadows. In order to accurately map impervious surface distribution from Quickbird images, manual editing is necessary and may be the only way to extract impervious surfaces from the confused land covers and the shadow problem. This research indicates that the hybrid method consisting of thresholding techniques, unsupervised classification and limited manual editing provides the best performance. PMID:21643434
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
GPU-BSM: A GPU-Based Tool to Map Bisulfite-Treated Reads
Manconi, Andrea; Orro, Alessandro; Manca, Emanuele; Armano, Giuliano; Milanesi, Luciano
2014-01-01
Cytosine DNA methylation is an epigenetic mark implicated in several biological processes. Bisulfite treatment of DNA is acknowledged as the gold standard technique to study methylation. This technique introduces changes in the genomic DNA by converting cytosines to uracils while 5-methylcytosines remain nonreactive. During PCR amplification 5-methylcytosines are amplified as cytosine, whereas uracils and thymines as thymine. To detect the methylation levels, reads treated with the bisulfite must be aligned against a reference genome. Mapping these reads to a reference genome represents a significant computational challenge mainly due to the increased search space and the loss of information introduced by the treatment. To deal with this computational challenge we devised GPU-BSM, a tool based on modern Graphics Processing Units. Graphics Processing Units are hardware accelerators that are increasingly being used successfully to accelerate general-purpose scientific applications. GPU-BSM is a tool able to map bisulfite-treated reads from whole genome bisulfite sequencing and reduced representation bisulfite sequencing, and to estimate methylation levels, with the goal of detecting methylation. Due to the massive parallelization obtained by exploiting graphics cards, GPU-BSM aligns bisulfite-treated reads faster than other cutting-edge solutions, while outperforming most of them in terms of unique mapped reads. PMID:24842718
Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang
2012-10-21
A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.
Cognitive Mapping Techniques: Implications for Research in Engineering and Technology Education
ERIC Educational Resources Information Center
Dixon, Raymond A.; Lammi, Matthew
2014-01-01
The primary goal of this paper is to present the theoretical basis and application of two types of cognitive maps, concept map and mind map, and explain how they can be used by educational researchers in engineering design research. Cognitive mapping techniques can be useful to researchers as they study students' problem solving strategies…
Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time.
Dhar, Amrit; Minin, Vladimir N
2017-05-01
Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences.
Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time
Dhar, Amrit
2017-01-01
Abstract Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences. PMID:28177780
Imaging of stellar surfaces with the Occamian approach and the least-squares deconvolution technique
NASA Astrophysics Data System (ADS)
Järvinen, S. P.; Berdyugina, S. V.
2010-10-01
Context. We present in this paper a new technique for the indirect imaging of stellar surfaces (Doppler imaging, DI), when low signal-to-noise spectral data have been improved by the least-squares deconvolution (LSD) method and inverted into temperature maps with the Occamian approach. We apply this technique to both simulated and real data and investigate its applicability for different stellar rotation rates and noise levels in data. Aims: Our goal is to boost the signal of spots in spectral lines and to reduce the effect of photon noise without loosing the temperature information in the lines. Methods: We simulated data from a test star, to which we added different amounts of noise, and employed the inversion technique based on the Occamian approach with and without LSD. In order to be able to infer a temperature map from LSD profiles, we applied the LSD technique for the first time to both the simulated observations and theoretical local line profiles, which remain dependent on temperature and limb angles. We also investigated how the excitation energy of individual lines effects the obtained solution by using three submasks that have lines with low, medium, and high excitation energy levels. Results: We show that our novel approach enables us to overcome the limitations of the two-temperature approximation, which was previously employed for LSD profiles, and to obtain true temperature maps with stellar atmosphere models. The resulting maps agree well with those obtained using the inversion code without LSD, provided the data are noiseless. However, using LSD is only advisable for poor signal-to-noise data. Further, we show that the Occamian technique, both with and without LSD, approaches the surface temperature distribution reasonably well for an adequate spatial resolution. Thus, the stellar rotation rate has a great influence on the result. For instance, in a slowly rotating star, closely situated spots are usually recovered blurred and unresolved, which affects the obtained temperature range of the map. This limitation is critical for small unresolved cool spots and is common for all DI techniques. Finally the LSD method was carried out for high signal-to-noise observations of the young active star V889 Her: the maps obtained with and without LSD are found to be consistent. Conclusions: Our new technique provides meaningful information on the temperature distribution on the stellar surfaces, which was previously inaccessible in DI with LSD. Our approach can be easily adopted for any other multi-line techniques.
Spatial analysis of plutonium-239 + 240 and Americium-241 in soils around Rocky Flats, Colorado
DOE Office of Scientific and Technical Information (OSTI.GOV)
Litaor, M.I.
1995-05-01
Plutonium and american contamination of soils around Rocky Flats, Colorado resulted from past outdoor storage practices. Four previous studies produce four different Pu isopleth maps. Spatial estimation techniques were not used in the construction of these maps and were also based on an extremely small number of soil samples. The purpose of this study was to elucidate the magnitude of Pu-239 + 240 and Am-241 dispersion in the soil environment east of Rocky Flats using robust spatial estimation techniques. Soils were sampled from 118 plots of 1.01 and 4.05 ha by compositing 25 evenly spaced samples in each plot frommore » the top 0.64 cm. Plutonium-239 + 240 activity ranged from 1.85 to 53 560 Bq/kg with a mean of 1924 Bq/kg and a standard deviation of 6327 Bq/kg. Americium-241 activity ranged from 0.18 to 9990 Bq/kg with a mean of 321 Bq/kg and a standard deviation of 1143 Bq/kg. Geostatistical techniques were used to model the spatial dependency and construct isopleth maps showing Pu-239 + 240 and Am-241 distribution. The isopleth configuration was consistent with the hypothesis that the dominant dispersal mechanism of Pu-239 + 240 was wind dispersion from west to east. The Pu-239 + 240 isopleth map proposed to this study differed significantly in the direction and distance of dispersal from the previously published maps. This ispleth map as well as the Am-241 map should be used as the primary data for future risk assessment associated with public exposure to Pu-239 + 240 and Am-241. 37 refs., 7 figs., 2 tabs.« less
Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS
NASA Astrophysics Data System (ADS)
Ahmad, Raed; Adris, Ahmad; Singh, Ramesh
2016-07-01
In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.
Mapping of the Land Cover Spatiotemporal Characteristics in Northern Russia Caused by Climate Change
NASA Astrophysics Data System (ADS)
Panidi, E.; Tsepelev, V.; Torlopova, N.; Bobkov, A.
2016-06-01
The study is devoted to the investigation of regional climate change in Northern Russia. Due to sparseness of the meteorological observation network in northern regions, we investigate the application capabilities of remotely sensed vegetation cover as indicator of climate change at the regional scale. In previous studies, we identified statistically significant relationship between the increase of surface air temperature and increase of the shrub vegetation productivity. We verified this relationship using ground observation data collected at the meteorological stations and Normalised Difference Vegetation Index (NDVI) data produced from Terra/MODIS satellite imagery. Additionally, we designed the technique of growing seasons separation for detailed investigation of the land cover (shrub cover) dynamics. Growing seasons are the periods when the temperature exceeds +5°C and +10°C. These periods determine the vegetation productivity conditions (i.e., conditions that allow growth of the phytomass). We have discovered that the trend signs for the surface air temperature and NDVI coincide on planes and river floodplains. On the current stage of the study, we are working on the automated mapping technique, which allows to estimate the direction and magnitude of the climate change in Northern Russia. This technique will make it possible to extrapolate identified relationship between land cover and climate onto territories with sparse network of meteorological stations. We have produced the gridded maps of NDVI and NDWI for the test area in European part of Northern Russia covered with the shrub vegetation. Basing on these maps, we may determine the frames of growing seasons for each grid cell. It will help us to obtain gridded maps of the NDVI linear trend for growing seasons on cell-by-cell basis. The trend maps can be used as indicative maps for estimation of the climate change on the studied areas.
NASA Astrophysics Data System (ADS)
Mansaray, Lamin R.; Liu, Lei; Zhou, Jun; Ma, Zhimin
2013-10-01
The Tonkolili iron field in northern Sierra Leone has the largest known iron ore deposit in Africa. It occurs in a greenstone belt in an Achaean granitic basement. This study focused mainly on mapping areas with iron-oxide and hydroxyl bearing minerals, and identifying potential areas for haematite mineralization and banded iron formations (BIFs) in Tonkolili. The predominant mineral assemblage at the surface (laterite duricrust) of this iron field is haematitegoethite- limonite ±magnetite. The mineralization occurs in quartzitic banded ironstones, layered amphibolites, granites, schists and hornblendites. In this study, Crosta techniques were applied on Enhanced Thematic Mapper (ETM+) data to enhance areas with alteration minerals and target potential areas of haematite and BIF units in the Tonkolili iron field. Synthetic analysis shows that alteration zones mapped herein are consistent with the already discovered magnetite BIFs in Tonkolili. Based on the overlaps of the simplified geological map and the remote sensing-based alteration mineral maps obtained in this study, three new haematite prospects were inferred within, and one new haematite prospect was inferred outside the tenement boundary of the Tonkolili exploration license. As the primary iron mineral in Tonkolili is magnetite, the study concludes that, these haematite prospects could also be underlain by magnetite BIFs. This study also concludes that, the application of Crosta techniques on ETM+ data is effective not only in mapping iron-oxide and hydroxyl alterations but can also provide a basis for inferring areas of potential iron resources in Algoma-type banded iron formations (BIFs), such as those in the Tonkolili field.
NASA Astrophysics Data System (ADS)
Abe, O. E.; Otero Villamide, X.; Paparini, C.; Radicella, S. M.; Nava, B.; Rodríguez-Bouza, M.
2017-04-01
Global Navigation Satellite Systems (GNSS) have become a powerful tool use in surveying and mapping, air and maritime navigation, ionospheric/space weather research and other applications. However, in some cases, its maximum efficiency could not be attained due to some uncorrelated errors associated with the system measurements, which is caused mainly by the dispersive nature of the ionosphere. Ionosphere has been represented using the total number of electrons along the signal path at a particular height known as Total Electron Content (TEC). However, there are many methods to estimate TEC but the outputs are not uniform, which could be due to the peculiarity in characterizing the biases inside the observables (measurements), and sometimes could be associated to the influence of mapping function. The errors in TEC estimation could lead to wrong conclusion and this could be more critical in case of safety-of-life application. This work investigated the performance of Ciraolo's and Gopi's GNSS-TEC calibration techniques, during 5 geomagnetic quiet and disturbed conditions in the month of October 2013, at the grid points located in low and middle latitudes. The data used are obtained from the GNSS ground-based receivers located at Borriana in Spain (40°N, 0°E; mid latitude) and Accra in Ghana (5.50°N, -0.20°E; low latitude). The results of the calibrated TEC are compared with the TEC obtained from European Geostationary Navigation Overlay System Processing Set (EGNOS PS) TEC algorithm, which is considered as a reference data. The TEC derived from Global Ionospheric Maps (GIM) through International GNSS service (IGS) was also examined at the same grid points. The results obtained in this work showed that Ciraolo's calibration technique (a calibration technique based on carrier-phase measurements only) estimates TEC better at middle latitude in comparison to Gopi's technique (a calibration technique based on code and carrier-phase measurements). At the same time, Gopi's calibration was also found more reliable in low latitude than Ciraolo's technique. In addition, the TEC derived from IGS GIM seems to be much reliable in middle-latitude than in low-latitude region.
Historical shoreline mapping (I): improving techniques and reducing positioning errors
Thieler, E. Robert; Danforth, William W.
1994-01-01
A critical need exists among coastal researchers and policy-makers for a precise method to obtain shoreline positions from historical maps and aerial photographs. A number of methods that vary widely in approach and accuracy have been developed to meet this need. None of the existing methods, however, address the entire range of cartographic and photogrammetric techniques required for accurate coastal mapping. Thus, their application to many typical shoreline mapping problems is limited. In addition, no shoreline mapping technique provides an adequate basis for quantifying the many errors inherent in shoreline mapping using maps and air photos. As a result, current assessments of errors in air photo mapping techniques generally (and falsely) assume that errors in shoreline positions are represented by the sum of a series of worst-case assumptions about digitizer operator resolution and ground control accuracy. These assessments also ignore altogether other errors that commonly approach ground distances of 10 m. This paper provides a conceptual and analytical framework for improved methods of extracting geographic data from maps and aerial photographs. We also present a new approach to shoreline mapping using air photos that revises and extends a number of photogrammetric techniques. These techniques include (1) developing spatially and temporally overlapping control networks for large groups of photos; (2) digitizing air photos for use in shoreline mapping; (3) preprocessing digitized photos to remove lens distortion and film deformation effects; (4) simultaneous aerotriangulation of large groups of spatially and temporally overlapping photos; and (5) using a single-ray intersection technique to determine geographic shoreline coordinates and express the horizontal and vertical error associated with a given digitized shoreline. As long as historical maps and air photos are used in studies of shoreline change, there will be a considerable amount of error (on the order of several meters) present in shoreline position and rate-of- change calculations. The techniques presented in this paper, however, provide a means to reduce and quantify these errors so that realistic assessments of the technological noise (as opposed to geological noise) in geographic shoreline positions can be made.
NASA Astrophysics Data System (ADS)
Dilalos, S.; Alexopoulos, J. D.
2017-05-01
In this paper, we discuss the correlation between isoseismal contour maps and gravity residual anomaly maps and how it might contribute to the characterization of vulnerable areas to earthquake damage, especially in urban areas, where the geophysical data collection is difficult. More specifically, we compare a couple of isoseismal maps that have been produced and published after the catastrophic earthquake of 7th September 1999 (5.9R) in Athens, the metropolis of Greece, with the residual map produced from the processing and data reduction of a gravity survey that has been carried out in the Athens basin recently. The geologic and tectonic regime of the Athens basin is quite complicated and it is still being updated with new elements. Basically it is comprised of four different geotectonic units, one of them considered as the autochthon. During the gravity investigation, 807 gravity stations were collected, based on a grid plan with spacing almost 1 km, covering the entire basin and supported by a newly established gravity base network comprised by thirteen bases. Differential DGPS technique was used for the accurate measurement of all the gravity stations and bases coordinates. After the appropriate data reduction and the construction of the Complete Bouguer Anomaly map, we applied FFT filtering in order to remove the regional component and produce the Residual Anomaly Map. The comparison of the Residual Anomaly Map with the isoseismal contours revealed that the areas with the most damage because of the earthquake were located in the areas with the minimum values of the Residual Anomaly Map.
Characterization and delineation of caribou habitat on Unimak Island using remote sensing techniques
NASA Astrophysics Data System (ADS)
Atkinson, Brain M.
The assessment of herbivore habitat quality is traditionally based on quantifying the forages available to the animal across their home range through ground-based techniques. While these methods are highly accurate, they can be time-consuming and highly expensive, especially for herbivores that occupy vast spatial landscapes. The Unimak Island caribou herd has been decreasing in the last decade at rates that have prompted discussion of management intervention. Frequent inclement weather in this region of Alaska has provided for little opportunity to study the caribou forage habitat on Unimak Island. The overall objectives of this study were two-fold 1) to assess the feasibility of using high-resolution color and near-infrared aerial imagery to map the forage distribution of caribou habitat on Unimak Island and 2) to assess the use of a new high-resolution multispectral satellite imagery platform, RapidEye, and use of the "red-edge" spectral band on vegetation classification accuracy. Maximum likelihood classification algorithms were used to create land cover maps in aerial and satellite imagery. Accuracy assessments and transformed divergence values were produced to assess vegetative spectral information and classification accuracy. By using RapidEye and aerial digital imagery in a hierarchical supervised classification technique, we were able to produce a high resolution land cover map of Unimak Island. We obtained overall accuracy rates of 71.4 percent which are comparable to other land cover maps using RapidEye imagery. The "red-edge" spectral band included in the RapidEye imagery provides additional spectral information that allows for a more accurate overall classification, raising overall accuracy 5.2 percent.
Comparison of clinical knowledge bases for summarization of electronic health records.
McCoy, Allison B; Sittig, Dean F; Wright, Adam
2013-01-01
Automated summarization tools that create condition-specific displays may improve clinician efficiency. These tools require new kinds of knowledge that is difficult to obtain. We compared five problem-medication pair knowledge bases generated using four previously described knowledge base development approaches. The number of pairs in the resulting mapped knowledge bases varied widely due to differing mapping techniques from the source terminologies, ranging from 2,873 to 63,977,738 pairs. The number of overlapping pairs across knowledge bases was low, with one knowledge base having half of the pairs overlapping with another knowledge base, and most having less than a third overlapping. Further research is necessary to better evaluate the knowledge bases independently in additional settings, and to identify methods to integrate the knowledge bases.
NASA Astrophysics Data System (ADS)
Yilmaz, Işik; Marschalko, Marian; Bednarik, Martin
2013-04-01
The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for collapse susceptibility modelling. Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques were used in GIS based collapse susceptibility mapping in an area from Sivas basin (Turkey). Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index (TWI), stream power index (SPI), Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from the models, and they were then compared by means of their validations. However, Area Under Curve (AUC) values obtained from all three models showed that the map obtained from soft computing (ANN) model looks like more accurate than the other models, accuracies of all three models can be evaluated relatively similar. The results also showed that the conditional probability is an essential method in preparation of collapse susceptibility map and highly compatible with GIS operating features.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-09
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661
NASA Astrophysics Data System (ADS)
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
Mapping edge-based traffic measurements onto the internal links in MPLS network
NASA Astrophysics Data System (ADS)
Zhao, Guofeng; Tang, Hong; Zhang, Yi
2004-09-01
Applying multi-protocol label switching techniques to IP-based backbone for traffic engineering goals has shown advantageous. Obtaining a volume of load on each internal link of the network is crucial for traffic engineering applying. Though collecting can be available for each link, such as applying traditional SNMP scheme, the approach may cause heavy processing load and sharply degrade the throughput of the core routers. Then monitoring merely at the edge of the network and mapping the measurements onto the core provides a good alternative way. In this paper, we explore a scheme for traffic mapping with edge-based measurements in MPLS network. It is supposed that the volume of traffic on each internal link over the domain would be mapped onto by measurements available only at ingress nodes. We apply path-based measurements at ingress nodes without enabling measurements in the core of the network. We propose a method that can infer a path from the ingress to the egress node using label distribution protocol without collecting routing data from core routers. Based on flow theory and queuing theory, we prove that our approach is effective and present the algorithm for traffic mapping. We also show performance simulation results that indicate potential of our approach.
NASA Astrophysics Data System (ADS)
Gawior, D.; Rutkiewicz, P.; Malik, I.; Wistuba, M.
2017-11-01
LiDAR data provide new insights into the historical development of mining industry recorded in the topography and landscape. In the study on the lead ore mining in the 13th-17th century we identified remnants of mining activity in relief that are normally obscured by dense vegetation. The industry in Tarnowice Plateau was based on exploitation of galena from the bedrock. New technologies, including DEM from airborne LiDAR provide show that present landscape and relief of post-mining area under study developed during several, subsequent phases of exploitation when different techniques of exploitation were used and probably different types of ores were exploited. Study conducted on the Tarnowice Plateau proved that combining GIS visualization techniques with historical maps, among all geological maps, is a promising approach in reconstructing development of anthropogenic relief and landscape..
Tip-enhanced Raman mapping with top-illumination AFM.
Chan, K L Andrew; Kazarian, Sergei G
2011-04-29
Tip-enhanced Raman mapping is a powerful, emerging technique that offers rich chemical information and high spatial resolution. Currently, most of the successes in tip-enhanced Raman scattering (TERS) measurements are based on the inverted configuration where tips and laser are approaching the sample from opposite sides. This results in the limitation of measurement for transparent samples only. Several approaches have been developed to obtain tip-enhanced Raman mapping in reflection mode, many of which involve certain customisations of the system. We have demonstrated in this work that it is also possible to obtain TERS nano-images using an upright microscope (top-illumination) with a gold-coated Si atomic force microscope (AFM) cantilever without significant modification to the existing integrated AFM/Raman system. A TERS image of a single-walled carbon nanotube has been achieved with a spatial resolution of ∼ 20-50 nm, demonstrating the potential of this technique for studying non-transparent nanoscale materials.
Application of remote sensing to monitoring and studying dispersion in ocean dumping
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Ohlhorst, C. W.
1981-01-01
Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.
Detection and Tracking Based on a Dynamical Hierarchical Occupancy Map in Agent-Based Simulations
2008-09-01
describes various techniques for targeting with probabilty reasoning. Advantages and disadvantages of the different methods will be discussed...Psum. Such an algorithm could decrease the performance of the prototype itself and therefore was not considered. Probabilty over time 0 0.2 0.4 0.6
White matter tractography by means of Turboprop diffusion tensor imaging.
Arfanakis, Konstantinos; Gui, Minzhi; Lazar, Mariana
2005-12-01
White matter fiber-tractography by means of diffusion tensor imaging (DTI) is a noninvasive technique that provides estimates of the structural connectivity of the brain. However, conventional fiber-tracking methods using DTI are based on echo-planar image acquisitions (EPI), which suffer from image distortions and artifacts due to magnetic susceptibility variations and eddy currents. Thus, a large percentage of white matter fiber bundles that are mapped using EPI-based DTI data are distorted, and/or terminated early, while others are completely undetected. This severely limits the potential of fiber-tracking techniques. In contrast, Turboprop imaging is a multiple-shot gradient and spin-echo (GRASE) technique that provides images with significantly fewer susceptibility and eddy current-related artifacts than EPI. The purpose of this work was to evaluate the performance of fiber-tractography techniques when using data obtained with Turboprop-DTI. All fiber pathways that were mapped were found to be in agreement with the anatomy. There were no visible distortions in any of the traced fiber bundles, even when these were located in the vicinity of significant magnetic field inhomogeneities. Additionally, the Turboprop-DTI data used in this research were acquired in less than 19 min of scan time. Thus, Turboprop appears to be a promising DTI data acquisition technique for tracing white matter fibers.
Sequence-structure mapping errors in the PDB: OB-fold domains
Venclovas, Česlovas; Ginalski, Krzysztof; Kang, Chulhee
2004-01-01
The Protein Data Bank (PDB) is the single most important repository of structural data for proteins and other biologically relevant molecules. Therefore, it is critically important to keep the PDB data, as much as possible, error-free. In this study, we have analyzed PDB crystal structures possessing oligonucleotide/oligosaccharide binding (OB)-fold, one of the highly populated folds, for the presence of sequence-structure mapping errors. Using energy-based structure quality assessment coupled with sequence analyses, we have found that there are at least five OB-structures in the PDB that have regions where sequences have been incorrectly mapped onto the structure. We have demonstrated that the combination of these computation techniques is effective not only in detecting sequence-structure mapping errors, but also in providing guidance to correct them. Namely, we have used results of computational analysis to direct a revision of X-ray data for one of the PDB entries containing a fairly inconspicuous sequence-structure mapping error. The revised structure has been deposited with the PDB. We suggest use of computational energy assessment and sequence analysis techniques to facilitate structure determination when homologs having known structure are available to use as a reference. Such computational analysis may be useful in either guiding the sequence-structure assignment process or verifying the sequence mapping within poorly defined regions. PMID:15133161
Hospital positioning: a strategic tool for the 1990s.
San Augustine, A J; Long, W J; Pantzallis, J
1992-03-01
The authors extend the process of market positioning in the health care sector by focusing on the simultaneous utilization of traditional research methods and emerging new computer-based adaptive perceptual mapping technologies and techniques.
Self-Organizing Maps-based ocean currents forecasting system.
Vilibić, Ivica; Šepić, Jadranka; Mihanović, Hrvoje; Kalinić, Hrvoje; Cosoli, Simone; Janeković, Ivica; Žagar, Nedjeljka; Jesenko, Blaž; Tudor, Martina; Dadić, Vlado; Ivanković, Damir
2016-03-16
An ocean surface currents forecasting system, based on a Self-Organizing Maps (SOM) neural network algorithm, high-frequency (HF) ocean radar measurements and numerical weather prediction (NWP) products, has been developed for a coastal area of the northern Adriatic and compared with operational ROMS-derived surface currents. The two systems differ significantly in architecture and algorithms, being based on either unsupervised learning techniques or ocean physics. To compare performance of the two methods, their forecasting skills were tested on independent datasets. The SOM-based forecasting system has a slightly better forecasting skill, especially during strong wind conditions, with potential for further improvement when data sets of higher quality and longer duration are used for training.
Self-Organizing Maps-based ocean currents forecasting system
Vilibić, Ivica; Šepić, Jadranka; Mihanović, Hrvoje; Kalinić, Hrvoje; Cosoli, Simone; Janeković, Ivica; Žagar, Nedjeljka; Jesenko, Blaž; Tudor, Martina; Dadić, Vlado; Ivanković, Damir
2016-01-01
An ocean surface currents forecasting system, based on a Self-Organizing Maps (SOM) neural network algorithm, high-frequency (HF) ocean radar measurements and numerical weather prediction (NWP) products, has been developed for a coastal area of the northern Adriatic and compared with operational ROMS-derived surface currents. The two systems differ significantly in architecture and algorithms, being based on either unsupervised learning techniques or ocean physics. To compare performance of the two methods, their forecasting skills were tested on independent datasets. The SOM-based forecasting system has a slightly better forecasting skill, especially during strong wind conditions, with potential for further improvement when data sets of higher quality and longer duration are used for training. PMID:26979129
Accuracy of vertical radial plume mapping technique in measuring lagoon gas emissions.
Viguria, Maialen; Ro, Kyoung S; Stone, Kenneth C; Johnson, Melvin H
2015-04-01
Recently, the U.S. Environmental Protection Agency (EPA) posted a ground-based optical remote sensing method on its Web site called Other Test Method (OTM) 10 for measuring fugitive gas emission flux from area sources such as closed landfills. The OTM 10 utilizes the vertical radial plume mapping (VRPM) technique to calculate fugitive gas emission mass rates based on measured wind speed profiles and path-integrated gas concentrations (PICs). This study evaluates the accuracy of the VRPM technique in measuring gas emission from animal waste treatment lagoons. A field trial was designed to evaluate the accuracy of the VRPM technique. Control releases of methane (CH4) were made from a 45 m×45 m floating perforated pipe network located on an irrigation pond that resembled typical treatment lagoon environments. The accuracy of the VRPM technique was expressed by the ratio of the calculated emission rates (QVRPM) to actual emission rates (Q). Under an ideal condition of having mean wind directions mostly normal to a downwind vertical plane, the average VRPM accuracy was 0.77±0.32. However, when mean wind direction was mostly not normal to the downwind vertical plane, the emission plume was not adequately captured resulting in lower accuracies. The accuracies of these nonideal wind conditions could be significantly improved if we relaxed the VRPM wind direction criteria and combined the emission rates determined from two adjacent downwind vertical planes surrounding the lagoon. With this modification, the VRPM accuracy improved to 0.97±0.44, whereas the number of valid data sets also increased from 113 to 186. The need for developing accurate and feasible measuring techniques for fugitive gas emission from animal waste lagoons is vital for livestock gas inventories and implementation of mitigation strategies. This field lagoon gas emission study demonstrated that the EPA's vertical radial plume mapping (VRPM) technique can be used to accurately measure lagoon gas emission with two downwind vertical concentration planes surrounding the lagoon.
New type of chaos synchronization in discrete-time systems: the F-M synchronization
NASA Astrophysics Data System (ADS)
Ouannas, Adel; Grassi, Giuseppe; Karouma, Abdulrahman; Ziar, Toufik; Wang, Xiong; Pham, Viet-Thanh
2018-04-01
In this paper, a new type of synchronization for chaotic (hyperchaotic) maps with different dimensions is proposed. The novel scheme is called F - M synchronization, since it combines the inverse generalized synchronization (based on a functional relationship F) with the matrix projective synchronization (based on a matrix M). In particular, the proposed approach enables F - M synchronization with index d to be achieved between n-dimensional drive system map and m-dimensional response system map, where the synchronization index d corresponds to the dimension of the synchronization error. The technique, which exploits nonlinear controllers and Lyapunov stability theory, proves to be effective in achieving the F - M synchronization not only when the synchronization index d equals n or m, but even if the synchronization index d is larger than the map dimensions n and m. Finally, simulation results are reported, with the aim to illustrate the capabilities of the novel scheme proposed herein.
The art and science of weed mapping
Barnett, David T.; Stohlgren, Thomas J.; Jarnevich, Catherine S.; Chong, Geneva W.; Ericson, Jenny A.; Davern, Tracy R.; Simonson, Sara E.
2007-01-01
Land managers need cost-effective and informative tools for non-native plant species management. Many local, state, and federal agencies adopted mapping systems designed to collect comparable data for the early detection and monitoring of non-native species. We compared mapping information to statistically rigorous, plot-based methods to better understand the benefits and compatibility of the two techniques. Mapping non-native species locations provided a species list, associated species distributions, and infested area for subjectively selected survey sites. The value of this information may be compromised by crude estimates of cover and incomplete or biased estimations of species distributions. Incorporating plot-based assessments guided by a stratified-random sample design provided a less biased description of non-native species distributions and increased the comparability of data over time and across regions for the inventory, monitoring, and management of non-native and native plant species.
Tannure, Meire Chucre; Salgado, Patrícia de Oliveira; Chianca, Tânia Couto Machado
2014-01-01
This descriptive study aimed at elaborating nursing diagnostic labels according to ICNP®; conducting a cross-mapping between the diagnostic formulations and the diagnostic labels of NANDA-I; identifying the diagnostic labels thus obtained that were also listed in the NANDA-I; and mapping them according to Basic Human Needs. The workshop technique was applied to 32 intensive care nurses, the cross-mapping and validation based on agreement with experts. The workshop produced 1665 diagnostic labels which were further refined into 120 labels. They were then submitted to a cross-mapping process with both NANDA-I diagnostic labels and the Basic Human Needs. The mapping results underwent content validation by two expert nurses leading to concordance rates of 92% and 100%. It was found that 63 labels were listed in NANDA-I and 47 were not.
Li, Zhifei; Qin, Dongliang
2014-01-01
In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation. PMID:24790572
Li, Zhifei; Qin, Dongliang; Yang, Feng
2014-01-01
In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation.
Graphical approach for multiple values logic minimization
NASA Astrophysics Data System (ADS)
Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.
1999-03-01
Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.
Enrichment of OpenStreetMap Data Completeness with Sidewalk Geometries Using Data Mining Techniques.
Mobasheri, Amin; Huang, Haosheng; Degrossi, Lívia Castro; Zipf, Alexander
2018-02-08
Tailored routing and navigation services utilized by wheelchair users require certain information about sidewalk geometries and their attributes to execute efficiently. Except some minor regions/cities, such detailed information is not present in current versions of crowdsourced mapping databases including OpenStreetMap. CAP4Access European project aimed to use (and enrich) OpenStreetMap for making it fit to the purpose of wheelchair routing. In this respect, this study presents a modified methodology based on data mining techniques for constructing sidewalk geometries using multiple GPS traces collected by wheelchair users during an urban travel experiment. The derived sidewalk geometries can be used to enrich OpenStreetMap to support wheelchair routing. The proposed method was applied to a case study in Heidelberg, Germany. The constructed sidewalk geometries were compared to an official reference dataset ("ground truth dataset"). The case study shows that the constructed sidewalk network overlays with 96% of the official reference dataset. Furthermore, in terms of positional accuracy, a low Root Mean Square Error (RMSE) value (0.93 m) is achieved. The article presents our discussion on the results as well as the conclusion and future research directions.
Remote Sensing Sensors and Applications in Environmental Resources Mapping and Modelling
Melesse, Assefa M.; Weng, Qihao; S.Thenkabail, Prasad; Senay, Gabriel B.
2007-01-01
The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling. PMID:28903290
NASA Astrophysics Data System (ADS)
Semenishchev, E. A.; Marchuk, V. I.; Fedosov, V. P.; Stradanchenko, S. G.; Ruslyakov, D. V.
2015-05-01
This work aimed to study computationally simple method of saliency map calculation. Research in this field received increasing interest for the use of complex techniques in portable devices. A saliency map allows increasing the speed of many subsequent algorithms and reducing the computational complexity. The proposed method of saliency map detection based on both image and frequency space analysis. Several examples of test image from the Kodak dataset with different detalisation considered in this paper demonstrate the effectiveness of the proposed approach. We present experiments which show that the proposed method providing better results than the framework Salience Toolbox in terms of accuracy and speed.
Anomaly Detection for Beam Loss Maps in the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja
2017-07-01
In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.
Testing statistical isotropy in cosmic microwave background polarization maps
NASA Astrophysics Data System (ADS)
Rath, Pranati K.; Samal, Pramoda Kumar; Panda, Srikanta; Mishra, Debesh D.; Aluri, Pavan K.
2018-04-01
We apply our symmetry based Power tensor technique to test conformity of PLANCK Polarization maps with statistical isotropy. On a wide range of angular scales (l = 40 - 150), our preliminary analysis detects many statistically anisotropic multipoles in foreground cleaned full sky PLANCK polarization maps viz., COMMANDER and NILC. We also study the effect of residual foregrounds that may still be present in the Galactic plane using both common UPB77 polarization mask, as well as the individual component separation method specific polarization masks. However, some of the statistically anisotropic modes still persist, albeit significantly in NILC map. We further probed the data for any coherent alignments across multipoles in several bins from the chosen multipole range.
NASA Astrophysics Data System (ADS)
Zuo, Chao; Chen, Qian; Gu, Guohua; Feng, Shijie; Feng, Fangxiaoyu; Li, Rubin; Shen, Guochen
2013-08-01
This paper introduces a high-speed three-dimensional (3-D) shape measurement technique for dynamic scenes by using bi-frequency tripolar pulse-width-modulation (TPWM) fringe projection. Two wrapped phase maps with different wavelengths can be obtained simultaneously by our bi-frequency phase-shifting algorithm. Then the two phase maps are unwrapped using a simple look-up-table based number-theoretical approach. To guarantee the robustness of phase unwrapping as well as the high sinusoidality of projected patterns, TPWM technique is employed to generate ideal fringe patterns with slight defocus. We detailed our technique, including its principle, pattern design, and system setup. Several experiments on dynamic scenes were performed, verifying that our method can achieve a speed of 1250 frames per second for fast, dense, and accurate 3-D measurements.
NASA Astrophysics Data System (ADS)
Qin, Y.; Lu, P.; Li, Z.
2018-04-01
Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF) method for landslide inventory mapping. The proposed method mainly includes two steps: 1) change detection-based multi-threshold for training samples generation and 2) MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1) it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2) it takes the spectral characteristics of landslides into account; and 3) it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2) images in China.
Planetary Geologic Mapping Handbook - 2010. Appendix
NASA Technical Reports Server (NTRS)
Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.
2010-01-01
Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of projectspecific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well. Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically. As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program's Planetary Cartography and Geologic Mapping Working Group's (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.
Application of sensitivity-analysis techniques to the calculation of topological quantities
NASA Astrophysics Data System (ADS)
Gilchrist, Stuart
2017-08-01
Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.
2014-03-27
fidelity. This pairing is accomplished through the use of a space mapping technique, which is a process where the design space of a lower fidelity model...is aligned a higher fidelity model. The intent of applying space mapping techniques to the field of surrogate construction is to leverage the
Three-dimensional mapping of the local interstellar medium with composite data
NASA Astrophysics Data System (ADS)
Capitanio, L.; Lallement, R.; Vergely, J. L.; Elyajouri, M.; Monreal-Ibero, A.
2017-10-01
Context. Three-dimensional maps of the Galactic interstellar medium are general astrophysical tools. Reddening maps may be based on the inversion of color excess measurements for individual target stars or on statistical methods using stellar surveys. Three-dimensional maps based on diffuse interstellar bands (DIBs) have also been produced. All methods benefit from the advent of massive surveys and may benefit from Gaia data. Aims: All of the various methods and databases have their own advantages and limitations. Here we present a first attempt to combine different datasets and methods to improve the local maps. Methods: We first updated our previous local dust maps based on a regularized Bayesian inversion of individual color excess data by replacing Hipparcos or photometric distances with Gaia Data Release 1 values when available. Secondly, we complemented this database with a series of ≃5000 color excess values estimated from the strength of the λ15273 DIB toward stars possessing a Gaia parallax. The DIB strengths were extracted from SDSS/APOGEE spectra. Third, we computed a low-resolution map based on a grid of Pan-STARRS reddening measurements by means of a new hierarchical technique and used this map as the prior distribution during the inversion of the two other datasets. Results: The use of Gaia parallaxes introduces significant changes in some areas and globally increases the compactness of the structures. Additional DIB-based data make it possible to assign distances to clouds located behind closer opaque structures and do not introduce contradictory information for the close structures. A more realistic prior distribution instead of a plane-parallel homogeneous distribution helps better define the structures. We validated the results through comparisons with other maps and with soft X-ray data. Conclusions: Our study demonstrates that the combination of various tracers is a potential tool for more accurate maps. An online tool makes it possible to retrieve maps and reddening estimations. Our online tool is available at http://stilism.obspm.fr
Burch, Matthew J.; Fancher, Chris M.; Patala, Srikanth; ...
2016-11-18
A novel technique, which directly and nondestructively maps polar domains using electron backscatter diffraction (EBSD) is described and demonstrated. Through dynamical diffraction simulations and quantitative comparison to experimental EBSD patterns, the absolute orientation of a non-centrosymmetric crystal can be determined. With this information, the polar domains of a material can be mapped. The technique is demonstrated by mapping the non-ferroelastic, or 180°, ferroelectric domains in periodically poled LiNbO 3 single crystals. Furthermore, the authors demonstrate the possibility of mapping polarity using this technique in other polar materials system.
Flattening maps for the visualization of multibranched vessels.
Zhu, Lei; Haker, Steven; Tannenbaum, Allen
2005-02-01
In this paper, we present two novel algorithms which produce flattened visualizations of branched physiological surfaces, such as vessels. The first approach is a conformal mapping algorithm based on the minimization of two Dirichlet functionals. From a triangulated representation of vessel surfaces, we show how the algorithm can be implemented using a finite element technique. The second method is an algorithm which adjusts the conformal mapping to produce a flattened representation of the original surface while preserving areas. This approach employs the theory of optimal mass transport. Furthermore, a new way of extracting center lines for vessel fly-throughs is provided.
Flattening Maps for the Visualization of Multibranched Vessels
Zhu, Lei; Haker, Steven; Tannenbaum, Allen
2013-01-01
In this paper, we present two novel algorithms which produce flattened visualizations of branched physiological surfaces, such as vessels. The first approach is a conformal mapping algorithm based on the minimization of two Dirichlet functionals. From a triangulated representation of vessel surfaces, we show how the algorithm can be implemented using a finite element technique. The second method is an algorithm which adjusts the conformal mapping to produce a flattened representation of the original surface while preserving areas. This approach employs the theory of optimal mass transport. Furthermore, a new way of extracting center lines for vessel fly-throughs is provided. PMID:15707245
NASA Astrophysics Data System (ADS)
Hofierka, Jaroslav; Gallay, Michal; Bandura, Peter; Šašak, Ján
2018-05-01
Karst sinkholes (dolines) play an important role in a karst landscape by controlling infiltration of surficial water, air flow or spatial distribution of solar energy. These landforms also present a limiting factor for human activities in agriculture or construction. Therefore, mapping such geomorphological forms is vital for appropriate landscape management and planning. There are several mapping techniques available; however, their applicability can be reduced in densely forested areas with poor accessibility and visibility of the landforms. In such conditions, airborne laser scanning (ALS) provides means for efficient and accurate mapping of both land and landscape canopy surfaces. Taking the benefits of ALS into account, we present an innovative method for identification and evaluation of karst sinkholes based on numerical water flow modelling. The suggested method was compared to traditional techniques for sinkhole mapping which use topographic maps and digital terrain modelling. The approach based on simulation of a rainfall event very closely matched the reference datasets derived by manual inspection of the ALS digital elevation model and field surveys. However, our process-based approach provides advantage of assessing the magnitude how sinkholes influence concentration of overland water flow during extreme rainfall events. This was performed by calculating the volume of water accumulated in sinkholes during the simulated rainfall. In this way, the influence of particular sinkholes on underground geomorphological systems can be assessed. The method was demonstrated in a case study of Slovak Karst in the West Carpathians where extreme rainfalls or snow-thaw events occur annually. We identified three spatially contiguous groups of sinkholes with a different effect on overland flow concentration. These results are discussed in relation to the known underground hydrological systems.
NaviCell Web Service for network-based data visualization.
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei
2015-07-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
NaviCell Web Service for network-based data visualization
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei
2015-01-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393
Erasing the Milky Way: new cleaning technique applied to GBT intensity mapping data
NASA Astrophysics Data System (ADS)
Wolz, L.; Blake, C.; Abdalla, F. B.; Anderson, C. J.; Chang, T.-C.; Li, Y.-C.; Masui, K. W.; Switzer, E.; Pen, U.-L.; Voytek, T. C.; Yadav, J.
2017-02-01
We present the first application of a new foreground removal pipeline to the current leading H I intensity mapping data set, obtained by the Green Bank Telescope (GBT). We study the 15- and 1-h-field data of the GBT observations previously presented in Mausui et al. and Switzer et al., covering about 41 deg2 at 0.6 < z < 1.0, for which cross-correlations may be measured with the galaxy distribution of the WiggleZ Dark Energy Survey. In the presented pipeline, we subtract the Galactic foreground continuum and the point-source contamination using an independent component analysis technique (FASTICA), and develop a Fourier-based optimal estimator to compute the temperature power spectrum of the intensity maps and cross-correlation with the galaxy survey data. We show that FASTICA is a reliable tool to subtract diffuse and point-source emission through the non-Gaussian nature of their probability distributions. The temperature power spectra of the intensity maps are dominated by instrumental noise on small scales which FASTICA, as a conservative subtraction technique of non-Gaussian signals, cannot mitigate. However, we determine similar GBT-WiggleZ cross-correlation measurements to those obtained by the singular value decomposition (SVD) method, and confirm that foreground subtraction with FASTICA is robust against 21 cm signal loss, as seen by the converged amplitude of these cross-correlation measurements. We conclude that SVD and FASTICA are complementary methods to investigate the foregrounds and noise systematics present in intensity mapping data sets.
Texture mapping via optimal mass transport.
Dominitz, Ayelet; Tannenbaum, Allen
2010-01-01
In this paper, we present a novel method for texture mapping of closed surfaces. Our method is based on the technique of optimal mass transport (also known as the "earth-mover's metric"). This is a classical problem that concerns determining the optimal way, in the sense of minimal transportation cost, of moving a pile of soil from one site to another. In our context, the resulting mapping is area preserving and minimizes angle distortion in the optimal mass sense. Indeed, we first begin with an angle-preserving mapping (which may greatly distort area) and then correct it using the mass transport procedure derived via a certain gradient flow. In order to obtain fast convergence to the optimal mapping, we incorporate a multiresolution scheme into our flow. We also use ideas from discrete exterior calculus in our computations.
Roland, Jarod L; Griffin, Natalie; Hacker, Carl D; Vellimana, Ananth K; Akbari, S Hassan; Shimony, Joshua S; Smyth, Matthew D; Leuthardt, Eric C; Limbrick, David D
2017-12-01
OBJECTIVE Cerebral mapping for surgical planning and operative guidance is a challenging task in neurosurgery. Pediatric patients are often poor candidates for many modern mapping techniques because of inability to cooperate due to their immature age, cognitive deficits, or other factors. Resting-state functional MRI (rs-fMRI) is uniquely suited to benefit pediatric patients because it is inherently noninvasive and does not require task performance or significant cooperation. Recent advances in the field have made mapping cerebral networks possible on an individual basis for use in clinical decision making. The authors present their initial experience translating rs-fMRI into clinical practice for surgical planning in pediatric patients. METHODS The authors retrospectively reviewed cases in which the rs-fMRI analysis technique was used prior to craniotomy in pediatric patients undergoing surgery in their institution. Resting-state analysis was performed using a previously trained machine-learning algorithm for identification of resting-state networks on an individual basis. Network maps were uploaded to the clinical imaging and surgical navigation systems. Patient demographic and clinical characteristics, including need for sedation during imaging and use of task-based fMRI, were also recorded. RESULTS Twenty patients underwent rs-fMRI prior to craniotomy between December 2013 and June 2016. Their ages ranged from 1.9 to 18.4 years, and 12 were male. Five of the 20 patients also underwent task-based fMRI and one underwent awake craniotomy. Six patients required sedation to tolerate MRI acquisition, including resting-state sequences. Exemplar cases are presented including anatomical and resting-state functional imaging. CONCLUSIONS Resting-state fMRI is a rapidly advancing field of study allowing for whole brain analysis by a noninvasive modality. It is applicable to a wide range of patients and effective even under general anesthesia. The nature of resting-state analysis precludes any need for task cooperation. These features make rs-fMRI an ideal technology for cerebral mapping in pediatric neurosurgical patients. This review of the use of rs-fMRI mapping in an initial pediatric case series demonstrates the feasibility of utilizing this technique in pediatric neurosurgical patients. The preliminary experience presented here is a first step in translating this technique to a broader clinical practice.
A computational visual saliency model based on statistics and machine learning.
Lin, Ru-Je; Lin, Wei-Song
2014-08-01
Identifying the type of stimuli that attracts human visual attention has been an appealing topic for scientists for many years. In particular, marking the salient regions in images is useful for both psychologists and many computer vision applications. In this paper, we propose a computational approach for producing saliency maps using statistics and machine learning methods. Based on four assumptions, three properties (Feature-Prior, Position-Prior, and Feature-Distribution) can be derived and combined by a simple intersection operation to obtain a saliency map. These properties are implemented by a similarity computation, support vector regression (SVR) technique, statistical analysis of training samples, and information theory using low-level features. This technique is able to learn the preferences of human visual behavior while simultaneously considering feature uniqueness. Experimental results show that our approach performs better in predicting human visual attention regions than 12 other models in two test databases. © 2014 ARVO.
Effects of partitioning and scheduling sparse matrix factorization on communication and load balance
NASA Technical Reports Server (NTRS)
Venugopal, Sesh; Naik, Vijay K.
1991-01-01
A block based, automatic partitioning and scheduling methodology is presented for sparse matrix factorization on distributed memory systems. Using experimental results, this technique is analyzed for communication and load imbalance overhead. To study the performance effects, these overheads were compared with those obtained from a straightforward 'wrap mapped' column assignment scheme. All experimental results were obtained using test sparse matrices from the Harwell-Boeing data set. The results show that there is a communication and load balance tradeoff. The block based method results in lower communication cost whereas the wrap mapped scheme gives better load balance.
Respiration-induced movement correlation for synchronous noninvasive renal cancer surgery.
Abhilash, Rakkunedeth H; Chauhan, Sunita
2012-07-01
Noninvasive surgery (NIS), such as high-intensity focused ultrasound (HIFU)-based ablation or radiosurgery, is used for treating tumors and cancers in various parts of the body. The soft tissue targets (usually organs) deform and move as a result of physiological processes such as respiration. Moreover, other deformations induced during surgery by changes in patient position, changes in physical properties caused by repeated exposures and uncertainties resulting from cavitation also occur. In this paper, we present a correlation-based movement prediction technique to address respiration-induced movement of the urological organs while targeting through extracorporeal trans-abdominal route access. Among other organs, kidneys are worst affected during respiratory cycles, with significant three-dimensional displacements observed on the order of 20 mm. Remote access to renal targets such as renal carcinomas and cysts during noninvasive surgery, therefore, requires a tightly controlled real-time motion tracking and quantitative estimate for compensation routine to synchronize the energy source(s) for precise energy delivery to the intended regions. The correlation model finds a mapping between the movement patterns of external skin markers placed on the abdominal access window and the internal movement of the targeted kidney. The coarse estimate of position is then fine-tuned using the Adaptive Neuro-Fuzzy Inference System (ANFIS), thereby achieving a nonlinear mapping. The technical issues involved in this tracking scheme are threefold: the model must have sufficient accuracy in mapping the movement pattern; there must be an image-based tracking scheme to provide the organ position within allowable system latency; and the processing delay resulting from modeling and tracking must be within the achievable prediction horizon to accommodate the latency in the therapeutic delivery system. The concept was tested on ultrasound image sequences collected from 20 healthy volunteers. The results indicate that the modeling technique can be practically integrated into an image-guided noninvasive robotic surgical system with an indicative targeting accuracy of more than 94%. A comparative analysis showed the superiority of this technique over conventional linear mapping and modelfree blind search techniques.
Complete Bouguer gravity map of the Medicine Lake Quadrangle, California
Finn, C.
1981-01-01
A mathematical technique, called kriging, was programmed for a computer to interpolate hydrologic data based on a network of measured values in west-central Kansas. The computer program generated estimated values at the center of each 1-mile section in the Western Kansas Groundwater Management District No. 1 and facilitated contouring of selected values that are needed in the effective management of ground water for irrigation. The kriging technique produced objective and reproducible maps that illustrated hydrologic conditions in the Ogallala aquifer, the principal source of water in west-central Kansas. Maps of the aquifer, which use a 3-year average, included the 1978-80 water-table altitudes, which ranged from about 2,580 to 3,720 feet; the 1978-80 saturated thicknesses, which ranged from about 0 to 250 feet; and the percentage changes in saturated thickness from 1950 to 1978-80, which ranged from about a 50-percent increase to a 100-percent decrease. A map showing errors of estimate also was provided as a measure of reliability for the 1978-80 water-table altitudes. Errors of estimate ranged from 2 to 24 feet. (USGS)
Solar thematic maps for space weather operations
Rigler, E. Joshua; Hill, Steven M.; Reinard, Alysha A.; Steenburgh, Robert A.
2012-01-01
Thematic maps are arrays of labels, or "themes", associated with discrete locations in space and time. Borrowing heavily from the terrestrial remote sensing discipline, a numerical technique based on Bayes' theorem captures operational expertise in the form of trained theme statistics, then uses this to automatically assign labels to solar image pixels. Ultimately, regular thematic maps of the solar corona will be generated from high-cadence, high-resolution SUVI images, the solar ultraviolet imager slated to fly on NOAA's next-generation GOES-R series of satellites starting ~2016. These thematic maps will not only provide quicker, more consistent synoptic views of the sun for space weather forecasters, but digital thematic pixel masks (e.g., coronal hole, active region, flare, etc.), necessary for a new generation of operational solar data products, will be generated. This paper presents the mathematical underpinnings of our thematic mapper, as well as some practical algorithmic considerations. Then, using images from the Solar Dynamics Observatory (SDO) Advanced Imaging Array (AIA) as test data, it presents results from validation experiments designed to ascertain the robustness of the technique with respect to differing expert opinions and changing solar conditions.
Zhang, Kaiming; Keane, Sarah C; Su, Zhaoming; Irobalieva, Rossitza N; Chen, Muyuan; Van, Verna; Sciandra, Carly A; Marchant, Jan; Heng, Xiao; Schmid, Michael F; Case, David A; Ludtke, Steven J; Summers, Michael F; Chiu, Wah
2018-03-06
Cryoelectron microscopy (cryo-EM) and nuclear magnetic resonance (NMR) spectroscopy are routinely used to determine structures of macromolecules with molecular weights over 65 and under 25 kDa, respectively. We combined these techniques to study a 30 kDa HIV-1 dimer initiation site RNA ([DIS] 2 ; 47 nt/strand). A 9 Å cryo-EM map clearly shows major groove features of the double helix and a right-handed superhelical twist. Simulated cryo-EM maps generated from time-averaged molecular dynamics trajectories (10 ns) exhibited levels of detail similar to those in the experimental maps, suggesting internal structural flexibility limits the cryo-EM resolution. Simultaneous inclusion of the cryo-EM map and 2 H-edited NMR-derived distance restraints during structure refinement generates a structure consistent with both datasets and supporting a flipped-out base within a conserved purine-rich bulge. Our findings demonstrate the power of combining global and local structural information from these techniques for structure determination of modest-sized RNAs. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mafanya, Madodomzi; Tsele, Philemon; Botai, Joel; Manyama, Phetole; Swart, Barend; Monate, Thabang
2017-07-01
Invasive alien plants (IAPs) not only pose a serious threat to biodiversity and water resources but also have impacts on human and animal wellbeing. To support decision making in IAPs monitoring, semi-automated image classifiers which are capable of extracting valuable information in remotely sensed data are vital. This study evaluated the mapping accuracies of supervised and unsupervised image classifiers for mapping Harrisia pomanensis (a cactus plant commonly known as the Midnight Lady) using two interlinked evaluation strategies i.e. point and area based accuracy assessment. Results of the point-based accuracy assessment show that with reference to 219 ground control points, the supervised image classifiers (i.e. Maxver and Bhattacharya) mapped H. pomanensis better than the unsupervised image classifiers (i.e. K-mediuns, Euclidian Length and Isoseg). In this regard, user and producer accuracies were 82.4% and 84% respectively for the Maxver classifier. The user and producer accuracies for the Bhattacharya classifier were 90% and 95.7%, respectively. Though the Maxver produced a higher overall accuracy and Kappa estimate than the Bhattacharya classifier, the Maxver Kappa estimate of 0.8305 is not significantly (statistically) greater than the Bhattacharya Kappa estimate of 0.8088 at a 95% confidence interval. The area based accuracy assessment results show that the Bhattacharya classifier estimated the spatial extent of H. pomanensis with an average mapping accuracy of 86.1% whereas the Maxver classifier only gave an average mapping accuracy of 65.2%. Based on these results, the Bhattacharya classifier is therefore recommended for mapping H. pomanensis. These findings will aid in the algorithm choice making for the development of a semi-automated image classification system for mapping IAPs.
Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Xie, Dong; Yang, Yixian
2015-06-01
The Telecare Medicine Information Systems (TMISs) provide an efficient communicating platform supporting the patients access health-care delivery services via internet or mobile networks. Authentication becomes an essential need when a remote patient logins into the telecare server. Recently, many extended chaotic maps based authentication schemes using smart cards for TMISs have been proposed. Li et al. proposed a secure smart cards based authentication scheme for TMISs using extended chaotic maps based on Lee's and Jiang et al.'s scheme. In this study, we show that Li et al.'s scheme has still some weaknesses such as violation the session key security, vulnerability to user impersonation attack and lack of local verification. To conquer these flaws, we propose a chaotic maps and smart cards based password authentication scheme by applying biometrics technique and hash function operations. Through the informal and formal security analyses, we demonstrate that our scheme is resilient possible known attacks including the attacks found in Li et al.'s scheme. As compared with the previous authentication schemes, the proposed scheme is more secure and efficient and hence more practical for telemedical environments.
Non-invasive imaging of barriers to drug delivery in tumors.
Hassid, Yaron; Eyal, Erez; Margalit, Raanan; Furman-Haran, Edna; Degani, Hadassa
2008-08-01
Solid tumors often develop high interstitial fluid pressure (IFP) as a result of increased water leakage and impaired lymphatic drainage, as well as changes in the extracellular matrix composition and elasticity. This high fluid pressure forms a barrier to drug delivery and hence, resistance to therapy. We have developed techniques based on contrast enhanced magnetic resonance imaging for mapping in tumors the vascular and transport parameters determining the delivery efficiency of blood borne substances. Sequential images are recorded during continuous infusion of a Gd-based contrast agent and analyzed according to a new physiological model, yielding maps of microvascular transfer constants, as well as outward convective interstitial transfer constants and steady state interstitial contrast agent concentrations both reflecting IFP distribution. We further demonstrated in non small cell human lung cancer xenografts the capability of our techniques to monitor in vivo collagenase induced increase in contrast agent delivery as a result of decreased IFP. These techniques can be applied to test drugs that affect angiogenesis and modulate interstitial fluid pressure and has the potential to be extended to cancer patients for assessing resistance to drug delivery.
Non-Invasive Imaging of Barriers to Drug Delivery in Tumors
Hassid, Yaron; Eyal, Erez; Margalit, Raanan; Furman-Haran, Edna; Degani, Hadassa
2011-01-01
Solid tumors often develop high interstitial fluid pressure (IFP) as a result of increased water leakage and impaired lymphatic drainage, as well as changes in the extracellular matrix composition and elasticity. This high fluid pressure forms a barrier to drug delivery and hence, resistance to therapy. We have developed techniques based on contrast enhanced magnetic resonance imaging for mapping in tumors the vascular and transport parameters determining the delivery efficiency of blood borne substances. Sequential images are recorded during continuous infusion of a Gd-based contrast agent and analyzed according to a new physiological model, yielding maps of microvascular transfer constants, as well as outward convective interstitial transfer constants and steady state interstitial contrast agent concentrations both reflecting IFP distribution. We further demonstrated in non small cell human lung cancer xenografts the capability of our techniques to monitor in vivo collagenase induced increase in contrast agent delivery as a result of decreased IFP. These techniques can be applied to test drugs that affect angiogenesis and modulate interstitial fluid pressure and has the potential to be extended to cancer patients for assessing resistance to drug delivery. PMID:18638494
NASA Astrophysics Data System (ADS)
Liu, Ping; Hall-Aquitania, Moorea; Hermens, Erma; Groves, Roger M.
2017-07-01
Optical diagnostics techniques are becoming important for technical art history (TAH) as well as for heritage conservation. In recent years, optical coherence tomography (OCT) has been increasingly used as a novel technique for the inspection of artwork, revealing the stratigraphy of paintings. It has also shown to be an effective tool for vanish layer inspection. OCT is a contactless and non-destructive technique for microstructural imaging of turbid media, originally developed for medical applications. However current OCT instruments have difficulty in paint layer inspection due to the opacity of most pigments. This paper explores the potential of OCT for the investigation of paintings with coloured grounds. Depth scans were processed to determine the light penetration depth at the optical wavelength based on a 1/e light attenuation calculation. The variation in paint opacity was mapped based on the microstructural images and 3D penetration depth profiles was calculated and related back to the construction of the artwork. By determining the light penetration depth over a range of wavelengths the 3D depth perception of a painting with coloured grounds can be characterized optically.
Applied learning-based color tone mapping for face recognition in video surveillance system
NASA Astrophysics Data System (ADS)
Yew, Chuu Tian; Suandi, Shahrel Azmin
2012-04-01
In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.
Operational monitoring of land-cover change using multitemporal remote sensing data
NASA Astrophysics Data System (ADS)
Rogan, John
2005-11-01
Land-cover change, manifested as either land-cover modification and/or conversion, can occur at all spatial scales, and changes at local scales can have profound, cumulative impacts at broader scales. The implication of operational land-cover monitoring is that researchers have access to a continuous stream of remote sensing data, with the long term goal of providing for consistent and repetitive mapping. Effective large area monitoring of land-cover (i.e., >1000 km2) can only be accomplished by using remotely sensed images as an indirect data source in land-cover change mapping and as a source for land-cover change model projections. Large area monitoring programs face several challenges: (1) choice of appropriate classification scheme/map legend over large, topographically and phenologically diverse areas; (2) issues concerning data consistency and map accuracy (i.e., calibration and validation); (3) very large data volumes; (4) time consuming data processing and interpretation. Therefore, this dissertation research broadly addresses these challenges in the context of examining state-of-the-art image pre-processing, spectral enhancement, classification, and accuracy assessment techniques to assist the California Land-cover Mapping and Monitoring Program (LCMMP). The results of this dissertation revealed that spatially varying haze can be effectively corrected from Landsat data for the purposes of change detection. The Multitemporal Spectral Mixture Analysis (MSMA) spectral enhancement technique produced more accurate land-cover maps than those derived from the Multitemporal Kauth Thomas (MKT) transformation in northern and southern California study areas. A comparison of machine learning classifiers showed that Fuzzy ARTMAP outperformed two classification tree algorithms, based on map accuracy and algorithm robustness. Variation in spatial data error (positional and thematic) was explored in relation to environmental variables using geostatistical interpolation techniques. Finally, the land-cover modification maps generated for three time intervals (1985--1990--1996--2000), with nine change-classes revealed important variations in land-cover gain and loss between northern and southern California study areas.
NASA Astrophysics Data System (ADS)
Pedersen, G. B. M.
2016-02-01
A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.
,
1999-01-01
Currently, the U.S. Geological Survey (USGS) uses conventional lithographic printing techniques to produce paper copies of most of its mapping products. This practice is not economical for those products that are in low demand. With the advent of newer technologies, high-speed, large-format printers have been coupled with innovative computer software to turn digital map data into a printed map. It is now possible to store and retrieve data from vast geospatial data bases and print a map on an as-needed basis; that is, print on demand, thereby eliminating the need to warehouse an inventory of paper maps for which there is low demand. Using print-on-demand technology, the USGS is implementing map-on-demand (MOD) printing for certain infrequently requested maps. By providing MOD, the USGS can offer an alternative to traditional, large-volume printing and can improve its responsiveness to customers by giving them greater access to USGS scientific data in a format that otherwise might not be available.
NASA Astrophysics Data System (ADS)
Bardi, Federica; Frodella, William; Ciampalini, Andrea; Bianchini, Silvia; Del Ventisette, Chiara; Gigli, Giovanni; Fanti, Riccardo; Moretti, Sandro; Basile, Giuseppe; Casagli, Nicola
2014-10-01
The potential use of the integration of PSI (Persistent Scatterer Interferometry) and GB-InSAR (Ground-based Synthetic Aperture Radar Interferometry) for landslide hazard mitigation was evaluated for mapping and monitoring activities of the San Fratello landslide (Sicily, Italy). Intense and exceptional rainfall events are the main factors that triggered several slope movements in the study area, which is susceptible to landslides, because of its steep slopes and silty-clayey sedimentary cover. In the last three centuries, the town of San Fratello was affected by three large landslides, developed in different periods: the oldest one occurred in 1754, damaging the northeastern sector of the town; in 1922 a large landslide completely destroyed a wide area in the western hillside of the town. In this paper, the attention is focussed on the most recent landslide that occurred on 14 February 2010: in this case, the phenomenon produced the failure of a large sector of the eastern hillside, causing severe damages to buildings and infrastructures. In particular, several slow-moving rotational and translational slides occurred in the area, making it suitable to monitor ground instability through different InSAR techniques. PS-InSAR™ (permanent scatterers SAR interferometry) techniques, using ERS-1/ERS-2, ENVISAT, RADARSAT-1, and COSMO-SkyMed SAR images, were applied to analyze ground displacements during pre- and post-event phases. Moreover, during the post-event phase in March 2010, a GB-InSAR system, able to acquire data continuously every 14 min, was installed collecting ground displacement maps for a period of about three years, until March 2013. Through the integration of space-borne and ground-based data sets, ground deformation velocity maps were obtained, providing a more accurate delimitation of the February 2010 landslide boundary, with respect to the carried out traditional geomorphological field survey. The integration of GB-InSAR and PSI techniques proved to be very effective in landslide mapping in the San Fratello test site, representing a valid scientific support for local authorities and decision makers during the post-emergency management.
BowMapCL: Burrows-Wheeler Mapping on Multiple Heterogeneous Accelerators.
Nogueira, David; Tomas, Pedro; Roma, Nuno
2016-01-01
The computational demand of exact-search procedures has pressed the exploitation of parallel processing accelerators to reduce the execution time of many applications. However, this often imposes strict restrictions in terms of the problem size and implementation efforts, mainly due to their possibly distinct architectures. To circumvent this limitation, a new exact-search alignment tool (BowMapCL) based on the Burrows-Wheeler Transform and FM-Index is presented. Contrasting to other alternatives, BowMapCL is based on a unified implementation using OpenCL, allowing the exploitation of multiple and possibly different devices (e.g., NVIDIA, AMD/ATI, and Intel GPUs/APUs). Furthermore, to efficiently exploit such heterogeneous architectures, BowMapCL incorporates several techniques to promote its performance and scalability, including multiple buffering, work-queue task-distribution, and dynamic load-balancing, together with index partitioning, bit-encoding, and sampling. When compared with state-of-the-art tools, the attained results showed that BowMapCL (using a single GPU) is 2 × to 7.5 × faster than mainstream multi-threaded CPU BWT-based aligners, like Bowtie, BWA, and SOAP2; and up to 4 × faster than the best performing state-of-the-art GPU implementations (namely, SOAP3 and HPG-BWT). When multiple and completely distinct devices are considered, BowMapCL efficiently scales the offered throughput, ensuring a convenient load-balance of the involved processing in the several distinct devices.
NASA Technical Reports Server (NTRS)
Turner, B. J. (Principal Investigator)
1982-01-01
A user friendly front end was constructed to facilitate access to the LANDSAT mosaic data base supplied by JPL and to process both LANDSAT and ancillary data. Archieval and retrieval techniques were developed to efficiently handle this data base and make it compatible with requirements of the Pennsylvania Bureau of Forestry. Procedures are ready for: (1) forming the forest/nonforest mask in ORSER compressed map format using GSFC-supplied classification procedures; (2) registering data from a new scene (defoliated) to the mask (which may involve mosaicking if the area encompasses two LANDSAT scenes; (3) producing a masked new data set using the MASK program; (4) analyzing this data set to produce a map showing degrees of defoliation, output on the Versatec plotter; and (5) producing color composite maps by a diazo-type process.
Dynamic Analysis of the Carotid-Kundalini Map
NASA Astrophysics Data System (ADS)
Wang, Xingyuan; Liang, Qingyong; Meng, Juan
The nature of the fixed points of the Carotid-Kundalini (C-K) map was studied and the boundary equation of the first bifurcation of the C-K map in the parameter plane is presented. Using the quantitative criterion and rule of chaotic system, the paper reveals the general features of the C-K Map transforming from regularity to chaos. The following conclusions are obtained: (i) chaotic patterns of the C-K map may emerge out of double-periodic bifurcation; (ii) the chaotic crisis phenomena are found. At the same time, the authors analyzed the orbit of critical point of the complex C-K Map and put forward the definition of Mandelbrot-Julia set of the complex C-K Map. The authors generalized the Welstead and Cromer's periodic scanning technique and using this technology constructed a series of the Mandelbrot-Julia sets of the complex C-K Map. Based on the experimental mathematics method of combining the theory of analytic function of one complex variable with computer aided drawing, we investigated the symmetry of the Mandelbrot-Julia set and studied the topological inflexibility of distribution of the periodic region in the Mandelbrot set, and found that the Mandelbrot set contains abundant information of the structure of Julia sets by finding the whole portray of Julia sets based on Mandelbrot set qualitatively.
NASA Technical Reports Server (NTRS)
Colwell, R. N. (Principal Investigator); Hay, C. M.; Thomas, R. W.; Benson, A. S.
1976-01-01
The progress of research conducted in support of the Large Area Crop Inventory Experiment (LACIE) is documented. Specific tasks include (1) evaluation of the static stratification procedure and modification of that procedure if warranted, and (2) the development of alternative photointerpretative techniques to the present LACIE procedures for the identification and selection of training fields (areas).
NASA Technical Reports Server (NTRS)
Colwell, R. N. (Principal Investigator); Hay, C. M.; Thomas, R. W.; Benson, A. S.
1977-01-01
Progress in the evaluation of the static stratification procedure and the development of alternative photointerpretive techniques to the present LACIE procedure for the identification of training fields is reported. Statistically significant signature controlling variables were defined for use in refining the stratification procedure. A subset of the 1973-74 Kansas LACIE segments for wheat was analyzed.
Updating Landsat-derived land-cover maps using change detection and masking techniques
NASA Technical Reports Server (NTRS)
Likens, W.; Maw, K.
1982-01-01
The California Integrated Remote Sensing System's San Bernardino County Project was devised to study the utilization of a data base at a number of jurisdictional levels. The present paper discusses the implementation of change-detection and masking techniques in the updating of Landsat-derived land-cover maps. A baseline landcover classification was first created from a 1976 image, then the adjusted 1976 image was compared with a 1979 scene by the techniques of (1) multidate image classification, (2) difference image-distribution tails thresholding, (3) difference image classification, and (4) multi-dimensional chi-square analysis of a difference image. The union of the results of methods 1, 3 and 4 was used to create a mask of possible change areas between 1976 and 1979, which served to limit analysis of the update image and reduce comparison errors in unchanged areas. The techniques of spatial smoothing of change-detection products, and of combining results of difference change-detection algorithms are also shown to improve Landsat change-detection accuracies.
Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho
2017-01-01
Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.
NASA Technical Reports Server (NTRS)
Landgrebe, D.
1974-01-01
A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.
ACCELERATING MR PARAMETER MAPPING USING SPARSITY-PROMOTING REGULARIZATION IN PARAMETRIC DIMENSION
Velikina, Julia V.; Alexander, Andrew L.; Samsonov, Alexey
2013-01-01
MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work, we propose a novel regularization strategy, which utilizes smoothness of signal evolution in the parametric dimension within compressed sensing framework (p-CS) to provide accurate and precise estimation of parametric maps from undersampled data. The performance of the method was demonstrated with variable flip angle T1 mapping and compared favorably to two representative reconstruction approaches, image space-based total variation regularization and an analytical model-based reconstruction. The proposed p-CS regularization was found to provide efficient suppression of noise amplification and preservation of parameter mapping accuracy without explicit utilization of analytical signal models. The developed method may facilitate acceleration of quantitative MRI techniques that are not suitable to model-based reconstruction because of complex signal models or when signal deviations from the expected analytical model exist. PMID:23213053
Semantics by analogy for illustrative volume visualization☆
Gerl, Moritz; Rautek, Peter; Isenberg, Tobias; Gröller, Eduard
2012-01-01
We present an interactive graphical approach for the explicit specification of semantics for volume visualization. This explicit and graphical specification of semantics for volumetric features allows us to visually assign meaning to both input and output parameters of the visualization mapping. This is in contrast to the implicit way of specifying semantics using transfer functions. In particular, we demonstrate how to realize a dynamic specification of semantics which allows to flexibly explore a wide range of mappings. Our approach is based on three concepts. First, we use semantic shader augmentation to automatically add rule-based rendering functionality to static visualization mappings in a shader program, while preserving the visual abstraction that the initial shader encodes. With this technique we extend recent developments that define a mapping between data attributes and visual attributes with rules, which are evaluated using fuzzy logic. Second, we let users define the semantics by analogy through brushing on renderings of the data attributes of interest. Third, the rules are specified graphically in an interface that provides visual clues for potential modifications. Together, the presented methods offer a high degree of freedom in the specification and exploration of rule-based mappings and avoid the limitations of a linguistic rule formulation. PMID:23576827
Active Voodoo Dolls: A Vision Based Input Device for Nonrigid Control.
1998-08-01
A vision based technique for nonrigid control is presented that can be used for animation and video game applications. The user grasps a soft...allowing the user to control it interactively. Our use of texture mapping hardware in tracking makes the system responsive enough for interactive animation and video game character control.
ERIC Educational Resources Information Center
Barker, Lauren N.; Ziino, Carlo
2010-01-01
This study aimed to produce indicators and guidelines for clinician use in determining whether individual therapy sessions for community rehabilitation services should be delivered in a home/community-based setting or centre-based setting within a flexible service delivery model. Concept mapping techniques as described by Tochrim and Kane (2005)…
Mogaji, Kehinde Anthony; Lim, Hwee San
2017-07-01
This study integrates the application of Dempster-Shafer-driven evidential belief function (DS-EBF) methodology with remote sensing and geographic information system techniques to analyze surface and subsurface data sets for the spatial prediction of groundwater potential in Perak Province, Malaysia. The study used additional data obtained from the records of the groundwater yield rate of approximately 28 bore well locations. The processed surface and subsurface data produced sets of groundwater potential conditioning factors (GPCFs) from which multiple surface hydrologic and subsurface hydrogeologic parameter thematic maps were generated. The bore well location inventories were partitioned randomly into a ratio of 70% (19 wells) for model training to 30% (9 wells) for model testing. Application results of the DS-EBF relationship model algorithms of the surface- and subsurface-based GPCF thematic maps and the bore well locations produced two groundwater potential prediction (GPP) maps based on surface hydrologic and subsurface hydrogeologic characteristics which established that more than 60% of the study area falling within the moderate-high groundwater potential zones and less than 35% falling within the low potential zones. The estimated uncertainty values within the range of 0 to 17% for the predicted potential zones were quantified using the uncertainty algorithm of the model. The validation results of the GPP maps using relative operating characteristic curve method yielded 80 and 68% success rates and 89 and 53% prediction rates for the subsurface hydrogeologic factor (SUHF)- and surface hydrologic factor (SHF)-based GPP maps, respectively. The study results revealed that the SUHF-based GPP map accurately delineated groundwater potential zones better than the SHF-based GPP map. However, significant information on the low degree of uncertainty of the predicted potential zones established the suitability of the two GPP maps for future development of groundwater resources in the area. The overall results proved the efficacy of the data mining model and the geospatial technology in groundwater potential mapping.
Technical Note: A 3-D rendering algorithm for electromechanical wave imaging of a beating heart.
Nauleau, Pierre; Melki, Lea; Wan, Elaine; Konofagou, Elisa
2017-09-01
Arrhythmias can be treated by ablating the heart tissue in the regions of abnormal contraction. The current clinical standard provides electroanatomic 3-D maps to visualize the electrical activation and locate the arrhythmogenic sources. However, the procedure is time-consuming and invasive. Electromechanical wave imaging is an ultrasound-based noninvasive technique that can provide 2-D maps of the electromechanical activation of the heart. In order to fully visualize the complex 3-D pattern of activation, several 2-D views are acquired and processed separately. They are then manually registered with a 3-D rendering software to generate a pseudo-3-D map. However, this last step is operator-dependent and time-consuming. This paper presents a method to generate a full 3-D map of the electromechanical activation using multiple 2-D images. Two canine models were considered to illustrate the method: one in normal sinus rhythm and one paced from the lateral region of the heart. Four standard echographic views of each canine heart were acquired. Electromechanical wave imaging was applied to generate four 2-D activation maps of the left ventricle. The radial positions and activation timings of the walls were automatically extracted from those maps. In each slice, from apex to base, these values were interpolated around the circumference to generate a full 3-D map. In both cases, a 3-D activation map and a cine-loop of the propagation of the electromechanical wave were automatically generated. The 3-D map showing the electromechanical activation timings overlaid on realistic anatomy assists with the visualization of the sources of earlier activation (which are potential arrhythmogenic sources). The earliest sources of activation corresponded to the expected ones: septum for the normal rhythm and lateral for the pacing case. The proposed technique provides, automatically, a 3-D electromechanical activation map with a realistic anatomy. This represents a step towards a noninvasive tool to efficiently localize arrhythmias in 3-D. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Goodwillie, A. M.
2015-12-01
We often demand information and data to be accessible over the web at no cost, and no longer do we expect to spend time labouriously compiling data from myriad sources with frustratingly-different formats. Instead, we increasingly expect convenience and consolidation. Recent advances in web-enabled technologies and cyberinfrastructure are answering those calls by providing data tools and resources that can transform undergraduate education. By freeing up valuable classroom time, students can focus upon gaining deeper insights and understanding from real-world data. GeoMapApp (http://www.geomapapp.org) is a map-based data discovery and visualisation tool developed at Lamont-Doherty Earth Observatory. GeoMapApp promotes U-Learning by working across all major computer platforms and functioning anywhere with internet connectivity, by lowering socio-economic barriers (it is free), by seamlessly integrating thousands of built-in research-grade data sets under intuitive menus, and by being adaptable to a range of learning environments - from lab sessions, group projects, and homework assignments to in-class pop-ups. GeoMapApp caters to casual and specialist users alike. Contours, artificial illumination, 3-D displays, data point manipulations, cross-sectional profiles, and other display techniques help students better grasp the content and geospatial context of data. Layering capabilities allow easy data set comparisons. The core functionality also applies to imported data sets: Student-collected data can thus be imported and analysed using the same techniques. A new Save Session function allows educators to preserve a pre-loaded state of GeoMapApp. When shared with a class, the saved file allows every student to open GeoMapApp at exactly the same starting point from which to begin their data explorations. Examples of built-in data sets include seafloor crustal age, earthquake locations and focal mechanisms, analytical geochemistry, ocean water physical properties, US and international geological maps, and satellite imagery. Student-generated data sets can be imported in Excel, ASCII, shapefile, and gridded format. Base maps can be saved for posters and publications. A wide range of undergraduate enquiry-driven education modules for GeoMapApp is already available at SERC.
Spatial Thinking and Visualisation of Real-World Concepts using GeoMapApp
NASA Astrophysics Data System (ADS)
Goodwillie, A. M.
2015-12-01
Commonly, geoscience data is presented to students in the lab and classroom in the form of data tables, maps and graphs. Successful data interpretation requires learners to become proficient with spatial thinking skills, allowing them to gain insight and understanding of the underlying real-world 3-D processes and concepts. Yet, educators at both the school and university level often witness students having difficulty in performing that translation. As a result, tools and resources that help to bridge that spatial capability gap can have useful application in the educational realm. A free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory caters to students and teachers alike by providing a variety of data display and manipulation techniques that enhance geospatial awareness. Called GeoMapApp (http://www.geomapapp.org), the tool provides access to hundreds of built-in authentic geoscience data sets. Examples include earthquake and volcano data, geological maps, lithospheric plate boundary information, geochemical, oceanographic, and environmental data. Barriers to entry are lowered through easy installation, seamless integration of research-grade data sets, intuitive menus, and project-saving continuity. The default base map is a cutting-edge elevation model covering the oceans and land. Dynamic contouring, artificial illumination, 3-D visualisations, data point manipulations, cross-sectional profiles, and other display techniques help students grasp the content and geospatial context of data. Data sets can also be layered for easier comparison. Students may import their own data sets in Excel, ASCII, shapefile, and gridded format, and they can gain a sense of ownership by being able to tailor their data explorations and save their own projects. GeoMapApp is adaptable to a range of learning environments from lab sessions, group projects, and homework assignments to in-class pop-ups. A new Save Session function allows educators to preserve a pre-loaded state of GeoMapApp. When shared with a class, the saved file allows every student to open GeoMapApp at exactly the same starting point from which to begin their data explorations. A wide range of enquiry-driven education modules for GeoMapApp is already available at SERC.
Evidence for Crater Ejecta on Venus Tessera Terrain from Earth-Based Radar Images
NASA Technical Reports Server (NTRS)
Campbell, Bruce A.; Campbell, Donald B.; Morgan, Gareth A.; Carter, Lynn M.; Nolan, Michael C.; Chandler, John F.
2014-01-01
We combine Earth-based radar maps of Venus from the 1988 and 2012 inferior conjunctions, which had similar viewing geometries. Processing of both datasets with better image focusing and co-registration techniques, and summing over multiple looks, yields maps with 1-2 km spatial resolution and improved signal to noise ratio, especially in the weaker same-sense circular (SC) polarization. The SC maps are unique to Earth-based observations, and offer a different view of surface properties from orbital mapping using same-sense linear (HH or VV) polarization. Highland or tessera terrains on Venus, which may retain a record of crustal differentiation and processes occurring prior to the loss of water, are of great interest for future spacecraft landings. The Earth-based radar images reveal multiple examples of tessera mantling by impact ''parabolas'' or ''haloes'', and can extend mapping of locally thick material from Magellan data by revealing thinner deposits over much larger areas. Of particular interest is an ejecta deposit from Stuart crater that we infer to mantle much of eastern Alpha Regio. Some radar-dark tessera occurrences may indicate sediments that are trapped for longer periods than in the plains. We suggest that such radar information is important for interpretation of orbital infrared data and selection of future tessera landing sites.
Mapping urban green open space in Bontang city using QGIS and cloud computing
NASA Astrophysics Data System (ADS)
Agus, F.; Ramadiani; Silalahi, W.; Armanda, A.; Kusnandar
2018-04-01
Digital mapping techniques are available freely and openly so that map-based application development is easier, faster and cheaper. A rapid development of Cloud Computing Geographic Information System makes this system can help the needs of the community for the provision of geospatial information online. The presence of urban Green Open Space (GOS) provide great benefits as an oxygen supplier, carbon-binding agent and can contribute to providing comfort and beauty of city life. This study aims to propose a platform application of GIS Cloud Computing (CC) of Bontang City GOS mapping. The GIS-CC platform uses the basic map available that’s free and open source. The research used survey method to collect GOS data obtained from Bontang City Government, while application developing works Quantum GIS-CC. The result section describes the existence of GOS Bontang City and the design of GOS mapping application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Chuck W.; Hanson, James P.; Ivarson, Kristine A.
2015-01-14
The Hanford Site nuclear reactor operations required large quantities of high-quality cooling water, which was treated with chemicals including sodium dichromate dihydrate for corrosion control. Cooling water leakage, as well as intentional discharge of cooling water to ground during upset conditions, produced extensive groundwater recharge mounds consisting largely of contaminated cooling water and resulted in wide distribution of hexavalent chromium (Cr[VI]) contamination in the unconfined aquifer. The 2013 Cr(VI) groundwater plumes in the 100 Areas cover approximately 6 km2 (1500 acres), primarily in the 100-HR-3 and 100-KR-4 groundwater operable units (OUs). The Columbia River is a groundwater discharge boundary; wheremore » the plumes are adjacent to the Columbia River there remains a potential to discharge Cr(VI) to the river at concentrations above water quality criteria. The pump-and-treat systems along the River Corridor are operating with two main goals: 1) protection of the Columbia River, and 2) recovery of contaminant mass. An evaluation of the effectiveness of the pump-and-treat systems was needed to determine if the Columbia River was protected from contamination, and also to determine where additional system modifications may be needed. In response to this need, a technique for assessing the river protection was developed which takes into consideration seasonal migration of the plume and hydraulic performance of the operating well fields. Groundwater contaminant plume maps are generated across the Hanford Site on an annual basis. The assessment technique overlays the annual plume and the capture efficiency maps for the various pump and treat systems. The river protection analysis technique was prepared for use at the Hanford site and is described in detail in M.J. Tonkin, 2013. Interpolated capture frequency maps, based on mapping dynamic water level observed in observation wells and derived water levels in the vicinity of extraction and injection wells, are developed initially. Second, simulated capture frequency maps are developed, based on transport modelling results. Both interpolated and simulated capture frequency maps are based on operation of the systems over a full year. These two capture maps are then overlaid on the plume distribution maps for inspection of the relative orientation of the contaminant plumes with the capture frequency. To quantify the relative degree of protection of the river from discharges of Cr(VI) (and conversely, the degree of threat) at any particular location, a systematic method of evaluating and mapping the plume/capture relationship was developed. By comparing the spatial relationship between contaminant plumes and hydraulic capture frequency, an index of relative protectiveness is developed and the results posted on the combined plume/capture plan view map. Areas exhibiting lesser degrees of river protection are identified for remedial process optimization actions to control plumes and prevent continuing discharge of Cr(VI) to the river.« less
Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective
NASA Astrophysics Data System (ADS)
Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.
2016-06-01
We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).
NASA Astrophysics Data System (ADS)
Wong, Kelvin K. L.; Kelso, Richard M.; Worthley, Stephen G.; Sanders, Prashanthan; Mazumdar, Jagannath; Abbott, Derek
2008-12-01
Modelling of non-stationary cardiac structures is complicated by the complexity of their intrinsic and extrinsic motion. The first known study of haemodynamics due to the beating of heart was made by Leonardo Da Vinci, giving the idea of fluid-solid interaction by describing how vortices develop during cardiac structural interaction with the blood. Heart morphology affects in changes of cardio dynamics during the systolic and diastolic phrases. In a chamber of the heart, vortices are discovered to exist as the result of the unique morphological changes of the cardiac chamber wall by using flow-imaging techniques such as phase contrast magnetic resonance imaging. The first part of this paper attempts to quantify vortex characteristics by means of calculating vorticity numerically and devising two dimensional vortical flow maps. The technique relies on determining the properties of vorticity using a statistical quantification of the flow maps and comparison of these quantities based on different scenarios. As the characteristics of our vorticity maps vary depending on the phase of a cardiac cycle, there is a need for robust quantification method to analyse vorticity. In the second part of the paper, the approach is then utilised for examining vortices within the human right atrium. Our study has shown that a proper quantification of vorticity for the flow field can indicate the strength and number of vortices within a heart chamber.
A ddRAD Based Linkage Map of the Cultivated Strawberry, Fragaria xananassa
Davik, Jahn; Sargent, Daniel James; Brurberg, May Bente; Lien, Sigbjørn; Kent, Matthew; Alsheikh, Muath
2015-01-01
The cultivated strawberry (Fragaria ×ananassa Duch.) is an allo-octoploid considered difficult to disentangle genetically due to its four relatively similar sub-genomic chromosome sets. This has been alleviated by the recent release of the strawberry IStraw90 whole genome genotyping array. However, array resolution relies on the genotypes used in the array construction and may be of limited general use. SNP detection based on reduced genomic sequencing approaches has the potential of providing better coverage in cases where the studied genotypes are only distantly related from the SNP array’s construction foundation. Here we have used double digest restriction-associated DNA sequencing (ddRAD) to identify SNPs in a 145 seedling F1 hybrid population raised from the cross between the cultivars Sonata (♀) and Babette (♂). A linkage map containing 907 markers which spanned 1,581.5 cM across 31 linkage groups representing the 28 chromosomes of the species. Comparing the physical span of the SNP markers with the F. vesca genome sequence, the linkage groups resolved covered 79% of the estimated 830 Mb of the F. ×ananassa genome. Here, we have developed the first linkage map for F. ×ananassa using ddRAD and show that this technique and other related techniques are useful tools for linkage map development and downstream genetic studies in the octoploid strawberry. PMID:26398886
NASA Astrophysics Data System (ADS)
Bibi, T.; Azahari Razak, K.; Rahman, A. Abdul; Latif, A.
2017-10-01
Landslides are an inescapable natural disaster, resulting in massive social, environmental and economic impacts all over the world. The tropical, mountainous landscape in generally all over Malaysia especially in eastern peninsula (Borneo) is highly susceptible to landslides because of heavy rainfall and tectonic disturbances. The purpose of the Landslide hazard mapping is to identify the hazardous regions for the execution of mitigation plans which can reduce the loss of life and property from future landslide incidences. Currently, the Malaysian research bodies e.g. academic institutions and government agencies are trying to develop a landslide hazard and risk database for susceptible areas to backing the prevention, mitigation, and evacuation plan. However, there is a lack of devotion towards landslide inventory mapping as an elementary input of landslide susceptibility, hazard and risk mapping. The developing techniques based on remote sensing technologies (satellite, terrestrial and airborne) are promising techniques to accelerate the production of landslide maps, shrinking the time and resources essential for their compilation and orderly updates. The aim of the study is to provide a better perception regarding the use of virtual mapping of landslides with the help of LiDAR technology. The focus of the study is spatio temporal detection and virtual mapping of landslide inventory via visualization and interpretation of very high-resolution data (VHR) in forested terrain of Mesilau river, Kundasang. However, to cope with the challenges of virtual inventory mapping on in forested terrain high resolution LiDAR derivatives are used. This study specifies that the airborne LiDAR technology can be an effective tool for mapping landslide inventories in a complex climatic and geological conditions, and a quick way of mapping regional hazards in the tropics.
3D thermography imaging standardization technique for inflammation diagnosis
NASA Astrophysics Data System (ADS)
Ju, Xiangyang; Nebel, Jean-Christophe; Siebert, J. Paul
2005-01-01
We develop a 3D thermography imaging standardization technique to allow quantitative data analysis. Medical Digital Infrared Thermal Imaging is very sensitive and reliable mean of graphically mapping and display skin surface temperature. It allows doctors to visualise in colour and quantify temperature changes in skin surface. The spectrum of colours indicates both hot and cold responses which may co-exist if the pain associate with an inflammatory focus excites an increase in sympathetic activity. However, due to thermograph provides only qualitative diagnosis information, it has not gained acceptance in the medical and veterinary communities as a necessary or effective tool in inflammation and tumor detection. Here, our technique is based on the combination of visual 3D imaging technique and thermal imaging technique, which maps the 2D thermography images on to 3D anatomical model. Then we rectify the 3D thermogram into a view independent thermogram and conform it a standard shape template. The combination of these imaging facilities allows the generation of combined 3D and thermal data from which thermal signatures can be quantified.
Cellular-based preemption system
NASA Technical Reports Server (NTRS)
Bachelder, Aaron D. (Inventor)
2011-01-01
A cellular-based preemption system that uses existing cellular infrastructure to transmit preemption related data to allow safe passage of emergency vehicles through one or more intersections. A cellular unit in an emergency vehicle is used to generate position reports that are transmitted to the one or more intersections during an emergency response. Based on this position data, the one or more intersections calculate an estimated time of arrival (ETA) of the emergency vehicle, and transmit preemption commands to traffic signals at the intersections based on the calculated ETA. Additional techniques may be used for refining the position reports, ETA calculations, and the like. Such techniques include, without limitation, statistical preemption, map-matching, dead-reckoning, augmented navigation, and/or preemption optimization techniques, all of which are described in further detail in the above-referenced patent applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forest, E.; Bengtsson, J.; Reusch, M.F.
1991-04-01
The full power of Yoshida's technique is exploited to produce an arbitrary order implicit symplectic integrator and multi-map explicit integrator. This implicit integrator uses a characteristic function involving the force term alone. Also we point out the usefulness of the plain Ruth algorithm in computing Taylor series map using the techniques first introduced by Berz in his 'COSY-INFINITY' code.
Hyperspectral image visualization based on a human visual model
NASA Astrophysics Data System (ADS)
Zhang, Hongqin; Peng, Honghong; Fairchild, Mark D.; Montag, Ethan D.
2008-02-01
Hyperspectral image data can provide very fine spectral resolution with more than 200 bands, yet presents challenges for visualization techniques for displaying such rich information on a tristimulus monitor. This study developed a visualization technique by taking advantage of both the consistent natural appearance of a true color image and the feature separation of a PCA image based on a biologically inspired visual attention model. The key part is to extract the informative regions in the scene. The model takes into account human contrast sensitivity functions and generates a topographic saliency map for both images. This is accomplished using a set of linear "center-surround" operations simulating visual receptive fields as the difference between fine and coarse scales. A difference map between the saliency map of the true color image and that of the PCA image is derived and used as a mask on the true color image to select a small number of interesting locations where the PCA image has more salient features than available in the visible bands. The resulting representations preserve hue for vegetation, water, road etc., while the selected attentional locations may be analyzed by more advanced algorithms.
Semi-automatic mapping of linear-trending bedforms using 'Self-Organizing Maps' algorithm
NASA Astrophysics Data System (ADS)
Foroutan, M.; Zimbelman, J. R.
2017-09-01
Increased application of high resolution spatial data such as high resolution satellite or Unmanned Aerial Vehicle (UAV) images from Earth, as well as High Resolution Imaging Science Experiment (HiRISE) images from Mars, makes it necessary to increase automation techniques capable of extracting detailed geomorphologic elements from such large data sets. Model validation by repeated images in environmental management studies such as climate-related changes as well as increasing access to high-resolution satellite images underline the demand for detailed automatic image-processing techniques in remote sensing. This study presents a methodology based on an unsupervised Artificial Neural Network (ANN) algorithm, known as Self Organizing Maps (SOM), to achieve the semi-automatic extraction of linear features with small footprints on satellite images. SOM is based on competitive learning and is efficient for handling huge data sets. We applied the SOM algorithm to high resolution satellite images of Earth and Mars (Quickbird, Worldview and HiRISE) in order to facilitate and speed up image analysis along with the improvement of the accuracy of results. About 98% overall accuracy and 0.001 quantization error in the recognition of small linear-trending bedforms demonstrate a promising framework.
Xiao, Shijun; Li, Jiongtang; Ma, Fengshou; Fang, Lujing; Xu, Shuangbin; Chen, Wei; Wang, Zhi Yong
2015-09-03
Large yellow croaker (Larimichthys crocea) is an important commercial fish in China and East-Asia. The annual product of the species from the aqua-farming industry is about 90 thousand tons. In spite of its economic importance, genetic studies of economic traits and genomic selections of the species are hindered by the lack of genomic resources. Specifically, a whole-genome physical map of large yellow croaker is still missing. The traditional BAC-based fingerprint method is extremely time- and labour-consuming. Here we report the first genome map construction using the high-throughput whole-genome mapping technique by nanochannel arrays in BioNano Genomics Irys system. For an optimal marker density of ~10 per 100 kb, the nicking endonuclease Nt.BspQ1 was chosen for the genome map generation. 645,305 DNA molecules with a total length of ~112 Gb were labelled and detected, covering more than 160X of the large yellow croaker genome. Employing IrysView package and signature patterns in raw DNA molecules, a whole-genome map of large yellow croaker was assembled into 686 maps with a total length of 727 Mb, which was consistent with the estimated genome size. The N50 length of the whole-genome map, including 126 maps, was up to 1.7 Mb. The excellent hybrid alignment with large yellow croaker draft genome validated the consensus genome map assembly and highlighted a promising application of whole-genome mapping on draft genome sequence super-scaffolding. The genome map data of large yellow croaker are accessible on lycgenomics.jmu.edu.cn/pm. Using the state-of-the-art whole-genome mapping technique in Irys system, the first whole-genome map for large yellow croaker has been constructed and thus highly facilitates the ongoing genomic and evolutionary studies for the species. To our knowledge, this is the first public report on genome map construction by the whole-genome mapping for aquatic-organisms. Our study demonstrates a promising application of the whole-genome mapping on genome maps construction for other non-model organisms in a fast and reliable manner.
NASA Astrophysics Data System (ADS)
Liu, Zeyu; Xia, Tiecheng; Wang, Jinbo
2018-03-01
We propose a new fractional two-dimensional triangle function combination discrete chaotic map (2D-TFCDM) with the discrete fractional difference. Moreover, the chaos behaviors of the proposed map are observed and the bifurcation diagrams, the largest Lyapunov exponent plot, and the phase portraits are derived, respectively. Finally, with the secret keys generated by Menezes–Vanstone elliptic curve cryptosystem, we apply the discrete fractional map into color image encryption. After that, the image encryption algorithm is analyzed in four aspects and the result indicates that the proposed algorithm is more superior than the other algorithms. Project supported by the National Natural Science Foundation of China (Grant Nos. 61072147 and 11271008).
Correlation mapping microscopy
NASA Astrophysics Data System (ADS)
McGrath, James; Alexandrov, Sergey; Owens, Peter; Subhash, Hrebesh M.; Leahy, Martin J.
2015-03-01
Changes in the microcirculation are associated with conditions such as Raynauds disease. Current modalities used to assess the microcirculation such as nailfold capillaroscopy are limited due to their depth ambiguity. A correlation mapping technique was recently developed to extend the capabilities of Optical Coherence Tomography to generate depth resolved images of the microcirculation. Here we present the extension of this technique to microscopy modalities, including confocal microscopy. It is shown that this correlation mapping microscopy technique can extend the capabilities of conventional microscopy to enable mapping of vascular networks in vivo with high spatial resolution.
Gray, B.A.; Zori, Roberto T.; McGuire, P.M.; Bonde, R.K.
2002-01-01
Detailed chromosome studies were conducted for the Florida manatee (Trichechus manatus latirostris) utilizing primary chromosome banding techniques (G- and Q-banding). Digital microscopic imaging methods were employed and a standard G-banded karyotype was constructed for both sexes. Based on chromosome banding patterns and measurements obtained in these studies, a standard karyotype and ideogram are proposed. Characterization of additional cytogenetic features of this species by supplemental chromosome banding techniques, C-banding (constitutive heterochromatin), Ag-NOR staining (nucleolar organizer regions), and DA/DAPI staining, was also performed. These studies provide detailed cytogenetic data for T. manatus latirostris, which could enhance future genetic mapping projects and interspecific and intraspecific genomic comparisons by techniques such as zoo-FISH.
Planetary Geologic Mapping Handbook - 2009
NASA Technical Reports Server (NTRS)
Tanaka, K. L.; Skinner, J. A.; Hare, T. M.
2009-01-01
Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of project-specific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well (e.g., Hare and others, 2009). Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically (e.g., Wilhelms, 1972, 1990; Tanaka and others, 1994). As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program s Planetary Cartography and Geologic Mapping Working Group s (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.
NASA Astrophysics Data System (ADS)
Mogaji, Kehinde Anthony; Omobude, Osayande Bright
2017-12-01
Modeling of groundwater potentiality zones is a vital scheme for effective management of groundwater resources. This study developed a new multi-criteria decision making algorithm for groundwater potentiality modeling through modifying the standard GOD model. The developed model christened as GODT model was applied to assess groundwater potential in a multi-faceted crystalline geologic terrain, southwestern, Nigeria using the derived four unify groundwater potential conditioning factors namely: Groundwater hydraulic confinement (G), aquifer Overlying strata resistivity (O), Depth to water table (D) and Thickness of aquifer (T) from the interpreted geophysical data acquired in the area. With the developed model algorithm, the GIS-based produced G, O, D and T maps were synthesized to estimate groundwater potential index (GWPI) values for the area. The estimated GWPI values were processed in GIS environment to produce groundwater potential prediction index (GPPI) map which demarcate the area into four potential zones. The produced GODT model-based GPPI map was validated through application of both correlation technique and spatial attribute comparative scheme (SACS). The performance of the GODT model was compared with that of the standard analytic hierarchy process (AHP) model. The correlation technique results established 89% regression coefficients for the GODT modeling algorithm compared with 84% for the AHP model. On the other hand, the SACS validation results for the GODT and AHP models are 72.5% and 65%, respectively. The overall results indicate that both models have good capability for predicting groundwater potential zones with the GIS-based GODT model as a good alternative. The GPPI maps produced in this study can form part of decision making model for environmental planning and groundwater management in the area.
Fast and robust generation of feature maps for region-based visual attention.
Aziz, Muhammad Zaheer; Mertsching, Bärbel
2008-05-01
Visual attention is one of the important phenomena in biological vision which can be followed to achieve more efficiency, intelligence, and robustness in artificial vision systems. This paper investigates a region-based approach that performs pixel clustering prior to the processes of attention in contrast to late clustering as done by contemporary methods. The foundation steps of feature map construction for the region-based attention model are proposed here. The color contrast map is generated based upon the extended findings from the color theory, the symmetry map is constructed using a novel scanning-based method, and a new algorithm is proposed to compute a size contrast map as a formal feature channel. Eccentricity and orientation are computed using the moments of obtained regions and then saliency is evaluated using the rarity criteria. The efficient design of the proposed algorithms allows incorporating five feature channels while maintaining a processing rate of multiple frames per second. Another salient advantage over the existing techniques is the reusability of the salient regions in the high-level machine vision procedures due to preservation of their shapes and precise locations. The results indicate that the proposed model has the potential to efficiently integrate the phenomenon of attention into the main stream of machine vision and systems with restricted computing resources such as mobile robots can benefit from its advantages.
Hybrid discrete-time neural networks.
Cao, Hongjun; Ibarz, Borja
2010-11-13
Hybrid dynamical systems combine evolution equations with state transitions. When the evolution equations are discrete-time (also called map-based), the result is a hybrid discrete-time system. A class of biological neural network models that has recently received some attention falls within this category: map-based neuron models connected by means of fast threshold modulation (FTM). FTM is a connection scheme that aims to mimic the switching dynamics of a neuron subject to synaptic inputs. The dynamic equations of the neuron adopt different forms according to the state (either firing or not firing) and type (excitatory or inhibitory) of their presynaptic neighbours. Therefore, the mathematical model of one such network is a combination of discrete-time evolution equations with transitions between states, constituting a hybrid discrete-time (map-based) neural network. In this paper, we review previous work within the context of these models, exemplifying useful techniques to analyse them. Typical map-based neuron models are low-dimensional and amenable to phase-plane analysis. In bursting models, fast-slow decomposition can be used to reduce dimensionality further, so that the dynamics of a pair of connected neurons can be easily understood. We also discuss a model that includes electrical synapses in addition to chemical synapses with FTM. Furthermore, we describe how master stability functions can predict the stability of synchronized states in these networks. The main results are extended to larger map-based neural networks.
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.
2001-01-01
Northeast Yellowstone National Park (YNP) has a diversity of forest, range, and wetland cover types. Several remote sensing studies have recently been done in this area, including the NASA Earth Observations Commercial Applications Program (EOCAP) hyperspectral project conducted by Yellowstone Ecosystems Studies (YES) on the use of hyperspectral imaging for assessing riparian and in-stream habitats. In 1999, YES and NASA's Commercial Remote Sensing Program Office began collaborative study of this area, assessing the potential of synergistic use of hyperspectral, synthetic aperture radar (SAR), and multiband thermal data for mapping forest, range, and wetland land cover. Since the beginning, a quality 'reference' land cover map has been desired as a tool for developing and validating other land cover maps produced during the project. This paper recounts an effort to produce such a reference land cover map using low-altitude Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data and unsupervised classification techniques. The main objective of this study is to assess ISODATA classification for mapping land cover in Northeast YNP using select bands of low-altitude AVIRIS data. A secondary, more long-term objective is to assess the potential for improving ISODATA-based classification of land cover through use of principal components analysis and minimum noise fraction (MNF) techniques. This paper will primarily report on work regarding the primary research objective. This study focuses on an AVIRIS cube acquired on July 23, 1999, by the confluence of Soda Butte Creek with the Lamar River. Range and wetland habitats dominate the image with forested habitats being a comparatively minor component of the scene. The scene generally tracks from southwest to northeast. Most of the scene is valley bottom with some lower side slopes occurring on the western portion. Elevations within the AVIRIS scene range from approximately 1998 to 2165 m above sea level, based on US Geological Survey (USGS) 30-m digital elevation model (DEM) data. Despain and the National Park Service (NPS) provide additional description of the study area.
Map Your Way to Speech Success! Employing Mind Mapping as a Speech Preparation Technique
ERIC Educational Resources Information Center
Paxman, Christina G.
2011-01-01
Mind mapping has gained considerable credibility recently in corporations such as Boeing and Nabisco, as well as in the classroom in terms of preparing for examinations and preparing for speeches. A mind map is a graphic technique for organizing an individual's thoughts and other information. It harnesses the full range of cortical skills--word,…
Mapping coastal morphodynamics with geospatial techniques, Cape Henry, Virginia, USA
NASA Astrophysics Data System (ADS)
Allen, Thomas R.; Oertel, George F.; Gares, Paul A.
2012-01-01
The advent and proliferation of digital terrain technologies have spawned concomitant advances in coastal geomorphology. Airborne topographic Light Detection and Ranging (LiDAR) has stimulated a renaissance in coastal mapping, and field-based mapping techniques have benefitted from improvements in real-time kinematic (RTK) Global Positioning System (GPS). Varied methodologies for mapping suggest a need to match geospatial products to geomorphic forms and processes, a task that should consider product and process ontologies from each perspective. Towards such synthesis, coastal morphodynamics on a cuspate foreland are reconstructed using spatial analysis. Sequential beach ridge and swale topography are mapped using photogrammetric spot heights and airborne LiDAR data and integrated with digital bathymetry and large-scale vector shoreline data. Isobaths from bathymetric charts were digitized to determine slope and toe depth of the modern shoreface and a reconstructed three-dimensional antecedent shoreface. Triangulated irregular networks were created for the subaerial cape and subaqueous shoreface models of the cape beach ridges and sets for volumetric analyses. Results provide estimates of relative age and progradation rate and corroborate other paleogeologic sea-level rise data from the region. Swale height elevations and other measurements quantifiable in these data provide several parameters suitable for studying coastal geomorphic evolution. Mapped paleoshorelines and volumes suggest the Virginia Beach coastal compartment is related to embryonic spit development from a late Holocene shoreline located some 5 km east of the current beach.
Commowick, Olivier; Akhondi-Asl, Alireza; Warfield, Simon K.
2012-01-01
We present a new algorithm, called local MAP STAPLE, to estimate from a set of multi-label segmentations both a reference standard segmentation and spatially varying performance parameters. It is based on a sliding window technique to estimate the segmentation and the segmentation performance parameters for each input segmentation. In order to allow for optimal fusion from the small amount of data in each local region, and to account for the possibility of labels not being observed in a local region of some (or all) input segmentations, we introduce prior probabilities for the local performance parameters through a new Maximum A Posteriori formulation of STAPLE. Further, we propose an expression to compute confidence intervals in the estimated local performance parameters. We carried out several experiments with local MAP STAPLE to characterize its performance and value for local segmentation evaluation. First, with simulated segmentations with known reference standard segmentation and spatially varying performance, we show that local MAP STAPLE performs better than both STAPLE and majority voting. Then we present evaluations with data sets from clinical applications. These experiments demonstrate that spatial adaptivity in segmentation performance is an important property to capture. We compared the local MAP STAPLE segmentations to STAPLE, and to previously published fusion techniques and demonstrate the superiority of local MAP STAPLE over other state-of-the- art algorithms. PMID:22562727
Kinsman, Nicole; Gibbs, Ann E.; Nolan, Matt
2015-01-01
For extensive and remote coastlines, the absence of high-quality elevation models—for example, those produced with lidar—leaves some coastal populations lacking one of the essential elements for mapping shoreline positions or flood extents. Here, we compare seven different elevation products in a lowlying area in western Alaska to establish their appropriateness for coastal mapping applications that require the delineation of elevation-based vectors. We further investigate the effective use of a Structure from Motion (SfM)-derived surface model (vertical RMSE<20 cm) by generating a tidal datum-based shoreline and an inundation extent map for a 2011 flood event. Our results suggest that SfM-derived elevation products can yield elevation-based vector features that have horizontal positional uncertainties comparable to those derived from other techniques. We also provide a rule-of-thumb equation to aid in the selection of minimum elevation model specifications based on terrain slope, vertical uncertainties, and desired horizontal accuracy.
Surface mapping of spike potential fields: experienced EEGers vs. computerized analysis.
Koszer, S; Moshé, S L; Legatt, A D; Shinnar, S; Goldensohn, E S
1996-03-01
An EEG epileptiform spike focus recorded with scalp electrodes is clinically localized by visual estimation of the point of maximal voltage and the distribution of its surrounding voltages. We compared such estimated voltage maps, drawn by experienced electroencephalographers (EEGers), with a computerized spline interpolation technique employed in the commercially available software package FOCUS. Twenty-two spikes were recorded from 15 patients during long-term continuous EEG monitoring. Maps of voltage distribution from the 28 electrodes surrounding the points of maximum change in slope (the spike maximum) were constructed by the EEGer. The same points of maximum spike and voltage distributions at the 29 electrodes were mapped by computerized spline interpolation and a comparison between the two methods was made. The findings indicate that the computerized spline mapping techniques employed in FOCUS construct voltage maps with similar maxima and distributions as the maps created by experienced EEGers. The dynamics of spike activity, including correlations, are better visualized using the computerized technique than by manual interpretation alone. Its use as a technique for spike localization is accurate and adds information of potential clinical value.
Na, X D; Zang, S Y; Wu, C S; Li, W L
2015-11-01
Knowledge of the spatial extent of forested wetlands is essential to many studies including wetland functioning assessment, greenhouse gas flux estimation, and wildlife suitable habitat identification. For discriminating forested wetlands from their adjacent land cover types, researchers have resorted to image analysis techniques applied to numerous remotely sensed data. While with some success, there is still no consensus on the optimal approaches for mapping forested wetlands. To address this problem, we examined two machine learning approaches, random forest (RF) and K-nearest neighbor (KNN) algorithms, and applied these two approaches to the framework of pixel-based and object-based classifications. The RF and KNN algorithms were constructed using predictors derived from Landsat 8 imagery, Radarsat-2 advanced synthetic aperture radar (SAR), and topographical indices. The results show that the objected-based classifications performed better than per-pixel classifications using the same algorithm (RF) in terms of overall accuracy and the difference of their kappa coefficients are statistically significant (p<0.01). There were noticeably omissions for forested and herbaceous wetlands based on the per-pixel classifications using the RF algorithm. As for the object-based image analysis, there were also statistically significant differences (p<0.01) of Kappa coefficient between results performed based on RF and KNN algorithms. The object-based classification using RF provided a more visually adequate distribution of interested land cover types, while the object classifications based on the KNN algorithm showed noticeably commissions for forested wetlands and omissions for agriculture land. This research proves that the object-based classification with RF using optical, radar, and topographical data improved the mapping accuracy of land covers and provided a feasible approach to discriminate the forested wetlands from the other land cover types in forestry area.
NASA Astrophysics Data System (ADS)
Berthias, F.; Feketeová, L.; Della Negra, R.; Dupasquier, T.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.; Märk, T. D.
2018-01-01
The combination of the Dispositif d'Irradiation d'Agrégats Moléculaire with the correlated ion and neutral time of flight-velocity map imaging technique provides a new way to explore processes occurring subsequent to the excitation of charged nano-systems. The present contribution describes in detail the methods developed for the quantitative measurement of branching ratios and cross sections for collision-induced dissociation processes of water cluster nano-systems. These methods are based on measurements of the detection efficiency of neutral fragments produced in these dissociation reactions. Moreover, measured detection efficiencies are used here to extract the number of neutral fragments produced for a given charged fragment.
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1975-01-01
Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.
Characterization of a bent Laue double-crystal beam-expanding monochromator
Martinson, Mercedes; Samadi, Nazanin; Shi, Xianbo; ...
2017-10-19
A bent Laue double-crystal monochromator system has been designed for vertically expanding the X-ray beam at the Canadian Light Source's BioMedical Imaging and Therapy beamlines. Expansion by a factor of 12 has been achieved without deteriorating the transverse coherence of the beam, allowing phase-based imaging techniques to be performed with high flux and a large field of view. However, preliminary studies revealed a lack of uniformity in the beam, presumed to be caused by imperfect bending of the silicon crystal wafers used in the system. Results from finite-element analysis of the system predicted that the second crystal would be mostmore » severely affected and has been shown experimentally. It has been determined that the majority of the distortion occurs in the second crystal and is likely caused by an imperfection in the surface of the bending frame. Here, measurements were then taken to characterize the bending of the crystal using both mechanical and diffraction techniques. In particular, two techniques commonly used to map dislocations in crystal structures have been adapted to map local curvature of the bent crystals. One of these, a variation of Berg–Berrett topography, has been used to quantify the diffraction effects caused by the distortion of the crystal wafer. This technique produces a global mapping of the deviation of the diffraction angle relative to a perfect cylinder. Finally, this information is critical for improving bending and measuring tolerances of imperfections by correlating this mapping to areas of missing intensity in the beam.« less
Characterization of a bent Laue double-crystal beam-expanding monochromator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinson, Mercedes; Samadi, Nazanin; Shi, Xianbo
A bent Laue double-crystal monochromator system has been designed for vertically expanding the X-ray beam at the Canadian Light Source's BioMedical Imaging and Therapy beamlines. Expansion by a factor of 12 has been achieved without deteriorating the transverse coherence of the beam, allowing phase-based imaging techniques to be performed with high flux and a large field of view. However, preliminary studies revealed a lack of uniformity in the beam, presumed to be caused by imperfect bending of the silicon crystal wafers used in the system. Results from finite-element analysis of the system predicted that the second crystal would be mostmore » severely affected and has been shown experimentally. It has been determined that the majority of the distortion occurs in the second crystal and is likely caused by an imperfection in the surface of the bending frame. Here, measurements were then taken to characterize the bending of the crystal using both mechanical and diffraction techniques. In particular, two techniques commonly used to map dislocations in crystal structures have been adapted to map local curvature of the bent crystals. One of these, a variation of Berg–Berrett topography, has been used to quantify the diffraction effects caused by the distortion of the crystal wafer. This technique produces a global mapping of the deviation of the diffraction angle relative to a perfect cylinder. Finally, this information is critical for improving bending and measuring tolerances of imperfections by correlating this mapping to areas of missing intensity in the beam.« less
Fassier, J-B; Lamort-Bouché, M; Sarnin, P; Durif-Bruckert, C; Péron, J; Letrilliart, L; Durand, M-J
2016-02-01
Health promotion programs are expected to improve population health and reduce social inequalities in health. However, their theoretical foundations are frequently ill-defined, and their implementation faces many obstacles. The aim of this article is to describe the intervention mapping protocol in health promotion programs planning, used recently in several countries. The challenges of planning health promotion programs are presented, and the six steps of the intervention mapping protocol are described with an example. Based on a literature review, the use of this protocol, its requirements and potential limitations are discussed. The intervention mapping protocol has four essential characteristics: an ecological perspective (person-environment), a participative approach, the use of theoretical models in human and social sciences and the use of scientific evidence. It comprises six steps: conduct a health needs assessment, define change objectives, select theory-based change techniques and practical applications, organize techniques and applications into an intervention program (logic model), plan for program adoption, implementation, and sustainability, and generate an evaluation plan. This protocol was used in different countries and domains such as obesity, tobacco, physical activity, cancer and occupational health. Although its utilization requires resources and a critical stance, this protocol was used to develop interventions which efficacy was demonstrated. The intervention mapping protocol is an integrated process that fits the scientific and practical challenges of health promotion. It could be tested in France as it was used in other countries, in particular to reduce social inequalities in health. Copyright © 2016. Published by Elsevier Masson SAS.
Zhu, Mingxing; Yu, Bin; Yang, Wanzhang; Jiang, Yanbing; Lu, Lin; Huang, Zhen; Chen, Shixiong; Li, Guanglin
2017-11-21
Swallowing is a continuous process with substantive interdependencies among different muscles, and it plays a significant role in our daily life. The aim of this study was to propose a novel technique based on high-density surface electromyography (HD sEMG) for the evaluation of normal swallowing functions. A total of 96 electrodes were placed on the front neck to acquire myoelectric signals from 12 healthy subjects while they were performing different swallowing tasks. HD sEMG energy maps were constructed based on the root mean square values to visualize muscular activities during swallowing. The effects of different volumes, viscosities, and head postures on the normal swallowing process were systemically investigated by using the energy maps. The results showed that the HD sEMG energy maps could provide detailed spatial and temporal properties of the muscle electrical activity, and visualize the muscle contractions that closely related to the swallowing function. The energy maps also showed that the swallowing time and effort was also explicitly affected by the volume and viscosity of the bolus. The concentration of the muscular activities shifted to the opposite side when the subjects turned their head to either side. The proposed method could provide an alternative method to physiologically evaluate the dynamic characteristics of normal swallowing and had the advantage of providing a full picture of how different muscle activities cooperate in time and location. The findings from this study suggested that the HD sEMG technique might be a useful tool for fast screening and objective assessment of swallowing disorders or dysphagia.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Feathering effect detection and artifact agglomeration index-based video deinterlacing technique
NASA Astrophysics Data System (ADS)
Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo
2018-03-01
Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.
A new simple technique for improving the random properties of chaos-based cryptosystems
NASA Astrophysics Data System (ADS)
Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.
2018-03-01
A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.
Equipment and techniques for low-altitude aerial sensing of water-vapor concentration and movement
Howell, R.L.
1969-01-01
Progress in the development of equipment and techniques for making rapid measurements of moisture movement through the atmosphere over a large area is described. Airborne sensing elements measure relative humidity, temperature, and air currents. These data are telemetered to a ground-based station and recorded. A radar unit tracks the aircraft and electronically plots its position on a base map of the area being studied. Thus the distribution of atmospheric conditions can be directly related to the underlying terrain and vegetation features. ?? 1969 American Elsevier Publishing Company, Inc.
A new multicriteria risk mapping approach based on a multiattribute frontier concept.
Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty
2013-09-01
Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.
Execution models for mapping programs onto distributed memory parallel computers
NASA Technical Reports Server (NTRS)
Sussman, Alan
1992-01-01
The problem of exploiting the parallelism available in a program to efficiently employ the resources of the target machine is addressed. The problem is discussed in the context of building a mapping compiler for a distributed memory parallel machine. The paper describes using execution models to drive the process of mapping a program in the most efficient way onto a particular machine. Through analysis of the execution models for several mapping techniques for one class of programs, we show that the selection of the best technique for a particular program instance can make a significant difference in performance. On the other hand, the results of benchmarks from an implementation of a mapping compiler show that our execution models are accurate enough to select the best mapping technique for a given program.
Middle Atmosphere Program. Handbook for MAP. Volume 13: Ground-based Techniques
NASA Technical Reports Server (NTRS)
Vincent, R. A. (Editor)
1984-01-01
Topics of activities in the middle Atmosphere program covered include: lidar systems of aerosol studies; mesosphere temperature; upper atmosphere temperatures and winds; D region electron densities; nitrogen oxides; atmospheric composition and structure; and optical sounding of ozone.
Karthikeyan, G; Sundarraj, A Shunmuga; Elango, K P
2003-10-01
193 drinking water samples from water sources of 27 panchayats of Veppanapalli block of Dharmapuri district of Tamil Nadu were analysed for chemical quality parameters. Based on the fluoride content of the water sources, fluoride maps differentiating regions with high / low fluoride levels were prepared using Isopleth mapping technique. The interdependence among the important chemical quality parameters were assessed using correlation studies. The experimental results of the application of linear and multiple regression equations on the influence of hardness, alkalinity, total dissolved solids and pH on fluoride are discussed.
Mathematical biodescriptors of proteomics maps: background and applications.
Basak, Subhash C; Gute, Brian D
2008-05-01
This article reviews recent developments in the formulation and application of biodescriptors to characterize proteomics maps. Such biodescriptors can be derived by applying techniques from discrete mathematics (graph theory, linear algebra and information theory). This review focuses on the development of biodescriptors for proteomics maps derived from 2D gel electrophoresis. Preliminary results demonstrated that such descriptors have a reasonable ability to differentiate between proteomics patterns that result from exposure to closely related individual chemicals and complex mixtures, such as the jet fuel JP-8. Further research is required to evaluate the utility of these proteomics-based biodescriptors for drug discovery and predictive toxicology.
Error reduction and parameter optimization of the TAPIR method for fast T1 mapping.
Zaitsev, M; Steinhoff, S; Shah, N J
2003-06-01
A methodology is presented for the reduction of both systematic and random errors in T(1) determination using TAPIR, a Look-Locker-based fast T(1) mapping technique. The relations between various sequence parameters were carefully investigated in order to develop recipes for choosing optimal sequence parameters. Theoretical predictions for the optimal flip angle were verified experimentally. Inversion pulse imperfections were identified as the main source of systematic errors in T(1) determination with TAPIR. An effective remedy is demonstrated which includes extension of the measurement protocol to include a special sequence for mapping the inversion efficiency itself. Copyright 2003 Wiley-Liss, Inc.
Fast IR laser mapping ellipsometry for the study of functional organic thin films.
Furchner, Andreas; Sun, Guoguang; Ketelsen, Helge; Rappich, Jörg; Hinrichs, Karsten
2015-03-21
Fast infrared mapping with sub-millimeter lateral resolution as well as time-resolved infrared studies of kinetic processes of functional organic thin films require a new generation of infrared ellipsometers. We present a novel laboratory-based infrared (IR) laser mapping ellipsometer, in which a laser is coupled to a variable-angle rotating analyzer ellipsometer. Compared to conventional Fourier-transform infrared (FT-IR) ellipsometers, the IR laser ellipsometer provides ten- to hundredfold shorter measurement times down to 80 ms per measured spot, as well as about tenfold increased lateral resolution of 120 μm, thus enabling mapping of small sample areas with thin-film sensitivity. The ellipsometer, equipped with a HeNe laser emitting at about 2949 cm(-1), was applied for the optical characterization of inhomogeneous poly(3-hexylthiophene) [P3HT] and poly(N-isopropylacrylamide) [PNIPAAm] organic thin films used for opto-electronics and bioapplications. With the constant development of tunable IR laser sources, laser-based infrared ellipsometry is a promising technique for fast in-depth mapping characterization of thin films and blends.
Global localization of 3D point clouds in building outline maps of urban outdoor environments.
Landsiedel, Christian; Wollherr, Dirk
2017-01-01
This paper presents a method to localize a robot in a global coordinate frame based on a sparse 2D map containing outlines of building and road network information and no location prior information. Its input is a single 3D laser scan of the surroundings of the robot. The approach extends the generic chamfer matching template matching technique from image processing by including visibility analysis in the cost function. Thus, the observed building planes are matched to the expected view of the corresponding map section instead of to the entire map, which makes a more accurate matching possible. Since this formulation operates on generic edge maps from visual sensors, the matching formulation can be expected to generalize to other input data, e.g., from monocular or stereo cameras. The method is evaluated on two large datasets collected in different real-world urban settings and compared to a baseline method from literature and to the standard chamfer matching approach, where it shows considerable performance benefits, as well as the feasibility of global localization based on sparse building outline data.
Single-shot real-time three dimensional measurement based on hue-height mapping
NASA Astrophysics Data System (ADS)
Wan, Yingying; Cao, Yiping; Chen, Cheng; Fu, Guangkai; Wang, Yapin; Li, Chengmeng
2018-06-01
A single-shot three-dimensional (3D) measurement based on hue-height mapping is proposed. The color fringe pattern is encoded by three sinusoidal fringes with the same frequency but different shifting phase into red (R), green (G) and blue (B) color channels, respectively. It is found that the hue of the captured color fringe pattern on the reference plane maintains monotonic in one period even it has the color crosstalk. Thus, unlike the traditional color phase shifting technique, the hue information is utilized to decode the color fringe pattern and map to the pixels of the fringe displacement in the proposed method. Because the monotonicity of the hue is limited within one period, displacement unwrapping is proposed to obtain the continuous displacement that is finally used to map to the height distribution. This method directly utilizes the hue under the effect of color crosstalk for mapping the height so that no color calibration is involved. Also, as it requires only single shot deformed color fringe pattern, this method can be applied into the real-time or dynamic 3D measurements.
NASA Astrophysics Data System (ADS)
Kinkingnehun, Serge R. J.; du Boisgueheneuc, Foucaud; Golmard, Jean-Louis; Zhang, Sandy X.; Levy, Richard; Dubois, Bruno
2004-04-01
We have developed a new technique to analyze correlations between brain anatomy and its neurological functions. The technique is based on the anatomic MRI of patients with brain lesions who are administered neuropsychological tests. Brain lesions of the MRI scans are first manually segmented. The MRI volumes are then normalized to a reference map, using the segmented area as a mask. After normalization, the brain lesions of the MRI are segmented again in order to redefine the border of the lesions in the context of the normalized brain. Once the MRI is segmented, the patient's score on the neuropsychological test is assigned to each voxel in the lesioned area, while the rest of the voxels of the image are set to 0. Subsequently, the individual patient's MRI images are superimposed, and each voxel is reassigned the average score of the patients who have a lesion at that voxel. A threshold is applied to remove regions having less than three overlaps. This process leads to an anatomo-functional map that links brain areas to functional loss. Other maps can be created to aid in analyzing the functional maps, such as one that indicates the 95% confidence interval of the averaged scores for each area. This anatomo-clinical overlapping map (AnaCOM) method was used to obtain functional maps from patients with lesions in the superior frontal gyrus. By finding particular subregions more responsible for a particular deficit, this method can generate new hypotheses to be tested by conventional group methods.
Kirov, Ilya; Van Laere, Katrijn; De Riek, Jan; De Keyser, Ellen; Van Roy, Nadine; Khrustaleva, Ludmila
2014-01-01
In order to anchor Rosa linkage groups to physical chromosomes, a combination of the Tyramide-FISH technology and the modern molecular marker system based on High Resolution Melting (HRM) is an efficient approach. Although, Tyramide-FISH is a very promising technique for the visualization of short DNA probes, it is very challenging for plant species with small chromosomes such as Rosa. In this study, we successfully applied the Tyramide-FISH technique for Rosa and compared different detection systems. An indirect detection system exploiting biotinylated tyramides was shown to be the most suitable technique for reliable signal detection. Three gene fragments with a size of 1100 pb–1700 bp (Phenylalanine Ammonia Lyase, Pyrroline-5-Carboxylate Synthase and Orcinol O-Methyl Transferase) have been physically mapped on chromosomes 7, 4 and 1, respectively, of Rosa wichurana. The signal frequency was between 25% and 40%. HRM markers of these 3 gene fragments were used to include the gene fragments on the existing genetic linkage map of Rosa wichurana. As a result, three linkage groups could be anchored to their physical chromosomes. The information was used to check for synteny between the Rosa chromosomes and Fragaria. PMID:24755945
DOE Office of Scientific and Technical Information (OSTI.GOV)
David, Sabrina N.; Zhai, Yao; van der Zande, Arend M.
Two-dimensional (2D) atomic materials such as graphene and transition metal dichalcogenides (TMDCs) have attracted significant research and industrial interest for their electronic, optical, mechanical, and thermal properties. While large-area crystal growth techniques such as chemical vapor deposition have been demonstrated, the presence of grain boundaries and orientation of grains arising in such growths substantially affect the physical properties of the materials. There is currently no scalable characterization method for determining these boundaries and orientations over a large sample area. We here present a second-harmonic generation based microscopy technique for rapidly mapping grain orientations and boundaries of 2D TMDCs. We experimentallymore » demonstrate the capability to map large samples to an angular resolution of ±1° with minimal sample preparation and without involved analysis. A direct comparison of the all-optical grain orientation maps against results obtained by diffraction-filtered dark-field transmission electron microscopy plus selected-area electron diffraction on identical TMDC samples is provided. This rapid and accurate tool should enable large-area characterization of TMDC samples for expedited studies of grain boundary effects and the efficient characterization of industrial-scale production techniques.« less
Price, Jeff
1995-01-01
These maps show changes in the distribution and abundance patterns of some North American birds for the last 20 years. For each species there are four maps, each representing the average distribution and abundance pattern over the five-year periods 1970-1974, 1975-1979, 1980-1984, and 1985-1989. The maps are based on data collected by the USFWS/CWS Breeding Bird Survey (BBS). Only BBS routes that were run at least once during each of the five-year periods were used (about 1300 routes). The maps were created in the software package Surfer using a kriging technique to interpolate mean relative abundances for areas where no routes were run. On each map, a portion of northeast Canada was blanked out because there were not enough routes to allow for adequate interpolation. All of the maps in this presentation use the same color scale (shown below). The minimum value mapped was 0.5 birds per route, which represents the edge of the species range.
Evaluation of MRI sequences for quantitative T1 brain mapping
NASA Astrophysics Data System (ADS)
Tsialios, P.; Thrippleton, M.; Glatz, A.; Pernet, C.
2017-11-01
T1 mapping constitutes a quantitative MRI technique finding significant application in brain imaging. It allows evaluation of contrast uptake, blood perfusion, volume, providing a more specific biomarker of disease progression compared to conventional T1-weighted images. While there are many techniques for T1-mapping there is a wide range of reported T1-values in tissues, raising the issue of protocols reproducibility and standardization. The gold standard for obtaining T1-maps is based on acquiring IR-SE sequence. Widely used alternative sequences are IR-SE-EPI, VFA (DESPOT), DESPOT-HIFI and MP2RAGE that speed up scanning and fitting procedures. A custom MRI phantom was used to assess the reproducibility and accuracy of the different methods. All scans were performed using a 3T Siemens Prisma scanner. The acquired data processed using two different codes. The main difference was observed for VFA (DESPOT) which grossly overestimated T1 relaxation time by 214 ms [126 270] compared to the IR-SE sequence. MP2RAGE and DESPOT-HIFI sequences gave slightly shorter time than IR-SE (~20 to 30ms) and can be considered as alternative and time-efficient methods for acquiring accurate T1 maps of the human brain, while IR-SE-EPI gave identical result, at a cost of a lower image quality.
Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model
NASA Astrophysics Data System (ADS)
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-05-01
Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.
Improved disparity map analysis through the fusion of monocular image segmentations
NASA Technical Reports Server (NTRS)
Perlant, Frederic P.; Mckeown, David M.
1991-01-01
The focus is to examine how estimates of three dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. The utilization of surface illumination information is provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Stereo analysis results are presented on a complex urban scene containing various man-made and natural features. This scene contains a variety of problems including low building height with respect to the stereo baseline, buildings and roads in complex terrain, and highly textured buildings and terrain. The improvements are demonstrated due to monocular fusion with a set of different region-based image segmentations. The generality of this approach to stereo analysis and its utility in the development of general three dimensional scene interpretation systems are also discussed.
Jin, Di; Zhou, Renjie; Yaqoob, Zahid; So, Peter T C
2018-01-08
Optical diffraction tomography (ODT) is an emerging microscopy technique for three-dimensional (3D) refractive index (RI) mapping of transparent specimens. Recently, the digital micromirror device (DMD) based scheme for angle-controlled plane wave illumination has been proposed to improve the imaging speed and stability of ODT. However, undesired diffraction noise always exists in the reported DMD-based illumination scheme, which leads to a limited contrast ratio of the measurement fringe and hence inaccurate RI mapping. Here we present a novel spatial filtering method, based on a second DMD, to dynamically remove the diffraction noise. The reported results illustrate significantly enhanced image quality of the obtained interferograms and the subsequently derived phase maps. And moreover, with this method, we demonstrate mapping of 3D RI distribution of polystyrene beads as well as biological cells with high accuracy. Importantly, with the proper hardware configuration, our method does not compromise the 3D imaging speed advantage promised by the DMD-based illumination scheme. Specifically, we have been able to successfully obtain interferograms at over 1 kHz speed, which is critical for potential high-throughput label-free 3D image cytometry applications.
Erasing the Milky Way: New Cleaning Technique Applied to GBT Intensity Mapping Data
NASA Technical Reports Server (NTRS)
Wolz, L.; Blake, C.; Abdalla, F. B.; Anderson, C. J.; Chang, T.-C.; Li, Y.-C.; Masi, K.W.; Switzer, E.; Pen, U.-L.; Voytek, T. C.;
2016-01-01
We present the first application of a new foreground removal pipeline to the current leading HI intensity mapping dataset, obtained by the Green Bank Telescope (GBT). We study the 15- and 1-h field data of the GBT observations previously presented in Masui et al. (2013) and Switzer et al. (2013), covering about 41 square degrees at 0.6 less than z is less than 1.0, for which cross-correlations may be measured with the galaxy distribution of the WiggleZ Dark Energy Survey. In the presented pipeline, we subtract the Galactic foreground continuum and the point source contamination using an independent component analysis technique (fastica), and develop a Fourier-based optimal estimator to compute the temperature power spectrum of the intensity maps and cross-correlation with the galaxy survey data. We show that fastica is a reliable tool to subtract diffuse and point-source emission through the non-Gaussian nature of their probability distributions. The temperature power spectra of the intensity maps is dominated by instrumental noise on small scales which fastica, as a conservative sub-traction technique of non-Gaussian signals, can not mitigate. However, we determine similar GBT-WiggleZ cross-correlation measurements to those obtained by the Singular Value Decomposition (SVD) method, and confirm that foreground subtraction with fastica is robust against 21cm signal loss, as seen by the converged amplitude of these cross-correlation measurements. We conclude that SVD and fastica are complementary methods to investigate the foregrounds and noise systematics present in intensity mapping datasets.
NASA Astrophysics Data System (ADS)
Nardi, F.; Grimaldi, S.; Petroselli, A.
2012-12-01
Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.
NASA Astrophysics Data System (ADS)
Hapca, Simona
2015-04-01
Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential in predicting chemical composition in particular for iron, which is generally sparsely distributed in soil. For carbon, silicon and oxygen, which are more densely distributed, the additional kriging of the regression tree residuals improved significantly the prediction, whereas prediction based on co-kriging was less consistent across replicates, underperforming regression-tree kriging. The present study shows a great potential in integrating geo-statistical methods with imaging techniques to unveil the 3D chemical structure of soil at very fine scales, the framework being suitable to be further applied to other types of imaging data such as images of biological thin sections for characterization of microbial distribution. Key words: X-ray CT, SEM-EDX, segmentation techniques, spatial correlation, 3D soil images, 2D chemical maps.
2012-01-01
Background Establishing the distribution of materials in paintings and that of their degradation products by imaging techniques is fundamental to understand the painting technique and can improve our knowledge on the conservation status of the painting. The combined use of chromatographic-mass spectrometric techniques, such as GC/MS or Py/GC/MS, and the chemical mapping of functional groups by imaging SR FTIR in transmission mode on thin sections and SR XRD line scans will be presented as a suitable approach to have a detailed characterisation of the materials in a paint sample, assuring their localisation in the sample build-up. This analytical approach has been used to study samples from Catalan paintings by Josep Maria Sert y Badía (20th century), a muralist achieving international recognition whose canvases adorned international buildings. Results The pigments used by the painter as well as the organic materials used as binders and varnishes could be identified by means of conventional techniques. The distribution of these materials by means of Synchrotron Radiation based techniques allowed to establish the mixtures used by the painter depending on the purpose. Conclusions Results show the suitability of the combined use of SR μFTIR and SR μXRD mapping and conventional techniques to unequivocally identify all the materials present in the sample and their localization in the sample build-up. This kind of approach becomes indispensable to solve the challenge of micro heterogeneous samples. The complementary interpretation of the data obtained with all the different techniques allowed the characterization of both organic and inorganic materials in the samples layer by layer as well as to establish the painting techniques used by Sert in the works-of-art under study. PMID:22616949
A grid spacing control technique for algebraic grid generation methods
NASA Technical Reports Server (NTRS)
Smith, R. E.; Kudlinski, R. A.; Everton, E. L.
1982-01-01
A technique which controls the spacing of grid points in algebraically defined coordinate transformations is described. The technique is based on the generation of control functions which map a uniformly distributed computational grid onto parametric variables defining the physical grid. The control functions are smoothed cubic splines. Sets of control points are input for each coordinate directions to outline the control functions. Smoothed cubic spline functions are then generated to approximate the input data. The technique works best in an interactive graphics environment where control inputs and grid displays are nearly instantaneous. The technique is illustrated with the two-boundary grid generation algorithm.
NASA Astrophysics Data System (ADS)
Dilbone, Elizabeth K.
Methods for spectrally-based bathymetric mapping of rivers mainly have been developed and tested on clear-flowing, gravel bedded channels, with limited application to turbid, sand-bedded rivers. Using hyperspectral images of the Niobrara River, Nebraska, and field-surveyed depth data, this study evaluated three methods of retrieving depth from remotely sensed data in a dynamic, sand-bedded channel. The first regression-based approach paired in situ depth measurements and image pixel values to predict depth via Optimal Band Ratio Analysis (OBRA). The second approach used ground-based reflectance measurements to calibrate an OBRA relationship. For this approach, CASI images were atmospherically corrected to units of apparent surface reflectance using an empirical line calibration. For the final technique, we used Image-to-Depth Quantile Transformation (IDQT) to predict depth by linking the cumulative distribution function (CDF) of depth to the CDF of an image derived variable. OBRA yielded the lowest overall depth retrieval error (0.0047 m) and highest observed versus predicted R2 (0.81). Although misalignment between field and image data were not problematic to OBRA's performance in this study, such issues present potential limitations to standard regression-based approaches like OBRA in dynamic, sand-bedded rivers. Field spectroscopy-based maps exhibited a slight shallow bias (0.0652 m) but provided reliable depth estimates for most of the study reach. IDQT had a strong deep bias, but still provided informative relative depth maps that portrayed general patterns of shallow and deep areas of the channel. The over-prediction of depth by IDQT highlights the need for an unbiased sampling strategy to define the CDF of depth. While each of the techniques tested in this study demonstrated the potential to provide accurate depth estimates in sand-bedded rivers, each method also was subject to certain constraints and limitations.
Hidden explosives detector employing pulsed neutron and x-ray interrogation
Schultz, F.J.; Caldwell, J.T.
1993-04-06
Methods and systems for the detection of small amounts of modern, highly-explosive nitrogen-based explosives, such as plastic explosives, hidden in airline baggage. Several techniques are employed either individually or combined in a hybrid system. One technique employed in combination is X-ray imaging. Another technique is interrogation with a pulsed neutron source in a two-phase mode of operation to image both nitrogen and oxygen densities. Another technique employed in combination is neutron interrogation to form a hydrogen density image or three-dimensional map. In addition, deliberately-placed neutron-absorbing materials can be detected.
Hidden explosives detector employing pulsed neutron and x-ray interrogation
Schultz, Frederick J.; Caldwell, John T.
1993-01-01
Methods and systems for the detection of small amounts of modern, highly-explosive nitrogen-based explosives, such as plastic explosives, hidden in airline baggage. Several techniques are employed either individually or combined in a hybrid system. One technique employed in combination is X-ray imaging. Another technique is interrogation with a pulsed neutron source in a two-phase mode of operation to image both nitrogen and oxygen densities. Another technique employed in combination is neutron interrogation to form a hydrogen density image or three-dimensional map. In addition, deliberately-placed neutron-absorbing materials can be detected.
Surface characterization based on optical phase shifting interferometry
Mello, Michael , Rosakis; Ares, J [Altadena, CA
2011-08-02
Apparatus, techniques and systems for implementing an optical interferometer to measure surfaces, including mapping of instantaneous curvature or in-plane and out-of-plane displacement field gradients of a sample surface based on obtaining and processing four optical interferograms from a common optical reflected beam from the sample surface that are relatively separated in phase by .pi./2.
Middle atmosphere thermal structure during MAP/WINE
NASA Technical Reports Server (NTRS)
Offermann, D.
1989-01-01
Middle atmosphere temperatures were measured during the MAP/WINE campaign by various ground-based techniques, by rocket instruments, and by satellites. Respective data were analyzed for atmospheric thermal mean state as well as for long and short period variations. A brief survey of the results is given. Monthly mean temperatures agree well with the new CIRA model. Long period (planetary) waves frequently exhibit peculiar vertical amplitude and phase structures, resembling those of standing waves. Short period oscillations tend to begin breaking well below the stratosphere.
2013-11-01
for rovers operating in close proximity to points of interest. Techniques such as Simultaneous Localization and Mapping ( SLAM ) have been utilized...successfully to localize rovers in a variety of settings and scenarios [3,4]. SLAM focuses on building a local map of landmarks as observed by a rover...more landmarks are observed and errors filtered. SLAM therefore does not require a priori knowledge of the locations of landmarks or that of the rover
Performance Mapping Studies in Redox Flow Cells
NASA Technical Reports Server (NTRS)
Hoberecht, M. A.; Thaller, L. H.
1981-01-01
Pumping power requirements in any flow battery system constitute a direct parasitic energy loss. It is therefore useful to determine the practical lower limit for reactant flow rates. Through the use of a theoretical framework based on electrochemical first principles, two different experimental flow mapping techniques were developed to evaluate and compare electrodes as a function of flow rate. For the carbon felt electrodes presently used in NASA-Lewis Redox cells, a flow rate 1.5 times greater than the stoichiometric rate seems to be the required minimum.
A Fast Approximate Algorithm for Mapping Long Reads to Large Reference Databases.
Jain, Chirag; Dilthey, Alexander; Koren, Sergey; Aluru, Srinivas; Phillippy, Adam M
2018-04-30
Emerging single-molecule sequencing technologies from Pacific Biosciences and Oxford Nanopore have revived interest in long-read mapping algorithms. Alignment-based seed-and-extend methods demonstrate good accuracy, but face limited scalability, while faster alignment-free methods typically trade decreased precision for efficiency. In this article, we combine a fast approximate read mapping algorithm based on minimizers with a novel MinHash identity estimation technique to achieve both scalability and precision. In contrast to prior methods, we develop a mathematical framework that defines the types of mapping targets we uncover, establish probabilistic estimates of p-value and sensitivity, and demonstrate tolerance for alignment error rates up to 20%. With this framework, our algorithm automatically adapts to different minimum length and identity requirements and provides both positional and identity estimates for each mapping reported. For mapping human PacBio reads to the hg38 reference, our method is 290 × faster than Burrows-Wheeler Aligner-MEM with a lower memory footprint and recall rate of 96%. We further demonstrate the scalability of our method by mapping noisy PacBio reads (each ≥5 kbp in length) to the complete NCBI RefSeq database containing 838 Gbp of sequence and >60,000 genomes.
Kernelized Locality-Sensitive Hashing for Fast Image Landmark Association
2011-03-24
based Simultaneous Localization and Mapping ( SLAM ). The problem, however, is that vision-based navigation techniques can re- quire excessive amounts of...up and optimizing the data association process in vision-based SLAM . Specifically, this work studies the current methods that algorithms use to...required for location identification than that of other methods. This work can then be extended into a vision- SLAM implementation to subsequently
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Le; Timbie, Peter T.; Bunn, Emory F.
In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less
Hatam, Nahid; Kafashi, Shahnaz; Kavosi, Zahra
2015-07-01
The importance of health indicators in the recent years has created challenges in resource allocation. Balanced and fair distribution of health resources is one of the main principles in achieving equity. The goal of this cross-sectional descriptive study, conducted in 2010, was to classify health structural indicators in the Fars province using the scalogram technique. Health structural indicators were selected and classified in three categories; namely institutional, human resources, and rural health. The data were obtained from the statistical yearbook of Iran and was analyzed according to the scalogram technique. The distribution map of the Fars province was drawn using ArcGIS (geographic information system). The results showed an interesting health structural indicator map across the province. Our findings revealed that the city of Mohr with 85 and Zarindasht with 36 had the highest and the lowest scores, respectively. This information is valuable to provincial health policymakers to plan appropriately based on factual data and minimize chaos in allocating health resources. Based on such data and reflecting on the local needs, one could develop equity based resource allocation policies and prevent inequality. It is concluded that, as top priority, the provincial policymakers should place dedicated deprivation programs for Farashband, Eghlid and Zaindasht regions.
Delakis, Ioannis; Hammad, Omer; Kitney, Richard I
2007-07-07
Wavelet-based de-noising has been shown to improve image signal-to-noise ratio in magnetic resonance imaging (MRI) while maintaining spatial resolution. Wavelet-based de-noising techniques typically implemented in MRI require that noise displays uniform spatial distribution. However, images acquired with parallel MRI have spatially varying noise levels. In this work, a new algorithm for filtering images with parallel MRI is presented. The proposed algorithm extracts the edges from the original image and then generates a noise map from the wavelet coefficients at finer scales. The noise map is zeroed at locations where edges have been detected and directional analysis is also used to calculate noise in regions of low-contrast edges that may not have been detected. The new methodology was applied on phantom and brain images and compared with other applicable de-noising techniques. The performance of the proposed algorithm was shown to be comparable with other techniques in central areas of the images, where noise levels are high. In addition, finer details and edges were maintained in peripheral areas, where noise levels are low. The proposed methodology is fully automated and can be applied on final reconstructed images without requiring sensitivity profiles or noise matrices of the receiver coils, therefore making it suitable for implementation in a clinical MRI setting.
High-resolution Antibody Array Analysis of Childhood Acute Leukemia Cells*
Kanderova, Veronika; Kuzilkova, Daniela; Stuchly, Jan; Vaskova, Martina; Brdicka, Tomas; Fiser, Karel; Hrusak, Ondrej; Lund-Johansen, Fridtjof
2016-01-01
Acute leukemia is a disease pathologically manifested at both genomic and proteomic levels. Molecular genetic technologies are currently widely used in clinical research. In contrast, sensitive and high-throughput proteomic techniques for performing protein analyses in patient samples are still lacking. Here, we used a technology based on size exclusion chromatography followed by immunoprecipitation of target proteins with an antibody bead array (Size Exclusion Chromatography-Microsphere-based Affinity Proteomics, SEC-MAP) to detect hundreds of proteins from a single sample. In addition, we developed semi-automatic bioinformatics tools to adapt this technology for high-content proteomic screening of pediatric acute leukemia patients. To confirm the utility of SEC-MAP in leukemia immunophenotyping, we tested 31 leukemia diagnostic markers in parallel by SEC-MAP and flow cytometry. We identified 28 antibodies suitable for both techniques. Eighteen of them provided excellent quantitative correlation between SEC-MAP and flow cytometry (p < 0.05). Next, SEC-MAP was applied to examine 57 diagnostic samples from patients with acute leukemia. In this assay, we used 632 different antibodies and detected 501 targets. Of those, 47 targets were differentially expressed between at least two of the three acute leukemia subgroups. The CD markers correlated with immunophenotypic categories as expected. From non-CD markers, we found DBN1, PAX5, or PTK2 overexpressed in B-cell precursor acute lymphoblastic leukemias, LAT, SH2D1A, or STAT5A overexpressed in T-cell acute lymphoblastic leukemias, and HCK, GLUD1, or SYK overexpressed in acute myeloid leukemias. In addition, OPAL1 overexpression corresponded to ETV6-RUNX1 chromosomal translocation. In summary, we demonstrated that SEC-MAP technology is a powerful tool for detecting hundreds of proteins in clinical samples obtained from pediatric acute leukemia patients. It provides information about protein size and reveals differences in protein expression between particular leukemia subgroups. Forty-seven of SEC-MAP identified targets were validated by other conventional method in this study. PMID:26785729
Oberg, Kevin A.; Mades, Dean M.
1987-01-01
Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)
Joint fMRI analysis and subject clustering using sparse dictionary learning
NASA Astrophysics Data System (ADS)
Kim, Seung-Jun; Dontaraju, Krishna K.
2017-08-01
Multi-subject fMRI data analysis methods based on sparse dictionary learning are proposed. In addition to identifying the component spatial maps by exploiting the sparsity of the maps, clusters of the subjects are learned by postulating that the fMRI volumes admit a subspace clustering structure. Furthermore, in order to tune the associated hyper-parameters systematically, a cross-validation strategy is developed based on entry-wise sampling of the fMRI dataset. Efficient algorithms for solving the proposed constrained dictionary learning formulations are developed. Numerical tests performed on synthetic fMRI data show promising results and provides insights into the proposed technique.
3D silicon breast surface mapping via structured light profilometry
NASA Astrophysics Data System (ADS)
Vairavan, R.; Ong, N. R.; Sauli, Z.; Kirtsaeng, S.; Sakuntasathien, S.; Shahimin, M. M.; Alcain, J. B.; Lai, S. L.; Paitong, P.; Retnasamy, V.
2017-09-01
Digital fringe projection technique is one of the promising optical methods for 3D surface imaging as it demonstrates non contact and non invasive characteristics. The potential of this technique matches the requirement for human body evaluation, as it is vital for disease diagnosis and for treatment option selection. Thus, the digital fringe projection has addressed this requirement with its wide clinical related application and studies. However, the application of this technique for 3D surface mapping of the breast is very minimal. Hence, in this work, the application of digital fringe projection for 3D breast surface mapping is reported. Phase shift fringe projection technique was utilized to perform the 3D breast surface mapping. Maiden results have confirmed the feasibility of using the digital fringe projection method for 3D surface mapping of the breast and it can be extended for breast cancer detection.
A tone mapping operator based on neural and psychophysical models of visual perception
NASA Astrophysics Data System (ADS)
Cyriac, Praveen; Bertalmio, Marcelo; Kane, David; Vazquez-Corral, Javier
2015-03-01
High dynamic range imaging techniques involve capturing and storing real world radiance values that span many orders of magnitude. However, common display devices can usually reproduce intensity ranges only up to two to three orders of magnitude. Therefore, in order to display a high dynamic range image on a low dynamic range screen, the dynamic range of the image needs to be compressed without losing details or introducing artefacts, and this process is called tone mapping. A good tone mapping operator must be able to produce a low dynamic range image that matches as much as possible the perception of the real world scene. We propose a two stage tone mapping approach, in which the first stage is a global method for range compression based on a gamma curve that equalizes the lightness histogram the best, and the second stage performs local contrast enhancement and color induction using neural activity models for the visual cortex.
ERIC Educational Resources Information Center
Malycha, Charlotte P.; Maier, Günter W.
2017-01-01
Although creativity techniques are highly recommended in working environments, their effects have been scarcely investigated. Two cognitive processes are often considered to foster creative potential and are, therefore, taken as a basis for creativity techniques: knowledge activation and conceptual combination. In this study, both processes were…
NASA Astrophysics Data System (ADS)
Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian
2018-06-01
Infrared (IR) small target enhancement plays a significant role in modern infrared search and track (IRST) systems and is the basic technique of target detection and tracking. In this paper, a coarse-to-fine grey level mapping method using improved sigmoid transformation and saliency histogram is designed to enhance IR small targets under different backgrounds. For the stage of rough enhancement, the intensity histogram is modified via an improved sigmoid function so as to narrow the regular intensity range of background as much as possible. For the part of further enhancement, a linear transformation is accomplished based on a saliency histogram constructed by averaging the cumulative saliency values provided by a saliency map. Compared with other typical methods, the presented method can achieve both better visual performances and quantitative evaluations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.
2015-02-15
Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less
A Physics-Based Deep Learning Approach to Shadow Invariant Representations of Hyperspectral Images.
Windrim, Lloyd; Ramakrishnan, Rishi; Melkumyan, Arman; Murphy, Richard J
2018-02-01
This paper proposes the Relit Spectral Angle-Stacked Autoencoder, a novel unsupervised feature learning approach for mapping pixel reflectances to illumination invariant encodings. This work extends the Spectral Angle-Stacked Autoencoder so that it can learn a shadow-invariant mapping. The method is inspired by a deep learning technique, Denoising Autoencoders, with the incorporation of a physics-based model for illumination such that the algorithm learns a shadow invariant mapping without the need for any labelled training data, additional sensors, a priori knowledge of the scene or the assumption of Planckian illumination. The method is evaluated using datasets captured from several different cameras, with experiments to demonstrate the illumination invariance of the features and how they can be used practically to improve the performance of high-level perception algorithms that operate on images acquired outdoors.
Classification of the Regional Ionospheric Disturbance Based on Machine Learning Techniques
NASA Astrophysics Data System (ADS)
Terzi, Merve Begum; Arikan, Orhan; Karatay, Secil; Arikan, Feza; Gulyaeva, Tamara
2016-08-01
In this study, Total Electron Content (TEC) estimated from GPS receivers is used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. For the automated classification of regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. Performance of developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing developed classification technique to Global Ionospheric Map (GIM) TEC data, which is provided by the NASA Jet Propulsion Laboratory (JPL), it is shown that SVM can be a suitable learning method to detect anomalies in TEC variations.
NASA Technical Reports Server (NTRS)
Butera, M. K.
1979-01-01
The success of remotely mapping wetland vegetation of the southwestern coast of Florida is examined. A computerized technique to process aircraft and LANDSAT multispectral scanner data into vegetation classification maps was used. The cost effectiveness of this mapping technique was evaluated in terms of user requirements, accuracy, and cost. Results indicate that mangrove communities are classified most cost effectively by the LANDSAT technique, with an accuracy of approximately 87 percent and with a cost of approximately 3 cent per hectare compared to $46.50 per hectare for conventional ground survey methods.
Tanaka, K.L.; Skinner, J.A.; Crumpler, L.S.; Dohm, J.M.
2009-01-01
We photogeologically mapped the SP Mountain region of the San Francisco Volcanic Field in northern Arizona, USA to evaluate and improve the fidelity of approaches used in geologic mapping of Mars. This test site, which was previously mapped in the field, is chiefly composed of Late Cenozoic cinder cones, lava flows, and alluvium perched on Permian limestone of the Kaibab Formation. Faulting and folding has deformed the older rocks and some of the volcanic materials, and fluvial erosion has carved drainage systems and deposited alluvium. These geologic materials and their formational and modificational histories are similar to those for regions of the Martian surface. We independently prepared four geologic maps using topographic and image data at resolutions that mimic those that are commonly used to map the geology of Mars (where consideration was included for the fact that Martian features such as lava flows are commonly much larger than their terrestrial counterparts). We primarily based our map units and stratigraphic relations on geomorphology, color contrasts, and cross-cutting relationships. Afterward, we compared our results with previously published field-based mapping results, including detailed analyses of the stratigraphy and of the spatial overlap and proximity of the field-based vs. remote-based (photogeologic) map units, contacts, and structures. Results of these analyses provide insights into how to optimize the photogeologic mapping of Mars (and, by extension, other remotely observed planetary surfaces). We recommend the following: (1) photogeologic mapping as an excellent approach to recovering the general geology of a region, along with examination of local, high-resolution datasets to gain insights into the complexity of the geology at outcrop scales; (2) delineating volcanic vents and lava-flow sequences conservatively and understanding that flow abutment and flow overlap are difficult to distinguish in remote data sets; (3) taking care to understand that surficial materials (such as alluvium and volcanic ash deposits) are likely to be under-mapped yet are important because they obscure underlying units and contacts; (4) where possible, mapping multiple contact and structure types based on their varying certainty and exposure that reflect the perceived accuracy of the linework; (5) reviewing the regional context and searching for evidence of geologic activity that may have affected the map area yet for which evidence within the map area may be absent; and (6) for multi-authored maps, collectively analyzing the mapping relations, approaches, and methods throughout the duration of the mapping project with the objective of achieving a solid, harmonious product.
A close-range photogrammetric technique for mapping neotectonic features in trenches
Fairer, G.M.; Whitney, J.W.; Coe, J.A.
1989-01-01
Close-range photogrammetric techniques and newly available computerized plotting equipment were used to map exploratory trench walls that expose Quaternary faults in the vicinity of Yucca Mountain, Nevada. Small-scale structural, lithologic, and stratigraphic features can be rapidly mapped by the photogrammetric method. This method is more accurate and significantly more rapid than conventional trench-mapping methods, and the analytical plotter is capable of producing cartographic definition of high resolution when detailed trench maps are necessary. -from Authors
MERINOVA: Meteorological risks as drivers of environmental innovation in agro-ecosystem management
NASA Astrophysics Data System (ADS)
Gobin, Anne; Oger, Robert; Marlier, Catherine; Van De Vijver, Hans; Vandermeulen, Valerie; Van Huylenbroeck, Guido; Zamani, Sepideh; Curnel, Yannick; Mettepenningen, Evi
2013-04-01
The BELSPO funded project 'MERINOVA' deals with risks associated with extreme weather phenomena and with risks of biological origin such as pests and diseases. The major objectives of the proposed project are to characterise extreme meteorological events, assess the impact on Belgian agro-ecosystems, characterise their vulnerability and resilience to these events, and explore innovative adaptation options to agricultural risk management. The project comprises of five major parts that reflect the chain of risks: (i) Hazard: Assessing the likely frequency and magnitude of extreme meteorological events by means of probability density functions; (ii) Impact: Analysing the potential bio-physical and socio-economic impact of extreme weather events on agro-ecosystems in Belgium using process-based modelling techniques commensurate with the regional scale; (iii) Vulnerability: Identifying the most vulnerable agro-ecosystems using fuzzy multi-criteria and spatial analysis; (iv) Risk Management: Uncovering innovative risk management and adaptation options using actor-network theory and fuzzy cognitive mapping techniques; and, (v) Communication: Communicating to research, policy and practitioner communities using web-based techniques. The different tasks of the MERINOVA project require expertise in several scientific disciplines: meteorology, statistics, spatial database management, agronomy, bio-physical impact modelling, socio-economic modelling, actor-network theory, fuzzy cognitive mapping techniques. These expertises are shared by the four scientific partners who each lead one work package. The MERINOVA project will concentrate on promoting a robust and flexible framework by demonstrating its performance across Belgian agro-ecosystems, and by ensuring its relevance to policy makers and practitioners. Impacts developed from physically based models will not only provide information on the state of the damage at any given time, but also assist in understanding the links between different factors causing damage and determining bio-physical vulnerability. Socio-economic impacts will enlarge the basis for vulnerability mapping, risk management and adaptation options. A strong expert and end-user network will be established to help disseminating and exploiting project results to meet user needs.
Semiautomated model building for RNA crystallography using a directed rotameric approach.
Keating, Kevin S; Pyle, Anna Marie
2010-05-04
Structured RNA molecules play essential roles in a variety of cellular processes; however, crystallographic studies of such RNA molecules present a large number of challenges. One notable complication arises from the low resolutions typical of RNA crystallography, which results in electron density maps that are imprecise and difficult to interpret. This problem is exacerbated by the lack of computational tools for RNA modeling, as many of the techniques commonly used in protein crystallography have no equivalents for RNA structure. This leads to difficulty and errors in the model building process, particularly in modeling of the RNA backbone, which is highly error prone due to the large number of variable torsion angles per nucleotide. To address this, we have developed a method for accurately building the RNA backbone into maps of intermediate or low resolution. This method is semiautomated, as it requires a crystallographer to first locate phosphates and bases in the electron density map. After this initial trace of the molecule, however, an accurate backbone structure can be built without further user intervention. To accomplish this, backbone conformers are first predicted using RNA pseudotorsions and the base-phosphate perpendicular distance. Detailed backbone coordinates are then calculated to conform both to the predicted conformer and to the previously located phosphates and bases. This technique is shown to produce accurate backbone structure even when starting from imprecise phosphate and base coordinates. A program implementing this methodology is currently available, and a plugin for the Coot model building program is under development.
Shih, Yen-Yu I; Chen, You-Yin; Chen, Chiao-Chi V; Chen, Jyh-Cheng; Chang, Chen; Jaw, Fu-Shan
2008-06-01
Nociceptive neuronal activation in subcortical regions has not been well investigated in functional magnetic resonance imaging (fMRI) studies. The present report aimed to use the blood oxygenation level-dependent (BOLD) fMRI technique to map nociceptive responses in both subcortical and cortical regions by employing a refined data processing method, the atlas registration-based event-related (ARBER) analysis technique. During fMRI acquisition, 5% formalin (50 mul) was injected into the left hindpaw to induce nociception. ARBER was then used to normalize the data among rats, and images were analyzed using automatic selection of the atlas-based region of interest. It was found that formalin-induced nociceptive processing increased BOLD signals in both cortical and subcortical regions. The cortical activation was distributed over the cingulate, motor, somatosensory, insular, and visual cortices, and the subcortical activation involved the caudate putamen, hippocampus, periaqueductal gray, superior colliculus, thalamus, and hypothalamus. With the aid of ARBER, the present study revealed a detailed activation pattern that possibly indicated the recruitment of various parts of the nociceptive system. The results also demonstrated the utilization of ARBER in establishing an fMRI-based whole-brain nociceptive map. The formalin induced nociceptive images may serve as a template of central nociceptive responses, which can facilitate the future use of fMRI in evaluation of new drugs and preclinical therapies for pain. (c) 2008 Wiley-Liss, Inc.
Jones, Ryan M.; O’Reilly, Meaghan A.; Hynynen, Kullervo
2013-01-01
The feasibility of transcranial passive acoustic mapping with hemispherical sparse arrays (30 cm diameter, 16 to 1372 elements, 2.48 mm receiver diameter) using CT-based aberration corrections was investigated via numerical simulations. A multi-layered ray acoustic transcranial ultrasound propagation model based on CT-derived skull morphology was developed. By incorporating skull-specific aberration corrections into a conventional passive beamforming algorithm (Norton and Won 2000 IEEE Trans. Geosci. Remote Sens. 38 1337–43), simulated acoustic source fields representing the emissions from acoustically-stimulated microbubbles were spatially mapped through three digitized human skulls, with the transskull reconstructions closely matching the water-path control images. Image quality was quantified based on main lobe beamwidths, peak sidelobe ratio, and image signal-to-noise ratio. The effects on the resulting image quality of the source’s emission frequency and location within the skull cavity, the array sparsity and element configuration, the receiver element sensitivity, and the specific skull morphology were all investigated. The system’s resolution capabilities were also estimated for various degrees of array sparsity. Passive imaging of acoustic sources through an intact skull was shown possible with sparse hemispherical imaging arrays. This technique may be useful for the monitoring and control of transcranial focused ultrasound (FUS) treatments, particularly non-thermal, cavitation-mediated applications such as FUS-induced blood-brain barrier disruption or sonothrombolysis, for which no real-time monitoring technique currently exists. PMID:23807573
Som, Dipasree; Tak, Megha; Setia, Mohit; Patil, Asawari; Sengupta, Amit; Chilakapati, C Murali Krishna; Srivastava, Anurag; Parmar, Vani; Nair, Nita; Sarin, Rajiv; Badwe, R
2016-01-01
Raman spectroscopy which is based upon inelastic scattering of photons has a potential to emerge as a noninvasive bedside in vivo or ex vivo molecular diagnostic tool. There is a need to improve the sensitivity and predictability of Raman spectroscopy. We developed a grid matrix-based tissue mapping protocol to acquire cellular-specific spectra that also involved digital microscopy for localizing malignant and lymphocytic cells in sentinel lymph node biopsy sample. Biosignals acquired from specific cellular milieu were subjected to an advanced supervised analytical method, i.e., cross-correlation and peak-to-peak ratio in addition to PCA and PC-LDA. We observed decreased spectral intensity as well as shift in the spectral peaks of amides and lipid bands in the completely metastatic (cancer cells) lymph nodes with high cellular density. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to create an automated smart diagnostic tool for bench side screening of sampled lymph nodes. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to develop an automated smart diagnostic tool for bench side screening of sampled lymph nodes supported by ongoing global research in developing better technology and signal and big data processing algorithms.
Kranz, Christine
2014-01-21
In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.
NASA Technical Reports Server (NTRS)
Wilson, C.; Dye, R.; Reed, L.
1982-01-01
The errors associated with planimetric mapping of the United States using satellite remote sensing techniques are analyzed. Assumptions concerning the state of the art achievable for satellite mapping systems and platforms in the 1995 time frame are made. An analysis of these performance parameters is made using an interactive cartographic satellite computer model, after first validating the model using LANDSAT 1 through 3 performance parameters. An investigation of current large scale (1:24,000) US National mapping techniques is made. Using the results of this investigation, and current national mapping accuracy standards, the 1995 satellite mapping system is evaluated for its ability to meet US mapping standards for planimetric and topographic mapping at scales of 1:24,000 and smaller.
Forest and range mapping in the Houston area with ERTS-1
NASA Technical Reports Server (NTRS)
Heath, G. R.; Parker, H. D.
1973-01-01
ERTS-1 data acquired over the Houston area has been analyzed for applications to forest and range mapping. In the field of forestry the Sam Houston National Forest (Texas) was chosen as a test site, (Scene ID 1037-16244). Conventional imagery interpretation as well as computer processing methods were used to make classification maps of timber species, condition and land-use. The results were compared with timber stand maps which were obtained from aircraft imagery and checked in the field. The preliminary investigations show that conventional interpretation techniques indicated an accuracy in classification of 63 percent. The computer-aided interpretations made by a clustering technique gave 70 percent accuracy. Computer-aided and conventional multispectral analysis techniques were applied to range vegetation type mapping in the gulf coast marsh. Two species of salt marsh grasses were mapped.
Demirci, Oguz; Clark, Vincent P; Calhoun, Vince D
2008-02-15
Schizophrenia is diagnosed based largely upon behavioral symptoms. Currently, no quantitative, biologically based diagnostic technique has yet been developed to identify patients with schizophrenia. Classification of individuals into patient with schizophrenia and healthy control groups based on quantitative biologically based data is of great interest to support and refine psychiatric diagnoses. We applied a novel projection pursuit technique on various components obtained with independent component analysis (ICA) of 70 subjects' fMRI activation maps obtained during an auditory oddball task. The validity of the technique was tested with a leave-one-out method and the detection performance varied between 80% and 90%. The findings suggest that the proposed data reduction algorithm is effective in classifying individuals into schizophrenia and healthy control groups and may eventually prove useful as a diagnostic tool.
Narayan, Sreenath; Kalhan, Satish C.; Wilson, David L.
2012-01-01
I.Abstract Purpose To reduce swaps in fat-water separation methods, a particular issue on 7T small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Materials and Methods Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Results Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Conclusion Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. PMID:23023815
Narayan, Sreenath; Kalhan, Satish C; Wilson, David L
2013-05-01
To reduce swaps in fat-water separation methods, a particular issue on 7 Tesla (T) small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. Copyright © 2012 Wiley Periodicals, Inc.
Spatial Analysis of Land Adjustment as a Rehabilitation Base of Mangrove in Indramayu Regency
NASA Astrophysics Data System (ADS)
Sodikin; Sitorus, S. R. P.; Prasetyo, L. B.; Kusmana, C.
2018-02-01
Indramayu Regency is the area that has the largest mangrove in West Java. According to the environment and forestry ministry of Indramayu district will be targeted to be the central area of mangrove Indonesia. Mangroves in the regency from the 1990s have experienced a significant decline caused by the conversion of mangrove land into ponds and settlements. To stop the mangrove decline that continues to occur, it is necessary to rehabilitate mangroves in the area. The rehabilitation of mangrove should be in the area suitable for mangrove growth and what kind of vegetation analysis is appropriate to plant the area, so the purpose of this research is to analyze the suitability of land for mangrove in Indramayu Regency. This research uses geographic information system with overlay technique, while the data used in this research is tidal map of sea water, salintas map, land ph map, soil texture map, sea level rise map, land use map, community participation level map, and Map of organic soil. Then overlay and adjusted to matrix environmental parameters for mangrove growth. Based on the results of the analysis is known that in Indramayu District there are 5 types of mangroves that fit among others Bruguera, Soneratia, Nypah, Rhizophora, and Avicennia. The area of each area is Bruguera with an area of 6260 ha, 2958 ha, nypah 1756 ha, Rhizophora 936, and Avicennia 433 ha.
Computer-composite mapping for geologists
van Driel, J.N.
1980-01-01
A computer program for overlaying maps has been tested and evaluated as a means for producing geologic derivative maps. Four maps of the Sugar House Quadrangle, Utah, were combined, using the Multi-Scale Data Analysis and Mapping Program, in a single composite map that shows the relative stability of the land surface during earthquakes. Computer-composite mapping can provide geologists with a powerful analytical tool and a flexible graphic display technique. Digitized map units can be shown singly, grouped with different units from the same map, or combined with units from other source maps to produce composite maps. The mapping program permits the user to assign various values to the map units and to specify symbology for the final map. Because of its flexible storage, easy manipulation, and capabilities of graphic output, the composite-mapping technique can readily be applied to mapping projects in sedimentary and crystalline terranes, as well as to maps showing mineral resource potential. ?? 1980 Springer-Verlag New York Inc.
Kumar, Mukesh; Rath, Nitish Kumar; Rath, Santanu Kumar
2016-04-01
Microarray-based gene expression profiling has emerged as an efficient technique for classification, prognosis, diagnosis, and treatment of cancer. Frequent changes in the behavior of this disease generates an enormous volume of data. Microarray data satisfies both the veracity and velocity properties of big data, as it keeps changing with time. Therefore, the analysis of microarray datasets in a small amount of time is essential. They often contain a large amount of expression, but only a fraction of it comprises genes that are significantly expressed. The precise identification of genes of interest that are responsible for causing cancer are imperative in microarray data analysis. Most existing schemes employ a two-phase process such as feature selection/extraction followed by classification. In this paper, various statistical methods (tests) based on MapReduce are proposed for selecting relevant features. After feature selection, a MapReduce-based K-nearest neighbor (mrKNN) classifier is also employed to classify microarray data. These algorithms are successfully implemented in a Hadoop framework. A comparative analysis is done on these MapReduce-based models using microarray datasets of various dimensions. From the obtained results, it is observed that these models consume much less execution time than conventional models in processing big data. Copyright © 2016 Elsevier Inc. All rights reserved.
Optimization techniques for integrating spatial data
Herzfeld, U.C.; Merriam, D.F.
1995-01-01
Two optimization techniques ta predict a spatial variable from any number of related spatial variables are presented. The applicability of the two different methods for petroleum-resource assessment is tested in a mature oil province of the Midcontinent (USA). The information on petroleum productivity, usually not directly accessible, is related indirectly to geological, geophysical, petrographical, and other observable data. This paper presents two approaches based on construction of a multivariate spatial model from the available data to determine a relationship for prediction. In the first approach, the variables are combined into a spatial model by an algebraic map-comparison/integration technique. Optimal weights for the map comparison function are determined by the Nelder-Mead downhill simplex algorithm in multidimensions. Geologic knowledge is necessary to provide a first guess of weights to start the automatization, because the solution is not unique. In the second approach, active set optimization for linear prediction of the target under positivity constraints is applied. Here, the procedure seems to select one variable from each data type (structure, isopachous, and petrophysical) eliminating data redundancy. Automating the determination of optimum combinations of different variables by applying optimization techniques is a valuable extension of the algebraic map-comparison/integration approach to analyzing spatial data. Because of the capability of handling multivariate data sets and partial retention of geographical information, the approaches can be useful in mineral-resource exploration. ?? 1995 International Association for Mathematical Geology.
A New Active Cavitation Mapping Technique for Pulsed HIFU Applications – Bubble Doppler
Li, Tong; Khokhlova, Tatiana; Sapozhnikov, Oleg; Hwang, Joo Ha; Sapozhnikov, Oleg; O’Donnell, Matthew
2015-01-01
In this work, a new active cavitation mapping technique for pulsed high-intensity focused ultrasound (pHIFU) applications termed bubble Doppler is proposed and its feasibility tested in tissue-mimicking gel phantoms. pHIFU therapy uses short pulses, delivered at low pulse repetition frequency, to cause transient bubble activity that has been shown to enhance drug and gene delivery to tissues. The current gold standard for detecting and monitoring cavitation activity during pHIFU treatments is passive cavitation detection (PCD), which provides minimal information on the spatial distribution of the bubbles. B-mode imaging can detect hyperecho formation, but has very limited sensitivity, especially to small, transient microbubbles. The bubble Doppler method proposed here is based on a fusion of the adaptations of three Doppler techniques that had been previously developed for imaging of ultrasound contrast agents – color Doppler, pulse inversion Doppler, and decorrelation Doppler. Doppler ensemble pulses were interleaved with therapeutic pHIFU pulses using three different pulse sequences and standard Doppler processing was applied to the received echoes. The information yielded by each of the techniques on the distribution and characteristics of pHIFU-induced cavitation bubbles was evaluated separately, and found to be complementary. The unified approach - bubble Doppler – was then proposed to both spatially map the presence of transient bubbles and to estimate their sizes and the degree of nonlinearity. PMID:25265178
NASA Astrophysics Data System (ADS)
Braud, Isabelle; Desprats, Jean-François; Ayral, Pierre-Alain; Bouvier, Christophe; Vandervaere, Jean-Pierre
2017-04-01
Topsoil field-saturated hydraulic conductivity, Kfs, is a parameter that controls the partition of rainfall between infiltration and runoff. It is a key parameter in most distributed hydrological models. However, there is a mismatch between the scale of local in situ measurements and the scale at which the parameter is required in models. Therefore it is necessary to design methods to regionally map this parameter at the model scale. The paper propose a method for mapping Kfs in the Cévennes-Vivarais region, south-east France, using more easily available GIS data: geology and land cover. The mapping is based on a data set gathering infiltration tests performed in the area or close to it for more than ten years. The data set is composed of infiltration tests performed using various techniques: Guelph permeameter, double ring and single ring infiltration tests, infiltrometers with multiple suctions. The different methods lead to different orders of magnitude for Kfs rendering the pooling of all the data challenging. Therefore, a method is first proposed to pool the data from the different infiltration methods, leading to a homogenized set of Kfs, based on an equivalent double ring/tension disk infiltration value. Statistical tests showed significant differences in distributions among different geologies and land covers. Thus those variables were retained as proxy for mapping Kfs at the regional scale. This map was compared to a map based on the Rawls and Brakensiek (RB) pedo-transfer function (Manus et al., 2009, Vannier et al., 2016), showing very different patterns between both maps. In addition, RB values did not fit observed values at the plot scale, highlighting that soil texture only is not a good predictor of Kfs. References Manus, C., Anquetin, S., Braud, I., Vandervaere, J.P., Viallet, P., Creutin, J.D., Gaume, E., 2009. A modelling approach to assess the hydrological response of small Mediterranean catchments to the variability of soil characteristics in a context of extreme events. Hydrology and Earth System Sciences, 13: 79-87. Vannier, O., Anquetin, S., Braud, I., 2016. Investigating the role of geology in the hydrological response of Mediterranean catchments prone to flash-floods: regional modelling study and process understanding. Journal of Hydrology, 541 Part A, 158-172.
Bioanalysis: its past, present, and some future.
Righetti, Pier Giorgio
2004-07-01
An overview of about 100 years of bioanalysis is here disastrously attempted. The beginning of rigorous analytical systems can perhaps be traced back to the building and testing of the analytical ultracentrifuge by Svedberg and the apparatus for moving-boundary electrophoresis of Tiselius, both systems relying on expensive and hard to operate machines. In the sixties, Porath discovered porous beads for the determination of relative molecular mass (Mr) of proteins, based on the principle of molecular sieving. Concomitantly, Svensson and his pupil Vesterberg described a revolutionary principle for fractionating proteins in a nonisocratic environment, based on generation of stable pH gradients in an electric field, a technique that went down to history as isoelectric focusing (IEF). Polyacrylamide gel electrophoresis (PAGE), with the brilliant idea of discontinuous buffers, was brought to the limelight and in 1967, sodium dodecyl sulfate (SDS)-PAGE was described, permitting easy assessment of protein purity and reasonable measurements of Mr values of denatured polypeptide chains. By the mid seventies, another explosive concept was realized: orthogonal combination of two unrelated techniques, based on surface charge and mass fractionation, namely, two-dimensional (2-D) PAGE already in the very first papers by O'Farrell elaborated to its utmost sophistication. The eighties saw the systematic growth of 2-D PAGE, accompanied by systematic efforts to develop instrumentation for large-scale production of 2-D maps and computer evaluation for 2-D map analysis, based on the sophisticated algorithms adopted by astronomers for mapping stars in the sky. Another fundamental innovation in the field of IEF was the discovery of immobilized pH gradients (IPGs) that brought the much needed reproducibility in 2-D maps while allowing exquisite resolution in very narrow pH ranges. The nineties were definitely the decade of capillary zone electrophoresis, with the concomitant concept of automation and miniaturization in electrokinetic methodologies. Also 2-D map analysis witnessed a big revival, thanks to the adoption of IPGs for the first dimension. The enormous progress of mass spectrometry resulted in first reports on the analysis of macromolecules and the building of data bases on gene and protein banks. The third millennium is, perhaps, exasperating the concept of miniaturization at all costs, while not disdaining increasingly larger maps for 2-D analysis of complex protein mixtures.
A mobile mapping system for spatial information based on DGPS/EGIS
NASA Astrophysics Data System (ADS)
Pei, Ling; Wang, Qing; Gu, Juan
2007-11-01
With the rapid developments of mobile device and wireless communication, it brings a new challenge for acquiring the spatial information. A mobile mapping system based on differential global position system (DGPS) integrated with embedded geographic information system (EGIS) is designed. A mobile terminal adapts to various GPS differential environments such as single base mode and network GPS mode like Virtual Reference Station (VRS) and Master- Auxiliary Concept (MAC) by the mobile communication technology. The spatial information collected through DGPS is organized in an EGIS running in the embedded device. A set of mobile terminal in real-time DGPS based on GPRS adopting multithreading technique of serial port in manner of simulating overlapped I/O operating is developed, further more, the GPS message analysis and checkout based on Strategy Pattern for various receivers are included in the process of development. A mobile terminal accesses to the GPS network successfully by NTRIP (Networked Transport of RTCM via Internet Protocol) compliance. Finally, the accuracy and reliability of the mobile mapping system are proved by a lot of testing in 9 provinces all over the country.
Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing
NASA Astrophysics Data System (ADS)
Ari, Gizem; Toker, Cenk
2016-07-01
Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.