Sample records for processing map map

  1. Assessing the impact of graphical quality on automatic text recognition in digital maps

    NASA Astrophysics Data System (ADS)

    Chiang, Yao-Yi; Leyk, Stefan; Honarvar Nazari, Narges; Moghaddam, Sima; Tan, Tian Xiang

    2016-08-01

    Converting geographic features (e.g., place names) in map images into a vector format is the first step for incorporating cartographic information into a geographic information system (GIS). With the advancement in computational power and algorithm design, map processing systems have been considerably improved over the last decade. However, the fundamental map processing techniques such as color image segmentation, (map) layer separation, and object recognition are sensitive to minor variations in graphical properties of the input image (e.g., scanning resolution). As a result, most map processing results would not meet user expectations if the user does not "properly" scan the map of interest, pre-process the map image (e.g., using compression or not), and train the processing system, accordingly. These issues could slow down the further advancement of map processing techniques as such unsuccessful attempts create a discouraged user community, and less sophisticated tools would be perceived as more viable solutions. Thus, it is important to understand what kinds of maps are suitable for automatic map processing and what types of results and process-related errors can be expected. In this paper, we shed light on these questions by using a typical map processing task, text recognition, to discuss a number of map instances that vary in suitability for automatic processing. We also present an extensive experiment on a diverse set of scanned historical maps to provide measures of baseline performance of a standard text recognition tool under varying map conditions (graphical quality) and text representations (that can vary even within the same map sheet). Our experimental results help the user understand what to expect when a fully or semi-automatic map processing system is used to process a scanned map with certain (varying) graphical properties and complexities in map content.

  2. Enhancements to Demilitarization Process Maps Program (ProMap)

    DTIC Science & Technology

    2016-10-14

    map tool, ProMap, was improved by implementing new features, and sharing data with MIDAS and AMDIT databases . Specifically, process efficiency was...improved by 1) providing access to APE information contained in the AMDIT database directly from inside ProMap when constructing a process map, 2...what equipment can be efficiently used to demil a particular munition. Associated with this task was the upgrade of the AMDIT database so that

  3. The use of process mapping in healthcare quality improvement projects.

    PubMed

    Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James

    2018-05-01

    Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.

  4. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching, is yet to find the digital flow that is achieved with pencil on notebook page or map. Free-form integrated sketching and notebook functionality in geological mapping software packages is in its nascence. Hence, the result is a tendency for digital geological mapping to focus on the ease of data collection rather than on the thoughts and careful observations that come from notebook sketching and interpreting boundaries on a map in the field. The final digital geological map can be assessed for when and where data was recorded, but the thought processes of the mapper are less easily assessed, and the use of observations and sketching to generate ideas and interpretations maybe inhibited by reliance on digital mapping methods. All mapping methods used have their own distinct advantages and disadvantages and with more recent technologies both hardware and software issues have arisen. We present field examples of using conventional fieldslip mapping, and compare these with more advanced technologies to highlight some of the main advantages and disadvantages of each method and discuss where geological mapping may be going in the future.

  5. Revision of Primary Series Maps

    USGS Publications Warehouse

    ,

    2000-01-01

    In 1992, the U.S. Geological Survey (USGS) completed a 50-year effort to provide primary series map coverage of the United States. Many of these maps now need to be updated to reflect the construction of new roads and highways and other changes that have taken place over time. The USGS has formulated a graphic revision plan to help keep the primary series maps current. Primary series maps include 1:20,000-scale quadrangles of Puerto Rico, 1:24,000- or 1:25,000-scale quadrangles of the conterminous United States, Hawaii, and U.S. Territories, and 1:63,360-scale quadrangles of Alaska. The revision of primary series maps from new collection sources is accomplished using a variety of processes. The raster revision process combines the scanned content of paper maps with raster updating technologies. The vector revision process involves the automated plotting of updated vector files. Traditional processes use analog stereoplotters and manual scribing instruments on specially coated map separates. The ability to select from or combine these processes increases the efficiency of the National Mapping Division map revision program.

  6. Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes

    PubMed Central

    Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin

    2012-01-01

    Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993

  7. Harvesting geographic features from heterogeneous raster maps

    NASA Astrophysics Data System (ADS)

    Chiang, Yao-Yi

    2010-11-01

    Raster maps offer a great deal of geospatial information and are easily accessible compared to other geospatial data. However, harvesting geographic features locked in heterogeneous raster maps to obtain the geospatial information is challenging. This is because of the varying image quality of raster maps (e.g., scanned maps with poor image quality and computer-generated maps with good image quality), the overlapping geographic features in maps, and the typical lack of metadata (e.g., map geocoordinates, map source, and original vector data). Previous work on map processing is typically limited to a specific type of map and often relies on intensive manual work. In contrast, this thesis investigates a general approach that does not rely on any prior knowledge and requires minimal user effort to process heterogeneous raster maps. This approach includes automatic and supervised techniques to process raster maps for separating individual layers of geographic features from the maps and recognizing geographic features in the separated layers (i.e., detecting road intersections, generating and vectorizing road geometry, and recognizing text labels). The automatic technique eliminates user intervention by exploiting common map properties of how road lines and text labels are drawn in raster maps. For example, the road lines are elongated linear objects and the characters are small connected-objects. The supervised technique utilizes labels of road and text areas to handle complex raster maps, or maps with poor image quality, and can process a variety of raster maps with minimal user input. The results show that the general approach can handle raster maps with varying map complexity, color usage, and image quality. By matching extracted road intersections to another geospatial dataset, we can identify the geocoordinates of a raster map and further align the raster map, separated feature layers from the map, and recognized features from the layers with the geospatial dataset. The road vectorization and text recognition results outperform state-of-art commercial products, and with considerably less user input. The approach in this thesis allows us to make use of the geospatial information of heterogeneous maps locked in raster format.

  8. Image processing for optical mapping.

    PubMed

    Ravindran, Prabu; Gupta, Aditya

    2015-01-01

    Optical Mapping is an established single-molecule, whole-genome analysis system, which has been used to gain a comprehensive understanding of genomic structure and to study structural variation of complex genomes. A critical component of Optical Mapping system is the image processing module, which extracts single molecule restriction maps from image datasets of immobilized, restriction digested and fluorescently stained large DNA molecules. In this review, we describe robust and efficient image processing techniques to process these massive datasets and extract accurate restriction maps in the presence of noise, ambiguity and confounding artifacts. We also highlight a few applications of the Optical Mapping system.

  9. Fast Mapping Across Time: Memory Processes Support Children's Retention of Learned Words.

    PubMed

    Vlach, Haley A; Sandhofer, Catherine M

    2012-01-01

    Children's remarkable ability to map linguistic labels to referents in the world is commonly called fast mapping. The current study examined children's (N = 216) and adults' (N = 54) retention of fast-mapped words over time (immediately, after a 1-week delay, and after a 1-month delay). The fast mapping literature often characterizes children's retention of words as consistently high across timescales. However, the current study demonstrates that learners forget word mappings at a rapid rate. Moreover, these patterns of forgetting parallel forgetting functions of domain-general memory processes. Memory processes are critical to children's word learning and the role of one such process, forgetting, is discussed in detail - forgetting supports extended mapping by promoting the memory and generalization of words and categories.

  10. Strong Convergence of Iteration Processes for Infinite Family of General Extended Mappings

    NASA Astrophysics Data System (ADS)

    Hussein Maibed, Zena

    2018-05-01

    The aim of this paper, we introduce a concept of general extended mapping which is independent of nonexpansive mapping and give an iteration process of families of quasi nonexpansive and of general extended mappings. Also, the existence of common fixed point are studied for these process in the Hilbert spaces.

  11. A new image enhancement algorithm with applications to forestry stand mapping

    NASA Technical Reports Server (NTRS)

    Kan, E. P. F. (Principal Investigator); Lo, J. K.

    1975-01-01

    The author has identified the following significant results. Results show that the new algorithm produced cleaner classification maps in which holes of small predesignated sizes were eliminated and significant boundary information was preserved. These cleaner post-processed maps better resemble true life timber stand maps and are thus more usable products than the pre-post-processing ones: Compared to an accepted neighbor-checking post-processing technique, the new algorithm is more appropriate for timber stand mapping.

  12. Neural network-based multiple robot simultaneous localization and mapping.

    PubMed

    Saeedi, Sajad; Paull, Liam; Trentini, Michael; Li, Howard

    2011-12-01

    In this paper, a decentralized platform for simultaneous localization and mapping (SLAM) with multiple robots is developed. Each robot performs single robot view-based SLAM using an extended Kalman filter to fuse data from two encoders and a laser ranger. To extend this approach to multiple robot SLAM, a novel occupancy grid map fusion algorithm is proposed. Map fusion is achieved through a multistep process that includes image preprocessing, map learning (clustering) using neural networks, relative orientation extraction using norm histogram cross correlation and a Radon transform, relative translation extraction using matching norm vectors, and then verification of the results. The proposed map learning method is a process based on the self-organizing map. In the learning phase, the obstacles of the map are learned by clustering the occupied cells of the map into clusters. The learning is an unsupervised process which can be done on the fly without any need to have output training patterns. The clusters represent the spatial form of the map and make further analyses of the map easier and faster. Also, clusters can be interpreted as features extracted from the occupancy grid map so the map fusion problem becomes a task of matching features. Results of the experiments from tests performed on a real environment with multiple robots prove the effectiveness of the proposed solution.

  13. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  14. Uncertainties in ecosystem service maps: a comparison on the European scale.

    PubMed

    Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H

    2014-01-01

    Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.

  15. Land cover mapping for development planning in Eastern and Southern Africa

    NASA Astrophysics Data System (ADS)

    Oduor, P.; Flores Cordova, A. I.; Wakhayanga, J. A.; Kiema, J.; Farah, H.; Mugo, R. M.; Wahome, A.; Limaye, A. S.; Irwin, D.

    2016-12-01

    Africa continues to experience intensification of land use, driven by competition for resources and a growing population. Land cover maps are some of the fundamental datasets required by numerous stakeholders to inform a number of development decisions. For instance, they can be integrated with other datasets to create value added products such as vulnerability impact assessment maps, and natural capital accounting products. In addition, land cover maps are used as inputs into Greenhouse Gas (GHG) inventories to inform the Agriculture, Forestry and other Land Use (AFOLU) sector. However, the processes and methodologies of creating land cover maps consistent with international and national land cover classification schemes can be challenging, especially in developing countries where skills, hardware and software resources can be limiting. To meet this need, SERVIR Eastern and Southern Africa developed methodologies and stakeholder engagement processes that led to a successful initiative in which land cover maps for 9 countries (Malawi, Rwanda, Namibia, Botswana, Lesotho, Ethiopia, Uganda, Zambia and Tanzania) were developed, using 2 major classification schemes. The first sets of maps were developed based on an internationally acceptable classification system, while the second sets of maps were based on a nationally defined classification system. The mapping process benefited from reviews from national experts and also from technical advisory groups. The maps have found diverse uses, among them the definition of the Forest Reference Levels in Zambia. In Ethiopia, the maps have been endorsed by the national mapping agency as part of national data. The data for Rwanda is being used to inform the Natural Capital Accounting process, through the WAVES program, a World Bank Initiative. This work illustrates the methodologies and stakeholder engagement processes that brought success to this land cover mapping initiative.

  16. Does the process map influence the outcome of quality improvement work? A comparison of a sequential flow diagram and a hierarchical task analysis diagram.

    PubMed

    Colligan, Lacey; Anderson, Janet E; Potts, Henry W W; Berman, Jonathan

    2010-01-07

    Many quality and safety improvement methods in healthcare rely on a complete and accurate map of the process. Process mapping in healthcare is often achieved using a sequential flow diagram, but there is little guidance available in the literature about the most effective type of process map to use. Moreover there is evidence that the organisation of information in an external representation affects reasoning and decision making. This exploratory study examined whether the type of process map - sequential or hierarchical - affects healthcare practitioners' judgments. A sequential and a hierarchical process map of a community-based anti coagulation clinic were produced based on data obtained from interviews, talk-throughs, attendance at a training session and examination of protocols and policies. Clinic practitioners were asked to specify the parts of the process that they judged to contain quality and safety concerns. The process maps were then shown to them in counter-balanced order and they were asked to circle on the diagrams the parts of the process where they had the greatest quality and safety concerns. A structured interview was then conducted, in which they were asked about various aspects of the diagrams. Quality and safety concerns cited by practitioners differed depending on whether they were or were not looking at a process map, and whether they were looking at a sequential diagram or a hierarchical diagram. More concerns were identified using the hierarchical diagram compared with the sequential diagram and more concerns were identified in relation to clinical work than administrative work. Participants' preference for the sequential or hierarchical diagram depended on the context in which they would be using it. The difficulties of determining the boundaries for the analysis and the granularity required were highlighted. The results indicated that the layout of a process map does influence perceptions of quality and safety problems in a process. In quality improvement work it is important to carefully consider the type of process map to be used and to consider using more than one map to ensure that different aspects of the process are captured.

  17. Multiple Concurrent Visual-Motor Mappings: Implications for Models of Adaptation

    NASA Technical Reports Server (NTRS)

    Cunningham, H. A.; Welch, Robert B.

    1994-01-01

    Previous research on adaptation to visual-motor rearrangement suggests that the central nervous system represents accurately only 1 visual-motor mapping at a time. This idea was examined in 3 experiments where subjects tracked a moving target under repeated alternations between 2 initially interfering mappings (the 'normal' mapping characteristic of computer input devices and a 108' rotation of the normal mapping). Alternation between the 2 mappings led to significant reduction in error under the rotated mapping and significant reduction in the adaptation aftereffect ordinarily caused by switching between mappings. Color as a discriminative cue, interference versus decay in adaptation aftereffect, and intermanual transfer were also examined. The results reveal a capacity for multiple concurrent visual-motor mappings, possibly controlled by a parametric process near the motor output stage of processing.

  18. Hybrid optical acoustic seafloor mapping

    NASA Astrophysics Data System (ADS)

    Inglis, Gabrielle

    The oceanographic research and industrial communities have a persistent demand for detailed three dimensional sea floor maps which convey both shape and texture. Such data products are used for archeology, geology, ship inspection, biology, and habitat classification. There are a variety of sensing modalities and processing techniques available to produce these maps and each have their own potential benefits and related challenges. Multibeam sonar and stereo vision are such two sensors with complementary strengths making them ideally suited for data fusion. Data fusion approaches however, have seen only limited application to underwater mapping and there are no established methods for creating hybrid, 3D reconstructions from two underwater sensing modalities. This thesis develops a processing pipeline to synthesize hybrid maps from multi-modal survey data. It is helpful to think of this processing pipeline as having two distinct phases: Navigation Refinement and Map Construction. This thesis extends existing work in underwater navigation refinement by incorporating methods which increase measurement consistency between both multibeam and camera. The result is a self consistent 3D point cloud comprised of camera and multibeam measurements. In map construction phase, a subset of the multi-modal point cloud retaining the best characteristics of each sensor is selected to be part of the final map. To quantify the desired traits of a map several characteristics of a useful map are distilled into specific criteria. The different ways that hybrid maps can address these criteria provides justification for producing them as an alternative to current methodologies. The processing pipeline implements multi-modal data fusion and outlier rejection with emphasis on different aspects of map fidelity. The resulting point cloud is evaluated in terms of how well it addresses the map criteria. The final hybrid maps retain the strengths of both sensors and show significant improvement over the single modality maps and naively assembled multi-modal maps.

  19. Cognitive Processes in Orienteering: A Review.

    ERIC Educational Resources Information Center

    Seiler, Roland

    1996-01-01

    Reviews recent research on information processing and decision making in orienteering. The main cognitive demands investigated were selection of relevant map information for route choice, comparison between map and terrain in map reading and in relocation, and quick awareness of mistakes. Presents a model of map reading based on results. Contains…

  20. How does creating a concept map affect item-specific encoding?

    PubMed

    Grimaldi, Phillip J; Poston, Laurel; Karpicke, Jeffrey D

    2015-07-01

    Concept mapping has become a popular learning tool. However, the processes underlying the task are poorly understood. In the present study, we examined the effect of creating a concept map on the processing of item-specific information. In 2 experiments, subjects learned categorized or ad hoc word lists by making pleasantness ratings, sorting words into categories, or creating a concept map. Memory was tested using a free recall test and a recognition memory test, which is considered to be especially sensitive to item-specific processing. Typically, tasks that promote item-specific processing enhance free recall of categorized lists, relative to category sorting. Concept mapping resulted in lower recall performance than both the pleasantness rating and category sorting condition for categorized words. Moreover, concept mapping resulted in lower recognition memory performance than the other 2 tasks. These results converge on the conclusion that creating a concept map disrupts the processing of item-specific information. (c) 2015 APA, all rights reserved.

  1. Middle-School Students' Map Construction: Understanding Complex Spatial Displays.

    ERIC Educational Resources Information Center

    Bausmith, Jennifer Merriman; Leinhardt, Gaea

    1998-01-01

    Examines the map-making process of middle-school students to determine which actions influence their accuracy, how prior knowledge helps their map construction, and what lessons can be learned from map making. Indicates that instruction that focuses on recognition of interconnections between map elements can promote map reasoning skills. (DSK)

  2. Symbolic, Nonsymbolic and Conceptual: An Across-Notation Study on the Space Mapping of Numerals.

    PubMed

    Zhang, Yu; You, Xuqun; Zhu, Rongjuan

    2016-07-01

    Previous studies suggested that there are interconnections between two numeral modalities of symbolic notation and nonsymbolic notation (array of dots), differences and similarities of the processing, and representation of the two modalities have both been found in previous research. However, whether there are differences between the spatial representation and numeral-space mapping of the two numeral modalities of symbolic notation and nonsymbolic notation is still uninvestigated. The present study aims to examine whether there are differences between the spatial representation and numeral-space mapping of the two numeral modalities of symbolic notation and nonsymbolic notation; especially how zero, as both a symbolic magnitude numeral and a nonsymbolic conceptual numeral, mapping onto space; and if the mapping happens automatically at an early stage of the numeral information processing. Results of the two experiments demonstrate that the low-level processing of symbolic numerals including zero and nonsymbolic numerals except zero can mapping onto space, whereas the low-level processing of nonsymbolic zero as a semantic conceptual numeral cannot mapping onto space, which indicating the specialty of zero in the numeral domain. The present study indicates that the processing of non-semantic numerals can mapping onto space, whereas semantic conceptual numerals cannot mapping onto space. © The Author(s) 2016.

  3. Prioritizing Seafloor Mapping for Washington’s Pacific Coast

    PubMed Central

    Battista, Timothy; Buja, Ken; Christensen, John; Hennessey, Jennifer; Lassiter, Katrina

    2017-01-01

    Remote sensing systems are critical tools used for characterizing the geological and ecological composition of the seafloor. However, creating comprehensive and detailed maps of ocean and coastal environments has been hindered by the high cost of operating ship- and aircraft-based sensors. While a number of groups (e.g., academic research, government resource management, and private sector) are engaged in or would benefit from the collection of additional seafloor mapping data, disparate priorities, dauntingly large data gaps, and insufficient funding have confounded strategic planning efforts. In this study, we addressed these challenges by implementing a quantitative, spatial process to facilitate prioritizing seafloor mapping needs in Washington State. The Washington State Prioritization Tool (WASP), a custom web-based mapping tool, was developed to solicit and analyze mapping priorities from each participating group. The process resulted in the identification of several discrete, high priority mapping hotspots. As a result, several of the areas have been or will be subsequently mapped. Furthermore, information captured during the process about the intended application of the mapping data was paramount for identifying the optimum remote sensing sensors and acquisition parameters to use during subsequent mapping surveys. PMID:28350338

  4. Experiments to Distribute Map Generalization Processes

    NASA Astrophysics Data System (ADS)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  5. A qualitative enquiry into OpenStreetMap making

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Wei

    2011-04-01

    Based on a case study on the OpenStreetMap community, this paper provides a contextual and embodied understanding of the user-led, user-participatory and user-generated produsage phenomenon. It employs Grounded Theory, Social Worlds Theory, and qualitative methods to illuminate and explores the produsage processes of OpenStreetMap making, and how knowledge artefacts such as maps can be collectively and collaboratively produced by a community of people, who are situated in different places around the world but engaged with the same repertoire of mapping practices. The empirical data illustrate that OpenStreetMap itself acts as a boundary object that enables actors from different social worlds to co-produce the Map through interacting with each other and negotiating the meanings of mapping, the mapping data and the Map itself. The discourses also show that unlike traditional maps that black-box cartographic knowledge and offer a single dominant perspective of cities or places, OpenStreetMap is an embodied epistemic object that embraces different world views. The paper also explores how contributors build their identities as an OpenStreetMaper alongside some other identities they have. Understanding the identity-building process helps to understand mapping as an embodied activity with emotional, cognitive and social repertoires.

  6. Mapping wildland fuels for fire management across multiple scales: integrating remote sensing, GIS, and biophysical modeling

    USGS Publications Warehouse

    Keane, Robert E.; Burgan, Robert E.; Van Wagtendonk, Jan W.

    2001-01-01

    Fuel maps are essential for computing spatial fire hazard and risk and simulating fire growth and intensity across a landscape. However, fuel mapping is an extremely difficult and complex process requiring expertise in remotely sensed image classification, fire behavior, fuels modeling, ecology, and geographical information systems (GIS). This paper first presents the challenges of mapping fuels: canopy concealment, fuelbed complexity, fuel type diversity, fuel variability, and fuel model generalization. Then, four approaches to mapping fuels are discussed with examples provided from the literature: (1) field reconnaissance; (2) direct mapping methods; (3) indirect mapping methods; and (4) gradient modeling. A fuel mapping method is proposed that uses current remote sensing and image processing technology. Future fuel mapping needs are also discussed which include better field data and fuel models, accurate GIS reference layers, improved satellite imagery, and comprehensive ecosystem models.

  7. Speech processing using maximum likelihood continuity mapping

    DOEpatents

    Hogden, John E.

    2000-01-01

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  8. Speech processing using maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, J.E.

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  9. Coming To Know: The Role of the Concept Map--Mirror, Assistant, Master?

    ERIC Educational Resources Information Center

    McAleese, Ray

    This paper explains the process of creating and managing concept maps, using reflection as a focus for its argument. Section 1, What is a Concept Map?, highlights the background and definition of concept mapping, explains how maps signify virtual conceptual structures, looks at structural knowledge, provides an example of a concept map, and…

  10. Cooperative studies between the United States of America and the People's Republic of China on applications of remote sensing to surveying and mapping

    USGS Publications Warehouse

    Lauer, Donald T.; Chu, Liangcai

    1992-01-01

    A Protocol established between the National Bureau of Surveying and Mapping, People's Republic of China (PRC) and the U.S. Geological Survey, United States of America (US), resulted in the exchange of scientific personnel, technical training, and exploration of the processing of remotely sensed data. These activities were directed toward the application of remotely sensed data to surveying and mapping. Data were processed and various products were generated for the Black Hills area in the US and the Ningxiang area of the PRC. The results of these investigations defined applicable processes in the creation of satellite image maps, land use maps, and the use of ancillary data for further map enhancements.

  11. Suitability aero-geophysical methods for generating conceptual soil maps and their use in the modeling of process-related susceptibility maps

    NASA Astrophysics Data System (ADS)

    Tilch, Nils; Römer, Alexander; Jochum, Birgit; Schattauer, Ingrid

    2014-05-01

    In the past years, several times large-scale disasters occurred in Austria, which were characterized not only by flooding, but also by numerous shallow landslides and debris flows. Therefore, for the purpose of risk prevention, national and regional authorities also require more objective and realistic maps with information about spatially variable susceptibility of the geosphere for hazard-relevant gravitational mass movements. There are many and various proven methods and models (e.g. neural networks, logistic regression, heuristic methods) available to create such process-related (e.g. flat gravitational mass movements in soil) suszeptibility maps. But numerous national and international studies show a dependence of the suitability of a method on the quality of process data and parameter maps (f.e. Tilch & Schwarz 2011, Schwarz & Tilch 2011). In this case, it is important that also maps with detailed and process-oriented information on the process-relevant geosphere will be considered. One major disadvantage is that only occasionally area-wide process-relevant information exists. Similarly, in Austria often only soil maps for treeless areas are available. However, in almost all previous studies, randomly existing geological and geotechnical maps were used, which often have been specially adapted to the issues and objectives. This is one reason why very often conceptual soil maps must be derived from geological maps with only hard rock information, which often have a rather low quality. Based on these maps, for example, adjacent areas of different geological composition and process-relevant physical properties are razor sharp delineated, which in nature appears quite rarly. In order to obtain more realistic information about the spatial variability of the process-relevant geosphere (soil cover) and its physical properties, aerogeophysical measurements (electromagnetic, radiometric), carried out by helicopter, from different regions of Austria were interpreted. Previous studies show that, especially with radiometric measurements, the two-dimensional spatial variability of the nature of the process-relevant soil, close to the surface can be determined. In addition, the electromagnetic measurements are more important to obtain three-dimensional information of the deeper geological conditions and to improve the area-specific geological knowledge and understanding. The validation of these measurements is done with terrestrial geoelectrical measurements. So both aspects, radiometric and electromagnetic measurements, are important and subsequently, interpretation of the geophysical results can be used as the parameter maps in the modeling of more realistic susceptibility maps with respect to various processes. Within this presentation, results of geophysical measurements, the outcome and the derived parameter maps, as well as first process-oriented susceptibility maps in terms of gravitational soil mass movements will be presented. As an example results which were obtained with a heuristic method in an area in Vorarlberg (Western Austria) will be shown. References: Schwarz, L. & Tilch, N. (2011): Why are good process data so important for the modelling of landslide susceptibility maps?- EGU-Postersession "Landslide hazard and risk assessment, and landslide management" (NH 3.6), Vienna. [http://www.geologie.ac.at/fileadmin/user_upload/dokumente/pdf/poster/poster_2011_egu_schwarz_tilch_1.pdf] Tilch, N. & Schwarz, L. (2011): Spatial and scale-dependent variability in data quality and their influence on susceptibility maps for gravitational mass movements in soil, modelled by heuristic method.- EGU-Postersession "Landslide hazard and risk assessment, and landslide management" (NH 3.6); Vienna. [http://www.geologie.ac.at/fileadmin/user_upload/dokumente/pdf/poster/poster_2011_egu_tilch_schwarz.pdf

  12. Implementing Dementia Care Mapping to develop person-centred care: results of a process evaluation within the Leben-QD II trial.

    PubMed

    Quasdorf, Tina; Riesner, Christine; Dichter, Martin Nikolaus; Dortmann, Olga; Bartholomeyczik, Sabine; Halek, Margareta

    2017-03-01

    To evaluate Dementia Care Mapping implementation in nursing homes. Dementia Care Mapping, an internationally applied method for supporting and enhancing person-centred care for people with dementia, must be successfully implemented into care practice for its effective use. Various factors influence the implementation of complex interventions such as Dementia Care Mapping; few studies have examined the specific factors influencing Dementia Care Mapping implementation. A convergent parallel mixed-methods design embedded in a quasi-experimental trial was used to assess Dementia Care Mapping implementation success and influential factors. From 2011-2013, nine nursing units in nine different nursing homes implemented either Dementia Care Mapping (n = 6) or a periodic quality of life measurement using the dementia-specific instrument QUALIDEM (n = 3). Diverse data (interviews, n = 27; questionnaires, n = 112; resident records, n = 81; and process documents) were collected. Each data set was separately analysed and then merged to comprehensively portray the implementation process. Four nursing units implemented the particular intervention without deviating from the preplanned intervention. Translating Dementia Care Mapping results into practice was challenging. Necessary organisational preconditions for Dementia Care Mapping implementation included well-functioning networks, a dementia-friendly culture and flexible organisational structures. Involved individuals' positive attitudes towards Dementia Care Mapping also facilitated implementation. Precisely planning the intervention and its implementation, recruiting champions who supported Dementia Care Mapping implementation and having well-qualified, experienced project coordinators were essential to the implementation process. For successful Dementia Care Mapping implementation, it must be embedded in a systematic implementation strategy considering the specific setting. Organisational preconditions may need to be developed before Dementia Care Mapping implementation. Necessary steps may include team building, developing and realising a person-centred care-based mission statement or educating staff regarding general dementia care. The implementation strategy may include attracting and involving individuals on different hierarchical levels in Dementia Care Mapping implementation and supporting staff to translate Dementia Care Mapping results into practice. The identified facilitating factors can guide Dementia Care Mapping implementation strategy development. © 2016 John Wiley & Sons Ltd.

  13. cudaMap: a GPU accelerated program for gene expression connectivity mapping.

    PubMed

    McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong

    2013-10-11

    Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.

  14. Development of Generation System of Simplified Digital Maps

    NASA Astrophysics Data System (ADS)

    Uchimura, Keiichi; Kawano, Masato; Tokitsu, Hiroki; Hu, Zhencheng

    In recent years, digital maps have been used in a variety of scenarios, including car navigation systems and map information services over the Internet. These digital maps are formed by multiple layers of maps of different scales; the map data most suitable for the specific situation are used. Currently, the production of map data of different scales is done by hand due to constraints related to processing time and accuracy. We conducted research concerning technologies for automatic generation of simplified map data from detailed map data. In the present paper, the authors propose the following: (1) a method to transform data related to streets, rivers, etc. containing widths into line data, (2) a method to eliminate the component points of the data, and (3) a method to eliminate data that lie below a certain threshold. In addition, in order to evaluate the proposed method, a user survey was conducted; in this survey we compared maps generated using the proposed method with the commercially available maps. From the viewpoint of the amount of data reduction and processing time, and on the basis of the results of the survey, we confirmed the effectiveness of the automatic generation of simplified maps using the proposed methods.

  15. Leveraging electronic health record documentation for Failure Mode and Effects Analysis team identification

    PubMed Central

    Carson, Matthew B; Lee, Young Ji; Benacka, Corrine; Mutharasan, R. Kannan; Ahmad, Faraz S; Kansal, Preeti; Yancy, Clyde W; Anderson, Allen S; Soulakis, Nicholas D

    2017-01-01

    Objective: Using Failure Mode and Effects Analysis (FMEA) as an example quality improvement approach, our objective was to evaluate whether secondary use of orders, forms, and notes recorded by the electronic health record (EHR) during daily practice can enhance the accuracy of process maps used to guide improvement. We examined discrepancies between expected and observed activities and individuals involved in a high-risk process and devised diagnostic measures for understanding discrepancies that may be used to inform quality improvement planning. Methods: Inpatient cardiology unit staff developed a process map of discharge from the unit. We matched activities and providers identified on the process map to EHR data. Using four diagnostic measures, we analyzed discrepancies between expectation and observation. Results: EHR data showed that 35% of activities were completed by unexpected providers, including providers from 12 categories not identified as part of the discharge workflow. The EHR also revealed sub-components of process activities not identified on the process map. Additional information from the EHR was used to revise the process map and show differences between expectation and observation. Conclusion: Findings suggest EHR data may reveal gaps in process maps used for quality improvement and identify characteristics about workflow activities that can identify perspectives for inclusion in an FMEA. Organizations with access to EHR data may be able to leverage clinical documentation to enhance process maps used for quality improvement. While focused on FMEA protocols, findings from this study may be applicable to other quality activities that require process maps. PMID:27589944

  16. Computer generated maps from digital satellite data - A case study in Florida

    NASA Technical Reports Server (NTRS)

    Arvanitis, L. G.; Reich, R. M.; Newburne, R.

    1981-01-01

    Ground cover maps are important tools to a wide array of users. Over the past three decades, much progress has been made in supplementing planimetric and topographic maps with ground cover details obtained from aerial photographs. The present investigation evaluates the feasibility of using computer maps of ground cover from satellite input tapes. Attention is given to the selection of test sites, a satellite data processing system, a multispectral image analyzer, general purpose computer-generated maps, the preliminary evaluation of computer maps, a test for areal correspondence, the preparation of overlays and acreage estimation of land cover types on the Landsat computer maps. There is every indication to suggest that digital multispectral image processing systems based on Landsat input data will play an increasingly important role in pattern recognition and mapping land cover in the years to come.

  17. Exploring the Interactive Patterns of Concept Map-Based Online Discussion: A Sequential Analysis of Users' Operations, Cognitive Processing, and Knowledge Construction

    ERIC Educational Resources Information Center

    Wu, Sheng-Yi; Chen, Sherry Y.; Hou, Huei-Tse

    2016-01-01

    Concept maps can be used as a cognitive tool to assist learners' knowledge construction. However, in a concept map-based online discussion environment, studies that take into consideration learners' manipulative actions of composing concept maps, cognitive process among learners' discussion, and social knowledge construction at the same time are…

  18. Database for the geologic map of Upper Geyser Basin, Yellowstone National Park, Wyoming

    USGS Publications Warehouse

    Abendini, Atosa A.; Robinson, Joel E.; Muffler, L. J. Patrick; White, D. E.; Beeson, Melvin H.; Truesdell, A. H.

    2015-01-01

    This dataset contains contacts, geologic units, and map boundaries from Miscellaneous Investigations Series Map I-1371, "The Geologic map of upper Geyser Basin, Yellowstone, National Park, Wyoming". This dataset was constructed to produce a digital geologic map as a basis for ongoing studies of hydrothermal processes.

  19. A low-frequency near-field interferometric-TOA 3-D Lightning Mapping Array

    NASA Astrophysics Data System (ADS)

    Lyu, Fanchao; Cummer, Steven A.; Solanki, Rahulkumar; Weinert, Joel; McTague, Lindsay; Katko, Alex; Barrett, John; Zigoneanu, Lucian; Xie, Yangbo; Wang, Wenqi

    2014-11-01

    We report on the development of an easily deployable LF near-field interferometric-time of arrival (TOA) 3-D Lightning Mapping Array applied to imaging of entire lightning flashes. An interferometric cross-correlation technique is applied in our system to compute windowed two-sensor time differences with submicrosecond time resolution before TOA is used for source location. Compared to previously reported LF lightning location systems, our system captures many more LF sources. This is due mainly to the improved mapping of continuous lightning processes by using this type of hybrid interferometry/TOA processing method. We show with five station measurements that the array detects and maps different lightning processes, such as stepped and dart leaders, during both in-cloud and cloud-to-ground flashes. Lightning images mapped by our LF system are remarkably similar to those created by VHF mapping systems, which may suggest some special links between LF and VHF emission during lightning processes.

  20. Toward an operational framework for fine-scale urban land-cover mapping in Wallonia using submeter remote sensing and ancillary vector data

    NASA Astrophysics Data System (ADS)

    Beaumont, Benjamin; Grippa, Tais; Lennert, Moritz; Vanhuysse, Sabine; Stephenne, Nathalie; Wolff, Eléonore

    2017-07-01

    Encouraged by the EU INSPIRE directive requirements and recommendations, the Walloon authorities, similar to other EU regional or national authorities, want to develop operational land-cover (LC) and land-use (LU) mapping methods using existing geodata. Urban planners and environmental monitoring stakeholders of Wallonia have to rely on outdated, mixed, and incomplete LC and LU information. The current reference map is 10-years old. The two object-based classification methods, i.e., a rule- and a classifier-based method, for detailed regional urban LC mapping are compared. The added value of using the different existing geospatial datasets in the process is assessed. This includes the comparison between satellite and aerial optical data in terms of mapping accuracies, visual quality of the map, costs, processing, data availability, and property rights. The combination of spectral, tridimensional, and vector data provides accuracy values close to 0.90 for mapping the LC into nine categories with a minimum mapping unit of 15 m2. Such a detailed LC map offers opportunities for fine-scale environmental and spatial planning activities. Still, the regional application poses challenges regarding automation, big data handling, and processing time, which are discussed.

  1. An alternative methionine aminopeptidase, MAP-A, is required for nitrogen starvation and high-light acclimation in the cyanobacterium Synechocystis sp. PCC 6803.

    PubMed

    Drath, Miriam; Baier, Kerstin; Forchhammer, Karl

    2009-05-01

    Methionine aminopeptidases (MetAPs or MAPs, encoded by map genes) are ubiquitous and pivotal enzymes for protein maturation in all living organisms. Whereas most bacteria harbour only one map gene, many cyanobacterial genomes contain two map paralogues, the genome of Synechocystis sp. PCC 6803 even three. The physiological function of multiple map paralogues remains elusive so far. This communication reports for the first time differential MetAP function in a cyanobacterium. In Synechocystis sp. PCC 6803, the universally conserved mapC gene (sll0555) is predominantly expressed in exponentially growing cells and appears to be a housekeeping gene. By contrast, expression of mapA (slr0918) and mapB (slr0786) genes increases during stress conditions. The mapB paralogue is only transiently expressed, whereas the widely distributed mapA gene appears to be the major MetAP during stress conditions. A mapA-deficient Synechocystis mutant shows a subtle impairment of photosystem II properties even under non-stressed conditions. In particular, the binding site for the quinone Q(B) is affected, indicating specific N-terminal methionine processing requirements of photosystem II components. MAP-A-specific processing becomes essential under certain stress conditions, since the mapA-deficient mutant is severely impaired in surviving conditions of prolonged nitrogen starvation and high light exposure.

  2. Smartphone-based noise mapping: Integrating sound level meter app data into the strategic noise mapping process.

    PubMed

    Murphy, Enda; King, Eoin A

    2016-08-15

    The strategic noise mapping process of the EU has now been ongoing for more than ten years. However, despite the fact that a significant volume of research has been conducted on the process and related issues there has been little change or innovation in how relevant authorities and policymakers are conducting the process since its inception. This paper reports on research undertaken to assess the possibility for smartphone-based noise mapping data to be integrated into the traditional strategic noise mapping process. We compare maps generated using the traditional approach with those generated using smartphone-based measurement data. The advantage of the latter approach is that it has the potential to remove the need for exhaustive input data into the source calculation model for noise prediction. In addition, the study also tests the accuracy of smartphone-based measurements against simultaneous measurements taken using traditional sound level meters in the field. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Managing mapping data using commercial data base management software.

    USGS Publications Warehouse

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  4. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  5. Asymmetric neighborhood functions accelerate ordering process of self-organizing maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ota, Kaiichiro; Aoki, Takaaki; Kurata, Koji

    2011-02-15

    A self-organizing map (SOM) algorithm can generate a topographic map from a high-dimensional stimulus space to a low-dimensional array of units. Because a topographic map preserves neighborhood relationships between the stimuli, the SOM can be applied to certain types of information processing such as data visualization. During the learning process, however, topological defects frequently emerge in the map. The presence of defects tends to drastically slow down the formation of a globally ordered topographic map. To remove such topological defects, it has been reported that an asymmetric neighborhood function is effective, but only in the simple case of mapping one-dimensionalmore » stimuli to a chain of units. In this paper, we demonstrate that even when high-dimensional stimuli are used, the asymmetric neighborhood function is effective for both artificial and real-world data. Our results suggest that applying the asymmetric neighborhood function to the SOM algorithm improves the reliability of the algorithm. In addition, it enables processing of complicated, high-dimensional data by using this algorithm.« less

  6. Application of Ifsar Technology in Topographic Mapping: JUPEM's Experience

    NASA Astrophysics Data System (ADS)

    Zakaria, Ahamad

    2018-05-01

    The application of Interferometric Synthetic Aperture Radar (IFSAR) in topographic mapping has increased during the past decades. This is due to the advantages that IFSAR technology offers in solving data acquisition problems in tropical regions. Unlike aerial photography, radar technology offers wave penetration through cloud cover, fog and haze. As a consequence, images can be made free of any natural phenomenon defects. In Malaysia, Department of Survey and Mapping Malaysia (JUPEM) has been utilizing the IFSAR products since 2009 to update topographic maps at 1 : 50,000 map scales. Orthorectified radar imagery (ORI), Digital Surface Models (DSM) and Digital Terrain Models (DTM) procured under the project have been further processed before the products are ingested into a revamped mapping workflow consisting of stereo and mono digitizing processes. The paper will highlight the experience of Department of Survey and Mapping Malaysia (DSMM)/ JUPEM in using such technology in order to speed up mapping production.

  7. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    PubMed

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  8. Mapping land use changes in the carboniferous region of Santa Catarina, report 2

    NASA Technical Reports Server (NTRS)

    Valeriano, D. D. (Principal Investigator); Bitencourtpereira, M. D.

    1983-01-01

    The techniques applied to MSS-LANDSAT data in the land-use mapping of Criciuma region (Santa Catarina state, Brazil) are presented along with the results of a classification accuracy estimate tested on the resulting map. The MSS-LANDSAT data digital processing involves noise suppression, features selection and a hybrid classifier. The accuracy test is made through comparisons with aerial photographs of sampled points. The utilization of digital processing to map the classes agricultural lands, forest lands and urban areas is recommended, while the coal refuse areas should be mapped visually.

  9. a Conceptual Framework for Indoor Mapping by Using Grammars

    NASA Astrophysics Data System (ADS)

    Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.

    2017-09-01

    Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  10. Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2006-01-01

    The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  11. Mapping disease at an approximated individual level using aggregate data: a case study of mapping New Hampshire birth defects.

    PubMed

    Shi, Xun; Miller, Stephanie; Mwenda, Kevin; Onda, Akikazu; Reese, Judy; Onega, Tracy; Gui, Jiang; Karagas, Margret; Demidenko, Eugene; Moeschler, John

    2013-09-06

    Limited by data availability, most disease maps in the literature are for relatively large and subjectively-defined areal units, which are subject to problems associated with polygon maps. High resolution maps based on objective spatial units are needed to more precisely detect associations between disease and environmental factors. We propose to use a Restricted and Controlled Monte Carlo (RCMC) process to disaggregate polygon-level location data to achieve mapping aggregate data at an approximated individual level. RCMC assigns a random point location to a polygon-level location, in which the randomization is restricted by the polygon and controlled by the background (e.g., population at risk). RCMC allows analytical processes designed for individual data to be applied, and generates high-resolution raster maps. We applied RCMC to the town-level birth defect data for New Hampshire and generated raster maps at the resolution of 100 m. Besides the map of significance of birth defect risk represented by p-value, the output also includes a map of spatial uncertainty and a map of hot spots. RCMC is an effective method to disaggregate aggregate data. An RCMC-based disease mapping maximizes the use of available spatial information, and explicitly estimates the spatial uncertainty resulting from aggregation.

  12. Advanced Map For Real-Time Process Control

    NASA Astrophysics Data System (ADS)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  13. Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process

    DTIC Science & Technology

    2015-10-01

    ARL-TR-7501 ● OCT 2015 US Army Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact...Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process by Stephen Berkebile Vehicle...YYYY) October 2015 2. REPORT TYPE Final 3. DATES COVERED (From - To) 1 January–30 June 2015 4. TITLE AND SUBTITLE Friction Mapping as a Tool for

  14. Making clinical case-based learning in veterinary medicine visible: analysis of collaborative concept-mapping processes and reflections.

    PubMed

    Khosa, Deep K; Volet, Simone E; Bolton, John R

    2014-01-01

    The value of collaborative concept mapping in assisting students to develop an understanding of complex concepts across a broad range of basic and applied science subjects is well documented. Less is known about students' learning processes that occur during the construction of a concept map, especially in the context of clinical cases in veterinary medicine. This study investigated the unfolding collaborative learning processes that took place in real-time concept mapping of a clinical case by veterinary medical students and explored students' and their teacher's reflections on the value of this activity. This study had two parts. The first part investigated the cognitive and metacognitive learning processes of two groups of students who displayed divergent learning outcomes in a concept mapping task. Meaningful group differences were found in their level of learning engagement in terms of the extent to which they spent time understanding and co-constructing knowledge along with completing the task at hand. The second part explored students' and their teacher's views on the value of concept mapping as a learning and teaching tool. The students' and their teacher's perceptions revealed congruent and contrasting notions about the usefulness of concept mapping. The relevance of concept mapping to clinical case-based learning in veterinary medicine is discussed, along with directions for future research.

  15. The use of concept mapping in measurement development and evaluation: Application and future directions.

    PubMed

    Rosas, Scott R; Ridings, John W

    2017-02-01

    The past decade has seen an increase of measurement development research in social and health sciences that featured the use of concept mapping as a core technique. The purpose, application, and utility of concept mapping have varied across this emerging literature. Despite the variety of uses and range of outputs, little has been done to critically review how researchers have approached the application of concept mapping in the measurement development and evaluation process. This article focuses on a review of the current state of practice regarding the use of concept mapping as methodological tool in this process. We systematically reviewed 23 scale or measure development and evaluation studies, and detail the application of concept mapping in the context of traditional measurement development and psychometric testing processes. Although several limitations surfaced, we found several strengths in the contemporary application of the method. We determined concept mapping provides (a) a solid method for establishing content validity, (b) facilitates researcher decision-making, (c) insight into target population perspectives that are integrated a priori, and (d) a foundation for analytical and interpretative choices. Based on these results, we outline how concept mapping can be situated in the measurement development and evaluation processes for new instrumentation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Software for Verifying Image-Correlation Tie Points

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Yagi, Gary

    2008-01-01

    A computer program enables assessment of the quality of tie points in the image-correlation processes of the software described in the immediately preceding article. Tie points are computed in mappings between corresponding pixels in the left and right images of a stereoscopic pair. The mappings are sometimes not perfect because image data can be noisy and parallax can cause some points to appear in one image but not the other. The present computer program relies on the availability of a left- right correlation map in addition to the usual right left correlation map. The additional map must be generated, which doubles the processing time. Such increased time can now be afforded in the data-processing pipeline, since the time for map generation is now reduced from about 60 to 3 minutes by the parallelization discussed in the previous article. Parallel cluster processing time, therefore, enabled this better science result. The first mapping is typically from a point (denoted by coordinates x,y) in the left image to a point (x',y') in the right image. The second mapping is from (x',y') in the right image to some point (x",y") in the left image. If (x,y) and(x",y") are identical, then the mapping is considered perfect. The perfect-match criterion can be relaxed by introducing an error window that admits of round-off error and a small amount of noise. The mapping procedure can be repeated until all points in each image not connected to points in the other image are eliminated, so that what remains are verified correlation data.

  17. The Effectiveness of a Single Intervention of Computer-Aided Argument Mapping in a Marketing and a Financial Accounting Subject

    ERIC Educational Resources Information Center

    Carrington, Michal; Chen, Richard; Davies, Martin; Kaur, Jagjit; Neville, Benjamin

    2011-01-01

    An argument map visually represents the structure of an argument, outlining its informal logical connections and informing judgments as to its worthiness. Argument mapping can be augmented with dedicated software that aids the mapping process. Empirical evidence suggests that semester-length subjects using argument mapping along with dedicated…

  18. Functional-to-form mapping for assembly design automation

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Liu, W. M.; Shen, W. D.; Yang, D. Y.; Liu, T. T.

    2017-11-01

    Assembly-level function-to-form mapping is the most effective procedure towards design automation. The research work mainly includes: the assembly-level function definitions, product network model and the two-step mapping mechanisms. The function-to-form mapping is divided into two steps, i.e. mapping of function-to-behavior, called the first-step mapping, and the second-step mapping, i.e. mapping of behavior-to-structure. After the first step mapping, the three dimensional transmission chain (or 3D sketch) is studied, and the feasible design computing tools are developed. The mapping procedure is relatively easy to be implemented interactively, but, it is quite difficult to finish it automatically. So manual, semi-automatic, automatic and interactive modification of the mapping model are studied. A mechanical hand F-F mapping process is illustrated to verify the design methodologies.

  19. Contribution of radiation hybrids to genome mapping in domestic animals.

    PubMed

    Faraut, T; de Givry, S; Hitte, C; Lahbib-Mansais, Y; Morisson, M; Milan, D; Schiex, T; Servin, B; Vignal, A; Galibert, F; Yerle, M

    2009-01-01

    Radiation hybrid mapping has emerged in the end of the 1990 s as a successful and complementary approach to map genomes, essentially because of its ability to bridge the gaps between genetic and clone-based physical maps, but also using comparative mapping approaches, between 'gene-rich' and 'gene-poor' maps. Since its early development in human, radiation hybrid mapping played a pivotal role in the process of mapping animal genomes, especially mammalian ones. We review here all the different steps involved in radiation hybrid mapping from the constitution of panels to the construction of maps. A description of its contribution to whole genome maps with a special emphasis on domestic animals will also be presented. Finally, current applications of radiation hybrid mapping in the context of whole genome assemblies will be described. (c) 2009 S. Karger AG, Basel.

  20. Mapping landscape corridors

    Treesearch

    Peter Vogt; Kurt H. Riitters; Marcin Iwanowski; Christine Estreguil; Jacek Kozak; Pierre Soille

    2007-01-01

    Corridors are important geographic features for biological conservation and biodiversity assessment. The identification and mapping of corridors is usually based on visual interpretations of movement patterns (functional corridors) or habitat maps (structural corridors). We present a method for automated corridor mapping with morphological image processing, and...

  1. ActionMap: A web-based software that automates loci assignments to framework maps.

    PubMed

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  2. ActionMap: a web-based software that automates loci assignments to framework maps

    PubMed Central

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-01-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426

  3. cudaMap: a GPU accelerated program for gene expression connectivity mapping

    PubMed Central

    2013-01-01

    Background Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. Results cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Conclusion Emerging ‘omics’ technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap. PMID:24112435

  4. Clustering of color map pixels: an interactive approach

    NASA Astrophysics Data System (ADS)

    Moon, Yiu Sang; Luk, Franklin T.; Yuen, K. N.; Yeung, Hoi Wo

    2003-12-01

    The demand for digital maps continues to arise as mobile electronic devices become more popular nowadays. Instead of creating the entire map from void, we may convert a scanned paper map into a digital one. Color clustering is the very first step of the conversion process. Currently, most of the existing clustering algorithms are fully automatic. They are fast and efficient but may not work well in map conversion because of the numerous ambiguous issues associated with printed maps. Here we introduce two interactive approaches for color clustering on the map: color clustering with pre-calculated index colors (PCIC) and color clustering with pre-calculated color ranges (PCCR). We also introduce a memory model that could enhance and integrate different image processing techniques for fine-tuning the clustering results. Problems and examples of the algorithms are discussed in the paper.

  5. Occupancy change detection system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-01

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes instructions for producing an occupancy grid map of an environment around the robot, scanning the environment to generate a current obstacle map relative to a current robot position, and converting the current obstacle map to a current occupancy grid map. The instructions also include processing each grid cell in the occupancy grid map. Within the processing of each grid cell, the instructions include comparing each grid cell in the occupancy grid map to a corresponding grid cell in the current occupancy grid map. For grid cells with a difference, the instructions include defining a change vector for each changed grid cell, wherein the change vector includes a direction from the robot to the changed grid cell and a range from the robot to the changed grid cell.

  6. MaMR: High-performance MapReduce programming model for material cloud applications

    NASA Astrophysics Data System (ADS)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  7. From conceptual modeling to a map

    NASA Astrophysics Data System (ADS)

    Gotlib, Dariusz; Olszewski, Robert

    2018-05-01

    Nowadays almost every map is a component of the information system. Design and production of maps requires the use of specific rules for modeling information systems: conceptual, application and data modelling. While analyzing various stages of cartographic modeling the authors ask the question: at what stage of this process a map occurs. Can we say that the "life of the map" begins even before someone define its form of presentation? This question is particularly important at the time of exponentially increasing number of new geoinformation products. During the analysis of the theory of cartography and relations of the discipline to other fields of knowledge it has been attempted to define a few properties of cartographic modeling which distinguish the process from other methods of spatial modeling. Assuming that the map is a model of reality (created in the process of cartographic modeling supported by domain-modeling) the article proposes an analogy of the process of cartographic modeling to the scheme of conceptual modeling presented in ISO 19101 standard.

  8. Iterative framework radiation hybrid mapping

    USDA-ARS?s Scientific Manuscript database

    Building comprehensive radiation hybrid maps for large sets of markers is a computationally expensive process, since the basic mapping problem is equivalent to the traveling salesman problem. The mapping problem is also susceptible to noise, and as a result, it is often beneficial to remove markers ...

  9. Interactive Geophysical Mapping on the Web

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Hamburger, M.; Estey, L.; Weingroff, M.; Deardorff, R.; Holt, W.

    2002-12-01

    We have developed a set of interactive, web-based map utilities that make geophysical results accessible to a large number and variety of users. These tools provide access to pre-determined map regions via a simple Html/JavaScript interface or to user-selectable areas using a Java interface to a Generic Mapping Tools (GMT) engine. Users can access a variety of maps, satellite images, and geophysical data at a range of spatial scales for the earth and other planets of the solar system. Developed initially by UNAVCO for study of global-scale geodynamic processes, users can choose from a variety of base maps (satellite mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others) and can then add a number of geographic and geophysical overlays for example coastlines, political boundaries, rivers and lakes, NEIC earthquake and volcano locations, stress axes, and observed and model plate motion and deformation velocity vectors representing a compilation of 2933 geodetic measurements from around the world. The software design is flexible allowing for construction of special editions for different target audiences. Custom maps been implemented for UNAVCO as the "Jules Verne Voyager" and "Voyager Junior", for the International Lithosphere Project's "Global Strain Rate Map", and for EarthScope Education and Outreach as "EarthScope Voyager Jr.". For the later, a number of EarthScope-specific features have been added, including locations of proposed USArray (seismic), Plate Boundary Observatory (geodetic), and San Andreas Fault Observatory at Depth sites plus detailed maps and geographically referenced examples of EarthScope-related scientific investigations. In addition, we are developing a website that incorporates background materials and curricular activities that encourage users to explore Earth processes. A cluster of map processing computers and nearly a terabyte of disk storage has been assembled to power the generation of interactive maps and provide space for a very large collection of map data. A portal to these map tools can be found at: http://jules.unavco.ucar.edu.

  10. Interagency Report: Astrogeology 58, television cartography

    USGS Publications Warehouse

    Batson, Raymond M.

    1973-01-01

    The purpose of this paper is to illustrate the processing of digital television pictures into base maps. In this context, a base map is defined as a pictorial representation of planetary surface morphology accurately reproduced on standard map projections. Topographic contour lines, albedo or geologic overprints may be super imposed on these base maps. The compilation of geodetic map controls, the techniques of mosaic compilation, computer processing and airbrush enhancement, and the compilation of con tour lines are discussed elsewhere by the originators of these techniques. A bibliography of applicable literature is included for readers interested in more detailed discussions.

  11. Iconicity as structure mapping

    PubMed Central

    Emmorey, Karen

    2014-01-01

    Linguistic and psycholinguistic evidence is presented to support the use of structure-mapping theory as a framework for understanding effects of iconicity on sign language grammar and processing. The existence of structured mappings between phonological form and semantic mental representations has been shown to explain the nature of metaphor and pronominal anaphora in sign languages. With respect to processing, it is argued that psycholinguistic effects of iconicity may only be observed when the task specifically taps into such structured mappings. In addition, language acquisition effects may only be observed when the relevant cognitive abilities are in place (e.g. the ability to make structural comparisons) and when the relevant conceptual knowledge has been acquired (i.e. information key to processing the iconic mapping). Finally, it is suggested that iconicity is better understood as a structured mapping between two mental representations than as a link between linguistic form and human experience. PMID:25092669

  12. Discrimination of fluoride and phosphate contamination in central Florida for analyses of environmental effects

    NASA Technical Reports Server (NTRS)

    Coker, A. E.; Marshall, R.; Thomson, F.

    1972-01-01

    A study was made of the spatial registration of fluoride and phosphate pollution parameters in central Florida by utilizing remote sensing techniques. Multispectral remote sensing data were collected over the area and processed to produce multispectral recognition maps. These processed data were used to map land areas and waters containing concentrations of fluoride and phosphate. Maps showing distribution of affected and unaffected vegetation were produced. In addition, the multispectral data were processed by single band radiometric slicing to produce radiometric maps used to delineate areas of high ultraviolet radiance, which indicates high fluoride concentrations. The multispectral parameter maps and radiometric maps in combination showed distinctive patterns, which are correlated with areas known to be affected by fluoride and phosphate contamination. These remote sensing techniques have the potential for regional use to assess the environmental impact of fluoride and phosphate wastes in central Florida.

  13. Drawing Road Networks with Mental Maps.

    PubMed

    Lin, Shih-Syun; Lin, Chao-Hung; Hu, Yan-Jhang; Lee, Tong-Yee

    2014-09-01

    Tourist and destination maps are thematic maps designed to represent specific themes in maps. The road network topologies in these maps are generally more important than the geometric accuracy of roads. A road network warping method is proposed to facilitate map generation and improve theme representation in maps. The basic idea is deforming a road network to meet a user-specified mental map while an optimization process is performed to propagate distortions originating from road network warping. To generate a map, the proposed method includes algorithms for estimating road significance and for deforming a road network according to various geometric and aesthetic constraints. The proposed method can produce an iconic mark of a theme from a road network and meet a user-specified mental map. Therefore, the resulting map can serve as a tourist or destination map that not only provides visual aids for route planning and navigation tasks, but also visually emphasizes the presentation of a theme in a map for the purpose of advertising. In the experiments, the demonstrations of map generations show that our method enables map generation systems to generate deformed tourist and destination maps efficiently.

  14. Volunteer map data collection at the USGS

    USGS Publications Warehouse

    Eric, B. Wolf; Poore, Barbara S.; Caro, Holly K.; Matthews, Greg D.

    2011-01-01

    Since 1994, citizen volunteers have helped the U.S. Geological Survey (USGS) improve its topographic maps. Through the Earth Science Corps program, citizens were able to "adopt a quad" and collect new information and update existing map features. Until its conclusion in 2001, as many as 300 volunteers annotated paper maps which were incorporated into the USGS topographic-map revision process.

  15. Mapping the Stacks: Sustainability and User Experience of Animated Maps in Library Discovery Interfaces

    ERIC Educational Resources Information Center

    McMillin, Bill; Gibson, Sally; MacDonald, Jean

    2016-01-01

    Animated maps of the library stacks were integrated into the catalog interface at Pratt Institute and into the EBSCO Discovery Service interface at Illinois State University. The mapping feature was developed for optimal automation of the update process to enable a range of library personnel to update maps and call-number ranges. The development…

  16. Higher resolution satellite remote sensing and the impact on image mapping

    USGS Publications Warehouse

    Watkins, Allen H.; Thormodsgard, June M.

    1987-01-01

    Recent advances in spatial, spectral, and temporal resolution of civil land remote sensing satellite data are presenting new opportunities for image mapping applications. The U.S. Geological Survey's experimental satellite image mapping program is evolving toward larger scale image map products with increased information content as a result of improved image processing techniques and increased resolution. Thematic mapper data are being used to produce experimental image maps at 1:100,000 scale that meet established U.S. and European map accuracy standards. Availability of high quality, cloud-free, 30-meter ground resolution multispectral data from the Landsat thematic mapper sensor, along with 10-meter ground resolution panchromatic and 20-meter ground resolution multispectral data from the recently launched French SPOT satellite, present new cartographic and image processing challenges.The need to fully exploit these higher resolution data increases the complexity of processing the images into large-scale image maps. The removal of radiometric artifacts and noise prior to geometric correction can be accomplished by using a variety of image processing filters and transforms. Sensor modeling and image restoration techniques allow maximum retention of spatial and radiometric information. An optimum combination of spectral information and spatial resolution can be obtained by merging different sensor types. These processing techniques are discussed and examples are presented.

  17. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  18. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    USDA-ARS?s Scientific Manuscript database

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  19. Accessible maps for the color vision deficient observers: past and present knowledge and future possibilities

    NASA Astrophysics Data System (ADS)

    Kvitle, Anne Kristin

    2018-05-01

    Color is part of the visual variables in map, serving an aesthetic part and as a guide of attention. Impaired color vision affects the ability to distinguish colors, which makes the task of decoding the map colors difficult. Map reading is reported as a challenging task for these observers, especially when the size of stimuli is small. The aim of this study is to review existing methods for map design for color vision deficient users. A systematic review of research literature and case studies of map design for CVD observers has been conducted in order to give an overview of current knowledge and future research challenges. In addition, relevant research on simulations of CVD and color image enhancement for these observers from other fields of industry is included. The study identified two main approaches: pre-processing by using accessible colors and post-processing by using enhancement methods. Some of the methods may be applied for maps, but requires tailoring of test images according to map types.

  20. Genome contact map explorer: a platform for the comparison, interactive visualization and analysis of genome contact maps

    PubMed Central

    Kumar, Rajendra; Sobhy, Haitham

    2017-01-01

    Abstract Hi-C experiments generate data in form of large genome contact maps (Hi-C maps). These show that chromosomes are arranged in a hierarchy of three-dimensional compartments. But to understand how these compartments form and by how much they affect genetic processes such as gene regulation, biologists and bioinformaticians need efficient tools to visualize and analyze Hi-C data. However, this is technically challenging because these maps are big. In this paper, we remedied this problem, partly by implementing an efficient file format and developed the genome contact map explorer platform. Apart from tools to process Hi-C data, such as normalization methods and a programmable interface, we made a graphical interface that let users browse, scroll and zoom Hi-C maps to visually search for patterns in the Hi-C data. In the software, it is also possible to browse several maps simultaneously and plot related genomic data. The software is openly accessible to the scientific community. PMID:28973466

  1. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps

    PubMed Central

    2016-01-01

    Abstract Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor‐preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface‐based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory‐motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory‐motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M‐I. Hum Brain Mapp 37:2784–2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. PMID:27061771

  2. Teaching children the structure of science

    NASA Astrophysics Data System (ADS)

    Börner, Katy; Palmer, Fileve; Davis, Julie M.; Hardy, Elisha; Uzzo, Stephen M.; Hook, Bryan J.

    2009-01-01

    Maps of the world are common in classroom settings. They are used to teach the juxtaposition of natural and political functions, mineral resources, political, cultural and geographical boundaries; occurrences of processes such as tectonic drift; spreading of epidemics; and weather forecasts, among others. Recent work in scientometrics aims to create a map of science encompassing our collective scholarly knowledge. Maps of science can be used to see disciplinary boundaries; the origin of ideas, expertise, techniques, or tools; the birth, evolution, merging, splitting, and death of scientific disciplines; the spreading of ideas and technology; emerging research frontiers and bursts of activity; etc. Just like the first maps of our planet, the first maps of science are neither perfect nor correct. Today's science maps are predominantly generated based on English scholarly data: Techniques and procedures to achieve local and global accuracy of these maps are still being refined, and a visual language to communicate something as abstract and complex as science is still being developed. Yet, the maps are successfully used by institutions or individuals who can afford them to guide science policy decision making, economic decision making, or as visual interfaces to digital libraries. This paper presents the process and results of creating hands-on science maps for kids that teaches children ages 4-14 about the structure of scientific disciplines. The maps were tested in both formal and informal science education environments. The results show that children can easily transfer their (world) map and concept map reading skills to utilize maps of science in interesting ways.

  3. Challenges and Opportunities: One Stop Processing of Automatic Large-Scale Base Map Production Using Airborne LIDAR Data Within GIS Environment. Case Study: Makassar City, Indonesia

    NASA Astrophysics Data System (ADS)

    Widyaningrum, E.; Gorte, B. G. H.

    2017-05-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.

  4. User Preferences in Image Map Using

    NASA Astrophysics Data System (ADS)

    Vondráková, A.; Vozenilek, V.

    2016-06-01

    In the process of map making, the attention is given to the resulting image map (to be accurate, readable, and suit the primary purpose) and its user aspects. Current cartography understands the user issues as all matters relating to user perception, map use and also user preferences. Most commercial cartographic production is strongly connected to economic circumstances. Companies are discovering user's interests and market demands. However, is it sufficient to focus just on the user's preferences? Recent research on user aspects at Palacký University Olomouc addresses a much wider scope of user aspects. The user's preferences are very often distorting - the users think that the particular image map is kind, beautiful, and useful and they wants to buy it (or use it - it depends on the form of the map production). But when the same user gets the task to use practically this particular map (such as finding the shortest way), so the user concludes that initially preferred map is useless, and uses a map, that was worse evaluated according to his preferences. It is, therefore, necessary to evaluate not only the correctness of image maps and their aesthetics but also to assess the user perception and other user issues. For the accomplishment of such testing, eye-tracking technology is a useful tool. The research analysed how users read image maps, or if they prefer image maps over traditional maps. The eye tracking experiment on the comparison of the conventional and image map reading was conducted. The map readers were asked to solve few simple tasks with either conventional or image map. The readers' choice of the map to solve the task was one of investigated aspect of user preferences. Results demonstrate that the user preferences and user needs are often quite different issues. The research outcomes show that it is crucial to implement map user testing into the cartographic production process.

  5. System Considerations and Challendes in 3d Mapping and Modeling Using Low-Cost Uav Systems

    NASA Astrophysics Data System (ADS)

    Lari, Z.; El-Sheimy, N.

    2015-08-01

    In the last few years, low-cost UAV systems have been acknowledged as an affordable technology for geospatial data acquisition that can meet the needs of a variety of traditional and non-traditional mapping applications. In spite of its proven potential, UAV-based mapping is still lacking in terms of what is needed for it to become an acceptable mapping tool. In other words, a well-designed system architecture that considers payload restrictions as well as the specifications of the utilized direct geo-referencing component and the imaging systems in light of the required mapping accuracy and intended application is still required. Moreover, efficient data processing workflows, which are capable of delivering the mapping products with the specified quality while considering the synergistic characteristics of the sensors onboard, the wide range of potential users who might lack deep knowledge in mapping activities, and time constraints of emerging applications, are still needed to be adopted. Therefore, the introduced challenges by having low-cost imaging and georeferencing sensors onboard UAVs with limited payload capability, the necessity of efficient data processing techniques for delivering required products for intended applications, and the diversity of potential users with insufficient mapping-related expertise needs to be fully investigated and addressed by UAV-based mapping research efforts. This paper addresses these challenges and reviews system considerations, adaptive processing techniques, and quality assurance/quality control procedures for achievement of accurate mapping products from these systems.

  6. Geodatabase model for global geologic mapping: concept and implementation in planetary sciences

    NASA Astrophysics Data System (ADS)

    Nass, Andrea

    2017-04-01

    One aim of the NASA Dawn mission is to generate global geologic maps of the asteroid Vesta and the dwarf planet Ceres. To accomplish this, the Dawn Science Team followed the technical recommendations for cartographic basemap production. The geological mapping campaign of Vesta was completed and published, but mapping of the dwarf planet Ceres is still ongoing. The tiling schema for the geological mapping is the same for both planetary bodies and for Ceres it is divided into two parts: four overview quadrangles (Survey Orbit, 415 m/pixel) and 15 more detailed quadrangles (High Altitude Mapping HAMO, 140 m/pixel). The first global geologic map was based on survey images (415 m/pixel). The combine 4 Survey quadrangles completed by HAMO data served as basis for generating a more detailed view of the geologic history and also for defining the chronostratigraphy and time scale of the dwarf planet. The most detailed view can be expected within the 15 mapping quadrangles based on HAMO resolution and completed by the Low Altitude Mapping (LAMO) data with 35 m/pixel. For the interpretative mapping process of each quadrangle one responsible mapper was assigned. Unifying the geological mapping of each quadrangle and bringing this together to regional and global valid statements is already a very time intensive task. However, another challenge that has to be accomplished is to consider how the 15 individual mappers can generate one homogenous GIS-based project (w.r.t. geometrical and visual character) thus produce a geologically-consistent final map. Our approach this challenge was already discussed for mapping of Vesta. To accommodate the map requirements regarding rules for data storage and database management, the computer-based GIS environment used for the interpretative mapping process must be designed in a way that it can be adjusted to the unique features of the individual investigation areas. Within this contribution the template will be presented that uses standards for digitizing, visualization, data merging and synchronization in the processes of interpretative mapping project. Following the new technological innovations within GIS software and the individual requirements for mapping Ceres, a template was developed based on the symbology and framework. The template for (GIS-base) mapping presented here directly links the generically descriptive attributes of planetary objects to the predefined and standardized symbology in one data structure. Using this template the map results are more comparable and better controllable. Furthermore, merging and synchronization of the individual maps, map projects and sheets will be far more efficient. The template can be adapted to any other planetary body and or within future discovery missions (e.g., Lucy and Psyche which was selected to explore the early solar system by NASA) for generating reusable map results.

  7. Recommendations for the user-specific enhancement of flood maps

    NASA Astrophysics Data System (ADS)

    Meyer, V.; Kuhlicke, C.; Luther, J.; Fuchs, S.; Priest, S.; Dorner, W.; Serrhini, K.; Pardoe, J.; McCarthy, S.; Seidel, J.; Palka, G.; Unnerstall, H.; Viavattene, C.; Scheuer, S.

    2012-05-01

    The European Union Floods Directive requires the establishment of flood maps for high risk areas in all European member states by 2013. However, the current practice of flood mapping in Europe still shows some deficits. Firstly, flood maps are frequently seen as an information tool rather than a communication tool. This means that, for example, local stocks of knowledge are not incorporated. Secondly, the contents of flood maps often do not match the requirements of the end-users. Finally, flood maps are often designed and visualised in a way that cannot be easily understood by residents at risk and/or that is not suitable for the respective needs of public authorities in risk and event management. The RISK MAP project examined how end-user participation in the mapping process may be used to overcome these barriers and enhance the communicative power of flood maps, fundamentally increasing their effectiveness. Based on empirical findings from a participatory approach that incorporated interviews, workshops and eye-tracking tests, conducted in five European case studies, this paper outlines recommendations for user-specific enhancements of flood maps. More specific, recommendations are given with regard to (1) appropriate stakeholder participation processes, which allow incorporating local knowledge and preferences, (2) the improvement of the contents of flood maps by considering user-specific needs and (3) the improvement of the visualisation of risk maps in order to produce user-friendly and understandable risk maps for the user groups concerned. Furthermore, "idealised" maps for different user groups are presented: for strategic planning, emergency management and the public.

  8. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps.

    PubMed

    Sood, Mariam R; Sereno, Martin I

    2016-08-01

    Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor-preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface-based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory-motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory-motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M-I. Hum Brain Mapp 37:2784-2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  9. Research on the Application of Rapid Surveying and Mapping for Large Scare Topographic Map by Uav Aerial Photography System

    NASA Astrophysics Data System (ADS)

    Gao, Z.; Song, Y.; Li, C.; Zeng, F.; Wang, F.

    2017-08-01

    Rapid acquisition and processing method of large scale topographic map data, which relies on the Unmanned Aerial Vehicle (UAV) low-altitude aerial photogrammetry system, is studied in this paper, elaborating the main work flow. Key technologies of UAV photograph mapping is also studied, developing a rapid mapping system based on electronic plate mapping system, thus changing the traditional mapping mode and greatly improving the efficiency of the mapping. Production test and achievement precision evaluation of Digital Orth photo Map (DOM), Digital Line Graphic (DLG) and other digital production were carried out combined with the city basic topographic map update project, which provides a new techniques for large scale rapid surveying and has obvious technical advantage and good application prospect.

  10. Musical Maps as Narrative Inquiry

    ERIC Educational Resources Information Center

    Blair, Deborah V.

    2007-01-01

    This study explores the metaphorical relationship between the process of narrative inquiry and the process of "musical mapping." The creation of musical maps was used as a classroom tool for enabling students' musical understanding while listening to music. As teacher-researcher, I studied my fifth-grade music students as they interacted with…

  11. Process for Generating Engine Fuel Consumption Map: Ricardo Cooled EGR Boost 24-bar Standard Car Engine Tier 2 Fuel

    EPA Pesticide Factsheets

    This document summarizes the process followed to utilize the fuel consumption map of a Ricardo modeled engine and vehicle fuel consumption data to generate a full engine fuel consumption map which can be used by EPA's ALPHA vehicle simulations.

  12. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool.

    PubMed

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-06-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13-17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences.

  13. Recovery of chemical Estimates by Field Inhomogeneity Neighborhood Error Detection (REFINED): Fat/Water Separation at 7T

    PubMed Central

    Narayan, Sreenath; Kalhan, Satish C.; Wilson, David L.

    2012-01-01

    I.Abstract Purpose To reduce swaps in fat-water separation methods, a particular issue on 7T small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Materials and Methods Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Results Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Conclusion Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. PMID:23023815

  14. Recovery of chemical estimates by field inhomogeneity neighborhood error detection (REFINED): fat/water separation at 7 tesla.

    PubMed

    Narayan, Sreenath; Kalhan, Satish C; Wilson, David L

    2013-05-01

    To reduce swaps in fat-water separation methods, a particular issue on 7 Tesla (T) small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. Copyright © 2012 Wiley Periodicals, Inc.

  15. Conceptual framework for the mapping of management process with information technology in a business process.

    PubMed

    Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.

  16. Conceptual Framework for the Mapping of Management Process with Information Technology in a Business Process

    PubMed Central

    Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688

  17. Visualizing complex processes using a cognitive-mapping tool to support the learning of clinical reasoning.

    PubMed

    Wu, Bian; Wang, Minhong; Grotzer, Tina A; Liu, Jun; Johnson, Janice M

    2016-08-22

    Practical experience with clinical cases has played an important role in supporting the learning of clinical reasoning. However, learning through practical experience involves complex processes difficult to be captured by students. This study aimed to examine the effects of a computer-based cognitive-mapping approach that helps students to externalize the reasoning process and the knowledge underlying the reasoning process when they work with clinical cases. A comparison between the cognitive-mapping approach and the verbal-text approach was made by analyzing their effects on learning outcomes. Fifty-two third-year or higher students from two medical schools participated in the study. Students in the experimental group used the computer-base cognitive-mapping approach, while the control group used the verbal-text approach, to make sense of their thinking and actions when they worked with four simulated cases over 4 weeks. For each case, students in both groups reported their reasoning process (involving data capture, hypotheses formulation, and reasoning with justifications) and the underlying knowledge (involving identified concepts and the relationships between the concepts) using the given approach. The learning products (cognitive maps or verbal text) revealed that students in the cognitive-mapping group outperformed those in the verbal-text group in the reasoning process, but not in making sense of the knowledge underlying the reasoning process. No significant differences were found in a knowledge posttest between the two groups. The computer-based cognitive-mapping approach has shown a promising advantage over the verbal-text approach in improving students' reasoning performance. Further studies are needed to examine the effects of the cognitive-mapping approach in improving the construction of subject-matter knowledge on the basis of practical experience.

  18. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  19. exocartographer: Constraining surface maps orbital parameters of exoplanets

    NASA Astrophysics Data System (ADS)

    Farr, Ben; Farr, Will M.; Cowan, Nicolas B.; Haggard, Hal M.; Robinson, Tyler

    2018-05-01

    exocartographer solves the exo-cartography inverse problem. This flexible forward-modeling framework, written in Python, retrieves the albedo map and spin geometry of a planet based on time-resolved photometry; it uses a Markov chain Monte Carlo method to extract albedo maps and planet spin and their uncertainties. Gaussian Processes use the data to fit for the characteristic length scale of the map and enforce smooth maps.

  20. Automatically Generated Vegetation Density Maps with LiDAR Survey for Orienteering Purpose

    NASA Astrophysics Data System (ADS)

    Petrovič, Dušan

    2018-05-01

    The focus of our research was to automatically generate the most adequate vegetation density maps for orienteering purpose. Application Karttapullatuin was used for automated generation of vegetation density maps, which requires LiDAR data to process an automatically generated map. A part of the orienteering map in the area of Kazlje-Tomaj was used to compare the graphical display of vegetation density. With different settings of parameters in the Karttapullautin application we changed the way how vegetation density of automatically generated map was presented, and tried to match it as much as possible with the orienteering map of Kazlje-Tomaj. Comparing more created maps of vegetation density the most suitable parameter settings to automatically generate maps on other areas were proposed, too.

  1. Method and system for processing optical elements using magnetorheological finishing

    DOEpatents

    Menapace, Joseph Arthur; Schaffers, Kathleen Irene; Bayramian, Andrew James; Molander, William A

    2012-09-18

    A method of finishing an optical element includes mounting the optical element in an optical mount having a plurality of fiducials overlapping with the optical element and obtaining a first metrology map for the optical element and the plurality of fiducials. The method also includes obtaining a second metrology map for the optical element without the plurality of fiducials, forming a difference map between the first metrology map and the second metrology map, and aligning the first metrology map and the second metrology map. The method further includes placing mathematical fiducials onto the second metrology map using the difference map to form a third metrology map and associating the third metrology map to the optical element. Moreover, the method includes mounting the optical element in the fixture in an MRF tool, positioning the optical element in the fixture; removing the plurality of fiducials, and finishing the optical element.

  2. A Comprehensive Three-Dimensional Cortical Map of Vowel Space

    ERIC Educational Resources Information Center

    Scharinger, Mathias; Idsardi, William J.; Poe, Samantha

    2011-01-01

    Mammalian cortex is known to contain various kinds of spatial encoding schemes for sensory information including retinotopic, somatosensory, and tonotopic maps. Tonotopic maps are especially interesting for human speech sound processing because they encode linguistically salient acoustic properties. In this study, we mapped the entire vowel space…

  3. SHARAKU: an algorithm for aligning and clustering read mapping profiles of deep sequencing in non-coding RNA processing.

    PubMed

    Tsuchiya, Mariko; Amano, Kojiro; Abe, Masaya; Seki, Misato; Hase, Sumitaka; Sato, Kengo; Sakakibara, Yasubumi

    2016-06-15

    Deep sequencing of the transcripts of regulatory non-coding RNA generates footprints of post-transcriptional processes. After obtaining sequence reads, the short reads are mapped to a reference genome, and specific mapping patterns can be detected called read mapping profiles, which are distinct from random non-functional degradation patterns. These patterns reflect the maturation processes that lead to the production of shorter RNA sequences. Recent next-generation sequencing studies have revealed not only the typical maturation process of miRNAs but also the various processing mechanisms of small RNAs derived from tRNAs and snoRNAs. We developed an algorithm termed SHARAKU to align two read mapping profiles of next-generation sequencing outputs for non-coding RNAs. In contrast with previous work, SHARAKU incorporates the primary and secondary sequence structures into an alignment of read mapping profiles to allow for the detection of common processing patterns. Using a benchmark simulated dataset, SHARAKU exhibited superior performance to previous methods for correctly clustering the read mapping profiles with respect to 5'-end processing and 3'-end processing from degradation patterns and in detecting similar processing patterns in deriving the shorter RNAs. Further, using experimental data of small RNA sequencing for the common marmoset brain, SHARAKU succeeded in identifying the significant clusters of read mapping profiles for similar processing patterns of small derived RNA families expressed in the brain. The source code of our program SHARAKU is available at http://www.dna.bio.keio.ac.jp/sharaku/, and the simulated dataset used in this work is available at the same link. Accession code: The sequence data from the whole RNA transcripts in the hippocampus of the left brain used in this work is available from the DNA DataBank of Japan (DDBJ) Sequence Read Archive (DRA) under the accession number DRA004502. yasu@bio.keio.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  4. Application of process mapping to understand integration of high risk medicine care bundles within community pharmacy practice.

    PubMed

    Weir, Natalie M; Newham, Rosemary; Corcoran, Emma D; Ali Atallah Al-Gethami, Ashwag; Mohammed Abd Alridha, Ali; Bowie, Paul; Watson, Anne; Bennie, Marion

    2017-11-21

    The Scottish Patient Safety Programme - Pharmacy in Primary Care collaborative is a quality improvement initiative adopting the Institute of Healthcare Improvement Breakthrough Series collaborative approach. The programme developed and piloted High Risk Medicine (HRM) Care Bundles (CB), focused on warfarin and non-steroidal anti-inflammatories (NSAIDs), within 27 community pharmacies over 4 NHS Regions. Each CB involves clinical assessment and patient education, although the CB content varies between regions. To support national implementation, this study aims to understand how the pilot pharmacies integrated the HRM CBs into routine practice to inform the development of a generic HRM CB process map. Regional process maps were developed in 4 pharmacies through simulation of the CB process, staff interviews and documentation of resources. Commonalities were collated to develop a process map for each HRM, which were used to explore variation at a national event. A single, generic process map was developed which underwent validation by case study testing. The findings allowed development of a generic process map applicable to warfarin and NSAID CB implementation. Five steps were identified as required for successful CB delivery: patient identification; clinical assessment; pharmacy CB prompt; CB delivery; and documentation. The generic HRM CB process map encompasses the staff and patients' journey and the CB's integration into routine community pharmacy practice. Pharmacist involvement was required only for clinical assessment, indicating suitability for whole-team involvement. Understanding CB integration into routine practice has positive implications for successful implementation. The generic process map can be used to develop targeted resources, and/or be disseminated to facilitate CB delivery and foster whole team involvement. Similar methods could be utilised within other settings, to allow those developing novel services to distil the key processes and consider their integration within routine workflows to effect maximal, efficient implementation and benefit to patient care. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Polar Views of Titan Global Topography

    NASA Image and Video Library

    2013-05-15

    These polar maps show the first global, topographic mapping of Saturn moon Titan, using data from NASA Cassini mission. To create these maps, scientists employed a mathematical process called splining.

  6. Surficial geology of Mars: A study in support of a penetrator mission to Mars

    NASA Technical Reports Server (NTRS)

    Spudis, P.; Greeley, R.

    1976-01-01

    Physiographic and surficial cover information were combined into unified surficial geology maps (30 quadrangles and 1 synoptic map). The surface of Mars is heterogeneous and is modified by wind, water, volcanism, tectonism, mass wasting and other processes. Surficial mapping identifies areas modified by these processes on a regional basis. Viking I mission results indicate that, at least in the landing site area, the surficial mapping based on Mariner data is fairly accurate. This area was mapped as a lightly cratered plain with thin or discontinuous eolian sediment. Analysis of lander images indicates that this interpretation is very close to actual surface conditions. These initial results do not imply that all surficial units are mapped correctly, but they do increase confidence in estimates based on photogeologic interpretations of orbital pictures.

  7. Demystifying process mapping: a key step in neurosurgical quality improvement initiatives.

    PubMed

    McLaughlin, Nancy; Rodstein, Jennifer; Burke, Michael A; Martin, Neil A

    2014-08-01

    Reliable delivery of optimal care can be challenging for care providers. Health care leaders have integrated various business tools to assist them and their teams in ensuring consistent delivery of safe and top-quality care. The cornerstone to all quality improvement strategies is the detailed understanding of the current state of a process, captured by process mapping. Process mapping empowers caregivers to audit how they are currently delivering care to subsequently strategically plan improvement initiatives. As a community, neurosurgery has clearly shown dedication to enhancing patient safety and delivering quality care. A care redesign strategy named NERVS (Neurosurgery Enhanced Recovery after surgery, Value, and Safety) is currently being developed and piloted within our department. Through this initiative, a multidisciplinary team led by a clinician neurosurgeon has process mapped the way care is currently being delivered throughout the entire episode of care. Neurosurgeons are becoming leaders in quality programs, and their education on the quality improvement strategies and tools is essential. The authors present a comprehensive review of process mapping, demystifying its planning, its building, and its analysis. The particularities of using process maps, initially a business tool, in the health care arena are discussed, and their specific use in an academic neurosurgical department is presented.

  8. Thermal Spray Maps: Material Genomics of Processing Technologies

    NASA Astrophysics Data System (ADS)

    Ang, Andrew Siao Ming; Sanpo, Noppakun; Sesso, Mitchell L.; Kim, Sun Yung; Berndt, Christopher C.

    2013-10-01

    There is currently no method whereby material properties of thermal spray coatings may be predicted from fundamental processing inputs such as temperature-velocity correlations. The first step in such an important understanding would involve establishing a foundation that consolidates the thermal spray literature so that known relationships could be documented and any trends identified. This paper presents a method to classify and reorder thermal spray data so that relationships and correlations between competing processes and materials can be identified. Extensive data mining of published experimental work was performed to create thermal spray property-performance maps, known as "TS maps" in this work. Six TS maps will be presented. The maps are based on coating characteristics of major importance; i.e., porosity, microhardness, adhesion strength, and the elastic modulus of thermal spray coatings.

  9. On the mapping associated with the complex representation of functions and processes.

    NASA Technical Reports Server (NTRS)

    Harger, R. O.

    1972-01-01

    The mapping between function spaces that is implied by the representation of a real 'bandpass' function by a complex 'low-pass' function is explicitly accepted. The discussion is extended to the representation of stationary random processes where the mapping is between spaces of random processes. This approach clarifies the nature of the complex representation, especially in the case of random processes and, in addition, derives the properties of the complex representation.-

  10. Vision-based mapping with cooperative robots

    NASA Astrophysics Data System (ADS)

    Little, James J.; Jennings, Cullen; Murray, Don

    1998-10-01

    Two stereo-vision-based mobile robots navigate and autonomously explore their environment safely while building occupancy grid maps of the environment. The robots maintain position estimates within a global coordinate frame using landmark recognition. This allows them to build a common map by sharing position information and stereo data. Stereo vision processing and map updates are done at 3 Hz and the robots move at speeds of 200 cm/s. Cooperative mapping is achieved through autonomous exploration of unstructured and dynamic environments. The map is constructed conservatively, so as to be useful for collision-free path planning. Each robot maintains a separate copy of a shared map, and then posts updates to the common map when it returns to observe a landmark at home base. Issues include synchronization, mutual localization, navigation, exploration, registration of maps, merging repeated views (fusion), centralized vs decentralized maps.

  11. Appendix A: Ecoprovinces of the Central North American Cordillera and adjacent plains

    Treesearch

    Dennis A. Demarchi

    1994-01-01

    The fundamental difference between the map presented here and other regional ecosystem classifications is that this map's ecological units are based on climatic processes rather than vegetation communities (map appears at the end of this appendix). Macroclimatic processes are the physical and thermodynamic interaction between climatic controls, or the relatively...

  12. How Does Creating a Concept Map Affect Item-Specific Encoding?

    ERIC Educational Resources Information Center

    Grimaldi, Phillip J.; Poston, Laurel; Karpicke, Jeffrey D.

    2015-01-01

    Concept mapping has become a popular learning tool. However, the processes underlying the task are poorly understood. In the present study, we examined the effect of creating a concept map on the processing of item-specific information. In 2 experiments, subjects learned categorized or ad hoc word lists by making pleasantness ratings, sorting…

  13. Use of Networked Collaborative Concept Mapping To Measure Team Processes and Team Outcomes.

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.; Herl, Howard E.; Dennis, Robert A.

    The feasibility of using a computer-based networked collaborative concept mapping system to measure teamwork skills was studied. A concept map is a node-link-node representation of content, where the nodes represent concepts and links represent relationships between connected concepts. Teamwork processes were examined for a group concept mapping…

  14. The need for sustained and integrated high-resolution mapping of dynamic coastal environments

    USGS Publications Warehouse

    Stockdon, Hilary F.; Lillycrop, Jeff W.; Howd, Peter A.; Wozencraft, Jennifer M.

    2007-01-01

    The evolution of the United States' coastal zone response to both human activities and natural processes is dynamic. Coastal resource and population protection requires understanding, in detail, the processes needed for change as well as the physical setting. Sustained coastal area mapping allows change to be documented and baseline conditions to be established, as well as future behavior to be predicted in conjunction with physical process models. Hyperspectral imagers and airborne lidars, as well as other recent mapping technology advances, allow rapid national scale land use information and high-resolution elevation data collection. Coastal hazard risk evaluation has critical dependence on these rich data sets. A fundamental storm surge model parameter in predicting flooding location, for example, is coastal elevation data, and a foundation in identifying the most vulnerable populations and resources is land use maps. A wealth of information for physical change process study, coastal resource and community management and protection, and coastal area hazard vulnerability determination, is available in a comprehensive national coastal mapping plan designed to take advantage of recent mapping technology progress and data distribution, management, and collection.

  15. Exploring Students' Mapping Behaviors and Interactive Discourses in a Case Diagnosis Problem: Sequential Analysis of Collaborative Causal Map Drawing Processes

    ERIC Educational Resources Information Center

    Lee, Woon Jee

    2012-01-01

    The purpose of this study was to explore the nature of students' mapping and discourse behaviors while constructing causal maps to articulate their understanding of a complex, ill-structured problem. In this study, six graduate-level students were assigned to one of three pair groups, and each pair used the causal mapping software program,…

  16. A computational linguistics motivated mapping of ICPC-2 PLUS to SNOMED CT.

    PubMed

    Wang, Yefeng; Patrick, Jon; Miller, Graeme; O'Hallaran, Julie

    2008-10-27

    A great challenge in sharing data across information systems in general practice is the lack of interoperability between different terminologies or coding schema used in the information systems. Mapping of medical vocabularies to a standardised terminology is needed to solve data interoperability problems. We present a system to automatically map an interface terminology ICPC-2 PLUS to SNOMED CT. Three steps of mapping are proposed in this system. The UMLS metathesaurus mapping utilises explicit relationships between ICPC-2 PLUS and SNOMED CT terms in the UMLS library to perform the first stage of the mapping. Computational linguistic mapping uses natural language processing techniques and lexical similarities for the second stage of mapping between terminologies. Finally, the post-coordination mapping allows one ICPC-2 PLUS term to be mapped into an aggregation of two or more SNOMED CT terms. A total 5,971 of all 7,410 ICPC-2 terms (80.58%) were mapped to SNOMED CT using the three stages but with different levels of accuracy. UMLS mapping achieved the mapping of 53.0% ICPC2 PLUS terms to SNOMED CT with the precision rate of 96.46% and overall recall rate of 44.89%. Lexical mapping increased the result to 60.31% and post-coordination mapping gave an increase of 20.27% in mapped terms. A manual review of a part of the mapping shows that the precision of lexical mappings is around 90%. The accuracy of post-coordination has not been evaluated yet. Unmapped terms and mismatched terms are due to the differences in the structures between ICPC-2 PLUS and SNOMED CT. Terms contained in ICPC-2 PLUS but not in SNOMED CT caused a large proportion of the failures in the mappings. Mapping terminologies to a standard vocabulary is a way to facilitate consistent medical data exchange and achieve system interoperability and data standardisation. Broad scale mapping cannot be achieved by any single method and methods based on computational linguistics can be very useful for the task. Automating as much as is possible of this process turns the searching and mapping task into a validation task, which can effectively reduce the cost and increase the efficiency and accuracy of this task over manual methods.

  17. Topological visual mapping in robotics.

    PubMed

    Romero, Anna; Cazorla, Miguel

    2012-08-01

    A key problem in robotics is the construction of a map from its environment. This map could be used in different tasks, like localization, recognition, obstacle avoidance, etc. Besides, the simultaneous location and mapping (SLAM) problem has had a lot of interest in the robotics community. This paper presents a new method for visual mapping, using topological instead of metric information. For that purpose, we propose prior image segmentation into regions in order to group the extracted invariant features in a graph so that each graph defines a single region of the image. Although others methods have been proposed for visual SLAM, our method is complete, in the sense that it makes all the process: it presents a new method for image matching; it defines a way to build the topological map; and it also defines a matching criterion for loop-closing. The matching process will take into account visual features and their structure using the graph transformation matching (GTM) algorithm, which allows us to process the matching and to remove out the outliers. Then, using this image comparison method, we propose an algorithm for constructing topological maps. During the experimentation phase, we will test the robustness of the method and its ability constructing topological maps. We have also introduced new hysteresis behavior in order to solve some problems found building the graph.

  18. What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products.

    PubMed

    Miake-Lye, Isomi M; Hempel, Susanne; Shanman, Roberta; Shekelle, Paul G

    2016-02-10

    The need for systematic methods for reviewing evidence is continuously increasing. Evidence mapping is one emerging method. There are no authoritative recommendations for what constitutes an evidence map or what methods should be used, and anecdotal evidence suggests heterogeneity in both. Our objectives are to identify published evidence maps and to compare and contrast the presented definitions of evidence mapping, the domains used to classify data in evidence maps, and the form the evidence map takes. We conducted a systematic review of publications that presented results with a process termed "evidence mapping" or included a figure called an "evidence map." We identified publications from searches of ten databases through 8/21/2015, reference mining, and consulting topic experts. We abstracted the research question, the unit of analysis, the search methods and search period covered, and the country of origin. Data were narratively synthesized. Thirty-nine publications met inclusion criteria. Published evidence maps varied in their definition and the form of the evidence map. Of the 31 definitions provided, 67 % described the purpose as identification of gaps and 58 % referenced a stakeholder engagement process or user-friendly product. All evidence maps explicitly used a systematic approach to evidence synthesis. Twenty-six publications referred to a figure or table explicitly called an "evidence map," eight referred to an online database as the evidence map, and five stated they used a mapping methodology but did not present a visual depiction of the evidence. The principal conclusion of our evaluation of studies that call themselves "evidence maps" is that the implied definition of what constitutes an evidence map is a systematic search of a broad field to identify gaps in knowledge and/or future research needs that presents results in a user-friendly format, often a visual figure or graph, or a searchable database. Foundational work is needed to better standardize the methods and products of an evidence map so that researchers and policymakers will know what to expect of this new type of evidence review. Although an a priori protocol was developed, no registration was completed; this review did not fit the PROSPERO format.

  19. The Research and Compilation of City Maps in the National Geomatics Atlas of the PEOPLE'S Republic of China

    NASA Astrophysics Data System (ADS)

    Wang, G.; Wang, D.; Zhou, W.; Chen, M.; Zhao, T.

    2018-04-01

    The research and compilation of new century version of the National Huge Atlas of the People's Republic of China is the special basic work project by Ministry of Science and Technology of the People's Republic of China. Among them, the research and compilation of the National Geomatics Atlas of the People's Republic of China is its main content. The National Geomatics Atlas of China consists of 4 groups of maps and place name index. The 4 groups of maps are separately nationwide thematic map group, provincial fundamental geographical map group, landcover map group and city map group. The city map group is an important component part of the National Geomatics Atlas of China and mainly shows the process of urbanization in China. This paper, aim at design and compilation of 39 city-wide maps, briefly introduces mapping area research and scale design, mapping technical route, content selection and cartographic generalization, symbol design and visualization of map, etc.

  20. Flood mapping from Sentinel-1 and Landsat-8 data: a case study from river Evros, Greece

    NASA Astrophysics Data System (ADS)

    Kyriou, Aggeliki; Nikolakopoulos, Konstantinos

    2015-10-01

    Floods are suddenly and temporary natural events, affecting areas which are not normally covered by water. The influence of floods plays a significant role both in society and the natural environment, therefore flood mapping is crucial. Remote sensing data can be used to develop flood map in an efficient and effective way. This work is focused on expansion of water bodies overtopping natural levees of the river Evros, invading the surroundings areas and converting them in flooded. Different techniques of flood mapping were used using data from active and passive remote sensing sensors like Sentinlel-1 and Landsat-8 respectively. Space borne pairs obtained from Sentinel-1 were processed in this study. Each pair included an image during the flood, which is called "crisis image" and another one before the event, which is called "archived image". Both images covering the same area were processed producing a map, which shows the spread of the flood. Multispectral data From Landsat-8 were also processed in order to detect and map the flooded areas. Different image processing techniques were applied and the results were compared to the respective results of the radar data processing.

  1. AVIRIS Spectrometer Maps Total Water Vapor Column

    NASA Technical Reports Server (NTRS)

    Conel, James E.; Green, Robert O.; Carrere, Veronique; Margolis, Jack S.; Alley, Ronald E.; Vane, Gregg A.; Bruegge, Carol J.; Gary, Bruce L.

    1992-01-01

    Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) processes maps of vertical-column abundances of water vapor in atmosphere with good precision and spatial resolution. Maps provide information for meteorology, climatology, and agriculture.

  2. Development of AHPDST Vulnerability Indexing Model for Groundwater Vulnerability Assessment Using Hydrogeophysical Derived Parameters and GIS Application

    NASA Astrophysics Data System (ADS)

    Mogaji, K. A.

    2017-04-01

    Producing a bias-free vulnerability assessment map model is significantly needed for planning a scheme of groundwater quality protection. This study developed a GIS-based AHPDST vulnerability index model for producing groundwater vulnerability model map in the hard rock terrain, Nigeria by exploiting the potentials of analytic hierarchy process (AHP) and Dempster-Shafer theory (DST) data mining models. The acquired borehole and geophysical data in the study area were processed to derive five groundwater vulnerability conditioning factors (GVCFs), namely recharge rate, aquifer transmissivity, hydraulic conductivity, transverse resistance and longitudinal conductance. The produced GVCFs' thematic maps were multi-criterially analyzed by employing the mechanisms of AHP and DST models to determine the normalized weight ( W) parameter for the GVCFs and mass function factors (MFFs) parameter for the GVCFs' thematic maps' class boundaries, respectively. Based on the application of the weighted linear average technique, the determined W and MFFs parameters were synthesized to develop groundwater vulnerability potential index (GVPI)-based AHPDST model algorithm. The developed model was applied to establish four GVPI mass/belief function indices. The estimates based on the applied GVPI belief function indices were processed in GIS environment to create prospective groundwater vulnerability potential index maps. The most representative of the resulting vulnerability maps (the GVPIBel map) was considered for producing the groundwater vulnerability potential zones (GVPZ) map for the area. The produced GVPZ map established 48 and 52% of the areal extent to be covered by the lows/moderate and highs vulnerable zones, respectively. The success and the prediction rates of the produced GVPZ map were determined using the relative operating characteristics technique to give 82.3 and 77.7%, respectively. The analyzed results reveal that the developed GVPI-based AHPDST model algorithm is capable of producing efficient groundwater vulnerability potential zones prediction map and characterizing the predicted zones uncertainty via the DST mechanism processes in the area. The produced GVPZ map in this study can be used by decision makers to formulate appropriate groundwater management strategies and the approach may be well opted in other hard rock regions of the world, especially in economically poor nations.

  3. Automated strip-mine and reclamation mapping from ERTS

    NASA Technical Reports Server (NTRS)

    Rogers, R. H. (Principal Investigator); Reed, L. E.; Pettyjohn, W. A.

    1974-01-01

    The author has identified the following significant results. Computer processing techniques were applied to ERTS-1 computer-compatible tape (CCT) data acquired in August 1972 on the Ohio Power Company's coal mining operation in Muskingum County, Ohio. Processing results succeeded in automatically classifying, with an accuracy greater than 90%: (1) stripped earth and major sources of erosion; (2) partially reclaimed areas and minor sources of erosion; (3) water with sedimentation; (4) water without sedimentation; and (5) vegetation. Computer-generated tables listing the area in acres and square kilometers were produced for each target category. Processing results also included geometrically corrected map overlays, one for each target category, drawn on a transparent material by a pen under computer control. Each target category is assigned a distinctive color on the overlay to facilitate interpretation. The overlays, drawn at a scale of 1:250,000 when placed over an AMS map of the same area, immediately provided map locations for each target. These mapping products were generated at a tenth of the cost of conventional mapping techniques.

  4. High resolution hybrid optical and acoustic sea floor maps (Invited)

    NASA Astrophysics Data System (ADS)

    Roman, C.; Inglis, G.

    2013-12-01

    This abstract presents a method for creating hybrid optical and acoustic sea floor reconstructions at centimeter scale grid resolutions with robotic vehicles. Multibeam sonar and stereo vision are two common sensing modalities with complementary strengths that are well suited for data fusion. We have recently developed an automated two stage pipeline to create such maps. The steps can be broken down as navigation refinement and map construction. During navigation refinement a graph-based optimization algorithm is used to align 3D point clouds created with both the multibeam sonar and stereo cameras. The process combats the typical growth in navigation error that has a detrimental affect on map fidelity and typically introduces artifacts at small grid sizes. During this process we are able to automatically register local point clouds created by each sensor to themselves and to each other where they overlap in a survey pattern. The process also estimates the sensor offsets, such as heading, pitch and roll, that describe how each sensor is mounted to the vehicle. The end results of the navigation step is a refined vehicle trajectory that ensures the points clouds from each sensor are consistently aligned, and the individual sensor offsets. In the mapping step, grid cells in the map are selectively populated by choosing data points from each sensor in an automated manner. The selection process is designed to pick points that preserve the best characteristics of each sensor and honor some specific map quality criteria to reduce outliers and ghosting. In general, the algorithm selects dense 3D stereo points in areas of high texture and point density. In areas where the stereo vision is poor, such as in a scene with low contrast or texture, multibeam sonar points are inserted in the map. This process is automated and results in a hybrid map populated with data from both sensors. Additional cross modality checks are made to reject outliers in a robust manner. The final hybrid map retains the strengths of both sensors and shows improvement over the single modality maps and a naively assembled multi-modal map where all the data points are included and averaged. Results will be presented from marine geological and archaeological applications using a 1350 kHz BlueView multibeam sonar and 1.3 megapixel digital still cameras.

  5. Construct Maps as a Foundation for Standard Setting

    ERIC Educational Resources Information Center

    Wyse, Adam E.

    2013-01-01

    Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…

  6. Analogical Processes in Children's Understanding of Spatial Representations

    ERIC Educational Resources Information Center

    Yuan, Lei; Uttal, David; Gentner, Dedre

    2017-01-01

    We propose that map reading can be construed as a form of analogical mapping. We tested 2 predictions that follow from this claim: First, young children's patterns of performance in map reading tasks should parallel those found in analogical mapping tasks; and, second, children will benefit from guided alignment instructions that help them see the…

  7. Journey Mapping the User Experience

    ERIC Educational Resources Information Center

    Samson, Sue; Granath, Kim; Alger, Adrienne

    2017-01-01

    This journey-mapping pilot study was designed to determine whether journey mapping is an effective method to enhance the student experience of using the library by assessing our services from their point of view. Journey mapping plots a process or service to produce a visual representation of a library transaction--from the point at which the…

  8. Release of the World Digital Magnetic Anomaly Map version 2 (WDMAM v2) scheduled

    NASA Astrophysics Data System (ADS)

    Dyment, Jérôme; Lesur, Vincent; Choi, Yujin; Hamoudi, Mohamed; Thébault, Erwan; Catalan, Manuel

    2015-04-01

    The World Digital Magnetic Anomaly Map is an international initiative carried out under the auspices of the International Association of Geomagnetism and Aeronomy (IAGA) and the Commission for the Geological Map of the World (CGMW). A first version of the map has been published and distributed eight years ago (WDMAM v1; Korhonen et al., 2007). After a call for an improved second version of the map in 2011, the slow process of data compilation, map preparation, evaluation and finalization is near completion, and the WDMAM v2 will be released at the International Union of Geophysics and Geodesy (IUGG) meeting to be held in Prag in June-July 2015. In this presentation we display several shortcomings of the WDMAM v1, both on continental and oceanic areas, that are hopefully alleviated in the WDMAM v2, and discuss the process leading to the new map. We reiterate a long-standing call for aeromagnetic and marine magnetic data contribution, and explore future directions to pursue the effort toward a more complete, higher resolution magnetic anomaly map of the World.

  9. Parallel algorithms for mapping pipelined and parallel computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Many computational problems in image processing, signal processing, and scientific computing are naturally structured for either pipelined or parallel computation. When mapping such problems onto a parallel architecture it is often necessary to aggregate an obvious problem decomposition. Even in this context the general mapping problem is known to be computationally intractable, but recent advances have been made in identifying classes of problems and architectures for which optimal solutions can be found in polynomial time. Among these, the mapping of pipelined or parallel computations onto linear array, shared memory, and host-satellite systems figures prominently. This paper extends that work first by showing how to improve existing serial mapping algorithms. These improvements have significantly lower time and space complexities: in one case a published O(nm sup 3) time algorithm for mapping m modules onto n processors is reduced to an O(nm log m) time complexity, and its space requirements reduced from O(nm sup 2) to O(m). Run time complexity is further reduced with parallel mapping algorithms based on these improvements, which run on the architecture for which they create the mappings.

  10. An interactive method for digitizing zone maps

    NASA Technical Reports Server (NTRS)

    Giddings, L. E.; Thompson, E. J.

    1975-01-01

    A method is presented for digitizing maps that consist of zones, such as contour or climatic zone maps. A color-coded map is prepared by any convenient process. The map is then read into memory of an Image 100 computer by means of its table scanner, using colored filters. Zones are separated and stored in themes, using standard classification procedures. Thematic data are written on magnetic tape and these data, appropriately coded, are combined to make a digitized image on tape. Step-by-step procedures are given for digitization of crop moisture index maps with this procedure. In addition, a complete example of the digitization of a climatic zone map is given.

  11. Historical Topographic Map Collection bookmark

    USGS Publications Warehouse

    Fishburn, Kristin A.; Allord, Gregory J.

    2017-06-29

    The U.S. Geological Survey (USGS) National Geospatial Program is scanning published USGS 1:250,000-scale and larger topographic maps printed between 1884, the inception of the topographic mapping program, and 2006. The goal of this project, which began publishing the historical scanned maps in 2011, is to provide a digital repository of USGS topographic maps, available to the public at no cost. For more than 125 years, USGS topographic maps have accurately portrayed the complex geography of the Nation. The USGS is the Nation’s largest producer of printed topographic maps, and prior to 2006, USGS topographic maps were created using traditional cartographic methods and printed using a lithographic printing process. As the USGS continues the release of a new generation of topographic maps (US Topo) in electronic form, the topographic map remains an indispensable tool for government, science, industry, land management planning, and leisure.

  12. The uses of emotion maps in research and clinical practice with families and couples: methodological innovation and critical inquiry.

    PubMed

    Gabb, Jacqui; Singh, Reenee

    2015-03-01

    We explore how "emotion maps" can be productively used in clinical assessment and clinical practice with families and couples. This graphic participatory method was developed in sociological studies to examine everyday family relationships. Emotion maps enable us to effectively "see" the dynamic experience and emotional repertoires of family life. Through the use of a case example, in this article we illustrate how emotion maps can add to the systemic clinicians' repertoire of visual methods. For clinicians working with families, couples, and young people, the importance of gaining insight into how lives are lived, at home, cannot be understated. Producing emotion maps can encourage critical personal reflection and expedite change in family practice. Hot spots in the household become visualized, facilitating dialogue on prevailing issues and how these events may be perceived differently by different family members. As emotion maps are not reliant on literacy or language skills they can be equally completed by parents and children alike, enabling children's perspective to be heard. Emotion maps can be used as assessment tools, to demonstrate the process of change within families. Furthermore, emotion maps can be extended to use through technology and hence are well suited particularly to working with young people. We end the article with a wider discussion of the place of emotions and emotion maps within systemic psychotherapy. © 2014 The Authors. Family Process published by Wiley Periodicals, Inc. on behalf of Family Process Institute.

  13. A novel algorithm for fully automated mapping of geospatial ontologies

    NASA Astrophysics Data System (ADS)

    Chaabane, Sana; Jaziri, Wassim

    2018-01-01

    Geospatial information is collected from different sources thus making spatial ontologies, built for the same geographic domain, heterogeneous; therefore, different and heterogeneous conceptualizations may coexist. Ontology integrating helps creating a common repository of the geospatial ontology and allows removing the heterogeneities between the existing ontologies. Ontology mapping is a process used in ontologies integrating and consists in finding correspondences between the source ontologies. This paper deals with the "mapping" process of geospatial ontologies which consist in applying an automated algorithm in finding the correspondences between concepts referring to the definitions of matching relationships. The proposed algorithm called "geographic ontologies mapping algorithm" defines three types of mapping: semantic, topological and spatial.

  14. Temporal mapping and analysis

    NASA Technical Reports Server (NTRS)

    O'Hara, Charles G. (Inventor); Shrestha, Bijay (Inventor); Vijayaraj, Veeraraghavan (Inventor); Mali, Preeti (Inventor)

    2011-01-01

    A compositing process for selecting spatial data collected over a period of time, creating temporal data cubes from the spatial data, and processing and/or analyzing the data using temporal mapping algebra functions. In some embodiments, the temporal data cube is creating a masked cube using the data cubes, and computing a composite from the masked cube by using temporal mapping algebra.

  15. Beyond Event Segmentation: Spatial- and Social-Cognitive Processes in Verb-to-Action Mapping

    ERIC Educational Resources Information Center

    Friend, Margaret; Pace, Amy

    2011-01-01

    The present article investigates spatial- and social-cognitive processes in toddlers' mapping of concepts to real-world events. In 2 studies we explore how event segmentation might lay the groundwork for extracting actions from the event stream and conceptually mapping novel verbs to these actions. In Study 1, toddlers demonstrated the ability to…

  16. Integrating Vegetation Classification, Mapping, and Strategic Inventory for Forest Management

    Treesearch

    C. K. Brewer; R. Bush; D. Berglund; J. A. Barber; S. R. Brown

    2006-01-01

    Many of the analyses needed to address multiple resource issues are focused on vegetation pattern and process relationships and most rely on the data models produced from vegetation classification, mapping, and/or inventory. The Northern Region Vegetation Mapping Project (R1-VMP) data models are based on these three integrally related, yet separate processes. This...

  17. How to achieve customer service through short-cycle paperwork.

    PubMed

    Hunter, M

    1998-02-01

    The ultimate goal of short-cycle paperwork is to satisfy customers by filling their orders as quickly as possible. Tools and techniques that can help achieve this goal include Just-in-Time paperwork elimination, process mapping, paper flow mapping, function/process mapping, work cells, and electronic kanban. Each of these is described briefly in the article.

  18. CIMOSA process classification for business process mapping in non-manufacturing firms: A case study

    NASA Astrophysics Data System (ADS)

    Latiffianti, Effi; Siswanto, Nurhadi; Wiratno, Stefanus Eko; Saputra, Yudha Andrian

    2017-11-01

    A business process mapping is one important means to enable an enterprise to effectively manage the value chain. One of widely used approaches to classify business process for mapping purpose is Computer Integrated Manufacturing System Open Architecture (CIMOSA). CIMOSA was initially designed for Computer Integrated Manufacturing (CIM) system based enterprises. This paper aims to analyze the use of CIMOSA process classification for business process mapping in the firms that do not fall within the area of CIM. Three firms of different business area that have used CIMOSA process classification were observed: an airline firm, a marketing and trading firm for oil and gas products, and an industrial estate management firm. The result of the research has shown that CIMOSA can be used in non-manufacturing firms with some adjustment. The adjustment includes addition, reduction, or modification of some processes suggested by CIMOSA process classification as evidenced by the case studies.

  19. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool

    PubMed Central

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-01-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13–17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences. PMID:29303048

  20. Creating a career legacy map to help assure meaningful work in nursing.

    PubMed

    Hinds, Pamela S; Britton, Dorienda R; Coleman, Lael; Engh, Eileen; Humbel, Tina Kunze; Keller, Susan; Kelly, Katherine Patterson; Menard, Johanna; Lee, Marlene A; Roberts-Turner, Renee; Walczak, Dory

    2015-01-01

    When nurses declare a professional legacy (or what they intend to be better in health care because of their efforts), they are likely to maintain a focus on achieving their legacy and to experience meaning in the process. We depict the legacy and involved steps in creating a legacy map, which is a concrete guide forward to intended career outcomes. Informed by the "meaningful work" literature, we describe a legacy map, its function, the process to create one, and the application of a legacy map to guide careers. We also describe an administrative benefit of the legacy map-the map can be used by team leaders and members to secure needed resources and opportunities to support the desired legacy of team members. Legacy mapping can be a self-use career guidance tool for nurses and other health care professionals or a tool that links the career efforts of a team member with the career support efforts of a team leader. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Novice to Expert Cognition During Geologic Bedrock Mapping

    NASA Astrophysics Data System (ADS)

    Petcovic, H. L.; Libarkin, J.; Hambrick, D. Z.; Baker, K. M.; Elkins, J. T.; Callahan, C. N.; Turner, S.; Rench, T. A.; LaDue, N.

    2011-12-01

    Bedrock geologic mapping is a complex and cognitively demanding task. Successful mapping requires domain-specific content knowledge, visuospatial ability, navigation through the field area, creating a mental model of the geology that is consistent with field data, and metacognition. Most post-secondary geology students in the United States receive training in geologic mapping, however, not much is known about the cognitive processes that underlie successful bedrock mapping, or about how these processes change with education and experience. To better understand cognition during geologic mapping, we conducted a 2-year research study in which 67 volunteers representing a range from undergraduate sophomore to 20+ years professional experience completed a suite of cognitive measures plus a 1-day bedrock mapping task in the Rocky Mountains, Montana, USA. In addition to participants' geologic maps and field notes, the cognitive suite included tests and questionnaires designed to measure: (1) prior geologic experience, via a self-report survey; (2) geologic content knowledge, via a modified version of the Geoscience Concept Inventory; (3) visuospatial ability, working memory capacity, and perceptual speed, via paper-and-pencil and computerized tests; (4) use of space and time during mapping via GPS tracking; and (5) problem-solving in the field via think-aloud audio logs during mapping and post-mapping semi-structured interviews. Data were examined for correlations between performance on the mapping task and other measures. We found that both geological knowledge and spatial visualization ability correlated positively with accuracy in the field mapping task. More importantly, we found a Visuospatial Ability × Geological Knowledge interaction, such that visuospatial ability positively predicted mapping performance at low, but not high, levels of geological knowledge. In other words, we found evidence to suggest that visuospatial ability mattered for bedrock mapping for the novices in our sample, but not for the experts. For experienced mappers, we found a significant correlation between GCI scores and the thoroughness with which they covered the map area, plus a relationship between speed and map accuracy such that faster mappers produced better maps. However, fast novice mappers tended to produce the worst maps. Successful mappers formed a mental model of the underlying geologic structure immediately to early in the mapping task, then spent field time collecting observations to confirm, disconfirm, or modify their initial model. In contrast, the least successful mappers (all inexperienced) rarely generated explanations or models of the underlying geologic structure in the field.

  2. Three-Dimensional Geologic Map of the Hayward Fault Zone, San Francisco Bay Region, California

    USGS Publications Warehouse

    Phelps, G.A.; Graymer, R.W.; Jachens, R.C.; Ponce, D.A.; Simpson, R.W.; Wentworth, C.M.

    2008-01-01

    A three-dimensional (3D) geologic map of the Hayward Fault zone was created by integrating the results from geologic mapping, potential field geophysics, and seismology investigations. The map volume is 100 km long, 20 km wide, and extends to a depth of 12 km below sea level. The map volume is oriented northwest and is approximately bisected by the Hayward Fault. The complex geologic structure of the region makes it difficult to trace many geologic units into the subsurface. Therefore, the map units are generalized from 1:24,000-scale geologic maps. Descriptions of geologic units and structures are offered, along with a discussion of the methods used to map them and incorporate them into the 3D geologic map. The map spatial database and associated viewing software are provided. Elements of the map, such as individual fault surfaces, are also provided in a non-proprietary format so that the user can access the map via open-source software. The sheet accompanying this manuscript shows views taken from the 3D geologic map for the user to access. The 3D geologic map is designed as a multi-purpose resource for further geologic investigations and process modeling.

  3. Digital floodplain mapping and an analysis of errors involved

    USGS Publications Warehouse

    Hamblen, C.S.; Soong, D.T.; Cai, X.

    2007-01-01

    Mapping floodplain boundaries using geographical information system (GIS) and digital elevation models (DEMs) was completed in a recent study. However convenient this method may appear at first, the resulting maps potentially can have unaccounted errors. Mapping the floodplain using GIS is faster than mapping manually, and digital mapping is expected to be more common in the future. When mapping is done manually, the experience and judgment of the engineer or geographer completing the mapping and the contour resolution of the surface topography are critical in determining the flood-plain and floodway boundaries between cross sections. When mapping is done digitally, discrepancies can result from the use of the computing algorithm and digital topographic datasets. Understanding the possible sources of error and how the error accumulates through these processes is necessary for the validation of automated digital mapping. This study will evaluate the procedure of floodplain mapping using GIS and a 3 m by 3 m resolution DEM with a focus on the accumulated errors involved in the process. Within the GIS environment of this mapping method, the procedural steps of most interest, initially, include: (1) the accurate spatial representation of the stream centerline and cross sections, (2) properly using a triangulated irregular network (TIN) model for the flood elevations of the studied cross sections, the interpolated elevations between them and the extrapolated flood elevations beyond the cross sections, and (3) the comparison of the flood elevation TIN with the ground elevation DEM, from which the appropriate inundation boundaries are delineated. The study area involved is of relatively low topographic relief; thereby, making it representative of common suburban development and a prime setting for the need of accurately mapped floodplains. This paper emphasizes the impacts of integrating supplemental digital terrain data between cross sections on floodplain delineation. ?? 2007 ASCE.

  4. Surficial geologic map of the Amboy 30' x 60' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2010-01-01

    The surficial geologic map of the Amboy 30' x 60' quadrangle presents characteristics of surficial materials for an area of approximately 5,000 km2 in the eastern Mojave Desert of southern California. This map consists of new surficial mapping conducted between 2000 and 2007, as well as compilations from previous surficial mapping. Surficial geologic units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects following deposition, and, where appropriate, the lithologic nature of the material. Many physical properties were noted and measured during the geologic mapping. This information was used to classify surficial deposits and to understand their ecological importance. We focus on physical properties that drive hydrologic, biologic, and physical processes such as particle-size distribution (PSD) and bulk density. The database contains point data representing locations of samples for both laboratory determined physical properties and semiquantitative field-based information in the database. We include the locations of all field observations and note the type of information collected in the field to help assist in assessing the quality of the mapping. The publication is separated into three parts: documentation, spatial data, and printable map graphics of the database. Documentation includes this pamphlet, which provides a discussion of the surficial geology and units and the map. Spatial data are distributed as ArcGIS Geodatabase in Microsoft Access format and are accompanied by a readme file, which describes the database contents, and FGDC metadata for the spatial map information. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files that provide a view of the spatial database at the mapped scale.

  5. Spatial data software integration - Merging CAD/CAM/mapping with GIS and image processing

    NASA Technical Reports Server (NTRS)

    Logan, Thomas L.; Bryant, Nevin A.

    1987-01-01

    The integration of CAD/CAM/mapping with image processing using geographic information systems (GISs) as the interface is examined. Particular emphasis is given to the development of software interfaces between JPL's Video Image Communication and Retrieval (VICAR)/Imaged Based Information System (IBIS) raster-based GIS and the CAD/CAM/mapping system. The design and functions of the VICAR and IBIS are described. Vector data capture and editing are studied. Various software programs for interfacing between the VICAR/IBIS and CAD/CAM/mapping are presented and analyzed.

  6. Physiographic map of the Sicilian region (1:250,000 scale)

    NASA Astrophysics Data System (ADS)

    Priori, Simone; Fantappiè, Maria; Costantini, Edoardo A. C.

    2015-04-01

    Physiographic maps summarize and group the landforms of a territory into homogeneous areas in terms of kind and intensity of main geomorphological process. Most of the physiographic maps have large scale, which is national or continental scale. Other maps have been produced at the semi-detailed scales, while examples at the regional scale are much less common. However, being the Region the main administrative level in Europe, they can be very useful for land planning in many fields, such as ecological studies, risk maps, and soil mapping. This work presents a methodological example of regional physiographic map, compiled at 1:250,000 scale, representing the whole Sicilian region, the largest and most characteristic of Mediterranean island. The physiographic units were classed matching thematich layers (NDVI, geology, DEM, land cover) with the main geomorphological processes that were identified by stereo-interpretation of aerial photographs (1:70,000 scale). In addition, information from other published maps, representing geomorphological forms, aeolian deposits, anthropic terraced slopes, and landslide were used to improve the accuracy and reliability of the map. The classification of the physiographic units, and then the map legend, was built up on the basis of literature and taking into account Italian geomorphological legend. The legend proposed in this map, which can be applied also in other Mediterranean countries, is suitable for different scales. The landform units were grouped on the base of a geomorphological classification of the forms into: anthropogenic, eolian, coastal, valley floor, intermountain fluvial, slope erosional, structural, karstic, and volcanic.

  7. Preliminary Assessment of the Impact of Culture on Understanding Cartographic Representations

    NASA Astrophysics Data System (ADS)

    Reolon Schmidt, Marcio Augusto; de Alencar Mendonça, André Luiz; Wieczorek, Małgorzata

    2018-05-01

    When users read a topographic map, they have to decode the represented information. This decoding passes through various processes in order to perceive, interpret, and understand the reported information. This set of processes is intrinsically a question that is influenced by culture. In particular, when one thinks of maps distributed across the internet or representations of audiences from different origins, the chance of efficient communication is reduced or at least influenced. Therefore, there should be some degree of common visual communication, which the symbology of maps can be applied in order to assure the adequate communication of phenomenon being represented on it. In this context, the present work aims at testing which evaluation factors influence the reading of maps, the understanding of space and reasoning of the map user, in particular national topographic maps. The assessment was through internet considering official map representation from Brazil and Poland and questionnaires. The results shown that conventional topographic maps on the same scale are not capable of producing the correct interpretation of the user from another culture. This means that formal training has a direct influence on the quality of the interpretation and spatial reasoning. Those results indicate that high levels of formal training positively influence the reading and interpretation results of the map and that there is no evidence that the specialists with the symbology of their own country have significantly positive results, when compared to those used maps with systematic mapping from another country.

  8. Comparing Two Forms of Concept Map Critique Activities to Facilitate Knowledge Integration Processes in Evolution Education

    ERIC Educational Resources Information Center

    Schwendimann, Beat A.; Linn, Marcia C.

    2016-01-01

    Concept map activities often lack a subsequent revision step that facilitates knowledge integration. This study compares two collaborative critique activities using a Knowledge Integration Map (KIM), a form of concept map. Four classes of high school biology students (n?=?81) using an online inquiry-based learning unit on evolution were assigned…

  9. Mapping fuels at multiple scales: landscape application of the fuel characteristic classification system.

    Treesearch

    D. McKenzie; C.L. Raymond; L.-K.B. Kellogg; R.A. Norheim; A.G. Andreu; A.C. Bayard; K.E. Kopper; E. Elman

    2007-01-01

    Fuel mapping is a complex and often multidisciplinary process, involving remote sensing, ground-based validation, statistical modeling, and knowledge-based systems. The scale and resolution of fuel mapping depend both on objectives and availability of spatial data layers. We demonstrate use of the Fuel Characteristic Classification System (FCCS) for fuel mapping at two...

  10. Smart "geomorphological" map browsing - a tale about geomorphological maps and the internet

    NASA Astrophysics Data System (ADS)

    Geilhausen, M.; Otto, J.-C.

    2012-04-01

    With the digital production of geomorphological maps, the dissemination of research outputs now extends beyond simple paper products. Internet technologies can contribute to both, the dissemination of geomorphological maps and access to geomorphologic data and help to make geomorphological knowledge available to a greater public. Indeed, many national geological surveys employ end-to-end digital workflows from data capture in the field to final map production and dissemination. This paper deals with the potential of web mapping applications and interactive, portable georeferenced PDF maps for the distribution of geomorphological information. Web mapping applications such as Google Maps have become very popular and widespread and increased the interest and access to mapping. They link the Internet with GIS technology and are a common way of presenting dynamic maps online. The GIS processing is performed online and maps are visualised in interactive web viewers characterised by different capabilities such as zooming, panning or adding further thematic layers, with the map refreshed after each task. Depending on the system architecture and the components used, advanced symbology, map overlays from different applications and sources and their integration into a Desktop GIS are possible. This interoperability is achieved through the use of international open standards that include mechanisms for the integration and visualisation of information from multiple sources. The portable document format (PDF) is commonly used for printing and is a standard format that can be processed by many graphic software and printers without loss of information. A GeoPDF enables the sharing of geospatial maps and data in PDF documents. Multiple, independent map frames with individual spatial reference systems are possible within a GeoPDF, for example, for map overlays or insets. Geospatial functionality of a GeoPDF includes scalable map display, layer visibility control, access to attribute data, coordinate queries and spatial measurements. The full functionality of GeoPDFs requires free and user-friendly plug-ins for PDF readers and GIS software. A GeoPDF enables fundamental GIS functionality turning the formerly static PDF map into an interactive, portable georeferenced PDF map. GeoPDFs are easy to create and provide an interesting and valuable way to disseminate geomorphological maps. Our motivation to engage with the online distribution of geomorphological maps originates in the increasing number of web mapping applications available today indicating that the Internet has become a medium for displaying geographical information in rich forms and user-friendly interfaces. So, why not use the Internet to distribute geomorphological maps and enhance their practical application? Web mapping and dynamic PDF maps can play a key role in the movement towards a global dissemination of geomorphological information. This will be exemplified by live demonstrations of i.) existing geomorphological WebGIS applications, ii.) data merging from various sources using web map services, and iii.) free to download GeoPDF maps during the presentations.

  11. Does Value Stream Mapping affect the structure, process, and outcome quality in care facilities? A systematic review.

    PubMed

    Nowak, Marina; Pfaff, Holger; Karbach, Ute

    2017-08-24

    Quality improvement within health and social care facilities is needed and has to be evidence-based and patient-centered. Value Stream Mapping, a method of Lean management, aims to increase the patients' value and quality of care by a visualization and quantification of the care process. The aim of this research is to examine the effectiveness of Value Stream Mapping on structure, process, and outcome quality in care facilities. A systematic review is conducted. PubMed, EBSCOhost, including Business Source Complete, Academic Search Complete, PSYCInfo, PSYNDX, SocINDEX with Full Text, Web of Knowledge, and EMBASE ScienceDirect are searched in February 2016. All peer-reviewed papers evaluating Value Stream Mapping and published in English or German from January 2000 are included. For data synthesis, all study results are categorized into Donabedian's model of structure, process, and outcome quality. To assess and interpret the effectiveness of Value Stream Mapping, the frequencies of the results statistically examined are considered. Of the 903 articles retrieved, 22 studies fulfill the inclusion criteria. Of these, 11 studies are used to answer the research question. Value Stream Mapping has positive effects on the time dimension of process and outcome quality. It seems to reduce non-value-added time (e.g., waiting time) and length of stay. All study designs are before and after studies without control, and methodologically sophisticated studies are missing. For a final conclusion about Value Stream Mapping's effectiveness, more research with improved methodology is needed. Despite this lack of evidence, Value Stream Mapping has the potential to improve quality of care on the time dimension. The contextual influence has to be investigated to make conclusions about the relationship between different quality domains when applying Value Stream Mapping. However, for using this review's conclusion, the limitation of including heterogeneous and potentially biased results has to be considered.

  12. Growing a hypercubical output space in a self-organizing feature map.

    PubMed

    Bauer, H U; Villmann, T

    1997-01-01

    Neural maps project data from an input space onto a neuron position in a (often lower dimensional) output space grid in a neighborhood preserving way, with neighboring neurons in the output space responding to neighboring data points in the input space. A map-learning algorithm can achieve an optimal neighborhood preservation only, if the output space topology roughly matches the effective structure of the data in the input space. We here present a growth algorithm, called the GSOM or growing self-organizing map, which enhances a widespread map self-organization process, Kohonen's self-organizing feature map (SOFM), by an adaptation of the output space grid during learning. The GSOM restricts the output space structure to the shape of a general hypercubical shape, with the overall dimensionality of the grid and its extensions along the different directions being subject of the adaptation. This constraint meets the demands of many larger information processing systems, of which the neural map can be a part. We apply our GSOM-algorithm to three examples, two of which involve real world data. Using recently developed methods for measuring the degree of neighborhood preservation in neural maps, we find the GSOM-algorithm to produce maps which preserve neighborhoods in a nearly optimal fashion.

  13. Field methods and data processing techniques associated with mapped inventory plots

    Treesearch

    William A. Bechtold; Stanley J. Zarnoch

    1999-01-01

    The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...

  14. The Impact of Concept Mapping on the Process of Problem-Based Learning

    ERIC Educational Resources Information Center

    Zwaal, Wichard; Otting, Hans

    2012-01-01

    A concept map is a graphical tool to activate and elaborate on prior knowledge, to support problem solving, promote conceptual thinking and understanding, and to organize and memorize knowledge. The aim of this study is to determine if the use of concept mapping (CM) in a problem-based learning (PBL) curriculum enhances the PBL process. The paper…

  15. Image display device in digital TV

    DOEpatents

    Choi, Seung Jong [Seoul, KR

    2006-07-18

    Disclosed is an image display device in a digital TV that is capable of carrying out the conversion into various kinds of resolution by using single bit map data in the digital TV. The image display device includes: a data processing part for executing bit map conversion, compression, restoration and format-conversion for text data; a memory for storing the bit map data obtained according to the bit map conversion and compression in the data processing part and image data inputted from an arbitrary receiving part, the receiving part receiving one of digital image data and analog image data; an image outputting part for reading the image data from the memory; and a display processing part for mixing the image data read from the image outputting part and the bit map data converted in format from the a data processing part. Therefore, the image display device according to the present invention can convert text data in such a manner as to correspond with various resolution, carry out the compression for bit map data, thereby reducing the memory space, and support text data of an HTML format, thereby providing the image with the text data of various shapes.

  16. Computer Programs to Display and Modify Data in Geographic Coordinates and Methods to Transfer Positions to and from Maps, with Applications to Gravity Data Processing, Global Positioning Systems, and 30-Meter Digital Elevation Models

    USGS Publications Warehouse

    Plouff, Donald

    1998-01-01

    Computer programs were written in the Fortran language to process and display gravity data with locations expressed in geographic coordinates. The programs and associated processes have been tested for gravity data in an area of about 125,000 square kilometers in northwest Nevada, southeast Oregon, and northeast California. This report discusses the geographic aspects of data processing. Utilization of the programs begins with application of a template (printed in PostScript format) to transfer locations obtained with Global Positioning Systems to and from field maps and includes a 5-digit geographic-based map naming convention for field maps. Computer programs, with source codes that can be copied, are used to display data values (printed in PostScript format) and data coverage, insert data into files, extract data from files, shift locations, test for redundancy, and organize data by map quadrangles. It is suggested that 30-meter Digital Elevation Models needed for gravity terrain corrections and other applications should be accessed in a file search by using the USGS 7.5-minute map name as a file name, for example, file '40117_B8.DEM' contains elevation data for the map with a southeast corner at lat 40? 07' 30' N. and lon 117? 52' 30' W.

  17. Mapping soil erosion risk in Serra de Grândola (Portugal)

    NASA Astrophysics Data System (ADS)

    Neto Paixão, H. M.; Granja Martins, F. M.; Zavala, L. M.; Jordán, A.; Bellinfante, N.

    2012-04-01

    Geomorphological processes can pose environmental risks to people and economical activities. Information and a better knowledge of the genesis of these processes is important for environmental planning, since it allows to model, quantify and classify risks, what can mitigate the threats. The objective of this research is to assess the soil erosion risk in Serra de Grândola, which is a north-south oriented mountain ridge with an altitude of 383 m, located in southwest of Alentejo (southern Portugal). The study area is 675 km2, including the councils of Grândola, Santiago do Cacém and Sines. The process for mapping of erosive status was based on the guidelines for measuring and mapping the processes of erosion of coastal areas of the Mediterranean proposed by PAP/RAC (1997), developed and later modified by other authors in different areas. This method is based on the application of a geographic information system that integrates different types of spatial information inserted into a digital terrain model and in their derivative models. Erosive status are classified using information from soil erodibility, slope, land use and vegetation cover. The rainfall erosivity map was obtained using the modified Fournier index, calculated from the mean monthly rainfall, as recorded in 30 meteorological stations with influence in the study area. Finally, the soil erosion risk map was designed by ovelaying the erosive status map and the rainfall erosivity map.

  18. Investigating the Use of 3d Geovisualizations for Urban Design in Informal Settlement Upgrading in South Africa

    NASA Astrophysics Data System (ADS)

    Rautenbach, V.; Coetzee, S.; Çöltekin, A.

    2016-06-01

    Informal settlements are a common occurrence in South Africa, and to improve in-situ circumstances of communities living in informal settlements, upgrades and urban design processes are necessary. Spatial data and maps are essential throughout these processes to understand the current environment, plan new developments, and communicate the planned developments. All stakeholders need to understand maps to actively participate in the process. However, previous research demonstrated that map literacy was relatively low for many planning professionals in South Africa, which might hinder effective planning. Because 3D visualizations resemble the real environment more than traditional maps, many researchers posited that they would be easier to interpret. Thus, our goal is to investigate the effectiveness of 3D geovisualizations for urban design in informal settlement upgrading in South Africa. We consider all involved processes: 3D modelling, visualization design, and cognitive processes during map reading. We found that procedural modelling is a feasible alternative to time-consuming manual modelling, and can produce high quality models. When investigating the visualization design, the visual characteristics of 3D models and relevance of a subset of visual variables for urban design activities of informal settlement upgrades were qualitatively assessed. The results of three qualitative user experiments contributed to understanding the impact of various levels of complexity in 3D city models and map literacy of future geoinformatics and planning professionals when using 2D maps and 3D models. The research results can assist planners in designing suitable 3D models that can be used throughout all phases of the process.

  19. Using mind mapping techniques for rapid qualitative data analysis in public participation processes.

    PubMed

    Burgess-Allen, Jilla; Owen-Smith, Vicci

    2010-12-01

    In a health service environment where timescales for patient participation in service design are short and resources scarce, a balance needs to be achieved between research rigour and the timeliness and utility of the findings of patient participation processes. To develop a pragmatic mind mapping approach to managing the qualitative data from patient participation processes. While this article draws on experience of using mind maps in a variety of participation processes, a single example is used to illustrate the approach. In this example mind maps were created during the course of patient participation focus groups. Two group discussions were also transcribed verbatim to allow comparison of the rapid mind mapping approach with traditional thematic analysis of qualitative data. The illustrative example formed part of a local alcohol service review which included consultation with local alcohol service users, their families and staff groups. The mind mapping approach provided a pleasing graphical format for representing the key themes raised during the focus groups. It helped stimulate and galvanize discussion and keep it on track, enhanced transparency and group ownership of the data analysis process, allowed a rapid dynamic between data collection and feedback, and was considerably faster than traditional methods for the analysis of focus groups, while resulting in similar broad themes. This study suggests that the use of a mind mapping approach to managing qualitative data can provide a pragmatic resolution of the tension between limited resources and quality in patient participation processes. © 2010 The Authors. Health Expectations © 2010 Blackwell Publishing Ltd.

  20. Lean implementation in primary care health visiting services in National Health Service UK.

    PubMed

    Grove, A L; Meredith, J O; Macintyre, M; Angelis, J; Neailey, K

    2010-10-01

    This paper presents the findings of a 13-month lean implementation in National Health Service (NHS) primary care health visiting services from May 2008 to June 2009. Lean was chosen for this study because of its reported success in other healthcare organisations. Value-stream mapping was utilised to map out essential tasks for the participating health visiting service. Stakeholder mapping was conducted to determine the links between all relevant stakeholders. Waste processes were then identified through discussions with these stakeholders, and a redesigned future state process map was produced. Quantitative data were provided through a 10-day time-and-motion study of a selected number of staff within the service. This was analysed to provide an indication of waste activity that could be removed from the system following planned improvements. The value-stream map demonstrated that there were 67 processes in the original health visiting service studied. Analysis revealed that 65% of these processes were waste and could be removed in the redesigned process map. The baseline time-and-motion data demonstrate that clinical staff performed on average 15% waste activities, and the administrative support staff performed 46% waste activities. Opportunities for significant waste reduction have been identified during the study using the lean tools of value-stream mapping and a time-and-motion study. These opportunities include simplification of standard tasks, reduction in paperwork and standardisation of processes. Successful implementation of these improvements will free up resources within the organisation which can be redirected towards providing better direct care to patients.

  1. A Mathematical Model for Storage and Recall of Images using Targeted Synchronization of Coupled Maps.

    PubMed

    Palaniyandi, P; Rangarajan, Govindan

    2017-08-21

    We propose a mathematical model for storage and recall of images using coupled maps. We start by theoretically investigating targeted synchronization in coupled map systems wherein only a desired (partial) subset of the maps is made to synchronize. A simple method is introduced to specify coupling coefficients such that targeted synchronization is ensured. The principle of this method is extended to storage/recall of images using coupled Rulkov maps. The process of adjusting coupling coefficients between Rulkov maps (often used to model neurons) for the purpose of storing a desired image mimics the process of adjusting synaptic strengths between neurons to store memories. Our method uses both synchronisation and synaptic weight modification, as the human brain is thought to do. The stored image can be recalled by providing an initial random pattern to the dynamical system. The storage and recall of the standard image of Lena is explicitly demonstrated.

  2. Graphic Strategies for Analyzing and Interpreting Curricular Mapping Data

    PubMed Central

    Leonard, Sean T.

    2010-01-01

    Objective To describe curricular mapping strategies used in analyzing and interpreting curricular mapping data and present findings on how these strategies were used to facilitate curricular development. Design Nova Southeastern University's doctor of pharmacy curriculum was mapped to the college's educational outcomes. The mapping process included development of educational outcomes followed by analysis of course material and semi-structured interviews with course faculty members. Data collected per course outcome included learning opportunities and assessment measures used. Assessment Nearly 1,000 variables and 10,000 discrete rows of curricular data were collected. Graphic representations of curricular data were created using bar charts and stacked area graphs relating the learning opportunities to the educational outcomes. Graphs were used in the curricular evaluation and development processes to facilitate the identification of curricular holes, sequencing misalignments, learning opportunities, and assessment measures. Conclusion Mapping strategies that use graphic representations of curricular data serve as effective diagnostic and curricular development tools. PMID:20798804

  3. Method for removing atomic-model bias in macromolecular crystallography

    DOEpatents

    Terwilliger, Thomas C [Santa Fe, NM

    2006-08-01

    Structure factor bias in an electron density map for an unknown crystallographic structure is minimized by using information in a first electron density map to elicit expected structure factor information. Observed structure factor amplitudes are combined with a starting set of crystallographic phases to form a first set of structure factors. A first electron density map is then derived and features of the first electron density map are identified to obtain expected distributions of electron density. Crystallographic phase probability distributions are established for possible crystallographic phases of reflection k, and the process is repeated as k is indexed through all of the plurality of reflections. An updated electron density map is derived from the crystallographic phase probability distributions for each one of the reflections. The entire process is then iterated to obtain a final set of crystallographic phases with minimum bias from known electron density maps.

  4. Graphic strategies for analyzing and interpreting curricular mapping data.

    PubMed

    Armayor, Graciela M; Leonard, Sean T

    2010-06-15

    To describe curricular mapping strategies used in analyzing and interpreting curricular mapping data and present findings on how these strategies were used to facilitate curricular development. Nova Southeastern University's doctor of pharmacy curriculum was mapped to the college's educational outcomes. The mapping process included development of educational outcomes followed by analysis of course material and semi-structured interviews with course faculty members. Data collected per course outcome included learning opportunities and assessment measures used. Nearly 1,000 variables and 10,000 discrete rows of curricular data were collected. Graphic representations of curricular data were created using bar charts and stacked area graphs relating the learning opportunities to the educational outcomes. Graphs were used in the curricular evaluation and development processes to facilitate the identification of curricular holes, sequencing misalignments, learning opportunities, and assessment measures. Mapping strategies that use graphic representations of curricular data serve as effective diagnostic and curricular development tools.

  5. Validation of a novel mapping system and utility for mapping complex atrial tachycardias.

    PubMed

    Honarbakhsh, S; Hunter, R J; Dhillon, G; Ullah, W; Keating, E; Providencia, R; Chow, A; Earley, M J; Schilling, R J

    2018-03-01

    This study sought to validate a novel wavefront mapping system utilizing whole-chamber basket catheters (CARTOFINDER, Biosense Webster). The system was validated in terms of (1) mapping atrial-paced beats and (2) mapping complex wavefront patterns in atrial tachycardia (AT). Patients undergoing catheter ablation for AT and persistent AF were included. A 64-pole-basket catheter was used to acquire unipolar signals that were processed by CARTOFINDER mapping system to generate dynamic wavefront propagation maps. The left atrium was paced from four sites to demonstrate focal activation. ATs were mapped with the mechanism confirmed by conventional mapping, entrainment, and response to ablation. Twenty-two patients were included in the study (16 with AT and 6 with AF initially who terminated to AT during ablation). In total, 172 maps were created with the mapping system. It correctly identified atrial-pacing sites in all paced maps. It accurately mapped 9 focal/microreentrant and 18 macroreentrant ATs both in the left and right atrium. A third and fourth observer independently identified the sites of atrial pacing and the AT mechanism from the CARTOFINDER maps, while being blinded to the conventional activation maps. This novel mapping system was effectively validated by mapping focal activation patterns from atrial-paced beats. The system was also effective in mapping complex wavefront patterns in a range of ATs in patients with scarred atria. The system may therefore be of practical use in the mapping and ablation of AT and could have potential for mapping wavefront activations in AF. © 2018 Wiley Periodicals, Inc.

  6. Occupancy mapping and surface reconstruction using local Gaussian processes with Kinect sensors.

    PubMed

    Kim, Soohwan; Kim, Jonghyuk

    2013-10-01

    Although RGB-D sensors have been successfully applied to visual SLAM and surface reconstruction, most of the applications aim at visualization. In this paper, we propose a noble method of building continuous occupancy maps and reconstructing surfaces in a single framework for both navigation and visualization. Particularly, we apply a Bayesian nonparametric approach, Gaussian process classification, to occupancy mapping. However, it suffers from high-computational complexity of O(n(3))+O(n(2)m), where n and m are the numbers of training and test data, respectively, limiting its use for large-scale mapping with huge training data, which is common with high-resolution RGB-D sensors. Therefore, we partition both training and test data with a coarse-to-fine clustering method and apply Gaussian processes to each local clusters. In addition, we consider Gaussian processes as implicit functions, and thus extract iso-surfaces from the scalar fields, continuous occupancy maps, using marching cubes. By doing that, we are able to build two types of map representations within a single framework of Gaussian processes. Experimental results with 2-D simulated data show that the accuracy of our approximated method is comparable to previous work, while the computational time is dramatically reduced. We also demonstrate our method with 3-D real data to show its feasibility in large-scale environments.

  7. Voyager Cartography

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Bridges, P. M.; Mullins, K. F.

    1985-01-01

    The Jovian and Saturnian satellites are being mapped at several scales from Voyager 1 and 2 data. The maps include specially formatted color mosaics, controlled photomosaics, and airbrush maps. More than 500 Voyager images of the Jovian and Saturnian satellites were radiometrically processed in preparation for cartographic processing. Of these images, 235 were geometrically transformed to map projections for base mosaic compilations. Special techniques for producing hybrid photomosaic/airbrush maps of Callisto are under investigation. The techniques involve making controlled computer mosaics of all available images with highest resolution images superimposed on lowest resolution images. The mosaics are then improved by airbrushing: seams and artifacts are removed, and image details enhanced that had been lost by saturation in some images. A controlled mosaic of the northern hemisphere of Rhea is complete, as is all processing for a similar mosaic of the equatorial region. Current plans and status of the various series are shown in a table.

  8. Values mapping with Latino forest users: Contributing to the dialogue on multiple land use conflict management

    Treesearch

    Kelly Biedenweg; Lee Cerveny; Rebecca J. McLain

    2014-01-01

    Participatory mapping of landscape values is gaining ground as a method for engaging communities and stakeholders in natural resource management. Socio-spatial mapping allows the public to identify places of economic, social, cultural, or personal importance. In addition to providing data for planning and land management, the mapping process can open dialogue about...

  9. Thematic and positional accuracy assessment of digital remotely sensed data

    Treesearch

    Russell G. Congalton

    2007-01-01

    Accuracy assessment or validation has become a standard component of any land cover or vegetation map derived from remotely sensed data. Knowing the accuracy of the map is vital to any decisionmaking performed using that map. The process of assessing the map accuracy is time consuming and expensive. It is very important that the procedure be well thought out and...

  10. Concept Mapping in the Humanities to Facilitate Reflection: Externalizing the Relationship between Public and Personal Learning

    ERIC Educational Resources Information Center

    Kandiko, Camille; Hay, David; Weller, Saranne

    2013-01-01

    This article discusses how mapping techniques were used in university teaching in a humanities subject. The use of concept mapping was expanded as a pedagogical tool, with a focus on reflective learning processes. Data were collected through a longitudinal study of concept mapping in a university-level Classics course. This was used to explore how…

  11. Scanning and georeferencing historical USGS quadrangles

    USGS Publications Warehouse

    Fishburn, Kristin A.; Davis, Larry R.; Allord, Gregory J.

    2017-06-23

    The U.S. Geological Survey (USGS) National Geospatial Program is scanning published USGS 1:250,000-scale and larger topographic maps printed between 1884, the inception of the topographic mapping program, and 2006. The goal of this project, which began publishing the Historical Topographic Map Collection in 2011, is to provide access to a digital repository of USGS topographic maps that is available to the public at no cost. For more than 125 years, USGS topographic maps have accurately portrayed the complex geography of the Nation. The USGS is the Nation’s largest producer of traditional topographic maps, and, prior to 2006, USGS topographic maps were created using traditional cartographic methods and printed using a lithographic process. The next generation of topographic maps, US Topo, is being released by the USGS in digital form, and newer technologies make it possible to also deliver historical maps in the same electronic format that is more publicly accessible.

  12. Applications of Remote Sensing and GIS(Geographic Information System) in Crime Analysis of Gujranwala City.

    NASA Astrophysics Data System (ADS)

    Munawar, Iqra

    2016-07-01

    Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.

  13. Exploring Pacific Seamounts through Telepresence Mapping on the NOAA Ship Okeanos Explorer

    NASA Astrophysics Data System (ADS)

    Lobecker, E.; Malik, M.; Sowers, D.; Kennedy, B. R.

    2016-12-01

    Telepresence utilizes modern computer networks and a high bandwidth satellite connection to enable remote users to participate virtually in ocean research and exploration cruises. NOAA's Office of Ocean Exploration and Research (OER) has been leveraging telepresence capabilities since the early 2000s. Through telepresence, remote users have provided support for operations planning and execution, troubleshooting hardware and software, and data interpretation during exploratory ocean mapping and remotely operated vehicle missions conducted by OER. The potential for this technology's application to immersive data acquisition and processing during mapping missions, however, has not yet been fully realized. We report the results of the application of telepresence to an 18-day 24 hour / day seafloor mapping expedition with the NOAA Ship Okeanos Explorer. The mapping team was split between shipboard and shore-based mission team members based at the Exploration Command Center at the University of New Hampshire. This cruise represented the third dedicated mapping cruise in a multi-year NOAA Campaign to Address the Pacific monument Science, Technology, and Ocean Needs (CAPSTONE). Cruise objectives included mapping several previously unmapped seamounts in the Wake Atoll Unit of the recently expanded Pacific Remote Islands Marine National Monument, and mapping of prominent seamount, ridge, and fracture zone features during transits. We discuss (1) expanded shore-based data processing of multiple sonar data streams leading to enhanced, rapid, initial site characterization, (2) remote access control of shipboard sonar data acquisition and processing computers, and (3) potential for broadening multidisciplinary applications of ocean mapping cruises including outreach, education, and communications efforts focused on expanding societal cognition and benefits of ocean exploration.

  14. First Steps in Initiating an Effective Maternal, Neonatal, and Child Health Program in Urban Slums: the BRAC Manoshi Project's Experience with Community Engagement, Social Mapping, and Census Taking in Bangladesh.

    PubMed

    Marcil, Lucy; Afsana, Kaosar; Perry, Henry B

    2016-02-01

    The processes for implementing effective programs at scale in low-income countries have not been well-documented in the peer-reviewed literature. This article describes the initial steps taken by one such program--the BRAC Manoshi Project, which now reaches a population of 6.9 million. The project has achieved notable increases in facility births and reductions in maternal and neonatal mortality. The focus of the paper is on the initial steps--community engagement, social mapping, and census taking. Community engagement began with (1) engaging local leaders, (2) creating Maternal, Neonatal, and Child Health Committees for populations of approximately 10,000 people, (3) responding to advice from the community, (4) social mapping of the community, and (5) census taking. Social mapping involved community members working with BRAC staff to map all important physical features that affect how the community carries out its daily functions--such as alleys, lanes and roads, schools, mosques, markets, pharmacies, health facilities, latrine sites, and ponds. As the social mapping progressed, it became possible to conduct household censuses with maps identifying every household and listing family members by household. Again, this was a process of collaboration between BRAC staff and community members. Thus, social mapping and census taking were also instrumental for advancing community engagement. These three processes-community engagement, social mapping, and census taking--can be valuable strategies for strengthening health programs in urban slum settings of low-income countries.

  15. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-07-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use expert knowledge for only model building and constraining, but also in the phase of landscape classification.

  16. Planetary Geologic Mapping Handbook - 2010. Appendix

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of projectspecific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well. Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically. As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program's Planetary Cartography and Geologic Mapping Working Group's (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  17. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  18. Improving Flood Risk Maps as a Capacity Building Activity: Fostering Public Participation and Raising Flood Risk Awareness in the German Mulde Region (project RISK MAP)

    NASA Astrophysics Data System (ADS)

    Luther, J.; Meyer, V.; Kuhlicke, C.; Scheuer, S.; Unnerstall, H.

    2012-04-01

    The EU Floods Directive requires the establishment of flood risk maps for high risk areas in all EU Member States by 2013. However, if existing at all, the current practice of risk mapping still shows some deficits: Risk maps are often seen as an information tool rather than a communication tool. This means that e.g. important local knowledge is not incorporated and forms a contrast to the understanding of capacity building which calls for engaging individuals in the process of learning and adapting to change and for the establishment of a more interactive public administration that learns equally from its actions and from the feedback it receives. Furthermore, the contents of risk maps often do not match the requirements of the end users, so that risk maps are often designed and visualised in a way which cannot be easily understood by laypersons and/or which is not suitable for the respective needs of public authorities in risk and flood event management. The project RISK MAP aimed at improving flood risk maps as a means to foster public participation and raising flood risk awareness. For achieving this aim, RISK MAP (1) developed rules for appropriate stakeholder participation enabling the incorporation of local knowledge and preferences; (2) improved the content of risk maps by considering different risk criteria through the use of a deliberative multicriteria risk mapping tool; and (3) improved the visualisation of risk maps in order to produce user-friendly risk maps by applying the experimental graphic semiology (EGS) method that uses the eye tracking approach. The research was carried out in five European case studies where the status quo of risk mapping and the legal framework was analysed, several stakeholder interviews and workshops were conducted, the visual perception of risk maps was tested and - based on this empirical work - exemplary improved risk maps were produced. The presentation and paper will outline the main findings of the project which ended in September 2011, focussing on the participatory aspects in one of the German case studies (the Mulde River in Saxony). In short, different map users such as strategic planners, emergency managers or the (affected) public require different maps, with varying information density and complexity. The purpose of participation may therefore have a substantive rationale (i.e. improving the content, including local knowledge) or a more instrumental rationale (i.e. building trust, raising awareness, increasing legitimacy). The degree to which both rationales are accommodated depends on the project objectives and determines the participants and process type. In the Mulde case study, both the process of collaborating with each other and considering the (local) knowledge and different experiences as well as the results were highly appreciated. Hazard and risk maps are thus not an end-product that could be complemented e.g. by emergency management information on existing or planned defences, evacuation routes, assembly points, but they should be embedded into a participatory maintenance/updating framework. Map visualisation could be enhanced by using more common and/or self-explanatory symbols, text and a limited number of colour grades for hazard and risk information. Keywords: Flood mapping, hazard and risk maps, participation, risk communication, flood risk awareness, emergency management

  19. a Model Study of Small-Scale World Map Generalization

    NASA Astrophysics Data System (ADS)

    Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.

    2018-04-01

    With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.

  20. Design and application of star map simulation system for star sensors

    NASA Astrophysics Data System (ADS)

    Wu, Feng; Shen, Weimin; Zhu, Xifang; Chen, Yuheng; Xu, Qinquan

    2013-12-01

    Modern star sensors are powerful to measure attitude automatically which assure a perfect performance of spacecrafts. They achieve very accurate attitudes by applying algorithms to process star maps obtained by the star camera mounted on them. Therefore, star maps play an important role in designing star cameras and developing procession algorithms. Furthermore, star maps supply significant supports to exam the performance of star sensors completely before their launch. However, it is not always convenient to supply abundant star maps by taking pictures of the sky. Thus, star map simulation with the aid of computer attracts a lot of interests by virtue of its low price and good convenience. A method to simulate star maps by programming and extending the function of the optical design program ZEMAX is proposed. The star map simulation system is established. Firstly, based on analyzing the working procedures of star sensors to measure attitudes and the basic method to design optical system by ZEMAX, the principle of simulating star sensor imaging is given out in detail. The theory about adding false stars and noises, and outputting maps is discussed and the corresponding approaches are proposed. Then, by external programming, the star map simulation program is designed and produced. Its user interference and operation are introduced. Applications of star map simulation method in evaluating optical system, star image extraction algorithm and star identification algorithm, and calibrating system errors are presented completely. It was proved that the proposed simulation method provides magnificent supports to the study on star sensors, and improves the performance of star sensors efficiently.

  1. Does Sleep Improve Your Grammar? Preferential Consolidation of Arbitrary Components of New Linguistic Knowledge

    PubMed Central

    Mirković, Jelena; Gaskell, M. Gareth

    2016-01-01

    We examined the role of sleep-related memory consolidation processes in learning new form-meaning mappings. Specifically, we examined a Complementary Learning Systems account, which implies that sleep-related consolidation should be more beneficial for new hippocampally dependent arbitrary mappings (e.g. new vocabulary items) relative to new systematic mappings (e.g. grammatical regularities), which can be better encoded neocortically. The hypothesis was tested using a novel language with an artificial grammatical gender system. Stem-referent mappings implemented arbitrary aspects of the new language, and determiner/suffix+natural gender mappings implemented systematic aspects (e.g. tib scoiffesh + ballerina, tib mofeem + bride; ked jorool + cowboy, ked heefaff + priest). Importantly, the determiner-gender and the suffix-gender mappings varied in complexity and salience, thus providing a range of opportunities to detect beneficial effects of sleep for this type of mapping. Participants were trained on the new language using a word-picture matching task, and were tested after a 2-hour delay which included sleep or wakefulness. Participants in the sleep group outperformed participants in the wake group on tests assessing memory for the arbitrary aspects of the new mappings (individual vocabulary items), whereas we saw no evidence of a sleep benefit in any of the tests assessing memory for the systematic aspects of the new mappings: Participants in both groups extracted the salient determiner-natural gender mapping, but not the more complex suffix-natural gender mapping. The data support the predictions of the complementary systems account and highlight the importance of the arbitrariness/systematicity dimension in the consolidation process for declarative memories. PMID:27046022

  2. Natural Resource Information System, design analysis

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.

  3. Demonstration of wetland vegetation mapping in Florida from computer-processed satellite and aircraft multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Butera, M. K.

    1979-01-01

    The success of remotely mapping wetland vegetation of the southwestern coast of Florida is examined. A computerized technique to process aircraft and LANDSAT multispectral scanner data into vegetation classification maps was used. The cost effectiveness of this mapping technique was evaluated in terms of user requirements, accuracy, and cost. Results indicate that mangrove communities are classified most cost effectively by the LANDSAT technique, with an accuracy of approximately 87 percent and with a cost of approximately 3 cent per hectare compared to $46.50 per hectare for conventional ground survey methods.

  4. Mapping racism.

    PubMed

    Moss, Donald B

    2006-01-01

    The author uses the metaphor of mapping to illuminate a structural feature of racist thought, locating the degraded object along vertical and horizontal axes. These axes establish coordinates of hierarchy and of distance. With the coordinates in place, racist thought begins to seem grounded in natural processes. The other's identity becomes consolidated, and parochialism results. The use of this kind of mapping is illustrated via two patient vignettes. The author presents Freud's (1905, 1927) views in relation to such a "mapping" process, as well as Adorno's (1951) and Baldwin's (1965). Finally, the author conceptualizes the crucial status of primitivity in the workings of racist thought.

  5. Becoming Bermuda grass: mapping and tracing rhizomes to practice reflexivity

    NASA Astrophysics Data System (ADS)

    Murakami, Christopher D.; Siegel, Marcelle A.

    2017-09-01

    This narrative project used rhizomatic analysis and reflexivity to describe a layered process of responding to a student's identity of non-participation within an undergraduate science classroom. Mapping rhizomes represents an ongoing and experimental process in consciousness. Rhizomatic mapping in educational studies is too often left out of the products of academic pursuits. In this paper, we try to capture this process, and let the process capture us. This manuscript starts with a focus on just one student, but maps our reflexive terrain that helped us think in new ways about persistent problems in science learning. As we decided how to address this student's identity of non-participation, we learned about the intertwined stories of the researchers and the researched and the challenges of designing inclusive learning environments.

  6. Curriculum Mapping in Academic Libraries

    ERIC Educational Resources Information Center

    Buchanan, Heidi; Webb, Katy Kavanagh; Houk, Amy Harris; Tingelstad, Catherine

    2015-01-01

    Librarians at four different academic institutions concurrently completed curriculum mapping projects using varying methods to analyze their information literacy instruction. Curriculum mapping is a process for systematically evaluating components of an instructional program for cohesiveness, proper sequencing, and goal achievement. There is a…

  7. Translation from the collaborative OSM database to cartography

    NASA Astrophysics Data System (ADS)

    Hayat, Flora

    2018-05-01

    The OpenStreetMap (OSM) database includes original items very useful for geographical analysis and for creating thematic maps. Contributors record in the open database various themes regarding amenities, leisure, transports, buildings and boundaries. The Michelin mapping department develops map prototypes to test the feasibility of mapping based on OSM. To translate the OSM database structure into a database structure fitted with Michelin graphic guidelines a research project is in development. It aims at defining the right structure for the Michelin uses. The research project relies on the analysis of semantic and geometric heterogeneities in OSM data. In that order, Michelin implements methods to transform the input geographical database into a cartographic image dedicated for specific uses (routing and tourist maps). The paper focuses on the mapping tools available to produce a personalised spatial database. Based on processed data, paper and Web maps can be displayed. Two prototypes are described in this article: a vector tile web map and a mapping method to produce paper maps on a regional scale. The vector tile mapping method offers an easy navigation within the map and within graphic and thematic guide- lines. Paper maps can be partly automatically drawn. The drawing automation and data management are part of the mapping creation as well as the final hand-drawing phase. Both prototypes have been set up using the OSM technical ecosystem.

  8. LANDSAT survey of near-shore ice conditions along the Arctic coast of Alaska

    NASA Technical Reports Server (NTRS)

    Stringer, W. J. (Principal Investigator); Barrett, S. A.

    1978-01-01

    The author has identified the following significant results. Winter and spring near-shore ice conditions were analyzed for the Beaufort Sea 1973-77, and the Chukchi Sea 1973-76. LANDSAT imagery was utilized to map major ice features related to regional ice morphology. Significant features from individual LANDSAT image maps were combined to yield regional maps of major ice ridge systems for each year of study and maps of flaw lead systems for representative seasons during each year. These regional maps were, in turn, used to prepare seasonal ice morphology maps. These maps showed, in terms of a zonal analysis, regions of statistically uniform ice behavior. The behavioral characteristics of each zone were described in terms of coastal processes and bathymetric configuration.

  9. Active Interaction Mapping as a tool to elucidate hierarchical functions of biological processes.

    PubMed

    Farré, Jean-Claude; Kramer, Michael; Ideker, Trey; Subramani, Suresh

    2017-07-03

    Increasingly, various 'omics data are contributing significantly to our understanding of novel biological processes, but it has not been possible to iteratively elucidate hierarchical functions in complex phenomena. We describe a general systems biology approach called Active Interaction Mapping (AI-MAP), which elucidates the hierarchy of functions for any biological process. Existing and new 'omics data sets can be iteratively added to create and improve hierarchical models which enhance our understanding of particular biological processes. The best datatypes to further improve an AI-MAP model are predicted computationally. We applied this approach to our understanding of general and selective autophagy, which are conserved in most eukaryotes, setting the stage for the broader application to other cellular processes of interest. In the particular application to autophagy-related processes, we uncovered and validated new autophagy and autophagy-related processes, expanded known autophagy processes with new components, integrated known non-autophagic processes with autophagy and predict other unexplored connections.

  10. Using Sentinel-1 SAR satellites to map wind speed variation across offshore wind farm clusters

    NASA Astrophysics Data System (ADS)

    James, S. F.

    2017-11-01

    Offshore wind speed maps at 500m resolution are derived from freely available satellite Synthetic Aperture Radar (SAR) data. The method for processing many SAR images to derive wind speed maps is described in full. The results are tested against coincident offshore mast data. Example wind speed maps for the UK Thames Estuary offshore wind farm cluster are presented.

  11. Developing Land Use Land Cover Maps for the Lower Mekong Basin to Aid SWAT Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Spruce, J.; Bolten, J. D.; Srinivasan, R.

    2017-12-01

    This presentation discusses research to develop Land Use Land Cover (LULC) maps for the Lower Mekong Basin (LMB). Funded by a NASA ROSES Disasters grant, the main objective was to produce updated LULC maps to aid the Mekong River Commission's (MRC's) Soil and Water Assessment Tool (SWAT) hydrologic model. In producing needed LULC maps, temporally processed MODIS monthly NDVI data for 2010 were used as the primary data source for classifying regionally prominent forest and agricultural types. The MODIS NDVI data was derived from processing MOD09 and MYD09 8-day reflectance data with the Time Series Product Tool, a custom software package. Circa 2010 Landsat multispectral data from the dry season were processed into top of atmosphere reflectance mosaics and then classified to derive certain locally common LULC types, such as urban areas and industrial forest plantations. Unsupervised ISODATA clustering was used to derive most LULC classifications. GIS techniques were used to merge MODIS and Landsat classifications into final LULC maps for Sub-Basins (SBs) 1-8 of the LMB. The final LULC maps were produced at 250-meter resolution and delivered to the MRC for use in SWAT modeling for the LMB. A map accuracy assessment was performed for the SB 7 LULC map with 14 classes. This assessment was performed by comparing random locations for sampled LULC types to geospatial reference data such as Landsat RGBs, MODIS NDVI phenologic profiles, high resolution satellite data from Google Map/Earth, and other reference data from the MRC (e.g., crop calendars). LULC accuracy assessment results for SB 7 indicated an overall agreement to reference data of 81% at full scheme specificity. However, by grouping 3 deciduous forest classes into 1 class, the overall agreement improved to 87%. The project enabled updated LULC maps, plus more specific rice types were classified compared to the previous LULC maps. The LULC maps from this project should improve the use of SWAT for modeling hydrology in the LMB, plus improve water and disaster management in a region vulnerable to flooding, droughts, and anthropogenic change (e.g., from dam building and other LULC change).

  12. GIS-mapping of environmental assessment of the territories in the region of intense activity for the oil and gas complex for achievement the goals of the Sustainable Development (on the example of Russia)

    NASA Astrophysics Data System (ADS)

    Yermolaev, Oleg

    2014-05-01

    The uniform system of complex scientific-reference ecological-geographical should act as a base for the maintenance of the Sustainable Development (SD) concept in the territories of the Russian Federation subjects or certain regions. In this case, the assessment of the ecological situation in the regions can be solved by the conjugation of the two interrelated system - the mapping and the geoinformational. The report discusses the methodological aspects of the Atlas-mapping for the purposes of SD in the regions of Russia. The Republic of Tatarstan viewed as a model territory where a large-scale oil-gas complex "Tatneft" PLC works. The company functions for more than 60 years. Oil fields occupy an area of more than 38 000 km2; placed in its territory about 40 000 oil wells, more than 55 000 km of pipelines; more than 3 billion tons of oil was extracted. Methods for to the structure and requirements for the Atlas's content were outlined. The approaches to mapping of "an ecological dominant" of SD conceptually substantiated following the pattern of a large region of Russia. Several trends of thematically mapping were suggested to be distinguished in the Atlas's structure: • The background history of oil-fields mine working; • The nature preservation technologies while oil extracting; • The assessment of natural conditions of a humans vital activity; • Unfavorable and dangerous natural processes and phenomena; • The anthropogenic effect and environmental surroundings change; • The social-economical processes and phenomena. • The medical-ecological and geochemical processes and phenomena; Within these groups the other numerous groups can distinguished. The maps of unfavorable and dangerous processes and phenomena subdivided in accordance with the types of processes - of endogenous and exogenous origin. Among the maps of the anthropogenic effects on the natural surroundings one can differentiate the maps of the influence on different nature's spheres - atmosphere, hydrosphere, lithosphere, biosphere, etc. In this way, all thematic groups brought together into four main sections: • The introduction (the maps of a general condition and social-economical state, a region's rating in Republic; • The components of natural, social-economics systems that form the conditions for the ecological situations; • The integrated maps of exertion and change of the environment; • The strategy to reach an ecological equilibrium. The following data confirm that: more than 200 electronic analytical, complex and synthetic maps; more than 1000 small rivers basins, 6000 landscapes areas, 500 anthropogenic pollutions source, etc. The extensive information, richness and diversity of the maps content, objective indices used in the maps, open the door to wide opportunities to apply different methods of cartography analysis comprising both usual visional one and the geographical constructions, cartometry statistical data treatment, respectively. The methods of mathematical-mapping and computer modeling presume to compute spatial correlations and mutual conformity of phenomena and to estimate the homogeneity of the ecological conditions, to reveal the leading factors of distribution and phenomena and processes development using the means of multidimensional statistical analysis.

  13. Facilitating participatory multilevel decision-making by using interactive mental maps.

    PubMed

    Pfeiffer, Constanze; Glaser, Stephanie; Vencatesan, Jayshree; Schliermann-Kraus, Elke; Drescher, Axel; Glaser, Rüdiger

    2008-11-01

    Participation of citizens in political, economic or social decisions is increasingly recognized as a precondition to foster sustainable development processes. Since spatial information is often important during planning and decision making, participatory mapping gains in popularity. However, little attention has been paid to the fact that information must be presented in a useful way to reach city planners and policy makers. Above all, the importance of visualisation tools to support collaboration, analytical reasoning, problem solving and decision-making in analysing and planning processes has been underestimated. In this paper, we describe how an interactive mental map tool has been developed in a highly interdisciplinary disaster management project in Chennai, India. We moved from a hand drawn mental maps approach to an interactive mental map tool. This was achieved by merging socio-economic and geospatial data on infrastructure, local perceptions, coping and adaptation strategies with remote sensing data and modern technology of map making. This newly developed interactive mapping tool allowed for insights into different locally-constructed realities and facilitated the communication of results to the wider public and respective policy makers. It proved to be useful in visualising information and promoting participatory decision-making processes. We argue that the tool bears potential also for health research projects. The interactive mental map can be used to spatially and temporally assess key health themes such as availability of, and accessibility to, existing health care services, breeding sites of disease vectors, collection and storage of water, waste disposal, location of public toilets or defecation sites.

  14. Application of remote sensing technology to land evaluation, planning utilization of land resources, and assessment of westland habitat in eastern South Dakota, parts 1 and 2

    NASA Technical Reports Server (NTRS)

    Myers, V. I. (Principal Investigator); Cox, T. L.; Best, R. G.

    1976-01-01

    The author has identified the following significant results. LANDSAT fulfilled the requirements for general soils and land use information. RB-57 imagery was required to provide the information and detail needed for mapping soils for land evaluation. Soils maps for land evaluation were provided on clear mylar at the scale of the county highway map to aid users in locating mapping units. Resulting mapped data were computer processed to provided a series of interpretive maps (land value, limitations to development, etc.) and area summaries for the users.

  15. Developing Process Maps as a Tool for a Surgical Infection Prevention Quality Improvement Initiative in Resource-Constrained Settings.

    PubMed

    Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G

    2018-06-01

    Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Large-Scale SNP Discovery and Genotyping for Constructing a High-Density Genetic Map of Tea Plant Using Specific-Locus Amplified Fragment Sequencing (SLAF-seq)

    PubMed Central

    Ma, Chun-Lei; Jin, Ji-Qiang; Li, Chun-Fang; Wang, Rong-Kai; Zheng, Hong-Kun; Yao, Ming-Zhe; Chen, Liang

    2015-01-01

    Genetic maps are important tools in plant genomics and breeding. The present study reports the large-scale discovery of single nucleotide polymorphisms (SNPs) for genetic map construction in tea plant. We developed a total of 6,042 valid SNP markers using specific-locus amplified fragment sequencing (SLAF-seq), and subsequently mapped them into the previous framework map. The final map contained 6,448 molecular markers, distributing on fifteen linkage groups corresponding to the number of tea plant chromosomes. The total map length was 3,965 cM, with an average inter-locus distance of 1.0 cM. This map is the first SNP-based reference map of tea plant, as well as the most saturated one developed to date. The SNP markers and map resources generated in this study provide a wealth of genetic information that can serve as a foundation for downstream genetic analyses, such as the fine mapping of quantitative trait loci (QTL), map-based cloning, marker-assisted selection, and anchoring of scaffolds to facilitate the process of whole genome sequencing projects for tea plant. PMID:26035838

  17. What Iswebmapping Anyway?

    NASA Astrophysics Data System (ADS)

    Veenendaal, B.; Brovelli, M. A.; Li, S.; Ivánová, I.

    2017-09-01

    Although maps have been around for a very long time, web maps are yet very young in their origin. Despite their relatively short history, web maps have been developing very rapidly over the past few decades. The use, users and usability of web maps have rapidly expanded along with developments in web technologies and new ways of mapping. In the process of these developments, the terms and terminology surrounding web mapping have also changed and evolved, often relating to the new technologies or new uses. Examples include web mapping, web GIS, cloud mapping, internet mapping, internet GIS, geoweb, map mashup, online mapping etc., not to mention those with prefixes such as "web-based" and "internet-based". So, how do we keep track of these terms, relate them to each other and have common understandings of their meanings so that references to them are not ambiguous, misunderstood or even different? This paper explores the terms surrounding web mapping and web GIS, and the development of their meaning over time. The paper then suggests the current context in which these terms are used and provides meanings that may assist in better understanding and communicating using these terms in the future.

  18. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between cross sections, and can generate working maps across a broad range of scales, for any selected area, and overlayed with easily updated cultural features. Local governments are aggressively collecting very-high-accuracy elevation data for numerous reasons; this not only lowers the cost and increases accuracy of flood maps, but also inherently boosts the level of community involvement in the mapping process. These elevation data are also ideal for hydraulic modeling, should an existing model be judged inadequate.

  19. Northern Everglades, Florida, satellite image map

    USGS Publications Warehouse

    Thomas, Jean-Claude; Jones, John W.

    2002-01-01

    These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program with support from the Everglades National Park. The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.

  20. USGS ShakeMap Developments, Implementation, and Derivative Tools

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Lin, K.; Quitoriano, V.; Worden, B.

    2007-12-01

    We discuss ongoing development and enhancements of ShakeMap, a system for automatically generating maps of ground shaking and intensity in the minutes following an earthquake. The rapid availability of these maps is of particular value to emergency response organizations, utilities, insurance companies, government decision- makers, the media, and the general public. ShakeMap Version 3.2 was released in March, 2007, on a download site which allows ShakeMap developers to track operators' updates and provide follow-up information; V3.2 has now been downloaded in 15 countries. The V3.2 release supports LINUX in addition to other UNIX operating systems and adds enhancements to XML, KML, metadata, and other products. We have also added an uncertainty measure, quantified as a function of spatial location. Uncertainty is essential for evaluating the range of possible losses. Though not released in V3.2, we will describe a new quantitative uncertainty letter grading for each ShakeMap produced, allowing users to gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of their post-earthquake critical decision-making process. Since the V3.2 release, several new ground motion predictions equations have also been added to the prediction equation modules. ShakeMap is implemented in several new regions as reported in this Session. Within the U.S., robust systems serve California, Nevada, Utah, Washington and Oregon, Hawaii, and Anchorage. Additional systems are in development and efforts to provide backup capabilities for all Advanced National Seismic System (ANSS) regions at the National Earthquake Information Center are underway. Outside the U.S., this Session has descriptions of ShakeMap systems in Italy, Switzerland, Romania, and Turkey, among other countries. We also describe our predictive global ShakeMap system for the rapid evaluation of significant earthquakes globally for the Prompt Assessment of Global Earthquakes for Response (PAGER) system. These global ShakeMaps are constrained by rapidly gathered intensity data via the Internet and by finite fault and aftershock analyses for portraying fault rupture dimensions. As part of the PAGER loss calibration process we have produced an Atlas of ShakeMaps for significant earthquakes around the globe since 1973 (Allen and others, this Session); these Atlas events have additional constraints provided by archival strong motion, faulting dimensions, and macroseismic intensity data. We also describe derivative tools for further utilizing ShakeMap including ShakeCast, a fully automated system for delivering specific ShakeMap products to critical users and triggering established post-earthquake response protocols. We have released ShakeCast Version 2.0 (Lin and others, this Session), which allows RSS feeds for automatically receiving ShakeMap files, auto-launching of post-download processing scripts, and delivering notifications based on users' likely facility damage states derived from ShakeMap shaking parameters. As part of our efforts to produce estimated ShakeMaps globally, we have developed a procedure for deriving Vs30 estimates from correlations with topographic slope, and we have now implemented a global Vs30 Server, allowing users to generate Vs30 maps for custom user-selected regions around the globe (Allen and Wald, this Session). Finally, as a further derivative product of the ShakeMap Atlas project, we will present a shaking hazard Map for the past 30 years based on approximately 3,900 earthquake ShakeMaps of historic earthquakes.

  1. Sparsity-constrained PET image reconstruction with learned dictionaries

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Yang, Bao; Wang, Yanhua; Ying, Leslie

    2016-09-01

    PET imaging plays an important role in scientific and clinical measurement of biochemical and physiological processes. Model-based PET image reconstruction such as the iterative expectation maximization algorithm seeking the maximum likelihood solution leads to increased noise. The maximum a posteriori (MAP) estimate removes divergence at higher iterations. However, a conventional smoothing prior or a total-variation (TV) prior in a MAP reconstruction algorithm causes over smoothing or blocky artifacts in the reconstructed images. We propose to use dictionary learning (DL) based sparse signal representation in the formation of the prior for MAP PET image reconstruction. The dictionary to sparsify the PET images in the reconstruction process is learned from various training images including the corresponding MR structural image and a self-created hollow sphere. Using simulated and patient brain PET data with corresponding MR images, we study the performance of the DL-MAP algorithm and compare it quantitatively with a conventional MAP algorithm, a TV-MAP algorithm, and a patch-based algorithm. The DL-MAP algorithm achieves improved bias and contrast (or regional mean values) at comparable noise to what the other MAP algorithms acquire. The dictionary learned from the hollow sphere leads to similar results as the dictionary learned from the corresponding MR image. Achieving robust performance in various noise-level simulation and patient studies, the DL-MAP algorithm with a general dictionary demonstrates its potential in quantitative PET imaging.

  2. AEKF-SLAM: A New Algorithm for Robotic Underwater Navigation

    PubMed Central

    Yuan, Xin; Martínez-Ortega, José-Fernán; Fernández, José Antonio Sánchez; Eckert, Martina

    2017-01-01

    In this work, we focus on key topics related to underwater Simultaneous Localization and Mapping (SLAM) applications. Moreover, a detailed review of major studies in the literature and our proposed solutions for addressing the problem are presented. The main goal of this paper is the enhancement of the accuracy and robustness of the SLAM-based navigation problem for underwater robotics with low computational costs. Therefore, we present a new method called AEKF-SLAM that employs an Augmented Extended Kalman Filter (AEKF)-based SLAM algorithm. The AEKF-based SLAM approach stores the robot poses and map landmarks in a single state vector, while estimating the state parameters via a recursive and iterative estimation-update process. Hereby, the prediction and update state (which exist as well in the conventional EKF) are complemented by a newly proposed augmentation stage. Applied to underwater robot navigation, the AEKF-SLAM has been compared with the classic and popular FastSLAM 2.0 algorithm. Concerning the dense loop mapping and line mapping experiments, it shows much better performances in map management with respect to landmark addition and removal, which avoid the long-term accumulation of errors and clutters in the created map. Additionally, the underwater robot achieves more precise and efficient self-localization and a mapping of the surrounding landmarks with much lower processing times. Altogether, the presented AEKF-SLAM method achieves reliably map revisiting, and consistent map upgrading on loop closure. PMID:28531135

  3. Ecology and space: A case study in mapping harmful invasive species

    USGS Publications Warehouse

    David T. Barnett,; Jarnevich, Catherine S.; Chong, Geneva W.; Stohlgren, Thomas J.; Sunil Kumar,; Holcombe, Tracy R.; Brunn, Stanley D.; Dodge, Martin

    2017-01-01

    The establishment and invasion of non-native plant species have the ability to alter the composition of native species and functioning of ecological systems with financial costs resulting from mitigation and loss of ecological services. Spatially documenting invasions has applications for management and theory, but the utility of maps is challenged by availability and uncertainty of data, and the reliability of extrapolating mapped data in time and space. The extent and resolution of projections also impact the ability to inform invasive species science and management. Early invasive species maps were coarse-grained representations that underscored the phenomena, but had limited capacity to direct management aside from development of watch lists for priorities for prevention and containment. Integrating mapped data sets with fine-resolution environmental variables in the context of species-distribution models allows a description of species-environment relationships and an understanding of how, why, and where invasions may occur. As with maps, the extent and resolution of models impact the resulting insight. Models of cheatgrass (Bromus tectorum) across a variety of spatial scales and grain result in divergent species-environment relationships. New data can improve models and efficiently direct further inventories. Mapping can target areas of greater model uncertainty or the bounds of modeled distribution to efficiently refine models and maps. This iterative process results in dynamic, living maps capable of describing the ongoing process of species invasions.

  4. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    PubMed Central

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096

  5. New Topographic Maps of Io Using Voyager and Galileo Stereo Imaging and Photoclinometry

    NASA Astrophysics Data System (ADS)

    White, O. L.; Schenk, P. M.; Hoogenboom, T.

    2012-03-01

    Stereo and photoclinometry processing have been applied to Voyager and Galileo images of Io in order to derive regional- and local-scale topographic maps of 20% of the moon’s surface to date. We present initial mapping results.

  6. Libraries, the MAP, and Student Achievement.

    ERIC Educational Resources Information Center

    Jones, Cherri; Singer, Marietta; Miller, David W.; Makemson, Carroll; Elliott, Kara; Litsch, Diana; Irwin, Barbara; Hoemann, Cheryl; Elmore, Jennifer; Roe, Patty; Gregg, Diane; Needham, Joyce; Stanley, Jerri; Reinert, John; Holtz, Judy; Jenkins, Sandra; Giles, Paula

    2002-01-01

    Includes 17 articles that discuss the Missouri Assessment Program (MAP) and the role of school library media centers. Highlights include improving student achievement; improving student scores on the MAP; graphic organizers; programs for volunteer student library workers; research process; research skills; reading initiatives; collaborative…

  7. The SO2 Cycle on Io as Seen by the Near Infrared Mapping Spectrometer

    NASA Technical Reports Server (NTRS)

    Doute', S.; Lopes-Gautier, R.; Carlson, R. W.; Schmitt, B.; Soderblom, L. A.

    2000-01-01

    Based on the analysis of Near Infrared Mapping Spectrometer (NIMS) hyperspectral images of Io that leads to sulfur dioxide distribution maps, we intend to give some insights about different processes occurring throughout the SO2 cycle.

  8. Concept Mapping

    ERIC Educational Resources Information Center

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  9. Integrating volcanic hazard data in a systematic approach to develop volcanic hazard maps in the Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Lindsay, Jan M.; Robertson, Richard E. A.

    2018-04-01

    We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick 'em Jenny and Ronde/Caille), Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s) of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past 10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that the documentation of our experience might be useful for other map makers to take into account when creating new or updating existing maps.

  10. Applied photo interpretation for airbrush cartography

    NASA Technical Reports Server (NTRS)

    Inge, J. L.; Bridges, P. M.

    1976-01-01

    New techniques of cartographic portrayal have been developed for the compilation of maps of lunar and planetary surfaces. Conventional photo interpretation methods utilizing size, shape, shadow, tone, pattern, and texture are applied to computer processed satellite television images. The variety of the image data allows the illustrator to interpret image details by inter-comparison and intra-comparison of photographs. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The validity of the interpretation process is tested by making a representational drawing by an airbrush portrayal technique. Production controls insure the consistency of a map series. Photo interpretive cartographic portrayal skills are used to prepare two kinds of map series and are adaptable to map products of different kinds and purposes.

  11. The MAPS Reporting Statement for Studies Mapping onto Generic Preference-Based Outcome Measures: Explanation and Elaboration.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    The process of "mapping" is increasingly being used to predict health utilities, for application within health economic evaluations, using data on other indicators or measures of health. Guidance for the reporting of mapping studies is currently lacking. The overall objective of this research was to develop a checklist of essential items, which authors should consider when reporting mapping studies. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a checklist, which aims to promote complete and transparent reporting by researchers. This paper provides a detailed explanation and elaboration of the items contained within the MAPS statement. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items and accompanying explanations was created. A two-round, modified Delphi survey, with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorised within six sections, namely, (i) title and abstract, (ii) introduction, (iii) methods, (iv) results, (v) discussion and (vi) other. For each item, we summarise the recommendation, illustrate it using an exemplar of good reporting practice identified from the published literature, and provide a detailed explanation to accompany the recommendation. It is anticipated that the MAPS statement will promote clarity, transparency and completeness of reporting of mapping studies. It is targeted at researchers developing mapping algorithms, peer reviewers and editors involved in the manuscript review process for mapping studies, and the funders of the research. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  12. A fruit quality gene map of Prunus

    PubMed Central

    2009-01-01

    Background Prunus fruit development, growth, ripening, and senescence includes major biochemical and sensory changes in texture, color, and flavor. The genetic dissection of these complex processes has important applications in crop improvement, to facilitate maximizing and maintaining stone fruit quality from production and processing through to marketing and consumption. Here we present an integrated fruit quality gene map of Prunus containing 133 genes putatively involved in the determination of fruit texture, pigmentation, flavor, and chilling injury resistance. Results A genetic linkage map of 211 markers was constructed for an intraspecific peach (Prunus persica) progeny population, Pop-DG, derived from a canning peach cultivar 'Dr. Davis' and a fresh market cultivar 'Georgia Belle'. The Pop-DG map covered 818 cM of the peach genome and included three morphological markers, 11 ripening candidate genes, 13 cold-responsive genes, 21 novel EST-SSRs from the ChillPeach database, 58 previously reported SSRs, 40 RAFs, 23 SRAPs, 14 IMAs, and 28 accessory markers from candidate gene amplification. The Pop-DG map was co-linear with the Prunus reference T × E map, with 39 SSR markers in common to align the maps. A further 158 markers were bin-mapped to the reference map: 59 ripening candidate genes, 50 cold-responsive genes, and 50 novel EST-SSRs from ChillPeach, with deduced locations in Pop-DG via comparative mapping. Several candidate genes and EST-SSRs co-located with previously reported major trait loci and quantitative trait loci for chilling injury symptoms in Pop-DG. Conclusion The candidate gene approach combined with bin-mapping and availability of a community-recognized reference genetic map provides an efficient means of locating genes of interest in a target genome. We highlight the co-localization of fruit quality candidate genes with previously reported fruit quality QTLs. The fruit quality gene map developed here is a valuable tool for dissecting the genetic architecture of fruit quality traits in Prunus crops. PMID:19995417

  13. Historical shoreline mapping (II): Application of the Digital Shoreline Mapping and Analysis Systems (DSMS/DSAS) to shoreline change mapping in Puerto Rico

    USGS Publications Warehouse

    Thieler, E. Robert; Danforth, William W.

    1994-01-01

    A new, state-of-the-art method for mapping historical shorelines from maps and aerial photographs, the Digital Shoreline Mapping System (DSMS), has been developed. The DSMS is a freely available, public domain software package that meets the cartographic and photogrammetric requirements of precise coastal mapping, and provides a means to quantify and analyze different sources of error in the mapping process. The DSMS is also capable of resolving imperfections in aerial photography that commonly are assumed to be nonexistent. The DSMS utilizes commonly available computer hardware and software, and permits the entire shoreline mapping process to be executed rapidly by a single person in a small lab. The DSMS generates output shoreline position data that are compatible with a variety of Geographic Information Systems (GIS). A second suite of programs, the Digital Shoreline Analysis System (DSAS) has been developed to calculate shoreline rates-of-change from a series of shoreline data residing in a GIS. Four rate-of-change statistics are calculated simultaneously (end-point rate, average of rates, linear regression and jackknife) at a user-specified interval along the shoreline using a measurement baseline approach. An example of DSMS and DSAS application using historical maps and air photos of Punta Uvero, Puerto Rico provides a basis for assessing the errors associated with the source materials as well as the accuracy of computed shoreline positions and erosion rates. The maps and photos used here represent a common situation in shoreline mapping: marginal-quality source materials. The maps and photos are near the usable upper limit of scale and accuracy, yet the shoreline positions are still accurate ±9.25 m when all sources of error are considered. This level of accuracy yields a resolution of ±0.51 m/yr for shoreline rates-of-change in this example, and is sufficient to identify the short-term trend (36 years) of shoreline change in the study area.

  14. Energy map of southwestern Wyoming - Energy data archived, organized, integrated, and accessible

    USGS Publications Warehouse

    Biewick, Laura; Jones, Nicholas R.; Wilson, Anna B.

    2013-01-01

    The Wyoming Landscape Conservation Initiative (WLCI) focuses on conserving world-class wildlife resources while facilitating responsible energy development in southwestern Wyoming. To further advance the objectives of the WLCI long-term, science-based effort, a comprehensive inventory of energy resource and production data is being published in two parts. Energy maps, data, documentation and spatial data processing capabilities are available in geodatabase, published map file (pmf), ArcMap document (mxd), Adobe Acrobat PDF map, and other digital formats that can be downloaded at the USGS website.

  15. Optimal mapping of neural-network learning on message-passing multicomputers

    NASA Technical Reports Server (NTRS)

    Chu, Lon-Chan; Wah, Benjamin W.

    1992-01-01

    A minimization of learning-algorithm completion time is sought in the present optimal-mapping study of the learning process in multilayer feed-forward artificial neural networks (ANNs) for message-passing multicomputers. A novel approximation algorithm for mappings of this kind is derived from observations of the dominance of a parallel ANN algorithm over its communication time. Attention is given to both static and dynamic mapping schemes for systems with static and dynamic background workloads, as well as to experimental results obtained for simulated mappings on multicomputers with dynamic background workloads.

  16. Self-mapping in treating suicide ideation: a case study.

    PubMed

    Robertson, Lloyd Hawkeye

    2011-03-01

    This case study traces the development and use of a self-mapping exercise in the treatment of a youth who had been at risk for re-attempting suicide. A life skills exercise was modified to identify units of culture called memes from which a map of the youth's self was prepared. A successful treatment plan followed the mapping exercise. The process of self-map construction is presented along with an interpretive analysis. It is suggested that therapists from a range of perspectives could use this technique in assessment and treatment.

  17. Soil mapping and processes modelling for sustainable land management: a review

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Brevik, Eric; Muñoz-Rojas, Miriam; Miller, Bradley; Smetanova, Anna; Depellegrin, Daniel; Misiune, Ieva; Novara, Agata; Cerda, Artemi

    2017-04-01

    Soil maps and models are fundamental for a correct and sustainable land management (Pereira et al., 2017). They are an important in the assessment of the territory and implementation of sustainable measures in urban areas, agriculture, forests, ecosystem services, among others. Soil maps represent an important basis for the evaluation and restoration of degraded areas, an important issue for our society, as consequence of climate change and the increasing pressure of humans on the ecosystems (Brevik et al. 2016; Depellegrin et al., 2016). The understanding of soil spatial variability and the phenomena that influence this dynamic is crucial to the implementation of sustainable practices that prevent degradation, and decrease the economic costs of soil restoration. In this context, soil maps and models are important to identify areas affected by degradation and optimize the resources available to restore them. Overall, soil data alone or integrated with data from other sciences, is an important part of sustainable land management. This information is extremely important land managers and decision maker's implements sustainable land management policies. The objective of this work is to present a review about the advantages of soil mapping and process modeling for sustainable land management. References Brevik, E., Calzolari, C., Miller, B., Pereira, P., Kabala, C., Baumgarten, A., Jordán, A. (2016) Historical perspectives and future needs in soil mapping, classification and pedological modelling, Geoderma, 264, Part B, 256-274. Depellegrin, D.A., Pereira, P., Misiune, I., Egarter-Vigl, L. (2016) Mapping Ecosystem Services in Lithuania. International Journal of Sustainable Development and World Ecology, 23, 441-455. Pereira, P., Brevik, E., Munoz-Rojas, M., Miller, B., Smetanova, A., Depellegrin, D., Misiune, I., Novara, A., Cerda, A. (2017) Soil mapping and process modelling for sustainable land management. In: Pereira, P., Brevik, E., Munoz-Rojas, M., Miller, B. (Eds.) Soil mapping and process modelling for sustainable land use management (Elsevier Publishing House) ISBN: 9780128052006

  18. Creating soil moisture maps based on radar satellite imagery

    NASA Astrophysics Data System (ADS)

    Hnatushenko, Volodymyr; Garkusha, Igor; Vasyliev, Volodymyr

    2017-10-01

    The presented work is related to a study of mapping soil moisture basing on radar data from Sentinel-1 and a test of adequacy of the models constructed on the basis of data obtained from alternative sources. Radar signals are reflected from the ground differently, depending on its properties. In radar images obtained, for example, in the C band of the electromagnetic spectrum, soils saturated with moisture usually appear in dark tones. Although, at first glance, the problem of constructing moisture maps basing on radar data seems intuitively clear, its implementation on the basis of the Sentinel-1 data on an industrial scale and in the public domain is not yet available. In the process of mapping, for verification of the results, measurements of soil moisture obtained from logs of the network of climate stations NOAA US Climate Reference Network (USCRN) were used. This network covers almost the entire territory of the United States. The passive microwave radiometers of Aqua and SMAP satellites data are used for comparing processing. In addition, other supplementary cartographic materials were used, such as maps of soil types and ready moisture maps. The paper presents a comparison of the effect of the use of certain methods of roughening the quality of radar data on the result of mapping moisture. Regression models were constructed showing dependence of backscatter coefficient values Sigma0 for calibrated radar data of different spatial resolution obtained at different times on soil moisture values. The obtained soil moisture maps of the territories of research, as well as the conceptual solutions about automation of operations of constructing such digital maps, are presented. The comparative assessment of the time required for processing a given set of radar scenes with the developed tools and with the ESA SNAP product was carried out.

  19. Planetary Geologic Mapping Handbook - 2009

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of project-specific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well (e.g., Hare and others, 2009). Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically (e.g., Wilhelms, 1972, 1990; Tanaka and others, 1994). As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program s Planetary Cartography and Geologic Mapping Working Group s (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  20. Cartographic research 1977

    USGS Publications Warehouse

    ,

    1978-01-01

    Two major subjects of the current research of the Topographic Division as reported here are related to policy decisions affecting the National Mapping Program of the Geological Survey. The adoption of a metric mapping policy has resulted in new cartographic products with associated changes in map design that require new looks in graphics and new equipment. The increasing use of digitized cartographic information has led to developments in data acquisition, processing, and storage and consequent changes in equipment and techniques. This report summarizes the activities in cartographic research and development for the 12-month period ending June 1977 and covers work done at the several facilities of the Topographic Division: the Western Mapping Center at Menlo Park, Calif., the Rocky Mountain Mapping Center at Denver, Colo., the Mid-Continent Mapping Center at Rolla, Mo., and the Eastern Mapping Center, the Special Mapping Center, the Office of Plans and Program Development, and the Office of Research and Technical Standards all at Reston, Va.

  1. Monitoring and evaluation of rowing performance using mobile mapping data

    NASA Astrophysics Data System (ADS)

    Mpimis, A.; Gikas, V.

    2011-12-01

    Traditionally, the term mobile mapping refers to a means of collecting geospatial data using mapping sensors that are mounted on a mobile platform. Historically, this process was mainly driven by the need for highway infrastructure mapping and transportation corridor inventories. However, the recent advances in mapping sensor and telecommunication technologies create the opportunity that, completely new, emergent application areas of mobile mapping to evolve rapidly. This article examines the potential of mobile mapping technology (MMT) in sports science and in particular in competitive rowing. Notably, in this study the concept definition of mobile mapping somehow differs from the traditional one in a way that, the end result is not relevant to the geospatial information acquired as the moving platform travels in space. In contrast, the interest is placed on the moving platform (rowing boat) itself and on the various subsystems which are also in continuous motion.

  2. In search of the motor engram: motor map plasticity as a mechanism for encoding motor experience.

    PubMed

    Monfils, Marie-H; Plautz, Erik J; Kleim, Jeffrey A

    2005-10-01

    Motor skill acquisition occurs through modification and organization of muscle synergies into effective movement sequences. The learning process is reflected neurophysiologically as a reorganization of movement representations within the primary motor cortex, suggesting that the motor map is a motor engram. However, the specific neural mechanisms underlying map plasticity are unknown. Here the authors review evidence that 1) motor map topography reflects the capacity for skilled movement, 2) motor skill learning induces reorganization of motor maps in a manner that reflects the kinematics of acquired skilled movement, 3) map plasticity is supported by a reorganization of cortical microcircuitry involving changes in synaptic efficacy, and 4) motor map integrity and topography are influenced by various neurochemical signals that coordinate changes in cortical circuitry to encode motor experience. Finally, the role of motor map plasticity in recovery of motor function after brain damage is discussed.

  3. A Map of Kilometer-Scale Topographic Roughness of Mercury

    NASA Astrophysics Data System (ADS)

    Kreslavsky, M. A.; Head, J. W., III; Kokhanov, A. A.; Neumann, G. A.; Smith, D. E.; Zuber, M. T.; Kozlova, N. A.

    2014-12-01

    We present a new map of the multiscale topographic roughness of the northern circumpolar area of Mercury. The map utilizes high internal vertical precision surface ranging by the laser altimeter MLA onboard MESSENGER mission to Mercury. This map is analogous to global roughness maps that had been created by M.A.K. with collaborators for Mars (MOLA data) and the Moon (LOLA data). As measures of roughness, we used the interquartile range of along-track profile curvature at three baselines: 0.7 km, 2.8 km, and 11 km. Unlike in the cases of LOLA data for the Moon, and MOLA data for Mars, the MLA data allow high-quality roughness mapping only for a small part of the surface of the planet: the map covers 65N - 84N latitude zone, where the density of MLA data is the highest. The map captures the regional variations of the typical background topographic texture of the surface. The map shows the clear dichotomy between smooth northern plains and rougher cratered terrains. The lowered contrast of this dichotomy at the shortest (0.7 km) baseline indicates that regolith on Mercury is thicker and/or gardening processes are more intensive in comparison to the Moon, approximately by a factor of three. The map reveals sharp roughness contrasts within northern plains of Mercury that we interpret as geologic boundaries of volcanic plains of different age. In particular, the map suggests a younger volcanic plains unit inside Goethe basin and inside another unnamed stealth basin. -- Acknowledgement: Work on data processing was carried out at MIIGAiK by MAK, AAK, NAK and supported by Russian Science Foundation project 14-22-00197.

  4. Operational shoreline mapping with high spatial resolution radar and geographic processing

    USGS Publications Warehouse

    Rangoonwala, Amina; Jones, Cathleen E; Chi, Zhaohui; Ramsey, Elijah W.

    2017-01-01

    A comprehensive mapping technology was developed utilizing standard image processing and available GIS procedures to automate shoreline identification and mapping from 2 m synthetic aperture radar (SAR) HH amplitude data. The development used four NASA Uninhabited Aerial Vehicle SAR (UAVSAR) data collections between summer 2009 and 2012 and a fall 2012 collection of wetlands dominantly fronted by vegetated shorelines along the Mississippi River Delta that are beset by severe storms, toxic releases, and relative sea-level rise. In comparison to shorelines interpreted from 0.3 m and 1 m orthophotography, the automated GIS 10 m alongshore sampling found SAR shoreline mapping accuracy to be ±2 m, well within the lower range of reported shoreline mapping accuracies. The high comparability was obtained even though water levels differed between the SAR and photography image pairs and included all shorelines regardless of complexity. The SAR mapping technology is highly repeatable and extendable to other SAR instruments with similar operational functionality.

  5. Planning or something else? Examining neuropsychological predictors of Zoo Map performance.

    PubMed

    Oosterman, Joukje M; Wijers, Marijn; Kessels, Roy P C

    2013-01-01

    The Zoo Map Test of the Behavioral Assessment of the Dysexecutive Syndrome battery is often applied to measure planning ability as part of executive function. Successful performance on this test is, however, dependent on various cognitive functions, and deficient Zoo Map performance does therefore not necessarily imply selectively disrupted planning abilities. To address this important issue, we examined whether planning is still the most important predictor of Zoo Map performance in a heterogeneous sample of neurologic and psychiatric outpatients (N = 71). In addition to the Zoo Map Test, the patients completed other neuropsychological tests of planning, inhibition, processing speed, and episodic memory. Planning was the strongest predictor of the total raw score and inappropriate places visited, and no additional contribution of other cognitive scores was found. One exception to this was the total time, which was associated with processing speed. Overall, our findings indicate that the Zoo Map Test is a valid indicator of planning ability in a heterogeneous patient sample.

  6. Assessing Volunteered Geographic Information (vgi) Quality Based on CONTRIBUTORS' Mapping Behaviours

    NASA Astrophysics Data System (ADS)

    Bégin, D.; Devillers, R.; Roche, S.

    2013-05-01

    VGI changed the mapping landscape by allowing people that are not professional cartographers to contribute to large mapping projects, resulting at the same time in concerns about the quality of the data produced. While a number of early VGI studies used conventional methods to assess data quality, such approaches are not always well adapted to VGI. Since VGI is a user-generated content, we posit that features and places mapped by contributors largely reflect contributors' personal interests. This paper proposes studying contributors' mapping processes to understand the characteristics and quality of the data produced. We argue that contributors' behaviour when mapping reflects contributors' motivation and individual preferences in selecting mapped features and delineating mapped areas. Such knowledge of contributors' behaviour could allow for the derivation of information about the quality of VGI datasets. This approach was tested using a sample area from OpenStreetMap, leading to a better understanding of data completeness for contributor's preferred features.

  7. How Albot0 finds its way home: a novel approach to cognitive mapping using robots.

    PubMed

    Yeap, Wai K

    2011-10-01

    Much of what we know about cognitive mapping comes from observing how biological agents behave in their physical environments, and several of these ideas were implemented on robots, imitating such a process. In this paper a novel approach to cognitive mapping is presented whereby robots are treated as a species of their own and their cognitive mapping is being investigated. Such robots are referred to as Albots. The design of the first Albot, Albot0 , is presented. Albot0 computes an imprecise map and employs a novel method to find its way home. Both the map and the return-home algorithm exhibited characteristics commonly found in biological agents. What we have learned from Albot0 's cognitive mapping are discussed. One major lesson is that the spatiality in a cognitive map affords us rich and useful information and this argues against recent suggestions that the notion of a cognitive map is not a useful one. Copyright © 2011 Cognitive Science Society, Inc.

  8. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    USGS Publications Warehouse

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  9. Cadastral Map Assembling Using Generalized Hough Transformation

    NASA Astrophysics Data System (ADS)

    Liu, Fei; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka

    There are numerous cadastral maps generated by the past land surveying. The raster digitization of these paper maps is in progress. For effective and efficient use of these maps, we have to assemble the set of maps to make them superimposable on other geographic information in a GIS. The problem can be seen as a complex jigsaw puzzle where the pieces are the cadastral sections extracted from the map. We present an automatic solution to this geographic jigsaw puzzle, based on the generalized Hough transformation that detects the longest common boundary between every piece and its neighbors. The experiments have been conducted using the map of Mie Prefecture, Japan and the French cadastral map. The results of the experiments with the French cadastral maps showed that the proposed method, which consists of a flood filling procedure of internal area and detection and normalization of the north arrow direction, is suitable for assembling the cadastral map. The final goal of the process is to integrate every piece of the puzzle into a national geographic reference frame and database.

  10. Unified Ecoregions of Alaska: 2001

    USGS Publications Warehouse

    Nowacki, Gregory J.; Spencer, Page; Fleming, Michael; Brock, Terry; Jorgenson, Torre

    2003-01-01

    Major ecosystems have been mapped and described for the State of Alaska and nearby areas. Ecoregion units are based on newly available datasets and field experience of ecologists, biologists, geologists and regional experts. Recently derived datasets for Alaska included climate parameters, vegetation, surficial geology and topography. Additional datasets incorporated in the mapping process were lithology, soils, permafrost, hydrography, fire regime and glaciation. Thirty two units are mapped using a combination of the approaches of Bailey (hierarchial), and Omernick (integrated). The ecoregions are grouped into two higher levels using a 'tri-archy' based on climate parameters, vegetation response and disturbance processes. The ecoregions are described with text, photos and tables on the published map.

  11. Refining Landsat classification results using digital terrain data

    USGS Publications Warehouse

    Miller, Wayne A.; Shasby, Mark

    1982-01-01

     Scientists at the U.S. Geological Survey's Earth Resources Observation systems (EROS) Data Center have recently completed two land-cover mapping projects in which digital terrain data were used to refine Landsat classification results. Digital ter rain data were incorporated into the Landsat classification process using two different procedures that required developing decision criteria either subjectively or quantitatively. The subjective procedure was used in a vegetation mapping project in Arizona, and the quantitative procedure was used in a forest-fuels mapping project in Montana. By incorporating digital terrain data into the Landsat classification process, more spatially accurate landcover maps were produced for both projects.

  12. LANDSAT and radar mapping of intrusive rocks in SE-Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Dossantos, A. R.; Dosanjos, C. E.; Moreira, J. C.; Barbosa, M. P.; Veneziani, P.

    1982-01-01

    The feasibility of intrusive rock mapping was investigated and criteria for regional geological mapping established at the scale of 1:500,00 in polycyclic and polymetamorphic areas using the logic method of photointerpretation of LANDSAT imagery and radar from the RADAMBRASIL project. The spectral behavior of intrusive rocks, was evaluated using the interactive multispectral image analysis system (Image-100). The region of Campos (city) in northern Rio de Janeiro State was selected as the study area and digital imagery processing and pattern recognition techniques were applied. Various maps at the 2:250,000 scale were obtained to evaluate the results of automatic data processing.

  13. Paleobathymetric Reconstruction of Ross Sea: seismic data processing and regional reflectors mapping

    NASA Astrophysics Data System (ADS)

    Olivo, Elisabetta; De Santis, Laura; Wardell, Nigel; Geletti, Riccardo; Busetti, Martina; Sauli, Chiara; Bergamasco, Andrea; Colleoni, Florence; Vanzella, Walter; Sorlien, Christopher; Wilson, Doug; De Conto, Robert; Powell, Ross; Bart, Phil; Luyendyk, Bruce

    2017-04-01

    PURPOSE: New maps of some major unconformities of the Ross Sea have been reconstructed, by using seismic data grids, combined with the acoustic velocities from previous works, from new and reprocessed seismic profiles. This work is carried out with the support of PNRA and in the frame of the bilateral Italy-USA project GLAISS (Global Sea Level Rise & Antarctic Ice Sheet Stability predictions), funded by the Ministry of Foreign Affairs. Paleobathymetric maps of 30, 14 and 4 million years ago, three 'key moments' for the glacial history of the Antarctic Ice Sheet, coinciding with global climatic changes. The paleobathymetric maps will then be used for numeric simulations focused on the width and thickness of the Ross Sea Ice Sheet. PRELIMINARY RESULTS: The first step was to create TWT maps of three main unconformity (RSU6, RSU4, and RSU2) of Ross Sea, revisiting and updating the ANTOSTRAT maps, through the interpretation of sedimentary bodies and erosional features, used to infer active or old processes along the slope, we identified the main seismic unconformities. We used the HIS Kingdom academic license. The different groups contribution was on the analysis of the Eastern Ross Sea continental slope and rise (OGS), of the Central Basin (KOPRI) of the western and central Ross Sea (Univ. of Santa Barbara and OGS), where new drill sites and seismic profiles were collected after the publication of the ANTOSTRAT maps. Than we joined our interpretation with previous interpretations. We examined previous processing of several seismic lines and all the old acoustic velocity analysis. In addiction we reprocessed some lines in order to have a higher data coverage. Then, combining the TWT maps of the unconformity with the old and new speed data we created new depth maps of the study area. The new depth maps will then be used for reconstructing the paleobathymetry of the Ross Sea by applying backstripping technique.

  14. The Use of Multiple Data Sources in the Process of Topographic Maps Updating

    NASA Astrophysics Data System (ADS)

    Cantemir, A.; Visan, A.; Parvulescu, N.; Dogaru, M.

    2016-06-01

    The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as TOPRO5 and at the same time, through the generalization and additional data sources of the Romanian 1:50 000 scale map. This paper also investigates the general perspective of DTM automatic use derived products in the process of updating the topographic maps.

  15. Magellan mission summary

    NASA Technical Reports Server (NTRS)

    Saunders, R. S.; Spear, A. J.; Allin, P. C.; Austin, R. S.; Berman, A. L.; Chandlee, R. C.; Clark, J.; Decharon, A. V.; De Jong, E. M.; Griffith, D. G.

    1992-01-01

    Magellan started mapping the planet Venus on September 15, 1990, and after one cycle (one Venus day or 243 earth days) had mapped 84 percent of the planet's surface. This returned an image data volume greater than all past planetary missions combined. Spacecraft problems were experienced in flight. Changes in operational procedures and reprogramming of onboard computers minimized the amount of mapping data lost. Magellan data processing is the largest planetary image-processing challenge to date. Compilation of global maps of tectonic and volcanic features, as well as impact craters and related phenomena and surface processes related to wind, weathering, and mass wasting, has begun. The Magellan project is now in an extended mission phase, with plans for additional cycles out to 1995. The Magellan project will fill in mapping gaps, obtain a global gravity data set between mid-September 1992 and May 1993, acquire images at different view angles, and look for changes on the surface from one cycle to another caused by surface activity such as volcanism, faulting, or wind activity.

  16. Using a concept map as a tool for strategic planning: The Healthy Brain Initiative.

    PubMed

    Anderson, Lynda A; Day, Kristine L; Vandenberg, Anna E

    2011-09-01

    Concept mapping is a tool to assist in strategic planning that allows planners to work through a sequence of phases to produce a conceptual framework. Although several studies describe how concept mapping is applied to various public health problems, the flexibility of the methods used in each phase of the process is often overlooked. If practitioners were more aware of the flexibility, more public health endeavors could benefit from using concept mapping as a tool for strategic planning. The objective of this article is to describe how the 6 concept-mapping phases originally outlined by William Trochim guided our strategic planning process and how we adjusted the specific methods in the first 2 phases to meet the specialized needs and requirements to create The Healthy Brain Initiative: A National Public Health Road Map to Maintaining Cognitive Health. In the first stage (phases 1 and 2 of concept mapping), we formed a steering committee, convened 4 work groups over a period of 3 months, and generated an initial set of 42 action items grounded in science. In the second stage (phases 3 and 4), we engaged stakeholders in sorting and rating the action items and constructed a series of concept maps. In the third and final stage (phases 5 and 6), we examined and refined the action items and generated a final concept map consisting of 44 action items. We then selected the top 10 action items, and in 2007, we published The Healthy Brain Initiative: A National Public Health Road Map to Maintaining Cognitive Health, which represents the strategic plan for The Healthy Brain Initiative.

  17. Surname distribution in France: a distance analysis by a distorted geographical map.

    PubMed

    Mourrieras, B; Darlu, P; Hochez, J; Hazout, S

    1995-01-01

    The distribution of surnames in 90 distinct regions in France during two successive periods, 1889-1915 and 1916-1940, is analysed from the civil birth registers of the 36,500 administrative units in France. A new approach, called 'Mobile Site Method' (MSM), is developed to allow representation of a surname distance matrix by a distorted geographical map. A surname distance matrix between the various regions in France is first calculated, then a distorted geographical map called the 'surname similarity map' is built up from the surname distances between regions. To interpret this map we draw (a) successive map contours obtained during the step-by-step distortion process, revealing zones of high surname dissimilarity, and (b) maps in grey levels representing the displacement magnitude, and allowing the segmentation of the geographical and surname maps into 'homogeneous surname zones'. By integrating geography and surname information in the same analysis, and by comparing results obtained for the two successive periods, the MSM approach produces convenient maps showing: (a) 'regionalism' of some peripheral populations such as Pays Basque, Alsace, Corsica and Brittany; (b) the presence of preferential axes of communications (Rhodanian corridor, Garonne valley); (c) barriers such as the Central Massif, Vosges; (d) the weak modifications of the distorted maps associated with the two periods studied suggest an extension (but limited) of the tendency of surname uniformity in France. These results are interpreted, in the nineteenth- and twentieth century context, as the consequences of a slow process of local migrations occurring over a long period of time.

  18. Valorisation of Como Historical Cadastral Maps Through Modern Web Geoservices

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Zamboni, G.

    2012-07-01

    Cartographic cultural heritage preserved in worldwide archives is often stored in the original paper version only, thus restricting both the chances of utilization and the range of possible users. The Web C.A.R.T.E. system addressed this issue with regard to the precious cadastral maps preserved at the State Archive of Como. Aim of the project was to improve the visibility and accessibility of this heritage using the latest free and open source tools for processing, cataloguing and web publishing the maps. The resulting architecture should therefore assist the State Archive of Como in managing its cartographic contents. After a pre-processing consisting of digitization and georeferencing steps, maps were provided with metadata, compiled according to the current Italian standards and managed through an ad hoc version of the GeoNetwork Opensource geocatalog software. A dedicated MapFish-based webGIS client, with an optimized version also for mobile platforms, was built for maps publication and 2D navigation. A module for 3D visualization of cadastral maps was finally developed using the NASA World Wind Virtual Globe. Thanks to a temporal slidebar, time was also included in the system producing a 4D Graphical User Interface. The overall architecture was totally built with free and open source software and allows a direct and intuitive consultation of historical maps. Besides the notable advantage of keeping original paper maps intact, the system greatly simplifies the work of the State Archive of Como common users and together widens the same range of users thanks to the modernization of map consultation tools.

  19. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  20. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  1. Alternative transitions between existing representations in multi-scale maps

    NASA Astrophysics Data System (ADS)

    Dumont, Marion; Touya, Guillaume; Duchêne, Cécile

    2018-05-01

    Map users may have issues to achieve multi-scale navigation tasks, as cartographic objects may have various representations across scales. We assume that adding intermediate representations could be one way to reduce the differences between existing representations, and to ease the transitions across scales. We consider an existing multiscale map on the scale range from 1 : 25k to 1 : 100k scales. Based on hypotheses about intermediate representations design, we build custom multi-scale maps with alternative transitions. We will conduct in a next future a user evaluation to compare the efficiency of these alternative maps for multi-scale navigation. This paper discusses the hypotheses and production process of these alternative maps.

  2. Galactic background maps at 3.93 and 6.55 MHz. M.S. Thesis - Maryland Univ.

    NASA Technical Reports Server (NTRS)

    Novaco, J. C.

    1973-01-01

    The Radio Astronomy Explorer Satellite (RAE-1), its hardware and its data processing are discussed. The data from the prime mapping antenna are discussed with emphasis on the problems involved in reducing the data. Particular attention is drawn to two problems - receiver instability and ground breakthrough - and their influence on the data. Galactic background maps of the nonthermal radiation at 3.93 and 6.55 MHz are produced. It is demonstrated that the positional uncertainity of the maps is about 20 deg. The maps at 3.93 and 6.55 MHz are compared to two ground based maps made at higher frequencies that are smoothed to the larger RAE antenna patterns.

  3. South Florida Everglades: satellite image map

    USGS Publications Warehouse

    Jones, John W.; Thomas, Jean-Claude; Desmond, G.B.

    2001-01-01

    These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program (http://access.usgs.gov/) with support from the Everglades National Park (http://www.nps.gov/ever/). The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.

  4. Korean coastal water depth/sediment and land cover mapping (1:25,000) by computer analysis of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Park, K. Y.; Miller, L. D.

    1978-01-01

    Computer analysis was applied to single date LANDSAT MSS imagery of a sample coastal area near Seoul, Korea equivalent to a 1:50,000 topographic map. Supervised image processing yielded a test classification map from this sample image containing 12 classes: 5 water depth/sediment classes, 2 shoreline/tidal classes, and 5 coastal land cover classes at a scale of 1:25,000 and with a training set accuracy of 76%. Unsupervised image classification was applied to a subportion of the site analyzed and produced classification maps comparable in results in a spatial sense. The results of this test indicated that it is feasible to produce such quantitative maps for detailed study of dynamic coastal processes given a LANDSAT image data base at sufficiently frequent time intervals.

  5. Radiation hybrid map of barley chromosome 3H

    USDA-ARS?s Scientific Manuscript database

    Assembly of the barley genome is complicated by its large size (5.1 Gb) and proportion of repetitive elements (84%). This process is facilitated by high resolution maps for aligning BAC contigs along chromosomes. Available genetic maps; however, do not provide accurate information on the physical po...

  6. As-built design specification for segment map (Sgmap) program

    NASA Technical Reports Server (NTRS)

    Tompkins, M. A. (Principal Investigator)

    1981-01-01

    The segment map program (SGMAP), which is part of the CLASFYT package, is described in detail. This program is designed to output symbolic maps or numerical dumps from LANDSAT cluster/classification files or aircraft ground truth/processed ground truth files which are in 'universal' format.

  7. 39 CFR 776.5 - Review procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Floodplain Management § 776.5 Review procedures. Officials shall follow the decision-making process outlined in paragraphs (a) through (f) of this section, when a facility action may involve floodplain issues... Emergency Management Agency (FEMA) maps, or more detailed maps if available. If such maps are not available...

  8. Creating a literature database of low-calorie sweeteners and health studies: evidence mapping.

    PubMed

    Wang, Ding Ding; Shams-White, Marissa; Bright, Oliver John M; Parrott, J Scott; Chung, Mei

    2016-01-05

    Evidence mapping is an emerging tool used to systematically identify, organize and summarize the quantity and focus of scientific evidence on a broad topic, but there are currently no methodological standards. Using the topic of low-calorie sweeteners (LCS) and selected health outcomes, we describe the process of creating an evidence-map database and demonstrate several example descriptive analyses using this database. The process of creating an evidence-map database is described in detail. The steps include: developing a comprehensive literature search strategy, establishing study eligibility criteria and a systematic study selection process, extracting data, developing outcome groups with input from expert stakeholders and tabulating data using descriptive analyses. The database was uploaded onto SRDR™ (Systematic Review Data Repository), an open public data repository. Our final LCS evidence-map database included 225 studies, of which 208 were interventional studies and 17 were cohort studies. An example bubble plot was produced to display the evidence-map data and visualize research gaps according to four parameters: comparison types, population baseline health status, outcome groups, and study sample size. This plot indicated a lack of studies assessing appetite and dietary intake related outcomes using LCS with a sugar intake comparison in people with diabetes. Evidence mapping is an important tool for the contextualization of in-depth systematic reviews within broader literature and identifies gaps in the evidence base, which can be used to inform future research. An open evidence-map database has the potential to promote knowledge translation from nutrition science to policy.

  9. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    USGS Publications Warehouse

    Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.

    2011-01-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey, represents the state of the art for processing planetary remote sensing data, from the raw unprocessed state to the map projected product. The second, the Geographic Resources Analysis Support System (GRASS) is a Geographic Information System developed by an international team of developers, and one of the core projects promoted by the Open Source Geospatial Foundation (OSGeo). We have worked on enabling the combined use of these software systems throughout the set-up of a common user interface, the unification of the cartographic reference system nomenclature and the minimization of data conversion. Both software packages are distributed with free open source licenses, as well as the source code, scripts and configuration files hereafter presented. In this paper we describe our work done to merge these working environments into a common one, where the user benefits from functionalities of both systems without the need to switch or transfer data from one software suite to the other one. Thereafter we provide an example of its usage in the handling of planetary data and the crafting of a digital geologic map. ?? 2010 Elsevier Ltd. All rights reserved.

  10. Near-real-time simulation and internet-based delivery of forecast-flood inundation maps using two-dimensional hydraulic modeling--A pilot study for the Snoqualmie River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Fulford, Janice M.; Voss, Frank D.

    2002-01-01

    A system of numerical hydraulic modeling, geographic information system processing, and Internet map serving, supported by new data sources and application automation, was developed that generates inundation maps for forecast floods in near real time and makes them available through the Internet. Forecasts for flooding are generated by the National Weather Service (NWS) River Forecast Center (RFC); these forecasts are retrieved automatically by the system and prepared for input to a hydraulic model. The model, TrimR2D, is a new, robust, two-dimensional model capable of simulating wide varieties of discharge hydrographs and relatively long stream reaches. TrimR2D was calibrated for a 28-kilometer reach of the Snoqualmie River in Washington State, and is used to estimate flood extent, depth, arrival time, and peak time for the RFC forecast. The results of the model are processed automatically by a Geographic Information System (GIS) into maps of flood extent, depth, and arrival and peak times. These maps subsequently are processed into formats acceptable by an Internet map server (IMS). The IMS application is a user-friendly interface to access the maps over the Internet; it allows users to select what information they wish to see presented and allows the authors to define scale-dependent availability of map layers and their symbology (appearance of map features). For example, the IMS presents a background of a digital USGS 1:100,000-scale quadrangle at smaller scales, and automatically switches to an ortho-rectified aerial photograph (a digital photograph that has camera angle and tilt distortions removed) at larger scales so viewers can see ground features that help them identify their area of interest more effectively. For the user, the option exists to select either background at any scale. Similar options are provided for both the map creator and the viewer for the various flood maps. This combination of a robust model, emerging IMS software, and application interface programming should allow the technology developed in the pilot study to be applied to other river systems where NWS forecasts are provided routinely.

  11. A new mapping function in table-mounted eye tracker

    NASA Astrophysics Data System (ADS)

    Tong, Qinqin; Hua, Xiao; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is a new apparatus of human-computer interaction, which has caught much attention in recent years. Eye tracking technology is to obtain the current subject's "visual attention (gaze)" direction by using mechanical, electronic, optical, image processing and other means of detection. While the mapping function is one of the key technology of the image processing, and is also the determination of the accuracy of the whole eye tracker system. In this paper, we present a new mapping model based on the relationship among the eyes, the camera and the screen that the eye gazed. Firstly, according to the geometrical relationship among the eyes, the camera and the screen, the framework of mapping function between the pupil center and the screen coordinate is constructed. Secondly, in order to simplify the vectors inversion of the mapping function, the coordinate of the eyes, the camera and screen was modeled by the coaxial model systems. In order to verify the mapping function, corresponding experiment was implemented. It is also compared with the traditional quadratic polynomial function. And the results show that our approach can improve the accuracy of the determination of the gazing point. Comparing with other methods, this mapping function is simple and valid.

  12. A comparative survey of current and proposed tropospheric refraction-delay models for DSN radio metric data calibration

    NASA Technical Reports Server (NTRS)

    Estefan, J. A.; Sovers, O. J.

    1994-01-01

    The standard tropospheric calibration model implemented in the operational Orbit Determination Program is the seasonal model developed by C. C. Chao in the early 1970's. The seasonal model has seen only slight modification since its release, particularly in the format and content of the zenith delay calibrations. Chao's most recent standard mapping tables, which are used to project the zenith delay calibrations along the station-to-spacecraft line of sight, have not been modified since they were first published in late 1972. This report focuses principally on proposed upgrades to the zenith delay mapping process, although modeling improvements to the zenith delay calibration process are also discussed. A number of candidate approximation models for the tropospheric mapping are evaluated, including the semi-analytic mapping function of Lanyi, and the semi-empirical mapping functions of Davis, et. al.('CfA-2.2'), of Ifadis (global solution model), of Herring ('MTT'), and of Niell ('NMF'). All of the candidate mapping functions are superior to the Chao standard mapping tables and approximation formulas when evaluated against the current Deep Space Network Mark 3 intercontinental very long baselines interferometry database.

  13. Prospects and pitfalls of occupational hazard mapping: 'between these lines there be dragons'.

    PubMed

    Koehler, Kirsten A; Volckens, John

    2011-10-01

    Hazard data mapping is a promising new technique that can enhance the process of occupational exposure assessment and risk communication. Hazard maps have the potential to improve worker health by providing key input for the design of hazard intervention and control strategies. Hazard maps are developed with aid from direct-reading instruments, which can collect highly spatially and temporally resolved data in a relatively short period of time. However, quantifying spatial-temporal variability in the occupational environment is not a straightforward process, and our lack of understanding of how to ascertain and model spatial and temporal variability is a limiting factor in the use and interpretation of workplace hazard maps. We provide an example of how sources of and exposures to workplace hazards may be mischaracterized in a hazard map due to a lack of completeness and representativeness of collected measurement data. Based on this example, we believe that a major priority for research in this emerging area should focus on the development of a statistical framework to quantify uncertainty in spatially and temporally varying data. In conjunction with this need is one for the development of guidelines and procedures for the proper sampling, generation, and evaluation of workplace hazard maps.

  14. Effectiveness of higher order thinking skills (HOTS) based i-Think map concept towards primary students

    NASA Astrophysics Data System (ADS)

    Ping, Owi Wei; Ahmad, Azhar; Adnan, Mazlini; Hua, Ang Kean

    2017-05-01

    Higher Order Thinking Skills (HOTS) is a new concept of education reform based on the Taxonomies Bloom. The concept concentrate on student understanding in learning process based on their own methods. Through the HOTS questions are able to train students to think creatively, critic and innovative. The aim of this study was to identify the student's proficiency in solving HOTS Mathematics question by using i-Think map. This research takes place in Sabak Bernam, Selangor. The method applied is quantitative approach that involves approximately all of the standard five students. Pra-posttest was conduct before and after the intervention using i-Think map in solving the HOTS questions. The result indicates significant improvement for post-test, which prove that applying i-Think map enhance the students ability to solve HOTS question. Survey's analysis showed 90% of the students agree having i-Thinking map in analysis the question carefully and using keywords in the map to solve the questions. As conclusion, this process benefits students to minimize in making the mistake when solving the questions. Therefore, teachers are necessarily to guide students in applying the eligible i-Think map and methods in analyzing the question through finding the keywords.

  15. An optimization method of VON mapping for energy efficiency and routing in elastic optical networks

    NASA Astrophysics Data System (ADS)

    Liu, Huanlin; Xiong, Cuilian; Chen, Yong; Li, Changping; Chen, Derun

    2018-03-01

    To improve resources utilization efficiency, network virtualization in elastic optical networks has been developed by sharing the same physical network for difference users and applications. In the process of virtual nodes mapping, longer paths between physical nodes will consume more spectrum resources and energy. To address the problem, we propose a virtual optical network mapping algorithm called genetic multi-objective optimize virtual optical network mapping algorithm (GM-OVONM-AL), which jointly optimizes the energy consumption and spectrum resources consumption in the process of virtual optical network mapping. Firstly, a vector function is proposed to balance the energy consumption and spectrum resources by optimizing population classification and crowding distance sorting. Then, an adaptive crossover operator based on hierarchical comparison is proposed to improve search ability and convergence speed. In addition, the principle of the survival of the fittest is introduced to select better individual according to the relationship of domination rank. Compared with the spectrum consecutiveness-opaque virtual optical network mapping-algorithm and baseline-opaque virtual optical network mapping algorithm, simulation results show the proposed GM-OVONM-AL can achieve the lowest bandwidth blocking probability and save the energy consumption.

  16. Database for geologic maps of pyroclastic-flow and related deposits of the 1980 eruptions of Mount St. Helens, Washington

    USGS Publications Warehouse

    Furze, Andrew J.; Bard, Joseph A.; Robinson, Joel; Ramsey, David W.; Kuntz, Mel A.; Rowley, Peter D.; MacLeod, Norman S.

    2017-10-31

    This publication releases digital versions of the geologic maps in U.S. Geological Survey Miscellaneous Investigations Map 1950 (USGS I-1950), “Geologic maps of pyroclastic-flow and related deposits of the 1980 eruptions of Mount St. Helens, Washington” (Kuntz, Rowley, and MacLeod, 1990) (https://pubs.er.usgs.gov/publication/i1950). The 1980 Mount St. Helens eruptions on May 18, May 25, June 12, July 22, August 7, and October 16–18 produced pyroclastic-flow and related deposits. The distribution and morphology of these deposits, as determined from extensive field studies and examination of vertical aerial photographs, are shown on four maps in I-1950 (maps A–D) on two map sheets. Map A shows the May 18, May 25, and June 12 deposits; map B shows the July 22 deposits; map C shows the August 7 deposits; and map D shows the October 16–18 deposits. No digital geospatial versions of the geologic data were made available at the time of publication of the original maps. This data release consists of attributed vector features, data tables, and the cropped and georeferenced scans from which the features were digitized, in order to enable visualization and analysis of these data in GIS software. This data release enables users to digitally re-create the maps and description of map units of USGS I-1950; map sheet 1 includes text sections (Introduction, Physiography of Mount St. Helens at the time of the 1980 eruptions, Processes of the 1980 eruptions, Deposits of the 1980 eruptions, Limitations of the maps, Preparation of the maps, and References cited) and associated tables and figures that are not included in this data release.

  17. Analogical processes in children's understanding of spatial representations.

    PubMed

    Yuan, Lei; Uttal, David; Gentner, Dedre

    2017-06-01

    We propose that map reading can be construed as a form of analogical mapping. We tested 2 predictions that follow from this claim: First, young children's patterns of performance in map reading tasks should parallel those found in analogical mapping tasks; and, second, children will benefit from guided alignment instructions that help them see the relational correspondences between the map and the space. In 4 experiments, 3-year-olds completed a map reading task in which they were asked to find hidden objects in a miniature room, using a corresponding map. We manipulated the availability of guided alignment (showing children the analogical mapping between maps and spaces; Experiments 1, 2, and 3a), the format of guided alignment (gesture or relational language; Experiment 2), and the iconicity of maps (Experiments 3a and 3b). We found that (a) young children's difficulties in map reading follow from known patterns of analogical development-for example, focusing on object similarity over relational similarity; and (b) guided alignment based on analogical reasoning led to substantially better performance. Results also indicated that children's map reading performance was affected by the format of guided alignment, the iconicity of the maps, and the order of tasks. The results bear on the developmental mechanisms underlying young children's learning of spatial representations and also suggest ways to support this learning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  19. Intelligent process mapping through systematic improvement of heuristics

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  20. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    PubMed

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  1. REGULATION OF EPHRIN-A EXPRESSION IN COMPRESSED RETINOCOLLICULAR MAPS

    PubMed Central

    Tadesse, T.; Cheng, Q.; Xu, M.; Baro, D.J.; Young, L.J.; Pallas, S.L.

    2012-01-01

    Retinotopic maps can undergo compression and expansion in response to changes in target size, but the mechanism underlying this compensatory process has remained a mystery. The discovery of ephrins as molecular mediators of Sperry’s chemoaffinity process allows a mechanistic approach to this important issue. In Syrian hamsters, neonatal, partial (PT) ablation of posterior superior colliculus (SC) leads to compression of the retinotopic map, independent of neural activity. Graded, repulsive EphA receptor/ephrin-A ligand interactions direct the formation of the retinocollicular map, but whether ephrins might also be involved in map compression is unknown. To examine whether map compression might be directed by changes in the ephrin expression pattern, we compared ephrin-A2 and ephrin-A5 mRNA expression between normal SC and PT SC using in situ hybridization and quantitative real-time PCR. We found that ephrin-A ligand expression in the compressed maps was low anteriorly and high posteriorly, as in normal animals. Consistent with our hypothesis, the steepness of the ephrin gradient increased in the lesioned colliculi. Interestingly, overall levels of ephrin-A2 and -A5 expression declined immediately after neonatal target damage, perhaps promoting axon outgrowth. These data establish a correlation between changes in ephrin-A gradients and map compression, and suggest that ephrin-A expression gradients may be regulated by target size. This in turn could lead to compression of the retinocollicular map onto the reduced target. These findings have important implications for mechanisms of recovery from traumatic brain injury. PMID:23008269

  2. Map scale effects on estimating the number of undiscovered mineral deposits

    USGS Publications Warehouse

    Singer, D.A.; Menzie, W.D.

    2008-01-01

    Estimates of numbers of undiscovered mineral deposits, fundamental to assessing mineral resources, are affected by map scale. Where consistently defined deposits of a particular type are estimated, spatial and frequency distributions of deposits are linked in that some frequency distributions can be generated by processes randomly in space whereas others are generated by processes suggesting clustering in space. Possible spatial distributions of mineral deposits and their related frequency distributions are affected by map scale and associated inclusions of non-permissive or covered geological settings. More generalized map scales are more likely to cause inclusion of geologic settings that are not really permissive for the deposit type, or that include unreported cover over permissive areas, resulting in the appearance of deposit clustering. Thus, overly generalized map scales can cause deposits to appear clustered. We propose a model that captures the effects of map scale and the related inclusion of non-permissive geologic settings on numbers of deposits estimates, the zero-inflated Poisson distribution. Effects of map scale as represented by the zero-inflated Poisson distribution suggest that the appearance of deposit clustering should diminish as mapping becomes more detailed because the number of inflated zeros would decrease with more detailed maps. Based on observed worldwide relationships between map scale and areas permissive for deposit types, mapping at a scale with twice the detail should cut permissive area size of a porphyry copper tract to 29% and a volcanic-hosted massive sulfide tract to 50% of their original sizes. Thus some direct benefits of mapping an area at a more detailed scale are indicated by significant reductions in areas permissive for deposit types, increased deposit density and, as a consequence, reduced uncertainty in the estimate of number of undiscovered deposits. Exploration enterprises benefit from reduced areas requiring detailed and expensive exploration, and land-use planners benefit from reduced areas of concern. ?? 2008 International Association for Mathematical Geology.

  3. The Structure-Mapping Engine: Algorithm and Examples.

    ERIC Educational Resources Information Center

    Falkenhainer, Brian; And Others

    This description of the Structure-Mapping Engine (SME), a flexible, cognitive simulation program for studying analogical processing which is based on Gentner's Structure-Mapping theory of analogy, points out that the SME provides a "tool kit" for constructing matching algorithms consistent with this theory. This report provides: (1) a…

  4. A vegetation mapping strategy for conifer forests by combining airborne LiDAR data and aerial imagery

    Treesearch

    Yanjun Su; Qinghua Guo; Danny L. Fry; Brandon M. Collins; Maggi Kelly; Jacob P. Flanagan; John J. Battles

    2016-01-01

    Abstract. Accurate vegetation mapping is critical for natural resources management, ecological analysis, and hydrological modeling, among other tasks. Remotely sensed multispectral and hyperspectral imageries have proved to be valuable inputs to the vegetation mapping process, but they can provide only limited vegetation structure...

  5. Evidence-Based Concept Mapping for the Athletic Training Student

    ERIC Educational Resources Information Center

    Speicher, Timothy E.; Martin, Malissa; Zigmont, Jason

    2013-01-01

    Context: A concept map is a graphical and cognitive tool that enables learners to link together interrelated concepts using propositions or statements that answer a posed problem. As an assessment tool, concept mapping reveals a learner's research skill proficiency and cognitive processing. Background: The identification and organization of the…

  6. An Interdisciplinary Theme: Topographic Maps and Plate Tectonics

    ERIC Educational Resources Information Center

    Concannon, James P.; Aulgur, Linda

    2011-01-01

    This is an interdisciplinary lesson designed for middle school students studying landforms and geological processes. Students create a two-dimensional topographic map from a three-dimensional landform that they create using clay. Students then use other groups' topographic maps to re-create landforms. Following this, students explore some basic…

  7. Ripple Effect Mapping: A "Radiant" Way to Capture Program Impacts

    ERIC Educational Resources Information Center

    Kollock, Debra Hansen; Flage, Lynette; Chazdon, Scott; Paine, Nathan; Higgins, Lorie

    2012-01-01

    Learn more about a promising follow-up, participatory group process designed to document the results of Extension educational efforts within complex, real-life settings. The method, known as Ripple Effect Mapping, uses elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to engage program participants and other community…

  8. Music Regions and Mental Maps: Teaching Cultural Geography

    ERIC Educational Resources Information Center

    Shobe, Hunter; Banis, David

    2010-01-01

    Music informs understandings of place and is an excellent vehicle for teaching cultural geography. A study was developed of geography students' perception of where music genres predominate in the United States. Its approach, involving mental map exercises, reveals the usefulness and importance of maps as an iterative process in teaching cultural…

  9. Map design and production issues for the Utah Gap Analysis Project

    USGS Publications Warehouse

    Hutchinson, John A.; Wittmann, J.H.

    1997-01-01

    The cartographic preparation and printing of four maps for the Utah GAP Project presented a wide range of challenges in cartographic design and production. In meeting these challenges, the map designers had to balance the purpose of the maps together with their legibility and utility against both the researchers' desire to show as much detail as possible and the technical limitations inherent in the printing process. This article describes seven design and production issues in order to illustrate the challenges of making maps from a merger of satellite data and GIS databases, and to point toward future investigation and development.

  10. Spatial downscaling of soil prediction models based on weighted generalized additive models in smallholder farm settings.

    PubMed

    Xu, Yiming; Smith, Scot E; Grunwald, Sabine; Abd-Elrahman, Amr; Wani, Suhas P; Nair, Vimala D

    2017-09-11

    Digital soil mapping (DSM) is gaining momentum as a technique to help smallholder farmers secure soil security and food security in developing regions. However, communications of the digital soil mapping information between diverse audiences become problematic due to the inconsistent scale of DSM information. Spatial downscaling can make use of accessible soil information at relatively coarse spatial resolution to provide valuable soil information at relatively fine spatial resolution. The objective of this research was to disaggregate the coarse spatial resolution soil exchangeable potassium (K ex ) and soil total nitrogen (TN) base map into fine spatial resolution soil downscaled map using weighted generalized additive models (GAMs) in two smallholder villages in South India. By incorporating fine spatial resolution spectral indices in the downscaling process, the soil downscaled maps not only conserve the spatial information of coarse spatial resolution soil maps but also depict the spatial details of soil properties at fine spatial resolution. The results of this study demonstrated difference between the fine spatial resolution downscaled maps and fine spatial resolution base maps is smaller than the difference between coarse spatial resolution base maps and fine spatial resolution base maps. The appropriate and economical strategy to promote the DSM technique in smallholder farms is to develop the relatively coarse spatial resolution soil prediction maps or utilize available coarse spatial resolution soil maps at the regional scale and to disaggregate these maps to the fine spatial resolution downscaled soil maps at farm scale.

  11. Care maps for children with medical complexity.

    PubMed

    Adams, Sherri; Nicholas, David; Mahant, Sanjay; Weiser, Natalie; Kanani, Ronik; Boydell, Katherine; Cohen, Eyal

    2017-12-01

    Children with medical complexity require multiple providers and services to keep them well and at home. A care map is a patient/family-created diagram that pictorially maps out this complex web of services. This study explored what care maps mean for families and healthcare providers to inform potential for clinical use. Parents (n=15) created care maps (hand drawn n=10 and computer-generated n=5) and participated in semi-structured interviews about the process of developing care maps and their perceived impact. Healthcare providers (n=30) reviewed the parent-created care maps and participated in semi-structured interviews. Data were analysed for themes and emerging theory using a grounded theory analytical approach. Data analysis revealed 13 overarching themes that were further categorized into three domains: features (characteristics of care maps), functions (what care maps do), and emerging outcomes (benefits of care map use). These domains further informed a definition and a theoretical model of how care maps work. Our findings suggest that care maps may be a way of supporting patient- and family-centred care by graphically identifying and integrating experiences of the family as well as priorities for moving forward. Care maps were endorsed as a useful tool by families and providers. They help healthcare providers better understand parental priorities for care. Parents can create care maps to demonstrate the complex burden of care. They are a unique visual way to incorporate narrative medicine into practice. © 2017 Mac Keith Press.

  12. Employing Geodatabases for Planetary Mapping Conduct - Requirements, Concepts and Solutions

    NASA Technical Reports Server (NTRS)

    vanGasselt, Stephan; Nass, A.

    2010-01-01

    Planetary geologic mapping has become complex in terms of merging and co-registering a variety of different datasets for analysis and mapping. But it has also become more convenient when it comes to conducting actual (geoscientific) mapping with the help of desktop Geographic Information Systems (GIS). The complexity and variety of data, however, are major issues that need to be taken care of in order to provide mappers with a consistent and easy-to-use mapping basis. Furthermore, a high degree of functionality and interoperability of various commercial and open-source GIS and remote sensing applications allow mappers to organize map data, map components and attribute data in a more sophisticated and intuitional way when compared to workflows 15 years ago. Integration of mapping results of different groups becomes an awkward task as each mapper follows his/her own style, especially if mapping conduct is not coordinated and organized programmatically. Problems of data homogenization start with various interpretations and implementations of planetary map projections and reference systems which form the core component of any mapping and analysis work. If the data basis is inconsistent, mapping results in terms of objects georeference become hard to integrate. Apart from data organization and referencing issues, which are important on the mapping as well as the data-processing side of every project, the organization of planetary geologic map units and attributes, as well as their representation within a common GIS environment, are key components that need to be taken care of in a consistent and persistent way.

  13. How to compare the faces of the Earth? Walachia in mid-19th century and nowadays

    NASA Astrophysics Data System (ADS)

    Bartos-Elekes, Zsombor; Magyari-Sáska, Zsolt; Timár, Gábor; Imecs, Zoltán

    2014-05-01

    In 1864 a detailed map was made about Walachia, its title is Charta României Meridionale (Map of Southern Romania), it has 112 map sheets, it is often called after his draughtsman: Szathmári's map. The map has an outstanding position in the history of Romanian cartography, because it indicates a turning-point. Before the map, foreigners (Austrians and Russians) had made topographic maps about this vassal principality of the Ottoman Empire. The Austrian topographic survey (1855-1859) - which served as a basis for this map - was the last one and the most detailed of these surveys. The map was made between the personal-union (1859) and independence (1878) of the Danubian Principalities. This map was the first (to a certain extent) own map of the forming country. In consequence of this survey and map, the Romanian mapping institute was founded, which one - based on this survey and map - began the topographic mapping of the country. In the Romanian scientific literature imperfect and contradictory information has been published about this map. Only a dozen copies of the map were kept in few map collections; the researchers could have reached them with difficulties. During our research we processed the circumstances of the survey and mapmaking discovering its documentation in the archives of Vienna, as well as using the Romanian, Hungarian and German scientific literature. We found the copies in map collections from Vienna to Bucharest. We digitized all the map sheets from different collections. We calculated the parameters of the used geodetic datum and map projection. We published on the web, such we made the map reachable for everybody. The map can be viewed in different zoom levels; can be downloaded; settlements can be found using the place name index; areas can be exported in modern projection, so the conditions of that time could be compared with today's reality. Our poster presents on the one hand the survey and the map realized in mid-19th century and our digital methods, on the other hand presents the faces of the Earth in Walachia -comparing details of the geo-referenced map from 19th century with maps of nowadays. This work was supported by a grant of the Romanian National Authority for Scientific Research, CNCS - UEFISCDI, project number PN-II-RU-TE-2011-3-0125.

  14. Adapting Nielsen’s Design Heuristics to Dual Processing for Clinical Decision Support

    PubMed Central

    Taft, Teresa; Staes, Catherine; Slager, Stacey; Weir, Charlene

    2016-01-01

    The study objective was to improve the applicability of Nielson’s standard design heuristics for evaluating electronic health record (EHR) alerts and linked ordering support by integrating them with Dual Process theory. Through initial heuristic evaluation and a user study of 7 physicians, usability problems were identified. Through independent mapping of specific usability criteria to support for each of the Dual Cognitive processes (S1 and S2) and deliberation, agreement was reached on mapping criteria. Finally, usability errors from the heuristic and user study were mapped to S1 and S2. Adding a dual process perspective to specific heuristic analysis increases the applicability and relevance of computerized health information design evaluations. This mapping enables designers to measure that their systems are tailored to support attention allocation. System 1 will be supported by improving pattern recognition and saliency, and system 2 through efficiency and control of information access. PMID:28269915

  15. Adapting Nielsen's Design Heuristics to Dual Processing for Clinical Decision Support.

    PubMed

    Taft, Teresa; Staes, Catherine; Slager, Stacey; Weir, Charlene

    2016-01-01

    The study objective was to improve the applicability of Nielson's standard design heuristics for evaluating electronic health record (EHR) alerts and linked ordering support by integrating them with Dual Process theory. Through initial heuristic evaluation and a user study of 7 physicians, usability problems were identified. Through independent mapping of specific usability criteria to support for each of the Dual Cognitive processes (S1 and S2) and deliberation, agreement was reached on mapping criteria. Finally, usability errors from the heuristic and user study were mapped to S1 and S2. Adding a dual process perspective to specific heuristic analysis increases the applicability and relevance of computerized health information design evaluations. This mapping enables designers to measure that their systems are tailored to support attention allocation. System 1 will be supported by improving pattern recognition and saliency, and system 2 through efficiency and control of information access.

  16. Voyager Interactive Web Interface to EarthScope

    NASA Astrophysics Data System (ADS)

    Eriksson, S. C.; Meertens, C. M.; Estey, L.; Weingroff, M.; Hamburger, M. W.; Holt, W. E.; Richard, G. A.

    2004-12-01

    Visualization of data is essential in helping scientists and students develop a conceptual understanding of relationships among many complex types of data and keep track of large amounts of information. Developed initially by UNAVCO for study of global-scale geodynamic processes, the Voyager map visualization tools have evolved into interactive, web-based map utilities that can make scientific results accessible to a large number and variety of educators and students as well as the originally targeted scientists. A portal to these map tools can be found at: http://jules.unavco.org. The Voyager tools provide on-line interactive data visualization through pre-determined map regions via a simple HTML/JavaScript interface (for large numbers of students using the tools simultaneously) or through student-selectable areas using a Java interface to a Generic Mapping Tools (GMT) engine. Students can access a variety of maps, satellite images, and geophysical data at a range of spatial scales for the earth and other planets of the solar system. Students can also choose from a variety of base maps (satellite mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others) and can then add a number of geographic and geophysical overlays, for example coastlines, political boundaries, rivers and lakes, earthquake and volcano locations, stress axes, and observed and model plate motion, as well as deformation velocity vectors representing a compilation of over 5000 geodetic measurements from around the world. The related educational website, "Exploring our Dynamic Planet", (http://www.dpc.ucar.edu/VoyagerJr/jvvjrtool.html) incorporates background materials and curricular activities that encourage students to explore Earth processes. One of the present curricular modules is designed for high school students or introductory-level undergraduate non-science majors. The purpose of the module is for students to examine real data to investigate how plate tectonic processes are reflected in observed geophysical phenomena. Constructing maps by controlling map parameters and answering open-ended questions which describe, compare relationships, and work with both observed and model data, promote conceptual understanding of plate tectonics and related processes. The goals of curricular development emphasize inquiry, development of critical thinking skills, and student-centered interests. Custom editions of the map utility have been made as the "Jules Verne Voyager" and "Voyager Junior", for the International Lithosphere Project's "Global Strain Rate Map", and for EarthScope Education and Outreach as "EarthScope Voyager Jr.". For the latter, a number of EarthScope-specific features have been added, including locations of proposed USArray (seismic), Plate Boundary Observatory (geodetic), and San Andreas Fault Observatory at Depth sites, plus detailed maps and geographically referenced examples of EarthScope-related scientific investigations. As EarthScope develops, maps will be updated in `real time' so that students of all ages can use the data in formal and informal educational settings.

  17. System and method for image mapping and visual attention

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard A. (Inventor)

    2010-01-01

    A method is described for mapping dense sensory data to a Sensory Ego Sphere (SES). Methods are also described for finding and ranking areas of interest in the images that form a complete visual scene on an SES. Further, attentional processing of image data is best done by performing attentional processing on individual full-size images from the image sequence, mapping each attentional location to the nearest node, and then summing attentional locations at each node.

  18. System and method for image mapping and visual attention

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard A. (Inventor)

    2011-01-01

    A method is described for mapping dense sensory data to a Sensory Ego Sphere (SES). Methods are also described for finding and ranking areas of interest in the images that form a complete visual scene on an SES. Further, attentional processing of image data is best done by performing attentional processing on individual full-size images from the image sequence, mapping each attentional location to the nearest node, and then summing all attentional locations at each node.

  19. Development and evaluation of a specialized task taxonomy for spatial planning - A map literacy experiment with topographic maps

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena; Çöltekin, Arzu

    2017-05-01

    Topographic maps are among the most commonly used map types, however, their complex and information-rich designs depicting natural, human-made and cultural features make them difficult to read. Regardless of their complexity, spatial planners make extensive use of topographic maps in their work. On the other hand, various studies suggest that map literacy among the development planning professionals in South Africa is not very high. The widespread use of topographic maps combined with the low levels of map literacy presents challenges for effective development planning. In this paper we address some of these challenges by developing a specialized task taxonomy based on systematically assessed map literacy levels; and conducting an empirical experiment with topographic maps to evaluate our task taxonomy. In such empirical studies if non-realistic tasks are used, the results of map literacy tests may be skewed. Furthermore, experience and familiarity with the studied map type play a role in map literacy. There is thus a need to develop map literacy tests aimed at planners specifically. We developed a taxonomy of realistic map reading tasks typically executed during the planning process. The taxonomy defines six levels tasks of increasing difficulty and complexity, ranging from recognising symbols to extracting knowledge. We hypothesized that competence in the first four levels indicates functional map literacy. In this paper, we present results from an empirical experiment with 49 map literate participants solving a subset of tasks from the first four levels of the taxonomy with a topographic map. Our findings suggest that the proposed taxonomy is a good reference for evaluating topographic map literacy. Participants solved the tasks on all four levels as expected and we therefore conclude that the experiment based on the first four levels of the taxonomy successfully determined the functional map literacy of the participants. We plan to continue the study for the remaining levels, repeat the experiments with a group of map illiterate participants to confirm that the taxonomy can also be used to determine map illiteracy.

  20. Preparation and Presentation of Digital Maps in Raster Format

    USGS Publications Warehouse

    Edwards, K.; Batson, R.M.

    1980-01-01

    A set of algorithms has been developed at USGS Flagstaff for displaying digital map data in raster format. The set includes: FILLIN, which assigns a specified attribute code to units of a map which have been outlined on a digitizer and converted to raster format; FILBND, which removes the outlines; ZIP, which adds patterns to the map units; and COLOR, which provides a simplified process for creating color separation plates for either photographic or lithographic reproduction. - Authors

  1. Evaluation of using digital gravity field models for zoning map creation

    NASA Astrophysics Data System (ADS)

    Loginov, Dmitry

    2018-05-01

    At the present time the digital cartographic models of geophysical fields are taking a special significance into geo-physical mapping. One of the important directions to their application is the creation of zoning maps, which allow taking into account the morphology of geophysical field in the implementation automated choice of contour intervals. The purpose of this work is the comparative evaluation of various digital models in the creation of integrated gravity field zoning map. For comparison were chosen the digital model of gravity field of Russia, created by the analog map with scale of 1 : 2 500 000, and the open global model of gravity field of the Earth - WGM2012. As a result of experimental works the four integrated gravity field zoning maps were obtained with using raw and processed data on each gravity field model. The study demonstrates the possibility of open data use to create integrated zoning maps with the condition to eliminate noise component of model by processing in specialized software systems. In this case, for solving problem of contour intervals automated choice the open digital models aren't inferior to regional models of gravity field, created for individual countries. This fact allows asserting about universality and independence of integrated zoning maps creation regardless of detail of a digital cartographic model of geo-physical fields.

  2. Abstracts of the Annual Meeting of Planetary Geologic Mappers, Flagstaff, AZ, 2008

    NASA Technical Reports Server (NTRS)

    Bleamaster, Leslie F., III (Editor); Tanaka, Kenneth L. (Editor); Kelley, Michael S. (Editor)

    2008-01-01

    Topics discussed include: Merging of the USGS Atlas of Mercury 1:5,000,000 Geologic Series; Geologic Mapping of the V-36 Thetis Regio Quadrangle: 2008 Progress Report; Structural Maps of the V-17 Beta Regio Quadrangle, Venus; Geologic Mapping of Isabella Quadrangle (V-50) and Helen Planitia, Venus; Renewed Mapping of the Nepthys Mons Quadrangle (V-54), Venus; Mapping the Sedna-Lavinia Region of Venus; Geologic Mapping of the Guinevere Planitia Quadrangle of Venus; Geological Mapping of Fortuna Tessera (V-2): Venus and Earth's Archean Process Comparisons; Geological Mapping of the North Polar Region of Venus (V-1 Snegurochka Planitia): Significant Problems and Comparisons to the Earth's Archean; Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training; Geologic Map of the V-1 Snegurochka Planitia Quadrangle: Progress Report; The Fredegonde (V-57) Quadrangle, Venus: Characterization of the Venus Midlands; Formation and Evolution of Lakshmi Planum (V-7), Venus: Assessment of Models using Observations from Geological Mapping; Geologic Map of the Meskhent Tessera Quadrangle (V-3), Venus: Evidence for Early Formation and Preservation of Regional Topography; Geological Mapping of the Lada Terra (V-56) Quadrangle, Venus: A Progress Report; Geology of the Lachesis Tessera Quadrangle (V-18), Venus; Geologic Mapping of the Juno Chasma Quadrangle, Venus: Establishing the Relation Between Rifting and Volcanism; Geologic Mapping of V-19, V-28, and V-53; Lunar Geologic Mapping Program: 2008 Update; Geologic Mapping of the Marius Quadrangle, the Moon; Geologic Mapping along the Arabia Terra Dichotomy Boundary: Mawrth Vallis and Nili Fossae, Mars: Introductory Report; New Geologic Map of the Argyre Region of Mars; Geologic Evolution of the Martian Highlands: MTMs -20002, -20007, -25002, and -25007; Mapping Hesperia Planum, Mars; Geologic Mapping of the Meridiani Region, Mars; Geology of Holden Crater and the Holden and Ladon Multi-Ring Impact Basins, Margaritifer Terra, Mars; Geologic Mapping of Athabasca Valles; Geologic Mapping of MTM -30247, -35247 and -40247 Quadrangles, Reull Vallis Region of Mars; Geologic Mapping of the Martian Impact Crater Tooting; Geology of the Southern Utopia Planitia Highland-Lowland Boundary Plain: First Year Results and Second Year Plan; Mars Global Geologic Mapping: Amazonian Results; Recent Geologic Mapping Results for the Polar Regions of Mars; Geologic Mapping of the Medusae Fossae Formation on Mars (MC-8 SE and MC-23 NW) and the Northern Lowlands of Venus (V-16 and V-15); Geologic Mapping of the Zal, Hi'iaka, and Shamshu Regions of Io; Global Geologic Map of Europa; Material Units, Structures/Landforms, and Stratigraphy for the Global Geologic Map of Ganymede (1:15M); and Global Geologic Mapping of Io: Preliminary Results.

  3. Global, quantitative and dynamic mapping of protein subcellular localization.

    PubMed

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh

    2016-06-09

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.

  4. Method for the visualization of landform by mapping using low altitude UAV application

    NASA Astrophysics Data System (ADS)

    Sharan Kumar, N.; Ashraf Mohamad Ismail, Mohd; Sukor, Nur Sabahiah Abdul; Cheang, William

    2018-05-01

    Unmanned Aerial Vehicle (UAV) and Digital Photogrammetry are evolving drastically in mapping technology. The significance and necessity for digital landform mapping are developing with years. In this study, a mapping workflow is applied to obtain two different input data sets which are the orthophoto and DSM. A fine flying technology is used to capture Low Altitude Aerial Photography (LAAP). Low altitude UAV (Drone) with the fixed advanced camera was utilized for imagery while computerized photogrammetry handling using Photo Scan was applied for cartographic information accumulation. The data processing through photogrammetry and orthomosaic processes is the main applications. High imagery quality is essential for the effectiveness and nature of normal mapping output such as 3D model, Digital Elevation Model (DEM), Digital Surface Model (DSM) and Ortho Images. The exactitude of Ground Control Points (GCP), flight altitude and the resolution of the camera are essential for good quality DEM and Orthophoto.

  5. Lexical processing and distributional knowledge in sound-spelling mapping in a consistent orthography: A longitudinal study of reading and spelling in dyslexic and typically developing children.

    PubMed

    Marinelli, Chiara Valeria; Cellini, Pamela; Zoccolotti, Pierluigi; Angelelli, Paola

    This study examined the ability to master lexical processing and use knowledge of the relative frequency of sound-spelling mappings in both reading and spelling. Twenty-four dyslexic and dysgraphic children and 86 typically developing readers were followed longitudinally in 3rd and 5th grades. Effects of word regularity, word frequency, and probability of sound-spelling mappings were examined in two experimental tasks: (a) spelling to dictation; and (b) orthographic judgment. Dyslexic children showed larger regularity and frequency effects than controls in both tasks. Sensitivity to distributional information of sound-spelling mappings was already detected by third grade, indicating early acquisition even in children with dyslexia. Although with notable differences, knowledge of the relative frequencies of sound-spelling mapping influenced both reading and spelling. Results are discussed in terms of their theoretical and empirical implications.

  6. Towards a Full-sky, High-resolution Dust Extinction Map with WISE and Planck

    NASA Astrophysics Data System (ADS)

    Meisner, Aaron M.; Finkbeiner, D. P.

    2014-01-01

    We have recently completed a custom processing of the entire WISE 12 micron All-sky imaging data set. The result is a full-sky map of diffuse, mid-infrared Galactic dust emission with angular resolution of 15 arcseconds, and with contaminating artifacts such as compact sources removed. At the same time, the 2013 Planck HFI maps represent a complementary data set in the far-infrared, with zero-point relatively immune to zodiacal contamination and angular resolution superior to previous full-sky data sets at similar frequencies. Taken together, these WISE and Planck data products present an opportunity to improve upon the SFD (1998) dust extinction map, by virtue of enhanced angular resolution and potentially better-controlled systematics on large scales. We describe our continuing efforts to construct and test high-resolution dust extinction and temperature maps based on our custom WISE processing and Planck HFI data.

  7. Use of concept mapping in an undergraduate introductory exercise physiology course.

    PubMed

    Henige, Kim

    2012-09-01

    Physiology is often considered a challenging course for students. It is up to teachers to structure courses and create learning opportunities that will increase the chance of student success. In an undergraduate exercise physiology course, concept maps are assigned to help students actively process and organize information into manageable and meaningful chunks and to teach them to recognize the patterns and regularities of physiology. Students are first introduced to concept mapping with a commonly relatable nonphysiology concept and are then assigned a series of maps that become more and more complex. Students map the acute response to a drop in blood pressure, the causes of the acute increase in stroke volume during cardiorespiratory exercise, and the factors contributing to an increase in maximal O(2) consumption with cardiorespiratory endurance training. In the process, students draw the integrative nature of physiology, identify causal relationships, and learn about general models and core principles of physiology.

  8. A Simple Secure Hash Function Scheme Using Multiple Chaotic Maps

    NASA Astrophysics Data System (ADS)

    Ahmad, Musheer; Khurana, Shruti; Singh, Sushmita; AlSharari, Hamed D.

    2017-06-01

    The chaotic maps posses high parameter sensitivity, random-like behavior and one-way computations, which favor the construction of cryptographic hash functions. In this paper, we propose to present a novel hash function scheme which uses multiple chaotic maps to generate efficient variable-sized hash functions. The message is divided into four parts, each part is processed by a different 1D chaotic map unit yielding intermediate hash code. The four codes are concatenated to two blocks, then each block is processed through 2D chaotic map unit separately. The final hash value is generated by combining the two partial hash codes. The simulation analyses such as distribution of hashes, statistical properties of confusion and diffusion, message and key sensitivity, collision resistance and flexibility are performed. The results reveal that the proposed anticipated hash scheme is simple, efficient and holds comparable capabilities when compared with some recent chaos-based hash algorithms.

  9. Modelling of Singapore's topographic transformation based on DEMs

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Belle, Iris; Hassler, Uta

    2015-02-01

    Singapore's topography has been heavily transformed by industrialization and urbanization processes. To investigate topographic changes and evaluate soil mass flows, historical topographic maps of 1924 and 2012 were employed, and basic topographic features were vectorized. Digital elevation models (DEMs) for the two years were reconstructed based on vector features. Corresponding slope maps, a surface difference map and a scatter plot of elevation changes were generated and used to quantify and categorize the nature of the topographic transformation. The surface difference map is aggregated into five main categories of changes: (1) areas without significant height changes, (2) lowered-down areas where hill ranges were cut down, (3) raised-up areas where valleys and swamps were filled in, (4) reclaimed areas from the sea, and (5) new water-covered areas. Considering spatial proximity and configurations of different types of changes, topographic transformation can be differentiated as either creating inland flat areas or reclaiming new land from the sea. Typical topographic changes are discussed in the context of Singapore's urbanization processes. The two slope maps and elevation histograms show that generally, the topographic surface of Singapore has become flatter and lower since 1924. More than 89% of height changes have happened within a range of 20 m and 95% have been below 40 m. Because of differences in land surveying and map drawing methods, uncertainties and inaccuracies inherent in the 1924 topographic maps are discussed in detail. In this work, a modified version of a traditional scatter plot is used to present height transformation patterns intuitively. This method of deriving categorical maps of topographical changes from a surface difference map can be used in similar studies to qualitatively interpret transformation. Slope maps and histograms were also used jointly to reveal additional patterns of topographic change.

  10. Compiling Mercury relief map using several data sources

    NASA Astrophysics Data System (ADS)

    Zakharova, M.

    2015-12-01

    There are several data of Mercury topography obtained as the result of processing materials collected by two spacecraft - the Mariner-10 and the MESSENGER during their Mercury flybys.The history of the visual mapping of Mercury begins at the recent times as the first significant observations were made during the latter half of the 20th century, whereas today we have no data with 100% coverage of the entire surface of the Mercury except the global mosaic composed of the images acquired by MESSENGER. The main objective of this work is to provide the first Mercury relief map using all the existing elevation data. The workflow included collecting, combining and processing the existing data and afterwards merging them correctly for one single map compiling. The preference was given to topography data while the global mosaic was used to fill the gaps where there was insufficient topography.The Mercury relief map has been created with the help of four different types of data: - global mosaic with 100% coverage of Mercury's surface created from Messenger orbital images (36% of the final map);- Digital Terrain Models obtained by the treating stereo images made during the Mariner 10's flybys (15% of the map) (Cook and Robinson, 2000);- Digital Terrain Models obtained from images acquired during the Messenger flybys (24% of the map) (F. Preusker et al., 2011);- the data sets produced by the MESSENGER Mercury Laser Altimeter (MLA) (25 % of the map).The final map is created in the Lambert azimuthal Equal area projection and has the scale 1:18 000 000. It represents two hemispheres - western and eastern which are separated by the zero meridian. It mainly shows the hypsometric features of the planet and craters with a diameter more than 200 kilometers.

  11. Seafloor 2030 - Building a Global Ocean Map through International Collaboration

    NASA Astrophysics Data System (ADS)

    Ferrini, V. L.; Wigley, R. A.; Falconer, R. K. H.; Jakobsson, M.; Allen, G.; Mayer, L. A.; Schmitt, T.; Rovere, M.; Weatherall, P.; Marks, K. M.

    2016-12-01

    With more than 85% of the ocean floor unmapped, a huge proportion of our planet remains unexplored. Creating a comprehensive map of seafloor bathymetry remains a true global challenge that can only be accomplished through collaboration and partnership between governments, industry, academia, research organizations and non-government organizations. The objective of Seafloor 2030 is to comprehensively map the global ocean floor to resolutions that enable exploration and improved understanding of ocean processes, while informing maritime policy and supporting the management of natural marine resources for a sustainable Blue Economy. Seafloor 2030 is the outcome of the Forum for Future of Ocean Floor Mapping held in Monaco in June 2016, which was held under the auspices of GEBCO and the Nippon Foundation of Japan. GEBCO is the only international organization mandated to map the global ocean floor and is guided by the International Hydrographic Organization (IHO) and the Intergovernmental Oceanographic Commission of UNESCO. The task of completely mapping the ocean floor will require new global coordination to ensure that both existing data are identified and that new mapping efforts are coordinated to help efficiently "map the gaps." Fundamental to achieving Seafloor 2030 will be greater access to data, tools and technology, particularly for developing and coastal nations. This includes bathymetric post-processing and analysis software, database technology, computing infrastructure and gridding techniques as well as the latest developments in seafloor mapping methods and emerging crowd-sourced bathymetry initiatives. The key to achieving this global bathymetric map is capacity building and education - including greater coordination between scientific research and industry and the effective engagement of international organizations such as the United Nations.

  12. [The experiment of participatory mapping in order to construct a cartographical alternative to the FHS].

    PubMed

    Goldstein, Roberta Argento; Barcellos, Christovam; Magalhães, Monica de Avelar Figueiredo Mafra; Gracie, Renata; Viacava, Francisco

    2013-01-01

    Maps and mapping procedures are useful tools for systematic interpretation and evaluation and for reporting of results to management. Applied to the Family Health Strategy (FHS), these maps permit the demarcation of the territory and the establishment of links between the territory, its population and health services. In this paper the use of maps by the FHS in 17 municipalities in northern and northeastern Brazil is studied and the process of demarcation and digitization of areas with the participation of teams is described. The survey conducted using questionnaires and discussion workshops showed that difficulties still prevail in reconciling the map (drawing) produced at the local level with maps produced by other government sectors. In general, the maps used at local level employ their own references, which prevent the interplay of information with other cartographic documents and their full use as a tool for evaluation and management. The combination of participatory mapping tools, associated with Geographic Information Systems (GIS) applications proposed in this paper, represents an alternative to mapping the territory of operations of FHS teams, as well as a reflection on the concept of territory and operation by the FHS.

  13. Rapid Crop Cover Mapping for the Conterminous United States.

    PubMed

    Dahal, Devendra; Wylie, Bruce; Howard, Danny

    2018-06-05

    Timely crop cover maps with sufficient resolution are important components to various environmental planning and research applications. Through the modification and use of a previously developed crop classification model (CCM), which was originally developed to generate historical annual crop cover maps, we hypothesized that such crop cover maps could be generated rapidly during the growing season. Through a process of incrementally removing weekly and monthly independent variables from the CCM and implementing a 'two model mapping' approach, we found it viable to generate conterminous United States-wide rapid crop cover maps at a resolution of 250 m for the current year by the month of September. In this approach, we divided the CCM model into one 'crop type model' to handle the classification of nine specific crops and a second, binary model to classify the presence or absence of 'other' crops. Under the two model mapping approach, the training errors were 0.8% and 1.5% for the crop type and binary model, respectively, while test errors were 5.5% and 6.4%, respectively. With spatial mapping accuracies for annual maps reaching upwards of 70%, this approach demonstrated a strong potential for generating rapid crop cover maps by the 1 st of September.

  14. Harmonization of forest disturbance datasets of the conterminous USA from 1986 to 2011

    USGS Publications Warehouse

    Soulard, Christopher E.; Acevedo, William; Cohen, Warren B.; Yang, Zhiqiang; Stehman, Stephen V.; Taylor, Janis L.

    2017-01-01

    Several spatial forest disturbance datasets exist for the conterminous USA. The major problem with forest disturbance mapping is that variability between map products leads to uncertainty regarding the actual rate of disturbance. In this article, harmonized maps were produced from multiple data sources (i.e., Global Forest Change, LANDFIRE Vegetation Disturbance, National Land Cover Database, Vegetation Change Tracker, and Web-Enabled Landsat Data). The harmonization process involved fitting common class ontologies and determining spatial congruency to produce forest disturbance maps for four time intervals (1986–1992, 1992–2001, 2001–2006, and 2006–2011). Pixels mapped as disturbed for two or more datasets were labeled as disturbed in the harmonized maps. The primary advantage gained by harmonization was improvement in commission error rates relative to the individual disturbance products. Disturbance omission errors were high for both harmonized and individual forest disturbance maps due to underlying limitations in mapping subtle disturbances with Landsat classification algorithms. To enhance the value of the harmonized disturbance products, we used fire perimeter maps to add information on the cause of disturbance.

  15. Harmonization of forest disturbance datasets of the conterminous USA from 1986 to 2011.

    PubMed

    Soulard, Christopher E; Acevedo, William; Cohen, Warren B; Yang, Zhiqiang; Stehman, Stephen V; Taylor, Janis L

    2017-04-01

    Several spatial forest disturbance datasets exist for the conterminous USA. The major problem with forest disturbance mapping is that variability between map products leads to uncertainty regarding the actual rate of disturbance. In this article, harmonized maps were produced from multiple data sources (i.e., Global Forest Change, LANDFIRE Vegetation Disturbance, National Land Cover Database, Vegetation Change Tracker, and Web-Enabled Landsat Data). The harmonization process involved fitting common class ontologies and determining spatial congruency to produce forest disturbance maps for four time intervals (1986-1992, 1992-2001, 2001-2006, and 2006-2011). Pixels mapped as disturbed for two or more datasets were labeled as disturbed in the harmonized maps. The primary advantage gained by harmonization was improvement in commission error rates relative to the individual disturbance products. Disturbance omission errors were high for both harmonized and individual forest disturbance maps due to underlying limitations in mapping subtle disturbances with Landsat classification algorithms. To enhance the value of the harmonized disturbance products, we used fire perimeter maps to add information on the cause of disturbance.

  16. Portability issues for a structured clinical vocabulary: mapping from Yale to the Columbia medical entities dictionary.

    PubMed Central

    Kannry, J L; Wright, L; Shifman, M; Silverstein, S; Miller, P L

    1996-01-01

    OBJECTIVE: To examine the issues involved in mapping an existing structured controlled vocabulary, the Medical Entities Dictionary (MED) developed at Columbia University, to an institutional vocabulary, the laboratory and pharmacy vocabularies of the Yale New Haven Medical Center. DESIGN: 200 Yale pharmacy terms and 200 Yale laboratory terms were randomly selected from database files containing all of the Yale laboratory and pharmacy terms. These 400 terms were then mapped to the MED in three phases: mapping terms, mapping relationships between terms, and mapping attributes that modify terms. RESULTS: 73% of the Yale pharmacy terms mapped to MED terms. 49% of the Yale laboratory terms mapped to MED terms. After certain obsolete and otherwise inappropriate laboratory terms were eliminated, the latter rate improved to 59%. 23% of the unmatched Yale laboratory terms failed to match because of differences in granularity with MED terms. The Yale and MED pharmacy terms share 12 of 30 distinct attributes. The Yale and MED laboratory terms share 14 of 23 distinct attributes. CONCLUSION: The mapping of an institutional vocabulary to a structured controlled vocabulary requires that the mapping be performed at the level of terms, relationships, and attributes. The mapping process revealed the importance of standardization of local vocabulary subsets, standardization of attribute representation, and term granularity. PMID:8750391

  17. PPDMs-a resource for mapping small molecule bioactivities from ChEMBL to Pfam-A protein domains.

    PubMed

    Kruger, Felix A; Gaulton, Anna; Nowotka, Michal; Overington, John P

    2015-03-01

    PPDMs is a resource that maps small molecule bioactivities to protein domains from the Pfam-A collection of protein families. Small molecule bioactivities mapped to protein domains add important precision to approaches that use protein sequence searches alignments to assist applications in computational drug discovery and systems and chemical biology. We have previously proposed a mapping heuristic for a subset of bioactivities stored in ChEMBL with the Pfam-A domain most likely to mediate small molecule binding. We have since refined this mapping using a manual procedure. Here, we present a resource that provides up-to-date mappings and the possibility to review assigned mappings as well as to participate in their assignment and curation. We also describe how mappings provided through the PPDMs resource are made accessible through the main schema of the ChEMBL database. The PPDMs resource and curation interface is available at https://www.ebi.ac.uk/chembl/research/ppdms/pfam_maps. The source-code for PPDMs is available under the Apache license at https://github.com/chembl/pfam_maps. Source code is available at https://github.com/chembl/pfam_map_loader to demonstrate the integration process with the main schema of ChEMBL. © The Author 2014. Published by Oxford University Press.

  18. A Practical Framework for Cartographic Design

    NASA Astrophysics Data System (ADS)

    Denil, Mark

    2018-05-01

    Creation of a map artifact that can be recognized, accepted, read, and absorbed is the cartographer's chief responsibility. This involves bringing coherence and order out of chaos and randomness through the construction of map artifacts that mediate processes of social communication. Maps are artifacts, first and foremost: they are artifacts with particular formal attributes. It is the formal aspects of the map artifact that allows it to invoke and sustain a reading as a map. This paper examines Cartographic Design as the sole means at the cartographer's disposal for constructing the meaning bearing artifacts we know as maps, by placing it in a center of a practical analytic framework. The framework draws together the Theoretic and Craft aspects of map making, and examines how Style and Taste operate through the rubric of a schema of Mapicity to produce high quality maps. The role of the Cartographic Canon, and the role of Critique, are also explored, and a few design resources are identified.

  19. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time.

    PubMed

    Dhar, Amrit; Minin, Vladimir N

    2017-05-01

    Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences.

  20. Quaternary Geologic Map of the Regina 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    USGS Publications Warehouse

    Fullerton, David S.; Christiansen, Earl A.; Schreiner, Bryan T.; Colton, Roger B.; Clayton, Lee; Bush, Charles A.; Fullerton, David S.

    2007-01-01

    For scientific purposes, the map differentiates Quaternary surficial deposits and materials on the basis of clast lithology or composition, matrix texture or particle size, structure, genesis, stratigraphic relations, engineering geologic properties, and relative age, as shown on the correlation diagram and indicated in the 'Description of Map Units'. Deposits of some constructional landforms, such as end moraines, are distinguished as map units. Deposits of erosional landforms, such as outwash terraces, are not distinguished, although glaciofluvial, ice-contact, fluvial, and lacustrine deposits that are mapped may be terraced. Differentiation of sequences of fluvial and glaciofluvial deposits at this scale is not possible. For practical purposes, the map is a surficial materials map. Materials are distinguished on the basis of lithology or composition, texture or particle size, and other physical, chemical, and engineering characteristics. It is not a map of soils that are recognized and classified in pedology or agronomy. Rather, it is a generalized map of soils as recognized in engineering geology, or of substrata or parent materials in which pedologic or agronomic soils are formed. As a materials map, it serves as a base from which a variety of maps for use in planning engineering, land-use planning, or land-management projects can be derived and from which a variety of maps relating to earth surface processes and Quaternary geologic history can be derived.

  1. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time

    PubMed Central

    Dhar, Amrit

    2017-01-01

    Abstract Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences. PMID:28177780

  2. Medicines access programs to cancer medicines in Australia and New Zealand: An exploratory study.

    PubMed

    Grover, Piyush; Babar, Zaheer-Ud-Din; Oehmen, Raoul; Vitry, Agnes

    2018-03-01

    Medicines Access Programs (MAP) offer access to publicly unfunded medicines at the discretion of pharmaceutical companies. Limited literature is available on their extent and scope in Australia and New Zealand. This study aims to identify MAPs for cancer medicines that were operational in 2014-15 in Australia and New Zealand and describe their characteristics. A preliminary list of MAPs was sent to hospital pharmacists in Australia and New Zealand to validate and collect further information. Pharmaceutical companies were contacted directly to provide information regarding MAPs offered. Key stakeholders were interviewed to identify issues with MAPs. Fifty-one MAPs were identified covering a range of indications. The majority of MAPs were provided free of charge to the patient for medicines that were registered or in the process of being registered but were not funded. Variability in the number of MAPs across institutions and characteristics was observed. Australia offered more MAPs than New Zealand. Only two of 17 pharmaceutical companies contacted agreed to provide information on their MAPs. Eight stakeholder interviews were conducted. This identified that while MAPs are widely operational there is lack of clinical monitoring, inequity to access, operational issues and lack of transparency. Our results suggest a need for a standardised and mandated policy to mitigate issues with MAPs. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Computer aided manufacturing for complex freeform optics

    NASA Astrophysics Data System (ADS)

    Wolfs, Franciscus; Fess, Ed; Johns, Dustin; LePage, Gabriel; Matthews, Greg

    2017-10-01

    Recently, the desire to use freeform optics has been increasing. Freeform optics can be used to expand the capabilities of optical systems and reduce the number of optics needed in an assembly. The traits that increase optical performance also present challenges in manufacturing. As tolerances on freeform optics become more stringent, it is necessary to continue to improve methods for how the grinding and polishing processes interact with metrology. To create these complex shapes, OptiPro has developed a computer aided manufacturing package called PROSurf. PROSurf generates tool paths required for grinding and polishing freeform optics with multiple axes of motion. It also uses metrology feedback for deterministic corrections. ProSurf handles 2 key aspects of the manufacturing process that most other CAM systems struggle with. The first is having the ability to support several input types (equations, CAD models, point clouds) and still be able to create a uniform high-density surface map useable for generating a smooth tool path. The second is to improve the accuracy of mapping a metrology file to the part surface. To perform this OptiPro is using 3D error maps instead of traditional 2D maps. The metrology error map drives the tool path adjustment applied during processing. For grinding, the error map adjusts the tool position to compensate for repeatable system error. For polishing, the error map drives the relative dwell times of the tool across the part surface. This paper will present the challenges associated with these issues and solutions that we have created.

  4. Mapping of Arithmetic Processing by Navigated Repetitive Transcranial Magnetic Stimulation in Patients with Parietal Brain Tumors and Correlation with Postoperative Outcome.

    PubMed

    Ille, Sebastian; Drummer, Katharina; Giglhuber, Katrin; Conway, Neal; Maurer, Stefanie; Meyer, Bernhard; Krieg, Sandro M

    2018-06-01

    Preserving functionality is important during neurosurgical resection of brain tumors. Specialized centers also map further brain functions apart from motor and language functions, such as arithmetic processing (AP). The mapping of AP by navigated repetitive transcranial magnetic stimulation (nrTMS) in healthy volunteers has been reported. The present study aimed to correlate the results of mapping AP with functional patient outcomes. We included 26 patients with parietal brain tumors. Because of preoperative impairment of AP, mapping was not possible in 8 patients (31%). We stimulated 52 cortical sites by nrTMS while patients performed a calculation task. Preoperatively and postoperatively, patients underwent a standardized number-processing and calculation test (NPCT). Tumor resection was blinded to nrTMS results, and the change in NPCT performance was correlated to resected AP-positive spots as identified by nrTMS. The resection of AP-positive sites correlated with a worsening of the postoperative NPCT result in 12 cases. In 3 cases, no AP-positive sites were resected and the postoperative NPCT result was similar to or better than preoperatively. Also, in 3 cases, the postoperative NPCT result was better than preoperatively, although AP-positive sites were resected. Despite presenting only a few cases, nrTMS might be a useful tool for preoperative mapping of AP. However, the reliability of the present results has to be evaluated in a larger series and by intraoperative mapping data. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. WRIS: a resource information system for wildland management

    Treesearch

    Robert M. Russell; David A. Sharpnack; Elliot Amidon

    1975-01-01

    WRIS (Wildland Resource Information System) is a computer system for processing, storing, retrieving, updating, and displaying geographic data. The polygon, representing a land area boundary, forms the building block of WRIS. Polygons form a map. Maps are digitized manually or by automatic scanning. Computer programs can extract and produce polygon maps and can overlay...

  6. How Do (Some) People Make a Cognitive Map? Routes, Places, and Working Memory

    ERIC Educational Resources Information Center

    Weisberg, Steven M.; Newcombe, Nora S.

    2016-01-01

    Research on the existence of cognitive maps and on the cognitive processes that support effective navigation has often focused on functioning across individuals. However, there are pronounced individual differences in navigation proficiency, which need to be explained and which can illuminate our understanding of cognitive maps and effective…

  7. Asset Mapping: A Tool to Enhance Your CSPAP Efforts

    ERIC Educational Resources Information Center

    Allar, Ishonté; Bulger, Sean

    2018-01-01

    Comprehensive school physical activity programs (CSPAPs) are one way to help students achieve most, if not all, of the recommended 60 minutes of daily moderate-to-vigorous physical activity (MVPA). Early in the process, one can use asset mapping to help enhance CSPAP efforts. Asset maps provide a valuable opportunity to identify potential partners…

  8. Painting a picture across the landscape with ModelMap

    Treesearch

    Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino

    2017-01-01

    Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...

  9. A Control Algorithm for Chaotic Physical Systems

    DTIC Science & Technology

    1991-10-01

    revision expands the grid to cover the entire area of any attractor that is present. 5 Map Selection The final choices of the state- space mapping process...interval h?; overrange R0 ; control parameter interval AkO and range [kbro, khigh]; iteration depth. "* State- space mapping : 1. Set up grid by expanding

  10. Competency Mapping of Teachers in Tertiary Education

    ERIC Educational Resources Information Center

    Sugumar, V. Raji

    2009-01-01

    Competency of teachers assumes a lot of importance in the era of knowledge society who are expected to produce students of high calibre. In India however competency development and mapping still remains an unexplored process. Not much study has been done on competency mapping in higher education sector, thus the present study is ventured upon. The…

  11. Initial implementation of The National Map

    USGS Publications Warehouse

    Roth, K.

    2003-01-01

    The development of The National Map is "national" in the broadest sense of the word. Although the U.S. Geological Survey is taking the lead, local governments, states, and regions are active and essential partners in the process, contributing, for example, data updates, problem-solving data integration, and map development from multiple data layers.

  12. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  13. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  14. Development of a Mapped Diabetes Community Program Guide for a Safety Net Population

    PubMed Central

    Zallman, Leah; Ibekwe, Lynn; Thompson, Jennifer W.; Ross-Degnan, Dennis; Oken, Emily

    2014-01-01

    Purpose Enhancing linkages between patients and community programs is increasingly recognized as a method for improving physical activity, nutrition and weight management. Although interactive mapped community program guides may be beneficial, there remains a dearth of articles that describe the processes and practicalities of creating such guides. This article describes the development of an interactive, web-based mapped community program guide at a safety net institution and the lessons learned from that process. Conclusions This project demonstrated the feasibility of creating two maps – a program guide and a population health map. It also revealed some key challenges and lessons for future work in this area, particularly within safety-net institutions. Our work underscores the need for developing partnerships outside of the health care system and the importance of employing community-based participatory methods. In addition to facilitating improvements in individual wellness, mapping community programs also has the potential to improve population health management by healthcare delivery systems such as hospitals, health centers, or public health systems, including city and state departments of health. PMID:24752180

  15. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  16. Applying Value Stream Mapping Technique for Production Improvement in a Manufacturing Company: A Case Study

    NASA Astrophysics Data System (ADS)

    Jeyaraj, K. L.; Muralidharan, C.; Mahalingam, R.; Deshmukh, S. G.

    2013-01-01

    The purpose of this paper is to explain how value stream mapping (VSM) is helpful in lean implementation and to develop the road map to tackle improvement areas to bridge the gap between the existing state and the proposed state of a manufacturing firm. Through this case study, the existing stage of manufacturing is mapped with the help of VSM process symbols and the biggest improvement areas like excessive TAKT time, production, and lead time are identified. Some modifications in current state map are suggested and with these modifications future state map is prepared. Further TAKT time is calculated to set the pace of production processes. This paper compares the current state and future state of a manufacturing firm and witnessed 20 % reduction in TAKT time, 22.5 % reduction in processing time, 4.8 % reduction in lead time, 20 % improvement in production, 9 % improvement in machine utilization, 7 % improvement in man power utilization, objective improvement in workers skill level, and no change in the product and semi finished product inventory level. The findings are limited due to the focused nature of the case study. This case study shows that VSM is a powerful tool for lean implementation and allows the industry to understand and continuously improve towards lean manufacturing.

  17. Developmental Levels of Processing in Metaphor Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Janice; Pascual-Leone, Juan

    1989-01-01

    Outlines a theory of metaphor that posits varying levels of semantic processing and formalizes the levels in terms of kinds of semantic mapping operators. Predicted complexity of semantic mapping operators was tested using metaphor interpretations of 204 children aged 6-12 years and 24 adults. Processing score increased predictably with age. (SAK)

  18. Absence of rotational activity detected using 2-dimensional phase mapping in the corresponding 3-dimensional phase maps in human persistent atrial fibrillation.

    PubMed

    Pathik, Bhupesh; Kalman, Jonathan M; Walters, Tomos; Kuklik, Pawel; Zhao, Jichao; Madry, Andrew; Sanders, Prashanthan; Kistler, Peter M; Lee, Geoffrey

    2018-02-01

    Current phase mapping systems for atrial fibrillation create 2-dimensional (2D) maps. This process may affect the accurate detection of rotors. We developed a 3-dimensional (3D) phase mapping technique that uses the 3D locations of basket electrodes to project phase onto patient-specific left atrial 3D surface anatomy. We sought to determine whether rotors detected in 2D phase maps were present at the corresponding time segments and anatomical locations in 3D phase maps. One-minute left atrial atrial fibrillation recordings were obtained in 14 patients using the basket catheter and analyzed off-line. Using the same phase values, 2D and 3D phase maps were created. Analysis involved determining the dominant propagation patterns in 2D phase maps and evaluating the presence of rotors detected in 2D phase maps in the corresponding 3D phase maps. Using 2D phase mapping, the dominant propagation pattern was single wavefront (36.6%) followed by focal activation (34.0%), disorganized activity (23.7%), rotors (3.3%), and multiple wavefronts (2.4%). Ten transient rotors were observed in 9 of 14 patients (64%). The mean rotor duration was 1.1 ± 0.7 seconds. None of the 10 rotors observed in 2D phase maps were seen at the corresponding time segments and anatomical locations in 3D phase maps; 4 of 10 corresponded with single wavefronts in 3D phase maps, 2 of 10 with 2 simultaneous wavefronts, 1 of 10 with disorganized activity, and in 3 of 10 there was no coverage by the basket catheter at the corresponding 3D anatomical location. Rotors detected in 2D phase maps were not observed in the corresponding 3D phase maps. These findings may have implications for current systems that use 2D phase mapping. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  19. Population weighted raster maps can communicate findings of social audits: examples from three continents.

    PubMed

    Mitchell, Steven; Cockcroft, Anne; Andersson, Neil

    2011-12-21

    Maps can portray trends, patterns, and spatial differences that might be overlooked in tabular data and are now widely used in health research. Little has been reported about the process of using maps to communicate epidemiological findings. Population weighted raster maps show colour changes over the study area. Similar to the rasters of barometric pressure in a weather map, data are the health occurrence--a peak on the map represents a higher value of the indicator in question. The population relevance of each sentinel site, as determined in the stratified last stage random sample, combines with geography (inverse-distance weighting) to provide a population-weighted extension of each colour. This transforms the map to show population space rather than simply geographic space. Maps allowed discussion of strategies to reduce violence against women in a context of political sensitivity about quoting summary indicator figures. Time-series maps showed planners how experiences of health services had deteriorated despite a reform programme; where in a country HIV risk behaviours were improving; and how knowledge of an economic development programme quickly fell off across a region. Change maps highlighted where indicators were improving and where they were deteriorating. Maps of potential impact of interventions, based on multivariate modelling, displayed how partial and full implementation of programmes could improve outcomes across a country. Scale depends on context. To support local planning, district maps or local government authority maps of health indicators were more useful than national maps; but multinational maps of outcomes were more useful for regional institutions. Mapping was useful to illustrate in which districts enrolment in religious schools--a rare occurrence--was more prevalent. Population weighted raster maps can present social audit findings in an accessible and compelling way, increasing the use of evidence by planners with limited numeracy skills or little time to look at evidence. Maps complement epidemiological analysis, but they are not a substitute. Much less do they substitute for rigorous epidemiological designs, like randomised controlled trials.

  20. The Use of Uas for Rapid 3d Mapping in Geomatics Education

    NASA Astrophysics Data System (ADS)

    Teo, Tee-Ann; Tian-Yuan Shih, Peter; Yu, Sz-Cheng; Tsai, Fuan

    2016-06-01

    With the development of technology, UAS is an advance technology to support rapid mapping for disaster response. The aim of this study is to develop educational modules for UAS data processing in rapid 3D mapping. The designed modules for this study are focused on UAV data processing from available freeware or trial software for education purpose. The key modules include orientation modelling, 3D point clouds generation, image georeferencing and visualization. The orientation modelling modules adopts VisualSFM to determine the projection matrix for each image station. Besides, the approximate ground control points are measured from OpenStreetMap for absolute orientation. The second module uses SURE and the orientation files from previous module for 3D point clouds generation. Then, the ground point selection and digital terrain model generation can be archived by LAStools. The third module stitches individual rectified images into a mosaic image using Microsoft ICE (Image Composite Editor). The last module visualizes and measures the generated dense point clouds in CloudCompare. These comprehensive UAS processing modules allow the students to gain the skills to process and deliver UAS photogrammetric products in rapid 3D mapping. Moreover, they can also apply the photogrammetric products for analysis in practice.

  1. Mapping biomass for a northern forest ecosystem using multi-frequency SAR data

    NASA Technical Reports Server (NTRS)

    Ranson, K. J.; Sun, Guoqing

    1992-01-01

    Image processing methods for mapping standing biomass for a forest in Maine, using NASA/JPL airborne synthetic aperture radar (AIRSAR) polarimeter data, are presented. By examining the dependence of backscattering on standing biomass, it is determined that the ratio of HV backscattering from a longer wavelength (P- or L-band) to a shorter wavelength (C) is a good combination for mapping total biomass. This ratio enhances the correlation of the image signature to the standing biomass and compensates for a major part of the variations in backscattering attributed to radar incidence angle. The image processing methods used include image calibration, ratioing, filtering, and segmentation. The image segmentation algorithm uses both means and variances of the image, and it is combined with the image filtering process. Preliminary assessment of the resultant biomass maps suggests that this is a promising method.

  2. Comparison of three orientation and mobility aids for individuals with blindness: Verbal description, audio-tactile map and audio-haptic map.

    PubMed

    Papadopoulos, Konstantinos; Koustriava, Eleni; Koukourikos, Panagiotis; Kartasidou, Lefkothea; Barouti, Marialena; Varveris, Asimis; Misiou, Marina; Zacharogeorga, Timoclia; Anastasiadis, Theocharis

    2017-01-01

    Disorientation and inability of wayfinding are phenomena with a great frequency for individuals with visual impairments during the process of travelling novel environments. Orientation and mobility aids could suggest important tools for the preparation of a more secure and cognitively mapped travelling. The aim of the present study was to examine if spatial knowledge structured after an individual with blindness had studied the map of an urban area that was delivered through a verbal description, an audio-tactile map or an audio-haptic map, could be used for detecting in the area specific points of interest. The effectiveness of the three aids with reference to each other was also examined. The results of the present study highlight the effectiveness of the audio-tactile and the audio-haptic maps as orientation and mobility aids, especially when these are compared to verbal descriptions.

  3. Alaska Interim Land Cover Mapping Program; final report

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine; Doughty, E.F.; Shasby, Mark; Benjamin, Susan

    1989-01-01

    In 1985, the U.S. Geological Survey initiated a research project to develop an interim land cover data base for Alaska as an alternative to the nationwide Land Use and Land Cover Mapping Program. The Alaska Interim Land Cover Mapping Program was subsequently created to develop methods for producing a series of land cover maps that utilized the existing Landsat digital land cover classifications produced by and for the major land management agencies for mapping the vegetation of Alaska. The program was successful in producing digital land cover classifications and statistical summaries using a common statewide classification and in reformatting these data to produce l:250,000-scale quadrangle-based maps directly from the Scitex laser plotter. A Federal and State agency review of these products found considerable user support for the maps. Presently the Geological Survey is committed to digital processing of six to eight quadrangles each year.

  4. Geological mapping goes 3-D in response to societal needs

    USGS Publications Warehouse

    Thorleifson, H.; Berg, R.C.; Russell, H.A.J.

    2010-01-01

    The transition to 3-D mapping has been made possible by technological advances in digital cartography, GIS, data storage, analysis, and visualization. Despite various challenges, technological advancements facilitated a gradual transition from 2-D maps to 2.5-D draped maps to 3-D geological mapping, supported by digital spatial and relational databases that can be interrogated horizontally or vertically and viewed interactively. Challenges associated with data collection, human resources, and information management are daunting due to their resource and training requirements. The exchange of strategies at the workshops has highlighted the use of basin analysis to develop a process-based predictive knowledge framework that facilitates data integration. Three-dimensional geological information meets a public demand that fills in the blanks left by conventional 2-D mapping. Two-dimensional mapping will, however, remain the standard method for extensive areas of complex geology, particularly where deformed igneous and metamorphic rocks defy attempts at 3-D depiction.

  5. Road Map for Development of Crystal-Tolerant High Level Waste Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matyas, Josef; Vienna, John D.; Peeler, David

    This road map guides the research and development for formulation and processing of crystal-tolerant glasses, identifying near- and long-term activities that need to be completed over the period from 2014 to 2019. The primary objective is to maximize waste loading for Hanford waste glasses without jeopardizing melter operation by crystal accumulation in the melter or melter discharge riser. The potential applicability to the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) is also addressed in this road map.

  6. A physical map of the bovine genome

    PubMed Central

    Snelling, Warren M; Chiu, Readman; Schein, Jacqueline E; Hobbs, Matthew; Abbey, Colette A; Adelson, David L; Aerts, Jan; Bennett, Gary L; Bosdet, Ian E; Boussaha, Mekki; Brauning, Rudiger; Caetano, Alexandre R; Costa, Marcos M; Crawford, Allan M; Dalrymple, Brian P; Eggen, André; Everts-van der Wind, Annelie; Floriot, Sandrine; Gautier, Mathieu; Gill, Clare A; Green, Ronnie D; Holt, Robert; Jann, Oliver; Jones, Steven JM; Kappes, Steven M; Keele, John W; de Jong, Pieter J; Larkin, Denis M; Lewin, Harris A; McEwan, John C; McKay, Stephanie; Marra, Marco A; Mathewson, Carrie A; Matukumalli, Lakshmi K; Moore, Stephen S; Murdoch, Brenda; Nicholas, Frank W; Osoegawa, Kazutoyo; Roy, Alice; Salih, Hanni; Schibler, Laurent; Schnabel, Robert D; Silveri, Licia; Skow, Loren C; Smith, Timothy PL; Sonstegard, Tad S; Taylor, Jeremy F; Tellam, Ross; Van Tassell, Curtis P; Williams, John L; Womack, James E; Wye, Natasja H; Yang, George; Zhao, Shaying

    2007-01-01

    Background Cattle are important agriculturally and relevant as a model organism. Previously described genetic and radiation hybrid (RH) maps of the bovine genome have been used to identify genomic regions and genes affecting specific traits. Application of these maps to identify influential genetic polymorphisms will be enhanced by integration with each other and with bacterial artificial chromosome (BAC) libraries. The BAC libraries and clone maps are essential for the hybrid clone-by-clone/whole-genome shotgun sequencing approach taken by the bovine genome sequencing project. Results A bovine BAC map was constructed with HindIII restriction digest fragments of 290,797 BAC clones from animals of three different breeds. Comparative mapping of 422,522 BAC end sequences assisted with BAC map ordering and assembly. Genotypes and pedigree from two genetic maps and marker scores from three whole-genome RH panels were consolidated on a 17,254-marker composite map. Sequence similarity allowed integrating the BAC and composite maps with the bovine draft assembly (Btau3.1), establishing a comprehensive resource describing the bovine genome. Agreement between the marker and BAC maps and the draft assembly is high, although discrepancies exist. The composite and BAC maps are more similar than either is to the draft assembly. Conclusion Further refinement of the maps and greater integration into the genome assembly process may contribute to a high quality assembly. The maps provide resources to associate phenotypic variation with underlying genomic variation, and are crucial resources for understanding the biology underpinning this important ruminant species so closely associated with humans. PMID:17697342

  7. Information system for preserving culture heritage in areas affected by heavy industry and mining

    NASA Astrophysics Data System (ADS)

    Pacina, Jan; Kopecký, Jiří; Bedrníková, Lenka; Handrychová, Barbora; Švarcová, Martina; Holá, Markéta; Pončíková, Edita

    2014-05-01

    The natural development of the Ústí region (North-West Bohemia, the Czech Republic) has been affected by the human activity during the past hundred years. The heavy industrialization and the brown coal mining have completely changed the land-use in the region. The open-pit coal mines are completely destroying the surrounding landscape, including settlement, communications, hydrological network and the over-all natural development of the region. The other factor affecting the natural development of the landscape, land-use and settlement was the political situation in 1945 (end of the 2nd World War) when the borderland was depopulated. All these factors caused vanishing of more than two hundreds of colonies, villages and towns during this period of time. The task of this project is to prepare and offer for public use a comprehensive information system preserving the cultural heritage in the form of processed old maps, aerial imagery, land-use and georelief reconstructions, local studies, text and photo documents covering the extinct landscape and settlement. Wide range of various maps was used for this area - Müller's map of Bohemia (ca. 1720) followed by the 1st, 2nd and 3rd Military survey of Habsburg empire (1792, 1894, 1938), maps of Stabile cadaster (ca. 1840) and State map derived in the scale 1:5000 (1953, 1972, 1981). All the maps were processed, georeferenced, hand digitized and are further used as base layers for visualization and analysis. The historical aerial imagery was processed in standard ways of photogrammetry and is covering the year 1938, 1953 and the current state. The other important task covered by this project is the georelief reconstruction. We use the old maps and aerial imagery to reconstruct the complete time-line of the georelief development. This time-line is covering the period since 1938 until now. The derived digital terrain models and further on analyzed and printed on a 3D printer. Other reconstruction task are performed using the processed old maps - here we are studying the land-use change, settlement development and the industrialization and brown coal mining effect on the hydrological network structure. The processed data (old maps, aerial photographs, land-use and georelief reconstructions) are published as a web-mapping application built using the ArcGIS API for Flex technology. The application is offering visualization and overlay tools so the user can perform basic landscape and land-use development analyses. The resulting information system will consist of three parts - the web-mapping application, database containing the text and photo information about the vanished towns and villages (spatially linked to the web-mapping application) and other local studies performed on single sites in the region. The local studies are focused on application of data collection methods as UAV (Unmanned Aerial Vehicle), KAP (Kite Aerial Photography) and LIDAR.

  8. 15 maps merged in one data structure - GIS-based template for Dawn at Ceres

    NASA Astrophysics Data System (ADS)

    Naß, A.; Dawn Mapping Team

    2017-09-01

    Derive regional and global valid statements out of the map (quadrangles) is already a very time intensive task. However, another challenge is how individual mappers can generate one homogenous GIS-based project (w.r.t. geometrical and visual character) representing one geologically-consistent final map. Within this contribution a template will be presented which was generated for the process of the interpretative mapping project of Ceres to accomplish the requirement of unifying and merging individual quadrangle.

  9. Restoration of distorted depth maps calculated from stereo sequences

    NASA Technical Reports Server (NTRS)

    Damour, Kevin; Kaufman, Howard

    1991-01-01

    A model-based Kalman estimator is developed for spatial-temporal filtering of noise and other degradations in velocity and depth maps derived from image sequences or cinema. As an illustration of the proposed procedures, edge information from image sequences of rigid objects is used in the processing of the velocity maps by selecting from a series of models for directional adaptive filtering. Adaptive filtering then allows for noise reduction while preserving sharpness in the velocity maps. Results from several synthetic and real image sequences are given.

  10. DOSoReMI.hu: collection of countrywide DSM products partly according to GSM.net specifications, partly driven by specific user demands

    NASA Astrophysics Data System (ADS)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor; Illés, Gábor; Bakacsi, Zsófia; Szabó, József

    2017-04-01

    Due to former soil surveys and mapping activities significant amount of soil information has accumulated in Hungary. In traditional soil mapping the creation of a new map was troublesome and laborious. As a consequence, robust maps were elaborated and rather the demands were fitted to the available map products. Until recently spatial soil information demands have been serviced with the available datasets either in their actual form or after certain specific and often enforced, thematic and spatial inference. Considerable imperfection may occur in the accuracy and reliability of the map products, since there might be significant discrepancies between the available data and the expected information. The DOSoReMI.hu (Digital, Optimized, Soil Related Maps and Information in Hungary) project was started intentionally for the renewal of the national soil spatial infrastructure in Hungary. During our activities we have significantly extended the potential, how soil information requirements could be satisfied. Soil property, soil type as well as functional soil maps were targeted. The set of the applied digital soil mapping techniques has been gradually broadened incorporating and eventually integrating geostatistical, data mining and GIS tools. Soil property maps have been compiled partly according to GSM.net specifications, partly by slightly or more strictly changing some of their predefined parameters (depth intervals, pixel size, property etc.) according to the specific demands on the final products. The elaborated primary maps were further processed, since even DOSoReMI.hu intended to take steps for the regionalization of higher level soil information (processes, functions, and services) involving crop models in the spatial modelling. The framework of DOSoReMI.hu also provides opportunity for the elaboration of goal specific soil maps, with the prescription of the parameters (thematic, resolution, accuracy, reliability etc.) characterizing the map product. As a result, unique digital soil map products (in a more general meaning) were elaborated regionalizing specific soil (related) features, which were never mapped before, even nationally with high ( 1 ha) spatial resolution. Based upon the collected experiences, the full range of GSM.net products were also targeted. The web publishing of the results was also elaborated creating a proper WMS environment. Our paper will present the resulted national maps furthermore some conclusions drawn from the experiences.] Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA) under Grant K105167 and AGRARKLÍMA.2 VKSZ_12-1-2013-0034.

  11. Similarity and accuracy of mental models formed during nursing handovers: A concept mapping approach.

    PubMed

    Drach-Zahavy, Anat; Broyer, Chaya; Dagan, Efrat

    2017-09-01

    Shared mental models are crucial for constructing mutual understanding of the patient's condition during a clinical handover. Yet, scant research, if any, has empirically explored mental models of the parties involved in a clinical handover. This study aimed to examine the similarities among mental models of incoming and outgoing nurses, and to test their accuracy by comparing them with mental models of expert nurses. A cross-sectional study, exploring nurses' mental models via the concept mapping technique. 40 clinical handovers. Data were collected via concept mapping of the incoming, outgoing, and expert nurses' mental models (total of 120 concept maps). Similarity and accuracy for concepts and associations indexes were calculated to compare the different maps. About one fifth of the concepts emerged in both outgoing and incoming nurses' concept maps (concept similarity=23%±10.6). Concept accuracy indexes were 35%±18.8 for incoming and 62%±19.6 for outgoing nurses' maps. Although incoming nurses absorbed fewer number of concepts and associations (23% and 12%, respectively), they partially closed the gap (35% and 22%, respectively) relative to expert nurses' maps. The correlations between concept similarities, and incoming as well as outgoing nurses' concept accuracy, were significant (r=0.43, p<0.01; r=0.68 p<0.01, respectively). Finally, in 90% of the maps, outgoing nurses added information concerning the processes enacted during the shift, beyond the expert nurses' gold standard. Two seemingly contradicting processes in the handover were identified. "Information loss", captured by the low similarity indexes among the mental models of incoming and outgoing nurses; and "information restoration", based on accuracy measures indexes among the mental models of the incoming nurses. Based on mental model theory, we propose possible explanations for these processes and derive implications for how to improve a clinical handover. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Getting past the dual logic: findings from a pilot asset mapping exercise in Sheffield, UK.

    PubMed

    South, Jane; Giuntoli, Gianfranco; Kinsella, Karina

    2017-01-01

    Asset-based approaches seek to identify and mobilise the personal, social and organisational resources available to communities. Asset mapping is a recognised method of gathering an inventory of neighbourhood assets and is underpinned by a fundamentally different logic to traditional needs assessments. The aim of this paper is to explore how asset mapping might be used as a tool for health improvement. It reports on a qualitative evaluation of a pilot asset mapping project carried out in two economically disadvantaged neighbourhoods in Sheffield, UK. The project involved community health champions working with two community organisations to identify assets linked to the health and wellbeing of their neighbourhoods. The evaluation was undertaken in 2012 after mapping activities had been completed. A qualitative design, using theory of change methodology, was used to explore assumptions between activities, mechanisms and outcomes. Semi structured interviews were undertaken with a purposive sample of 11 stakeholders including champions, community staff and strategic partners. Thematic analysis was used and themes were identified on the process of asset mapping, the role of champions and the early outcomes for neighbourhoods and services. Findings showed that asset mapping was developmental and understandings grew as participatory activities were planned and implemented. The role of the champions was limited by numbers involved, nonetheless meaningful engagement occurred with residents which led to personal and social resources being identified. Most early outcomes were focused on the lead community organisations. There was less evidence of results feeding into wider planning processes because of the requirements for more quantifiable information. The paper discusses the importance of relational aspects of asset mapping both within communities and between communities and services. The conclusions are that it is insufficient to switch from the logic of needs to assets without building asset mapping as part of a broader planning process. © 2015 John Wiley & Sons Ltd.

  13. Heralded processes on continuous-variable spaces as quantum maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreyrol, Franck; Spagnolo, Nicolò; Blandino, Rémi

    2014-12-04

    Heralding processes, which only work when a measurement on a part of the system give the good result, are particularly interesting for continuous-variables. They permit non-Gaussian transformations that are necessary for several continuous-variable quantum information tasks. However if maps and quantum process tomography are commonly used to describe quantum transformations in discrete-variable space, they are much rarer in the continuous-variable domain. Also, no convenient tool for representing maps in a way more adapted to the particularities of continuous variables have yet been explored. In this paper we try to fill this gap by presenting such a tool.

  14. A Brief History of Soil Mapping and Classification in the USA

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Hartemink, Alfred E.

    2014-05-01

    Soil maps show the distribution of soils across an area but also depict soil science theory and ideas on soil formation and classification at the time the maps were created. The national soil mapping program in the USA was established in 1899. The first nation-wide soil map was published by M. Whitney in 1909 and showed soil provinces that were largely based on geology. In 1912, G.N. Coffey published the first country-wide map based on soil properties. The map showed 5 broad soil units that used parent material, color and drainage as diagnostic criteria. The 1913 national map was produced by C.F. Marbut, H.H. Bennett, J.E. Lapham, and M.H. Lapham and showed broad physiographic units that were further subdivided into soil series, soil classes and soil types. In 1935, Marbut drafted a series of maps based on soil properties, but these maps were replaced as official U.S. soil maps in 1938 with the work of M. Baldwin, C.E. Kellogg, and J. Thorp. A series of soil maps similar to modern USA maps appeared in the 1960s with the 7th Approximation followed by revisions with the 1975 and 1999 editions of Soil Taxonomy. This review has shown that soil maps in the United States produced since the early 1900s moved initially from a geologic-based concept to a pedologic concept of soils. Later changes were from property-based systems to process-based, and then back to property-based. The information in this presentation is based on Brevik and Hartemink (2013). Brevik, E.C., and A.E. Hartemink. 2013. Soil Maps of the United States of America. Soil Science Society of America Journal 77:1117-1132. doi:10.2136/sssaj2012.0390.

  15. Geologic map of the Khanneshin carbonatite complex, Helmand Province, Afghanistan, modified from the 1976 original map compilation of V.G. Cheremytsin

    USGS Publications Warehouse

    Tucker, Robert D.; Peters, Stephen G.; Schulz, Klaus J.; Renaud, Karine M.; Stettner, Will R.; Masonic, Linda M.; Packard, Patricia H.

    2011-01-01

    This map is a modified version of the Geological map of the Khanneshin carbonatite complex, scale 1:10,000, which was compiled by V.G. Cheremytsin in 1976. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original map and also visited the field area in September 2009, August 2010, and February 2011. This modified map, which includes cross sections, illustrates the geologic structure of the Khanneshin carbonatite complex. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross sections and includes modifications based on our examination of that map and a related report, and based on observations made during our field visits. (Refer to the References section in the Map PDF for complete citations of the original map and related report.) Elevations on the cross section are derived from the original Soviet topography and may not match the newer topography used on the current map. We have attempted to translate the original Russian terminology and rock classification into modern English geologic usage as literally as possible without changing any genetic or process-oriented implications in the original descriptions. We also use the age designations from the original map. The unit colors on the map and cross sections differ from the colors shown on the original version. The units are colored according to the color and pattern scheme of the Commission for the Geological Map of the World (CGMW) (http://www.ccgm.org).

  16. Scoping of Flood Hazard Mapping Needs for Androscoggin County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed and as funds allow. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Androscoggin County. Scoping activities included assembling existing data and map needs information for communities in Androscoggin County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Androscoggin County, Maine, is at least 17 years. Most studies were published in the early 1990s, and some towns have partial maps that are more recent than their study date. Since the studies were done, development has occurred in many of the watersheds and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  17. Scoping of Flood Hazard Mapping Needs for Lincoln County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Lincoln County. Scoping activities included assembling existing data and map needs information for communities in Lincoln County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Lincoln County, Maine is at least 17 years. Many of these studies were published in the mid- to late-1980s, and some towns have partial maps that are more recent than their study. However, in the ensuing 15-20 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  18. USGS standard quadrangle maps for emergency response

    USGS Publications Warehouse

    Moore, Laurence R.

    2009-01-01

    The 1:24,000-scale topographic quadrangle was the primary product of the U.S. Geological Survey's (USGS) National Mapping Program from 1947-1992. This map series includes about 54,000 map sheets for the conterminous United States, and is the only uniform map series ever produced that covers this area at such a large scale. This map series partially was revised under several programs, starting as early as 1968, but these programs were not adequate to keep the series current. Through the 1990s the emphasis of the USGS mapping program shifted away from topographic maps and toward more specialized digital data products. Topographic map revision dropped off rapidly after 1999, and stopped completely by 2004. Since 2001, emergency-response and homeland security requirement have revived the question of whether a standard national topographic series is needed. Emergencies such as Hurricane Katrina in 2005 and California wildfires in 2007-08 demonstrated that familiar maps are important to first responders. Maps that have a standard scale, extent, and grids help reduce confusion and save time in emergencies. Traditional maps are designed to allow the human brain to quickly process large amounts of information, and depend on artistic layout and design that cannot be fully automated. In spite of technical advances, creating a traditional, general-purpose topographic map is still expensive. Although the content and layout of traditional topographic maps probably is still desirable, the preferred packaging and delivery of maps has changed. Digital image files are now desired by most users, but to be useful to the emergency-response community, these files must be easy to view and easy to print without specialized geographic information system expertise or software.

  19. Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia

    NASA Astrophysics Data System (ADS)

    Swasti Kanthi, Nurin; Hery Purwanto, Taufik

    2016-11-01

    Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.

  20. Chosen Aspects of the Production of the Basic Map Using Uav Imagery

    NASA Astrophysics Data System (ADS)

    Kedzierski, M.; Fryskowska, A.; Wierzbicki, D.; Nerc, P.

    2016-06-01

    For several years there has been an increasing interest in the use of unmanned aerial vehicles in acquiring image data from a low altitude. Considering the cost-effectiveness of the flight time of UAVs vs. conventional airplanes, the use of the former is advantageous when generating large scale accurate ortophotos. Through the development of UAV imagery, we can update large-scale basic maps. These maps are cartographic products which are used for registration, economic, and strategic planning. On the basis of these maps other cartographic maps are produced, for example maps used building planning. The article presents an assessesment of the usefulness of orthophotos based on UAV imagery to upgrade the basic map. In the research a compact, non-metric camera, mounted on a fixed wing powered by an electric motor was used. The tested area covered flat, agricultural and woodland terrains. The processing and analysis of orthorectification were carried out with the INPHO UASMaster programme. Due to the effect of UAV instability on low-altitude imagery, the use of non-metric digital cameras and the low-accuracy GPS-INS sensors, the geometry of images is visibly lower were compared to conventional digital aerial photos (large values of phi and kappa angles). Therefore, typically, low-altitude images require large along- and across-track direction overlap - usually above 70 %. As a result of the research orthoimages were obtained with a resolution of 0.06 meters and a horizontal accuracy of 0.10m. Digitized basic maps were used as the reference data. The accuracy of orthoimages vs. basic maps was estimated based on the study and on the available reference sources. As a result, it was found that the geometric accuracy and interpretative advantages of the final orthoimages allow the updating of basic maps. It is estimated that such an update of basic maps based on UAV imagery reduces processing time by approx. 40%.

  1. Current trends in satellite based emergency mapping - the need for harmonisation

    NASA Astrophysics Data System (ADS)

    Voigt, Stefan

    2013-04-01

    During the past years, the availability and use of satellite image data to support disaster management and humanitarian relief organisations has largely increased. The automation and data processing techniques are greatly improving as well as the capacity in accessing and processing satellite imagery in getting better globally. More and more global activities via the internet and through global organisations like the United Nations or the International Charter Space and Major Disaster engage in the topic, while at the same time, more and more national or local centres engage rapid mapping operations and activities. In order to make even more effective use of this very positive increase of capacity, for the sake of operational provision of analysis results, for fast validation of satellite derived damage assessments, for better cooperation in the joint inter agency generation of rapid mapping products and for general scientific use, rapid mapping results in general need to be better harmonized, if not even standardized. In this presentation, experiences from various years of rapid mapping gained by the DLR Center for satellite based Crisis Information (ZKI) within the context of the national activities, the International Charter Space and Major Disasters, GMES/Copernicus etc. are reported. Furthermore, an overview on how automation, quality assurance and optimization can be achieved through standard operation procedures within a rapid mapping workflow is given. Building on this long term rapid mapping experience, and building on the DLR initiative to set in pace an "International Working Group on Satellite Based Emergency Mapping" current trends in rapid mapping are discussed and thoughts on how the sharing of rapid mapping information can be optimized by harmonizing analysis results and data structures are presented. Such an harmonization of analysis procedures, nomenclatures and representations of data as well as meta data are the basis to better cooperate within the global rapid mapping community throughout local/national, regional/supranational and global scales

  2. Creation of a full color geologic map by computer: A case history from the Port Moller project resource assessment, Alaska Peninsula: A section in Geologic studies in Alaska by the U.S. Geological Survey, 1988

    USGS Publications Warehouse

    Wilson, Frederic H.

    1989-01-01

    Graphics programs on computers can facilitate the compilation and production of geologic maps, including full color maps of publication quality. This paper describes the application of two different programs, GSMAP and ARC/INFO, to the production of a geologic map of the Port Meller and adjacent 1:250,000-scale quadrangles on the Alaska Peninsula. GSMAP was used at first because of easy digitizing on inexpensive computer hardware. Limitations in its editing capability led to transfer of the digital data to ARC/INFO, a Geographic Information System, which has better editing and also added data analysis capability. Although these improved capabilities are accompanied by increased complexity, the availability of ARC/INFO's data analysis capability provides unanticipated advantages. It allows digital map data to be processed as one of multiple data layers for mineral resource assessment. As a result of development of both software packages, it is now easier to apply both software packages to geologic map production. Both systems accelerate the drafting and revision of maps and enhance the compilation process. Additionally, ARC/ INFO's analysis capability enhances the geologist's ability to develop answers to questions of interest that were previously difficult or impossible to obtain.

  3. Using perceptual rules in interactive visualization

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Treinish, Lloyd A.

    1994-05-01

    In visualization, data are represented as variations in grayscale, hue, shape, and texture. They can be mapped to lines, surfaces, and glyphs, and can be represented statically or in animation. In modem visualization systems, the choices for representing data seem unlimited. This is both a blessing and a curse, however, since the visual impression created by the visualization depends critically on which dimensions are selected for representing the data (Bertin, 1967; Tufte, 1983; Cleveland, 1991). In modem visualization systems, the user can interactively select many different mapping and representation operations, and can interactively select processing operations (e.g., applying a color map), realization operations (e.g., generating geometric structures such as contours or streamlines), and rendering operations (e.g., shading or ray-tracing). The user can, for example, map data to a color map, then apply contour lines, then shift the viewing angle, then change the color map again, etc. In many systems, the user can vary the choices for each operation, selecting, for example, particular color maps, contour characteristics, and shading techniques. The hope is that this process will eventually converge on a visual representation which expresses the structure of the data and effectively communicates its message in a way that meets the user's goals. Sometimes, however, it results in visual representations which are confusing, misleading, and garish.

  4. Simultaneous topography imaging and broadband nanomechanical mapping on atomic force microscope

    NASA Astrophysics Data System (ADS)

    Li, Tianwei; Zou, Qingze

    2017-12-01

    In this paper, an approach is proposed to achieve simultaneous imaging and broadband nanomechanical mapping of soft materials in air by using an atomic force microscope. Simultaneous imaging and nanomechanical mapping are needed, for example, to correlate the morphological and mechanical evolutions of the sample during dynamic phenomena such as the cell endocytosis process. Current techniques for nanomechanical mapping, however, are only capable of capturing static elasticity of the material, or the material viscoelasticity in a narrow frequency band around the resonant frequency(ies) of the cantilever used, not competent for broadband nanomechanical mapping, nor acquiring topography image of the sample simultaneously. These limitations are addressed in this work by enabling the augmentation of an excitation force stimuli of rich frequency spectrum for nanomechanical mapping in the imaging process. Kalman-filtering technique is exploited to decouple and split the mixed signals for imaging and mapping, respectively. Then the sample indentation generated is quantified online via a system-inversion method, and the effects of the indentation generated and the topography tracking error on the topography quantification are taken into account. Moreover, a data-driven feedforward-feedback control is utilized to track the sample topography. The proposed approach is illustrated through experimental implementation on a polydimethylsiloxane sample with a pre-fabricated pattern.

  5. Hyperspectral imaging—An advanced instrument concept for the EnMAP mission (Environmental Mapping and Analysis Programme)

    NASA Astrophysics Data System (ADS)

    Stuffler, Timo; Förster, Klaus; Hofer, Stefan; Leipold, Manfred; Sang, Bernhard; Kaufmann, Hermann; Penné, Boris; Mueller, Andreas; Chlebek, Christian

    2009-10-01

    In the upcoming generation of satellite sensors, hyperspectral instruments will play a significant role. This payload type is considered world-wide within different future planning. Our team has now successfully finalized the Phase B study for the advanced hyperspectral mission EnMAP (Environmental Mapping and Analysis Programme), Germans next optical satellite being scheduled for launch in 2012. GFZ in Potsdam has the scientific lead on EnMAP, Kayser-Threde in Munich is the industrial prime. The EnMAP instrument provides over 240 continuous spectral bands in the wavelength range between 420 and 2450 nm with a ground resolution of 30 m×30 m. Thus, the broad science and application community can draw from an extensive and highly resolved pool of information supporting the modeling and optimization process on their results. The performance of the hyperspectral instrument allows for a detailed monitoring, characterization and parameter extraction of rock/soil targets, vegetation, and inland and coastal waters on a global scale supporting a wide variety of applications in agriculture, forestry, water management and geology. The operation of an airborne system (ARES) as an element in the HGF hyperspectral network and the ongoing evolution concerning data handling and extraction procedures, will support the later inclusion process of EnMAP into the growing scientist and user communities.

  6. Mind the gap! Automated concept map feedback supports students in writing cohesive explanations.

    PubMed

    Lachner, Andreas; Burkhart, Christian; Nückles, Matthias

    2017-03-01

    Many students are challenged with the demand of writing cohesive explanations. To support students in writing cohesive explanations, we developed a computer-based feedback tool that visualizes cohesion deficits of students' explanations in a concept map. We conducted three studies to investigate the effectiveness of such feedback as well as the underlying cognitive processes. In Study 1, we found that the concept map helped students identify potential cohesion gaps in their drafts and plan remedial revisions. In Study 2, students with concept map feedback conducted revisions that resulted in more locally and globally cohesive, and also more comprehensible, explanations than the explanations of students who revised without concept map feedback. In Study 3, we replicated the findings of Study 2 by and large. More importantly, students who had received concept map feedback on a training explanation 1 week later wrote a transfer explanation without feedback that was more cohesive than the explanation of students who had received no feedback on their training explanation. The automated concept map feedback appears to particularly support the evaluation phase of the revision process. Furthermore, the feedback enabled novice writers to acquire sustainable skills in writing cohesive explanations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. High-throughput SNP genotyping in Cucurbita pepo for map construction and quantitative trait loci mapping

    PubMed Central

    2012-01-01

    Background Cucurbita pepo is a member of the Cucurbitaceae family, the second- most important horticultural family in terms of economic importance after Solanaceae. The "summer squash" types, including Zucchini and Scallop, rank among the highest-valued vegetables worldwide. There are few genomic tools available for this species. The first Cucurbita transcriptome, along with a large collection of Single Nucleotide Polymorphisms (SNP), was recently generated using massive sequencing. A set of 384 SNP was selected to generate an Illumina GoldenGate assay in order to construct the first SNP-based genetic map of Cucurbita and map quantitative trait loci (QTL). Results We herein present the construction of the first SNP-based genetic map of Cucurbita pepo using a population derived from the cross of two varieties with contrasting phenotypes, representing the main cultivar groups of the species' two subspecies: Zucchini (subsp. pepo) × Scallop (subsp. ovifera). The mapping population was genotyped with 384 SNP, a set of selected EST-SNP identified in silico after massive sequencing of the transcriptomes of both parents, using the Illumina GoldenGate platform. The global success rate of the assay was higher than 85%. In total, 304 SNP were mapped, along with 11 SSR from a previous map, giving a map density of 5.56 cM/marker. This map was used to infer syntenic relationships between C. pepo and cucumber and to successfully map QTL that control plant, flowering and fruit traits that are of benefit to squash breeding. The QTL effects were validated in backcross populations. Conclusion Our results show that massive sequencing in different genotypes is an excellent tool for SNP discovery, and that the Illumina GoldenGate platform can be successfully applied to constructing genetic maps and performing QTL analysis in Cucurbita. This is the first SNP-based genetic map in the Cucurbita genus and is an invaluable new tool for biological research, especially considering that most of these markers are located in the coding regions of genes involved in different physiological processes. The platform will also be useful for future mapping and diversity studies, and will be essential in order to accelerate the process of breeding new and better-adapted squash varieties. PMID:22356647

  8. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    NASA Astrophysics Data System (ADS)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.

  9. Execution models for mapping programs onto distributed memory parallel computers

    NASA Technical Reports Server (NTRS)

    Sussman, Alan

    1992-01-01

    The problem of exploiting the parallelism available in a program to efficiently employ the resources of the target machine is addressed. The problem is discussed in the context of building a mapping compiler for a distributed memory parallel machine. The paper describes using execution models to drive the process of mapping a program in the most efficient way onto a particular machine. Through analysis of the execution models for several mapping techniques for one class of programs, we show that the selection of the best technique for a particular program instance can make a significant difference in performance. On the other hand, the results of benchmarks from an implementation of a mapping compiler show that our execution models are accurate enough to select the best mapping technique for a given program.

  10. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    PubMed

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.

  11. A High Density Consensus Genetic Map of Tetraploid Cotton That Integrates Multiple Component Maps through Molecular Marker Redundancy Check

    PubMed Central

    Blenda, Anna; Fang, David D.; Rami, Jean-François; Garsmeur, Olivier; Luo, Feng; Lacape, Jean-Marc

    2012-01-01

    A consensus genetic map of tetraploid cotton was constructed using six high-density maps and after the integration of a sequence-based marker redundancy check. Public cotton SSR libraries (17,343 markers) were curated for sequence redundancy using 90% as a similarity cutoff. As a result, 20% of the markers (3,410) could be considered as redundant with some other markers. The marker redundancy information had been a crucial part of the map integration process, in which the six most informative interspecific Gossypium hirsutum×G. barbadense genetic maps were used for assembling a high density consensus (HDC) map for tetraploid cotton. With redundant markers being removed, the HDC map could be constructed thanks to the sufficient number of collinear non-redundant markers in common between the component maps. The HDC map consists of 8,254 loci, originating from 6,669 markers, and spans 4,070 cM, with an average of 2 loci per cM. The HDC map presents a high rate of locus duplications, as 1,292 markers among the 6,669 were mapped in more than one locus. Two thirds of the duplications are bridging homoeologous AT and DT chromosomes constitutive of allopolyploid cotton genome, with an average of 64 duplications per AT/DT chromosome pair. Sequences of 4,744 mapped markers were used for a mutual blast alignment (BBMH) with the 13 major scaffolds of the recently released Gossypium raimondii genome indicating high level of homology between the diploid D genome and the tetraploid cotton genetic map, with only a few minor possible structural rearrangements. Overall, the HDC map will serve as a valuable resource for trait QTL comparative mapping, map-based cloning of important genes, and better understanding of the genome structure and evolution of tetraploid cotton. PMID:23029214

  12. Recent Geologic Mapping Results for the Polar Regions of Mars

    NASA Technical Reports Server (NTRS)

    tanaka, K. L.; Kolb, E. J.

    2008-01-01

    The polar regions of Mars include the densest data coverage for the planet because of the polar orbits of MGS, ODY, and MEX. Because the geology of the polar plateaus has been among the most dynamic on the planet in recent geologic time, the data enable the most detailed and complex geologic investigations of any regions on Mars, superseding previous, even recent, mapping efforts [e.g., 1-3]. Geologic mapping at regional and local scales is revealing that the stratigraphy and modificational histories of polar materials by various processes are highly complex at both poles. Here, we describe some of our recent results in polar geologic mapping and how they address the geologic processes involved and implications for polar climate history.

  13. Analysis of Large-Scale Resurfacing Processes on Mercury: Mapping the Derain (H-10) Quadrangle

    NASA Astrophysics Data System (ADS)

    Whitten, J. L.; Ostrach, L. R.; Fassett, C. I.

    2018-05-01

    The Derain (H-10) Quadrangle of Mercury contains a large region of "average" crustal materials, with minimal smooth plains and basin ejecta, allowing the relative contribution of volcanic and impact processes to be assessed through geologic mapping.

  14. Process for Generating Engine Fuel Consumption Map: Future Atkinson Engine with Cooled EGR and Cylinder Deactivation

    EPA Pesticide Factsheets

    This document summarizes the process followed to utilize GT-POWER modeled engine and laboratory engine dyno test data to generate a full engine fuel consumption map which can be used by EPA's ALPHA vehicle simulations.

  15. Process mapping as a framework for performance improvement in emergency general surgery.

    PubMed

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2017-12-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  16. Process mapping as a framework for performance improvement in emergency general surgery.

    PubMed

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2018-02-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  17. Searching for the missing pieces between the hospital and primary care: mapping the patient process during care transitions.

    PubMed

    Johnson, Julie K; Farnan, Jeanne M; Barach, Paul; Hesselink, Gijs; Wollersheim, Hub; Pijnenborg, Loes; Kalkman, Cor; Arora, Vineet M

    2012-12-01

    Safe patient transitions depend on effective communication and a functioning care coordination process. Evidence suggests that primary care physicians are not satisfied with communication at transition points between inpatient and ambulatory care, and that communication often is not provided in a timely manner, omits essential information, or contains ambiguities that put patients at risk. Our aim was to demonstrate how process mapping can illustrate current handover practices between ambulatory and inpatient care settings, identify existing barriers and facilitators to effective transitions of care, and highlight potential areas for quality improvement. We conducted focus group interviews to facilitate a process mapping exercise with clinical teams in six academic health centres in the USA, Poland, Sweden, Italy, Spain and the Netherlands. At a high level, the process of patient admission to the hospital through the emergency department, inpatient care, and discharge back in the community were comparable across sites. In addition, the process maps highlighted similar barriers to providing information to primary care physicians, inaccurate or incomplete information on referral and discharge, a lack of time and priority to collaborate with counterpart colleagues, and a lack of feedback to clinicians involved in the handovers. Process mapping is effective in bringing together key stakeholders and makes explicit the mental models that frame their understanding of the clinical process. Exploring the barriers and facilitators to safe and reliable patient transitions highlights opportunities for further improvement work and illustrates ideas for best practices that might be transferrable to other settings.

  18. Prior Knowledge Activation: How Different Concept Mapping Tasks Lead to Substantial Differences in Cognitive Processes, Learning Outcomes, and Perceived Self-Efficacy

    ERIC Educational Resources Information Center

    Gurlitt, Johannes; Renkl, Alexander

    2010-01-01

    Two experiments investigated the effects of characteristic features of concept mapping used for prior knowledge activation. Characteristic demands of concept mapping include connecting lines representing the relationships between concepts and labeling these lines, specifying the type of the semantic relationships. In the first experiment,…

  19. Wood transportation systems-a spin-off of a computerized information and mapping technique

    Treesearch

    William W. Phillips; Thomas J. Corcoran

    1978-01-01

    A computerized mapping system originally developed for planning the control of the spruce budworm in Maine has been extended into a tool for planning road net-work development and optimizing transportation costs. A budgetary process and a mathematical linear programming routine are used interactively with the mapping and information retrieval capabilities of the system...

  20. Mapping marine habitat suitability and uncertainty of Bayesian networks: a case study using Pacific benthic macrofauna

    Treesearch

    Andrea Havron; Chris Goldfinger; Sarah Henkel; Bruce G. Marcot; Chris Romsos; Lisa Gilbane

    2017-01-01

    Resource managers increasingly use habitat suitability map products to inform risk management and policy decisions. Modeling habitat suitability of data-poor species over large areas requires careful attention to assumptions and limitations. Resulting habitat suitability maps can harbor uncertainties from data collection and modeling processes; yet these limitations...

  1. A Method of Surrogate Model Construction which Leverages Lower-Fidelity Information using Space Mapping Techniques

    DTIC Science & Technology

    2014-03-27

    fidelity. This pairing is accomplished through the use of a space mapping technique, which is a process where the design space of a lower fidelity model...is aligned a higher fidelity model. The intent of applying space mapping techniques to the field of surrogate construction is to leverage the

  2. Development of an Intervention Map for a Parent Education Intervention to Prevent Violence Among Hispanic Middle School Students.

    ERIC Educational Resources Information Center

    Murray, Nancy; Kelder, Steve; Parcel, Guy; Orpinas, Pamela

    1998-01-01

    Describes development of an intervention program for Hispanic parents to reduce violence by increased monitoring of their middle school students. Program development used a five-step guided intervention mapping process. Student surveys and parent interviews provided data to inform program design. Intervention mapping ensured involvement with the…

  3. Using Concept Mapping as as Tool for Program Theory Development

    ERIC Educational Resources Information Center

    Orsi, Rebecca

    2011-01-01

    The purpose of this methodological study is to explore how well a process called "concept mapping" (Trochim, 1989) can articulate the theory which underlies a social program. Articulation of a program's theory is a key step in completing a sound theory based evaluation (Weiss, 1997a). In this study, concept mapping is used to…

  4. Conceptual Maps for Training Tutors in the Distance Learning of Business Administration Course

    ERIC Educational Resources Information Center

    Mendes, Elise; Jordão de Carvalho, Claudinê; Gargiulo, Victor; da Mota Alves, João Bosco

    2014-01-01

    This article aims at reporting on the process of tutors training for the planning of distance education at the undergraduate Administration course at the Federal University of Uberlandia-Brazil. It describes a participatory research training of tutors in the use of concept mapping (CM) and concept mapping software to encourage individual…

  5. How similar are forest disturbance maps derived from different Landsat time series algorithms?

    Treesearch

    Warren B. Cohen; Sean P. Healey; Zhiqiang Yang; Stephen V. Stehman; C. Kenneth Brewer; Evan B. Brooks; Noel Gorelick; Chengqaun Huang; M. Joseph Hughes; Robert E. Kennedy; Thomas R. Loveland; Gretchen G. Moisen; Todd A. Schroeder; James E. Vogelmann; Curtis E. Woodcock; Limin Yang; Zhe Zhu

    2017-01-01

    Disturbance is a critical ecological process in forested systems, and disturbance maps are important for understanding forest dynamics. Landsat data are a key remote sensing dataset for monitoring forest disturbance and there recently has been major growth in the development of disturbance mapping algorithms. Many of these algorithms take advantage of the high temporal...

  6. Effects of Concept Mapping Strategy on Learning Performance in Business and Economics Statistics

    ERIC Educational Resources Information Center

    Chiou, Chei-Chang

    2009-01-01

    A concept map (CM) is a hierarchically arranged, graphic representation of the relationships among concepts. Concept mapping (CMING) is the process of constructing a CM. This paper examines whether a CMING strategy can be useful in helping students to improve their learning performance in a business and economics statistics course. A single…

  7. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    PubMed

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. An ultrafast, reliable and scalable 4D CBCT∕CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment.

  8. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment

    PubMed Central

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-01-01

    Purpose: Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT/CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. Methods: In this work, we accelerated the Feldcamp–Davis–Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT/CT reconstruction algorithm. Results: Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10−7. Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. Conclusions: An ultrafast, reliable and scalable 4D CBCT/CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment. PMID:22149842

  9. Mapping patterns of change in emotion-focused psychotherapy: Implications for theory, research, practice, and training.

    PubMed

    Watson, Jeanne C

    2018-05-01

    An important objective in humanistic-experiential psychotherapies and particularly emotion-focused psychotherapy (EFT) is to map patterns of change. Effective mapping of the processes and pathways of change requires that in-session processes be linked to in-session resolutions, immediate post-session changes, intermediate outcome, final therapy outcome, and longer-term change. This is a challenging and long-term endeavour. Fine-grained descriptions of in-session processes that lead to resolution of specific interpersonal and intrapersonal issues linked with longer-term outcomes are the foundation of EFT, the process-experiential approach. In this paper, evidence in support of EFT as a treatment approach will be reviewed along with research on two mechanisms of change, viewed as central to EFT, clients' emotional processing and the therapeutic relationship conditions. The implications for psychotherapy research are discussed. Given the methodological constraints, there is a need for more innovative methodologies and strategies to investigate specific psychotherapy processes within and across different approaches to map patterns and mechanisms of change to enhance theory, research, practice, and training.

  10. Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.

    PubMed

    Ivory, Catherine H

    2016-07-01

    The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.

  11. A perturbation method to the tent map based on Lyapunov exponent and its application

    NASA Astrophysics Data System (ADS)

    Cao, Lv-Chen; Luo, Yu-Ling; Qiu, Sen-Hui; Liu, Jun-Xiu

    2015-10-01

    Perturbation imposed on a chaos system is an effective way to maintain its chaotic features. A novel parameter perturbation method for the tent map based on the Lyapunov exponent is proposed in this paper. The pseudo-random sequence generated by the tent map is sent to another chaos function — the Chebyshev map for the post processing. If the output value of the Chebyshev map falls into a certain range, it will be sent back to replace the parameter of the tent map. As a result, the parameter of the tent map keeps changing dynamically. The statistical analysis and experimental results prove that the disturbed tent map has a highly random distribution and achieves good cryptographic properties of a pseudo-random sequence. As a result, it weakens the phenomenon of strong correlation caused by the finite precision and effectively compensates for the digital chaos system dynamics degradation. Project supported by the Guangxi Provincial Natural Science Foundation, China (Grant No. 2014GXNSFBA118271), the Research Project of Guangxi University, China (Grant No. ZD2014022), the Fund from Guangxi Provincial Key Laboratory of Multi-source Information Mining & Security, China (Grant No. MIMS14-04), the Fund from the Guangxi Provincial Key Laboratory of Wireless Wideband Communication & Signal Processing, China (Grant No. GXKL0614205), the Education Development Foundation and the Doctoral Research Foundation of Guangxi Normal University, the State Scholarship Fund of China Scholarship Council (Grant No. [2014]3012), and the Innovation Project of Guangxi Graduate Education, China (Grant No. YCSZ2015102).

  12. Enhancing The National Map Through Tactical Planning and Performance Monitoring

    USGS Publications Warehouse

    ,

    2008-01-01

    Tactical planning and performance monitoring are initial steps toward improving 'the way The National Map works' and supporting the U.S. Geological Survey (USGS) Science Strategy. This Tactical Performance Planning Summary for The National Map combines information from The National Map 2.0 Tactical Plan and The National Map Performance Milestone Matrix. The National Map 2.0 Tactical Plan is primarily a working document to guide The National Map program's execution, production, and metrics monitoring for fiscal years (FY) 2008 and 2009. The Tactical Plan addresses data, products, and services, as well as supporting and enabling activities. The National Map's 2-year goal for FY 2008 and FY 2009 is to provide a range of geospatial products and services that further the National Spatial Data Infrastructure and underpin USGS science. To do this, the National Geospatial Program will develop a renewed understanding during FY 2008 of key customer needs and requirements, develop the infrastructure to support The National Map business model, modernize its business processes, and reengineer its workforce. Priorities for The National Map will be adjusted if necessary to respond to changes to the project that may impact resources, constrain timeframes, or change customer needs. The supporting and enabling activities that make it possible to produce the products and services of The National Map will include partnership activities, improved compatibility of systems, outreach, and integration of data themes.

  13. Transfer of an implied incompatible spatial mapping to a Simon task.

    PubMed

    Luo, Chunming; Proctor, Robert W

    2016-02-01

    When location words left and right are presented in left and right locations and mapped to left and right keypress responses in the Hedge and Marsh (1975) task (Arend & Wandmacher, 1987), a compatible mapping of words to responses yields a benefit for stimulus-response location correspondence (sometimes called the Simon effect), whereas an incompatible mapping yields a benefit for noncorrespondence (called the Hedge and Marsh reversal). Experiment 1 replicated the correspondence benefit and its reversal by using Chinese location words [symbol: see text] (left) and [symbol: see text] (right) in the Hedge and Marsh task. Experiments 2 and 3 examined whether the tendency to respond with the noncorresponding response when the mapping is incompatible transfers to the task version in which the mapping is compatible, and Experiment 4 examined whether transfer similarly occurs from the compatible mapping to the task version with incompatible mapping. Transfer of the incompatible relation was apparent in a lack of correspondence benefit when the mapping was changed to compatible, but transfer of the compatible relation to the incompatible mapping did not occur. The results suggest that an association between noncorresponding stimulus-response locations is acquired when the word-response mapping is incompatible, even though this relation is only implicit, regardless of whether through misapplication of a logical recoding rule or spatial representations shared by the locations and words. These associations then continue to affect processing of location when the mapping is compatible. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Perfusion weighted imaging and its application in stroke

    NASA Astrophysics Data System (ADS)

    Li, Enzhong; Tian, Jie; Han, Ying; Wang, Huifang; Li, Xingfeng; Zhu, Fuping

    2003-05-01

    To study the technique and application of perfusion weighted imaging (PWI) in the diagnosis and medical treatment of acute stroke, 25 patients were examined by 1.5 T or 1.0 T MRI scanner. The Data analysis was done with "3D Med System" developed by our Lab to process the data and obtain apparent diffusion coefficient (ADC) map, cerebral blood volume (CBV) map, cerebral blood flow (CBF) map as well as mean transit time (MTT) map. In accute stage of stroke, normal or slightly hypointensity in T1-, hyperintensity in T2- and diffusion-weighted images were seen in the cerebral infarction areas. There were hypointensity in CBV map, CBF map and ADC map; and hyperintensity in MTT map that means this infarct area could be saved. If the hyperintensity area in MTT map was larger than the area in diffusion weighted imaging (DWI), the larger part was called penumbra and could be cured by an appropriate thrombolyitic or other therapy. The CBV, CBF and MTT maps are very important in the diagnosis and medical treatment of acute especially hyperacute stroke. Comparing with DWI, we can easily know the situation of penumbra and the effect of curvative therapy. Besides, we can also make a differential diagnosis with this method.

  15. Benthic Habitat Mapping by Combining Lyzenga’s Optical Model and Relative Water Depth Model in Lintea Island, Southeast Sulawesi

    NASA Astrophysics Data System (ADS)

    Hafizt, M.; Manessa, M. D. M.; Adi, N. S.; Prayudha, B.

    2017-12-01

    Benthic habitat mapping using satellite data is one challenging task for practitioners and academician as benthic objects are covered by light-attenuating water column obscuring object discrimination. One common method to reduce this water-column effect is by using depth-invariant index (DII) image. However, the application of the correction in shallow coastal areas is challenging as a dark object such as seagrass could have a very low pixel value, preventing its reliable identification and classification. This limitation can be solved by specifically applying a classification process to areas with different water depth levels. The water depth level can be extracted from satellite imagery using Relative Water Depth Index (RWDI). This study proposed a new approach to improve the mapping accuracy, particularly for benthic dark objects by combining the DII of Lyzenga’s water column correction method and the RWDI of Stumpt’s method. This research was conducted in Lintea Island which has a high variation of benthic cover using Sentinel-2A imagery. To assess the effectiveness of the proposed new approach for benthic habitat mapping two different classification procedures are implemented. The first procedure is the commonly applied method in benthic habitat mapping where DII image is used as input data to all coastal area for image classification process regardless of depth variation. The second procedure is the proposed new approach where its initial step begins with the separation of the study area into shallow and deep waters using the RWDI image. Shallow area was then classified using the sunglint-corrected image as input data and the deep area was classified using DII image as input data. The final classification maps of those two areas were merged as a single benthic habitat map. A confusion matrix was then applied to evaluate the mapping accuracy of the final map. The result shows that the new proposed mapping approach can be used to map all benthic objects in all depth ranges and shows a better accuracy compared to that of classification map produced using only with DII.

  16. Geologic map of the Haji-Gak iron deposit, Bamyan Province, Afghanistan, modified from the 1965 original map compilation of M.S. Smirnov and I.K. Kusov

    USGS Publications Warehouse

    Renaud, Karine M.; Tucker, Robert D.; Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geological-structural map of Hajigak iron-ore deposit, scale 1:10,000, which was compiled by M.S. Smirnov and I.K. Kusov in 1965. (Refer to the References Cited section in the Map PDF for complete citations of the original map and a related report.) USGS scientists, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original documents and also visited the field area in November 2009. This modified map illustrates the geological structure of the Haji-Gak iron deposit and includes cross sections of the same area. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross sections and includes modifications based on our examination of these documents. Elevations on the cross sections are derived from the original Soviet topography and may not match the newer topography used on the current map. We have attempted to translate the original Russian terminology and rock classification into modern English geologic usage as literally as possible without changing any genetic or process-oriented implications in the original descriptions. We also use the age designations from the original map. The unit colors on the map and cross sections differ from the colors shown on the original version. The units are colored according to the color and pattern scheme of the Commission for the Geological Map of the World (CGMW) (http://www.ccgm.org).

  17. Passive language mapping combining real-time oscillation analysis with cortico-cortical evoked potentials for awake craniotomy.

    PubMed

    Tamura, Yukie; Ogawa, Hiroshi; Kapeller, Christoph; Prueckl, Robert; Takeuchi, Fumiya; Anei, Ryogo; Ritaccio, Anthony; Guger, Christoph; Kamada, Kyousuke

    2016-12-01

    OBJECTIVE Electrocortical stimulation (ECS) is the gold standard for functional brain mapping; however, precise functional mapping is still difficult in patients with language deficits. High gamma activity (HGA) between 80 and 140 Hz on electrocorticography is assumed to reflect localized cortical processing, whereas the cortico-cortical evoked potential (CCEP) can reflect bidirectional responses evoked by monophasic pulse stimuli to the language cortices when there is no patient cooperation. The authors propose the use of "passive" mapping by combining HGA mapping and CCEP recording without active tasks during conscious resections of brain tumors. METHODS Five patients, each with an intraaxial tumor in their dominant hemisphere, underwent conscious resection of their lesion with passive mapping. The authors performed functional localization for the receptive language area, using real-time HGA mapping, by listening passively to linguistic sounds. Furthermore, single electrical pulses were delivered to the identified receptive temporal language area to detect CCEPs in the frontal lobe. All mapping results were validated by ECS, and the sensitivity and specificity were evaluated. RESULTS Linguistic HGA mapping quickly identified the language area in the temporal lobe. Electrical stimulation by linguistic HGA mapping to the identified temporal receptive language area evoked CCEPs on the frontal lobe. The combination of linguistic HGA and frontal CCEPs needed no patient cooperation or effort. In this small case series, the sensitivity and specificity were 93.8% and 89%, respectively. CONCLUSIONS The described technique allows for simple and quick functional brain mapping with higher sensitivity and specificity than ECS mapping. The authors believe that this could improve the reliability of functional brain mapping and facilitate rational and objective operations. Passive mapping also sheds light on the underlying physiological mechanisms of language in the human brain.

  18. Proteomic analysis to investigate color changes of chilled beef longissimus steaks held under carbon monoxide and high oxygen packaging.

    PubMed

    Yang, Xiaoyin; Wu, Shuang; Hopkins, David L; Liang, Rongrong; Zhu, Lixian; Zhang, Yimin; Luo, Xin

    2018-08-01

    This study investigated the proteome basis for color stability variations in beef steaks packaged under two modified atmosphere packaging (MAP) methods: HiOx-MAP (80% O 2 /20% CO 2 ) and CO-MAP (0.4% CO/30% CO 2 /69.6% N 2 ) during 15 days of storage. The color stability, pH, and sarcoplasmic proteome analysis of steaks were evaluated on days 0, 5, 10 and 15 of storage. Proteomic results revealed that the differential expression of the sarcoplasmic proteome during storage contributed to the variations in meat color stability between the two MAP methods. Compared with HiOx-MAP steaks, some glycolytic and energy metabolic enzymes important in NADH regeneration and antioxidant processes, antioxidant peroxiredoxins (thioredoxin-dependent peroxide reductase, peroxiredoxin-2, peroxiredoxin-6) and protein DJ-1 were more abundant in CO-MAP steaks. The over-expression of these proteins could induce CO-MAP steaks to maintain high levels of metmyoglobin reducing activity and oxygen consumption rate, resulting in CO-MAP steaks exhibiting better color stability than HiOx-MAP steaks during storage. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Snake River Plain Geothermal Play Fairway Analysis - Phase 1 Raster Files

    DOE Data Explorer

    John Shervais

    2015-10-09

    Snake River Plain Play Fairway Analysis - Phase 1 CRS Raster Files. This dataset contains raster files created in ArcGIS. These raster images depict Common Risk Segment (CRS) maps for HEAT, PERMEABILITY, AND SEAL, as well as selected maps of Evidence Layers. These evidence layers consist of either Bayesian krige functions or kernel density functions, and include: (1) HEAT: Heat flow (Bayesian krige map), Heat flow standard error on the krige function (data confidence), volcanic vent distribution as function of age and size, groundwater temperature (equivalue interval and natural breaks bins), and groundwater T standard error. (2) PERMEABILTY: Fault and lineament maps, both as mapped and as kernel density functions, processed for both dilational tendency (TD) and slip tendency (ST), along with data confidence maps for each data type. Data types include mapped surface faults from USGS and Idaho Geological Survey data bases, as well as unpublished mapping; lineations derived from maximum gradients in magnetic, deep gravity, and intermediate depth gravity anomalies. (3) SEAL: Seal maps based on presence and thickness of lacustrine sediments and base of SRP aquifer. Raster size is 2 km. All files generated in ArcGIS.

  20. a Mapping Method of Slam Based on Look up Table

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Li, J.; Wang, A.; Wang, J.

    2017-09-01

    In the last years several V-SLAM(Visual Simultaneous Localization and Mapping) approaches have appeared showing impressive reconstructions of the world. However these maps are built with far more than the required information. This limitation comes from the whole process of each key-frame. In this paper we present for the first time a mapping method based on the LOOK UP TABLE(LUT) for visual SLAM that can improve the mapping effectively. As this method relies on extracting features in each cell divided from image, it can get the pose of camera that is more representative of the whole key-frame. The tracking direction of key-frames is obtained by counting the number of parallax directions of feature points. LUT stored all mapping needs the number of cell corresponding to the tracking direction which can reduce the redundant information in the key-frame, and is more efficient to mapping. The result shows that a better map with less noise is build using less than one-third of the time. We believe that the capacity of LUT efficiently building maps makes it a good choice for the community to investigate in the scene reconstruction problems.

  1. 77 FR 21991 - Federal Housing Administration (FHA): Multifamily Accelerated Processing (MAP)-Lender and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-12

    ... Administration (FHA): Multifamily Accelerated Processing (MAP)--Lender and Underwriter Eligibility Criteria and....gov . FOR FURTHER INFORMATION CONTACT: Terry W. Clark, Office of Multifamily Development, Office of... qualifications could underwrite loans involving more complex multifamily housing programs and transactions. II...

  2. Development and comparison of processing maps of Mg-3Sn-1Ca alloy from data obtained in tension versus compression

    NASA Astrophysics Data System (ADS)

    Rao, K. P.; Suresh, K.; Prasad, Y. V. R. K.; Hort, N.

    2018-01-01

    The hot workability of extruded Mg-3Sn-1Ca alloy has been evaluated by developing processing maps with flow stress data from compression and tensile tests with a view to find the effect of the applied state-of-stress. The processing maps developed at a strain of 0.2 are essentially similar irrespective of the mode of deformation - compression or tension, and exhibit three domains in the temperature ranges: (1) 350 - 425 °C, and (2) 450 - 550 °C and (3) 400 - 500 °C, the first two occurring at lower strain rates and the third occurring at higher strain rates. In all the three domains, dynamic recrystallization occurs and is caused by non-basal slip and controlled by lattice self-diffusion in the first and second domains and grain boundary self-diffusion in the third domain. The state-of-stress imposed on the specimen (compression or tension) does not have any significant effect on the processing maps.

  3. The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Zhou, Liqing

    2015-12-01

    With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.

  4. Knowledge mapping as a technique to support knowledge translation.

    PubMed Central

    Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.

    2006-01-01

    This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651

  5. Granger-causality maps of diffusion processes.

    PubMed

    Wahl, Benjamin; Feudel, Ulrike; Hlinka, Jaroslav; Wächter, Matthias; Peinke, Joachim; Freund, Jan A

    2016-02-01

    Granger causality is a statistical concept devised to reconstruct and quantify predictive information flow between stochastic processes. Although the general concept can be formulated model-free it is often considered in the framework of linear stochastic processes. Here we show how local linear model descriptions can be employed to extend Granger causality into the realm of nonlinear systems. This novel treatment results in maps that resolve Granger causality in regions of state space. Through examples we provide a proof of concept and illustrate the utility of these maps. Moreover, by integration we convert the local Granger causality into a global measure that yields a consistent picture for a global Ornstein-Uhlenbeck process. Finally, we recover invariance transformations known from the theory of autoregressive processes.

  6. Development and validation of a complementary map to enhance the existing 1998 to 2008 Abbreviated Injury Scale map

    PubMed Central

    2011-01-01

    Introduction Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Methods Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. Results The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. Conclusions The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available. PMID:21548991

  7. Development and validation of a complementary map to enhance the existing 1998 to 2008 Abbreviated Injury Scale map.

    PubMed

    Palmer, Cameron S; Franklyn, Melanie; Read-Allsopp, Christine; McLellan, Susan; Niggemeyer, Louise E

    2011-05-08

    Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available.

  8. Citizen-Scientist Digitization of a Complex Geologic Map of the McDowell Mountains (Scottsdale, Arizona).

    NASA Astrophysics Data System (ADS)

    Gruber, D.; Skotnicki, S.; Gootee, B.

    2016-12-01

    The work of citizen scientists has become very important to researchers doing field work and internet-based projects but has not been widely utilized in digital mapping. The McDowell Mountains - located in Scottsdale, Arizona, at the edge of the basin-and-range province and protected as part of the McDowell Sonoran Preserve - are geologically complex. Until recently, no comprehensive geologic survey of the entire range had been done. Over the last 9 years geologist Steven Skotnicki spent 2000 hours mapping the complex geology of the range. His work, born of personal interest and partially supported by the McDowell Sonoran Conservancy, resulted in highly detailed hand-drawn survey maps. Dr. Skotnicki's work provides important new information and raises interesting research questions about the geology of this range. Citizen scientists of the McDowell Sonoran Conservancy Field Institute digitized Dr. Skotnicki's maps. A team of 10 volunteers, trained in ArcMap digitization techniques and led by volunteer project leader Daniel Gruber, performed the digitization work. Technical oversight of mapping using ArcMap, including provision of USGS-based mapping toolbars, was provided by Arizona Geological Survey (AZGS) research geologist Brian Gootee. The map digitization process identified and helped resolve a number of mapping questions. The citizen-scientist team spent 900 hours on training, digitization, quality checking, and project coordination with support and review by Skotnicki and Gootee. The resulting digital map has approximately 3000 polygons, 3000 points, and 86 map units with complete metadata and unit descriptions. The finished map is available online through AZGS and can be accessed in the field on mobile devices. User location is shown on the map and metadata can be viewed with a tap. The citizen scientist map digitization team has made this important geologic information available to the public and accessible to other researchers quickly and efficiently.

  9. Scoping of Flood Hazard Mapping Needs for Penobscot County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program (MFMP), began scoping work in 2006 for Penobscot County. Scoping activities included assembling existing data and map needs information for communities in Penobscot County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. As of 2007, the average age of the FEMA floodplain maps in Penobscot County, Maine, is 22 years, based on the most recent revisions to the maps. Because the revisions did not affect all the map panels in each town, however, the true average date probably is more than 22 years. Many of the studies were published in the mid-1980s. Since the studies were completed, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  10. A large-area, spatially continuous assessment of land cover map error and its impact on downstream analyses.

    PubMed

    Estes, Lyndon; Chen, Peng; Debats, Stephanie; Evans, Tom; Ferreira, Stefanus; Kuemmerle, Tobias; Ragazzo, Gabrielle; Sheffield, Justin; Wolf, Adam; Wood, Eric; Caylor, Kelly

    2018-01-01

    Land cover maps increasingly underlie research into socioeconomic and environmental patterns and processes, including global change. It is known that map errors impact our understanding of these phenomena, but quantifying these impacts is difficult because many areas lack adequate reference data. We used a highly accurate, high-resolution map of South African cropland to assess (1) the magnitude of error in several current generation land cover maps, and (2) how these errors propagate in downstream studies. We first quantified pixel-wise errors in the cropland classes of four widely used land cover maps at resolutions ranging from 1 to 100 km, and then calculated errors in several representative "downstream" (map-based) analyses, including assessments of vegetative carbon stocks, evapotranspiration, crop production, and household food security. We also evaluated maps' spatial accuracy based on how precisely they could be used to locate specific landscape features. We found that cropland maps can have substantial biases and poor accuracy at all resolutions (e.g., at 1 km resolution, up to ∼45% underestimates of cropland (bias) and nearly 50% mean absolute error (MAE, describing accuracy); at 100 km, up to 15% underestimates and nearly 20% MAE). National-scale maps derived from higher-resolution imagery were most accurate, followed by multi-map fusion products. Constraining mapped values to match survey statistics may be effective at minimizing bias (provided the statistics are accurate). Errors in downstream analyses could be substantially amplified or muted, depending on the values ascribed to cropland-adjacent covers (e.g., with forest as adjacent cover, carbon map error was 200%-500% greater than in input cropland maps, but ∼40% less for sparse cover types). The average locational error was 6 km (600%). These findings provide deeper insight into the causes and potential consequences of land cover map error, and suggest several recommendations for land cover map users. © 2017 John Wiley & Sons Ltd.

  11. Using participatory design to develop (public) health decision support systems through GIS.

    PubMed

    Dredger, S Michelle; Kothari, Anita; Morrison, Jason; Sawada, Michael; Crighton, Eric J; Graham, Ian D

    2007-11-27

    Organizations that collect substantial data for decision-making purposes are often characterized as being 'data rich' but 'information poor'. Maps and mapping tools can be very useful for research transfer in converting locally collected data into information. Challenges involved in incorporating GIS applications into the decision-making process within the non-profit (public) health sector include a lack of financial resources for software acquisition and training for non-specialists to use such tools. This on-going project has two primary phases. This paper critically reflects on Phase 1: the participatory design (PD) process of developing a collaborative web-based GIS tool. A case study design is being used whereby the case is defined as the data analyst and manager dyad (a two person team) in selected Ontario Early Year Centres (OEYCs). Multiple cases are used to support the reliability of findings. With nine producer/user pair participants, the goal in Phase 1 was to identify barriers to map production, and through the participatory design process, develop a web-based GIS tool suited for data analysts and their managers. This study has been guided by the Ottawa Model of Research Use (OMRU) conceptual framework. Due to wide variations in OEYC structures, only some data analysts used mapping software and there was no consistency or standardization in the software being used. Consequently, very little sharing of maps and data occurred among data analysts. Using PD, this project developed a web-based mapping tool (EYEMAP) that was easy to use, protected proprietary data, and permit limited and controlled sharing between participants. By providing data analysts with training on its use, the project also ensured that data analysts would not break cartographic conventions (e.g. using a chloropleth map for count data). Interoperability was built into the web-based solution; that is, EYEMAP can read many different standard mapping file formats (e.g. ESRI, MapInfo, CSV). Based on the evaluation of Phase 1, the PD process has served both as a facilitator and a barrier. In terms of successes, the PD process identified two key components that are important to users: increased data/map sharing functionality and interoperability. Some of the challenges affected developers and users; both individually and as a collective. From a development perspective, this project experienced difficulties in obtaining personnel skilled in web application development and GIS. For users, some data sharing barriers are beyond what a technological tool can address (e.g. third party data). Lastly, the PD process occurs in real time; both a strength and a limitation. Programmatic changes at the provincial level and staff turnover at the organizational level made it difficult to maintain buy-in as participants changed over time. The impacts of these successes and challenges will be evaluated more concretely at the end of Phase 2. PD approaches, by their very nature, encourage buy-in to the development process, better addresses user-needs, and creates a sense of user-investment and ownership.

  12. Classification of fMRI resting-state maps using machine learning techniques: A comparative study

    NASA Astrophysics Data System (ADS)

    Gallos, Ioannis; Siettos, Constantinos

    2017-11-01

    We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.

  13. Mapping the seafloor geology offshore of Massachusetts

    USGS Publications Warehouse

    Barnhardt, Walter A.; Andrews, Brian D.

    2006-01-01

    Geologic and bathymetric maps help us understand the evolutionary history of the Massachusetts coast and the processes that have shaped it. The maps show the distribution of bottom types (for example, bedrock, gravel, sand, mud) and water depths over large areas of the seafloor. In turn, these two fundamental parameters largely determine the species of flora and fauna that inhabit a particular area. Knowledge of bottom types and water depths provides a framework for mapping benthic habitats and managing marine resources. The need for coastal–zone mapping to inform policy and management is widely recognized as critical for mitigating hazards, creating resource inventories, and tracking environmental changes (National Research Council, 2004; U.S. Commission on Ocean Policy, 2004).

  14. Multiresolution saliency map based object segmentation

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Wang, Xin; Dai, ZhenYou

    2015-11-01

    Salient objects' detection and segmentation are gaining increasing research interest in recent years. A saliency map can be obtained from different models presented in previous studies. Based on this saliency map, the most salient region (MSR) in an image can be extracted. This MSR, generally a rectangle, can be used as the initial parameters for object segmentation algorithms. However, to our knowledge, all of those saliency maps are represented in a unitary resolution although some models have even introduced multiscale principles in the calculation process. Furthermore, some segmentation methods, such as the well-known GrabCut algorithm, need more iteration time or additional interactions to get more precise results without predefined pixel types. A concept of a multiresolution saliency map is introduced. This saliency map is provided in a multiresolution format, which naturally follows the principle of the human visual mechanism. Moreover, the points in this map can be utilized to initialize parameters for GrabCut segmentation by labeling the feature pixels automatically. Both the computing speed and segmentation precision are evaluated. The results imply that this multiresolution saliency map-based object segmentation method is simple and efficient.

  15. QSL Squasher: A Fast Quasi-separatrix Layer Map Calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tassev, Svetlin; Savcheva, Antonia, E-mail: svetlin.tassev@cfa.harvard.edu

    Quasi-Separatrix Layers (QSLs) are a useful proxy for the locations where current sheets can develop in the solar corona, and give valuable information about the connectivity in complicated magnetic field configurations. However, calculating QSL maps, even for two-dimensional slices through three-dimensional models of coronal magnetic fields, is a non-trivial task, as it usually involves tracing out millions of magnetic field lines with immense precision. Thus, extending QSL calculations to three dimensions has rarely been done until now. In order to address this challenge, we present QSL Squasher—a public, open-source code, which is optimized for calculating QSL maps in both twomore » and three dimensions on graphics processing units. The code achieves large processing speeds for three reasons, each of which results in an order-of-magnitude speed-up. (1) The code is parallelized using OpenCL. (2) The precision requirements for the QSL calculation are drastically reduced by using perturbation theory. (3) A new boundary detection criterion between quasi-connectivity domains is used, which quickly identifies possible QSL locations that need to be finely sampled by the code. That boundary detection criterion relies on finding the locations of abrupt field-line length changes, which we do by introducing a new Field-line Length Edge (FLEDGE) map. We find FLEDGE maps useful on their own as a quick-and-dirty substitute for QSL maps. QSL Squasher allows construction of high-resolution 3D FLEDGE maps in a matter of minutes, which is two orders of magnitude faster than calculating the corresponding 3D QSL maps. We include a sample of calculations done using QSL Squasher to demonstrate its capabilities as a QSL calculator, as well as to compare QSL and FLEDGE maps.« less

  16. Developing Tsunami Evacuation Plans, Maps, And Procedures: Pilot Project in Central America

    NASA Astrophysics Data System (ADS)

    Arcos, N. P.; Kong, L. S. L.; Arcas, D.; Aliaga, B.; Coetzee, D.; Leonard, J.

    2015-12-01

    In the End-to-End tsunami warning chain, once a forecast is provided and a warning alert issued, communities must know what to do and where to go. The 'where to' answer would be reliable and practical community-level tsunami evacuation maps. Following the Exercise Pacific Wave 2011, a questionnaire was sent to the 46 Member States of Pacific Tsunami Warning System (PTWS). The results revealed over 42 percent of Member States lacked tsunami mass coastal evacuation plans. Additionally, a significant gap in mapping was exposed as over 55 percent of Member States lacked tsunami evacuation maps, routes, signs and assembly points. Thereby, a significant portion of countries in the Pacific lack appropriate tsunami planning and mapping for their at-risk coastal communities. While a variety of tools exist to establish tsunami inundation areas, these are inconsistent while a methodology has not been developed to assist countries develop tsunami evacuation maps, plans, and procedures. The International Tsunami Information Center (ITIC) and partners is leading a Pilot Project in Honduras demonstrating that globally standardized tools and methodologies can be applied by a country, with minimal tsunami warning and mitigation resources, towards the determination of tsunami inundation areas and subsequently community-owned tsunami evacuation maps and plans for at-risk communities. The Pilot involves a 1- to 2-year long process centered on a series of linked tsunami training workshops on: evacuation planning, evacuation map development, inundation modeling and map creation, tsunami warning & emergency response Standard Operating Procedures (SOPs), and conducting tsunami exercises (including evacuation). The Pilot's completion is capped with a UNESCO/IOC document so that other countries can replicate the process in their tsunami-prone communities.

  17. QSL Squasher: A Fast Quasi-separatrix Layer Map Calculator

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Savcheva, Antonia

    2017-05-01

    Quasi-Separatrix Layers (QSLs) are a useful proxy for the locations where current sheets can develop in the solar corona, and give valuable information about the connectivity in complicated magnetic field configurations. However, calculating QSL maps, even for two-dimensional slices through three-dimensional models of coronal magnetic fields, is a non-trivial task, as it usually involves tracing out millions of magnetic field lines with immense precision. Thus, extending QSL calculations to three dimensions has rarely been done until now. In order to address this challenge, we present QSL Squasher—a public, open-source code, which is optimized for calculating QSL maps in both two and three dimensions on graphics processing units. The code achieves large processing speeds for three reasons, each of which results in an order-of-magnitude speed-up. (1) The code is parallelized using OpenCL. (2) The precision requirements for the QSL calculation are drastically reduced by using perturbation theory. (3) A new boundary detection criterion between quasi-connectivity domains is used, which quickly identifies possible QSL locations that need to be finely sampled by the code. That boundary detection criterion relies on finding the locations of abrupt field-line length changes, which we do by introducing a new Field-line Length Edge (FLEDGE) map. We find FLEDGE maps useful on their own as a quick-and-dirty substitute for QSL maps. QSL Squasher allows construction of high-resolution 3D FLEDGE maps in a matter of minutes, which is two orders of magnitude faster than calculating the corresponding 3D QSL maps. We include a sample of calculations done using QSL Squasher to demonstrate its capabilities as a QSL calculator, as well as to compare QSL and FLEDGE maps.

  18. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    NASA Astrophysics Data System (ADS)

    Lehrack, S.; Duckeck, G.; Ebke, J.

    2014-06-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  19. Assessing the Agreement Between Eo-Based Semi-Automated Landslide Maps with Fuzzy Manual Landslide Delineation

    NASA Astrophysics Data System (ADS)

    Albrecht, F.; Hölbling, D.; Friedl, B.

    2017-09-01

    Landslide mapping benefits from the ever increasing availability of Earth Observation (EO) data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA) provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR) satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  20. Frame of reference for electronic maps - The relevance of spatial cognition, mental rotation, and componential task analysis

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Aretz, Anthony; Harwood, Kelly

    1989-01-01

    Three experiments are reported that examine the difference between north-up and track-up maps for airborne navigation. The results of the first two experiments, conducted in a basic laboratory setting, identified the cost associated with mental rotation, when a north-up map is used. However, the data suggest that these costs are neither large nor consistent. The third experiment examined a range of tasks in a higher fidelity helicopter flight simulation, and associated the costs of north-up maps with a cognitive component related to orientation, and the costs of track-up maps with a cognitive component related to inconsistent landmark location. Different tasks are associated with different dependence on these components. The results are discussed in terms of their implications for map design, and for cognitive models of navigational processes.

  1. Using concept mapping for assessing and promoting relational conceptual change in science

    NASA Astrophysics Data System (ADS)

    Liu, Xiufeng

    2004-05-01

    In this article, we adopted the relational conceptual change as our theoretical framework to accommodate current views of conceptual change such as ontological beliefs, epistemological commitment, and social/affective contexts commonly mentioned in the literature. We used a specific concept mapping format and process - digraphs and digraphing - as an operational framework for assessing and promoting relational conceptual change. We wanted to find out how concept mapping can be used to account for relational conceptual change. We collected data from a Grade 12 chemistry class using collaborative computerized concept mapping on an ongoing basis during a unit of instruction. Analysis of progressive concept maps and interview transcripts of representative students and the teacher showed that ongoing and collaborative computerized concept mapping is able to account for student conceptual change in ontological, epistemological, and social/affective domains.

  2. Forest and range mapping in the Houston area with ERTS-1

    NASA Technical Reports Server (NTRS)

    Heath, G. R.; Parker, H. D.

    1973-01-01

    ERTS-1 data acquired over the Houston area has been analyzed for applications to forest and range mapping. In the field of forestry the Sam Houston National Forest (Texas) was chosen as a test site, (Scene ID 1037-16244). Conventional imagery interpretation as well as computer processing methods were used to make classification maps of timber species, condition and land-use. The results were compared with timber stand maps which were obtained from aircraft imagery and checked in the field. The preliminary investigations show that conventional interpretation techniques indicated an accuracy in classification of 63 percent. The computer-aided interpretations made by a clustering technique gave 70 percent accuracy. Computer-aided and conventional multispectral analysis techniques were applied to range vegetation type mapping in the gulf coast marsh. Two species of salt marsh grasses were mapped.

  3. [Cross-Mapping: diagnostic labels formulated according to the ICNP® versus diagnosis of NANDA International].

    PubMed

    Tannure, Meire Chucre; Salgado, Patrícia de Oliveira; Chianca, Tânia Couto Machado

    2014-01-01

    This descriptive study aimed at elaborating nursing diagnostic labels according to ICNP®; conducting a cross-mapping between the diagnostic formulations and the diagnostic labels of NANDA-I; identifying the diagnostic labels thus obtained that were also listed in the NANDA-I; and mapping them according to Basic Human Needs. The workshop technique was applied to 32 intensive care nurses, the cross-mapping and validation based on agreement with experts. The workshop produced 1665 diagnostic labels which were further refined into 120 labels. They were then submitted to a cross-mapping process with both NANDA-I diagnostic labels and the Basic Human Needs. The mapping results underwent content validation by two expert nurses leading to concordance rates of 92% and 100%. It was found that 63 labels were listed in NANDA-I and 47 were not.

  4. Using high-resolution digital aerial imagery to map land cover

    USGS Publications Warehouse

    Dieck, J.J.; Robinson, Larry

    2014-01-01

    The Upper Midwest Environmental Sciences Center (UMESC) has used aerial photography to map land cover/land use on federally owned and managed lands for over 20 years. Until recently, that process used 23- by 23-centimeter (9- by 9-inch) analog aerial photos to classify vegetation along the Upper Mississippi River System, on National Wildlife Refuges, and in National Parks. With digital aerial cameras becoming more common and offering distinct advantages over analog film, UMESC transitioned to an entirely digital mapping process in 2009. Though not without challenges, this method has proven to be much more accurate and efficient when compared to the analog process.

  5. Procedure for extraction of disparate data from maps into computerized data bases

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1979-01-01

    A procedure is presented for extracting disparate sources of data from geographic maps and for the conversion of these data into a suitable format for processing on a computer-oriented information system. Several graphic digitizing considerations are included and related to the NASA Earth Resources Laboratory's Digitizer System. Current operating procedures for the Digitizer System are given in a simplified and logical manner. The report serves as a guide to those organizations interested in converting map-based data by using a comparable map digitizing system.

  6. Creating affordable Internet map server applications for regional scale applications.

    PubMed

    Lembo, Arthur J; Wagenet, Linda P; Schusler, Tania; DeGloria, Stephen D

    2007-12-01

    This paper presents an overview and process for developing an Internet Map Server (IMS) application for a local volunteer watershed group using an Internal Internet Map Server (IIMS) strategy. The paper illustrates that modern GIS architectures utilizing an internal Internet map server coupled with a spatial SQL command language allow for rapid development of IMS applications. The implication of this approach means that powerful IMS applications can be rapidly and affordably developed for volunteer organizations that lack significant funds or a full time information technology staff.

  7. Mapping alpha-Particle X-Ray Fluorescence Spectrometer (Map-X)

    NASA Technical Reports Server (NTRS)

    Blake, D. F.; Sarrazin, P.; Bristow, T.

    2014-01-01

    Many planetary surface processes (like physical and chemical weathering, water activity, diagenesis, low-temperature or impact metamorphism, and biogenic activity) leave traces of their actions as features in the size range 10s to 100s of micron. The Mapping alpha-particle X-ray Spectrometer ("Map-X") is intended to provide chemical imaging at 2 orders of magnitude higher spatial resolution than previously flown instruments, yielding elemental chemistry at or below the scale length where many relict physical, chemical, and biological features can be imaged and interpreted in ancient rocks.

  8. Preliminary investigation of submerged aquatic vegetation mapping using hyperspectral remote sensing.

    PubMed

    William, David J; Rybicki, Nancy B; Lombana, Alfonso V; O'Brien, Tim M; Gomez, Richard B

    2003-01-01

    The use of airborne hyperspectral remote sensing imagery for automated mapping of submerged aquatic vegetation (SAV) in the tidal Potomac River was investigated for near to real-time resource assessment and monitoring. Airborne hyperspectral imagery and field spectrometer measurements were obtained in October of 2000. A spectral library database containing selected ground-based and airborne sensor spectra was developed for use in image processing. The spectral library is used to automate the processing of hyperspectral imagery for potential real-time material identification and mapping. Field based spectra were compared to the airborne imagery using the database to identify and map two species of SAV (Myriophyllum spicatum and Vallisneria americana). Overall accuracy of the vegetation maps derived from hyperspectral imagery was determined by comparison to a product that combined aerial photography and field based sampling at the end of the SAV growing season. The algorithms and databases developed in this study will be useful with the current and forthcoming space-based hyperspectral remote sensing systems.

  9. Global, quantitative and dynamic mapping of protein subcellular localization

    PubMed Central

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH

    2016-01-01

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775

  10. Recent development in preparation of European soil hydraulic maps

    NASA Astrophysics Data System (ADS)

    Toth, B.; Weynants, M.; Pasztor, L.; Hengl, T.

    2017-12-01

    Reliable quantitative information on soil hydraulic properties is crucial for modelling hydrological, meteorological, ecological and biological processes of the Critical Zone. Most of the Earth system models need information on soil moisture retention capacity and hydraulic conductivity in the full matric potential range. These soil hydraulic properties can be quantified, but their measurement is expensive and time consuming, therefore measurement-based catchment scale mapping of these soil properties is not possible. The increasing availability of soil information and methods describing relationships between simple soil characteristics and soil hydraulic properties provide the possibility to derive soil hydraulic maps based on spatial soil datasets and pedotransfer functions (PTFs). Over the last decade there has been a significant development in preparation of soil hydraulic maps. Spatial datasets on model parameters describing the soil hydraulic processes have become available for countries, continents and even for the whole globe. Our aim is to present European soil hydraulic maps, show their performance, highlight their advantages and drawbacks, and propose possible ways to further improve the performance of those.

  11. LAMMR world data base documentation support and demonstrations

    NASA Technical Reports Server (NTRS)

    Chin, R.; Beaudet, P.

    1980-01-01

    The primary purpose of the World Surface Map is to provide the LAMMR subsystem with world surface type classifications that are used to set up LAMMR LEVEL II process control. This data base will be accessed solely by the LAMMR subsystem. The SCATT and ALT subsystems will access the data base indirectly through the T sub b (Brightness Temperature) Data Bank, where the surface types were updated from a priori to current classification, and where the surface types were organized on an orbital subtrack basis. The single most important factor in the design of the World Surface Maps is the ease of access to the information while the complexity of generating these maps is of lesser importance because their generation is a one-time, off-line process. The World Surface Map provides storage of information with a resolution of 7 km necessary to set flags concerning the earth's features with a different set of maps for each month of the year.

  12. Investigating Word Learning in Fragile X Syndrome: A Fast-Mapping Study

    ERIC Educational Resources Information Center

    McDuffie, Andrea; Kover, Sara T.; Hagerman, Randi; Abbeduto, Leonard

    2013-01-01

    Fast-mapping paradigms have not been used previously to examine the process of word learning in boys with fragile X syndrome (FXS), who are likely to have intellectual impairment, language delays, and symptoms of autism. In this study, a fast-mapping task was used to investigate associative word learning in 4- to 10-year-old boys with FXS relative…

  13. Using Discovery Maps as a Free-Choice Learning Process Can Enhance the Effectiveness of Environmental Education in a Botanical Garden

    ERIC Educational Resources Information Center

    Yang, Xi; Chen, Jin

    2017-01-01

    Botanical gardens (BGs) are important agencies that enhance human knowledge and attitude towards flora conservation. By following free-choice learning model, we developed a "Discovery map" and distributed the map to visitors at the Xishuangbanna Tropical Botanical Garden in Yunnan, China. Visitors, who did and did not receive discovery…

  14. The Contribution of Latino Studies to Social Science Research on Immigration. JSRI Occasional Paper No. 36. Latino Studies Series.

    ERIC Educational Resources Information Center

    Pedraza, Silvia

    This paper offers a conceptual "map" of issues and approaches in immigration research and illustrates features of the map with the significant contributions of Latino Studies to immigration research. One axis of the map concerns the time line of various waves of immigration. Although research on immigrants and immigration processes was a…

  15. Adding It Up: A Rationale for Mapping Public Resources for Children, Youth and Families

    ERIC Educational Resources Information Center

    Flynn-Khan, Margaret; Ferber, Thaddeus; Gaines, Elizabeth; Pittman, Karen

    2006-01-01

    This introduction, one of three parts of "Adding It Up: A Guide to Mapping Public Resources for Children, Youth and Families," explains the why, how and what behind creating a children, youth, and families (CYF) resource map. Setting the stage for what's involved in the process, this overview provides a good framework for understanding both the…

  16. Kalman/Map filtering-aided fast normalized cross correlation-based Wi-Fi fingerprinting location sensing.

    PubMed

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-11-13

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results.

  17. Kalman/Map Filtering-Aided Fast Normalized Cross Correlation-Based Wi-Fi Fingerprinting Location Sensing

    PubMed Central

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-01-01

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results. PMID:24233027

  18. Mapping spatial patterns with morphological image processing

    Treesearch

    Peter Vogt; Kurt H. Riitters; Christine Estreguil; Jacek Kozak; Timothy G. Wade; James D. Wickham

    2006-01-01

    We use morphological image processing for classifying spatial patterns at the pixel level on binary land-cover maps. Land-cover pattern is classified as 'perforated,' 'edge,' 'patch,' and 'core' with higher spatial precision and thematic accuracy compared to a previous approach based on image convolution, while retaining the...

  19. Double-Labeled Metabolic Maps of Memory.

    ERIC Educational Resources Information Center

    John, E. R.; And Others

    1986-01-01

    Reviews a study which sought to obtain a quantitative metabolic map of the neurons mediating a specific memory. Research results support notions of cooperative processes in which nonrandom behavior of high ensembles of neural elements mediates the integration and processing of information and the retrieval of memory. (ML)

  20. GIS-based realization of international standards for digital geological mapping - developments in planetary mapping

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan; Jaumann, Ralf

    2010-05-01

    The Helmholtz Alliance and the European Planetary Network are research communities with different main topics. One of the main research topics which are shared by these communities is the question about the geomorphological evolutions of planetary surfaces as well as the geological context of life. This research contains questions like "Is there volcanic activity on a planet?" or "Where are possible landing sites?". In order to help answering such questions, analyses of surface features and morphometric measurements need to be performed. This ultimately leads to the generation of thematic maps (e.g. geological and geomorphologic maps) as a basis for the further studies. By using modern GIS techniques the comparative work and generalisation during mapping processes results in new information. These insights are crucial for subsequent investigations. Therefore, the aim is to make these results available to the research community as a secondary data basis. In order to obtain a common and interoperable data collection results of different mapping projects have to follow a standardised data-infrastructure, metadata definition and map layout. Therefore, we are currently focussing on the generation of a database model arranging all data and processes in a uniform mapping schema. With the help of such a schema, the mapper will be able to utilise a predefined (but customisable) GIS environment with individual tool items as well as a standardised symbolisation and a metadata environment. This environment is based on a data model which is currently on a conceptual level and provides the layout of the data infrastructure including relations and topologies. One of the first tasks towards this data model is the definition of a consistent basis of symbolisation standards developed for planetary mapping. The mapper/geologist will be able to access the pre-built signatures and utilise these in scale dependence within the mapping project. The symbolisation will be related to the data model in the next step. As second task, we designed a concept for description of the digital mapping result. Therefore, we are creating a metadata template based on existing standards for individual needs in planetary sciences. This template is subdivided in (meta) data about the general map content (e.g. on which data the mapping result based on) and in metadata for each individual mapping element/layer comprising information like minimum mapping scale, interpretation hints, etc. The assignment of such a metadata description in combination with the usage of a predefined mapping schema facilitates the efficient and traceable storage of data information on a network server and enables a subsequent representation, e.g. as a mapserver data structure. Acknowledgement: This work is partly supported by DLR and the Helmholtz Alliance "Planetary Evolution and Life".

  1. Facilitating the exploitation of ERTS-1 imagery using snow enhancement techniques. [geological fault maps of Massachusetts and Connecticut

    NASA Technical Reports Server (NTRS)

    Wobber, F. J. (Principal Investigator); Martin, K. R.; Amato, R. V.; Leshendok, T.

    1973-01-01

    The author has identified the following significant results. The applications of ERTS-1 imagery for geological fracture mapping regardless of season has been repeatedly confirmed. The enhancement provided by a differential cover of snow increases the number and length of fracture-lineaments which can be detected with ERTS-1 data and accelerates the fracture mapping process for a variety of practical applications. The geological mapping benefits of the program will be realized in geographic areas where data are most needed - complex glaciated terrain and areas of deep residual soils. ERTS-1 derived fracture-lineament maps which provide detail well in excess of existing geological maps are not available in the Massachusetts-Connecticut area. The large quantity of new data provided by ERTS-1 may accelerate and improve field mapping now in progress in the area. Numerous other user groups have requested data on the techniques. This represents a major change in operating philosophy for groups who to data judged that snow obscured geological detail.

  2. Globally optimal superconducting magnets part I: minimum stored energy (MSE) current density map.

    PubMed

    Tieng, Quang M; Vegh, Viktor; Brereton, Ian M

    2009-01-01

    An optimal current density map is crucial in magnet design to provide the initial values within search spaces in an optimization process for determining the final coil arrangement of the magnet. A strategy for obtaining globally optimal current density maps for the purpose of designing magnets with coaxial cylindrical coils in which the stored energy is minimized within a constrained domain is outlined. The current density maps obtained utilising the proposed method suggests that peak current densities occur around the perimeter of the magnet domain, where the adjacent peaks have alternating current directions for the most compact designs. As the dimensions of the domain are increased, the current density maps yield traditional magnet designs of positive current alone. These unique current density maps are obtained by minimizing the stored magnetic energy cost function and therefore suggest magnet coil designs of minimal system energy. Current density maps are provided for a number of different domain arrangements to illustrate the flexibility of the method and the quality of the achievable designs.

  3. Automated land-use mapping from spacecraft data. [Oakland County, Michigan

    NASA Technical Reports Server (NTRS)

    Chase, P. E. (Principal Investigator); Rogers, R. H.; Reed, L. E.

    1974-01-01

    The author has identified the following significant results. In response to the need for a faster, more economical means of producing land use maps, this study evaluated the suitability of using ERTS-1 computer compatible tape (CCT) data as a basis for automatic mapping. Significant findings are: (1) automatic classification accuracy greater than 90% is achieved on categories of deep and shallow water, tended grass, rangeland, extractive (bare earth), urban, forest land, and nonforested wet lands; (2) computer-generated printouts by target class provide a quantitative measure of land use; and (3) the generation of map overlays showing land use from ERTS-1 CCTs offers a significant breakthrough in the rate at which land use maps are generated. Rather than uncorrected classified imagery or computer line printer outputs, the processing results in geometrically-corrected computer-driven pen drawing of land categories, drawn on a transparent material at a scale specified by the operator. These map overlays are economically produced and provide an efficient means of rapidly updating maps showing land use.

  4. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  5. Speech processing using conditional observable maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John; Nix, David

    A computer implemented method enables the recognition of speech and speech characteristics. Parameters are initialized of first probability density functions that map between the symbols in the vocabulary of one or more sequences of speech codes that represent speech sounds and a continuity map. Parameters are also initialized of second probability density functions that map between the elements in the vocabulary of one or more desired sequences of speech transcription symbols and the continuity map. The parameters of the probability density functions are then trained to maximize the probabilities of the desired sequences of speech-transcription symbols. A new sequence ofmore » speech codes is then input to the continuity map having the trained first and second probability function parameters. A smooth path is identified on the continuity map that has the maximum probability for the new sequence of speech codes. The probability of each speech transcription symbol for each input speech code can then be output.« less

  6. A Servicewide Benthic Mapping Program for National Parks

    USGS Publications Warehouse

    Moses, Christopher S.; Nayegandhi, Amar; Beavers, Rebecca; Brock, John

    2010-01-01

    In 2007, the National Park Service (NPS) Inventory and Monitoring Program directed the initiation of a benthic habitat mapping program in ocean and coastal parks in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With 74 ocean and Great Lakes parks stretching over more than 5,000 miles of coastline across 26 States and territories, this Servicewide Benthic Mapping Program (SBMP) is essential. This program will deliver benthic habitat maps and their associated inventory reports to NPS managers in a consistent, servicewide format to support informed management and protection of 3 million acres of submerged National Park System natural and cultural resources. The NPS and the U.S. Geological Survey (USGS) convened a workshop June 3-5, 2008, in Lakewood, Colo., to discuss the goals and develop the design of the NPS SBMP with an assembly of experts (Moses and others, 2010) who identified park needs and suggested best practices for inventory and mapping of bathymetry, benthic cover, geology, geomorphology, and some water-column properties. The recommended SBMP protocols include servicewide standards (such as gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). SBMP Mapping Process. The SBMP calls for a multi-step mapping process for each park, beginning with a gap assessment and data mining to determine data resources and needs. An interagency announcement of intent to acquire new data will provide opportunities to leverage partnerships. Prior to new data acquisition, all involved parties should be included in a scoping meeting held at network scale. Data collection will be followed by processing and interpretation, and finally expert review and publication. After publication, all digital materials will be archived in a common format. SBMP Classification Scheme. The SBMP will map using the Coastal and Marine Ecological Classification Standard (CMECS) that is being modified to include all NPS needs, such as lacustrine ecosystems and submerged cultural resources. CMECS Version III (Madden and others, 2010) includes components for water column, biotic cover, surface geology, sub-benthic, and geoform. SBMP Data Archiving. The SBMP calls for the storage of all raw data and final products in common-use data formats. The concept of 'collect once, use often' is essential to efficient use of mapping resources. Data should also be shared with other agencies and the public through various digital clearing houses, such as Geospatial One-Stop (http://gos2.geodata.gov/wps/portal/gos). To be most useful for managing submerged resources, the SBMP advocates the inventory and mapping of the five components of marine ecosystems: surface geology, biotic cover, geoform, sub-benthic, and water column. A complete benthic inventory of a park would include maps of bathymetry and the five components of CMECS. The completion of mapping for any set of components, such as bathymetry and surface geology, or a particular theme (for example, submerged aquatic vegetation) should also include a printed report.

  7. Analysis of production flow process with lean manufacturing approach

    NASA Astrophysics Data System (ADS)

    Siregar, Ikhsan; Arif Nasution, Abdillah; Prasetio, Aji; Fadillah, Kharis

    2017-09-01

    This research was conducted on the company engaged in the production of Fast Moving Consumer Goods (FMCG). The production process in the company are still exists several activities that cause waste. Non value added activity (NVA) in the implementation is still widely found, so the cycle time generated to make the product will be longer. A form of improvement on the production line is by applying lean manufacturing method to identify waste along the value stream to find non value added activities. Non value added activity can be eliminated and reduced by utilizing value stream mapping and identifying it with activity mapping process. According to the results obtained that there are 26% of value-added activities and 74% non value added activity. The results obtained through the current state map of the production process of process lead time value of 678.11 minutes and processing time of 173.94 minutes. While the results obtained from the research proposal is the percentage of value added time of 41% of production process activities while non value added time of the production process of 59%. While the results obtained through the future state map of the production process of process lead time value of 426.69 minutes and processing time of 173.89 minutes.

  8. Quaternary and pre-Quaternary( ) materials and processes of southeast Ohio: Overview, speculations, and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, T.M.

    1992-01-01

    Investigations and mapping of surficial deposits in Ohio have focused largely on the glacial deposits which cover nearly two-thirds of the state. Research on Quaternary deposits beyond the glacial border has been done by Foster, Hildreth, Andrews, Leverett, Tight, Stout, Goldthwait, Forsyth, Lessig, White, Totten, Hoyer, and Noltimier. However, growing human interaction with surficial materials of southeast Ohio now requires much more detailed mapping and characterization of these deposits. Recognition of periglacial, proglacial, and preglacial processes and materials in eastern and southern states has led to the search for similar processes and materials in southeast Ohio. Evidence for gelifraction, gelifluction,more » cryoturbation, and considerable periglacial colluviation is more extensive than previously thought. Proglacial deposits are also much more extensive, outwash and glaciolacustrine deposits cover large areas in southeast Ohio and are poorly mapped and characterized, or not mapped at all. Preglacial processes including a long span of profound weathering and formation of saprolite have been given little or no attention in southeast Ohio. The signature of protracted preglacial weathering still remains in this part of the state, and should change prevailing views of the terrain upon which periglacial processes worked. Mapping and characterization of these materials are urgently needed as citizens make important land-use decisions such as locating landfills and new developments.« less

  9. Elemental mapping of biofortified wheat grains using micro X-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Ramos, I.; Pataco, I. M.; Mourinho, M. P.; Lidon, F.; Reboredo, F.; Pessoa, M. F.; Carvalho, M. L.; Santos, J. P.; Guerra, M.

    2016-06-01

    Micro X-ray fluorescence has been used to obtain elemental maps of biofortified wheat grains. Two varieties of wheat were used in the study, Triticum aestivum L. and Triticum durum desf. Two treatments, with different nutrient concentration, were applied to the plants during the whole plant growth cycle. From the obtained elemental maps it was possible to extract information regarding the plant's physiological processes under the biofortification procedures. Both macro and micronutrients were mapped, providing useful insight into the posterior food processing mechanisms of this biofortified staple food. We have also shown that these kind of studies can now be performed with laboratory benchtop apparatus, rather than using synchrotron radiation, increasing the overall attractiveness of micro X-ray fluorescence in the study of highly heterogeneous biological samples.

  10. Optimal mapping of irregular finite element domains to parallel processors

    NASA Technical Reports Server (NTRS)

    Flower, J.; Otto, S.; Salama, M.

    1987-01-01

    Mapping the solution domain of n-finite elements into N-subdomains that may be processed in parallel by N-processors is an optimal one if the subdomain decomposition results in a well-balanced workload distribution among the processors. The problem is discussed in the context of irregular finite element domains as an important aspect of the efficient utilization of the capabilities of emerging multiprocessor computers. Finding the optimal mapping is an intractable combinatorial optimization problem, for which a satisfactory approximate solution is obtained here by analogy to a method used in statistical mechanics for simulating the annealing process in solids. The simulated annealing analogy and algorithm are described, and numerical results are given for mapping an irregular two-dimensional finite element domain containing a singularity onto the Hypercube computer.

  11. Study of USGS/NASA land use classification system. [compatibility of land use classification system with computer processing techniques employed for land use mapping from ERTS data

    NASA Technical Reports Server (NTRS)

    Spann, G. W.; Faust, N. L.

    1974-01-01

    It is known from several previous investigations that many categories of land-use can be mapped via computer processing of Earth Resources Technology Satellite data. The results are presented of one such experiment using the USGS/NASA land-use classification system. Douglas County, Georgia, was chosen as the test site for this project. It was chosen primarily because of its recent rapid growth and future growth potential. Results of the investigation indicate an overall land-use mapping accuracy of 67% with higher accuracies in rural areas and lower accuracies in urban areas. It is estimated, however, that 95% of the State of Georgia could be mapped by these techniques with an accuracy of 80% to 90%.

  12. Near real-time skin deformation mapping

    NASA Astrophysics Data System (ADS)

    Kacenjar, Steve; Chen, Suzie; Jafri, Madiha; Wall, Brian; Pedersen, Richard; Bezozo, Richard

    2013-02-01

    A novel in vivo approach is described that provides large area mapping of the mechanical properties of the skin in human patients. Such information is important in the understanding of skin health, cosmetic surgery[1], aging, and impacts of sun exposure. Currently, several methods have been developed to estimate the local biomechanical properties of the skin, including the use of a physical biopsy of local areas of the skin (in vitro methods) [2, 3, and 4], and also the use of non-invasive methods (in vivo) [5, 6, and 7]. All such methods examine localized areas of the skin. Our approach examines the local elastic properties via the generation of field displacement maps of the skin created using time-sequence imaging [9] with 2D digital imaging correlation (DIC) [10]. In this approach, large areas of the skin are reviewed rapidly, and skin displacement maps are generated showing the contour maps of skin deformation. These maps are then used to precisely register skin images for purposes of diagnostic comparison. This paper reports on our mapping and registration approach, and demonstrates its ability to accurately measure the skin deformation through a described nulling interpolation process. The result of local translational DIC alignment is compared using this interpolation process. The effectiveness of the approach is reported in terms of residual RMS, image entropy measures, and differential segmented regional errors.

  13. BowMapCL: Burrows-Wheeler Mapping on Multiple Heterogeneous Accelerators.

    PubMed

    Nogueira, David; Tomas, Pedro; Roma, Nuno

    2016-01-01

    The computational demand of exact-search procedures has pressed the exploitation of parallel processing accelerators to reduce the execution time of many applications. However, this often imposes strict restrictions in terms of the problem size and implementation efforts, mainly due to their possibly distinct architectures. To circumvent this limitation, a new exact-search alignment tool (BowMapCL) based on the Burrows-Wheeler Transform and FM-Index is presented. Contrasting to other alternatives, BowMapCL is based on a unified implementation using OpenCL, allowing the exploitation of multiple and possibly different devices (e.g., NVIDIA, AMD/ATI, and Intel GPUs/APUs). Furthermore, to efficiently exploit such heterogeneous architectures, BowMapCL incorporates several techniques to promote its performance and scalability, including multiple buffering, work-queue task-distribution, and dynamic load-balancing, together with index partitioning, bit-encoding, and sampling. When compared with state-of-the-art tools, the attained results showed that BowMapCL (using a single GPU) is 2 × to 7.5 × faster than mainstream multi-threaded CPU BWT-based aligners, like Bowtie, BWA, and SOAP2; and up to 4 × faster than the best performing state-of-the-art GPU implementations (namely, SOAP3 and HPG-BWT). When multiple and completely distinct devices are considered, BowMapCL efficiently scales the offered throughput, ensuring a convenient load-balance of the involved processing in the several distinct devices.

  14. Geomorphological map of a coastal stretch of north-eastern Gozo (Maltese archipelago, Mediterranean Sea)

    NASA Astrophysics Data System (ADS)

    Soldati, Mauro; Micallef, Anton; Biolchi, Sara; Chelli, Alessandro; Cuoghi, Alessandro; Devoto, Stefano; Gauci, Christopher; Graff, Kevin; Lolli, Federico; Mantovani, Matteo; Mastronuzzi, Giuseppe; Pisani, Luca; Prampolini, Mariacristina; Restall, Brian; Roulland, Thomas; Saliba, Michael; Selmi, Lidia; Vandelli, Vittoria

    2017-04-01

    Geomorphological investigations carried out along the north-eastern coast of the Island of Gozo (Malta) have led to the production of a detailed geomorphological map. Field surveys, accompanied by aerial photo-interpretation, were carried out within the framework of the EUR-OPA Major Hazards Agreement Project ``Developing Geomorphological mapping skills and datasets in anticipation of subsequent Susceptibility, Vulnerability, Hazard and Risk Mapping'' (Council of Europe). In particular, this geomorphological map is the main output of a `Training Course on Geomorphological Mapping in Coastal Areas' held within the Project in November 2016. The study area selected was between Ramla Bay and Dacrhlet Qorrot Bay on the Island of Gozo (67 km2), part of the Maltese archipelago in the central Mediterranean Sea. From a geological viewpoint, the stratigraphic sequence includes Late Oligocene (Chattian) to Late Miocene (Messinian) sedimentary rocks. The hard limestones of the Upper Coralline Limestone Formation, the youngest lithostratigraphic unit, dominate the study area. Underlying this formation, marls and clays belonging to the Blue Clay Formation extensively outcrop. The oldest lithostratigraphic unit observed in the study area is the Globigerina Limestone Formation, a fine-grained limestone. The lithostructural features of the outcropping units clearly condition the morphography of the landscape. The coast is characterised by the alternation of inlets and promontories. Worthy of notice is the large sandy beach of Ramla Bay partly backed by dunes. From a geomorphological perspective, the investigated coastal stretch is characterised by limestone plateaus bounded by steep structural scarps which are reshaped by gravitational and/or degradation processes, and milder slopes in Blue Clays at their foot comprising of numerous rock block deposits (rdum in Maltese) and active or abandoned terraced fields used for agricultural purposes. Landforms and processes related to structural, gravitational, coastal, alluvial and karst processes were mapped. Particular attention was devoted to the recognition and classification of landslides of different type (in particular block slides and earth flows/slides) which affect large sectors of the north-eastern coast of Gozo. In most cases, landslide accumulations reach the coastline and cover shore platforms. In addition, wide portions of the plateau areas are affected by rock spreading related to the presence of limestones overlying clayey terrains. The climatic conditions, the dense joint systems and the karstification of limestone determine a temporary superficial drainage pattern. Temporary streambeds (wieden in Maltese) were identified in correspondence of V-shaped valleys once occupied by permanent water courses. Karst processes widely affect the Upper Coralline Limestone Formation resulting in caves, diffuse solution pools, grooves and furrows. The geomorphological map output represents a baseline document on which to undertake, first the landslide susceptibility mapping, subsequently the hazard mapping and finally the risk mapping, a critical part of the wider-scoped risk management process of this and similar coastal areas.

  15. Optimal contact definition for reconstruction of contact maps.

    PubMed

    Duarte, Jose M; Sathyapriya, Rajagopal; Stehr, Henning; Filippis, Ioannis; Lappe, Michael

    2010-05-27

    Contact maps have been extensively used as a simplified representation of protein structures. They capture most important features of a protein's fold, being preferred by a number of researchers for the description and study of protein structures. Inspired by the model's simplicity many groups have dedicated a considerable amount of effort towards contact prediction as a proxy for protein structure prediction. However a contact map's biological interest is subject to the availability of reliable methods for the 3-dimensional reconstruction of the structure. We use an implementation of the well-known distance geometry protocol to build realistic protein 3-dimensional models from contact maps, performing an extensive exploration of many of the parameters involved in the reconstruction process. We try to address the questions: a) to what accuracy does a contact map represent its corresponding 3D structure, b) what is the best contact map representation with regard to reconstructability and c) what is the effect of partial or inaccurate contact information on the 3D structure recovery. Our results suggest that contact maps derived from the application of a distance cutoff of 9 to 11A around the Cbeta atoms constitute the most accurate representation of the 3D structure. The reconstruction process does not provide a single solution to the problem but rather an ensemble of conformations that are within 2A RMSD of the crystal structure and with lower values for the pairwise average ensemble RMSD. Interestingly it is still possible to recover a structure with partial contact information, although wrong contacts can lead to dramatic loss in reconstruction fidelity. Thus contact maps represent a valid approximation to the structures with an accuracy comparable to that of experimental methods. The optimal contact definitions constitute key guidelines for methods based on contact maps such as structure prediction through contacts and structural alignments based on maximum contact map overlap.

  16. Optimal contact definition for reconstruction of Contact Maps

    PubMed Central

    2010-01-01

    Background Contact maps have been extensively used as a simplified representation of protein structures. They capture most important features of a protein's fold, being preferred by a number of researchers for the description and study of protein structures. Inspired by the model's simplicity many groups have dedicated a considerable amount of effort towards contact prediction as a proxy for protein structure prediction. However a contact map's biological interest is subject to the availability of reliable methods for the 3-dimensional reconstruction of the structure. Results We use an implementation of the well-known distance geometry protocol to build realistic protein 3-dimensional models from contact maps, performing an extensive exploration of many of the parameters involved in the reconstruction process. We try to address the questions: a) to what accuracy does a contact map represent its corresponding 3D structure, b) what is the best contact map representation with regard to reconstructability and c) what is the effect of partial or inaccurate contact information on the 3D structure recovery. Our results suggest that contact maps derived from the application of a distance cutoff of 9 to 11Å around the Cβ atoms constitute the most accurate representation of the 3D structure. The reconstruction process does not provide a single solution to the problem but rather an ensemble of conformations that are within 2Å RMSD of the crystal structure and with lower values for the pairwise average ensemble RMSD. Interestingly it is still possible to recover a structure with partial contact information, although wrong contacts can lead to dramatic loss in reconstruction fidelity. Conclusions Thus contact maps represent a valid approximation to the structures with an accuracy comparable to that of experimental methods. The optimal contact definitions constitute key guidelines for methods based on contact maps such as structure prediction through contacts and structural alignments based on maximum contact map overlap. PMID:20507547

  17. Volumetric calibration of a plenoptic camera.

    PubMed

    Hall, Elise Munz; Fahringer, Timothy W; Guildenbecher, Daniel R; Thurow, Brian S

    2018-02-01

    The volumetric calibration of a plenoptic camera is explored to correct for inaccuracies due to real-world lens distortions and thin-lens assumptions in current processing methods. Two methods of volumetric calibration based on a polynomial mapping function that does not require knowledge of specific lens parameters are presented and compared to a calibration based on thin-lens assumptions. The first method, volumetric dewarping, is executed by creation of a volumetric representation of a scene using the thin-lens assumptions, which is then corrected in post-processing using a polynomial mapping function. The second method, direct light-field calibration, uses the polynomial mapping in creation of the initial volumetric representation to relate locations in object space directly to image sensor locations. The accuracy and feasibility of these methods is examined experimentally by capturing images of a known dot card at a variety of depths. Results suggest that use of a 3D polynomial mapping function provides a significant increase in reconstruction accuracy and that the achievable accuracy is similar using either polynomial-mapping-based method. Additionally, direct light-field calibration provides significant computational benefits by eliminating some intermediate processing steps found in other methods. Finally, the flexibility of this method is shown for a nonplanar calibration.

  18. Mapping rice ecosystem dynamics and greenhouse gas emissions using multiscale imagery and biogeochemical models

    NASA Astrophysics Data System (ADS)

    Salas, W.; Torbick, N.

    2017-12-01

    Rice greenhouse gas (GHG) emissions in production hot spots have been mapped using multiscale satellite imagery and a processed-based biogeochemical model. The multiscale Synthetic Aperture Radar (SAR) and optical imagery were co-processed and fed into a machine leanring framework to map paddy attributes that are tuned using field observations and surveys. Geospatial maps of rice extent, crop calendar, hydroperiod, and cropping intensity were then used to parameterize the DeNitrification-DeComposition (DNDC) model to estimate emissions. Results, in the Red River Detla for example, show total methane emissions at 345.4 million kgCH4-C equivalent to 11.5 million tonnes CO2e (carbon dioxide equivalent). We further assessed the role of Alternative Wetting and Drying and the impact on GHG and yield across production hot spots with uncertainty estimates. The approach described in this research provides a framework for using SAR to derive maps of rice and landscape characteristics to drive process models like DNDC. These types of tools and approaches will support the next generation of Monitoring, Reporting, and Verification (MRV) to combat climate change and support ecosystem service markets.

  19. Geologic map of the western Haji-Gak iron deposit, Bamyan Province, Afghanistan, modified from the 1965 original map compilation of V.V. Reshetniak and I.K. Kusov

    USGS Publications Warehouse

    Renaud, Karine M.; Tucker, Robert D.; Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geologic-prospecting plan of western area of Hajigak iron-ore deposit, scale 1:2,000, which was compiled by V.V. Reshetniak and I.K. Kusov in 1965. (Refer to the References Cited section in the Map PDF for complete citations of the original map and related reports.) USGS scientists, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original documents and also visited the field area in November 2009. This modified map illustrates the geological structure of the western Haji-Gak iron deposit and includes cross sections of the same area. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and includes modifications based on our examination of that document. We constructed the cross sections from data derived from the original map. Elevations on the cross sections are derived from the original Soviet topography and may not match the newer topography used on the current map. We have attempted to translate the original Russian terminology and rock classification into modern English geologic usage as literally as possible without changing any genetic or process-oriented implications in the original descriptions. We also use the age designations from the original map. The unit colors on the map and cross sections differ from the colors shown on the original version. The units are colored according to the color and pattern scheme of the Commission for the Geological Map of the World (CGMW) (http://www.ccgm.org).

  20. Mapping cell-specific functional connections in the mouse brain using ChR2-evoked hemodynamics (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bauer, Adam Q.; Kraft, Andrew; Baxter, Grant A.; Bruchas, Michael; Lee, Jin-Moo; Culver, Joseph P.

    2017-02-01

    Functional magnetic resonance imaging (fMRI) has transformed our understanding of the brain's functional organization. However, mapping subunits of a functional network using hemoglobin alone presents several disadvantages. Evoked and spontaneous hemodynamic fluctuations reflect ensemble activity from several populations of neurons making it difficult to discern excitatory vs inhibitory network activity. Still, blood-based methods of brain mapping remain powerful because hemoglobin provides endogenous contrast in all mammalian brains. To add greater specificity to hemoglobin assays, we integrated optical intrinsic signal(OIS) imaging with optogenetic stimulation to create an Opto-OIS mapping tool that combines the cell-specificity of optogenetics with label-free, hemoglobin imaging. Before mapping, titrated photostimuli determined which stimulus parameters elicited linear hemodynamic responses in the cortex. Optimized stimuli were then scanned over the left hemisphere to create a set of optogenetically-defined effective connectivity (Opto-EC) maps. For many sites investigated, Opto-EC maps exhibited higher spatial specificity than those determined using spontaneous hemodynamic fluctuations. For example, resting-state functional connectivity (RS-FC) patterns exhibited widespread ipsilateral connectivity while Opto-EC maps contained distinct short- and long-range constellations of ipsilateral connectivity. Further, RS-FC maps were usually symmetric about midline while Opto-EC maps displayed more heterogeneous contralateral homotopic connectivity. Both Opto-EC and RS-FC patterns were compared to mouse connectivity data from the Allen Institute. Unlike RS-FC maps, Thy1-based maps collected in awake, behaving mice closely recapitulated the connectivity structure derived using ex vivo anatomical tracer methods. Opto-OIS mapping could be a powerful tool for understanding cellular and molecular contributions to network dynamics and processing in the mouse brain.

  1. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    PubMed

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  2. Cross-terminology mapping challenges: a demonstration using medication terminological systems.

    PubMed

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer V; Chute, Christopher G; Johnson, Todd R

    2012-08-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems-a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. An efficient cardiac mapping strategy for radiofrequency catheter ablation with active learning.

    PubMed

    Feng, Yingjing; Guo, Ziyan; Dong, Ziyang; Zhou, Xiao-Yun; Kwok, Ka-Wai; Ernst, Sabine; Lee, Su-Lin

    2017-07-01

    A major challenge in radiofrequency catheter ablation procedures is the voltage and activation mapping of the endocardium, given a limited mapping time. By learning from expert interventional electrophysiologists (operators), while also making use of an active-learning framework, guidance on performing cardiac voltage mapping can be provided to novice operators or even directly to catheter robots. A learning from demonstration (LfD) framework, based upon previous cardiac mapping procedures performed by an expert operator, in conjunction with Gaussian process (GP) model-based active learning, was developed to efficiently perform voltage mapping over right ventricles (RV). The GP model was used to output the next best mapping point, while getting updated towards the underlying voltage data pattern as more mapping points are taken. A regularized particle filter was used to keep track of the kernel hyperparameter used by GP. The travel cost of the catheter tip was incorporated to produce time-efficient mapping sequences. The proposed strategy was validated on a simulated 2D grid mapping task, with leave-one-out experiments on 25 retrospective datasets, in an RV phantom using the Stereotaxis Niobe ® remote magnetic navigation system, and on a tele-operated catheter robot. In comparison with an existing geometry-based method, regression error was reduced and was minimized at a faster rate over retrospective procedure data. A new method of catheter mapping guidance has been proposed based on LfD and active learning. The proposed method provides real-time guidance for the procedure, as well as a live evaluation of mapping sufficiency.

  4. Cross-terminology mapping challenges: A demonstration using medication terminological systems

    PubMed Central

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer; Chute, Christopher G.; Johnson, Todd R.

    2015-01-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems—a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED-CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. PMID:22750536

  5. Scoping of Flood Hazard Mapping Needs for Hancock County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Hancock County. Scoping activities included assembling existing data and map needs information for communities in Hancock County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps (all types) in Hancock County, Maine, is at least 19 years. Most of these studies were published in the late 1980s and early 1990s, and no study is more recent than 1992. Some towns have partial maps that are more recent than their study, indicating that the true average age of the data is probably more than 19 years. Since the studies were done, development has occurred in some of the watersheds and the characteristics of the watersheds have changed. Therefore, many of the older studies may not depict current conditions or accurately estimate risk in terms of flood heights or flood mapping.

  6. Self Consistent Bathymetric Mapping Using Sub-maps: Survey Results From the TAG Hydrothermal Structure

    NASA Astrophysics Data System (ADS)

    Roman, C. N.; Reves-Sohn, R.; Singh, H.; Humphris, S.

    2005-12-01

    The spatial resolution of microbathymetry maps created using robotic vehicles such as ROVs, AUVs and manned submersibles in the deep ocean is currently limited by the accuracy of the vehicle navigation data. Errors in the vehicle position estimate commonly exceed the ranging errors of the acoustic mapping sensor itself, which creates inconsistency in the map making process and produces artifacts that lower resolution and distort map integrity. We present a methodology for producing self-consistent maps and improving vehicle position estimation by exploiting accurate local navigation and utilizing terrain relative measurements. The complete map is broken down into individual "sub-maps'', which are generated using short term Doppler based navigation. The sub-maps are pairwise registered to constrain the vehicle position estimates by matching terrain that has been imaged multiple times. This procedure is implemented using a delayed state Kalman filter to incorporate the sub-map registrations as relative position measurements between previously visited vehicle locations. Archiving of previous positions in a filter state vector allows for continual adjustment of the sub-map locations. The terrain registration is accomplished using a two dimensional correlation and a six degree of freedom point cloud alignment method tailored to bathymetric data. This registration procedure is applicable to fully 3 dimensional complex underwater environments. The complete bathymetric map is then created from the union of all sub-maps that have been aligned in a consistent manner. The method is applied to an SM2000 multibeam survey of the TAG hydrothermal structure on the Mid-Atlantic Ridge at 26(°)N using the Jason II ROV. The survey included numerous crossing tracklines designed to test this algorithm, and the final gridded bathymetry data is sub-meter accurate. The high-resolution map has allowed for the identification of previously unrecognized fracture patterns associated with flow focusing at TAG, as well as imaging of fine-scale features such as individual sulfide talus blocks and ODP re-entry cones.

  7. Building Better Volcanic Hazard Maps Through Scientific and Stakeholder Collaboration

    NASA Astrophysics Data System (ADS)

    Thompson, M. A.; Lindsay, J. M.; Calder, E.

    2015-12-01

    All across the world information about natural hazards such as volcanic eruptions, earthquakes and tsunami is shared and communicated using maps that show which locations are potentially exposed to hazards of varying intensities. Unlike earthquakes and tsunami, which typically produce one dominant hazardous phenomenon (ground shaking and inundation, respectively) volcanic eruptions can produce a wide variety of phenomena that range from near-vent (e.g. pyroclastic flows, ground shaking) to distal (e.g. volcanic ash, inundation via tsunami), and that vary in intensity depending on the type and location of the volcano. This complexity poses challenges in depicting volcanic hazard on a map, and to date there has been no consistent approach, with a wide range of hazard maps produced and little evaluation of their relative efficacy. Moreover, in traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map that is then presented to stakeholders. This one-way, top-down approach to hazard communication does not necessarily translate into effective hazard education, or, as tragically demonstrated by Nevado del Ruiz, Columbia in 1985, its use in risk mitigation by civil authorities. Furthermore, messages taken away from a hazard map can be strongly influenced by its visual design. Thus, hazard maps are more likely to be useful, usable and used if relevant stakeholders are engaged during the hazard map process to ensure a) the map is designed in a relevant way and b) the map takes into account how users interpret and read different map features and designs. The IAVCEI Commission on Volcanic Hazards and Risk has recently launched a Hazard Mapping Working Group to collate some of these experiences in graphically depicting volcanic hazard from around the world, including Latin America and the Caribbean, with the aim of preparing some Considerations for Producing Volcanic Hazard Maps that may help map makers in the future.

  8. Preliminary soil-slip susceptibility maps, southwestern California

    USGS Publications Warehouse

    Morton, Douglas M.; Alvarez, Rachel M.; Campbell, Russell H.; Digital preparation by Bovard, Kelly R.; Brown, D.T.; Corriea, K.M.; Lesser, J.N.

    2003-01-01

    This group of maps shows relative susceptibility of hill slopes to the initiation sites of rainfall-triggered soil slip-debris flows in southwestern California. As such, the maps offer a partial answer to one part of the three parts necessary to predict the soil-slip/debris-flow process. A complete prediction of the process would include assessments of “where”, “when”, and “how big”. These maps empirically show part of the “where” of prediction (i.e., relative susceptibility to sites of initiation of the soil slips) but do not attempt to show the extent of run out of the resultant debris flows. Some information pertinent to “when” the process might begin is developed. “When” is determined mostly by dynamic factors such as rainfall rate and duration, for which local variations are not amenable to long-term prediction. “When” information is not provided on the maps but is described later in this narrative. The prediction of “how big” is addressed indirectly by restricting the maps to a single type of landslide process—soil slip-debris flows. The susceptibility maps were created through an iterative process from two kinds of information. First, locations of sites of past soil slips were obtained from inventory maps of past events. Aerial photographs, taken during six rainy seasons that produced abundant soil slips, were used as the basis for soil slip-debris flow inventory. Second, digital elevation models (DEM) of the areas that were inventoried were used to analyze the spatial characteristics of soil slip locations. These data were supplemented by observations made on the ground. Certain physical attributes of the locations of the soil-slip debris flows were found to be important and others were not. The most important attribute was the mapped bedrock formation at the site of initiation of the soil slip. However, because the soil slips occur in surficial materials overlying the bedrocks units, the bedrock formation can only serve as a surrogate for the susceptibility of the overlying surficial materials. The maps of susceptibility were created from those physical attributes learned to be important from the inventories. The multiple inventories allow a model to be created from one set of inventory data and evaluated with others. The resultant maps of relative susceptibility represent the best estimate generated from available inventory and DEM data. Slope and aspect values used in the susceptibility analysis were 10-meter DEM cells at a scale of 1:24,000. For most of the area 10-meter DEMs were available; for those quadrangles that have only 30-meter DEMs, the 30-meter DEMS were resampled to 10-meters to maintain resolution of 10-meter cells. Geologic unit values used in the susceptibility analysis were five-meter cells. For convenience, the soil slip susceptibility values are assembled on 1:100,000-scale bases. Any area of the 1:100,000-scale maps can be transferred to 1:24,000-scale base without any loss of accuracy. Figure 32 is an example of part of a 1:100,000-scale susceptibility map transferred back to a 1:24,000-scale quadrangle.

  9. Implementation of the EU environmental noise directive: lessons from the first phase of strategic noise mapping and action planning in Ireland.

    PubMed

    King, E A; Murphy, E; Rice, H J

    2011-03-01

    The first phase of noise mapping and action planning in Ireland, in accordance with EU Directive 2002/49/EC, is now complete. In total this included one agglomeration, one airport and approximately 600 km of major roads outside the agglomeration. These noise maps describe the level of noise exposure of approximately 1.25 million people. The first phase of noise mapping was dealt with by five noise mapping bodies while 26 action planning authorities were involved in the development of the associated action plans. The second phase of noise mapping, due to be completed in 2012, sees a reduction in the defined thresholds describing the required agglomerations, roads and railways that have to be mapped. This will have a significant impact on the extent of mapping required. In Ireland this will result in an increased number of local authorities being required to develop strategic noise maps for their area along with the further development of associated action plans. It is appropriate at this point to review the work process and results from the first phase of noise mapping in Ireland in order to establish areas that could be improved, throughout the noise mapping project. In this paper a review of the implementation procedures focussing on (dominant) road traffic noise is presented. It is identified that more standardisation is needed and this could be achieved by the establishment of a national expert steering group. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. Flexible Learning Itineraries Based on Conceptual Maps

    ERIC Educational Resources Information Center

    Agudelo, Olga Lucía; Salinas, Jesús

    2015-01-01

    The use of learning itineraries based on conceptual maps is studied in order to propose a more flexible instructional design that strengthens the learning process focused on the student, generating non-linear processes, characterising its elements, setting up relationships between them and shaping a general model with specifications for each…

  11. Mapping a Process of Negotiated Identity among Incarcerated Male Juvenile Offenders

    ERIC Educational Resources Information Center

    Abrams, Laura S.; Hyun, Anna

    2009-01-01

    Building on theories of youth identity transitions, this study maps a process of negotiated identity among incarcerated young men. Data are drawn from ethnographic study of three juvenile correctional institutions and longitudinal semistructured interviews with facility residents. Cross-case analysis of 10 cases that finds youth offenders adapted…

  12. The impact of CmapTools utilization towards students' conceptual change on optics topic

    NASA Astrophysics Data System (ADS)

    Rofiuddin, Muhammad Rifqi; Feranie, Selly

    2017-05-01

    Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.

  13. Identification of QTLs Associated with Callogenesis and Embryogenesis in Oil Palm Using Genetic Linkage Maps Improved with SSR Markers

    PubMed Central

    Ting, Ngoot-Chin; Jansen, Johannes; Nagappan, Jayanthi; Ishak, Zamzuri; Chin, Cheuk-Weng; Tan, Soon-Guan; Cheah, Suan-Choo; Singh, Rajinder

    2013-01-01

    Clonal reproduction of oil palm by means of tissue culture is a very inefficient process. Tissue culturability is known to be genotype dependent with some genotypes being more amenable to tissue culture than others. In this study, genetic linkage maps enriched with simple sequence repeat (SSR) markers were developed for dura (ENL48) and pisifera (ML161), the two fruit forms of oil palm, Elaeis guineensis. The SSR markers were mapped onto earlier reported parental maps based on amplified fragment length polymorphism (AFLP) and restriction fragment length polymorphism (RFLP) markers. The new linkage map of ENL48 contains 148 markers (33 AFLPs, 38 RFLPs and 77 SSRs) in 23 linkage groups (LGs), covering a total map length of 798.0 cM. The ML161 map contains 240 markers (50 AFLPs, 71 RFLPs and 119 SSRs) in 24 LGs covering a total of 1,328.1 cM. Using the improved maps, two quantitative trait loci (QTLs) associated with tissue culturability were identified each for callusing rate and embryogenesis rate. A QTL for callogenesis was identified in LGD4b of ENL48 and explained 17.5% of the phenotypic variation. For embryogenesis rate, a QTL was detected on LGP16b in ML161 and explained 20.1% of the variation. This study is the first attempt to identify QTL associated with tissue culture amenity in oil palm which is an important step towards understanding the molecular processes underlying clonal regeneration of oil palm. PMID:23382832

  14. Multi-Depth-Map Raytracing for Efficient Large-Scene Reconstruction.

    PubMed

    Arikan, Murat; Preiner, Reinhold; Wimmer, Michael

    2016-02-01

    With the enormous advances of the acquisition technology over the last years, fast processing and high-quality visualization of large point clouds have gained increasing attention. Commonly, a mesh surface is reconstructed from the point cloud and a high-resolution texture is generated over the mesh from the images taken at the site to represent surface materials. However, this global reconstruction and texturing approach becomes impractical with increasing data sizes. Recently, due to its potential for scalability and extensibility, a method for texturing a set of depth maps in a preprocessing and stitching them at runtime has been proposed to represent large scenes. However, the rendering performance of this method is strongly dependent on the number of depth maps and their resolution. Moreover, for the proposed scene representation, every single depth map has to be textured by the images, which in practice heavily increases processing costs. In this paper, we present a novel method to break these dependencies by introducing an efficient raytracing of multiple depth maps. In a preprocessing phase, we first generate high-resolution textured depth maps by rendering the input points from image cameras and then perform a graph-cut based optimization to assign a small subset of these points to the images. At runtime, we use the resulting point-to-image assignments (1) to identify for each view ray which depth map contains the closest ray-surface intersection and (2) to efficiently compute this intersection point. The resulting algorithm accelerates both the texturing and the rendering of the depth maps by an order of magnitude.

  15. Application of Remote Sensing in Geological Mapping, Case Study al Maghrabah Area - Hajjah Region, Yemen

    NASA Astrophysics Data System (ADS)

    Al-Nahmi, F.; Saddiqi, O.; Hilali, A.; Rhinane, H.; Baidder, L.; El arabi, H.; Khanbari, K.

    2017-11-01

    Remote sensing technology plays an important role today in the geological survey, mapping, analysis and interpretation, which provides a unique opportunity to investigate the geological characteristics of the remote areas of the earth's surface without the need to gain access to an area on the ground. The aim of this study is achievement a geological map of the study area. The data utilizes is Sentinel-2 imagery, the processes used in this study, the OIF Optimum Index Factor is a statistic value that can be used to select the optimum combination of three bands in a satellite image. It's based on the total variance within bands and correlation coefficient between bands, ICA Independent component analysis (3, 4, 6) is a statistical and computational technique for revealing hidden factors that underlie sets of random variables, measurements, or signals, MNF Minimum Noise Fraction (1, 2, 3) is used to determine the inherent dimensionality of image data to segregate noise in the data and to reduce the computational requirements for subsequent processing, Optimum Index Factor is a good method for choosing the best band for lithological mapping. ICA, MNF, also a practical way to extract the structural geology maps. The results in this paper indicate that, the studied area can be divided into four main geological units: Basement rocks (Meta volcanic, Meta sediments), Sedimentary rocks, Intrusive rocks, volcanic rocks. The method used in this study offers great potential for lithological mapping, by using Sentinel-2 imagery, the results were compared with existing geologic maps and were superior and could be used to update the existing maps.

  16. Phosphorylated and Nonphosphorylated PfMAP2 Are Localized in the Nucleus, Dependent on the Stage of Plasmodium falciparum Asexual Maturation

    PubMed Central

    Dahalan, Farah Aida; Sidek, Hasidah Mohd; Murtey, Mogana Das; Embi, Mohammed Noor; Ibrahim, Jamaiah; Fei Tieng, Lim; Zakaria, Nurul Aiezzah

    2016-01-01

    Plasmodium falciparum mitogen-activated protein (MAP) kinases, a family of enzymes central to signal transduction processes including inflammatory responses, are a promising target for antimalarial drug development. Our study shows for the first time that the P. falciparum specific MAP kinase 2 (PfMAP2) is colocalized in the nucleus of all of the asexual erythrocytic stages of P. falciparum and is particularly elevated in its phosphorylated form. It was also discovered that PfMAP2 is expressed in its highest quantity during the early trophozoite (ring form) stage and significantly reduced in the mature trophozoite and schizont stages. Although the phosphorylated form of the kinase is always more prevalent, its ratio relative to the nonphosphorylated form remained constant irrespective of the parasites' developmental stage. We have also shown that the TSH motif specifically renders PfMAP2 genetically divergent from the other plasmodial MAP kinase activation sites using Neighbour Joining analysis. Furthermore, TSH motif-specific designed antibody is crucial in determining the location of the expression of the PfMAP2 protein. However, by using immunoelectron microscopy, PPfMAP2 were detected ubiquitously in the parasitized erythrocytes. In summary, PfMAP2 may play a far more important role than previously thought and is a worthy candidate for research as an antimalarial. PMID:27525262

  17. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    PubMed

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Mars Polar Thermal Inertia and Albedo Properties Using TES Data

    NASA Astrophysics Data System (ADS)

    Scherbenski, J. M.; Paige, D. A.

    2002-12-01

    We present north and south polar thermal inertia and albedo maps derived from MGS TES observations. The maps were derived using the same robust approach developed to make polar thermal and inertia and albedo maps using IRTM observationsby Paige, Bachman, and Keegan (1994) and Paige and Keegan (1994). The data processing approach involved reading TES reduced data records in PDS format using the Vanilla software tool, and sending the data down a processing pipeline that constrains and bins the data, and compares it to the results of a diurnal and seasonal thermal model to obtain the best fit thermal inertia and apparent albedo. To facilitate comparison, the TES maps were created at the same Ls ranges as the published IRTM maps using TES spectral surface temperature results. The north polar maps used TES nadir observations obtained during a 50-day period from Ls 98.39 to Ls 121.25. The south polar maps used TES nadir observations obtained during a 30-day period from Ls 321.58 to 338.07. The creation of these maps employ a basic thermal model that does not include the effects of the atmosphere, as well as a one-dimensional radiative-convective model that does include the effects of the atmosphere. The spatial resolution of the north polar maps is 0.1 degrees of latitude and 1.0 degrees of longitude. The spatial resolution of the south polar maps is 2 degrees of latitude and 2 degrees of longitude. The TES north polar maps show the residual cap area in significantly greater detail than has been available previously. The IRTM maps showed that the north polar sand sea that surrounds the cap has unusually low thermal inertia. The TES maps confirm this conclusion, but also show that the dark renetrant features in chama boreal and elsewhere on the cap also have low thermal inertias. This strongly supports the proposal that these dark rentrants are the sources of the dune material. The TES maps also show that the darker layered deposits which are found at the periphery of the cap have high thermal inertias, just like the brighter water ice deposits elsewhere on the cap. This strongly supports the conclusion that even the darker north polar layered deposits are mostly ice. The TES south polar maps show similar features to those observed by IRTM, including the presence of a low thermal inertia region centered on the south pole, and a region of anomalously high apparent albedo southward of 78 degrees latitude. References: Paige, D. A., J. E. Bachman and K. D. Keegan, Thermal and albedo mapping of the polar regions of Mars using Viking thermal mapper observations: 1. North polar region, J. Geophys. Res. 99, 24,959-25,991, 1994. Paige, D. A. and K. D. Keegan, Thermal and albedo mapping of the polar regions of Mars using Viking thermal mapper observations: 1. South polar region, J. Geophys. Res. 99, 24,993-26,013, 1994.

  19. Mapping knowledge translation and innovation processes in Cancer Drug Development: the case of liposomal doxorubicin.

    PubMed

    Fajardo-Ortiz, David; Duran, Luis; Moreno, Laura; Ochoa, Hector; Castaño, Victor M

    2014-09-03

    We explored how the knowledge translation and innovation processes are structured when theyresult in innovations, as in the case of liposomal doxorubicin research. In order to map the processes, a literature network analysis was made through Cytoscape and semantic analysis was performed by GOPubmed which is based in the controlled vocabularies MeSH (Medical Subject Headings) and GO (Gene Ontology). We found clusters related to different stages of the technological development (invention, innovation and imitation) and the knowledge translation process (preclinical, translational and clinical research), and we were able to map the historic emergence of Doxil as a paradigmatic nanodrug. This research could be a powerful methodological tool for decision-making and innovation management in drug delivery research.

  20. Postprocessing classification images

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1979-01-01

    Program cleans up remote-sensing maps. It can be used with existing image-processing software. Remapped images closely resemble familiar resource information maps and can replace or supplement classification images not postprocessed by this program.

  1. 3D Geological Mapping - uncovering the subsurface to increase environmental understanding

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Mathers, S.; Peach, D.

    2012-12-01

    Geological understanding is required for many disciplines studying natural processes from hydrology to landscape evolution. The subsurface structure of rocks and soils and their properties occupies three-dimensional (3D) space and geological processes operate in time. Traditionally geologists have captured their spatial and temporal knowledge in 2 dimensional maps and cross-sections and through narrative, because paper maps and later two dimensional geographical information systems (GIS) were the only tools available to them. Another major constraint on using more explicit and numerical systems to express geological knowledge is the fact that a geologist only ever observes and measures a fraction of the system they study. Only on rare occasions does the geologist have access to enough real data to generate meaningful predictions of the subsurface without the input of conceptual understanding developed from and knowledge of the geological processes responsible for the deposition, emplacement and diagenesis of the rocks. This in turn has led to geology becoming an increasingly marginalised science as other disciplines have embraced the digital world and have increasingly turned to implicit numerical modelling to understand environmental processes and interactions. Recent developments in geoscience methodology and technology have gone some way to overcoming these barriers and geologists across the world are beginning to routinely capture their knowledge and combine it with all available subsurface data (of often highly varying spatial distribution and quality) to create regional and national geological three dimensional geological maps. This is re-defining the way geologists interact with other science disciplines, as their concepts and knowledge are now expressed in an explicit form that can be used downstream to design process models structure. For example, groundwater modellers can refine their understanding of groundwater flow in three dimensions or even directly parameterize their numerical models using outputs from 3D mapping. In some cases model code is being re-designed in order to deal with the increasing geological complexity expressed by Geologists. These 3D maps contain have inherent uncertainty, just as their predecessors, 2D geological maps had, and there remains a significant body of work to quantify and effectively communicate this uncertainty. Here we present examples of regional and national 3D maps from Geological Survey Organisations worldwide and how these are being used to better solve real-life environmental problems. The future challenge for geologists is to make these 3D maps easily available in an accessible and interoperable form so that the environmental science community can truly integrate the hidden subsurface into a common understanding of the whole geosphere.

  2. High-resolution Ceres LAMO atlas derived from Dawn FC images

    NASA Astrophysics Data System (ADS)

    Roatsch, T.; Kersten, E.; Matz, K. D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C.

    2016-12-01

    Introduction: NASA's Dawn spacecraft has been orbiting the dwarf planet Ceres since December 2015 in LAMO (High Altitude Mapping Orbit) with an altitude of about 400 km to characterize for instance the geology, topography, and shape of Ceres. One of the major goals of this mission phase is the global high-resolution mapping of Ceres. Data: The Dawn mission is equipped with a fram-ing camera (FC). The framing camera took until the time of writing about 27,500 clear filter images in LAMO with a resolution of about 30 m/pixel and dif-ferent viewing angles and different illumination condi-tions. Data Processing: The first step of the processing chain towards the cartographic products is to ortho-rectify the images to the proper scale and map projec-tion type. This process requires detailed information of the Dawn orbit and attitude data and of the topography of the target. A high-resolution shape model was provided by stereo processing of the HAMO dataset, orbit and attitude data are available as reconstructed SPICE data. Ceres' HAMO shape model is used for the calculation of the ray intersection points while the map projection itself was done onto a reference sphere of Ceres. The final step is the controlled mosaicking of all nadir images to a global mosaic of Ceres, the so called basemap. Ceres map tiles: The Ceres atlas will be produced in a scale of 1:250,000 and will consist of 62 tiles that conforms to the quadrangle schema for Venus at 1:5,000,000. A map scale of 1:250,000 is a compro-mise between the very high resolution in LAMO and a proper map sheet size of the single tiles. Nomenclature: The Dawn team proposed to the International Astronomical Union (IAU) to use the names of gods and goddesses of agriculture and vege-tation from world mythology as names for the craters and to use names of agricultural festivals of the world for other geological features. This proposal was ac-cepted by the IAU and the team proposed 92 names for geological features to the IAU based on the LAMO mosaic. These feature names will be applied to the map tiles.

  3. High-resolution Ceres HAMO Atlas derived from Dawn FC Images

    NASA Astrophysics Data System (ADS)

    Roatsch, T.; Kersten, E.; Matz, K. D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

    2015-12-01

    Introduction: NASA's Dawn spacecraft will orbit the dwarf planet Ceres in August and September 2015 in HAMO (High Altitude Mapping Orbit) with an altitude of about 1,500 km to characterize for instance the geology, topography, and shape of Ceres before it will be transferred to the lowest orbit. One of the major goals of this mission phase is the global mapping of Ceres. Data: The Dawn mission is equipped with a fram-ing camera (FC). The framing camera will take about 2600 clear filter images with a resolution of about 120 m/pixel and different viewing angles and different illumination conditions. Data Processing: The first step of the processing chain towards the cartographic products is to ortho-rectify the images to the proper scale and map projec-tion type. This process requires detailed information of the Dawn orbit and attitude data and of the topography of the target. Both, improved orientation and high-resolution shape models, are provided by stereo processing of the HAMO dataset. Ceres' HAMO shape model is used for the calculation of the ray intersection points while the map projection itself will be done onto a reference sphere for Ceres. The final step is the controlled mosaicking of all nadir images to a global mosaic of Ceres, the so called basemap. Ceres map tiles: The Ceres atlas will be produced in a scale of 1:750,000 and will consist of 15 tiles that conform to the quadrangle schema for small planets and medium size Icy satellites. A map scale of 1:750,000 guarantees a mapping at the highest availa-ble Dawn resolution in HAMO. Nomenclature: The Dawn team proposed to the International Astronomical Union (IAU) to use the names of gods and goddesses of agriculture and vege-tation from world mythology as names for the craters. This proposal was accepted by the IAU and the team proposed names for geological features to the IAU based on the HAMO mosaic. These feature names will be applied to the map tiles.

  4. Where to Go Next? Identifying Target Areas in the North Atlantic for Future Seafloor Mapping Initiatives

    NASA Astrophysics Data System (ADS)

    Woelfl, A. C.; Jencks, J.; Johnston, G.; Varner, J. D.; Devey, C. W.

    2017-12-01

    Human activities are rapidly expanding into the oceans, yet detailed bathymetric maps do not exist for most of the seafloor that would permit governments to formulate sensible usage rules. Changing this situation will require an enormous international mapping effort. To ensure that this effort is directed towards the regions most in need of mapping, we need to know which areas have already been mapped and which areas are potentially most interesting. Despite various mapping efforts in recent years, large parts of the Atlantic still lack detailed bathymetric information. To successfully plan for future mapping efforts to fill these gaps, knowledge of current data coverage is imperative to avoid duplication of effort. While certain datasets are publically available online (e.g. NOAA's NCEI, EMODnet, IHO-DCDB, LDEO's GMRT), many are not. However, with the limited information we do have at hand, the question remains, where should we map next? And what criteria should we take into account? In 2016, a study was taken on as part of the efforts of the International Atlantic Seabed Mapping Working Group (ASMIWG). The ASMIWG, established by the Tri-Partite Galway Statement Implementation Committee, was tasked to develop a cohesive seabed mapping strategy for the Atlantic Ocean. The aim of our study was to develop a reproducible process for identifying and evaluating potential target areas within the North Atlantic that represent suitable sites for future bathymetric surveys. The sites were selected by applying a GIS-based suitability analysis that included specific user group-based parameters of the marine environment. Furthermore, information regarding current data coverage were gathered to take into account in the selection process. The results reveal the suitability of sites within the North Atlantic based on the selected criteria. Three potential target sites should be seen as flexible suggestions for future mapping initiatives rather than a rigid, defined set of areas. This methodology can be adjusted to other areas of interest and can include a variety of parameters based on stakeholder interest. Further this work only included accessible and displayable information about multibeam data coverage and would certainly benefit from more easily available and discoverable data sets or at least from location information.

  5. An ocean gazetteer for education and research

    NASA Astrophysics Data System (ADS)

    Delaney, R.; Staudigel, D.; Staudigel, H.

    2003-04-01

    Global travel, economy, and news coverage often challenge the student's and teacher's knowledge of the geography of the seas. The International Hydrographic Organization (IHO) has published a description of all the major seas making up earth's oceans, but there is currently no electronic tool that identifies them on a digital map. During an internship at Scripps Institution of Oceanography, we transferred the printed visual description of the seas from IHO publication 23 into a digital format. This digital map was turned into a (Flash) web application that allows a user to identify any of the IHO seas on a world map, simply by moving the computer cursor over it. In our presentation, we will describe the path taken to produce this web application and the learning process involved in this path during our internship at Scripps. The main steps in this process included the digitization of the official IHO maps, the transfer of this information onto a modern digital map by Smith and Sandwell. Adjustments were necessary due to the fact that many of the landmasses were placed incorrectly on a lat/long grid, off by as much as 100km. Boundaries between seas were often misrepresented by the IHO as straight lines on a Mercator projection. Once the digitization of the seas was completed we used the 2d animation environment Flash and we produced an interactive map environment that allows any teacher or student of ocean geography to identify an ocean by name and location. Aside from learning about the geography of the oceans, we were introduced to the use of digitizers, we learned to make maps using Generic Mapping Tools (GMT) and digital global bathymetry data sets, and we learned about map projections. We studied Flash to produce an interactive map of the oceans that displays bathymetry and topography, highlighting any particular sea the cursor moves across. The name of the selected sea in our Flash application appears in a textbox on the bottom of the map. The result of this project can be found at http://earthref.org/PACER/beta/IH023seas.

  6. Optimisation of decontamination method and influence of culture media on the recovery of Mycobacterium avium subspecies paratuberculosis from spiked water sediments.

    PubMed

    Aboagye, G; Rowe, M T

    2018-07-01

    The recovery of Mycobacterium avium subspecies paratuberculosis (Map) from the environment can be a laborious process - owing to Map being fastidious, its low number, and also high numbers of other microbial populations in such settings. Protocols i.e. filtration, decontamination and modified elution were devised to recover Map from spiked water sediments. Three culture media: Herrold's Egg Yolk Media (HEYM), Middlebrook 7H10 (M-7H10) and Bactec 12B were then employed to grow the organism following its elution. In the sterile sediment samples the recovery of Map was significant between the time of exposure for each of HEYM and M-7H10, and insignificant between both media (P < 0.05). However, in the non-sterile sediment samples, the HEYM grew other background microflora including moulds at all the times of exposure whilst 4 h followed by M-7H10 culture yielded Map colonies without any background microflora. Using sterile samples only for the Bactec 12B, the recovery of Map decreased as time of exposure increased. Based on these findings, M-7H10 should be considered for the recovery of Map from the natural environment including water sediments where the recovery of diverse microbial species remains a challenge. Map is a robust pathogen that abides in the environment. In water treatment operations, Map associates with floccules and other particulate matter including sediments. It is also a fastidious organism, and its detection and recovery from the water environment is a laborious process and can be misleading within the abundance of other mycobacterial species owing to their close resemblance in phylogenetic traits. In the absence of a reliable recovery method, Map continues to pose public health risks through biofilm in household water tanks, hence the need for the development of a reliable recovery protocol to monitor the presence of Map in water systems in order to curtail its public health risks. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Adding It Up: A Guide for Mapping Public Resources for Children, Youth and Families

    ERIC Educational Resources Information Center

    Flynn-Khan, Margaret; Ferber, Thaddeus; Gaines, Elizabeth; Pittman, Karen

    2006-01-01

    This guide is a joint effort from the Forum for Youth Investment and the Finance Project designed to help decision makers and community leaders both learn the importance of a good children youth and families (CYF) resource map and lay out the process of creating or improving a CYF resource map of their own. The handbook has been designed to…

  8. Sampling intensity and normalizations: Exploring cost-driving factors in nationwide mapping of tree canopy cover

    Treesearch

    John Tipton; Gretchen Moisen; Paul Patterson; Thomas A. Jackson; John Coulston

    2012-01-01

    There are many factors that will determine the final cost of modeling and mapping tree canopy cover nationwide. For example, applying a normalization process to Landsat data used in the models is important in standardizing reflectance values among scenes and eliminating visual seams in the final map product. However, normalization at the national scale is expensive and...

  9. Subjectivity in Design Education: The Perception of the City through Personal Maps

    ERIC Educational Resources Information Center

    Yilmaz, Ebru

    2016-01-01

    Our mental maps related to the cities are limited by our personal perception and fragmented in the process. There are many inner and outer effects that shape our mental maps, and as a result the fragmented whole refers to the total city image in our minds. To represent this image, an experimental study has been conducted with a group of students.…

  10. Runway Detection From Map, Video and Aircraft Navigational Data

    DTIC Science & Technology

    2016-03-01

    FROM MAP, VIDEO AND AIRCRAFT NAVIGATIONAL DATA by Jose R. Espinosa Gloria March 2016 Thesis Advisor: Roberto Cristi Co-Advisor: Oleg...COVERED Master’s thesis 4. TITLE AND SUBTITLE RUNWAY DETECTION FROM MAP, VIDEO AND AIRCRAFT NAVIGATIONAL DATA 5. FUNDING NUMBERS 6. AUTHOR...Mexican Navy, unmanned aerial vehicles (UAV) have been equipped with daylight and infrared cameras. Processing the video information obtained from these

  11. Failure detection in high-performance clusters and computers using chaotic map computations

    DOEpatents

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  12. Decomposition Technique for Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  13. Iterative Refinement of Transmission Map for Stereo Image Defogging Using a Dual Camera Sensor.

    PubMed

    Kim, Heegwang; Park, Jinho; Park, Hasil; Paik, Joonki

    2017-12-09

    Recently, the stereo imaging-based image enhancement approach has attracted increasing attention in the field of video analysis. This paper presents a dual camera-based stereo image defogging algorithm. Optical flow is first estimated from the stereo foggy image pair, and the initial disparity map is generated from the estimated optical flow. Next, an initial transmission map is generated using the initial disparity map. Atmospheric light is then estimated using the color line theory. The defogged result is finally reconstructed using the estimated transmission map and atmospheric light. The proposed method can refine the transmission map iteratively. Experimental results show that the proposed method can successfully remove fog without color distortion. The proposed method can be used as a pre-processing step for an outdoor video analysis system and a high-end smartphone with a dual camera system.

  14. A heuristic multi-criteria classification approach incorporating data quality information for choropleth mapping

    PubMed Central

    Sun, Min; Wong, David; Kronenfeld, Barry

    2016-01-01

    Despite conceptual and technology advancements in cartography over the decades, choropleth map design and classification fail to address a fundamental issue: estimates that are statistically indifferent may be assigned to different classes on maps or vice versa. Recently, the class separability concept was introduced as a map classification criterion to evaluate the likelihood that estimates in two classes are statistical different. Unfortunately, choropleth maps created according to the separability criterion usually have highly unbalanced classes. To produce reasonably separable but more balanced classes, we propose a heuristic classification approach to consider not just the class separability criterion but also other classification criteria such as evenness and intra-class variability. A geovisual-analytic package was developed to support the heuristic mapping process to evaluate the trade-off between relevant criteria and to select the most preferable classification. Class break values can be adjusted to improve the performance of a classification. PMID:28286426

  15. Alteration, slope-classified alteration, and potential lahar inundation maps of volcanoes for the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Volcano Archive

    USGS Publications Warehouse

    Mars, John C.; Hubbard, Bernard E.; Pieri, David; Linick, Justin

    2015-01-01

    This study was undertaken during 2012–2013 in cooperation with the National Aeronautics and Space Administration (NASA). Since completion of this study, a new lahar modeling program (LAHAR_pz) has been released, which may produce slightly different modeling results from the LAHARZ model used in this study. The maps and data from this study should not be used in place of existing volcano hazard maps published by local authorities. For volcanoes without hazard maps and (or) published lahar-related hazard studies, this work will provide a starting point from which more accurate hazard maps can be produced. This is the first dataset to provide digital maps of altered volcanoes and adjacent watersheds that can be used for assessing volcanic hazards, hydrothermal alteration, and other volcanic processes in future studies.

  16. Decreased theta power at encoding and cognitive mapping deficits in elderly individuals during a spatial memory task.

    PubMed

    Lithfous, Ségolène; Tromp, Delphine; Dufour, André; Pebayle, Thierry; Goutagny, Romain; Després, Olivier

    2015-10-01

    The purpose of this study was to investigate the role of theta activity in cognitive mapping, and to determine whether age-associated decreased theta power may account for navigational difficulties in elderly individuals. Cerebral activity was recorded using electroencephalograph in young and older individuals performing a spatial memory task that required the creation of cognitive maps. Power spectra were computed in the frontal and parietal regions and correlated with recognition performance. We found that accuracy of cognitive mapping was positively correlated with left frontal theta activity during encoding in young adults but not in older individuals. Compared with young adults, older participants were impaired in the creation of cognitive maps and showed reduced theta and alpha activity at encoding. These results suggest that encoding processes are impaired in older individual, which may explain age-related cognitive mapping deficits. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Iterative Refinement of Transmission Map for Stereo Image Defogging Using a Dual Camera Sensor

    PubMed Central

    Park, Jinho; Park, Hasil

    2017-01-01

    Recently, the stereo imaging-based image enhancement approach has attracted increasing attention in the field of video analysis. This paper presents a dual camera-based stereo image defogging algorithm. Optical flow is first estimated from the stereo foggy image pair, and the initial disparity map is generated from the estimated optical flow. Next, an initial transmission map is generated using the initial disparity map. Atmospheric light is then estimated using the color line theory. The defogged result is finally reconstructed using the estimated transmission map and atmospheric light. The proposed method can refine the transmission map iteratively. Experimental results show that the proposed method can successfully remove fog without color distortion. The proposed method can be used as a pre-processing step for an outdoor video analysis system and a high-end smartphone with a dual camera system. PMID:29232826

  18. Tectonic evaluation of the Nubian shield of Northeastern Sudan using thematic mapper imagery

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Bechtel is nearing completion of a one-year program that uses digitally enhanced LANDSAT Thematic Mapper (TM) data to compile the first comprehensive regional tectonic map of the Proterozoic Nubian Shield exposed in the northern Red Sea Hills of northeastern Sudan. The status of significant objectives of this study are given. Pertinent published and unpublished geologic literature and maps of the northern Red Sea Hills to establish the geologic framework of the region were reviewed. Thematic mapper imagery for optimal base-map enhancements was processed. Photo mosaics of enhanced images to serve as base maps for compilation of geologic information were completed. Interpretation of TM imagery to define and delineate structural and lithogologic provinces was completed. Geologic information (petrologic, and radiometric data) was compiled from the literature review onto base-map overlays. Evaluation of the tectonic evolution of the Nubian Shield based on the image interpretation and the compiled tectonic maps is continuing.

  19. Imaging spectroscopy: Earth and planetary remote sensing with the USGS Tetracorder and expert systems

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Livo, K. Eric; Kokaly, Raymond F.; Sutley, Steve J.; Dalton, J. Brad; McDougal, Robert R.; Gent, Carol A.

    2003-01-01

    Imaging spectroscopy is a tool that can be used to spectrally identify and spatially map materials based on their specific chemical bonds. Spectroscopic analysis requires significantly more sophistication than has been employed in conventional broadband remote sensing analysis. We describe a new system that is effective at material identification and mapping: a set of algorithms within an expert system decision‐making framework that we call Tetracorder. The expertise in the system has been derived from scientific knowledge of spectral identification. The expert system rules are implemented in a decision tree where multiple algorithms are applied to spectral analysis, additional expert rules and algorithms can be applied based on initial results, and more decisions are made until spectral analysis is complete. Because certain spectral features are indicative of specific chemical bonds in materials, the system can accurately identify and map those materials. In this paper we describe the framework of the decision making process used for spectral identification, describe specific spectral feature analysis algorithms, and give examples of what analyses and types of maps are possible with imaging spectroscopy data. We also present the expert system rules that describe which diagnostic spectral features are used in the decision making process for a set of spectra of minerals and other common materials. We demonstrate the applications of Tetracorder to identify and map surface minerals, to detect sources of acid rock drainage, and to map vegetation species, ice, melting snow, water, and water pollution, all with one set of expert system rules. Mineral mapping can aid in geologic mapping and fault detection and can provide a better understanding of weathering, mineralization, hydrothermal alteration, and other geologic processes. Environmental site assessment, such as mapping source areas of acid mine drainage, has resulted in the acceleration of site cleanup, saving millions of dollars and years in cleanup time. Imaging spectroscopy data and Tetracorder analysis can be used to study both terrestrial and planetary science problems. Imaging spectroscopy can be used to probe planetary systems, including their atmospheres, oceans, and land surfaces.

  20. SWIMRT: A graphical user interface using the sliding window algorithm to construct a fluence map machine file

    PubMed Central

    Chow, James C.L.; Grigorov, Grigor N.; Yazdani, Nuri

    2006-01-01

    A custom‐made computer program, SWIMRT, to construct “multileaf collimator (MLC) machine” file for intensity‐modulated radiotherapy (IMRT) fluence maps was developed using MATLAB® and the sliding window algorithm. The user can either import a fluence map with a graphical file format created by an external treatment‐planning system such as Pinnacle3 or create his or her own fluence map using the matrix editor in the program. Through comprehensive calibrations of the dose and the dimension of the imported fluence field, the user can use associated image‐processing tools such as field resizing and edge trimming to modify the imported map. When the processed fluence map is suitable, a “MLC machine” file is generated for our Varian 21 EX linear accelerator with a 120‐leaf Millennium MLC. This machine file is transferred to the MLC console of the LINAC to control the continuous motions of the leaves during beam irradiation. An IMRT field is then irradiated with the 2D intensity profiles, and the irradiated profiles are compared to the imported or modified fluence map. This program was verified and tested using film dosimetry to address the following uncertainties: (1) the mechanical limitation due to the leaf width and maximum traveling speed, and (2) the dosimetric limitation due to the leaf leakage/transmission and penumbra effect. Because the fluence map can be edited, resized, and processed according to the requirement of a study, SWIMRT is essential in studying and investigating the IMRT technique using the sliding window algorithm. Using this program, future work on the algorithm may include redistributing the time space between segmental fields to enhance the fluence resolution, and readjusting the timing of each leaf during delivery to avoid small fields. Possible clinical utilities and examples for SWIMRT are given in this paper. PACS numbers: 87.53.Kn, 87.53.St, 87.53.Uv PMID:17533330

  1. Soil mapping and process modeling for sustainable land use management: a brief historical review

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Pereira, Paulo; Muñoz-Rojas, Miriam; Miller, Bradley A.; Cerdà, Artemi; Parras-Alcántara, Luis; Lozano-García, Beatriz

    2017-04-01

    Basic soil management goes back to the earliest days of agricultural practices, approximately 9,000 BCE. Through time humans developed soil management techniques of ever increasing complexity, including plows, contour tillage, terracing, and irrigation. Spatial soil patterns were being recognized as early as 3,000 BCE, but the first soil maps didn't appear until the 1700s and the first soil models finally arrived in the 1880s (Brevik et al., in press). The beginning of the 20th century saw an increase in standardization in many soil science methods and wide-spread soil mapping in many parts of the world, particularly in developed countries. However, the classification systems used, mapping scale, and national coverage varied considerably from country to country. Major advances were made in pedologic modeling starting in the 1940s, and in erosion modeling starting in the 1950s. In the 1970s and 1980s advances in computing power, remote and proximal sensing, geographic information systems (GIS), global positioning systems (GPS), and statistics and spatial statistics among other numerical techniques significantly enhanced our ability to map and model soils (Brevik et al., 2016). These types of advances positioned soil science to make meaningful contributions to sustainable land use management as we moved into the 21st century. References Brevik, E., Pereira, P., Muñoz-Rojas, M., Miller, B., Cerda, A., Parras-Alcantara, L., Lozano-Garcia, B. Historical perspectives on soil mapping and process modelling for sustainable land use management. In: Pereira, P., Brevik, E., Muñoz-Rojas, M., Miller, B. (eds) Soil mapping and process modelling for sustainable land use management (In press). Brevik, E., Calzolari, C., Miller, B., Pereira, P., Kabala, C., Baumgarten, A., Jordán, A. 2016. Historical perspectives and future needs in soil mapping, classification and pedological modelling, Geoderma, 264, Part B, 256-274.

  2. A global interaction network maps a wiring diagram of cellular function

    PubMed Central

    Costanzo, Michael; VanderSluis, Benjamin; Koch, Elizabeth N.; Baryshnikova, Anastasia; Pons, Carles; Tan, Guihong; Wang, Wen; Usaj, Matej; Hanchard, Julia; Lee, Susan D.; Pelechano, Vicent; Styles, Erin B.; Billmann, Maximilian; van Leeuwen, Jolanda; van Dyk, Nydia; Lin, Zhen-Yuan; Kuzmin, Elena; Nelson, Justin; Piotrowski, Jeff S.; Srikumar, Tharan; Bahr, Sondra; Chen, Yiqun; Deshpande, Raamesh; Kurat, Christoph F.; Li, Sheena C.; Li, Zhijian; Usaj, Mojca Mattiazzi; Okada, Hiroki; Pascoe, Natasha; Luis, Bryan-Joseph San; Sharifpoor, Sara; Shuteriqi, Emira; Simpkins, Scott W.; Snider, Jamie; Suresh, Harsha Garadi; Tan, Yizhao; Zhu, Hongwei; Malod-Dognin, Noel; Janjic, Vuk; Przulj, Natasa; Troyanskaya, Olga G.; Stagljar, Igor; Xia, Tian; Ohya, Yoshikazu; Gingras, Anne-Claude; Raught, Brian; Boutros, Michael; Steinmetz, Lars M.; Moore, Claire L.; Rosebrock, Adam P.; Caudy, Amy A.; Myers, Chad L.; Andrews, Brenda; Boone, Charles

    2017-01-01

    We generated a global genetic interaction network for Saccharomyces cerevisiae, constructing over 23 million double mutants, identifying ~550,000 negative and ~350,000 positive genetic interactions. This comprehensive network maps genetic interactions for essential gene pairs, highlighting essential genes as densely connected hubs. Genetic interaction profiles enabled assembly of a hierarchical model of cell function, including modules corresponding to protein complexes and pathways, biological processes, and cellular compartments. Negative interactions connected functionally related genes, mapped core bioprocesses, and identified pleiotropic genes, whereas positive interactions often mapped general regulatory connections among gene pairs, rather than shared functionality. The global network illustrates how coherent sets of genetic interactions connect protein complex and pathway modules to map a functional wiring diagram of the cell. PMID:27708008

  3. Mass Movement Susceptibility Mapping Using Satellite Optical Imagery Compared With INSAR Monitoring: Zigui County, Three Gorges Region, China

    NASA Astrophysics Data System (ADS)

    Kincal, Cem; Singleton, Andrew; Liu, Peng; Li, Zhenhong; Drummond, Jane; Hoey, Trevor; Muller, Jan-Peter; Qu, Wei; Zeng, Qiming; Zhang, Jingfa; Du, Peijun

    2010-10-01

    Mass movements on steep slopes are a major hazard to communities and infrastructure in the Three Gorges region, China. Developing susceptibility maps of mass movements is therefore very important in both current and future land use planning. This study employed satellite optical imagery and an ASTER GDEM (15 m) to derive various parameters (namely geology; slope gradient; proximity to drainage networks and proximity to lineaments) in order to create a GIS-based map of mass movement susceptibility. This map was then evaluated using highly accurate deformation signals processed using the Persistent Scatterer (PS) InSAR technique. Areas of high susceptibility correspond well to points of high subsidence, which provides a strong support of our susceptibility map.

  4. Landmarks selection in street map design

    NASA Astrophysics Data System (ADS)

    Kao, C. J.

    2014-02-01

    In Taiwan many electrical maps present their landmarks according to the category of the feature, a designer short of knowledge about mental representation of space, can cause the map to lose its communication effects. To resolve this map design problem, in this research through long-term memory recall, navigation and observation, and short-term memory processing 111 participants were asked to select the proper landmark from study area. The results reveal that in Taiwan convenience stores are the most popular local landmark in rural and urban areas. Their commercial signs have a unique design and bright color. Contrasted to their background, this makes the convenience store a salient feature. This study also developed a rule to assess the priority of the landmarks to design them in different scale maps.

  5. Surface topography of the Greenland Ice Sheet from satellite radar altimetry

    NASA Technical Reports Server (NTRS)

    Bindschadler, Robert A.; Zwally, H. Jay; Major, Judith A.; Brenner, Anita C.

    1989-01-01

    Surface elevation maps of the southern half of the Greenland subcontinent are produced from radar altimeter data acquired by the Seasat satellite. A summary of the processing procedure and examples of return waveform data are given. The elevation data are used to generate a regular grid which is then computer contoured to provide an elevation contour map. Ancillary maps show the statistical quality of the elevation data and various characteristics of the surface. The elevation map is used to define ice flow directions and delineate the major drainage basins. Regular maps of the Jakobshavns Glacier drainage basin and the ice divide in the vicinity of Crete Station are presented. Altimeter derived elevations are compared with elevations measured both by satellite geoceivers and optical surveying.

  6. Internet protocol network mapper

    DOEpatents

    Youd, David W.; Colon III, Domingo R.; Seidl, Edward T.

    2016-02-23

    A network mapper for performing tasks on targets is provided. The mapper generates a map of a network that specifies the overall configuration of the network. The mapper inputs a procedure that defines how the network is to be mapped. The procedure specifies what, when, and in what order the tasks are to be performed. Each task specifies processing that is to be performed for a target to produce results. The procedure may also specify input parameters for a task. The mapper inputs initial targets that specify a range of network addresses to be mapped. The mapper maps the network by, for each target, executing the procedure to perform the tasks on the target. The results of the tasks represent the mapping of the network defined by the initial targets.

  7. Comparing orbiter and rover image-based mapping of an ancient sedimentary environment, Aeolis Palus, Gale crater, Mars

    NASA Astrophysics Data System (ADS)

    Stack, K. M.; Edwards, C. S.; Grotzinger, J. P.; Gupta, S.; Sumner, D. Y.; Calef, F. J.; Edgar, L. A.; Edgett, K. S.; Fraeman, A. A.; Jacob, S. R.; Le Deit, L.; Lewis, K. W.; Rice, M. S.; Rubin, D.; Williams, R. M. E.; Williford, K. H.

    2016-12-01

    This study provides the first systematic comparison of orbital facies maps with detailed ground-based geology observations from the Mars Science Laboratory (MSL) Curiosity rover to examine the validity of geologic interpretations derived from orbital image data. Orbital facies maps were constructed for the Darwin, Cooperstown, and Kimberley waypoints visited by the Curiosity rover using High Resolution Imaging Science Experiment (HiRISE) images. These maps, which represent the most detailed orbital analysis of these areas to date, were compared with rover image-based geologic maps and stratigraphic columns derived from Curiosity's Mast Camera (Mastcam) and Mars Hand Lens Imager (MAHLI). Results show that bedrock outcrops can generally be distinguished from unconsolidated surficial deposits in high-resolution orbital images and that orbital facies mapping can be used to recognize geologic contacts between well-exposed bedrock units. However, process-based interpretations derived from orbital image mapping are difficult to infer without known regional context or observable paleogeomorphic indicators, and layer-cake models of stratigraphy derived from orbital maps oversimplify depositional relationships as revealed from a rover perspective. This study also shows that fine-scale orbital image-based mapping of current and future Mars landing sites is essential for optimizing the efficiency and science return of rover surface operations.

  8. Ubiquitous Creation of Bas-Relief Surfaces with Depth-of-Field Effects Using Smartphones.

    PubMed

    Sohn, Bong-Soo

    2017-03-11

    This paper describes a new method to automatically generate digital bas-reliefs with depth-of-field effects from general scenes. Most previous methods for bas-relief generation take input in the form of 3D models. However, obtaining 3D models of real scenes or objects is often difficult, inaccurate, and time-consuming. From this motivation, we developed a method that takes as input a set of photographs that can be quickly and ubiquitously captured by ordinary smartphone cameras. A depth map is computed from the input photographs. The value range of the depth map is compressed and used as a base map representing the overall shape of the bas-relief. However, the resulting base map contains little information on details of the scene. Thus, we construct a detail map using pixel values of the input image to express the details. The base and detail maps are blended to generate a new depth map that reflects both overall depth and scene detail information. This map is selectively blurred to simulate the depth-of-field effects. The final depth map is converted to a bas-relief surface mesh. Experimental results show that our method generates a realistic bas-relief surface of general scenes with no expensive manual processing.

  9. Rapid crop cover mapping for the conterminous United States

    USGS Publications Warehouse

    Dahal, Devendra; Wylie, Bruce K.; Howard, Daniel

    2018-01-01

    Timely crop cover maps with sufficient resolution are important components to various environmental planning and research applications. Through the modification and use of a previously developed crop classification model (CCM), which was originally developed to generate historical annual crop cover maps, we hypothesized that such crop cover maps could be generated rapidly during the growing season. Through a process of incrementally removing weekly and monthly independent variables from the CCM and implementing a ‘two model mapping’ approach, we found it viable to generate conterminous United States-wide rapid crop cover maps at a resolution of 250 m for the current year by the month of September. In this approach, we divided the CCM model into one ‘crop type model’ to handle the classification of nine specific crops and a second, binary model to classify the presence or absence of ‘other’ crops. Under the two model mapping approach, the training errors were 0.8% and 1.5% for the crop type and binary model, respectively, while test errors were 5.5% and 6.4%, respectively. With spatial mapping accuracies for annual maps reaching upwards of 70%, this approach demonstrated a strong potential for generating rapid crop cover maps by the 1st of September.

  10. Ubiquitous Creation of Bas-Relief Surfaces with Depth-of-Field Effects Using Smartphones

    PubMed Central

    Sohn, Bong-Soo

    2017-01-01

    This paper describes a new method to automatically generate digital bas-reliefs with depth-of-field effects from general scenes. Most previous methods for bas-relief generation take input in the form of 3D models. However, obtaining 3D models of real scenes or objects is often difficult, inaccurate, and time-consuming. From this motivation, we developed a method that takes as input a set of photographs that can be quickly and ubiquitously captured by ordinary smartphone cameras. A depth map is computed from the input photographs. The value range of the depth map is compressed and used as a base map representing the overall shape of the bas-relief. However, the resulting base map contains little information on details of the scene. Thus, we construct a detail map using pixel values of the input image to express the details. The base and detail maps are blended to generate a new depth map that reflects both overall depth and scene detail information. This map is selectively blurred to simulate the depth-of-field effects. The final depth map is converted to a bas-relief surface mesh. Experimental results show that our method generates a realistic bas-relief surface of general scenes with no expensive manual processing. PMID:28287487

  11. One perspective on spatial variability in geologic mapping

    USGS Publications Warehouse

    Markewich, H.W.; Cooper, S.C.

    1991-01-01

    This paper discusses some of the differences between geologic mapping and soil mapping, and how the resultant maps are interpreted. The role of spatial variability in geologic mapping is addressed only indirectly because in geologic mapping there have been few attempts at quantification of spatial differences. This is largely because geologic maps deal with temporal as well as spatial variability and consider time, age, and origin, as well as composition and geometry. Both soil scientists and geologists use spatial variability to delineate mappable units; however, the classification systems from which these mappable units are defined differ greatly. Mappable soil units are derived from systematic, well-defined, highly structured sets of taxonomic criteria; whereas mappable geologic units are based on a more arbitrary heirarchy of categories that integrate many features without strict values or definitions. Soil taxonomy is a sorting tool used to reduce heterogeneity between soil units. Thus at the series level, soils in any one series are relatively homogeneous because their range of properties is small and well-defined. Soil maps show the distribution of soils on the land surface. Within a map area, soils, which are often less than 2 m thick, show a direct correlation to topography and to active surface processes as well as to parent material.

  12. Comparing orbiter and rover image-based mapping of an ancient sedimentary environment, Aeolis Palus, Gale crater, Mars

    USGS Publications Warehouse

    Stack, Kathryn M.; Edwards, Christopher; Grotzinger, J. P.; Gupta, S.; Sumner, D.; Edgar, Lauren; Fraeman, A.; Jacob, S.; LeDeit, L.; Lewis, K.W.; Rice, M.S.; Rubin, D.; Calef, F.; Edgett, K.; Williams, R.M.E.; Williford, K.H.

    2016-01-01

    This study provides the first systematic comparison of orbital facies maps with detailed ground-based geology observations from the Mars Science Laboratory (MSL) Curiosity rover to examine the validity of geologic interpretations derived from orbital image data. Orbital facies maps were constructed for the Darwin, Cooperstown, and Kimberley waypoints visited by the Curiosity rover using High Resolution Imaging Science Experiment (HiRISE) images. These maps, which represent the most detailed orbital analysis of these areas to date, were compared with rover image-based geologic maps and stratigraphic columns derived from Curiosity’s Mast Camera (Mastcam) and Mars Hand Lens Imager (MAHLI). Results show that bedrock outcrops can generally be distinguished from unconsolidated surficial deposits in high-resolution orbital images and that orbital facies mapping can be used to recognize geologic contacts between well-exposed bedrock units. However, process-based interpretations derived from orbital image mapping are difficult to infer without known regional context or observable paleogeomorphic indicators, and layer-cake models of stratigraphy derived from orbital maps oversimplify depositional relationships as revealed from a rover perspective. This study also shows that fine-scale orbital image-based mapping of current and future Mars landing sites is essential for optimizing the efficiency and science return of rover surface operations.

  13. Linear programming model to develop geodiversity map using utility theory

    NASA Astrophysics Data System (ADS)

    Sepehr, Adel

    2015-04-01

    In this article, the classification and mapping of geodiversity based on a quantitative methodology was accomplished using linear programming, the central idea of which being that geosites and geomorphosites as main indicators of geodiversity can be evaluated by utility theory. A linear programming method was applied for geodiversity mapping over Khorasan-razavi province located in eastern north of Iran. In this route, the main criteria for distinguishing geodiversity potential in the studied area were considered regarding rocks type (lithology), faults position (tectonic process), karst area (dynamic process), Aeolian landforms frequency and surface river forms. These parameters were investigated by thematic maps including geology, topography and geomorphology at scales 1:100'000, 1:50'000 and 1:250'000 separately, imagery data involving SPOT, ETM+ (Landsat 7) and field operations directly. The geological thematic layer was simplified from the original map using a practical lithologic criterion based on a primary genetic rocks classification representing metamorphic, igneous and sedimentary rocks. The geomorphology map was provided using DEM at scale 30m extracted by ASTER data, geology and google earth images. The geology map shows tectonic status and geomorphology indicated dynamic processes and landform (karst, Aeolian and river). Then, according to the utility theory algorithms, we proposed a linear programming to classify geodiversity degree in the studied area based on geology/morphology parameters. The algorithm used in the methodology was consisted a linear function to be maximized geodiversity to certain constraints in the form of linear equations. The results of this research indicated three classes of geodiversity potential including low, medium and high status. The geodiversity potential shows satisfied conditions in the Karstic areas and Aeolian landscape. Also the utility theory used in the research has been decreased uncertainty of the evaluations.

  14. Use of landsat ETM+ SLC-off segment-based gap-filled imagery for crop type mapping

    USGS Publications Warehouse

    Maxwell, S.K.; Craig, M.E.

    2008-01-01

    Failure of the Scan Line Corrector (SLC) on the Landsat ETM+ sensor has had a major impact on many applications that rely on continuous medium resolution imagery to meet their objectives. The United States Department of Agriculture (USDA) Cropland Data Layer (CDL) program uses Landsat imagery as the primary source of data to produce crop-specific maps for 20 states in the USA. A new method has been developed to fill the image gaps resulting from the SLC failure to support the needs of Landsat users who require coincident spectral data, such as for crop type mapping and monitoring. We tested the new gap-filled method for a CDL crop type mapping project in eastern Nebraska. Scan line gaps were simulated on two Landsat 5 images (spring and late summer 2003) and then gap-filled using landscape boundary models, or segment models, that were derived from 1992 and 2002 Landsat images (used in the gap-fill process). Various date combinations of original and gap-filled images were used to derive crop maps using a supervised classification process. Overall kappa values were slightly higher for crop maps derived from SLC-off gap-filled images compared to crop maps derived from the original imagery (0.3–1.3% higher). Although the age of the segment model used to derive the SLC-off gap-filled product did not negatively impact the overall agreement, differences in individual cover type agreement did increase (−0.8%–1.6% using the 2002 segment model to −5.0–5.1% using the 1992 segment model). Classification agreement also decreased for most of the classes as the size of the segment used in the gap-fill process increased.

  15. Lactic acid bacteria associated with a heat-processed pork product and sources of variation affecting chemical indices of spoilage and sensory characteristics.

    PubMed

    Laursen, B G; Byrne, D V; Kirkegaard, J B; Leisner, J J

    2009-02-01

    To evaluate the potential for developing a quality index for a Danish modified atmosphere packaged (MAP) heat-processed and naturally contaminated pork meat product stored at 5 degrees C. The composition of the predominating microflora and changes in contents of tyramine, arginine, organic acids and sensory characteristics were analysed. The microflora was predominated by Lactobacillus sakei, Leuconostoc carnosum and Carnobacterium divergens. The presence of each species varied between products and batches resulting in limited usefulness of the concentrations of these bacteria or their metabolites as indices of quality. Furthermore, the three species differed in their metabolic activities as shown by use of a model meat extract. However, when MAP storage of the processed pork product was followed by aerobic storage then acetic acid showed some potential as a chemical indicator of sensory quality. Variation in processing parameters and spoilage microbiota limited the usefulness of concentrations of micro-organisms and their metabolites as indices of spoilage for the studied processed MAP pork product. The present study contributes to an understanding of the difficulties experienced in developing quality indices to be used in the control of microbial spoilage of processed MAP meat products.

  16. Map of Pluto Surface

    NASA Image and Video Library

    1998-03-28

    This image-based surface map of Pluto was assembled by computer image processing software from four separate images of Pluto disk taken with the European Space Agency Faint Object Camera aboard NASA Hubble Space Telescope.

  17. Electronic atlas of the Russian Arctic coastal zone: natural conditions and technogenic risk

    NASA Astrophysics Data System (ADS)

    Drozdov, D. S.; Rivkin, F. M.; Rachold, V.

    2004-12-01

    The Arctic coast is characterized by a diversity of geological-geomorphological structures and geocryological conditions, which are expected to respond differently to changes in the natural environment and in anthropogenic impacts. At present, oil fields are prospected and developed and permanent and temporary ports are constructed in the Arctic regions of Russia. Thus, profound understanding of the processes involved and measures of nature conservation for the coastal zone of the Arctic Seas are required. One of the main field of Arctic coastal investigations and database formation of coastal conditions is the mapping of the coasts. This poster presents a set of digital maps including geology, quaternary sediments, landscapes, engineering-geology, vegetation, geocryology and a series of regional sources, which have been selected to characterize the Russian Arctic coast. The area covered in this work includes the 200-km-wide band along the entire Russian Arctic coast from the Norwegian boundary in the west to the Bering Strait in the east. Methods included the collection of the majority of available hard copies of cartographic material and their digital formats and the transformation of these sources into a uniform digital graphic format. The atlas consists of environmental maps and maps of engineering-geological zoning. The set of environmental maps includes geology, quaternary sediments, landscapes and vegetation of the Russian Arctic coast at a scale of 1:4000000. The set of engineering-geocryological maps includes a map of engineering-geocryological zoning of the Russian Arctic coast, a map of the intensity of destructive coastal process and a map of industrial impact risk assessment ( 1:8000000 scale). Detailed mapping has been performed for key sites (at a scale of 1:100000) in order to enable more precise estimates of the intensity of destructive coastal process and industrial impact. The engineering-geocryological map of the Russian Arctic coast was compiled based on the analysis of geotechnical and geocryological conditions in the areas adjacent to the coastal band. Industrial impact assessment has been estimated differently for each engineering-geocryological region distinguished on the coast, considering technological features of construction and engineering facilities: aerial construction, highways and airdromes, underground (with positive and negative pipe temperatures) and surface pipelines and quarries. The atlas is being used as a base for the circum-Arctic segmentation of the coastline and the analyses of coastal dynamics within the Arctic Coastal Dynamics (ACD) Project. The work has been supported by INTAS (project number 01-2332).

  18. Geologic Mapping of Vesta

    NASA Technical Reports Server (NTRS)

    Yingst, R. A.; Mest, S. C.; Berman, D. C.; Garry, W. B.; Williams, D. A.; Buczkowski, D.; Jaumann, R.; Pieters, C. M.; De Sanctis, M. C.; Frigeri, A.; hide

    2014-01-01

    We report on a preliminary global geologic map of Vesta, based on data from the Dawn spacecraft's High- Altitude Mapping Orbit (HAMO) and informed by Low-Altitude Mapping Orbit (LAMO) data. This map is part of an iterative mapping effort; the geologic map has been refined with each improvement in resolution. Vesta has a heavily-cratered surface, with large craters evident in numerous locations. The south pole is dominated by an impact structure identified before Dawn's arrival. Two large impact structures have been resolved: the younger, larger Rheasilvia structure, and the older, more degraded Veneneia structure. The surface is also characterized by a system of deep, globe-girdling equatorial troughs and ridges, as well as an older system of troughs and ridges to the north. Troughs and ridges are also evident cutting across, and spiraling arcuately from, the Rheasilvia central mound. However, no volcanic features have been unequivocally identified. Vesta can be divided very broadly into three terrains: heavily-cratered terrain; ridge-and-trough terrain (equatorial and northern); and terrain associated with the Rheasilvia crater. Localized features include bright and dark material and ejecta (some defined specifically by color); lobate deposits; and mass-wasting materials. No obvious volcanic features are evident. Stratigraphy of Vesta's geologic units suggests a history in which formation of a primary crust was followed by the formation of impact craters, including Veneneia and the associated Saturnalia Fossae unit. Formation of Rheasilvia followed, along with associated structural deformation that shaped the Divalia Fossae ridge-and-trough unit at the equator. Subsequent impacts and mass wasting events subdued impact craters, rims and portions of ridge-and-trough sets, and formed slumps and landslides, especially within crater floors and along crater rims and scarps. Subsequent to the formation of Rheasilvia, discontinuous low-albedo deposits formed or were emplaced; these lie stratigraphically above the equatorial ridges that likely were formed by Rheasilvia. The last features to be formed were craters with bright rays and other surface mantling deposits. Executed progressively throughout data acquisition, the iterative mapping process provided the team with geologic proto-units in a timely manner. However, interpretation of the resulting map was hampered by the necessity to provide the team with a standard nomenclature and symbology early in the process. With regard to mapping and interpreting units, the mapping process was hindered by the lack of calibrated mineralogic information. Topography and shadow played an important role in discriminating features and terrains, especially in the early stages of data acquisition.

  19. Methods for landslide susceptibility modelling in Lower Austria

    NASA Astrophysics Data System (ADS)

    Bell, Rainer; Petschko, Helene; Glade, Thomas; Leopold, Philip; Heiss, Gerhard; Proske, Herwig; Granica, Klaus; Schweigl, Joachim; Pomaroli, Gilbert

    2010-05-01

    Landslide susceptibility modelling and implementation of the resulting maps is still a challenge for geoscientists, spatial and infrastructure planners. Particularly on a regional scale landslide processes and their dynamics are poorly understood. Furthermore, the availability of appropriate spatial data in high resolution is often a limiting factor for modelling high quality landslide susceptibility maps for large study areas. However, these maps form an important basis for preventive spatial planning measures. Thus, new methods have to be developed, especially focussing on the implementation of final maps into spatial planning processes. The main objective of the project "MoNOE" (Method development for landslide susceptibility modelling in Lower Austria) is to design a method for landslide susceptibility modelling for a large study area (about 10.200 km²) and to produce landslide susceptibility maps which are finally implemented in the spatial planning strategies of the Federal state of Lower Austria. The project focuses primarily on the landslide types fall and slide. To enable susceptibility modelling, landslide inventories for the respective landslide types must be compiled and relevant data has to be gathered, prepared and homogenized. Based on this data new methods must be developed to tackle the needs of the spatial planning strategies. Considerable efforts will also be spent on the validation of the resulting maps for each landslide type. A great challenge will be the combination of the susceptibility maps for slides and falls in just one single susceptibility map (which is requested by the government) and the definition of the final visualisation. Since numerous landslides have been favoured or even triggered by human impact, the human influence on landslides will also have to be investigated. Furthermore possibilities to integrate respective findings in regional susceptibility modelling will be explored. According to these objectives the project is structured in four work packages namely data preparation and homogenization (WP1), susceptibility modelling and validation (WP2), integrative susceptibility assessment (WP3) and human impact (WP4). The expected results are a landslide inventory map covering all endangered parts of the Federal state of Lower Austria, a land cover map of Lower Austria with high spatial resolution, processed spatial input data and an optimized integrative susceptibility map visualized at a scale of 1:25.000. The structure of the research project, research strategies as well as first results will be presented at the conference. The project is funded by the Federal state government of Lower Austria.

  20. Using a detailed uncertainty analysis to adjust mapped rates of forest disturbance derived from Landsat time series data (Invited)

    NASA Astrophysics Data System (ADS)

    Cohen, W. B.; Yang, Z.; Stehman, S.; Huang, C.; Healey, S. P.

    2013-12-01

    Forest ecosystem process models require spatially and temporally detailed disturbance data to accurately predict fluxes of carbon or changes in biodiversity over time. A variety of new mapping algorithms using dense Landsat time series show great promise for providing disturbance characterizations at an annual time step. These algorithms provide unprecedented detail with respect to timing, magnitude, and duration of individual disturbance events, and causal agent. But all maps have error and disturbance maps in particular can have significant omission error because many disturbances are relatively subtle. Because disturbance, although ubiquitous, can be a relatively rare event spatially in any given year, omission errors can have a great impact on mapped rates. Using a high quality reference disturbance dataset, it is possible to not only characterize map errors but also to adjust mapped disturbance rates to provide unbiased rate estimates with confidence intervals. We present results from a national-level disturbance mapping project (the North American Forest Dynamics project) based on the Vegetation Change Tracker (VCT) with annual Landsat time series and uncertainty analyses that consist of three basic components: response design, statistical design, and analyses. The response design describes the reference data collection, in terms of the tool used (TimeSync), a formal description of interpretations, and the approach for data collection. The statistical design defines the selection of plot samples to be interpreted, whether stratification is used, and the sample size. Analyses involve derivation of standard agreement matrices between the map and the reference data, and use of inclusion probabilities and post-stratification to adjust mapped disturbance rates. Because for NAFD we use annual time series, both mapped and adjusted rates are provided at an annual time step from ~1985-present. Preliminary evaluations indicate that VCT captures most of the higher intensity disturbances, but that many of the lower intensity disturbances (thinnings, stress related to insects and disease, etc.) are missed. Because lower intensity disturbances are a large proportion of the total set of disturbances, adjusting mapped disturbance rates to include these can be important for inclusion in ecosystem process models. The described statistical disturbance rate adjustments are aspatial in nature, such that the basic underlying map is unchanged. For spatially explicit ecosystem modeling, such adjustments, although important, can be difficult to directly incorporate. One approach for improving the basic underlying map is an ensemble modeling approach that uses several different complementary maps, each derived from a different algorithm and having their own strengths and weaknesses relative to disturbance magnitude and causal agent of disturbance. We will present results from a pilot study associated with the Landscape Change Monitoring System (LCMS), an emerging national-level program that builds upon NAFD and the well-established Monitoring Trends in Burn Severity (MTBS) program.

Top