Sample records for tools mapping targeted

  1. Grid Visualization Tool

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steven

    2005-01-01

    The Grid Visualization Tool (GVT) is a computer program for displaying the path of a mobile robotic explorer (rover) on a terrain map. The GVT reads a map-data file in either portable graymap (PGM) or portable pixmap (PPM) format, representing a gray-scale or color map image, respectively. The GVT also accepts input from path-planning and activity-planning software. From these inputs, the GVT generates a map overlaid with one or more rover path(s), waypoints, locations of targets to be explored, and/or target-status information (indicating success or failure in exploring each target). The display can also indicate different types of paths or path segments, such as the path actually traveled versus a planned path or the path traveled to the present position versus planned future movement along a path. The program provides for updating of the display in real time to facilitate visualization of progress. The size of the display and the map scale can be changed as desired by the user. The GVT was written in the C++ language using the Open Graphics Library (OpenGL) software. It has been compiled for both Sun Solaris and Linux operating systems.

  2. Locating Sequence on FPC Maps and Selecting a Minimal Tiling Path

    PubMed Central

    Engler, Friedrich W.; Hatfield, James; Nelson, William; Soderlund, Carol A.

    2003-01-01

    This study discusses three software tools, the first two aid in integrating sequence with an FPC physical map and the third automatically selects a minimal tiling path given genomic draft sequence and BAC end sequences. The first tool, FSD (FPC Simulated Digest), takes a sequenced clone and adds it back to the map based on a fingerprint generated by an in silico digest of the clone. This allows verification of sequenced clone positions and the integration of sequenced clones that were not originally part of the FPC map. The second tool, BSS (Blast Some Sequence), takes a query sequence and positions it on the map based on sequence associated with the clones in the map. BSS has multiple uses as follows: (1) When the query is a file of marker sequences, they can be added as electronic markers. (2) When the query is draft sequence, the results of BSS can be used to close gaps in a sequenced clone or the physical map. (3) When the query is a sequenced clone and the target is BAC end sequences, one may select the next clone for sequencing using both sequence comparison results and map location. (4) When the query is whole-genome draft sequence and the target is BAC end sequences, the results can be used to select many clones for a minimal tiling path at once. The third tool, pickMTP, automates the majority of this last usage of BSS. Results are presented using the rice FPC map, BAC end sequences, and whole-genome shotgun from Syngenta. PMID:12915486

  3. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool.

    PubMed

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-06-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13-17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences.

  4. Incorporating Concept Mapping in Project-Based Learning: Lessons from Watershed Investigations

    NASA Astrophysics Data System (ADS)

    Rye, James; Landenberger, Rick; Warner, Timothy A.

    2013-06-01

    The concept map tool set forth by Novak and colleagues is underutilized in education. A meta-analysis has encouraged teachers to make extensive use of concept mapping, and researchers have advocated computer-based concept mapping applications that exploit hyperlink technology. Through an NSF sponsored geosciences education grant, middle and secondary science teachers participated in professional development to apply computer-based concept mapping in project-based learning (PBL) units that investigated local watersheds. Participants attended a summer institute, engaged in a summer through spring online learning academy, and presented PBL units at a subsequent fall science teachers' convention. The majority of 17 teachers who attended the summer institute had previously used the concept mapping strategy with students and rated it highly. Of the 12 teachers who continued beyond summer, applications of concept mapping ranged from collaborative planning of PBL projects to building students' vocabulary to students producing maps related to the PBL driving question. Barriers to the adoption and use of concept mapping included technology access at the schools, lack of time for teachers to advance their technology skills, lack of student motivation to choose to learn, and student difficulty with linking terms. In addition to mitigating the aforementioned barriers, projects targeting teachers' use of technology tools may enhance adoption by recruiting teachers as partners from schools as well as a small number that already are proficient in the targeted technology and emphasizing the utility of the concept map as a planning tool.

  5. Interventions and assessment tools addressing key concepts people need to know to appraise claims about treatment effects: a systematic mapping review.

    PubMed

    Austvoll-Dahlgren, Astrid; Nsangi, Allen; Semakula, Daniel

    2016-12-29

    People's ability to appraise claims about treatment effects is crucial for informed decision-making. Our objective was to systematically map this area of research in order to (a) provide an overview of interventions targeting key concepts that people need to understand to assess treatment claims and (b) to identify assessment tools used to evaluate people's understanding of these concepts. The findings of this review provide a starting point for decisions about which key concepts to address when developing new interventions, and which assessment tools should be considered. We conducted a systematic mapping review of interventions and assessment tools addressing key concepts important for people to be able to assess treatment claims. A systematic literature search was done by a reserach librarian in relevant databases. Judgement about inclusion of studies and data collection was done by at least two researchers. We included all quantitative study designs targeting one or more of the key concepts, and targeting patients, healthy members of the public, and health professionals. The studies were divided into four categories: risk communication and decision aids, evidence-based medicine and critical appraisal, understanding of controlled trials, and science education. Findings were summarised descriptively. We included 415 studies, of which the interventions and assessment tools we identified included only a handful of the key concepts. The most common key concepts in interventions were "Treatments usually have beneficial and harmful effects," "Treatment comparisons should be fair," "Compare like with like," and "Single studies can be misleading." A variety of assessment tools were identified, but only four assessment tools included 10 or more key concepts. There is great potential for developing learning and assessment tools targeting key concepts that people need to understand to assess claims about treatment effects. There is currently no instrument covering assessment of all these key concepts.

  6. Developing Automated Spectral Analysis Tools for Interstellar Features Extractionto Support Construction of the 3D ISM Map

    NASA Astrophysics Data System (ADS)

    Puspitarini, L.; Lallement, R.; Monreal-Ibero, A.; Chen, H.-C.; Malasan, H. L.; Aprilia; Arifyanto, M. I.; Irfan, M.

    2018-04-01

    One of the ways to obtain a detailed 3D ISM map is by gathering interstellar (IS) absorption data toward widely distributed background target stars at known distances (line-of-sight/LOS data). The radial and angular evolution of the LOS measurements allow the inference of the ISM spatial distribution. For a better spatial resolution, one needs a large number of the LOS data. It requires building fast tools to measure IS absorption. One of the tools is a global analysis that fit two different diffuse interstellar bands (DIBs) simultaneously. We derived the equivalent width (EW) ratio of the two DIBs recorded in each spectrum of target stars. The ratio variability can be used to study IS environmental conditions or to detect DIB family.

  7. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool

    PubMed Central

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-01-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13–17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences. PMID:29303048

  8. EcoEvo-MAPS: An Ecology and Evolution Assessment for Introductory through Advanced Undergraduates

    ERIC Educational Resources Information Center

    Summers, Mindi M.; Couch, Brian A.; Knight, Jennifer K.; Brownell, Sara E.; Crowe, Alison J.; Semsar, Katharine; Wright, Christian D.; Smith, Michelle K.

    2018-01-01

    A new assessment tool, Ecology and Evolution--Measuring Achievement and Progression in Science or EcoEvo-MAPS, measures student thinking in ecology and evolution during an undergraduate course of study. EcoEvo-MAPS targets foundational concepts in ecology and evolution and uses a novel approach that asks students to evaluate a series of…

  9. The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.

    PubMed

    Schubert, Christian R; Stultz, Collin M

    2009-08-01

    Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.

  10. EcoEvo-MAPS: An Ecology and Evolution Assessment for Introductory through Advanced Undergraduates.

    PubMed

    Summers, Mindi M; Couch, Brian A; Knight, Jennifer K; Brownell, Sara E; Crowe, Alison J; Semsar, Katharine; Wright, Christian D; Smith, Michelle K

    2018-06-01

    A new assessment tool, Ecology and Evolution-Measuring Achievement and Progression in Science or EcoEvo-MAPS, measures student thinking in ecology and evolution during an undergraduate course of study. EcoEvo-MAPS targets foundational concepts in ecology and evolution and uses a novel approach that asks students to evaluate a series of predictions, conclusions, or interpretations as likely or unlikely to be true given a specific scenario. We collected evidence of validity and reliability for EcoEvo-MAPS through an iterative process of faculty review, student interviews, and analyses of assessment data from more than 3000 students at 34 associate's-, bachelor's-, master's-, and doctoral-granting institutions. The 63 likely/unlikely statements range in difficulty and target student understanding of key concepts aligned with the Vision and Change report. This assessment provides departments with a tool to measure student thinking at different time points in the curriculum and provides data that can be used to inform curricular and instructional modifications.

  11. A Toolkit for bulk PCR-based marker design from next-generation sequence data: application for development of a framework linkage map in bulb onion (Allium cepa L.)

    PubMed Central

    2012-01-01

    Background Although modern sequencing technologies permit the ready detection of numerous DNA sequence variants in any organisms, converting such information to PCR-based genetic markers is hampered by a lack of simple, scalable tools. Onion is an example of an under-researched crop with a complex, heterozygous genome where genome-based research has previously been hindered by limited sequence resources and genetic markers. Results We report the development of generic tools for large-scale web-based PCR-based marker design in the Galaxy bioinformatics framework, and their application for development of next-generation genetics resources in a wide cross of bulb onion (Allium cepa L.). Transcriptome sequence resources were developed for the homozygous doubled-haploid bulb onion line ‘CUDH2150’ and the genetically distant Indian landrace ‘Nasik Red’, using 454™ sequencing of normalised cDNA libraries of leaf and shoot. Read mapping of ‘Nasik Red’ reads onto ‘CUDH2150’ assemblies revealed 16836 indel and SNP polymorphisms that were mined for portable PCR-based marker development. Tools for detection of restriction polymorphisms and primer set design were developed in BioPython and adapted for use in the Galaxy workflow environment, enabling large-scale and targeted assay design. Using PCR-based markers designed with these tools, a framework genetic linkage map of over 800cM spanning all chromosomes was developed in a subset of 93 F2 progeny from a very large F2 family developed from the ‘Nasik Red’ x ‘CUDH2150’ inter-cross. The utility of tools and genetic resources developed was tested by designing markers to transcription factor-like polymorphic sequences. Bin mapping these markers using a subset of 10 progeny confirmed the ability to place markers within 10 cM bins, enabling increased efficiency in marker assignment and targeted map refinement. The major genetic loci conditioning red bulb colour (R) and fructan content (Frc) were located on this map by QTL analysis. Conclusions The generic tools developed for the Galaxy environment enable rapid development of sets of PCR assays targeting sequence variants identified from Illumina and 454 sequence data. They enable non-specialist users to validate and exploit large volumes of next-generation sequence data using basic equipment. PMID:23157543

  12. A toolkit for bulk PCR-based marker design from next-generation sequence data: application for development of a framework linkage map in bulb onion (Allium cepa L.).

    PubMed

    Baldwin, Samantha; Revanna, Roopashree; Thomson, Susan; Pither-Joyce, Meeghan; Wright, Kathryn; Crowhurst, Ross; Fiers, Mark; Chen, Leshi; Macknight, Richard; McCallum, John A

    2012-11-19

    Although modern sequencing technologies permit the ready detection of numerous DNA sequence variants in any organisms, converting such information to PCR-based genetic markers is hampered by a lack of simple, scalable tools. Onion is an example of an under-researched crop with a complex, heterozygous genome where genome-based research has previously been hindered by limited sequence resources and genetic markers. We report the development of generic tools for large-scale web-based PCR-based marker design in the Galaxy bioinformatics framework, and their application for development of next-generation genetics resources in a wide cross of bulb onion (Allium cepa L.). Transcriptome sequence resources were developed for the homozygous doubled-haploid bulb onion line 'CUDH2150' and the genetically distant Indian landrace 'Nasik Red', using 454™ sequencing of normalised cDNA libraries of leaf and shoot. Read mapping of 'Nasik Red' reads onto 'CUDH2150' assemblies revealed 16836 indel and SNP polymorphisms that were mined for portable PCR-based marker development. Tools for detection of restriction polymorphisms and primer set design were developed in BioPython and adapted for use in the Galaxy workflow environment, enabling large-scale and targeted assay design. Using PCR-based markers designed with these tools, a framework genetic linkage map of over 800cM spanning all chromosomes was developed in a subset of 93 F(2) progeny from a very large F(2) family developed from the 'Nasik Red' x 'CUDH2150' inter-cross. The utility of tools and genetic resources developed was tested by designing markers to transcription factor-like polymorphic sequences. Bin mapping these markers using a subset of 10 progeny confirmed the ability to place markers within 10 cM bins, enabling increased efficiency in marker assignment and targeted map refinement. The major genetic loci conditioning red bulb colour (R) and fructan content (Frc) were located on this map by QTL analysis. The generic tools developed for the Galaxy environment enable rapid development of sets of PCR assays targeting sequence variants identified from Illumina and 454 sequence data. They enable non-specialist users to validate and exploit large volumes of next-generation sequence data using basic equipment.

  13. Improving mapping and SNP-calling performance in multiplexed targeted next-generation sequencing

    PubMed Central

    2012-01-01

    Background Compared to classical genotyping, targeted next-generation sequencing (tNGS) can be custom-designed to interrogate entire genomic regions of interest, in order to detect novel as well as known variants. To bring down the per-sample cost, one approach is to pool barcoded NGS libraries before sample enrichment. Still, we lack a complete understanding of how this multiplexed tNGS approach and the varying performance of the ever-evolving analytical tools can affect the quality of variant discovery. Therefore, we evaluated the impact of different software tools and analytical approaches on the discovery of single nucleotide polymorphisms (SNPs) in multiplexed tNGS data. To generate our own test model, we combined a sequence capture method with NGS in three experimental stages of increasing complexity (E. coli genes, multiplexed E. coli, and multiplexed HapMap BRCA1/2 regions). Results We successfully enriched barcoded NGS libraries instead of genomic DNA, achieving reproducible coverage profiles (Pearson correlation coefficients of up to 0.99) across multiplexed samples, with <10% strand bias. However, the SNP calling quality was substantially affected by the choice of tools and mapping strategy. With the aim of reducing computational requirements, we compared conventional whole-genome mapping and SNP-calling with a new faster approach: target-region mapping with subsequent ‘read-backmapping’ to the whole genome to reduce the false detection rate. Consequently, we developed a combined mapping pipeline, which includes standard tools (BWA, SAMtools, etc.), and tested it on public HiSeq2000 exome data from the 1000 Genomes Project. Our pipeline saved 12 hours of run time per Hiseq2000 exome sample and detected ~5% more SNPs than the conventional whole genome approach. This suggests that more potential novel SNPs may be discovered using both approaches than with just the conventional approach. Conclusions We recommend applying our general ‘two-step’ mapping approach for more efficient SNP discovery in tNGS. Our study has also shown the benefit of computing inter-sample SNP-concordances and inspecting read alignments in order to attain more confident results. PMID:22913592

  14. SU-E-J-109: Accurate Contour Transfer Between Different Image Modalities Using a Hybrid Deformable Image Registration and Fuzzy Connected Image Segmentation Method.

    PubMed

    Yang, C; Paulson, E; Li, X

    2012-06-01

    To develop and evaluate a tool that can improve the accuracy of contour transfer between different image modalities under challenging conditions of low image contrast and large image deformation, comparing to a few commonly used methods, for radiation treatment planning. The software tool includes the following steps and functionalities: (1) accepting input of images of different modalities, (2) converting existing contours on reference images (e.g., MRI) into delineated volumes and adjusting the intensity within the volumes to match target images (e.g., CT) intensity distribution for enhanced similarity metric, (3) registering reference and target images using appropriate deformable registration algorithms (e.g., B-spline, demons) and generate deformed contours, (4) mapping the deformed volumes on target images, calculating mean, variance, and center of mass as the initialization parameters for consecutive fuzzy connectedness (FC) image segmentation on target images, (5) generate affinity map from FC segmentation, (6) achieving final contours by modifying the deformed contours using the affinity map with a gradient distance weighting algorithm. The tool was tested with the CT and MR images of four pancreatic cancer patients acquired at the same respiration phase to minimize motion distortion. Dice's Coefficient was calculated against direct delineation on target image. Contours generated by various methods, including rigid transfer, auto-segmentation, deformable only transfer and proposed method, were compared. Fuzzy connected image segmentation needs careful parameter initialization and user involvement. Automatic contour transfer by multi-modality deformable registration leads up to 10% of accuracy improvement over the rigid transfer. Two extra proposed steps of adjusting intensity distribution and modifying the deformed contour with affinity map improve the transfer accuracy further to 14% averagely. Deformable image registration aided by contrast adjustment and fuzzy connectedness segmentation improves the contour transfer accuracy between multi-modality images, particularly with large deformation and low image contrast. © 2012 American Association of Physicists in Medicine.

  15. AlphaSpace: Fragment-Centric Topographical Mapping To Target Protein–Protein Interaction Interfaces

    PubMed Central

    2016-01-01

    Inhibition of protein–protein interactions (PPIs) is emerging as a promising therapeutic strategy despite the difficulty in targeting such interfaces with drug-like small molecules. PPIs generally feature large and flat binding surfaces as compared to typical drug targets. These features pose a challenge for structural characterization of the surface using geometry-based pocket-detection methods. An attractive mapping strategy—that builds on the principles of fragment-based drug discovery (FBDD)—is to detect the fragment-centric modularity at the protein surface and then characterize the large PPI interface as a set of localized, fragment-targetable interaction regions. Here, we introduce AlphaSpace, a computational analysis tool designed for fragment-centric topographical mapping (FCTM) of PPI interfaces. Our approach uses the alpha sphere construct, a geometric feature of a protein’s Voronoi diagram, to map out concave interaction space at the protein surface. We introduce two new features—alpha-atom and alpha-space—and the concept of the alpha-atom/alpha-space pair to rank pockets for fragment-targetability and to facilitate the evaluation of pocket/fragment complementarity. The resulting high-resolution interfacial map of targetable pocket space can be used to guide the rational design and optimization of small molecule or biomimetic PPI inhibitors. PMID:26225450

  16. High-resolution Antibody Array Analysis of Childhood Acute Leukemia Cells*

    PubMed Central

    Kanderova, Veronika; Kuzilkova, Daniela; Stuchly, Jan; Vaskova, Martina; Brdicka, Tomas; Fiser, Karel; Hrusak, Ondrej; Lund-Johansen, Fridtjof

    2016-01-01

    Acute leukemia is a disease pathologically manifested at both genomic and proteomic levels. Molecular genetic technologies are currently widely used in clinical research. In contrast, sensitive and high-throughput proteomic techniques for performing protein analyses in patient samples are still lacking. Here, we used a technology based on size exclusion chromatography followed by immunoprecipitation of target proteins with an antibody bead array (Size Exclusion Chromatography-Microsphere-based Affinity Proteomics, SEC-MAP) to detect hundreds of proteins from a single sample. In addition, we developed semi-automatic bioinformatics tools to adapt this technology for high-content proteomic screening of pediatric acute leukemia patients. To confirm the utility of SEC-MAP in leukemia immunophenotyping, we tested 31 leukemia diagnostic markers in parallel by SEC-MAP and flow cytometry. We identified 28 antibodies suitable for both techniques. Eighteen of them provided excellent quantitative correlation between SEC-MAP and flow cytometry (p < 0.05). Next, SEC-MAP was applied to examine 57 diagnostic samples from patients with acute leukemia. In this assay, we used 632 different antibodies and detected 501 targets. Of those, 47 targets were differentially expressed between at least two of the three acute leukemia subgroups. The CD markers correlated with immunophenotypic categories as expected. From non-CD markers, we found DBN1, PAX5, or PTK2 overexpressed in B-cell precursor acute lymphoblastic leukemias, LAT, SH2D1A, or STAT5A overexpressed in T-cell acute lymphoblastic leukemias, and HCK, GLUD1, or SYK overexpressed in acute myeloid leukemias. In addition, OPAL1 overexpression corresponded to ETV6-RUNX1 chromosomal translocation. In summary, we demonstrated that SEC-MAP technology is a powerful tool for detecting hundreds of proteins in clinical samples obtained from pediatric acute leukemia patients. It provides information about protein size and reveals differences in protein expression between particular leukemia subgroups. Forty-seven of SEC-MAP identified targets were validated by other conventional method in this study. PMID:26785729

  17. EO based Agro-ecosystem approach for climate change adaptation in enhancing the crop production efficiency in the Indo-gangetic plains of India

    NASA Astrophysics Data System (ADS)

    Pandey, Suraj

    This study develops a spatial mapping of agro-ecological zones based on earth observation model using MODIS regional dataset as a tool to guide key areas of cropping system and targeting to climate change strategies. This tool applies to the Indo-gangetic Plains of north India to target the domains of bio-physical characteristics and socio-economics with respect to changing climate in the region. It derive on secondary data for spatially-explicit variables at the state/district level, which serve as indicators of climate variability based on sustainable livelihood approach, natural, social and human. The study details the methodology used and generates the spatial climate risk maps for composite indicators of livelihood and vulnerability index in the region.

  18. Stakeholder analysis and mapping as targeted communication strategy.

    PubMed

    Shirey, Maria R

    2012-09-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author highlights the importance of stakeholder theory and discusses how to apply the theory to conduct a stakeholder analysis. This article also provides an explanation of how to use related stakeholder mapping techniques with targeted communication strategies.

  19. KinMap: a web-based tool for interactive navigation through human kinome data.

    PubMed

    Eid, Sameh; Turk, Samo; Volkamer, Andrea; Rippmann, Friedrich; Fulle, Simone

    2017-01-05

    Annotations of the phylogenetic tree of the human kinome is an intuitive way to visualize compound profiling data, structural features of kinases or functional relationships within this important class of proteins. The increasing volume and complexity of kinase-related data underlines the need for a tool that enables complex queries pertaining to kinase disease involvement and potential therapeutic uses of kinase inhibitors. Here, we present KinMap, a user-friendly online tool that facilitates the interactive navigation through kinase knowledge by linking biochemical, structural, and disease association data to the human kinome tree. To this end, preprocessed data from freely-available sources, such as ChEMBL, the Protein Data Bank, and the Center for Therapeutic Target Validation platform are integrated into KinMap and can easily be complemented by proprietary data. The value of KinMap will be exemplarily demonstrated for uncovering new therapeutic indications of known kinase inhibitors and for prioritizing kinases for drug development efforts. KinMap represents a new generation of kinome tree viewers which facilitates interactive exploration of the human kinome. KinMap enables generation of high-quality annotated images of the human kinome tree as well as exchange of kinome-related data in scientific communications. Furthermore, KinMap supports multiple input and output formats and recognizes alternative kinase names and links them to a unified naming scheme, which makes it a useful tool across different disciplines and applications. A web-service of KinMap is freely available at http://www.kinhub.org/kinmap/ .

  20. Scheduling Results for the THEMIS Observation Scheduling Tool

    NASA Technical Reports Server (NTRS)

    Mclaren, David; Rabideau, Gregg; Chien, Steve; Knight, Russell; Anwar, Sadaat; Mehall, Greg; Christensen, Philip

    2011-01-01

    We describe a scheduling system intended to assist in the development of instrument data acquisitions for the THEMIS instrument, onboard the Mars Odyssey spacecraft, and compare results from multiple scheduling algorithms. This tool creates observations of both (a) targeted geographical regions of interest and (b) general mapping observations, while respecting spacecraft constraints such as data volume, observation timing, visibility, lighting, season, and science priorities. This tool therefore must address both geometric and state/timing/resource constraints. We describe a tool that maps geometric polygon overlap constraints to set covering constraints using a grid-based approach. These set covering constraints are then incorporated into a greedy optimization scheduling algorithm incorporating operations constraints to generate feasible schedules. The resultant tool generates schedules of hundreds of observations per week out of potential thousands of observations. This tool is currently under evaluation by the THEMIS observation planning team at Arizona State University.

  1. Use of concurrent mixed methods combining concept mapping and focus groups to adapt a health equity tool in Canada.

    PubMed

    Guichard, Anne; Tardieu, Émilie; Dagenais, Christian; Nour, Kareen; Lafontaine, Ginette; Ridde, Valéry

    2017-04-01

    The aim of this project was to identify and prioritize a set of conditions to be considered for incorporating a health equity tool into public health practice. Concept mapping and focus groups were implemented as complementary methods to investigate the conditions of use of a health equity tool by public health organizations in Quebec. Using a hybrid integrated research design is a richer way to address the complexity of questions emerging from intervention and planning settings. This approach provides a deeper, operational, and contextualized understanding of research results involving different professional and organizational cultures, and thereby supports the decision-making process. Concept mapping served to identify and prioritize in a limited timeframe the conditions to be considered for incorporation into a health equity tool into public health practices. Focus groups then provided a more refined understanding of the barriers, issues, and facilitating factors surrounding the tools adoption, helped distinguish among participants' perspectives based on functional roles and organizational contexts, and clarified some apparently contradictory results from the concept map. The combined use of these two techniques brought the strengths of each approach to bear, thereby overcoming some of the respective limitations of concept mapping and focus groups. This design is appropriate for investigating targets with multiple levels of complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Comparative map and trait viewer (CMTV): an integrated bioinformatic tool to construct consensus maps and compare QTL and functional genomics data across genomes and experiments.

    PubMed

    Sawkins, M C; Farmer, A D; Hoisington, D; Sullivan, J; Tolopko, A; Jiang, Z; Ribaut, J-M

    2004-10-01

    In the past few decades, a wealth of genomic data has been produced in a wide variety of species using a diverse array of functional and molecular marker approaches. In order to unlock the full potential of the information contained in these independent experiments, researchers need efficient and intuitive means to identify common genomic regions and genes involved in the expression of target phenotypic traits across diverse conditions. To address this need, we have developed a Comparative Map and Trait Viewer (CMTV) tool that can be used to construct dynamic aggregations of a variety of types of genomic datasets. By algorithmically determining correspondences between sets of objects on multiple genomic maps, the CMTV can display syntenic regions across taxa, combine maps from separate experiments into a consensus map, or project data from different maps into a common coordinate framework using dynamic coordinate translations between source and target maps. We present a case study that illustrates the utility of the tool for managing large and varied datasets by integrating data collected by CIMMYT in maize drought tolerance research with data from public sources. This example will focus on one of the visualization features for Quantitative Trait Locus (QTL) data, using likelihood ratio (LR) files produced by generic QTL analysis software and displaying the data in a unique visual manner across different combinations of traits, environments and crosses. Once a genomic region of interest has been identified, the CMTV can search and display additional QTLs meeting a particular threshold for that region, or other functional data such as sets of differentially expressed genes located in the region; it thus provides an easily used means for organizing and manipulating data sets that have been dynamically integrated under the focus of the researcher's specific hypothesis.

  3. A global comparability approach for biosimilar monoclonal antibodies using LC-tandem MS based proteomics.

    PubMed

    Chen, Shun-Li; Wu, Shiaw-Lin; Huang, Li-Juan; Huang, Jia-Bao; Chen, Shu-Hui

    2013-06-01

    Liquid chromatography-tandem mass spectrometry-based proteomics for peptide mapping and sequencing was used to characterize the marketed monoclonal antibody trastuzumab and compare it with two biosimilar products, mAb A containing D359E and L361M variations at the Fc site and mAb B without variants. Complete sequence coverage (100%) including disulfide linkages, glycosylations and other commonly occurring modifications (i.e., deamidation, oxidation, dehydration and K-clipping) were identified using maps generated from multi-enzyme digestions. In addition to the targeted comparison for the relative populations of targeted modification forms, a non-targeted approach was used to globally compare ion intensities in tryptic maps. The non-targeted comparison provided an extra-dimensional view to examine any possible differences related to variants or modifications. A peptide containing the two variants in mAb A, D359E and L361M, was revealed using the non-targeted comparison of the tryptic maps. In contrast, no significant differences were observed when trastuzumab was self-compared or compared with mAb B. These results were consistent with the data derived from peptide sequencing via collision induced dissociation/electron transfer dissociation. Thus, combined targeted and non-targeted approaches using powerful mass spectrometry-based proteomic tools hold great promise for the structural characterization of biosimilar products. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Evaluation of the User Strategy on 2d and 3d City Maps Based on Novel Scanpath Comparison Method and Graph Visualization

    NASA Astrophysics Data System (ADS)

    Dolezalova, J.; Popelka, S.

    2016-06-01

    The paper is dealing with scanpath comparison of eye-tracking data recorded during case study focused on the evaluation of 2D and 3D city maps. The experiment contained screenshots from three map portals. Two types of maps were used - standard map and 3D visualization. Respondents' task was to find particular point symbol on the map as fast as possible. Scanpath comparison is one group of the eye-tracking data analyses methods used for revealing the strategy of the respondents. In cartographic studies, the most commonly used application for scanpath comparison is eyePatterns that output is hierarchical clustering and a tree graph representing the relationships between analysed sequences. During an analysis of the algorithm generating a tree graph, it was found that the outputs do not correspond to the reality. We proceeded to the creation of a new tool called ScanGraph. This tool uses visualization of cliques in simple graphs and is freely available at www.eyetracking.upol.cz/scangraph. Results of the study proved the functionality of the tool and its suitability for analyses of different strategies of map readers. Based on the results of the tool, similar scanpaths were selected, and groups of respondents with similar strategies were identified. With this knowledge, it is possible to analyse the relationship between belonging to the group with similar strategy and data gathered from the questionnaire (age, sex, cartographic knowledge, etc.) or type of stimuli (2D, 3D map).

  5. Semi-automatic Data Integration using Karma

    NASA Astrophysics Data System (ADS)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of Karma specifically for the geosciences. In particular, we show how Karma can be used intuitively to obtain the mapping model between case study data sources and a publicly available and expressive target ontology that has been designed to capture a broad set of concepts in geoscience with standardized, easily searchable names.

  6. Mapping invasive aquatic vegetation in the Sacramento-San Joaquin Delta using hyperspectral imagery.

    PubMed

    Underwood, E C; Mulitsch, M J; Greenberg, J A; Whiting, M L; Ustin, S L; Kefauver, S C

    2006-10-01

    The ecological and economic impacts associated with invasive species are of critical concern to land managers. The ability to map the extent and severity of invasions would be a valuable contribution to management decisions relating to control and monitoring efforts. We investigated the use of hyperspectral imagery for mapping invasive aquatic plant species in the Sacramento-San Joaquin Delta in the Central Valley of California, at two spatial scales. Sixty-four flightlines of HyMap hyperspectral imagery were acquired over the study region covering an area of 2,139 km(2) and field work was conducted to acquire GPS locations of target invasive species. We used spectral mixture analysis to classify two target invasive species; Brazilian waterweed (Egeria densa), a submerged invasive, and water hyacinth (Eichhornia crassipes), a floating emergent invasive. At the relatively fine spatial scale for five sites within the Delta (average size 51 ha) average classification accuracies were 93% for Brazilian waterweed and 73% for water hyacinth. However, at the coarser, Delta-wide scale (177,000 ha) these accuracy results were 29% for Brazilian waterweed and 65% for water hyacinth. The difference in accuracy is likely accounted for by the broad range in water turbidity and tide heights encountered across the Delta. These findings illustrate that hyperspectral imagery is a promising tool for discriminating target invasive species within the Sacramento-San Joaquin Delta waterways although more work is needed to develop classification tools that function under changing environmental conditions.

  7. Mapping small molecule binding data to structural domains

    PubMed Central

    2012-01-01

    Background Large-scale bioactivity/SAR Open Data has recently become available, and this has allowed new analyses and approaches to be developed to help address the productivity and translational gaps of current drug discovery. One of the current limitations of these data is the relative sparsity of reported interactions per protein target, and complexities in establishing clear relationships between bioactivity and targets using bioinformatics tools. We detail in this paper the indexing of targets by the structural domains that bind (or are likely to bind) the ligand within a full-length protein. Specifically, we present a simple heuristic to map small molecule binding to Pfam domains. This profiling can be applied to all proteins within a genome to give some indications of the potential pharmacological modulation and regulation of all proteins. Results In this implementation of our heuristic, ligand binding to protein targets from the ChEMBL database was mapped to structural domains as defined by profiles contained within the Pfam-A database. Our mapping suggests that the majority of assay targets within the current version of the ChEMBL database bind ligands through a small number of highly prevalent domains, and conversely the majority of Pfam domains sampled by our data play no currently established role in ligand binding. Validation studies, carried out firstly against Uniprot entries with expert binding-site annotation and secondly against entries in the wwPDB repository of crystallographic protein structures, demonstrate that our simple heuristic maps ligand binding to the correct domain in about 90 percent of all assessed cases. Using the mappings obtained with our heuristic, we have assembled ligand sets associated with each Pfam domain. Conclusions Small molecule binding has been mapped to Pfam-A domains of protein targets in the ChEMBL bioactivity database. The result of this mapping is an enriched annotation of small molecule bioactivity data and a grouping of activity classes following the Pfam-A specifications of protein domains. This is valuable for data-focused approaches in drug discovery, for example when extrapolating potential targets of a small molecule with known activity against one or few targets, or in the assessment of a potential target for drug discovery or screening studies. PMID:23282026

  8. Rapid mapping of schistosomiasis and other neglected tropical diseases in the context of integrated control programmes in Africa

    PubMed Central

    BROOKER, S.; KABATEREINE, N. B.; GYAPONG, J. O.; STOTHARD, J. R.; UTZINGER, J.

    2009-01-01

    SUMMARY There is growing interest and commitment to the control of schistosomiasis and other so-called neglected tropical diseases (NTDs). Resources for control are inevitably limited, necessitating assessment methods that can rapidly and accurately identify and map high-risk communities so that interventions can be targeted in a spatially-explicit and cost-effective manner. Here, we review progress made with (i) mapping schistosomiasis across Africa using available epidemiological data and more recently, climate-based risk prediction; (ii) the development and use of morbidity questionnaires for rapid identification of high-risk communities of urinary schistosomiasis; and (iii) innovative sampling-based approaches for intestinal schistosomiasis, using the lot quality assurance sampling technique. Experiences are also presented for the rapid mapping of other NTDs, including onchocerciasis, loiasis and lymphatic filariasis. Future directions for an integrated rapid mapping approach targeting multiple NTDs simultaneously are outlined, including potential challenges in developing an integrated survey tool. The lessons from the mapping of human helminth infections may also be relevant for the rapid mapping of malaria as its control efforts are intensified. PMID:19450373

  9. Visiting Vehicle Ground Trajectory Tool

    NASA Technical Reports Server (NTRS)

    Hamm, Dustin

    2013-01-01

    The International Space Station (ISS) Visiting Vehicle Group needed a targeting tool for vehicles that rendezvous with the ISS. The Visiting Vehicle Ground Trajectory targeting tool provides the ability to perform both realtime and planning operations for the Visiting Vehicle Group. This tool provides a highly reconfigurable base, which allows the Visiting Vehicle Group to perform their work. The application is composed of a telemetry processing function, a relative motion function, a targeting function, a vector view, and 2D/3D world map type graphics. The software tool provides the ability to plan a rendezvous trajectory for vehicles that visit the ISS. It models these relative trajectories using planned and realtime data from the vehicle. The tool monitors ongoing rendezvous trajectory relative motion, and ensures visiting vehicles stay within agreed corridors. The software provides the ability to update or re-plan a rendezvous to support contingency operations. Adding new parameters and incorporating them into the system was previously not available on-the-fly. If an unanticipated capability wasn't discovered until the vehicle was flying, there was no way to update things.

  10. I-FORCAST: Rapid Flight Planning Tool

    NASA Technical Reports Server (NTRS)

    Oaida, Bogdan; Khan, Mohammed; Mercury, Michael B.

    2012-01-01

    I-FORCAST (Instrument - Field of Regard Coverage Analysis and Simulation Tool) is a flight planning tool specifically designed for quickly verifying the feasibility and estimating the cost of airborne remote sensing campaigns (see figure). Flights are simulated by being broken into three predefined routing algorithms as necessary: mapping in a snaking pattern, mapping the area around a point target (like a volcano) with a star pattern, and mapping the area between a list of points. The tool has been used to plan missions for radar, lidar, and in-situ atmospheric measuring instruments for a variety of aircraft. It has also been used for global and regional scale campaigns and automatically includes landings when refueling is required. The software has been compared to the flight times of known commercial aircraft route travel times, as well as a UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar) campaign, and was within 15% of the actual flight time. Most of the discrepancy is due to non-optimal flight paths taken by actual aircraft to avoid restricted airspace and used to follow landing and take-off corridors.

  11. Doppler Imaging of Exoplanets and Brown Dwarfs

    NASA Astrophysics Data System (ADS)

    Crossfield, I.; Biller, B.; Schlieder, J.; Deacon, N.; Bonnefoy, M.; Homeier, D.; Allard, F.; Buenzli, E.; Henning, T.; Brandner, W.; Goldman, Bertr; Kopytova, T.

    2014-03-01

    Doppler Imaging produces 2D global maps. When applied to cool planets or more massive brown dwarfs, it can map atmospheric features and track global weather patterns. The first substellar map, of the 2pc-distant brown dwarf Luhman 16B (Crossfeld et al. 2014), revealed patchy regions of thin & thick clouds. Here, I investigate the feasibility of future Doppler Imaging of additional objects. Searching the literature, I find that all 3 of P, v sin i, and variability are published for 22 brown dwarfs. At least one datum exists for 333 targets. The sample is very incomplete below ~L5; we need more surveys to find the best targets for Doppler Imaging! I estimate limiting magnitudes for Doppler Imaging with various hi-resolution near-infrared spectrographs. Only a handful of objects - at the M/L and L/T transitions - can be mapped with current tools. Large telescopes such as TMT and GMT will allow Doppler Imaging of many dozens of brown dwarfs and the brightest exoplanets. More targets beyond type L5 likely remain to be found. Future observations will let us probe the global atmospheric dynamics of many diverse objects.

  12. Three-dimensional mapping of the local interstellar medium with composite data

    NASA Astrophysics Data System (ADS)

    Capitanio, L.; Lallement, R.; Vergely, J. L.; Elyajouri, M.; Monreal-Ibero, A.

    2017-10-01

    Context. Three-dimensional maps of the Galactic interstellar medium are general astrophysical tools. Reddening maps may be based on the inversion of color excess measurements for individual target stars or on statistical methods using stellar surveys. Three-dimensional maps based on diffuse interstellar bands (DIBs) have also been produced. All methods benefit from the advent of massive surveys and may benefit from Gaia data. Aims: All of the various methods and databases have their own advantages and limitations. Here we present a first attempt to combine different datasets and methods to improve the local maps. Methods: We first updated our previous local dust maps based on a regularized Bayesian inversion of individual color excess data by replacing Hipparcos or photometric distances with Gaia Data Release 1 values when available. Secondly, we complemented this database with a series of ≃5000 color excess values estimated from the strength of the λ15273 DIB toward stars possessing a Gaia parallax. The DIB strengths were extracted from SDSS/APOGEE spectra. Third, we computed a low-resolution map based on a grid of Pan-STARRS reddening measurements by means of a new hierarchical technique and used this map as the prior distribution during the inversion of the two other datasets. Results: The use of Gaia parallaxes introduces significant changes in some areas and globally increases the compactness of the structures. Additional DIB-based data make it possible to assign distances to clouds located behind closer opaque structures and do not introduce contradictory information for the close structures. A more realistic prior distribution instead of a plane-parallel homogeneous distribution helps better define the structures. We validated the results through comparisons with other maps and with soft X-ray data. Conclusions: Our study demonstrates that the combination of various tracers is a potential tool for more accurate maps. An online tool makes it possible to retrieve maps and reddening estimations. Our online tool is available at http://stilism.obspm.fr

  13. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  14. Pathview: an R/Bioconductor package for pathway-based data integration and visualization.

    PubMed

    Luo, Weijun; Brouwer, Cory

    2013-07-15

    Pathview is a novel tool set for pathway-based data integration and visualization. It maps and renders user data on relevant pathway graphs. Users only need to supply their data and specify the target pathway. Pathview automatically downloads the pathway graph data, parses the data file, maps and integrates user data onto the pathway and renders pathway graphs with the mapped data. Although built as a stand-alone program, Pathview may seamlessly integrate with pathway and functional analysis tools for large-scale and fully automated analysis pipelines. The package is freely available under the GPLv3 license through Bioconductor and R-Forge. It is available at http://bioconductor.org/packages/release/bioc/html/pathview.html and at http://Pathview.r-forge.r-project.org/. luo_weijun@yahoo.com Supplementary data are available at Bioinformatics online.

  15. A ‘tool box’ for deciphering neuronal circuits in the developing chick spinal cord

    PubMed Central

    Hadas, Yoav; Etlin, Alex; Falk, Haya; Avraham, Oshri; Kobiler, Oren; Panet, Amos; Lev-Tov, Aharon; Klar, Avihu

    2014-01-01

    The genetic dissection of spinal circuits is an essential new means for understanding the neural basis of mammalian behavior. Molecular targeting of specific neuronal populations, a key instrument in the genetic dissection of neuronal circuits in the mouse model, is a complex and time-demanding process. Here we present a circuit-deciphering ‘tool box’ for fast, reliable and cheap genetic targeting of neuronal circuits in the developing spinal cord of the chick. We demonstrate targeting of motoneurons and spinal interneurons, mapping of axonal trajectories and synaptic targeting in both single and populations of spinal interneurons, and viral vector-mediated labeling of pre-motoneurons. We also demonstrate fluorescent imaging of the activity pattern of defined spinal neurons during rhythmic motor behavior, and assess the role of channel rhodopsin-targeted population of interneurons in rhythmic behavior using specific photoactivation. PMID:25147209

  16. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-03

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  17. Dynamic approximate entropy electroanatomic maps detect rotors in a simulated atrial fibrillation model.

    PubMed

    Ugarte, Juan P; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping.

  18. On Voxel based Iso-Tumor Control Probabilty and Iso-Complication Maps for Selective Boosting and Selective Avoidance Intensity Modulated Radiotherapy.

    PubMed

    Kim, Yusung; Tomé, Wolfgang A

    2008-01-01

    Voxel based iso-Tumor Control Probability (TCP) maps and iso-Complication maps are proposed as a plan-review tool especially for functional image-guided intensity-modulated radiotherapy (IMRT) strategies such as selective boosting (dose painting) and conformal avoidance IMRT. The maps employ voxel-based phenomenological biological dose-response models for target volumes and normal organs. Two IMRT strategies for prostate cancer, namely conventional uniform IMRT delivering an EUD = 84 Gy (equivalent uniform dose) to the entire PTV and selective boosting delivering an EUD = 82 Gy to the entire PTV, are investigated, to illustrate the advantages of this approach over iso-dose maps. Conventional uniform IMRT did yield a more uniform isodose map to the entire PTV while selective boosting did result in a nonuniform isodose map. However, when employing voxel based iso-TCP maps selective boosting exhibited a more uniform tumor control probability map compared to what could be achieved using conventional uniform IMRT, which showed TCP cold spots in high-risk tumor subvolumes despite delivering a higher EUD to the entire PTV. Voxel based iso-Complication maps are presented for rectum and bladder, and their utilization for selective avoidance IMRT strategies are discussed. We believe as the need for functional image guided treatment planning grows, voxel based iso-TCP and iso-Complication maps will become an important tool to assess the integrity of such treatment plans.

  19. Making Air Pollution Visible: A Tool for Promoting Environmental Health Literacy.

    PubMed

    Cleary, Ekaterina Galkina; Patton, Allison P; Wu, Hsin-Ching; Xie, Alan; Stubblefield, Joseph; Mass, William; Grinstein, Georges; Koch-Weser, Susan; Brugge, Doug; Wong, Carolyn

    2017-04-12

    Digital maps are instrumental in conveying information about environmental hazards geographically. For laypersons, computer-based maps can serve as tools to promote environmental health literacy about invisible traffic-related air pollution and ultrafine particles. Concentrations of these pollutants are higher near major roadways and increasingly linked to adverse health effects. Interactive computer maps provide visualizations that can allow users to build mental models of the spatial distribution of ultrafine particles in a community and learn about the risk of exposure in a geographic context. The objective of this work was to develop a new software tool appropriate for educating members of the Boston Chinatown community (Boston, MA, USA) about the nature and potential health risks of traffic-related air pollution. The tool, the Interactive Map of Chinatown Traffic Pollution ("Air Pollution Map" hereafter), is a prototype that can be adapted for the purpose of educating community members across a range of socioeconomic contexts. We built the educational visualization tool on the open source Weave software platform. We designed the tool as the centerpiece of a multimodal and intergenerational educational intervention about the health risk of traffic-related air pollution. We used a previously published fine resolution (20 m) hourly land-use regression model of ultrafine particles as the algorithm for predicting pollution levels and applied it to one neighborhood, Boston Chinatown. In designing the map, we consulted community experts to help customize the user interface to communication styles prevalent in the target community. The product is a map that displays ultrafine particulate concentrations averaged across census blocks using a color gradation from white to dark red. The interactive features allow users to explore and learn how changing meteorological conditions and traffic volume influence ultrafine particle concentrations. Users can also select from multiple map layers, such as a street map or satellite view. The map legends and labels are available in both Chinese and English, and are thus accessible to immigrants and residents with proficiency in either language. The map can be either Web or desktop based. The Air Pollution Map incorporates relevant language and landmarks to make complex scientific information about ultrafine particles accessible to members of the Boston Chinatown community. In future work, we will test the map in an educational intervention that features intergenerational colearning and the use of supplementary multimedia presentations. ©Ekaterina Galkina Cleary, Allison P Patton, Hsin-Ching Wu, Alan Xie, Joseph Stubblefield, William Mass, Georges Grinstein, Susan Koch-Weser, Doug Brugge, Carolyn Wong. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 12.04.2017.

  20. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments

    PubMed Central

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-01-01

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272

  1. A polyalanine peptide derived from polar fish with anti-infectious activities

    NASA Astrophysics Data System (ADS)

    Cardoso, Marlon H.; Ribeiro, Suzana M.; Nolasco, Diego O.; de La Fuente-Núñez, César; Felício, Mário R.; Gonçalves, Sónia; Matos, Carolina O.; Liao, Luciano M.; Santos, Nuno C.; Hancock, Robert E. W.; Franco, Octávio L.; Migliolo, Ludovico

    2016-02-01

    Due to the growing concern about antibiotic-resistant microbial infections, increasing support has been given to new drug discovery programs. A promising alternative to counter bacterial infections includes the antimicrobial peptides (AMPs), which have emerged as model molecules for rational design strategies. Here we focused on the study of Pa-MAP 1.9, a rationally designed AMP derived from the polar fish Pleuronectes americanus. Pa-MAP 1.9 was active against Gram-negative planktonic bacteria and biofilms, without being cytotoxic to mammalian cells. By using AFM, leakage assays, CD spectroscopy and in silico tools, we found that Pa-MAP 1.9 may be acting both on intracellular targets and on the bacterial surface, also being more efficient at interacting with anionic LUVs mimicking Gram-negative bacterial surface, where this peptide adopts α-helical conformations, than cholesterol-enriched LUVs mimicking mammalian cells. Thus, as bacteria present varied physiological features that favor antibiotic-resistance, Pa-MAP 1.9 could be a promising candidate in the development of tools against infections caused by pathogenic bacteria.

  2. Towards local implementation of Dutch health policy guidelines: a concept-mapping approach.

    PubMed

    Kuunders, Theo J M; van Bon-Martens, Marja J H; van de Goor, Ien A M; Paulussen, Theo G W M; van Oers, Hans A M

    2017-02-22

    To develop a targeted implementation strategy for a municipal health policy guideline, implementation targets of two guideline users [Regional Health Services (RHSs)] and guideline developers of leading national health institutes were made explicit. Therefore, characteristics of successful implementation of the guideline were identified. Differences and similarities in perceptions of these characteristics between RHSs and developers were explored. Separate concept mapping procedures were executed in two RHSs, one with representatives from partner local health organizations and municipalities, the second with RHS members only. A third map was conducted with the developers of the guideline. All mapping procedures followed the same design of generating statements up to interpretation of results with participants. Concept mapping, as a practical implementation tool, will be discussed in the context of international research literature on guideline implementation in public health. Guideline developers consider implementation successful when substantive components (health issues) of the guidelines, content are visible in local policy practice. RHSs, local organizations and municipalities view the implementation process itself within and between organizations as more relevant, and state that usability of the guideline for municipal policy and commitment by officials and municipal managers are critical targets for successful implementation. Between the RHSs, differences in implementation targets were smaller than between RHSs and guideline developers. For successful implementation, RHSs tend to focus on process targets while developers focus more on the thematic contents of the guideline. Implications of these different orientations for implementation strategies are dealt with in the discussion. © The Author 2017. Published by Oxford University Press.

  3. Interactive Geophysical Mapping on the Web

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Hamburger, M.; Estey, L.; Weingroff, M.; Deardorff, R.; Holt, W.

    2002-12-01

    We have developed a set of interactive, web-based map utilities that make geophysical results accessible to a large number and variety of users. These tools provide access to pre-determined map regions via a simple Html/JavaScript interface or to user-selectable areas using a Java interface to a Generic Mapping Tools (GMT) engine. Users can access a variety of maps, satellite images, and geophysical data at a range of spatial scales for the earth and other planets of the solar system. Developed initially by UNAVCO for study of global-scale geodynamic processes, users can choose from a variety of base maps (satellite mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others) and can then add a number of geographic and geophysical overlays for example coastlines, political boundaries, rivers and lakes, NEIC earthquake and volcano locations, stress axes, and observed and model plate motion and deformation velocity vectors representing a compilation of 2933 geodetic measurements from around the world. The software design is flexible allowing for construction of special editions for different target audiences. Custom maps been implemented for UNAVCO as the "Jules Verne Voyager" and "Voyager Junior", for the International Lithosphere Project's "Global Strain Rate Map", and for EarthScope Education and Outreach as "EarthScope Voyager Jr.". For the later, a number of EarthScope-specific features have been added, including locations of proposed USArray (seismic), Plate Boundary Observatory (geodetic), and San Andreas Fault Observatory at Depth sites plus detailed maps and geographically referenced examples of EarthScope-related scientific investigations. In addition, we are developing a website that incorporates background materials and curricular activities that encourage users to explore Earth processes. A cluster of map processing computers and nearly a terabyte of disk storage has been assembled to power the generation of interactive maps and provide space for a very large collection of map data. A portal to these map tools can be found at: http://jules.unavco.ucar.edu.

  4. Next generation tools for high-throughput promoter and expression analysis employing single-copy knock-ins at the Hprt1 locus.

    PubMed

    Yang, G S; Banks, K G; Bonaguro, R J; Wilson, G; Dreolini, L; de Leeuw, C N; Liu, L; Swanson, D J; Goldowitz, D; Holt, R A; Simpson, E M

    2009-03-01

    We have engineered a set of useful tools that facilitate targeted single copy knock-in (KI) at the hypoxanthine guanine phosphoribosyl transferase 1 (Hprt1) locus. We employed fine scale mapping to delineate the precise breakpoint location at the Hprt1(b-m3) locus allowing allele specific PCR assays to be established. Our suite of tools contains four targeting expression vectors and a complementing series of embryonic stem cell lines. Two of these vectors encode enhanced green fluorescent protein (EGFP) driven by the human cytomegalovirus immediate-early enhancer/modified chicken beta-actin (CAG) promoter, whereas the other two permit flexible combinations of a chosen promoter combined with a reporter and/or gene of choice. We have validated our tools as part of the Pleiades Promoter Project (http://www.pleiades.org), with the generation of brain-specific EGFP positive germline mouse strains.

  5. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less

  6. Dynamic Approximate Entropy Electroanatomic Maps Detect Rotors in a Simulated Atrial Fibrillation Model

    PubMed Central

    Ugarte, Juan P.; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping. PMID:25489858

  7. On Voxel based Iso-Tumor Control Probabilty and Iso-Complication Maps for Selective Boosting and Selective Avoidance Intensity Modulated Radiotherapy

    PubMed Central

    Kim, Yusung; Tomé, Wolfgang A.

    2010-01-01

    Summary Voxel based iso-Tumor Control Probability (TCP) maps and iso-Complication maps are proposed as a plan-review tool especially for functional image-guided intensity-modulated radiotherapy (IMRT) strategies such as selective boosting (dose painting) and conformal avoidance IMRT. The maps employ voxel-based phenomenological biological dose-response models for target volumes and normal organs. Two IMRT strategies for prostate cancer, namely conventional uniform IMRT delivering an EUD = 84 Gy (equivalent uniform dose) to the entire PTV and selective boosting delivering an EUD = 82 Gy to the entire PTV, are investigated, to illustrate the advantages of this approach over iso-dose maps. Conventional uniform IMRT did yield a more uniform isodose map to the entire PTV while selective boosting did result in a nonuniform isodose map. However, when employing voxel based iso-TCP maps selective boosting exhibited a more uniform tumor control probability map compared to what could be achieved using conventional uniform IMRT, which showed TCP cold spots in high-risk tumor subvolumes despite delivering a higher EUD to the entire PTV. Voxel based iso-Complication maps are presented for rectum and bladder, and their utilization for selective avoidance IMRT strategies are discussed. We believe as the need for functional image guided treatment planning grows, voxel based iso-TCP and iso-Complication maps will become an important tool to assess the integrity of such treatment plans. PMID:21151734

  8. Map Sensitivity vs. Map Dependency: A Case Study of Subway Maps’ Impact on Passenger Route Choices in Washington DC

    PubMed Central

    Xu, John

    2017-01-01

    This paper addresses the key assumption in behavioral and transportation planning literature that, when people use a transit system more frequently, they become less dependent on and less sensitive to transit maps in their decision-making. Therefore, according to this assumption, map changes are much less impactful to travel decisions of frequent riders than to that of first-time or new passengers. This assumption—though never empirically validated—has been the major hurdle for transit maps to becoming a planning tool to change passengers’ behavior. This paper examines this assumption using the Washington DC metro map as a case study by conducting a route choice experiment between 30 Origin-Destination (O-D) pairs on seven metro map designs. The experiment targets two types of passengers: frequent metro riders through advertisements on a free daily newspaper available at DC metro stations, and general residents in the Washington metropolitan area through Amazon Mechanical Turk, an online crowdsourcing platform. A total of 255 and 371 participants made 2024 and 2960 route choices in the respective experiments. The results show that frequent passengers are in fact more sensitive to subtle changes in map design than general residents who are less likely to be familiar with the metro map and therefore unaffected by map changes presented in the alternative designs. The work disproves the aforementioned assumption and further validates metro maps as an effective planning tool in transit systems. PMID:29068371

  9. A Cooperative Approach to Virtual Machine Based Fault Injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naughton III, Thomas J; Engelmann, Christian; Vallee, Geoffroy R

    Resilience investigations often employ fault injection (FI) tools to study the effects of simulated errors on a target system. It is important to keep the target system under test (SUT) isolated from the controlling environment in order to maintain control of the experiement. Virtual machines (VMs) have been used to aid these investigations due to the strong isolation properties of system-level virtualization. A key challenge in fault injection tools is to gain proper insight and context about the SUT. In VM-based FI tools, this challenge of target con- text is increased due to the separation between host and guest (VM).more » We discuss an approach to VM-based FI that leverages virtual machine introspection (VMI) methods to gain insight into the target s context running within the VM. The key to this environment is the ability to provide basic information to the FI system that can be used to create a map of the target environment. We describe a proof- of-concept implementation and a demonstration of its use to introduce simulated soft errors into an iterative solver benchmark running in user-space of a guest VM.« less

  10. Remote image analysis for Mars Exploration Rover mobility and manipulation operations

    NASA Technical Reports Server (NTRS)

    Leger, Chris; Deen, Robert G.; Bonitz, Robert G.

    2005-01-01

    NASA's Mars Exploration Rovers are two sixwheeled, 175-kg robotic vehicles which have operated on Mars for over a year as of March 2005. The rovers are controlled by teams who must understand the rover's surroundings and develop command sequences on a daily basis. The tight tactical planning timeline and everchanging environment call for tools that allow quick assessment of potential manipulator targets and traverse goals, since command sequences must be developed in a matter of hours after receipt of new data from the rovers. Reachability maps give a visual indication of which targets are reachable by each rover's manipulator, while slope and solar energy maps show the rover operator which terrain areas are safe and unsafe from different standpoints.

  11. Rapid and long-lasting plasticity of input-output mapping.

    PubMed

    Yamamoto, Kenji; Hoffman, Donna S; Strick, Peter L

    2006-11-01

    Skilled use of tools requires us to learn an "input-output map" for the device, i.e., how our movements relate to the actions of the device. We used the paradigm of visuo-motor rotation to examine two questions about the plasticity of input-output maps: 1) does extensive practice on one mapping make it difficult to modify and/or to form a new input-output map and 2) once a map has been modified or a new map has been formed, does this map survive a gap in performance? Humans and monkeys made wrist movements to control the position of a cursor on a computer monitor. Humans practiced the task for approximately 1.5 h; monkeys practiced for 3-9 yr. After this practice, we gradually altered the direction of cursor movement relative to wrist movement while subjects moved either to a single target or to four targets. Subjects were unaware of the change in cursor-movement relationship. Despite their prior practice on the task, the humans and the monkeys quickly adjusted their motor output to compensate for the visuo-motor rotation. Monkeys retained the modified input-output map during a 2-wk gap in motor performance. Humans retained the altered map during a gap of >1 yr. Our results show that sensorimotor performance remains flexible despite considerable practice on a specific task, and even relatively short-term exposure to a new input-output mapping leads to a long-lasting change in motor performance.

  12. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments.

    PubMed

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-07-08

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Gene Delivery to Adipose Tissue Using Transcriptionally Targeted rAAV8 Vectors

    PubMed Central

    Uhrig-Schmidt, Silke; Geiger, Matthias; Luippold, Gerd; Birk, Gerald; Mennerich, Detlev; Neubauer, Heike; Grimm, Dirk; Wolfrum, Christian; Kreuz, Sebastian

    2014-01-01

    In recent years, the increasing prevalence of obesity and obesity-related co-morbidities fostered intensive research in the field of adipose tissue biology. To further unravel molecular mechanisms of adipose tissue function, genetic tools enabling functional studies in vitro and in vivo are essential. While the use of transgenic animals is well established, attempts using viral and non-viral vectors to genetically modify adipocytes in vivo are rare. Therefore, we here characterized recombinant Adeno-associated virus (rAAV) vectors regarding their potency as gene transfer vehicles for adipose tissue. Our results demonstrate that a single dose of systemically applied rAAV8-CMV-eGFP can give rise to remarkable transgene expression in murine adipose tissues. Upon transcriptional targeting of the rAAV8 vector to adipocytes using a 2.2 kb fragment of the murine adiponectin (mAP2.2) promoter, eGFP expression was significantly decreased in off-target tissues while efficient transduction was maintained in subcutaneous and visceral fat depots. Moreover, rAAV8-mAP2.2-mediated expression of perilipin A – a lipid-droplet-associated protein – resulted in significant changes in metabolic parameters only three weeks post vector administration. Taken together, our findings indicate that rAAV vector technology is applicable as a flexible tool to genetically modify adipocytes for functional proof-of-concept studies and the assessment of putative therapeutic targets in vivo. PMID:25551639

  14. Liposomes containing NY‑ESO‑1/tetanus toxoid and adjuvant peptides targeted to human dendritic cells via the Fc receptor for cancer vaccines.

    PubMed

    Cruz, Luis J; Rueda, Felix; Simón, Lorena; Cordobilla, Begoña; Albericio, Fernando; Domingo, Joan C

    2014-04-01

    To improve the immunological response against tumors, a vaccine based on nanoliposomes targeted to the Fcg-receptor was developed to enhance the immunogenicity of tumor-associated antigens (TAAs). Using human dendritic cells in vitro, a fragment of the TAA NY-ESO-1 combined with a T-helper peptide from the tetanus toxoid encapsulated in nanoliposomes was evaluated. In addition, peptides Palm-IL-1 and MAP-IFN-g were coadministered as adjuvants to enhance the immunological response. Coadministration of Palm-IL-1 or MAP-IFN-g peptide adjuvants and the hybrid NY-ESO-1-tetanus toxoid (soluble or encapsulated in nanoliposomes without targeting) increased immunogenicity. However, the most potent immunological response was obtained when the peptide adjuvants were encapsulated in liposomes targeted to human dendritic cells via the Fc receptor. This targeted vaccine strategy is a promising tool to activate and deliver antigens to dendritic cells, thus improving immunotherapeutic response in situations in which the immune system is frequently compromised, as in advanced cancers.

  15. The Beaks of Finches & the Tool Analogy: Use with Care

    ERIC Educational Resources Information Center

    Milne, Catherine

    2008-01-01

    Analogies are an integral feature of scientific theories, like evolution. They are developed to support explanations, proposed on the basis of evidence collected from experimental studies, field studies, and other observational studies. They map a known source or process to an unknown or target with the goal of helping educators understand the…

  16. Use of The Yeast Two-Hybrid System to Identify Targets of Fungal Effectors

    USDA-ARS?s Scientific Manuscript database

    The yeast-two hybrid (Y2H) system is a binary method widely used to determine direct interactions between paired proteins. Although having certain limitations, this method has become one of the two main systemic tools (along with affinity purification/mass spectrometry) for interactome mapping in mo...

  17. Biomimetic Signal Processing Using the Biosonar Measurement Tool (BMT)

    NASA Astrophysics Data System (ADS)

    Abawi, Ahmad T.; Hursky, Paul; Porter, Michael B.; Tiemann, Chris; Martin, Stephen

    2004-11-01

    In this paper data recorded on the Biosonar Measurement Tool (BMT) during a target echolocation experiment are used to 1) find ways to separate target echoes from clutter echoes, 2) analyze target returns and 3) find features in target returns that distinguish them from clutter returns. The BMT is an instrumentation package used in dolphin echolocation experiments developed at SPAWARSYSCEN. It can be held by the dolphin using a bite-plate during echolocation experiments and records the movement and echolocation strategy of a target-hunting dolphin without interfering with its motion through the search field. The BMT was developed to record a variety of data from a free-swimming dolphin engaged in a bottom target detection task. These data include the three dimensional location of the dolphin, including its heading, pitch roll and velocity as well as passive acoustic data recorded on three channels. The outgoing dolphin click is recorded on one channel and the resulting echoes are recorded on the two remaining channels. For each outgoing click the BMT records a large number of echoes that come from the entire ensonified field. Given the large number of transmitted clicks and the returned echoes, it is almost impossible to find a target return from the recorded data on the BMT. As a means of separating target echoes from those of clutter, an echo-mapping tool was developed. This tool produces an echomap on which echoes from targets (and other regular objects such as surface buoys, the side of a boat and so on) stack together as tracks, while echoes from clutter are scattered. Once these tracks are identified, the retuned echoes can easily be extracted for further analysis.

  18. A computational tool to predict the evolutionarily conserved protein-protein interaction hot-spot residues from the structure of the unbound protein.

    PubMed

    Agrawal, Neeraj J; Helk, Bernhard; Trout, Bernhardt L

    2014-01-21

    Identifying hot-spot residues - residues that are critical to protein-protein binding - can help to elucidate a protein's function and assist in designing therapeutic molecules to target those residues. We present a novel computational tool, termed spatial-interaction-map (SIM), to predict the hot-spot residues of an evolutionarily conserved protein-protein interaction from the structure of an unbound protein alone. SIM can predict the protein hot-spot residues with an accuracy of 36-57%. Thus, the SIM tool can be used to predict the yet unknown hot-spot residues for many proteins for which the structure of the protein-protein complexes are not available, thereby providing a clue to their functions and an opportunity to design therapeutic molecules to target these proteins. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  19. Disease mapping for informing targeted health interventions: childhood pneumonia in Bohol, Philippines.

    PubMed

    Thomas, Deborah S K; Anthamatten, Peter; Root, Elisabeth Dowling; Lucero, Marilla; Nohynek, Hanna; Tallo, Veronica; Williams, Gail M; Simões, Eric A F

    2015-11-01

    Acute lower respiratory tract infections (ALRI) are the leading cause of childhood mortality worldwide. Currently, most developing countries assign resources at a district level, and yet District Medical Officers have few tools for directing targeted interventions to high mortality or morbidity areas. Mapping of ALRI at the local level can guide more efficient allocation of resources, coordination of efforts and targeted interventions, which are particularly relevant for health management in resource-scarce settings. An efficacy study of 11-valent pneumococcal vaccine was conducted in six municipalities in the Bohol Province of central Philippines from July 2000 to December 2004. Geocoded under-five pneumonia cases (using WHO classifications) were mapped to create spatial patterns of pneumonia at the local health unit (barangay) level. There were 2951 children with WHO-defined clinical pneumonia, of whom 1074 were severe or very severely ill, 278 were radiographic, and 219 were hypoxaemic. While most children with pneumonia were from urban barangays, there was a disproportionately higher distribution of severe/very severe pneumonia in rural barangays and the most severe hypoxaemic children were concentrated in the northern barangays most distant from the regional hospital. Mapping of ALRI at the local administrative health level can be performed relatively simply. If these principles are applied to routinely collected IMCI classification of disease at the district level in developing countries, such efforts can form the basis for directing public health and healthcare delivery efforts in a targeted manner. © 2015 John Wiley & Sons Ltd.

  20. SU-F-J-194: Development of Dose-Based Image Guided Proton Therapy Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, R; Sun, B; Zhao, T

    Purpose: To implement image-guided proton therapy (IGPT) based on daily proton dose distribution. Methods: Unlike x-ray therapy, simple alignment based on anatomy cannot ensure proper dose coverage in proton therapy. Anatomy changes along the beam path may lead to underdosing the target, or overdosing the organ-at-risk (OAR). With an in-room mobile computed tomography (CT) system, we are developing a dose-based IGPT software tool that allows patient positioning and treatment adaption based on daily dose distributions. During an IGPT treatment, daily CT images are acquired in treatment position. After initial positioning based on rigid image registration, proton dose distribution is calculatedmore » on daily CT images. The target and OARs are automatically delineated via deformable image registration. Dose distributions are evaluated to decide if repositioning or plan adaptation is necessary in order to achieve proper coverage of the target and sparing of OARs. Besides online dose-based image guidance, the software tool can also map daily treatment doses to the treatment planning CT images for offline adaptive treatment. Results: An in-room helical CT system is commissioned for IGPT purposes. It produces accurate CT numbers that allow proton dose calculation. GPU-based deformable image registration algorithms are developed and evaluated for automatic ROI-delineation and dose mapping. The online and offline IGPT functionalities are evaluated with daily CT images of the proton patients. Conclusion: The online and offline IGPT software tool may improve the safety and quality of proton treatment by allowing dose-based IGPT and adaptive proton treatments. Research is partially supported by Mevion Medical Systems.« less

  1. Using mental mapping to unpack perceived cycling risk.

    PubMed

    Manton, Richard; Rau, Henrike; Fahy, Frances; Sheahan, Jerome; Clifford, Eoghan

    2016-03-01

    Cycling is the most energy-efficient mode of transport and can bring extensive environmental, social and economic benefits. Research has highlighted negative perceptions of safety as a major barrier to the growth of cycling. Understanding these perceptions through the application of novel place-sensitive methodological tools such as mental mapping could inform measures to increase cyclist numbers and consequently improve cyclist safety. Key steps to achieving this include: (a) the design of infrastructure to reduce actual risks and (b) targeted work on improving safety perceptions among current and future cyclists. This study combines mental mapping, a stated-preference survey and a transport infrastructure inventory to unpack perceptions of cycling risk and to reveal both overlaps and discrepancies between perceived and actual characteristics of the physical environment. Participants translate mentally mapped cycle routes onto hard-copy base-maps, colour-coding road sections according to risk, while a transport infrastructure inventory captures the objective cycling environment. These qualitative and quantitative data are matched using Geographic Information Systems and exported to statistical analysis software to model the individual and (infra)structural determinants of perceived cycling risk. This method was applied to cycling conditions in Galway City (Ireland). Participants' (n=104) mental maps delivered data-rich perceived safety observations (n=484) and initial comparison with locations of cycling collisions suggests some alignment between perception and reality, particularly relating to danger at roundabouts. Attributing individual and (infra)structural characteristics to each observation, a Generalised Linear Mixed Model statistical analysis identified segregated infrastructure, road width, the number of vehicles as well as gender and cycling experience as significant, and interactions were found between individual and infrastructural variables. The paper concludes that mental mapping is a highly useful tool for assessing perceptions of cycling risk with a strong visual aspect and significant potential for public participation. This distinguishes it from more traditional cycling safety assessment tools that focus solely on the technical assessment of cycling infrastructure. Further development of online mapping tools is recommended as part of bicycle suitability measures to engage cyclists and the general public and to inform 'soft' and 'hard' cycling policy responses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Geospatial Analysis and Remote Sensing from Airplanes and Satellites for Cultural Resources Management

    NASA Technical Reports Server (NTRS)

    Giardino, Marco J.; Haley, Bryan S.

    2005-01-01

    Cultural resource management consists of research to identify, evaluate, document and assess cultural resources, planning to assist in decision-making, and stewardship to implement the preservation, protection and interpretation of these decisions and plans. One technique that may be useful in cultural resource management archaeology is remote sensing. It is the acquisition of data and derivative information about objects or materials (targets) located on the Earth's surface or in its atmosphere by using sensor mounted on platforms located at a distance from the targets to make measurements on interactions between the targets and electromagnetic radiation. Included in this definition are systems that acquire imagery by photographic methods and digital multispectral sensors. Data collected by digital multispectral sensors on aircraft and satellite platforms play a prominent role in many earth science applications, including land cover mapping, geology, soil science, agriculture, forestry, water resource management, urban and regional planning, and environmental assessments. Inherent in the analysis of remotely sensed data is the use of computer-based image processing techniques. Geographical information systems (GIS), designed for collecting, managing, and analyzing spatial information, are also useful in the analysis of remotely sensed data. A GIS can be used to integrate diverse types of spatially referenced digital data, including remotely sensed and map data. In archaeology, these tools have been used in various ways to aid in cultural resource projects. For example, they have been used to predict the presence of archaeological resources using modern environmental indicators. Remote sensing techniques have also been used to directly detect the presence of unknown sites based on the impact of past occupation on the Earth's surface. Additionally, remote sensing has been used as a mapping tool aimed at delineating the boundaries of a site or mapping previously unknown features. All of these applications are pertinent to the goals of site discovery and assessment in cultural resource management.

  3. Voyager Interactive Web Interface to EarthScope

    NASA Astrophysics Data System (ADS)

    Eriksson, S. C.; Meertens, C. M.; Estey, L.; Weingroff, M.; Hamburger, M. W.; Holt, W. E.; Richard, G. A.

    2004-12-01

    Visualization of data is essential in helping scientists and students develop a conceptual understanding of relationships among many complex types of data and keep track of large amounts of information. Developed initially by UNAVCO for study of global-scale geodynamic processes, the Voyager map visualization tools have evolved into interactive, web-based map utilities that can make scientific results accessible to a large number and variety of educators and students as well as the originally targeted scientists. A portal to these map tools can be found at: http://jules.unavco.org. The Voyager tools provide on-line interactive data visualization through pre-determined map regions via a simple HTML/JavaScript interface (for large numbers of students using the tools simultaneously) or through student-selectable areas using a Java interface to a Generic Mapping Tools (GMT) engine. Students can access a variety of maps, satellite images, and geophysical data at a range of spatial scales for the earth and other planets of the solar system. Students can also choose from a variety of base maps (satellite mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others) and can then add a number of geographic and geophysical overlays, for example coastlines, political boundaries, rivers and lakes, earthquake and volcano locations, stress axes, and observed and model plate motion, as well as deformation velocity vectors representing a compilation of over 5000 geodetic measurements from around the world. The related educational website, "Exploring our Dynamic Planet", (http://www.dpc.ucar.edu/VoyagerJr/jvvjrtool.html) incorporates background materials and curricular activities that encourage students to explore Earth processes. One of the present curricular modules is designed for high school students or introductory-level undergraduate non-science majors. The purpose of the module is for students to examine real data to investigate how plate tectonic processes are reflected in observed geophysical phenomena. Constructing maps by controlling map parameters and answering open-ended questions which describe, compare relationships, and work with both observed and model data, promote conceptual understanding of plate tectonics and related processes. The goals of curricular development emphasize inquiry, development of critical thinking skills, and student-centered interests. Custom editions of the map utility have been made as the "Jules Verne Voyager" and "Voyager Junior", for the International Lithosphere Project's "Global Strain Rate Map", and for EarthScope Education and Outreach as "EarthScope Voyager Jr.". For the latter, a number of EarthScope-specific features have been added, including locations of proposed USArray (seismic), Plate Boundary Observatory (geodetic), and San Andreas Fault Observatory at Depth sites, plus detailed maps and geographically referenced examples of EarthScope-related scientific investigations. As EarthScope develops, maps will be updated in `real time' so that students of all ages can use the data in formal and informal educational settings.

  4. Using DNase Hi-C techniques to map global and local three-dimensional genome architecture at high resolution.

    PubMed

    Ma, Wenxiu; Ay, Ferhat; Lee, Choli; Gulsoy, Gunhan; Deng, Xinxian; Cook, Savannah; Hesson, Jennifer; Cavanaugh, Christopher; Ware, Carol B; Krumm, Anton; Shendure, Jay; Blau, C Anthony; Disteche, Christine M; Noble, William S; Duan, ZhiJun

    2018-06-01

    The folding and three-dimensional (3D) organization of chromatin in the nucleus critically impacts genome function. The past decade has witnessed rapid advances in genomic tools for delineating 3D genome architecture. Among them, chromosome conformation capture (3C)-based methods such as Hi-C are the most widely used techniques for mapping chromatin interactions. However, traditional Hi-C protocols rely on restriction enzymes (REs) to fragment chromatin and are therefore limited in resolution. We recently developed DNase Hi-C for mapping 3D genome organization, which uses DNase I for chromatin fragmentation. DNase Hi-C overcomes RE-related limitations associated with traditional Hi-C methods, leading to improved methodological resolution. Furthermore, combining this method with DNA capture technology provides a high-throughput approach (targeted DNase Hi-C) that allows for mapping fine-scale chromatin architecture at exceptionally high resolution. Hence, targeted DNase Hi-C will be valuable for delineating the physical landscapes of cis-regulatory networks that control gene expression and for characterizing phenotype-associated chromatin 3D signatures. Here, we provide a detailed description of method design and step-by-step working protocols for these two methods. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Electromagnetic pulse (EMP) coupling codes for use with the vulnerability/lethality (VIL) taxonomy. Final report, June-October 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mar, M.H.

    1995-07-01

    Based on the vulnerability Lethality (V/L) taxonomy developed by the Ballistic Vulnerability Lethality Division (BVLD) of the Survivability Lethality Analysis Directorate (SLAD), a nuclear electromagnetic pulse (EMP) coupling V/L analysis taxonomy has been developed. A nuclear EMP threat to a military system can be divided into two levels: (1) coupling to a system level through a cable, antenna, or aperture; and (2) the component level. This report will focus on the initial condition, which includes threat definition and target description, as well as the mapping process from the initial condition to damaged components state. EMP coupling analysis at a systemmore » level is used to accomplish this. This report introduces the nature of EMP threat, interaction between the threat and target, and how the output of EMP coupling analysis at a system level becomes the input to the component level analysis. Many different tools (EMP coupling codes) will be discussed for the mapping process, which correponds to the physics of phenomenology. This EMP coupling V/L taxonomy and the models identified in this report will provide the tools necessary to conduct basic V/L analysis of EMP coupling.« less

  6. X-ray absorption radiography for high pressure shock wave studies

    NASA Astrophysics Data System (ADS)

    Antonelli, L.; Atzeni, S.; Batani, D.; Baton, S. D.; Brambrink, E.; Forestier-Colleoni, P.; Koenig, M.; Le Bel, E.; Maheut, Y.; Nguyen-Bui, T.; Richetta, M.; Rousseaux, C.; Ribeyre, X.; Schiavi, A.; Trela, J.

    2018-01-01

    The study of laser compressed matter, both warm dense matter (WDM) and hot dense matter (HDM), is relevant to several research areas, including materials science, astrophysics, inertial confinement fusion. X-ray absorption radiography is a unique tool to diagnose compressed WDM and HDM. The application of radiography to shock-wave studies is presented and discussed. In addition to the standard Abel inversion to recover a density map from a transmission map, a procedure has been developed to generate synthetic radiographs using density maps produced by the hydrodynamics code DUED. This procedure takes into account both source-target geometry and source size (which plays a non negligible role in the interpretation of the data), and allows to reproduce transmission data with a good degree of accuracy.

  7. Acoustic mapping of shallow water gas releases using shipborne multibeam systems

    NASA Astrophysics Data System (ADS)

    Urban, Peter; Köser, Kevin; Weiß, Tim; Greinert, Jens

    2015-04-01

    Water column imaging (WCI) shipborne multibeam systems are effective tools for investigating marine free gas (bubble) release. Like single- and splitbeam systems they are very sensitive towards gas bubbles in the water column, and have the advantage of the wide swath opening angle, 120° or more allowing a better mapping and possible 3D investigations of targets in the water column. On the downside, WCI data are degraded by specific noise from side-lobe effects and are usually not calibrated for target backscattering strength analysis. Most approaches so far concentrated on manual investigations of bubbles in the water column data. Such investigations allow the detection of bubble streams (flares) and make it possible to get an impression about the strength of detected flares/the gas release. Because of the subjective character of these investigations it is difficult to understand how well an area has been investigated by a flare mapping survey and subjective impressions about flare strength can easily be fooled by the many acoustic effects multibeam systems create. Here we present a semi-automated approach that uses the behavior of bubble streams in varying water currents to detect and map their exact source positions. The focus of the method is application of objective rules for flare detection, which makes it possible to extract information about the quality of the seepage mapping survey, perform automated noise reduction and create acoustic maps with quality discriminators indicating how well an area has been mapped.

  8. CloudDOE: a user-friendly tool for deploying Hadoop clouds and analyzing high-throughput sequencing data with MapReduce.

    PubMed

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.

  9. CloudDOE: A User-Friendly Tool for Deploying Hadoop Clouds and Analyzing High-Throughput Sequencing Data with MapReduce

    PubMed Central

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343

  10. Using GIS and secondary data to target diabetes-related public health efforts.

    PubMed

    Curtis, Amy B; Kothari, Catherine; Paul, Rajib; Connors, Elyse

    2013-01-01

    To efficiently help communities prevent and manage diabetes, health departments need to be able to target populations with high risk but low resources. To aid in this process, we mapped county-level diabetes-related rates and resources/use using publicly available secondary data to identify Michigan counties with high diabetes prevalence and low or no medical and/or community resources. We collected county-level diabetes-related rates and resources from Web-based sources and mapped them using geographic information systems (GIS) software. Data included age-adjusted county diabetes rates, diabetes-related medical resource and resource use (i.e., the number of endocrinologists and percentage of Medicare patients with diabetes who received hemoglobin A1c testing in the past year), community resources (i.e., the number of certified diabetes self-management education and diabetes support groups), as well as population estimates and demographics (e.g., rural residence, education, poverty, and race/ethnicity). We created GIS maps highlighting areas that had higher-than-median rates of disease and lower-than-median resources. We also conducted linear, logistic, and Poisson regression analyses to confirm GIS findings. There were clear regional trends in resource distribution across Michigan. The 15 counties in the Upper Peninsula were lacking in medical resources but higher in community resources compared with the 68 counties in the Lower Peninsula. There was little apparent association between need (diabetes prevalence) and diabetes-related resources/use. Specific counties with high diabetes prevalence and low resources were easily identified using GIS mapping. Using public data and mapping tools identified diabetes health-service shortage areas for targeted public health programming.

  11. Automated MRI segmentation for individualized modeling of current flow in the human head.

    PubMed

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.

  12. Modeling protein structure at near atomic resolutions with Gorgon.

    PubMed

    Baker, Matthew L; Abeysinghe, Sasakthi S; Schuh, Stephen; Coleman, Ross A; Abrams, Austin; Marsh, Michael P; Hryc, Corey F; Ruths, Troy; Chiu, Wah; Ju, Tao

    2011-05-01

    Electron cryo-microscopy (cryo-EM) has played an increasingly important role in elucidating the structure and function of macromolecular assemblies in near native solution conditions. Typically, however, only non-atomic resolution reconstructions have been obtained for these large complexes, necessitating computational tools for integrating and extracting structural details. With recent advances in cryo-EM, maps at near-atomic resolutions have been achieved for several macromolecular assemblies from which models have been manually constructed. In this work, we describe a new interactive modeling toolkit called Gorgon targeted at intermediate to near-atomic resolution density maps (10-3.5 Å), particularly from cryo-EM. Gorgon's de novo modeling procedure couples sequence-based secondary structure prediction with feature detection and geometric modeling techniques to generate initial protein backbone models. Beyond model building, Gorgon is an extensible interactive visualization platform with a variety of computational tools for annotating a wide variety of 3D volumes. Examples from cryo-EM maps of Rotavirus and Rice Dwarf Virus are used to demonstrate its applicability to modeling protein structure. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Exploration of depth modeling mode one lossless wedgelets storage strategies for 3D-high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan

    2018-01-01

    The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.

  14. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    PubMed

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  15. Ergonomics action research II: a framework for integrating HF into work system design.

    PubMed

    Neumann, W P; Village, J

    2012-01-01

    This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.

  16. Mapping a multiplexed zoo of mRNA expression.

    PubMed

    Choi, Harry M T; Calvert, Colby R; Husain, Naeem; Huss, David; Barsi, Julius C; Deverman, Benjamin E; Hunter, Ryan C; Kato, Mihoko; Lee, S Melanie; Abelin, Anna C T; Rosenthal, Adam Z; Akbari, Omar S; Li, Yuwei; Hay, Bruce A; Sternberg, Paul W; Patterson, Paul H; Davidson, Eric H; Mazmanian, Sarkis K; Prober, David A; van de Rijn, Matt; Leadbetter, Jared R; Newman, Dianne K; Readhead, Carol; Bronner, Marianne E; Wold, Barbara; Lansford, Rusty; Sauka-Spengler, Tatjana; Fraser, Scott E; Pierce, Niles A

    2016-10-01

    In situ hybridization methods are used across the biological sciences to map mRNA expression within intact specimens. Multiplexed experiments, in which multiple target mRNAs are mapped in a single sample, are essential for studying regulatory interactions, but remain cumbersome in most model organisms. Programmable in situ amplifiers based on the mechanism of hybridization chain reaction (HCR) overcome this longstanding challenge by operating independently within a sample, enabling multiplexed experiments to be performed with an experimental timeline independent of the number of target mRNAs. To assist biologists working across a broad spectrum of organisms, we demonstrate multiplexed in situ HCR in diverse imaging settings: bacteria, whole-mount nematode larvae, whole-mount fruit fly embryos, whole-mount sea urchin embryos, whole-mount zebrafish larvae, whole-mount chicken embryos, whole-mount mouse embryos and formalin-fixed paraffin-embedded human tissue sections. In addition to straightforward multiplexing, in situ HCR enables deep sample penetration, high contrast and subcellular resolution, providing an incisive tool for the study of interlaced and overlapping expression patterns, with implications for research communities across the biological sciences. © 2016. Published by The Company of Biologists Ltd.

  17. Mapping a multiplexed zoo of mRNA expression

    PubMed Central

    Choi, Harry M. T.; Calvert, Colby R.; Husain, Naeem; Huss, David; Barsi, Julius C.; Deverman, Benjamin E.; Hunter, Ryan C.; Kato, Mihoko; Lee, S. Melanie; Abelin, Anna C. T.; Rosenthal, Adam Z.; Akbari, Omar S.; Li, Yuwei; Hay, Bruce A.; Sternberg, Paul W.; Patterson, Paul H.; Davidson, Eric H.; Mazmanian, Sarkis K.; Prober, David A.; van de Rijn, Matt; Leadbetter, Jared R.; Newman, Dianne K.; Readhead, Carol; Bronner, Marianne E.; Wold, Barbara; Lansford, Rusty; Sauka-Spengler, Tatjana; Fraser, Scott E.

    2016-01-01

    In situ hybridization methods are used across the biological sciences to map mRNA expression within intact specimens. Multiplexed experiments, in which multiple target mRNAs are mapped in a single sample, are essential for studying regulatory interactions, but remain cumbersome in most model organisms. Programmable in situ amplifiers based on the mechanism of hybridization chain reaction (HCR) overcome this longstanding challenge by operating independently within a sample, enabling multiplexed experiments to be performed with an experimental timeline independent of the number of target mRNAs. To assist biologists working across a broad spectrum of organisms, we demonstrate multiplexed in situ HCR in diverse imaging settings: bacteria, whole-mount nematode larvae, whole-mount fruit fly embryos, whole-mount sea urchin embryos, whole-mount zebrafish larvae, whole-mount chicken embryos, whole-mount mouse embryos and formalin-fixed paraffin-embedded human tissue sections. In addition to straightforward multiplexing, in situ HCR enables deep sample penetration, high contrast and subcellular resolution, providing an incisive tool for the study of interlaced and overlapping expression patterns, with implications for research communities across the biological sciences. PMID:27702788

  18. Tree Cover Mapping Tool—Documentation and user manual

    USGS Publications Warehouse

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2016-06-02

    The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.

  19. Mapping how information about childhood vaccination is communicated in two regions of Cameroon: What is done and where are the gaps?

    PubMed

    Ames, Heather; Njang, Diangha Mabel; Glenton, Claire; Fretheim, Atle; Kaufman, Jessica; Hill, Sophie; Oku, Afiong; Cliff, Julie; Cartier, Yuri; Bosch-Capblanch, Xavier; Rada, Gabriel; Muloliwa, Artur; Oyo-Ita, Angela; Lewin, Simon

    2015-12-21

    The 'Communicate to vaccinate' (COMMVAC) project builds research evidence for improving communication with parents and communities about childhood vaccinations in low- and middle-income countries. Understanding and mapping the range of vaccination communication strategies used in different settings is an important component of this work. In this part of the COMMVAC project, our objectives were: (1) to identify the vaccination communication interventions used in two regions of Cameroon; (2) to apply the COMMVAC taxonomy, a global taxonomy of vaccination communication interventions, to these communication interventions to help us classify these interventions, including their purposes and target audiences; and identify whether gaps in purpose or target audiences exist; (3) to assess the COMMVAC taxonomy as a research tool for data collection and analysis. We used the following qualitative methods to identify communication strategies in the Central and North West Regions of Cameroon in the first half of 2014: interviews with program managers, non-governmental organizations, vaccinators, parents and community members; observations and informal conversations during routine immunization clinics and three rounds of the National Polio Immunization Campaign; and document analysis of reports and mass media communications about vaccination. A survey of parents and caregivers was also done. We organised the strategies using the COMMVAC taxonomy and produced a map of Cameroon-specific interventions, which we presented to local stakeholders for feedback. Our map of the interventions used in Cameroon suggests that most childhood vaccination communication interventions focus on national campaigns against polio rather than routine immunisation. The map also indicates that most communication interventions target communities more broadly, rather than parents, and that very few interventions target health workers. The majority of the communication interventions aimed to inform or educate or remind or recall members of the community about vaccination. The COMMVAC taxonomy provided a useful framework for quickly and simply mapping existing vaccination communication strategies. By identifying the interventions used in Cameroon and developing an intervention map, we allowed stakeholders to see where they were concentrating their communication efforts and where gaps exist, allowing them to reflect on whether changes are needed to the communication strategies they are using.

  20. A complete mass spectrometric map for the analysis of the yeast proteome and its application to quantitative trait analysis

    PubMed Central

    Picotti, Paola; Clement-Ziza, Mathieu; Lam, Henry; Campbell, David S.; Schmidt, Alexander; Deutsch, Eric W.; Röst, Hannes; Sun, Zhi; Rinner, Oliver; Reiter, Lukas; Shen, Qin; Michaelson, Jacob J.; Frei, Andreas; Alberti, Simon; Kusebauch, Ulrike; Wollscheid, Bernd; Moritz, Robert; Beyer, Andreas; Aebersold, Ruedi

    2013-01-01

    Complete reference maps or datasets, like the genomic map of an organism, are highly beneficial tools for biological and biomedical research. Attempts to generate such reference datasets for a proteome so far failed to reach complete proteome coverage, with saturation apparent at approximately two thirds of the proteomes tested, even for the most thoroughly characterized proteomes. Here, we used a strategy based on high-throughput peptide synthesis and mass spectrometry to generate a close to complete reference map (97% of the genome-predicted proteins) of the S. cerevisiae proteome. We generated two versions of this mass spectrometric map one supporting discovery- (shotgun) and the other hypothesis-driven (targeted) proteomic measurements. The two versions of the map, therefore, constitute a complete set of proteomic assays to support most studies performed with contemporary proteomic technologies. The reference libraries can be browsed via a web-based repository and associated navigation tools. To demonstrate the utility of the reference libraries we applied them to a protein quantitative trait locus (pQTL) analysis, which requires measurement of the same peptides over a large number of samples with high precision. Protein measurements over a set of 78 S. cerevisiae strains revealed a complex relationship between independent genetic loci, impacting on the levels of related proteins. Our results suggest that selective pressure favors the acquisition of sets of polymorphisms that maintain the stoichiometry of protein complexes and pathways. PMID:23334424

  1. Having trouble with your strategy? Then map it.

    PubMed

    Kaplan, R S; Norton, D P

    2000-01-01

    If you were a military general on the march, you'd want your troops to have plenty of maps--detailed information about the mission they were on, the roads they would travel, the campaigns they would undertake, and the weapons at their disposal. The same holds true in business: a workforce needs clear and detailed information to execute a business strategy successfully. Until now, there haven't been many tools that can communicate both an organization's strategy and the processes and systems needed to implement that strategy. But authors Robert Kaplan and David Norton, cocreators of the balanced scorecard, have adapted that seminal tool to create strategy maps. Strategy maps let an organization describe and illustrate--in clear and general language--its objectives, initiatives, targets markets, performance measures, and the links between all the pieces of its strategy. Employees get a visual representation of how their jobs are tied to the company's overall goals, while managers get a clearer understanding of their strategies and a means to detect and correct any flaws in those plans. Using Mobil North American Marketing and Refining Company as an example, Kaplan and Norton walk through the creation of a strategy map and its four distinct regions--financial, customer, internal process, and learning and growth--which correspond to the four perspectives of the balanced scorecard. The authors show step by step how the Mobil division used the map to transform itself from a centrally controlled manufacturer of commodity products to a decentralized, customer-driven organization.

  2. Molecular Neuroanatomy: A Generation of Progress

    PubMed Central

    Pollock, Jonathan D.; Wu, Da-Yu; Satterlee, John

    2014-01-01

    The neuroscience research landscape has changed dramatically over the past decade. An impressive array of neuroscience tools and technologies have been generated, including brain gene expression atlases, genetically encoded proteins to monitor and manipulate neuronal activity and function, cost effective genome sequencing, new technologies enabling genome manipulation, new imaging methods and new tools for mapping neuronal circuits. However, despite these technological advances, several significant scientific challenges must be overcome in the coming decade to enable a better understanding of brain function and to develop next generation cell type-targeted therapeutics to treat brain disorders. For example, we do not have an inventory of the different types of cells that exist in the brain, nor do we know how to molecularly phenotype them. We also lack robust technologies to map connections between cells. This review will provide an overview of some of the tools and technologies neuroscientists are currently using to move the field of molecular neuroanatomy forward and also discuss emerging technologies that may enable neuroscientists to address these critical scientific challenges over the coming decade. PMID:24388609

  3. Genomic Tools in Groundnut Breeding Program: Status and Perspectives

    PubMed Central

    Janila, P.; Variath, Murali T.; Pandey, Manish K.; Desmae, Haile; Motagi, Babu N.; Okori, Patrick; Manohar, Surendra S.; Rathnakumar, A. L.; Radhakrishnan, T.; Liao, Boshou; Varshney, Rajeev K.

    2016-01-01

    Groundnut, a nutrient-rich food legume, is cultivated world over. It is valued for its good quality cooking oil, energy and protein rich food, and nutrient-rich fodder. Globally, groundnut improvement programs have developed varieties to meet the preferences of farmers, traders, processors, and consumers. Enhanced yield, tolerance to biotic and abiotic stresses and quality parameters have been the target traits. Spurt in genetic information of groundnut was facilitated by development of molecular markers, genetic, and physical maps, generation of expressed sequence tags (EST), discovery of genes, and identification of quantitative trait loci (QTL) for some important biotic and abiotic stresses and quality traits. The first groundnut variety developed using marker assisted breeding (MAB) was registered in 2003. Since then, USA, China, Japan, and India have begun to use genomic tools in routine groundnut improvement programs. Introgression lines that combine foliar fungal disease resistance and early maturity were developed using MAB. Establishment of marker-trait associations (MTA) paved way to integrate genomic tools in groundnut breeding for accelerated genetic gain. Genomic Selection (GS) tools are employed to improve drought tolerance and pod yield, governed by several minor effect QTLs. Draft genome sequence and low cost genotyping tools such as genotyping by sequencing (GBS) are expected to accelerate use of genomic tools to enhance genetic gains for target traits in groundnut. PMID:27014312

  4. Chemical mapping of cytosines enzymatically flipped out of the DNA helix

    PubMed Central

    Liutkevičiūtė, Zita; Tamulaitis, Gintautas; Klimašauskas, Saulius

    2008-01-01

    Haloacetaldehydes can be employed for probing unpaired DNA structures involving cytosine and adenine residues. Using an enzyme that was structurally proven to flip its target cytosine out of the DNA helix, the HhaI DNA methyltransferase (M.HhaI), we demonstrate the suitability of the chloroacetaldehyde modification for mapping extrahelical (flipped-out) cytosine bases in protein–DNA complexes. The generality of this method was verified with two other DNA cytosine-5 methyltransferases, M.AluI and M.SssI, as well as with two restriction endonucleases, R.Ecl18kI and R.PspGI, which represent a novel class of base-flipping enzymes. Our results thus offer a simple and convenient laboratory tool for detection and mapping of flipped-out cytosines in protein–DNA complexes. PMID:18450817

  5. Airborne Gravity Gradiometry Resolves a Full Range of Gravity Frequencies

    NASA Astrophysics Data System (ADS)

    Mataragio, J.; Brewster, J.; Mims, J.

    2007-12-01

    Airborne Full Tensor Gradiometry (Air\\-FTGR) was flown at high altitude coincident with Airborne Gravity (AG) flown in 2003 in West Arnhem Land, Australia. A preliminary analysis of two data sets indicates that the Air\\-FTGR system has the capability of resolving intermediate to long wavelengths features that may be associated with relatively deeper geological structures. A comparison of frequency filtered slices and power spectral density (PSD) for both data sets using the short (> 5 km), intermediate (10 km) and long (20 km) wavelengths reveals that high altitude Air\\-FTGR data show greater response in high frequency anomalies than a conventional Airborne Gravity and matches well with the AG even at the longest wavelengths anomalies. The effect of line spacing and target resolution was examined between the two data sets. Reprocessed gradient and AG data at 2, 4 and 6 km line spacing suggest that Air\\-FTGR could be effectively flown at a comparatively wider line spacing to resolve similar targets the AG would resolve with tighter line spacing. Introduction Airborne Full Tensor Gradiometry (Air\\-FTGR) data have been available to the mining industry since 2002 and their use for geologic applications is well established. However, Air\\-FTGR data has been mostly considered and used in mapping and delineation of near surface geological targets. This is due to the fact that gravity gradiometer measurements are well suited to capture the high frequency signal associated with near\\-surface targets ( Li, 2001). This is possible because the gradiometer signal strength falls off with the cube of the distance to the target. Nonetheless, in recent years there has been an increasing demand from the mining, oil, and gas industry in utilizing Full Tensor Gravity Gradiometer as a mapping tool for both regional and prospect level surveys. Air\\-FTGR as a Regional Mapping Tool Several, relatively low altitude surveys have been successfully flown in Brazil, Canada and Australia mostly targeting large, regional\\- scale crustal structures as well as regional mapping of both lithology and regolith. Air\\-FTGR mapping is especially effective in areas of thick lateritic and/or clay cover where other geophysical methods such as airborne magnetics or electromagnetics become less effective. For instance, an Air\\-FTGR survey was successfully flown in Brazil in the Province of Minas Gerais, where several crustal\\-scale structures associated with iron oxide mineralization were identified ( Mataragio et. al., 2006). In addition, in 2006 Air\\-FTGR had good success in the regional mapping of structures associated with Iron Oxide Copper Gold (IOCG) and uranium mineralization in the Wernecke Mountains in the Yukon, and Northwest Territories, Canada. On the basis of these successful surveys, Bell Geospace has initiated a number of high altitude test surveys aiming at evaluating the performance of the Air\\-FTGR system in capturing low frequency signal that may be associated with regional\\-scale, deeper structures. One of the test surveys was conducted in December of 2006 in Australia, where the performance of Air\\-FTGR and the conventional Airborne Gravity were evaluated. Airborne gravity is currently considered well suited for capturing low frequency signal.

  6. The cancer experience map: an approach to including the patient voice in supportive care solutions.

    PubMed

    Hall, Leslie Kelly; Kunz, Breanne F; Davis, Elizabeth V; Dawson, Rose I; Powers, Ryan S

    2015-05-28

    The perspective of the patient, also called the "patient voice", is an essential element in materials created for cancer supportive care. Identifying that voice, however, can be a challenge for researchers and developers. A multidisciplinary team at a health information company tasked with addressing this issue created a representational model they call the "cancer experience map". This map, designed as a tool for content developers, offers a window into the complex perspectives inside the cancer experience. Informed by actual patient quotes, the map shows common overall themes for cancer patients, concerns at key treatment points, strategies for patient engagement, and targeted behavioral goals. In this article, the team members share the process by which they created the map as well as its first use as a resource for cancer support videos. The article also addresses the broader policy implications of including the patient voice in supportive cancer content, particularly with regard to mHealth apps.

  7. Thematic mapping of likely target areas for the occurence of cassiterite in the Serra do Mocambo (GO) granitic massifs using LANDSAT 2 digital imaging

    NASA Technical Reports Server (NTRS)

    Almeidofilho, R. (Principal Investigator)

    1984-01-01

    The applicability of LANDSAT/MSS images, enhanced by computer derived techniques, as essential tools in mineral research was investigated and the Serra do Mocambo granitic massif was used as illustration. Given the peculiar factors founded in this area, orbital imagery permitted the delineation of potential target areas of mineralization occurrences, associated to albitized/greisenized types. Follow up prospection for primary tin deposits in this granitic massif should be restricted to the delineated areas which are less than 5% of the total superficial area of the massif.

  8. Hybrid overlay metrology for high order correction by using CDSEM

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Halder, Sandip; Lorusso, Gian; Baudemprez, Bart; Inoue, Osamu; Okagawa, Yutaka

    2016-03-01

    Overlay control has become one of the most critical issues for semiconductor manufacturing. Advanced lithographic scanners use high-order corrections or correction per exposure to reduce the residual overlay. It is not enough in traditional feedback of overlay measurement by using ADI wafer because overlay error depends on other process (etching process and film stress, etc.). It needs high accuracy overlay measurement by using AEI wafer. WIS (Wafer Induced Shift) is the main issue for optical overlay, IBO (Image Based Overlay) and DBO (Diffraction Based Overlay). We design dedicated SEM overlay targets for dual damascene process of N10 by i-ArF multi-patterning. The pattern is same as device-pattern locally. Optical overlay tools select segmented pattern to reduce the WIS. However segmentation has limit, especially the via-pattern, for keeping the sensitivity and accuracy. We evaluate difference between the viapattern and relaxed pitch gratings which are similar to optical overlay target at AEI. CDSEM can estimate asymmetry property of target from image of pattern edge. CDSEM can estimate asymmetry property of target from image of pattern edge. We will compare full map of SEM overlay to full map of optical overlay for high order correction ( correctables and residual fingerprints).

  9. Toward a view-oriented approach for aligning RDF-based biomedical repositories.

    PubMed

    Anguita, A; García-Remesal, M; de la Iglesia, D; Graf, N; Maojo, V

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". The need for complementary access to multiple RDF databases has fostered new lines of research, but also entailed new challenges due to data representation disparities. While several approaches for RDF-based database integration have been proposed, those focused on schema alignment have become the most widely adopted. All state-of-the-art solutions for aligning RDF-based sources resort to a simple technique inherited from legacy relational database integration methods. This technique - known as element-to-element (e2e) mappings - is based on establishing 1:1 mappings between single primitive elements - e.g. concepts, attributes, relationships, etc. - belonging to the source and target schemas. However, due to the intrinsic nature of RDF - a representation language based on defining tuples < subject, predicate, object > -, one may find RDF elements whose semantics vary dramatically when combined into a view involving other RDF elements - i.e. they depend on their context. The latter cannot be adequately represented in the target schema by resorting to the traditional e2e approach. These approaches fail to properly address this issue without explicitly modifying the target ontology, thus lacking the required expressiveness for properly reflecting the intended semantics in the alignment information. To enhance existing RDF schema alignment techniques by providing a mechanism to properly represent elements with context-dependent semantics, thus enabling users to perform more expressive alignments, including scenarios that cannot be adequately addressed by the existing approaches. Instead of establishing 1:1 correspondences between single primitive elements of the schemas, we propose adopting a view-based approach. The latter is targeted at establishing mapping relationships between RDF subgraphs - that can be regarded as the equivalent of views in traditional databases -, rather than between single schema elements. This approach enables users to represent scenarios defined by context-dependent RDF elements that cannot be properly represented when adopting the currently existing approaches. We developed a software tool implementing our view-based strategy. Our tool is currently being used in the context of the European Commission funded p-medicine project, targeted at creating a technological framework to integrate clinical and genomic data to facilitate the development of personalized drugs and therapies for cancer, based on the genetic profile of the patient. We used our tool to integrate different RDF-based databases - including different repositories of clinical trials and DICOM images - using the Health Data Ontology Trunk (HDOT) ontology as the target schema. The importance of database integration methods and tools in the context of biomedical research has been widely recognized. Modern research in this area - e.g. identification of disease biomarkers, or design of personalized therapies - heavily relies on the availability of a technical framework to enable researchers to uniformly access disparate repositories. We present a method and a tool that implement a novel alignment method specifically designed to support and enhance the integration of RDF-based data sources at schema (metadata) level. This approach provides an increased level of expressiveness compared to other existing solutions, and allows solving heterogeneity scenarios that cannot be properly represented using other state-of-the-art techniques.

  10. Geodemographics as a tool for targeting neighbourhoods in public health campaigns

    NASA Astrophysics Data System (ADS)

    Petersen, Jakob; Gibin, Maurizio; Longley, Paul; Mateos, Pablo; Atkinson, Philip; Ashby, David

    2011-06-01

    Geodemographics offers the prospects of integrating, modelling and mapping health care needs and other health indicators that are useful for targeting neighbourhoods in public health campaigns. Yet reports about this application domain has to date been sporadic. The purpose of this paper is to examine the potential of a bespoke geodemographic system for neighbourhood targeting in an inner city public health authority, Southwark Primary Care Trust, London. This system, the London Output Area Classification (LOAC), is compared to six other geodemographic systems from both governmental and commercial sources. The paper proposes two new indicators for assessing the performance of geodemographic systems for neighbourhood targeting based on local hospital demand data. The paper also analyses and discusses the utility of age- and sex standardisation of geodemographic profiles of health care demand.

  11. Curation of inhibitor-target data: process and impact on pathway analysis.

    PubMed

    Devidas, Sreenivas

    2009-01-01

    The past decade has seen a significant emergence in the availability and use of pathway analysis tools. The workflow that is supported by most of the pathway analysis tools is limited to either of the following: a. a network of genes based on the input data set, or b. the resultant network filtered down by a few criteria such as (but not limited to) i. disease association of the genes in the network; ii. targets known to be the target of one or more launched drugs; iii. targets known to be the target of one or more compounds in clinical trials; and iv. targets reasonably known to be potential candidate or clinical biomarkers. Almost all the tools in use today are biased towards the biological side and contain little, if any, information on the chemical inhibitors associated with the components of a given biological network. The limitation resides as follows: The fact that the number of inhibitors that have been published or patented is probably several fold (probably greater than 10-fold) more than the number of published protein-protein interactions. Curation of such data is both expensive and time consuming and could impact ROI significantly. The non-standardization associated with protein and gene names makes mapping reasonably non-straightforward. The number of patented and published inhibitors across target classes increases by over a million per year. Therefore, keeping the databases current becomes a monumental problem. Modifications required in the product architectures to accommodate chemistry-related content. GVK Bio has, over the past 7 years, curated the compound-target data that is necessary for the addition of such compound-centric workflows. This chapter focuses on identification, curation and utility of such data.

  12. User Guide for the Anvil Threat Cooridor Forecast Tool V2.4 for AWIPS

    NASA Technical Reports Server (NTRS)

    Barett, Joe H., III; Bauman, William H., III

    2008-01-01

    The Anvil Tool GUI allows users to select a Data Type, toggle the map refresh on/off, place labels, and choose the Profiler Type (source of the KSC 50 MHz profiler data), the Date- Time of the data, the Center of Plot, and the Station (location of the RAOB or 50 MHz profiler). If the Data Type is Models, the user selects a Fcst Hour (forecast hour) instead of Station. There are menus for User Profiles, Circle Label Options, and Frame Label Options. Labels can be placed near the center circle of the plot and/or at a specified distance and direction from the center of the circle (Center of Plot). The default selection for the map refresh is "ON". When the user creates a new Anvil Tool map with Refresh Map "ON, the plot is automatically displayed in the AWIPS frame. If another Anvil Tool map is already displayed and the user does not change the existing map number shown at the bottom of the GUI, the new Anvil Tool map will overwrite the old one. If the user turns the Refresh Map "OFF", the new Anvil Tool map is created but not automatically displayed. The user can still display the Anvil Tool map through the Maps dropdown menu* as shown in Figure 4.

  13. Making Space for Place: Mapping Tools and Practices to Teach for Spatial Justice

    ERIC Educational Resources Information Center

    Rubel, Laurie H.; Hall-Wieckert, Maren; Lim, Vivian Y.

    2017-01-01

    This article presents a set of spatial tools for classroom learning about spatial justice. As part of a larger team, we designed a curriculum that engaged 10 learners with 3 spatial tools: (a) an oversized floor map, (b) interactive geographic information systems (GIS) maps, and (c) participatory mapping. We analyze how these tools supported…

  14. Delineating Beach and Dune Morphology from Massive Terrestrial Laser Scanning Data Using the Generic Mapping Tools

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Wang, G.; Yan, B.; Kearns, T.

    2016-12-01

    Terrestrial laser scanning (TLS) techniques have been proven to be efficient tools to collect three-dimensional high-density and high-accuracy point clouds for coastal research and resource management. However, the processing and presenting of massive TLS data is always a challenge for research when targeting a large area with high-resolution. This article introduces a workflow using shell-scripting techniques to chain together tools from the Generic Mapping Tools (GMT), Geographic Resources Analysis Support System (GRASS), and other command-based open-source utilities for automating TLS data processing. TLS point clouds acquired in the beach and dune area near Freeport, Texas in May 2015 were used for the case study. Shell scripts for rotating the coordinate system, removing anomalous points, assessing data quality, generating high-accuracy bare-earth DEMs, and quantifying beach and sand dune features (shoreline, cross-dune section, dune ridge, toe, and volume) are presented in this article. According to this investigation, the accuracy of the laser measurements (distance from the scanner to the targets) is within a couple of centimeters. However, the positional accuracy of TLS points with respect to a global coordinate system is about 5 cm, which is dominated by the accuracy of GPS solutions for obtaining the positions of the scanner and reflector. The accuracy of TLS-derived bare-earth DEM is primarily determined by the size of grid cells and roughness of the terrain surface for the case study. A DEM with grid cells of 4m x 1m (shoreline by cross-shore) provides a suitable spatial resolution and accuracy for deriving major beach and dune features.

  15. Open-source image registration for MRI-TRUS fusion-guided prostate interventions.

    PubMed

    Fedorov, Andriy; Khallaghi, Siavash; Sánchez, C Antonio; Lasso, Andras; Fels, Sidney; Tuncali, Kemal; Sugar, Emily Neubauer; Kapur, Tina; Zhang, Chenxi; Wells, William; Nguyen, Paul L; Abolmaesumi, Purang; Tempany, Clare

    2015-06-01

    We propose two software tools for non-rigid registration of MRI and transrectal ultrasound (TRUS) images of the prostate. Our ultimate goal is to develop an open-source solution to support MRI-TRUS fusion image guidance of prostate interventions, such as targeted biopsy for prostate cancer detection and focal therapy. It is widely hypothesized that image registration is an essential component in such systems. The two non-rigid registration methods are: (1) a deformable registration of the prostate segmentation distance maps with B-spline regularization and (2) a finite element-based deformable registration of the segmentation surfaces in the presence of partial data. We evaluate the methods retrospectively using clinical patient image data collected during standard clinical procedures. Computation time and Target Registration Error (TRE) calculated at the expert-identified anatomical landmarks were used as quantitative measures for the evaluation. The presented image registration tools were capable of completing deformable registration computation within 5 min. Average TRE was approximately 3 mm for both methods, which is comparable with the slice thickness in our MRI data. Both tools are available under nonrestrictive open-source license. We release open-source tools that may be used for registration during MRI-TRUS-guided prostate interventions. Our tools implement novel registration approaches and produce acceptable registration results. We believe these tools will lower the barriers in development and deployment of interventional research solutions and facilitate comparison with similar tools.

  16. Planetary Geologic Mapping Handbook - 2010. Appendix

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of projectspecific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well. Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically. As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program's Planetary Cartography and Geologic Mapping Working Group's (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  17. Concept Mapping Using Cmap Tools to Enhance Meaningful Learning

    NASA Astrophysics Data System (ADS)

    Cañas, Alberto J.; Novak, Joseph D.

    Concept maps are graphical tools that have been used in all facets of education and training for organizing and representing knowledge. When learners build concept maps, meaningful learning is facilitated. Computer-based concept mapping software such as CmapTools have further extended the use of concept mapping and greatly enhanced the potential of the tool, facilitating the implementation of a concept map-centered learning environment. In this chapter, we briefly present concept mapping and its theoretical foundation, and illustrate how it can lead to an improved learning environment when it is combined with CmapTools and the Internet. We present the nationwide “Proyecto Conéctate al Conocimiento” in Panama as an example of how concept mapping, together with technology, can be adopted by hundreds of schools as a means to enhance meaningful learning.

  18. Map reading tools for map libraries.

    USGS Publications Warehouse

    Greenberg, G.L.

    1982-01-01

    Engineers, navigators and military strategists employ a broad array of mechanical devices to facilitate map use. A larger number of map users such as educators, students, tourists, journalists, historians, politicians, economists and librarians are unaware of the available variety of tools which can be used with maps to increase the speed and efficiency of their application and interpretation. This paper identifies map reading tools such as coordinate readers, protractors, dividers, planimeters, and symbol-templets according to a functional classification. Particularly, arrays of tools are suggested for use in determining position, direction, distance, area and form (perimeter-shape-pattern-relief). -from Author

  19. Matching spatial with ontological brain regions using Java tools for visualization, database access, and integrated data analysis.

    PubMed

    Bezgin, Gleb; Reid, Andrew T; Schubert, Dirk; Kötter, Rolf

    2009-01-01

    Brain atlases are widely used in experimental neuroscience as tools for locating and targeting specific brain structures. Delineated structures in a given atlas, however, are often difficult to interpret and to interface with database systems that supply additional information using hierarchically organized vocabularies (ontologies). Here we discuss the concept of volume-to-ontology mapping in the context of macroscopical brain structures. We present Java tools with which we have implemented this concept for retrieval of mapping and connectivity data on the macaque brain from the CoCoMac database in connection with an electronic version of "The Rhesus Monkey Brain in Stereotaxic Coordinates" authored by George Paxinos and colleagues. The software, including our manually drawn monkey brain template, can be downloaded freely under the GNU General Public License. It adds value to the printed atlas and has a wider (neuro-)informatics application since it can read appropriately annotated data from delineated sections of other species and organs, and turn them into 3D registered stacks. The tools provide additional features, including visualization and analysis of connectivity data, volume and centre-of-mass estimates, and graphical manipulation of entire structures, which are potentially useful for a range of research and teaching applications.

  20. Accurate estimation of short read mapping quality for next-generation genome sequencing

    PubMed Central

    Ruffalo, Matthew; Koyutürk, Mehmet; Ray, Soumya; LaFramboise, Thomas

    2012-01-01

    Motivation: Several software tools specialize in the alignment of short next-generation sequencing reads to a reference sequence. Some of these tools report a mapping quality score for each alignment—in principle, this quality score tells researchers the likelihood that the alignment is correct. However, the reported mapping quality often correlates weakly with actual accuracy and the qualities of many mappings are underestimated, encouraging the researchers to discard correct mappings. Further, these low-quality mappings tend to correlate with variations in the genome (both single nucleotide and structural), and such mappings are important in accurately identifying genomic variants. Approach: We develop a machine learning tool, LoQuM (LOgistic regression tool for calibrating the Quality of short read mappings, to assign reliable mapping quality scores to mappings of Illumina reads returned by any alignment tool. LoQuM uses statistics on the read (base quality scores reported by the sequencer) and the alignment (number of matches, mismatches and deletions, mapping quality score returned by the alignment tool, if available, and number of mappings) as features for classification and uses simulated reads to learn a logistic regression model that relates these features to actual mapping quality. Results: We test the predictions of LoQuM on an independent dataset generated by the ART short read simulation software and observe that LoQuM can ‘resurrect’ many mappings that are assigned zero quality scores by the alignment tools and are therefore likely to be discarded by researchers. We also observe that the recalibration of mapping quality scores greatly enhances the precision of called single nucleotide polymorphisms. Availability: LoQuM is available as open source at http://compbio.case.edu/loqum/. Contact: matthew.ruffalo@case.edu. PMID:22962451

  1. Fast photochemical oxidation of proteins (FPOP) maps the epitope of EGFR binding to adnectin.

    PubMed

    Yan, Yuetian; Chen, Guodong; Wei, Hui; Huang, Richard Y-C; Mo, Jingjie; Rempel, Don L; Tymiak, Adrienne A; Gross, Michael L

    2014-12-01

    Epitope mapping is an important tool for the development of monoclonal antibodies, mAbs, as therapeutic drugs. Recently, a class of therapeutic mAb alternatives, adnectins, has been developed as targeted biologics. They are derived from the 10th type III domain of human fibronectin ((10)Fn3). A common approach to map the epitope binding of these therapeutic proteins to their binding partners is X-ray crystallography. Although the crystal structure is known for Adnectin 1 binding to human epidermal growth factor receptor (EGFR), we seek to determine complementary binding in solution and to test the efficacy of footprinting for this purpose. As a relatively new tool in structural biology and complementary to X-ray crystallography, protein footprinting coupled with mass spectrometry is promising for protein-protein interaction studies. We report here the use of fast photochemical oxidation of proteins (FPOP) coupled with MS to map the epitope of EGFR-Adnectin 1 at both the peptide and amino-acid residue levels. The data correlate well with the previously determined epitopes from the crystal structure and are consistent with HDX MS data, which are presented in an accompanying paper. The FPOP-determined binding interface involves various amino-acid and peptide regions near the N terminus of EGFR. The outcome adds credibility to oxidative labeling by FPOP for epitope mapping and motivates more applications in the therapeutic protein area as a stand-alone method or in conjunction with X-ray crystallography, NMR, site-directed mutagenesis, and other orthogonal methods.

  2. DotMapper: an open source tool for creating interactive disease point maps.

    PubMed

    Smith, Catherine M; Hayward, Andrew C

    2016-04-12

    Molecular strain typing of tuberculosis isolates has led to increased understanding of the epidemiological characteristics of the disease and improvements in its control, diagnosis and treatment. However, molecular cluster investigations, which aim to detect previously unidentified cases, remain challenging. Interactive dot mapping is a simple approach which could aid investigations by highlighting cases likely to share epidemiological links. Current tools generally require technical expertise or lack interactivity. We designed a flexible application for producing disease dot maps using Shiny, a web application framework for the statistical software, R. The application displays locations of cases on an interactive map colour coded according to levels of categorical variables such as demographics and risk factors. Cases can be filtered by selecting combinations of these characteristics and by notification date. It can be used to rapidly identify geographic patterns amongst cases in molecular clusters of tuberculosis in space and time; generate hypotheses about disease transmission; identify outliers, and guide targeted control measures. DotMapper is a user-friendly application which enables rapid production of maps displaying locations of cases and their epidemiological characteristics without the need for specialist training in geographic information systems. Enhanced understanding of tuberculosis transmission using this application could facilitate improved detection of cases with epidemiological links and therefore lessen the public health impacts of the disease. It is a flexible system and also has broad international potential application to other investigations using geo-coded health information.

  3. Fast Photochemical Oxidation of Proteins (FPOP) Maps the Epitope of EGFR Binding to Adnectin

    NASA Astrophysics Data System (ADS)

    Yan, Yuetian; Chen, Guodong; Wei, Hui; Huang, Richard Y.-C.; Mo, Jingjie; Rempel, Don L.; Tymiak, Adrienne A.; Gross, Michael L.

    2014-12-01

    Epitope mapping is an important tool for the development of monoclonal antibodies, mAbs, as therapeutic drugs. Recently, a class of therapeutic mAb alternatives, adnectins, has been developed as targeted biologics. They are derived from the 10th type III domain of human fibronectin (10Fn3). A common approach to map the epitope binding of these therapeutic proteins to their binding partners is X-ray crystallography. Although the crystal structure is known for Adnectin 1 binding to human epidermal growth factor receptor (EGFR), we seek to determine complementary binding in solution and to test the efficacy of footprinting for this purpose. As a relatively new tool in structural biology and complementary to X-ray crystallography, protein footprinting coupled with mass spectrometry is promising for protein-protein interaction studies. We report here the use of fast photochemical oxidation of proteins (FPOP) coupled with MS to map the epitope of EGFR-Adnectin 1 at both the peptide and amino-acid residue levels. The data correlate well with the previously determined epitopes from the crystal structure and are consistent with HDX MS data, which are presented in an accompanying paper. The FPOP-determined binding interface involves various amino-acid and peptide regions near the N terminus of EGFR. The outcome adds credibility to oxidative labeling by FPOP for epitope mapping and motivates more applications in the therapeutic protein area as a stand-alone method or in conjunction with X-ray crystallography, NMR, site-directed mutagenesis, and other orthogonal methods.

  4. Communication strategies to promote the uptake of childhood vaccination in Nigeria: a systematic map

    PubMed Central

    Oku, Afiong; Oyo-Ita, Angela; Glenton, Claire; Fretheim, Atle; Ames, Heather; Muloliwa, Artur; Kaufman, Jessica; Hill, Sophie; Cliff, Julie; Cartier, Yuri; Bosch-Capblanch, Xavier; Rada, Gabriel; Lewin, Simon

    2016-01-01

    Background Effective communication is a critical component in ensuring that children are fully vaccinated. Although numerous communication interventions have been proposed and implemented in various parts of Nigeria, the range of communication strategies used has not yet been mapped systematically. This study forms part of the ‘Communicate to vaccinate’ (COMMVAC) project, an initiative aimed at building research evidence for improving communication with parents and communities about childhood vaccinations in low- and middle-income countries. Objective This study aims to: 1) identify the communication strategies used in two states in Nigeria; 2) map these strategies against the existing COMMVAC taxonomy, a global taxonomy of vaccination communication interventions; 3) create a specific Nigerian country map of interventions organised by purpose and target; and 4) analyse gaps between the COMMVAC taxonomy and the Nigerian map. Design We conducted the study in two Nigerian states: Bauchi State in Northern Nigeria and Cross River State in Southern Nigeria. We identified vaccination communication interventions through interviews carried out among purposively selected stakeholders in the health services and relevant agencies involved in vaccination information delivery; through observations and through relevant documents. We used the COMMVAC taxonomy to organise the interventions we identified based on the intended purpose of the communication and the group to which the intervention was targeted. Results The Nigerian map revealed that most of the communication strategies identified aimed to inform and educate and remind or recall. Few aimed to teach skills, enhance community ownership, and enable communication. We did not identify any intervention that aimed to provide support or facilitate decision-making. Many interventions had more than one purpose. The main targets for most interventions were caregivers and community members, with few interventions directed at health workers. Most interventions identified were used in the context of campaigns rather than routine immunisation programmes. Conclusions The identification and development of the Nigerian vaccination communication interventions map could assist programme managers to identify gaps in vaccination communication. The map may be a useful tool as part of efforts to address vaccine hesitancy and improve vaccination coverage in Nigeria and similar settings. PMID:26880154

  5. Communication strategies to promote the uptake of childhood vaccination in Nigeria: a systematic map.

    PubMed

    Oku, Afiong; Oyo-Ita, Angela; Glenton, Claire; Fretheim, Atle; Ames, Heather; Muloliwa, Artur; Kaufman, Jessica; Hill, Sophie; Cliff, Julie; Cartier, Yuri; Bosch-Capblanch, Xavier; Rada, Gabriel; Lewin, Simon

    2016-01-01

    Effective communication is a critical component in ensuring that children are fully vaccinated. Although numerous communication interventions have been proposed and implemented in various parts of Nigeria, the range of communication strategies used has not yet been mapped systematically. This study forms part of the 'Communicate to vaccinate' (COMMVAC) project, an initiative aimed at building research evidence for improving communication with parents and communities about childhood vaccinations in low- and middle-income countries. This study aims to: 1) identify the communication strategies used in two states in Nigeria; 2) map these strategies against the existing COMMVAC taxonomy, a global taxonomy of vaccination communication interventions; 3) create a specific Nigerian country map of interventions organised by purpose and target; and 4) analyse gaps between the COMMVAC taxonomy and the Nigerian map. We conducted the study in two Nigerian states: Bauchi State in Northern Nigeria and Cross River State in Southern Nigeria. We identified vaccination communication interventions through interviews carried out among purposively selected stakeholders in the health services and relevant agencies involved in vaccination information delivery; through observations and through relevant documents. We used the COMMVAC taxonomy to organise the interventions we identified based on the intended purpose of the communication and the group to which the intervention was targeted. The Nigerian map revealed that most of the communication strategies identified aimed to inform and educate and remind or recall. Few aimed to teach skills, enhance community ownership, and enable communication. We did not identify any intervention that aimed to provide support or facilitate decision-making. Many interventions had more than one purpose. The main targets for most interventions were caregivers and community members, with few interventions directed at health workers. Most interventions identified were used in the context of campaigns rather than routine immunisation programmes. The identification and development of the Nigerian vaccination communication interventions map could assist programme managers to identify gaps in vaccination communication. The map may be a useful tool as part of efforts to address vaccine hesitancy and improve vaccination coverage in Nigeria and similar settings.

  6. Linguistic validation of the Alberta Context Tool and two measures of research use, for German residential long term care.

    PubMed

    Hoben, Matthias; Bär, Marion; Mahler, Cornelia; Berger, Sarah; Squires, Janet E; Estabrooks, Carole A; Kruse, Andreas; Behrens, Johann

    2014-01-31

    To study the association between organizational context and research utilization in German residential long term care (LTC), we translated three Canadian assessment instruments: the Alberta Context Tool (ACT), Estabrooks' Kinds of Research Utilization (RU) items and the Conceptual Research Utilization Scale. Target groups for the tools were health care aides (HCAs), registered nurses (RNs), allied health professionals (AHPs), clinical specialists and care managers. Through a cognitive debriefing process, we assessed response processes validity-an initial stage of validity, necessary before more advanced validity assessment. We included 39 participants (16 HCAs, 5 RNs, 7 AHPs, 5 specialists and 6 managers) from five residential LTC facilities. We created lists of questionnaire items containing problematic items plus items randomly selected from the pool of remaining items. After participants completed the questionnaires, we conducted individual semi-structured cognitive interviews using verbal probing. We asked participants to reflect on their answers for list items in detail. Participants' answers were compared to concept maps defining the instrument concepts in detail. If at least two participants gave answers not matching concept map definitions, items were revised and re-tested with new target group participants. Cognitive debriefings started with HCAs. Based on the first round, we modified 4 of 58 ACT items, 1 ACT item stem and all 8 items of the RU tools. All items were understood by participants after another two rounds. We included revised HCA ACT items in the questionnaires for the other provider groups. In the RU tools for the other provider groups, we used different wording than the HCA version, as was done in the original English instruments. Only one cognitive debriefing round was needed with each of the other provider groups. Cognitive debriefing is essential to detect and respond to problematic instrument items, particularly when translating instruments for heterogeneous, less well educated provider groups such as HCAs. Cognitive debriefing is an important step in research tool development and a vital component of establishing response process validity evidence. Publishing cognitive debriefing results helps researchers to determine potentially critical elements of the translated tools and assists with interpreting scores.

  7. Linguistic validation of the Alberta Context Tool and two measures of research use, for German residential long term care

    PubMed Central

    2014-01-01

    Background To study the association between organizational context and research utilization in German residential long term care (LTC), we translated three Canadian assessment instruments: the Alberta Context Tool (ACT), Estabrooks’ Kinds of Research Utilization (RU) items and the Conceptual Research Utilization Scale. Target groups for the tools were health care aides (HCAs), registered nurses (RNs), allied health professionals (AHPs), clinical specialists and care managers. Through a cognitive debriefing process, we assessed response processes validity–an initial stage of validity, necessary before more advanced validity assessment. Methods We included 39 participants (16 HCAs, 5 RNs, 7 AHPs, 5 specialists and 6 managers) from five residential LTC facilities. We created lists of questionnaire items containing problematic items plus items randomly selected from the pool of remaining items. After participants completed the questionnaires, we conducted individual semi-structured cognitive interviews using verbal probing. We asked participants to reflect on their answers for list items in detail. Participants’ answers were compared to concept maps defining the instrument concepts in detail. If at least two participants gave answers not matching concept map definitions, items were revised and re-tested with new target group participants. Results Cognitive debriefings started with HCAs. Based on the first round, we modified 4 of 58 ACT items, 1 ACT item stem and all 8 items of the RU tools. All items were understood by participants after another two rounds. We included revised HCA ACT items in the questionnaires for the other provider groups. In the RU tools for the other provider groups, we used different wording than the HCA version, as was done in the original English instruments. Only one cognitive debriefing round was needed with each of the other provider groups. Conclusion Cognitive debriefing is essential to detect and respond to problematic instrument items, particularly when translating instruments for heterogeneous, less well educated provider groups such as HCAs. Cognitive debriefing is an important step in research tool development and a vital component of establishing response process validity evidence. Publishing cognitive debriefing results helps researchers to determine potentially critical elements of the translated tools and assists with interpreting scores. PMID:24479645

  8. Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process

    DTIC Science & Technology

    2015-10-01

    ARL-TR-7501 ● OCT 2015 US Army Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact...Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process by Stephen Berkebile Vehicle...YYYY) October 2015 2. REPORT TYPE Final 3. DATES COVERED (From - To) 1 January–30 June 2015 4. TITLE AND SUBTITLE Friction Mapping as a Tool for

  9. Magnetic resonance imaging-directed transperineal limited-mapping prostatic biopsies to diagnose prostate cancer: a Scottish experience.

    PubMed

    Mukherjee, Ankur; Morton, Simon; Fraser, Sioban; Salmond, Jonathan; Baxter, Grant; Leung, Hing Y

    2014-11-01

    Transperineal prostatic biopsy is firmly established as an important tool in the diagnosis of prostate cancer. The benefit of additional imaging (magnetic resonance imaging) to target biopsy remains to be fully addressed. Using a cohort of consecutive patients undergoing transperineal template mapping biopsies, we studied positive biopsies in the context of magnetic resonance imaging findings and examined the accuracy of magnetic resonance imaging in predicting the location of transperineal template mapping biopsies-detected prostate cancer. Forty-four patients (mean age: 65 years, range 53-78) underwent transperineal template mapping biopsies. Thirty-four patients had 1-2 and 10 patients had ≥3 previous transrectal ultrasound scan-guided biopsies. The mean prostate-specific antigen was 15 ng/mL (range 2.5-79 ng/mL). High-grade prostatic intraepithelial neoplasia was found in 12 (27%) patients and prostate cancer with Gleason <7, 7 and >7 in 13, 10 and 8 patients, respectively. Suspicious lesions on magnetic resonance imaging scans were scored from 1 to 5. In 28 patients, magnetic resonance imaging detected lesions with score ≥3. Magnetic resonance imaging correctly localised transperineal template mapping biopsies-detected prostate cancer in a hemi-gland approach, particularly in a right to left manner (79% positive prediction rate), but not in a quadrant approach (33% positive prediction rate). Our findings support the notion of magnetic resonance imaging-based selection of patients for transperineal template mapping biopsies and that lesions revealed by magnetic resonance imaging are likely useful for targeted biopsies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  10. Interactome Mapping Guided by Tissue-Specific Phosphorylation in Age-Related Macular Degeneration

    PubMed Central

    Sripathi, Srinivas R.; He, Weilue; Prigge, Cameron L.; Sylvester, O’Donnell; Um, Ji-Yeon; Powell, Folami L.; Neksumi, Musa; Bernstein, Paul S.; Choo, Dong-Won; Bartoli, Manuela; Gutsaeva, Diana R.; Jahng, Wan Jin

    2017-01-01

    The current study aims to determine the molecular mechanisms of age-related macular degeneration (AMD) using the phosphorylation network. Specifically, we examined novel biomarkers for oxidative stress by protein interaction mapping using in vitro and in vivo models that mimic the complex and progressive characteristics of AMD. We hypothesized that the early apoptotic reactions could be initiated by protein phosphorylation in region-dependent (peripheral retina vs. macular) and tissue-dependent (retinal pigment epithelium vs. retina) manner under chronic oxidative stress. The analysis of protein interactome and oxidative biomarkers showed the presence of tissue- and region-specific post-translational mechanisms that contribute to AMD progression and suggested new therapeutic targets that include ubiquitin, erythropoietin, vitronectin, MMP2, crystalline, nitric oxide, and prohibitin. Phosphorylation of specific target proteins in RPE cells is a central regulatory mechanism as a survival tool under chronic oxidative imbalance. The current interactome map demonstrates a positive correlation between oxidative stress-mediated phosphorylation and AMD progression and provides a basis for understanding oxidative stress-induced cytoskeletal changes and the mechanism of aggregate formation induced by protein phosphorylation. This information could provide an effective therapeutic approach to treat age-related neurodegeneration. PMID:28580316

  11. Interactome Mapping Guided by Tissue-Specific Phosphorylation in Age-Related Macular Degeneration.

    PubMed

    Sripathi, Srinivas R; He, Weilue; Prigge, Cameron L; Sylvester, O'Donnell; Um, Ji-Yeon; Powell, Folami L; Neksumi, Musa; Bernstein, Paul S; Choo, Dong-Won; Bartoli, Manuela; Gutsaeva, Diana R; Jahng, Wan Jin

    2017-02-01

    The current study aims to determine the molecular mechanisms of age-related macular degeneration (AMD) using the phosphorylation network. Specifically, we examined novel biomarkers for oxidative stress by protein interaction mapping using in vitro and in vivo models that mimic the complex and progressive characteristics of AMD. We hypothesized that the early apoptotic reactions could be initiated by protein phosphorylation in region-dependent (peripheral retina vs. macular) and tissue-dependent (retinal pigment epithelium vs. retina) manner under chronic oxidative stress. The analysis of protein interactome and oxidative biomarkers showed the presence of tissue- and region-specific post-translational mechanisms that contribute to AMD progression and suggested new therapeutic targets that include ubiquitin, erythropoietin, vitronectin, MMP2, crystalline, nitric oxide, and prohibitin. Phosphorylation of specific target proteins in RPE cells is a central regulatory mechanism as a survival tool under chronic oxidative imbalance. The current interactome map demonstrates a positive correlation between oxidative stress-mediated phosphorylation and AMD progression and provides a basis for understanding oxidative stress-induced cytoskeletal changes and the mechanism of aggregate formation induced by protein phosphorylation. This information could provide an effective therapeutic approach to treat age-related neurodegeneration.

  12. C-SPADE: a web-tool for interactive analysis and visualization of drug screening experiments through compound-specific bioactivity dendrograms

    PubMed Central

    Alam, Zaid; Peddinti, Gopal

    2017-01-01

    Abstract The advent of polypharmacology paradigm in drug discovery calls for novel chemoinformatic tools for analyzing compounds’ multi-targeting activities. Such tools should provide an intuitive representation of the chemical space through capturing and visualizing underlying patterns of compound similarities linked to their polypharmacological effects. Most of the existing compound-centric chemoinformatics tools lack interactive options and user interfaces that are critical for the real-time needs of chemical biologists carrying out compound screening experiments. Toward that end, we introduce C-SPADE, an open-source exploratory web-tool for interactive analysis and visualization of drug profiling assays (biochemical, cell-based or cell-free) using compound-centric similarity clustering. C-SPADE allows the users to visually map the chemical diversity of a screening panel, explore investigational compounds in terms of their similarity to the screening panel, perform polypharmacological analyses and guide drug-target interaction predictions. C-SPADE requires only the raw drug profiling data as input, and it automatically retrieves the structural information and constructs the compound clusters in real-time, thereby reducing the time required for manual analysis in drug development or repurposing applications. The web-tool provides a customizable visual workspace that can either be downloaded as figure or Newick tree file or shared as a hyperlink with other users. C-SPADE is freely available at http://cspade.fimm.fi/. PMID:28472495

  13. Exploiting rice-sorghum synteny for targeted development of EST-SSRs to enrich the sorghum genetic linkage map.

    PubMed

    Ramu, P; Kassahun, B; Senthilvel, S; Ashok Kumar, C; Jayashree, B; Folkertsma, R T; Reddy, L Ananda; Kuruvinashetti, M S; Haussmann, B I G; Hash, C T

    2009-11-01

    The sequencing and detailed comparative functional analysis of genomes of a number of select botanical models open new doors into comparative genomics among the angiosperms, with potential benefits for improvement of many orphan crops that feed large populations. In this study, a set of simple sequence repeat (SSR) markers was developed by mining the expressed sequence tag (EST) database of sorghum. Among the SSR-containing sequences, only those sharing considerable homology with rice genomic sequences across the lengths of the 12 rice chromosomes were selected. Thus, 600 SSR-containing sorghum EST sequences (50 homologous sequences on each of the 12 rice chromosomes) were selected, with the intention of providing coverage for corresponding homologous regions of the sorghum genome. Primer pairs were designed and polymorphism detection ability was assessed using parental pairs of two existing sorghum mapping populations. About 28% of these new markers detected polymorphism in this 4-entry panel. A subset of 55 polymorphic EST-derived SSR markers were mapped onto the existing skeleton map of a recombinant inbred population derived from cross N13 x E 36-1, which is segregating for Striga resistance and the stay-green component of terminal drought tolerance. These new EST-derived SSR markers mapped across all 10 sorghum linkage groups, mostly to regions expected based on prior knowledge of rice-sorghum synteny. The ESTs from which these markers were derived were then mapped in silico onto the aligned sorghum genome sequence, and 88% of the best hits corresponded to linkage-based positions. This study demonstrates the utility of comparative genomic information in targeted development of markers to fill gaps in linkage maps of related crop species for which sufficient genomic tools are not available.

  14. Mapping Inhibitory Neuronal Circuits by Laser Scanning Photostimulation

    PubMed Central

    Ikrar, Taruna; Olivas, Nicholas D.; Shi, Yulin; Xu, Xiangmin

    2011-01-01

    Inhibitory neurons are crucial to cortical function. They comprise about 20% of the entire cortical neuronal population and can be further subdivided into diverse subtypes based on their immunochemical, morphological, and physiological properties1-4. Although previous research has revealed much about intrinsic properties of individual types of inhibitory neurons, knowledge about their local circuit connections is still relatively limited3,5,6. Given that each individual neuron's function is shaped by its excitatory and inhibitory synaptic input within cortical circuits, we have been using laser scanning photostimulation (LSPS) to map local circuit connections to specific inhibitory cell types. Compared to conventional electrical stimulation or glutamate puff stimulation, LSPS has unique advantages allowing for extensive mapping and quantitative analysis of local functional inputs to individually recorded neurons3,7-9. Laser photostimulation via glutamate uncaging selectively activates neurons perisomatically, without activating axons of passage or distal dendrites, which ensures a sub-laminar mapping resolution. The sensitivity and efficiency of LSPS for mapping inputs from many stimulation sites over a large region are well suited for cortical circuit analysis. Here we introduce the technique of LSPS combined with whole-cell patch clamping for local inhibitory circuit mapping. Targeted recordings of specific inhibitory cell types are facilitated by use of transgenic mice expressing green fluorescent proteins (GFP) in limited inhibitory neuron populations in the cortex3,10, which enables consistent sampling of the targeted cell types and unambiguous identification of the cell types recorded. As for LSPS mapping, we outline the system instrumentation, describe the experimental procedure and data acquisition, and present examples of circuit mapping in mouse primary somatosensory cortex. As illustrated in our experiments, caged glutamate is activated in a spatially restricted region of the brain slice by UV laser photolysis; simultaneous voltage-clamp recordings allow detection of photostimulation-evoked synaptic responses. Maps of either excitatory or inhibitory synaptic input to the targeted neuron are generated by scanning the laser beam to stimulate hundreds of potential presynaptic sites. Thus, LSPS enables the construction of detailed maps of synaptic inputs impinging onto specific types of inhibitory neurons through repeated experiments. Taken together, the photostimulation-based technique offers neuroscientists a powerful tool for determining the functional organization of local cortical circuits. PMID:22006064

  15. Magrit: a new thematic cartography tool

    NASA Astrophysics Data System (ADS)

    Viry, Matthieu; Giraud, Timothée; Lambert, Nicolas

    2018-05-01

    The article provides an overview of the features of the Magrit web application: a free online thematic mapping tool, presenting a strong pedagogical dimension and making possible to mobilize all the elements necessary for the realization of a thematic map. In this tool, several simple modes of representation are proposed such as proportional maps or choropleth maps. Other, more complex modes are also available such as smoothed maps and cartograms. Each map can be finalized thanks to layout and customization features (projection, scale, orientation, toponyms, etc.) and exported in vector format. Magrit is therefore a complete, light and versatile tool particularly adapted to cartography teaching at the university.

  16. An open-water electrical geophysical tool for mapping sub-seafloor heavy placer minerals in 3D and migrating hydrocarbon plumes in 4D

    USGS Publications Warehouse

    Wynn, J.; Williamson, M.; Urquhart, S.; Fleming, J.

    2011-01-01

    A towed-streamer technology has been developed for mapping placer heavy minerals and dispersed hydrocarbon plumes in the open ocean. The approach uses induced polarization (IP), an electrical measurement that encompasses several different surface-reactive capacitive and electrochemical phenomena, and thus is ideally suited for mapping dispersed or disseminated targets. The application is operated at sea by towing active electrical geophysical streamers behind a ship; a wide area can be covered in three dimensions by folding tow-paths over each other in lawn-mower fashion. This technology has already been proven in laboratory and ocean settings to detect IP-reactive titanium-and rare-earth (REE) minerals such as ilmenite and monazite. By extension, minerals that weather and accumulate/concentrate by a similar mechanism, including gold, platinum, and diamonds, may be rapidly detected and mapped indirectly even when dispersed and covered with thick, inert sediment. IP is also highly reactive to metal structures such as pipelines and cables. ?? 2011 MTS.

  17. Invasive species management and research using GIS

    USGS Publications Warehouse

    Holcombe, Tracy R.; Stohlgren, Thomas J.; Jarnevich, Catherine S.

    2007-01-01

    Geographical Information Systems (GIS) are powerful tools in the field of invasive species management. GIS can be used to create potential distribution maps for all manner of taxa, including plants, animals, and diseases. GIS also performs well in the early detection and rapid assessment of invasive species. Here, we used GIS applications to investigate species richness and invasion patterns in fish in the United States (US) at the 6-digit Hydrologic Unit Code (HUC) level. We also created maps of potential spread of the cane toad (Bufo marinus) in the southeastern US at the 8-digit HUC level using regression and environmental envelope techniques. Equipped with this potential map, resource managers can target their field surveys to areas most vulnerable to invasion. Advances in GIS technology, maps, data, and many of these techniques can be found on websites such as the National Institute of Invasive Species Science (www.NIISS.org). Such websites provide a forum for data sharing and analysis that is an invaluable service to the invasive species community.

  18. Urban local climate zone mapping and apply in urban environment study

    NASA Astrophysics Data System (ADS)

    He, Shan; Zhang, Yunwei; Zhang, Jili

    2018-02-01

    The city’s local climate zone (LCZ) was considered to be a powerful tool for urban climate mapping. But for cities in different countries and regions, the LCZ division methods and results were different, thus targeted researches should be performed. In the current work, a LCZ mapping method was proposed, which is convenient in operation and city planning oriented. In this proposed method, the local climate zoning types were adjusted firstly, according to the characteristics of Chinese city, that more tall buildings and high density. Then the classification method proposed by WUDAPT based on remote sensing data was performed on Xi’an city, as an example, for LCZ mapping. Combined with the city road network, a reasonable expression of the dividing results was provided, to adapt to the characteristics in city planning that land parcels are usually recognized as the basic unit. The proposed method was validated against the actual land use and construction data that surveyed in Xi’an, with results indicating the feasibility of the proposed method for urban LCZ mapping in China.

  19. Planetary Geologic Mapping Handbook - 2009

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of project-specific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well (e.g., Hare and others, 2009). Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically (e.g., Wilhelms, 1972, 1990; Tanaka and others, 1994). As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program s Planetary Cartography and Geologic Mapping Working Group s (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  20. Automated MRI Segmentation for Individualized Modeling of Current Flow in the Human Head

    PubMed Central

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-01-01

    Objective High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography (HD-EEG) require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images (MRI) requires labor-intensive manual segmentation, even when leveraging available automated segmentation tools. Also, accurate placement of many high-density electrodes on individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach A fully automated segmentation technique based on Statical Parametric Mapping 8 (SPM8), including an improved tissue probability map (TPM) and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on 4 healthy subjects and 7 stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. Main results The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view (FOV) extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Significance Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials. PMID:24099977

  1. Automated MRI segmentation for individualized modeling of current flow in the human head

    NASA Astrophysics Data System (ADS)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.

  2. The use of concept mapping in measurement development and evaluation: Application and future directions.

    PubMed

    Rosas, Scott R; Ridings, John W

    2017-02-01

    The past decade has seen an increase of measurement development research in social and health sciences that featured the use of concept mapping as a core technique. The purpose, application, and utility of concept mapping have varied across this emerging literature. Despite the variety of uses and range of outputs, little has been done to critically review how researchers have approached the application of concept mapping in the measurement development and evaluation process. This article focuses on a review of the current state of practice regarding the use of concept mapping as methodological tool in this process. We systematically reviewed 23 scale or measure development and evaluation studies, and detail the application of concept mapping in the context of traditional measurement development and psychometric testing processes. Although several limitations surfaced, we found several strengths in the contemporary application of the method. We determined concept mapping provides (a) a solid method for establishing content validity, (b) facilitates researcher decision-making, (c) insight into target population perspectives that are integrated a priori, and (d) a foundation for analytical and interpretative choices. Based on these results, we outline how concept mapping can be situated in the measurement development and evaluation processes for new instrumentation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Improving polio vaccination coverage in Nigeria through the use of geographic information system technology.

    PubMed

    Barau, Inuwa; Zubairu, Mahmud; Mwanza, Michael N; Seaman, Vincent Y

    2014-11-01

    Historically, microplanning for polio vaccination campaigns in Nigeria relied on inaccurate and incomplete hand-drawn maps, resulting in the exclusion of entire settlements and missed children. The goal of this work was to create accurate, coordinate-based maps for 8 polio-endemic states in northern Nigeria to improve microplanning and support tracking of vaccination teams, thereby enhancing coverage, supervision, and accountability. Settlement features were identified in the target states, using high-resolution satellite imagery. Field teams collected names and geocoordinates for each settlement feature, with the help of local guides. Global position system (GPS) tracking of vaccination teams was conducted in selected areas and daily feedback provided to supervisors. Geographic information system (GIS)-based maps were created for 2238 wards in the 8 target states. The resulting microplans included all settlements and more-efficient team assignments, owing to the improved spatial reference. GPS tracking was conducted in 111 high-risk local government areas, resulting in improved team performance and the identification of missed/poorly covered settlements. Accurate and complete maps are a necessary part of an effective polio microplan, and tracking vaccinators gives supervisors a tool to ensure that all settlements are visited. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. NASA's Solar System Treks: Online Portals for Planetary Mapping and Modeling

    NASA Technical Reports Server (NTRS)

    Day, Brian

    2017-01-01

    NASA's Solar System Treks are a suite of web-based of lunar and planetary mapping and modeling portals providing interactive visualization and analysis tools enabling mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, Vesta, and more. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look ahead to future features and releases. Moon Trek is a new portal replacing its predecessor, the Lunar Mapping and Modeling Portal (LMMP), that significantly upgrades and builds upon the capabilities of LMMP. It features greatly improved navigation, 3D visualization, fly-overs, performance, and reliability. Additional data products and tools continue to be added. These include both generalized products as well as polar data products specifically targeting potential sites for NASA's Resource Prospector mission as well as for missions being planned by NASA's international partners. The latest release of Mars Trek includes new tools and data products requested by NASA's Planetary Science Division to support site selection and analysis for Mars Human Landing Exploration Zone Sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. Phobos Trek, the latest effort in the Solar System Treks suite, is being developed in coordination with the International Phobos/Deimos Landing Site Working Group, with landing site selection and analysis for JAXA's MMX (Martian Moons eXploration) mission as a primary driver.

  5. NASA's Solar System Treks: Online Portals for Planetary Mapping and Modeling

    NASA Astrophysics Data System (ADS)

    Day, B. H.; Law, E.

    2017-12-01

    NASA's Solar System Treks are a suite of web-based of lunar and planetary mapping and modeling portals providing interactive visualization and analysis tools enabling mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, Vesta, and more. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look ahead to future features and releases. Moon Trek is a new portal replacing its predecessor, the Lunar Mapping and Modeling Portal (LMMP), that significantly upgrades and builds upon the capabilities of LMMP. It features greatly improved navigation, 3D visualization, fly-overs, performance, and reliability. Additional data products and tools continue to be added. These include both generalized products as well as polar data products specifically targeting potential sites for NASA's Resource Prospector mission as well as for missions being planned by NASA's international partners. The latest release of Mars Trek includes new tools and data products requested by NASA's Planetary Science Division to support site selection and analysis for Mars Human Landing Exploration Zone Sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. Phobos Trek, the latest effort in the Solar System Treks suite, is being developed in coordination with the International Phobos/Deimos Landing Site Working Group, with landing site selection and analysis for JAXA's MMX mission as a primary driver.

  6. Carbon nanotubes allow capture of krypton, barium and lead for multichannel biological X-ray fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Serpell, Christopher J.; Rutte, Reida N.; Geraki, Kalotina; Pach, Elzbieta; Martincic, Markus; Kierkowicz, Magdalena; de Munari, Sonia; Wals, Kim; Raj, Ritu; Ballesteros, Belén; Tobias, Gerard; Anthony, Daniel C.; Davis, Benjamin G.

    2016-10-01

    The desire to study biology in situ has been aided by many imaging techniques. Among these, X-ray fluorescence (XRF) mapping permits observation of elemental distributions in a multichannel manner. However, XRF imaging is underused, in part, because of the difficulty in interpreting maps without an underlying cellular `blueprint' this could be supplied using contrast agents. Carbon nanotubes (CNTs) can be filled with a wide range of inorganic materials, and thus can be used as `contrast agents' if biologically absent elements are encapsulated. Here we show that sealed single-walled CNTs filled with lead, barium and even krypton can be produced, and externally decorated with peptides to provide affinity for sub-cellular targets. The agents are able to highlight specific organelles in multiplexed XRF mapping, and are, in principle, a general and versatile tool for this, and other modes of biological imaging.

  7. Drug targets in the cytokine universe for autoimmune disease.

    PubMed

    Liu, Xuebin; Fang, Lei; Guo, Taylor B; Mei, Hongkang; Zhang, Jingwu Z

    2013-03-01

    In autoimmune disease, a network of diverse cytokines is produced in association with disease susceptibility to constitute the 'cytokine milieu' that drives chronic inflammation. It remains elusive how cytokines interact in such a complex network to sustain inflammation in autoimmune disease. This has presented huge challenges for successful drug discovery because it has been difficult to predict how individual cytokine-targeted therapy would work. Here, we combine the principles of Chinese Taoism philosophy and modern bioinformatics tools to dissect multiple layers of arbitrary cytokine interactions into discernible interfaces and connectivity maps to predict movements in the cytokine network. The key principles presented here have important implications in our understanding of cytokine interactions and development of effective cytokine-targeted therapies for autoimmune disorders. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Identification, Characterization, and Epitope Mapping of Human Monoclonal Antibody J19 That Specifically Recognizes Activated Integrin α4β7*

    PubMed Central

    Qi, JunPeng; Zhang, Kun; Zhang, Qiao; Sun, Yi; Fu, Ting; Li, GuoHui; Chen, JianFeng

    2012-01-01

    Integrin α4β7 is a lymphocyte homing receptor that mediates both rolling and firm adhesion of lymphocytes on vascular endothelium, two of the critical steps in lymphocyte migration and tissue-specific homing. The rolling and firm adhesions of lymphocytes rely on the dynamic shift between the inactive and active states of integrin α4β7, which is associated with the conformational rearrangement of integrin molecules. Activation-specific antibodies, which specifically recognize the activated integrins, have been used as powerful tools in integrin studies, whereas there is no well characterized activation-specific antibody to integrin α4β7. Here, we report the identification, characterization, and epitope mapping of an activation-specific human mAb J19 against integrin α4β7. J19 was discovered by screening a human single-chain variable fragment phage library using an activated α4β7 mutant as target. J19 IgG specifically bound to the high affinity α4β7 induced by Mn2+, DTT, ADP, or CXCL12, but not to the low affinity integrin. Moreover, J19 IgG did not interfere with α4β7-MAdCAM-1 interaction. The epitope of J19 IgG was mapped to Ser-331, Ala-332, and Ala-333 of β7 I domain and a seven-residue segment from 184 to 190 of α4 β-propeller domain, which are buried in low affinity integrin with bent conformation and only exposed in the high affinity extended conformation. Taken together, J19 is a potentially powerful tool for both studies on α4β7 activation mechanism and development of novel therapeutics targeting the activated lymphocyte expressing high affinity α4β7. PMID:22418441

  9. Optical Imaging and Radiometric Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.

    2010-01-01

    OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge diffusion modulation transfer function (MTF).

  10. The VIMS Data Explorer: A tool for locating and visualizing hyperspectral data

    NASA Astrophysics Data System (ADS)

    Pasek, V. D.; Lytle, D. M.; Brown, R. H.

    2016-12-01

    Since successfully entering Saturn's orbit during Summer 2004 there have been over 300,000 hyperspectral data cubes returned from the visible and infrared mapping spectrometer (VIMS) instrument onboard the Cassini spacecraft. The VIMS Science Investigation is a multidisciplinary effort that uses these hyperspectral data to study a variety of scientific problems, including surface characterizations of the icy satellites and atmospheric analyses of Titan and Saturn. Such investigations may need to identify thousands of exemplary data cubes for analysis and can span many years in scope. Here we describe the VIMS data explorer (VDE) application, currently employed by the VIMS Investigation to search for and visualize data. The VDE application facilitates real-time inspection of the entire VIMS hyperspectral dataset, the construction of in situ maps, and markers to save and recall work. The application relies on two databases to provide comprehensive search capabilities. The first database contains metadata for every cube. These metadata searches are used to identify records based on parameters such as target, observation name, or date taken; they fall short in utility for some investigations. The cube metadata contains no target geometry information. Through the introduction of a post-calibration pixel database, the VDE tool enables users to greatly expand their searching capabilities. Users can select favorable cubes for further processing into 2-D and 3-D interactive maps, aiding in the data interpretation and selection process. The VDE application enables efficient search, visualization, and access to VIMS hyperspectral data. It is simple to use, requiring nothing more than a browser for access. Hyperspectral bands can be individually selected or combined to create real-time color images, a technique commonly employed by hyperspectral researchers to highlight compositional differences.

  11. King County Nearshore Habitat Mapping Data Report: Picnic Point to Shilshole Bay Marina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodruff, Dana L.; Farley, Paul J.; Borde, Amy B.

    2000-12-31

    The objective of this study is to provide accurate, georeferenced maps of benthic habitats to assist in the siting of a new wastewater treatment plant outfall and the assessment of habitats of endangered, threatened, and economically important species. The mapping was conducted in the fall of 1999 using two complementary techniques: side-scan sonar and underwater videography. Products derived from these techniques include geographic information system (GIS) compatible polygon data of substrate type and vegetation cover, including eelgrass and kelp. Additional GIS overlays include underwater video track line data of total macroalgae, selected macroalgal species, fish, and macroinvertebrates. The combined toolsmore » of geo-referenced side-scan sonar and underwater video is a powerful technique for assessing and mapping of nearshore habitat in Puget Sound. Side-scan sonar offers the ability to map eelgrass with high spatial accuracy and resolution, and provides information on patch size, shape, and coverage. It also provides information on substrate change and location of specific targets (e.g., piers, docks, pilings, large boulders, debris piles). The addition of underwater video is a complementary tool providing both groundtruthing for the sonar and additional information on macro fauna and flora. As a groundtruthing technique, the video was able to confirm differences between substrate types, as well as detect subtle spatial changes in substrate. It also verified information related to eelgrass, including the density classification categories and the type of substrate associated with eelgrass, which could not be determined easily with side- scan sonar. Video is also a powerful tool for mapping the location of macroalgae, (including kelp and Ulva), fish and macroinvertebrates. The ability to geo-locate these resources in their functional habitat provides an added layer of information and analytical potential.« less

  12. Visualizing and communicating climate change using the ClimateWizard: decision support and education through web-based analysis and mapping

    NASA Astrophysics Data System (ADS)

    Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Maurer, E. P.; Duffy, P.

    2009-12-01

    Virtually all fields of study and parts of society—from ecological science and nature conservation, to global development, multinational corporations, and government bodies—need to know how climate change has and may impact specific locations of interest. Our ability to respond to climate change depends on having convenient tools that make past and projected climate trends available to planners, managers, scientists and the general public, at scales ranging from global to local scales. Web-mapping applications provide an effective platform for communicating climate change impacts in specific geographic areas of interest to the public. Here, we present one such application, the ClimateWizard, that allows users to analyze, visualize and explore climate change maps for specific geographic areas of interest throughout the world (http://ClimateWizard.org). Built on Web 2.0 web-services (SOAP), Google Maps mash-up, and cloud computing technologies, the ClimateWizard analyzes large databases of climate information located on remote servers to create synthesized information and useful products tailored to geographic areas of interest (e.g. maps, graphs, tables, GIS layers). We demonstrate how the ClimateWizard can be used to assess projected changes to temperature and precipitation across all states in the contiguous United States and all countries of the world using statistically downscaled general circulation models from the CMIP3 dataset. We then go on to show how ClimateWizard can be used to analyze changes to other climate related variables, such as moisture stress and water production. Finally, we discuss how this tool can be adapted to develop a wide range of web-based tools that are targeted at informing specific audiences—from scientific research and natural resource management, to K-12 and higher education—about how climate change may affect different aspects of human and natural systems.

  13. Implementation of structure-mapping inference by event-file binding and action planning: a model of tool-improvisation analogies.

    PubMed

    Fields, Chris

    2011-03-01

    Structure-mapping inferences are generally regarded as dependent upon relational concepts that are understood and expressible in language by subjects capable of analogical reasoning. However, tool-improvisation inferences are executed by members of a variety of non-human primate and other species. Tool improvisation requires correctly inferring the motion and force-transfer affordances of an object; hence tool improvisation requires structure mapping driven by relational properties. Observational and experimental evidence can be interpreted to indicate that structure-mapping analogies in tool improvisation are implemented by multi-step manipulation of event files by binding and action-planning mechanisms that act in a language-independent manner. A functional model of language-independent event-file manipulations that implement structure mapping in the tool-improvisation domain is developed. This model provides a mechanism by which motion and force representations commonly employed in tool-improvisation structure mappings may be sufficiently reinforced to be available to inwardly directed attention and hence conceptualization. Predictions and potential experimental tests of this model are outlined.

  14. Breaking Barriers and Building Bridges: Using EJ SCREEN ...

    EPA Pesticide Factsheets

    Communities across the United States are faced with concerns about environmental risks and exposures including air contaminants near roadways, proximity to hazardous waste sites and children’s environmental health. These concerns are compounded by complicated data, limited opportunities for collaboration and resource-based restrictions such as funding. This workshop will introduce innovative approaches for combining the capacity of EPA science tools - EJ SCREEN and the recently released Community Focused Exposure and Risk Screening Tool (C-FERST). Following a nationally applicable case study, participants will learn how these tools can be used sequentially to; (1) identify community environmental health ‘hotspots’; (2) take a closer look at local scale sources of exposure and; (3) use new features of the tool to target potential partners and resources across the country. By exploring the power of GIS mapping and crowdsource data, participants will leave with simple, user-defined approaches for using state of the science tools to advance their community and environmental health projects. Presentation using EJ SCREEN and C-FERST

  15. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  16. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  17. Decision-level fusion of SAR and IR sensor information for automatic target detection

    NASA Astrophysics Data System (ADS)

    Cho, Young-Rae; Yim, Sung-Hyuk; Cho, Hyun-Woong; Won, Jin-Ju; Song, Woo-Jin; Kim, So-Hyeon

    2017-05-01

    We propose a decision-level architecture that combines synthetic aperture radar (SAR) and an infrared (IR) sensor for automatic target detection. We present a new size-based feature, called target-silhouette to reduce the number of false alarms produced by the conventional target-detection algorithm. Boolean Map Visual Theory is used to combine a pair of SAR and IR images to generate the target-enhanced map. Then basic belief assignment is used to transform this map into a belief map. The detection results of sensors are combined to build the target-silhouette map. We integrate the fusion mass and the target-silhouette map on the decision level to exclude false alarms. The proposed algorithm is evaluated using a SAR and IR synthetic database generated by SE-WORKBENCH simulator, and compared with conventional algorithms. The proposed fusion scheme achieves higher detection rate and lower false alarm rate than the conventional algorithms.

  18. Combined ultrasound and MR imaging to guide focused ultrasound therapies in the brain

    NASA Astrophysics Data System (ADS)

    Arvanitis, Costas D.; Livingstone, Margaret S.; McDannold, Nathan

    2013-07-01

    Several emerging therapies with potential for use in the brain, harness effects produced by acoustic cavitation—the interaction between ultrasound and microbubbles either generated during sonication or introduced into the vasculature. Systems developed for transcranial MRI-guided focused ultrasound (MRgFUS) thermal ablation can enable their clinical translation, but methods for real-time monitoring and control are currently lacking. Acoustic emissions produced during sonication can provide information about the location, strength and type of the microbubble oscillations within the ultrasound field, and they can be mapped in real-time using passive imaging approaches. Here, we tested whether such mapping can be achieved transcranially within a clinical brain MRgFUS system. We integrated an ultrasound imaging array into the hemisphere transducer of the MRgFUS device. Passive cavitation maps were obtained during sonications combined with a circulating microbubble agent at 20 targets in the cingulate cortex in three macaques. The maps were compared with MRI-evident tissue effects. The system successfully mapped microbubble activity during both stable and inertial cavitation, which was correlated with MRI-evident transient blood-brain barrier disruption and vascular damage, respectively. The location of this activity was coincident with the resulting tissue changes within the expected resolution limits of the system. While preliminary, these data clearly demonstrate, for the first time, that it is possible to construct maps of stable and inertial cavitation transcranially, in a large animal model, and under clinically relevant conditions. Further, these results suggest that this hybrid ultrasound/MRI approach can provide comprehensive guidance for targeted drug delivery via blood-brain barrier disruption and other emerging ultrasound treatments, facilitating their clinical translation. We anticipate that it will also prove to be an important research tool that will further the development of a broad range of microbubble-enhanced therapies.

  19. Advances in Maize Genomics and Their Value for Enhancing Genetic Gains from Breeding

    PubMed Central

    Xu, Yunbi; Skinner, Debra J.; Wu, Huixia; Palacios-Rojas, Natalia; Araus, Jose Luis; Yan, Jianbing; Gao, Shibin; Warburton, Marilyn L.; Crouch, Jonathan H.

    2009-01-01

    Maize is an important crop for food, feed, forage, and fuel across tropical and temperate areas of the world. Diversity studies at genetic, molecular, and functional levels have revealed that, tropical maize germplasm, landraces, and wild relatives harbor a significantly wider range of genetic variation. Among all types of markers, SNP markers are increasingly the marker-of-choice for all genomics applications in maize breeding. Genetic mapping has been developed through conventional linkage mapping and more recently through linkage disequilibrium-based association analyses. Maize genome sequencing, initially focused on gene-rich regions, now aims for the availability of complete genome sequence. Conventional insertion mutation-based cloning has been complemented recently by EST- and map-based cloning. Transgenics and nutritional genomics are rapidly advancing fields targeting important agronomic traits including pest resistance and grain quality. Substantial advances have been made in methodologies for genomics-assisted breeding, enhancing progress in yield as well as abiotic and biotic stress resistances. Various genomic databases and informatics tools have been developed, among which MaizeGDB is the most developed and widely used by the maize research community. In the future, more emphasis should be given to the development of tools and strategic germplasm resources for more effective molecular breeding of tropical maize products. PMID:19688107

  20. Space moving target detection using time domain feature

    NASA Astrophysics Data System (ADS)

    Wang, Min; Chen, Jin-yong; Gao, Feng; Zhao, Jin-yu

    2018-01-01

    The traditional space target detection methods mainly use the spatial characteristics of the star map to detect the targets, which can not make full use of the time domain information. This paper presents a new space moving target detection method based on time domain features. We firstly construct the time spectral data of star map, then analyze the time domain features of the main objects (target, stars and the background) in star maps, finally detect the moving targets using single pulse feature of the time domain signal. The real star map target detection experimental results show that the proposed method can effectively detect the trajectory of moving targets in the star map sequence, and the detection probability achieves 99% when the false alarm rate is about 8×10-5, which outperforms those of compared algorithms.

  1. Linking rare and common disease: mapping clinical disease-phenotypes to ontologies in therapeutic target validation.

    PubMed

    Sarntivijai, Sirarat; Vasant, Drashtti; Jupp, Simon; Saunders, Gary; Bento, A Patrícia; Gonzalez, Daniel; Betts, Joanna; Hasan, Samiul; Koscielny, Gautier; Dunham, Ian; Parkinson, Helen; Malone, James

    2016-01-01

    The Centre for Therapeutic Target Validation (CTTV - https://www.targetvalidation.org/) was established to generate therapeutic target evidence from genome-scale experiments and analyses. CTTV aims to support the validity of therapeutic targets by integrating existing and newly-generated data. Data integration has been achieved in some resources by mapping metadata such as disease and phenotypes to the Experimental Factor Ontology (EFO). Additionally, the relationship between ontology descriptions of rare and common diseases and their phenotypes can offer insights into shared biological mechanisms and potential drug targets. Ontologies are not ideal for representing the sometimes associated type relationship required. This work addresses two challenges; annotation of diverse big data, and representation of complex, sometimes associated relationships between concepts. Semantic mapping uses a combination of custom scripting, our annotation tool 'Zooma', and expert curation. Disease-phenotype associations were generated using literature mining on Europe PubMed Central abstracts, which were manually verified by experts for validity. Representation of the disease-phenotype association was achieved by the Ontology of Biomedical AssociatioN (OBAN), a generic association representation model. OBAN represents associations between a subject and object i.e., disease and its associated phenotypes and the source of evidence for that association. The indirect disease-to-disease associations are exposed through shared phenotypes. This was applied to the use case of linking rare to common diseases at the CTTV. EFO yields an average of over 80% of mapping coverage in all data sources. A 42% precision is obtained from the manual verification of the text-mined disease-phenotype associations. This results in 1452 and 2810 disease-phenotype pairs for IBD and autoimmune disease and contributes towards 11,338 rare diseases associations (merged with existing published work [Am J Hum Genet 97:111-24, 2015]). An OBAN result file is downloadable at http://sourceforge.net/p/efo/code/HEAD/tree/trunk/src/efoassociations/. Twenty common diseases are linked to 85 rare diseases by shared phenotypes. A generalizable OBAN model for association representation is presented in this study. Here we present solutions to large-scale annotation-ontology mapping in the CTTV knowledge base, a process for disease-phenotype mining, and propose a generic association model, 'OBAN', as a means to integrate disease using shared phenotypes. EFO is released monthly and available for download at http://www.ebi.ac.uk/efo/.

  2. Earthdata User Interface Patterns: Building Usable Web Interfaces Through a Shared UI Pattern Library

    NASA Astrophysics Data System (ADS)

    Siarto, J.

    2014-12-01

    As more Earth science software tools and services move to the web--the design and usability of those tools become ever more important. A good user interface is becoming expected and users are becoming increasingly intolerant of websites and web applications that work against them. The Earthdata UI Pattern Library attempts to give these scientists and developers the design tools they need to make usable, compelling user interfaces without the associated overhead of using a full design team. Patterns are tested and functional user interface elements targeted specifically at the Earth science community and will include web layouts, buttons, tables, typography, iconography, mapping and visualization/graphing widgets. These UI elements have emerged as the result of extensive user testing, research and software development within the NASA Earthdata team over the past year.

  3. Conservation plan based on the concept of integrity

    NASA Astrophysics Data System (ADS)

    Yen, Y. N.; Cheng, C. F.

    2015-08-01

    Value based concept has been accepted as a universal principle for the conservation of Cultural Heritage. Authenticity and integrity are two main issues protecting those values. Authenticity is the major tool in the value assessment and integrity plays an important role in the procedure of conservation plan. From the perspective of integrity, this research explores the principle of conservation plan and discusses its relation with the restoration plan and urban plan. A conservation plan in Quing-Lin village, Kinmen, will be taken as an example for implementation. The research shows that a conservation plan with integrity in mind helps to clarify the conservation target areas and their buffer zones. It also serves as a tool for developing control and risk management. Cultural mapping is an efficient tool for the communication with stakeholders in the process of the conservation plan.

  4. Knowledge maps: a tool for online assessment with automated feedback.

    PubMed

    Ho, Veronica W; Harris, Peter G; Kumar, Rakesh K; Velan, Gary M

    2018-12-01

    In higher education, most assessments or examinations comprise either multiple-choice items or open-ended questions such as modified essay questions (MEQs). Online concept and knowledge maps are potential tools for assessment, which might emphasize meaningful, integrated understanding of phenomena. We developed an online knowledge-mapping assessment tool, which provides automated feedback on student-submitted maps. We conducted a pilot study to investigate the potential utility of online knowledge mapping as a tool for automated assessment by comparing the scores generated by the software with manual grading of a MEQ on the same topic for a cohort of first-year medical students. In addition, an online questionnaire was used to gather students' perceptions of the tool. Map items were highly discriminating between students of differing knowledge of the topic overall. Regression analysis showed a significant correlation between map scores and MEQ scores, and responses to the questionnaire regarding use of knowledge maps for assessment were overwhelmingly positive. These results suggest that knowledge maps provide a similar indication of students' understanding of a topic as a MEQ, with the advantage of instant, consistent computer grading and time savings for educators. Online concept and knowledge maps could be a useful addition to the assessment repertoire in higher education.

  5. Real-time MRI guidance of cardiac interventions.

    PubMed

    Campbell-Washburn, Adrienne E; Tavallaei, Mohammad A; Pop, Mihaela; Grant, Elena K; Chubb, Henry; Rhode, Kawal; Wright, Graham A

    2017-10-01

    Cardiac magnetic resonance imaging (MRI) is appealing to guide complex cardiac procedures because it is ionizing radiation-free and offers flexible soft-tissue contrast. Interventional cardiac MR promises to improve existing procedures and enable new ones for complex arrhythmias, as well as congenital and structural heart disease. Guiding invasive procedures demands faster image acquisition, reconstruction and analysis, as well as intuitive intraprocedural display of imaging data. Standard cardiac MR techniques such as 3D anatomical imaging, cardiac function and flow, parameter mapping, and late-gadolinium enhancement can be used to gather valuable clinical data at various procedural stages. Rapid intraprocedural image analysis can extract and highlight critical information about interventional targets and outcomes. In some cases, real-time interactive imaging is used to provide a continuous stream of images displayed to interventionalists for dynamic device navigation. Alternatively, devices are navigated relative to a roadmap of major cardiac structures generated through fast segmentation and registration. Interventional devices can be visualized and tracked throughout a procedure with specialized imaging methods. In a clinical setting, advanced imaging must be integrated with other clinical tools and patient data. In order to perform these complex procedures, interventional cardiac MR relies on customized equipment, such as interactive imaging environments, in-room image display, audio communication, hemodynamic monitoring and recording systems, and electroanatomical mapping and ablation systems. Operating in this sophisticated environment requires coordination and planning. This review provides an overview of the imaging technology used in MRI-guided cardiac interventions. Specifically, this review outlines clinical targets, standard image acquisition and analysis tools, and the integration of these tools into clinical workflow. 1 Technical Efficacy: Stage 5 J. Magn. Reson. Imaging 2017;46:935-950. © 2017 International Society for Magnetic Resonance in Medicine.

  6. Combining Machine Learning Systems and Multiple Docking Simulation Packages to Improve Docking Prediction Reliability for Network Pharmacology

    PubMed Central

    Hsin, Kun-Yi; Ghosh, Samik; Kitano, Hiroaki

    2013-01-01

    Increased availability of bioinformatics resources is creating opportunities for the application of network pharmacology to predict drug effects and toxicity resulting from multi-target interactions. Here we present a high-precision computational prediction approach that combines two elaborately built machine learning systems and multiple molecular docking tools to assess binding potentials of a test compound against proteins involved in a complex molecular network. One of the two machine learning systems is a re-scoring function to evaluate binding modes generated by docking tools. The second is a binding mode selection function to identify the most predictive binding mode. Results from a series of benchmark validations and a case study show that this approach surpasses the prediction reliability of other techniques and that it also identifies either primary or off-targets of kinase inhibitors. Integrating this approach with molecular network maps makes it possible to address drug safety issues by comprehensively investigating network-dependent effects of a drug or drug candidate. PMID:24391846

  7. Technological advances in the surgical treatment of movement disorders.

    PubMed

    Gross, Robert E; McDougal, Margaret E

    2013-08-01

    Technological innovations have driven the advancement of the surgical treatment of movement disorders, from the invention of the stereotactic frame to the adaptation of deep brain stimulation (DBS). Along these lines, this review will describe recent advances in inserting neuromodulation modalities, including DBS, to the target, and in the delivery of therapy at the target. Recent radiological advances are altering the way that DBS leads are targeted and inserted, by refining the ability to visualize the subcortical targets using high-field strength magnetic resonance imaging and other innovations, such as diffusion tensor imaging, and the development of novel targeting devices enabling purely anatomical implantations without the need for neurophysiological monitoring. New portable computed tomography scanners also are facilitating lead implantation without monitoring, as well as improving radiological verification of DBS lead location. Advances in neurophysiological mapping include efforts to develop automatic target verification algorithms, and probabilistic maps to guide target selection. The delivery of therapy at the target is being improved by the development of the next generation of internal pulse generators (IPGs). These include constant current devices that mitigate the variability introduced by impedance changes of the stimulated tissue and, in the near future, devices that deliver novel stimulation patterns with improved efficiency. Closed-loop adaptive IPGs are being tested, which may tailor stimulation to ongoing changes in the nervous system, reflected in biomarkers continuously recorded by the devices. Finer-grained DBS leads, in conjunction with new IPGs and advanced programming tools, may offer improved outcomes via current steering algorithms. Finally, even thermocoagulation-essentially replaced by DBS-is being advanced by new minimally-invasive approaches that may improve this therapy for selected patients in whom it may be preferred. Functional neurosurgery has a history of being driven by technological innovation, a tradition that continues into its future.

  8. Using environmental public health tracking to identify community targets for public health actions in childhood lead poisoning in Wisconsin.

    PubMed

    Berney, Dawn; Camponeschi, Jenny; Coons, Marjorie; Creswell, Paul D; Schirmer, Joe; Walsh, Reghan

    2015-01-01

    In an effort to improve the ability of local public health departments to target resources to the highest need regions, the Wisconsin Environmental Public Health Tracking (WI EPHT) Program worked to enhance its public portal to benefit the Wisconsin Childhood Lead Poisoning Prevention Program (WCLPPP) and other programs. The WI EPHT Program conducted this enhancement in collaboration with WCLPPP. The WI EPHT enhanced public portal is the next phase of Wisconsin's ongoing efforts in environmental public health tracking. As part of this process, this new mapping application includes mapping capacity that provides information on childhood lead testing and results at county and census tract levels in Wisconsin. The WI EPHT Program will update its public portal to have the capability to map data at a subcounty level (ie, census tract or zip code) for some data topics when such data are available. This tool is available to local public health departments and other public health organizations throughout Wisconsin as a resource to identify communities most affected by the Centers for Disease Control and Prevention's new guidelines with regard to childhood lead poisoning. The collaboration between WI EPHT and WCLPPP on updating and enhancing the portal exemplifies the power of environmental health data to inform a more accurate understanding of public health problems.

  9. Internet protocol network mapper

    DOEpatents

    Youd, David W.; Colon III, Domingo R.; Seidl, Edward T.

    2016-02-23

    A network mapper for performing tasks on targets is provided. The mapper generates a map of a network that specifies the overall configuration of the network. The mapper inputs a procedure that defines how the network is to be mapped. The procedure specifies what, when, and in what order the tasks are to be performed. Each task specifies processing that is to be performed for a target to produce results. The procedure may also specify input parameters for a task. The mapper inputs initial targets that specify a range of network addresses to be mapped. The mapper maps the network by, for each target, executing the procedure to perform the tasks on the target. The results of the tasks represent the mapping of the network defined by the initial targets.

  10. MED SUV TASK 6.3 Capacity building and interaction with decision makers: Improving volcanic risk communication through volcanic hazard tools evaluation, Campi Flegrei Caldera case study (Italy)

    NASA Astrophysics Data System (ADS)

    Nave, Rosella; Isaia, Roberto; Sandri, Laura; Cristiani, Chiara

    2016-04-01

    In the communication chain between scientists and decision makers (end users), scientific outputs, as maps, are a fundamental source of information on hazards zoning and the related at risk areas definition. Anyway the relationship between volcanic phenomena, their probability and potential impact can be complex and the geospatial information not easily decoded or understood by not experts even if decision makers. Focusing on volcanic hazard the goal of MED SUV WP6 Task 3 is to improve the communication efficacy of scientific outputs, to contribute in filling the gap between scientists and decision-makers. Campi Flegrei caldera, in Neapolitan area has been chosen as the pilot research area where to apply an evaluation/validation procedure to provide a robust evaluation of the volcanic maps and its validation resulting from end users response. The selected sample involved are decision makers and officials from Campanian Region Civil Protection and municipalities included in Campi Flegrei RED ZONE, the area exposed to risk from to pyroclastic currents hazard. Semi-structured interviews, with a sample of decision makers and civil protection officials have been conducted to acquire both quantitative and qualitative data. The tested maps have been: the official Campi Flegrei Caldera RED ZONE map, three maps produced by overlapping the Red Zone limit on Orthophoto, DTM and Contour map, as well as other maps included a probabilistic one, showing volcanological data used to border the Red Zone. The outcomes' analysis have assessed level of respondents' understanding of content as displayed, and their needs in representing the complex information embedded in volcanic hazard. The final output has been the development of a leaflet as "guidelines" that can support decision makers and officials in understanding volcanic hazard and risk maps, and also in using them as a communication tool in information program for the population at risk. The same evaluation /validation process has been applied also on the scientific output of MED-SUV WP6, as a tool for the short-term probabilistic volcanic hazard assessment. For the Campi Flegrei volcanic system, the expected tool has been implemented to compute hazard curves, hazard maps and probability maps for tephra fallout on a target grid covering the Campania region. This allows the end user to visualize the hazard from tephra fallout and its uncertainty. The response of end-users to such products will help to determine to what extent end-users understand them, find them useful, and match their requirements. In order to involve also Etna area in WP6 TASK 3 activities, a questionnaire developed in the VUELCO project (Volcanic Unrest in Europe and Latin America) has been proposed to Sicily Civil Protection officials having decision-making responsibility in case of volcanic unrest at Etna and Stromboli, to survey their opinions and requirements also in case of volcanic unrest

  11. Behavioral Targeting—Consumer Tracking

    NASA Astrophysics Data System (ADS)

    Srimani, P. K.; Srinivas, A.

    2011-12-01

    Behavioral targeting is an online marketing method that collects data on the browsing activities of consumers, in order to `target' more relevant online advertising. Behavioral targeting enables marketers to reach in-market consumers and increases the value of publisher inventory. At the heart of behavioral targeting is a learning-based investigation of consumer behaviors. It helps marketers understand consumers' purchase patterns over time, mapping out a customer's activities based not only on a single purchase but also on an annual or even lifetime basis. As marketers increasingly appreciate the importance of customer lifetime value, behavioral targeting can be a foundation for creating a continuous analytical study of consumer trends and patterns. But as behavioural-targeting systems become more sophisticated and invasive, it is vital that the companies behind them are open with users about what is going on, and give them control over their personal information. The aim of this paper is to explore the various tools and techniques of behavioral targeting and its benefits to online marketing. A multiple—case study approach was used for exploring the effectiveness and acceptance of online marketing in the area of the online marketing.

  12. Concept-Mapping Tools and the Development of Students' Critical-Thinking Skills

    ERIC Educational Resources Information Center

    Tseng, Sheng-Shiang

    2015-01-01

    Developing students' critical-thinking skills has recently received attention at all levels of education. This article proposes the use of concept-mapping tools to improve students' critical-thinking skills. The article introduces a Web-based concept-mapping tool--Popplet--and demonstrates its application for teaching critical-thinking skills in…

  13. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  14. Leveraging community health worker system to map a mountainous rural district in low resource setting: a low-cost approach to expand use of geographic information systems for public health.

    PubMed

    Munyaneza, Fabien; Hirschhorn, Lisa R; Amoroso, Cheryl L; Nyirazinyoye, Laetitia; Birru, Ermyas; Mugunga, Jean Claude; Murekatete, Rachel M; Ntaganira, Joseph

    2014-12-06

    Geographic Information Systems (GIS) have become an important tool in monitoring and improving health services, particularly at local levels. However, GIS data are often unavailable in rural settings and village-level mapping is resource-intensive. This study describes the use of community health workers' (CHW) supervisors to map villages in a mountainous rural district of Northern Rwanda and subsequent use of these data to map village-level variability in safe water availability. We developed a low literacy and skills-focused training in the local language (Kinyarwanda) to train 86 CHW Supervisors and 25 nurses in charge of community health at the health center (HC) and health post (HP) levels to collect the geographic coordinates of the villages using Global Positioning Systems (GPS). Data were validated through meetings with key stakeholders at the sub-district and district levels and joined using ArcMap 10 Geo-processing tools. Costs were calculated using program budgets and activities' records, and compared with the estimated costs of mapping using a separate, trained GIS team. To demonstrate the usefulness of this work, we mapped drinking water sources (DWS) from data collected by CHW supervisors from the chief of the village. DWSs were categorized as safe versus unsafe using World Health Organization definitions. Following training, each CHW Supervisor spent five days collecting data on the villages in their coverage area. Over 12 months, the CHW supervisors mapped the district's 573 villages using 12 shared GPS devices. Sector maps were produced and distributed to local officials. The cost of mapping using CHW supervisors was $29,692, about two times less than the estimated cost of mapping using a trained and dedicated GIS team ($60,112). The availability of local mapping was able to rapidly identify village-level disparities in DWS, with lower access in populations living near to lakes and wetlands (p < .001). Existing national CHW system can be leveraged to inexpensively and rapidly map villages even in mountainous rural areas. These data are important to provide managers and decision makers with local-level GIS data to rapidly identify variability in health and other related services to better target and evaluate interventions.

  15. Map based multimedia tool on Pacific theatre in World War II

    NASA Astrophysics Data System (ADS)

    Pakala Venkata, Devi Prasada Reddy

    Maps have been used for depicting data of all kinds in the educational community for many years. A standout amongst the rapidly changing methods of teaching is through the development of interactive and dynamic maps. The emphasis of the thesis is to develop an intuitive map based multimedia tool, which provides a timeline of battles and events in the Pacific theatre of World War II. The tool contains summaries of major battles and commanders and has multimedia content embedded in it. The primary advantage of this Map tool is that one can quickly know about all the battles and campaigns of the Pacific Theatre by accessing Timeline of Battles in each region or Individual Battles in each region or Summary of each Battle in an interactive way. This tool can be accessed via any standard web browser and motivate the user to know more about the battles involved in the Pacific Theatre. It was made responsive using Google maps API, JavaScript, HTML5 and CSS.

  16. A decision-support tool for the control of urban noise pollution.

    PubMed

    Suriano, Marcia Thais; de Souza, Léa Cristina Lucas; da Silva, Antonio Nelson Rodrigues

    2015-07-01

    Improving the quality of life is increasingly seen as an important urban planning goal. In order to reach it, various tools are being developed to mitigate the negative impacts of human activities on society. This paper develops a methodology for quantifying the population's exposure to noise, by proposing a classification of urban blocks. Taking into account the vehicular flow and traffic composition of the surroundings of urban blocks, we generated a noise map by applying a computational simulation. The urban blocks were classified according to their noise range and then the population was estimated for each urban block, by a process which was based on the census tract and the constructed area of the blocks. The acoustical classes of urban blocks and the number of inhabitants per block were compared, so that the population exposed to noise levels above 65 dB(A) could be estimated, which is the highest limit established by legislation. As a result, we developed a map of the study area, so that urban blocks that should be priority targets for noise mitigation actions can be quickly identified.

  17. Sensitivity of Attitude Determination on the Model Assumed for ISAR Radar Mappings

    NASA Astrophysics Data System (ADS)

    Lemmens, S.; Krag, H.

    2013-09-01

    Inverse synthetic aperture radars (ISAR) are valuable instrumentations for assessing the state of a large object in low Earth orbit. The images generated by these radars can reach a sufficient quality to be used during launch support or contingency operations, e.g. for confirming the deployment of structures, determining the structural integrity, or analysing the dynamic behaviour of an object. However, the direct interpretation of ISAR images can be a demanding task due to the nature of the range-Doppler space in which these images are produced. Recently, a tool has been developed by the European Space Agency's Space Debris Office to generate radar mappings of a target in orbit. Such mappings are a 3D-model based simulation of how an ideal ISAR image would be generated by a ground based radar under given processing conditions. These radar mappings can be used to support a data interpretation process. E.g. by processing predefined attitude scenarios during an observation sequence and comparing them with actual observations, one can detect non-nominal behaviour. Vice versa, one can also estimate the attitude states of the target by fitting the radar mappings to the observations. It has been demonstrated for the latter use case that a coarse approximation of the target through an 3D-model is already sufficient to derive the attitude information from the generated mappings. The level of detail required for the 3D-model is determined by the process of generating ISAR images, which is based on the theory of scattering bodies. Therefore, a complex surface can return an intrinsically noisy ISAR image. E.g. when many instruments on a satellite are visible to the observer, the ISAR image can suffer from multipath reflections. In this paper, we will further analyse the sensitivity of the attitude fitting algorithms to variations in the dimensions and the level of detail of the underlying 3D model. Moreover, we investigate the ability to estimate the orientations of different spacecraft components with respect to each other from the fitting procedure.

  18. iDrug: a web-accessible and interactive drug discovery and design platform

    PubMed Central

    2014-01-01

    Background The progress in computer-aided drug design (CADD) approaches over the past decades accelerated the early-stage pharmaceutical research. Many powerful standalone tools for CADD have been developed in academia. As programs are developed by various research groups, a consistent user-friendly online graphical working environment, combining computational techniques such as pharmacophore mapping, similarity calculation, scoring, and target identification is needed. Results We presented a versatile, user-friendly, and efficient online tool for computer-aided drug design based on pharmacophore and 3D molecular similarity searching. The web interface enables binding sites detection, virtual screening hits identification, and drug targets prediction in an interactive manner through a seamless interface to all adapted packages (e.g., Cavity, PocketV.2, PharmMapper, SHAFTS). Several commercially available compound databases for hit identification and a well-annotated pharmacophore database for drug targets prediction were integrated in iDrug as well. The web interface provides tools for real-time molecular building/editing, converting, displaying, and analyzing. All the customized configurations of the functional modules can be accessed through featured session files provided, which can be saved to the local disk and uploaded to resume or update the history work. Conclusions iDrug is easy to use, and provides a novel, fast and reliable tool for conducting drug design experiments. By using iDrug, various molecular design processing tasks can be submitted and visualized simply in one browser without installing locally any standalone modeling softwares. iDrug is accessible free of charge at http://lilab.ecust.edu.cn/idrug. PMID:24955134

  19. Using a map-based assessment tool for the development of cost-effective WFD river basin action programmes in a changing climate.

    PubMed

    Kaspersen, Bjarke Stoltze; Jacobsen, Torsten Vammen; Butts, Michael Brian; Jensen, Niels H; Boegh, Eva; Seaby, Lauren Paige; Müller, Henrik Gioertz; Kjaer, Tyge

    2016-08-01

    For the 2nd and 3rd river basin management cycles (2015-2027) of the Water Framework Directive (WFD), EU Member States are required to fully integrate climate change into the process of river basin management planning (RBMP). Complying with the main WFD objective of achieving 'good ecological status' in all water bodies in Denmark requires Programmes of Measures (PoMs) to reduce nitrogen (N) pollution from point and diffuse sources. Denmark is among the world's most intensively farmed countries and in spite of thirty years of significant policy actions to reduce diffuse nutrient emissions, there is still a need for further reductions. In addition, the impacts of climate change are projected to lead to a situation where nutrient loads will have to be reduced still further in comparison to current climate conditions. There is an urgent need to address this challenge in WFD action programmes in order to develop robust and cost-effective adaptation strategies for the next WFD RBMP cycles. The aim of this paper is to demonstrate and discuss how a map-based PoMs assessment tool can support the development of adaptive and cost-effective strategies to reduce N losses in the Isefjord and Roskilde Fjord River Basin in the north east of Denmark. The tool facilitates assessments of the application of agri-environmental measures that are targeted towards low retention agricultural areas, where limited or no surface and subsurface N reduction takes place. Effects of climate change on nitrate leaching were evaluated using the dynamic agro-ecosystem model 'Daisy'. Results show that nitrate leaching rates increase by approx. 25% under current management practices. This impact outweighs the expected total N reduction effect of Baseline 2015 and the first RBMP in the case study river basin. The particular PoMs investigated in our study show that WFD N reduction targets can be achieved by targeted land use changes on approx. 4% of the agricultural area under current climate conditions and approx. 9% of the agricultural area, when projected climate change impacts on nitrate leaching rates are included in the assessment. The study highlights the potential of the PoMs assessment tool to assist in evaluation of alternative WFD RBMP scenarios to achieve spatially targeted and cost-effective reductions of N loads at catchment scale in the context of a changing climate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Automated first-principles mapping for phase-change materials.

    PubMed

    Esser, Marc; Maintz, Stefan; Dronskowski, Richard

    2017-04-05

    Plotting materials on bi-coordinate maps according to physically meaningful descriptors has a successful tradition in computational solid-state science spanning more than four decades. Equipped with new ab initio techniques introduced in this work, we generate an improved version of the treasure map for phase-change materials (PCMs) as introduced previously by Lencer et al. which, other than before, charts all industrially used PCMs correctly. Furthermore, we suggest seven new PCM candidates, namely SiSb 4 Te 7 , Si 2 Sb 2 Te 5 , SiAs 2 Te 4 , PbAs 2 Te 4 , SiSb 2 Te 4 , Sn 2 As 2 Te 5 , and PbAs 4 Te 7 , to be used as synthetic targets. To realize aforementioned maps based on orbital mixing (or "hybridization") and ionicity coordinates, structural information was first included into an ab initio numerical descriptor for sp 3 orbital mixing and then generalized beyond high-symmetry structures. In addition, a simple, yet powerful quantum-mechanical ionization measure also including structural information was introduced. Taken together, these tools allow for (automatically) generating materials maps solely relying on first-principles calculations. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Magnetic Separation Methods for the Detection of Mycobacterium avium subsp. paratuberculosis in Various Types of Matrices: A Review

    PubMed Central

    Dziedzinska, Radka

    2017-01-01

    The main reasons to improve the detection of Mycobacterium avium subsp. paratuberculosis (MAP) are animal health and monitoring of MAP entering the food chain via meat, milk, and/or dairy products. Different approaches can be used for the detection of MAP, but the use of magnetic separation especially in conjunction with PCR as an end-point detection method has risen in past years. However, the extraction of DNA which is a crucial step prior to PCR detection can be complicated due to the presence of inhibitory substances. Magnetic separation methods involving either antibodies or peptides represent a powerful tool for selective separation of target bacteria from other nontarget microorganisms and inhibitory sample components. These methods enable the concentration of pathogens present in the initial matrix into smaller volume and facilitate the isolation of sufficient quantities of pure DNA. The purpose of this review was to summarize the methods based on the magnetic separation approach that are currently available for the detection of MAP in a broad range of matrices. PMID:28642876

  2. Sinking Maps: A Conceptual Tool for Visual Metaphor

    ERIC Educational Resources Information Center

    Giampa, Joan Marie

    2012-01-01

    Sinking maps, created by Northern Virginia Community College professor Joan Marie Giampa, are tools that teach fine art students how to construct visual metaphor by conceptually mapping sensory perceptions. Her dissertation answers the question, "Can visual metaphor be conceptually mapped in the art classroom?" In the Prologue, Giampa…

  3. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  4. Molecular neuroanatomy: a generation of progress.

    PubMed

    Pollock, Jonathan D; Wu, Da-Yu; Satterlee, John S

    2014-02-01

    The neuroscience research landscape has changed dramatically over the past decade. Specifically, an impressive array of new tools and technologies have been generated, including but not limited to: brain gene expression atlases, genetically encoded proteins to monitor and manipulate neuronal activity, and new methods for imaging and mapping circuits. However, despite these technological advances, several significant challenges must be overcome to enable a better understanding of brain function and to develop cell type-targeted therapeutics to treat brain disorders. This review provides an overview of some of the tools and technologies currently being used to advance the field of molecular neuroanatomy, and also discusses emerging technologies that may enable neuroscientists to address these crucial scientific challenges over the coming decade. Published by Elsevier Ltd.

  5. Facilitating in vivo tumor localization by principal component analysis based on dynamic fluorescence molecular imaging

    NASA Astrophysics Data System (ADS)

    Gao, Yang; Chen, Maomao; Wu, Junyu; Zhou, Yuan; Cai, Chuangjian; Wang, Daliang; Luo, Jianwen

    2017-09-01

    Fluorescence molecular imaging has been used to target tumors in mice with xenograft tumors. However, tumor imaging is largely distorted by the aggregation of fluorescent probes in the liver. A principal component analysis (PCA)-based strategy was applied on the in vivo dynamic fluorescence imaging results of three mice with xenograft tumors to facilitate tumor imaging, with the help of a tumor-specific fluorescent probe. Tumor-relevant features were extracted from the original images by PCA and represented by the principal component (PC) maps. The second principal component (PC2) map represented the tumor-related features, and the first principal component (PC1) map retained the original pharmacokinetic profiles, especially of the liver. The distribution patterns of the PC2 map of the tumor-bearing mice were in good agreement with the actual tumor location. The tumor-to-liver ratio and contrast-to-noise ratio were significantly higher on the PC2 map than on the original images, thus distinguishing the tumor from its nearby fluorescence noise of liver. The results suggest that the PC2 map could serve as a bioimaging marker to facilitate in vivo tumor localization, and dynamic fluorescence molecular imaging with PCA could be a valuable tool for future studies of in vivo tumor metabolism and progression.

  6. Eye tracking a self-moved target with complex hand-target dynamics

    PubMed Central

    Landelle, Caroline; Montagnini, Anna; Madelain, Laurent

    2016-01-01

    Previous work has shown that the ability to track with the eye a moving target is substantially improved when the target is self-moved by the subject's hand compared with when being externally moved. Here, we explored a situation in which the mapping between hand movement and target motion was perturbed by simulating an elastic relationship between the hand and target. Our objective was to determine whether the predictive mechanisms driving eye-hand coordination could be updated to accommodate this complex hand-target dynamics. To fully appreciate the behavioral effects of this perturbation, we compared eye tracking performance when self-moving a target with a rigid mapping (simple) and a spring mapping as well as when the subject tracked target trajectories that he/she had previously generated when using the rigid or spring mapping. Concerning the rigid mapping, our results confirmed that smooth pursuit was more accurate when the target was self-moved than externally moved. In contrast, with the spring mapping, eye tracking had initially similar low spatial accuracy (though shorter temporal lag) in the self versus externally moved conditions. However, within ∼5 min of practice, smooth pursuit improved in the self-moved spring condition, up to a level similar to the self-moved rigid condition. Subsequently, when the mapping unexpectedly switched from spring to rigid, the eye initially followed the expected target trajectory and not the real one, thereby suggesting that subjects used an internal representation of the new hand-target dynamics. Overall, these results emphasize the stunning adaptability of smooth pursuit when self-maneuvering objects with complex dynamics. PMID:27466129

  7. Risk maps for targeting exotic plant pest detection programs in the United States

    Treesearch

    R.D. Magarey; D.M. Borchert; J.S. Engle; M Garcia-Colunga; Frank H. Koch; et al

    2011-01-01

    In the United States, pest risk maps are used by the Cooperative Agricultural Pest Survey for spatial and temporal targeting of exotic plant pest detection programs. Methods are described to create standardized host distribution, climate and pathway risk maps for the top nationally ranked exotic pest targets. Two examples are provided to illustrate the risk mapping...

  8. A comparative study evaluating the efficacy of IS_MAP04 with IS900 and IS_MAP02 as a new diagnostic target for the detection of Mycobacterium avium subspecies paratuberculosis from bovine faeces.

    PubMed

    de Kruijf, Marcel; Govender, Rodney; Yearsley, Dermot; Coffey, Aidan; O'Mahony, Jim

    2017-05-01

    The aim of this study was to investigate the efficacy of IS_MAP04 as a potential new diagnostic quantitative PCR (qPCR) target for the detection of Mycobacterium avium subspecies paratuberculosis from bovine faeces. IS_MAP04 primers were designed and tested negative against non-MAP strains. The detection limit of IS_MAP04 qPCR was evaluated on different MAP K-10 DNA concentrations and on faecal samples spiked with different MAP K-10 cell dilutions. A collection of 106 faecal samples was analysed and the efficacy of IS_MAP04 was statistically compared with IS900 and IS_MAP02. The detection limits observed for IS_MAP04 and IS900 on MAP DNA was 34 fg and 3.4 fg respectively. The detection limit of MAP from inoculated faecal samples was 10 2 CFU/g for both IS_MAP04 and IS900 targets and a detection limit of 10 2 CFU/g was also achieved with a TaqMan qPCR targeting IS_MAP04. The efficacy of IS_MAP04 to detect positive MAP faecal samples was 83.0% compared to 85.8% and 83.9% for IS900 and IS_MAP02 respectively. Strong kappa agreements were observed between IS_MAP04 and IS900 (κ=0.892) and between IS_MAP04 and IS_MAP02 (κ=0.897). As a new molecular target, IS_MAP04 showed that the detection limit was comparable to IS900 to detect MAP from inoculated faecal material. The MAP detection efficacy of IS_MAP04 from naturally infected faecal samples proved to be relatively comparable to IS_MAP02, but yielded efficacy results slightly less than IS900. Moreover, IS_MAP04 could be of significant value when used in duplex or multiplex qPCR assays. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation.

    PubMed

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.

  10. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation

    PubMed Central

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Objects: Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications. PMID:27199729

  11. AEGIS: a wildfire prevention and management information system

    NASA Astrophysics Data System (ADS)

    Kalabokidis, K.; Ager, A.; Finney, M.; Athanasis, N.; Palaiologou, P.; Vasilakos, C.

    2015-10-01

    A Web-GIS wildfire prevention and management platform (AEGIS) was developed as an integrated and easy-to-use decision support tool (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing access to information that is essential for wildfire management. Databases were created with spatial and non-spatial data to support key system functionalities. Updated land use/land cover maps were produced by combining field inventory data with high resolution multispectral satellite images (RapidEye) to be used as inputs in fire propagation modeling with the Minimum Travel Time algorithm. End users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations; i.e. single-fire propagations, conditional burn probabilities and at the landscape-level, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANN) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps produced an integrated output map for fire danger prediction. The system also incorporates weather measurements from remote automatic weather stations and weather forecast maps. The structure of the algorithms relies on parallel processing techniques (i.e. High Performance Computing and Cloud Computing) that ensure computational power and speed. All AEGIS functionalities are accessible to authorized end users through a web-based graphical user interface. An innovative mobile application, AEGIS App, acts as a complementary tool to the web-based version of the system.

  12. Forming maps of targets having multiple reflectors with a biomimetic audible sonar.

    PubMed

    Kuc, Roman

    2018-05-01

    A biomimetic audible sonar mimics human echolocation by emitting clicks and sensing echoes binaurally to investigate the limitations in acoustic mapping of 2.5 dimensional targets. A monaural sonar that provides only echo time-of-flight values produces biased maps that lie outside the target surfaces. Reflector bearing estimates derived from the first echoes detected by a binaural sonar are employed to form unbiased maps. Multiple echoes from a target introduce phantom-reflector artifacts into its map because later echoes are produced by reflectors at bearings different from those determined from the first echoes. In addition, overlapping echoes interfere to produce bearing errors. Addressing the causes of these bearing errors motivates a processing approach that employs template matching to extract valid echoes. Interfering echoes can mimic a valid echo and also form PR artifacts. These artifacts are eliminated by recognizing the bearing fluctuations that characterize echo interference. Removing PR artifacts produces a map that resembles the physical target shape to within the resolution capabilities of the sonar. The remaining differences between the target shape and the final map are void artifacts caused by invalid or missing echoes.

  13. Laurentide: The Crime Fighting Geologist, A Comic-Book Curriculum Tool

    NASA Astrophysics Data System (ADS)

    McGillis, A.; Gilbert, L. A.; Enright, K. P.

    2014-12-01

    When the police are just too ill informed on matters of earth science to solve the case it is up to Laurentide and her crew of geologists to bring justice to evildoers. Using every tool available, from a rock hammer to LiDAR, Laurentide fights crime while teaching her apprentice Esker about how geologists uncover mysteries everyday. This is the first of what will be a series of free teaching materials targeted at grades 5-8 based around the National Science Education Standards. Students will get the chance to practice problem solving and data analysis in order to solve mysteries with a combination of comic book style story telling and hands-on worksheets. The pilot story, "The Caper of the Ridiculously Cheap Condominiums" will cover 4 of the 9 Earth Science Literacy Principles 'Big Ideas'. Material will explore earthquakes, the hazards and risks they present, and the tools geologists use to map faults and estimate reoccurrence intervals.

  14. iVAX: An integrated toolkit for the selection and optimization of antigens and the design of epitope-driven vaccines.

    PubMed

    Moise, Leonard; Gutierrez, Andres; Kibria, Farzana; Martin, Rebecca; Tassone, Ryan; Liu, Rui; Terry, Frances; Martin, Bill; De Groot, Anne S

    2015-01-01

    Computational vaccine design, also known as computational vaccinology, encompasses epitope mapping, antigen selection and immunogen design using computational tools. The iVAX toolkit is an integrated set of tools that has been in development since 1998 by De Groot and Martin. It comprises a suite of immunoinformatics algorithms for triaging candidate antigens, selecting immunogenic and conserved T cell epitopes, eliminating regulatory T cell epitopes, and optimizing antigens for immunogenicity and protection against disease. iVAX has been applied to vaccine development programs for emerging infectious diseases, cancer antigens and biodefense targets. Several iVAX vaccine design projects have had success in pre-clinical studies in animal models and are progressing toward clinical studies. The toolkit now incorporates a range of immunoinformatics tools for infectious disease and cancer immunotherapy vaccine design. This article will provide a guide to the iVAX approach to computational vaccinology.

  15. Targeting trachoma control through risk mapping: the example of Southern Sudan.

    PubMed

    Clements, Archie C A; Kur, Lucia W; Gatpan, Gideon; Ngondi, Jeremiah M; Emerson, Paul M; Lado, Mounir; Sabasio, Anthony; Kolaczinski, Jan H

    2010-08-17

    Trachoma is a major cause of blindness in Southern Sudan. Its distribution has only been partially established and many communities in need of intervention have therefore not been identified or targeted. The present study aimed to develop a tool to improve targeting of survey and control activities. A national trachoma risk map was developed using Bayesian geostatistics models, incorporating trachoma prevalence data from 112 geo-referenced communities surveyed between 2001 and 2009. Logistic regression models were developed using active trachoma (trachomatous inflammation follicular and/or trachomatous inflammation intense) in 6345 children aged 1-9 years as the outcome, and incorporating fixed effects for age, long-term average rainfall (interpolated from weather station data) and land cover (i.e. vegetation type, derived from satellite remote sensing), as well as geostatistical random effects describing spatial clustering of trachoma. The model predicted the west of the country to be at no or low trachoma risk. Trachoma clusters in the central, northern and eastern areas had a radius of 8 km after accounting for the fixed effects. In Southern Sudan, large-scale spatial variation in the risk of active trachoma infection is associated with aridity. Spatial prediction has identified likely high-risk areas to be prioritized for more data collection, potentially to be followed by intervention.

  16. Targeting Trachoma Control through Risk Mapping: The Example of Southern Sudan

    PubMed Central

    Clements, Archie C. A.; Kur, Lucia W.; Gatpan, Gideon; Ngondi, Jeremiah M.; Emerson, Paul M.; Lado, Mounir; Sabasio, Anthony; Kolaczinski, Jan H.

    2010-01-01

    Background Trachoma is a major cause of blindness in Southern Sudan. Its distribution has only been partially established and many communities in need of intervention have therefore not been identified or targeted. The present study aimed to develop a tool to improve targeting of survey and control activities. Methods/Principal Findings A national trachoma risk map was developed using Bayesian geostatistics models, incorporating trachoma prevalence data from 112 geo-referenced communities surveyed between 2001 and 2009. Logistic regression models were developed using active trachoma (trachomatous inflammation follicular and/or trachomatous inflammation intense) in 6345 children aged 1–9 years as the outcome, and incorporating fixed effects for age, long-term average rainfall (interpolated from weather station data) and land cover (i.e. vegetation type, derived from satellite remote sensing), as well as geostatistical random effects describing spatial clustering of trachoma. The model predicted the west of the country to be at no or low trachoma risk. Trachoma clusters in the central, northern and eastern areas had a radius of 8 km after accounting for the fixed effects. Conclusion In Southern Sudan, large-scale spatial variation in the risk of active trachoma infection is associated with aridity. Spatial prediction has identified likely high-risk areas to be prioritized for more data collection, potentially to be followed by intervention. PMID:20808910

  17. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    PubMed Central

    2012-01-01

    Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single software environment with the added capability to interact with public data sources and visual analytic tools for HTP data analysis at a systems level. BRM is developed using Java™ and other open-source technologies for free distribution (http://www.sysbio.org/dataresources/brm.stm). PMID:23174015

  18. Hole Lotta Grindin Going On

    NASA Image and Video Library

    2004-03-06

    The red marks in this image, taken by the Mars Exploration Rover Opportunity's panoramic camera, indicate holes made by the rover's rock abrasion tool, located on its instrument deployment device, or "arm." The lower hole, located on a target called "McKittrick," was made on the 30th martian day, or sol, of Opportunity's journey. The upper hole, located on a target called "Guadalupe" was made on sol 34 of the rover's mission. The mosaic image was taken using a blue filter at the "El Capitan" region of the Meridiani Planum, Mars, rock outcrop. The image, shown in a vertical-perspective map projection, consists of images acquired on sols 27, 29 and 30 of the rover's mission. http://photojournal.jpl.nasa.gov/catalog/PIA05513

  19. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  20. Construct Maps: A Tool to Organize Validity Evidence

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen

    2013-01-01

    The construct map is a promising tool for organizing the data standard-setting panelists interpret. The challenge in applying construct maps to standard-setting procedures will be the judicious selection of data to include within this organizing framework. Therefore, this commentary focuses on decisions about what to include in the construct map.…

  1. Accurate atom-mapping computation for biochemical reactions.

    PubMed

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  2. Ganymede’s stratigraphy and crater distributions in Voyager and Galileo SSI images: results from the anti-jovian hemisphere

    NASA Astrophysics Data System (ADS)

    Wagner, Roland Josef; Schmedemann, Nico; Stephan, Katrin; Werner, Stephanie; Ivanov, Boris A.; Roatsch, Thomas; Jaumann, Ralf; Palumbo, Pasquale

    2017-10-01

    Crater size distributions are a valuable tool in planetary stratigraphy to derive the sequence of geologic events. In this study, we extend our previous work [1] in Ganymede’s sub-jovian hemisphere to the anti-jovian hemisphere. For geologic mapping, the map by [2] is used as a reference. Our study provides groundwork for the upcoming imaging by the JANUS camera aboard ESA’s JUICE mission [3]. Voyager-2 images are reprocessed using a map scale of 700 m/pxl achieved for parts of the anti-jovian hemisphere. To obtain relative ages from crater frequencies, we apply an updated crater scaling law for cratering into icy targets in order to derive a crater production function for Ganymede [1]. Also, we adopt the Poisson timing analysis method discussed and implemented recently [4] to obtain relative (and absolute model) ages. Results are compared to those from the sub-jovian hemisphere [1] as well as to support and/or refine the global stratigraphic system by [2]. Further emphasis is placed on local target areas in the anti-jovian hemisphere imaged by Galileo SSI at regional map scales of 100 to 300 m/pxl in order to study local geologic effects and processes. These areas incorporate (1) dark and (2) light tectonized materials, and (3) impact crater materials including an area with numerous secondaries from ray crater Osiris. References: [1] Wagner R. et al. (2014), DPS meeting #46, abstract 418.09. [2] Collins G. et al. (2013), U.S.G.S. Sci. Inv. Map 3237. [3] Della Corte V. et al. (2014), Proc. SPIE 9143, doi:10.1117/12.2056353. [4] Michael G. et al. (2016), Icarus 277, 279-285.

  3. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    PubMed

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  4. The Dutch National Atlas of Public Health.

    PubMed

    Zwakhals, S L N; Giesbers, H; Mac Gillavry, E; van Boven, P F; van der Veen, A A

    2004-09-01

    The Dutch National Atlas of Public Health (http://www.zorgatlas.nl) maps the regional distribution of demand and usage of health care, public health status and influencing factors. The Atlas provides answers to locational questions, e. g. 'Where are the highest mortality rates?', 'Where are the longest waiting lists?' and 'Where are hospitals located?' Maps play a pivotal role in the Atlas. Texts, graphics and diagrams support the interpretation of the maps. The information in the Atlas specifically targets policy makers at the Ministry of Health, Welfare and Sport. For them, the Atlas is a tool for problem detection, policy making and policy evaluation. The Atlas is also aimed at all professionals in health care. In practice, also the general public appears to access and use the Atlas. The Atlas is part of the Dutch Public Health Status and Forecasts (PHSF). The PHSF is made by the National Institute of Public Health and the Environment mandated by the Ministry of Health, Welfare and Sport.

  5. DOSoReMI.hu: collection of countrywide DSM products partly according to GSM.net specifications, partly driven by specific user demands

    NASA Astrophysics Data System (ADS)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor; Illés, Gábor; Bakacsi, Zsófia; Szabó, József

    2017-04-01

    Due to former soil surveys and mapping activities significant amount of soil information has accumulated in Hungary. In traditional soil mapping the creation of a new map was troublesome and laborious. As a consequence, robust maps were elaborated and rather the demands were fitted to the available map products. Until recently spatial soil information demands have been serviced with the available datasets either in their actual form or after certain specific and often enforced, thematic and spatial inference. Considerable imperfection may occur in the accuracy and reliability of the map products, since there might be significant discrepancies between the available data and the expected information. The DOSoReMI.hu (Digital, Optimized, Soil Related Maps and Information in Hungary) project was started intentionally for the renewal of the national soil spatial infrastructure in Hungary. During our activities we have significantly extended the potential, how soil information requirements could be satisfied. Soil property, soil type as well as functional soil maps were targeted. The set of the applied digital soil mapping techniques has been gradually broadened incorporating and eventually integrating geostatistical, data mining and GIS tools. Soil property maps have been compiled partly according to GSM.net specifications, partly by slightly or more strictly changing some of their predefined parameters (depth intervals, pixel size, property etc.) according to the specific demands on the final products. The elaborated primary maps were further processed, since even DOSoReMI.hu intended to take steps for the regionalization of higher level soil information (processes, functions, and services) involving crop models in the spatial modelling. The framework of DOSoReMI.hu also provides opportunity for the elaboration of goal specific soil maps, with the prescription of the parameters (thematic, resolution, accuracy, reliability etc.) characterizing the map product. As a result, unique digital soil map products (in a more general meaning) were elaborated regionalizing specific soil (related) features, which were never mapped before, even nationally with high ( 1 ha) spatial resolution. Based upon the collected experiences, the full range of GSM.net products were also targeted. The web publishing of the results was also elaborated creating a proper WMS environment. Our paper will present the resulted national maps furthermore some conclusions drawn from the experiences.] Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA) under Grant K105167 and AGRARKLÍMA.2 VKSZ_12-1-2013-0034.

  6. Advancement and applications of peptide phage display technology in biomedical science.

    PubMed

    Wu, Chien-Hsun; Liu, I-Ju; Lu, Ruei-Min; Wu, Han-Chung

    2016-01-19

    Combinatorial phage library is a powerful research tool for high-throughput screening of protein interactions. Of all available molecular display techniques, phage display has proven to be the most popular approach. Screening phage-displayed random peptide libraries is an effective means of identifying peptides that can bind target molecules and regulate their function. Phage-displayed peptide libraries can be used for (i) B-cell and T-cell epitope mapping, (ii) selection of bioactive peptides bound to receptors or proteins, disease-specific antigen mimics, peptides bound to non-protein targets, cell-specific peptides, or organ-specific peptides, and (iii) development of peptide-mediated drug delivery systems and other applications. Targeting peptides identified using phage display technology may be useful for basic research and translational medicine. In this review article, we summarize the latest technological advancements in the application of phage-displayed peptide libraries to applied biomedical sciences.

  7. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  8. High-resolution genetic mapping of allelic variants associated with cell wall chemistry in Populus.

    PubMed

    Muchero, Wellington; Guo, Jianjun; DiFazio, Stephen P; Chen, Jin-Gui; Ranjan, Priya; Slavov, Gancho T; Gunter, Lee E; Jawdy, Sara; Bryan, Anthony C; Sykes, Robert; Ziebell, Angela; Klápště, Jaroslav; Porth, Ilga; Skyba, Oleksandr; Unda, Faride; El-Kassaby, Yousry A; Douglas, Carl J; Mansfield, Shawn D; Martin, Joel; Schackwitz, Wendy; Evans, Luke M; Czarnecki, Olaf; Tuskan, Gerald A

    2015-01-23

    QTL cloning for the discovery of genes underlying polygenic traits has historically been cumbersome in long-lived perennial plants like Populus. Linkage disequilibrium-based association mapping has been proposed as a cloning tool, and recent advances in high-throughput genotyping and whole-genome resequencing enable marker saturation to levels sufficient for association mapping with no a priori candidate gene selection. Here, multiyear and multienvironment evaluation of cell wall phenotypes was conducted in an interspecific P. trichocarpa x P. deltoides pseudo-backcross mapping pedigree and two partially overlapping populations of unrelated P. trichocarpa genotypes using pyrolysis molecular beam mass spectrometry, saccharification, and/ or traditional wet chemistry. QTL mapping was conducted using a high-density genetic map with 3,568 SNP markers. As a fine-mapping approach, chromosome-wide association mapping targeting a QTL hot-spot on linkage group XIV was performed in the two P. trichocarpa populations. Both populations were genotyped using the 34 K Populus Infinium SNP array and whole-genome resequencing of one of the populations facilitated marker-saturation of candidate intervals for gene identification. Five QTLs ranging in size from 0.6 to 1.8 Mb were mapped on linkage group XIV for lignin content, syringyl to guaiacyl (S/G) ratio, 5- and 6-carbon sugars using the mapping pedigree. Six candidate loci exhibiting significant associations with phenotypes were identified within QTL intervals. These associations were reproducible across multiple environments, two independent genotyping platforms, and different plant growth stages. cDNA sequencing for allelic variants of three of the six loci identified polymorphisms leading to variable length poly glutamine (PolyQ) stretch in a transcription factor annotated as an ANGUSTIFOLIA C-terminus Binding Protein (CtBP) and premature stop codons in a KANADI transcription factor as well as a protein kinase. Results from protoplast transient expression assays suggested that each of the polymorphisms conferred allelic differences in the activation of cellulose, hemicelluloses, and lignin pathway marker genes. This study illustrates the utility of complementary QTL and association mapping as tools for gene discovery with no a priori candidate gene selection. This proof of concept in a perennial organism opens up opportunities for discovery of novel genetic determinants of economically important but complex traits in plants.

  9. Developing INFOMAR's Seabed Mapping Data to Support a Sustainable Marine Economy

    NASA Astrophysics Data System (ADS)

    Judge, M. T.; Guinan, J.

    2016-02-01

    As Ireland's national seabed mapping programme, INFOMAR1 (INtegrated mapping FOr the sustainable development of Ireland's MARine resource) enters its eleventh year it continues to provide pivotal seabed mapping data products, e.g. databases, charts and physical habitat maps to support Ireland's Integrated Marine Plan. The programme, jointly coordinated by the Geological Survey of Ireland and the Marine Institute, has gained a world class reputation for developing seabed mapping technologies, infrastructure and expertise. In the government's current Integrated Marine Plan, the programme's critical role in marine spatial planning enabling infrastructural development, research and education has been cited2. INFOMAR's free data policy supports a thriving maritime economy by promoting easy access to seabed mapping datasets that underpin; maritime safety, security and surveillance, governance, business development, research and technology innovation and infrastructure. The first hydrographic surveys of the national marine mapping programme mapped the extent of Ireland's deepest offshore area, whilst in recent years the focus has been to map the coastal and shallow areas. Targeted coastal areas include 26 bays and 3 priority areas for which specialised equipment, techniques and vessels are required. This talk will discuss how the INFOMAR programme has evolved to address the scientific and technological challenges of seabed mapping across a range of water depths; particularly the challenges associated with addressing inshore data gaps. It will describe how the data converts to bathymetric and geological maps detailing seabed characteristics and habitats. We will expand on how maps are: incorporated into collaborative marine projects such as EMODnet, commercialised to identify marine resources and used as marine decision support tools that drive policy and promote protection of the vastly under discovered marine area.

  10. Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)

    DTIC Science & Technology

    2015-07-01

    EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of

  11. Using Generalizability Theory to Examine Different Concept Map Scoring Methods

    ERIC Educational Resources Information Center

    Cetin, Bayram; Guler, Nese; Sarica, Rabia

    2016-01-01

    Problem Statement: In addition to being teaching tools, concept maps can be used as effective assessment tools. The use of concept maps for assessment has raised the issue of scoring them. Concept maps generated and used in different ways can be scored via various methods. Holistic and relational scoring methods are two of them. Purpose of the…

  12. Development of a Competency Mapping Tool for Undergraduate Professional Degree Programmes, Using Mechanical Engineering as a Case Study

    ERIC Educational Resources Information Center

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that…

  13. An Experiment in Mind-Mapping and Argument-Mapping: Tools for Assessing Outcomes in the Business Curriculum

    ERIC Educational Resources Information Center

    Gargouri, Chanaz; Naatus, Mary Kate

    2017-01-01

    Distinguished from other teaching-learning tools, such as mind and concept mapping in which students draw pictures and concepts and show relationships and correlation between them to demonstrate their own understanding of complex concepts, argument mapping is used to demonstrate clarity of reasoning, based on supporting evidence, and come to a…

  14. Teachers' Perceptions of Esri Story Maps as Effective Teaching Tools

    ERIC Educational Resources Information Center

    Strachan, Caitlin; Mitchell, Jerry

    2014-01-01

    The current study explores teachers' perceptions of Esri Story Maps as effective teaching tools. Story Maps are a relatively new web application created using Esri's cloud-based GIS platform, ArcGIS Online. They combine digitized, dynamic web maps with other story elements to help the creator effectively convey a message. The relative ease…

  15. Crystal Ball Replica

    NASA Astrophysics Data System (ADS)

    Ajamian, John

    2016-09-01

    The A2 collaboration of the Institute for Nuclear Physics of Johannes Gutenberg University performs research on (multiple) meson photoproduction and nucleon structure and dynamics using a high energy polarized photon beam at specific targets. Particles scattered from the target are detected in the Crystal Ball, or CB. The CB is composed of 672 NaI crystals that surround the target and can analyze particle type and energy of ejected particles. Our project was to create a replica of the CB that could display what was happening in real time on a 3 Dimensional scale replica. Our replica was constructed to help explain the physics to the general public, be used as a tool when calibrating each of the 672 NaI crystals, and to better analyze the electron showering of particles coming from the target. This poster will focus on the hardware steps necessary to construct the replica and wire the 672 programmable LEDS in such a way that they can be mapped to correspond to the Crystal Ball elements. George Washington NSF Grant.

  16. NASA Lunar and Planetary Mapping and Modeling

    NASA Astrophysics Data System (ADS)

    Day, B. H.; Law, E.

    2016-12-01

    NASA's Lunar and Planetary Mapping and Modeling Portals provide web-based suites of interactive visualization and analysis tools to enable mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, and Vesta. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look forward to the results of the exciting work currently being undertaken. Additional data products and tools continue to be added to the Lunar Mapping and Modeling Portal (LMMP). These include both generalized products as well as polar data products specifically targeting potential sites for the Resource Prospector mission. Current development work on LMMP also includes facilitating mission planning and data management for lunar CubeSat missions, and working with the NASA Astromaterials Acquisition and Curation Office's Lunar Apollo Sample database in order to help better visualize the geographic contexts from which samples were retrieved. A new user interface provides, among other improvements, significantly enhanced 3D visualizations and navigation. Mars Trek, the project's Mars portal, has now been assigned by NASA's Planetary Science Division to support site selection and analysis for the Mars 2020 Rover mission as well as for the Mars Human Landing Exploration Zone Sites. This effort is concentrating on enhancing Mars Trek with data products and analysis tools specifically requested by the proposing teams for the various sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in these upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. The portals also serve as outstanding resources for education and outreach. As such, they have been designated by NASA's Science Mission Directorate as key supporting infrastructure for the new education programs selected through the division's recent CAN.

  17. Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.

    PubMed

    Lepley, C J

    1998-12-01

    The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.

  18. Tools for model-building with cryo-EM maps

    DOE PAGES

    Terwilliger, Thomas Charles

    2018-01-01

    There are new tools available to you in Phenix for interpreting cryo-EM maps. You can automatically sharpen (or blur) a map with phenix.auto_sharpen and you can segment a map with phenix.segment_and_split_map. If you have overlapping partial models for a map, you can merge them with phenix.combine_models. If you have a protein-RNA complex and protein chains have been accidentally built in the RNA region, you can try to remove them with phenix.remove_poor_fragments. You can put these together and automatically sharpen, segment and build a map with phenix.map_to_model.

  19. Tools for model-building with cryo-EM maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terwilliger, Thomas Charles

    There are new tools available to you in Phenix for interpreting cryo-EM maps. You can automatically sharpen (or blur) a map with phenix.auto_sharpen and you can segment a map with phenix.segment_and_split_map. If you have overlapping partial models for a map, you can merge them with phenix.combine_models. If you have a protein-RNA complex and protein chains have been accidentally built in the RNA region, you can try to remove them with phenix.remove_poor_fragments. You can put these together and automatically sharpen, segment and build a map with phenix.map_to_model.

  20. The Small Body Mapping Tool (SBMT) for Accessing, Visualizing, and Analyzing Spacecraft Data in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barnouin, O. S.; Ernst, C. M.; Daly, R. T.

    2018-04-01

    The free, publicly available Small Body Mapping Tool (SBMT) developed at the Johns Hopkins University Applied Physics Laboratory is a powerful, easy-to-use tool for accessing and analyzing data from small bodies.

  1. The Synthesis Map Is a Multidimensional Educational Tool That Provides Insight into Students' Mental Models and Promotes Students' Synthetic Knowledge Generation

    ERIC Educational Resources Information Center

    Ortega, Ryan A.; Brame, Cynthia J.

    2015-01-01

    Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…

  2. SpecTIR and SEBASS analysis of the National Mining District, Humboldt County, Nevada

    NASA Astrophysics Data System (ADS)

    Morken, Todd O.

    The purpose of this study was to evaluate the minerals and materials that could be uniquely identified and mapped from measurements made with airborne hyperspectral SpecTIR VNIR/SWIR and SEBASS TIR sensors over areas in the National Mining District. SpecTIR Corporation and Aerospace Corporation acquired Hyperspectral measurements on June 26, 2008 using their ProSpecTIR and SEBASS sensors respectively. In addition the effects of vegetation, elevation, the atmosphere on spectral measurements were evaluated to determine their impact upon the data analysis and target identification. The National Mining District is located approximately 75 miles northeast of Winnemucca, Nevada at the northern end of the Santa Rosa Mountains. Precious metal mining has been dormant in this area since the 1940's, however with increased metal prices over the last decade economic interest in the region has increased substantially. Buckskin Mountain has a preserved alteration assemblage that is exposed in topographically steep terrain, ideal for exploring what hydrothermal alteration products can be identified and mapped in these datasets. These Visible Near Infrared (VNIR), Short Wave Infrared (SWIR), and Long Wave Infrared (LWIR) hyperspectral datasets were used to identify and map kaolinite, alunite, quartz, opal, and illite/muscovite, all of which are useful exploration target identifiers and can indicate regions of alteration. These mapping results were then combined with and compared to other geospatial data in a geographic information systems (GIS) database. The TIR hyperspectral data provided significant additional information that can benefit geologic exploration and demonstrated its usefulness as an additional tool for geological exploration.

  3. Combining Google Earth and GIS mapping technologies in a dengue surveillance system for developing countries

    PubMed Central

    Chang, Aileen Y; Parrales, Maria E; Jimenez, Javier; Sobieszczyk, Magdalena E; Hammer, Scott M; Copenhaver, David J; Kulkarni, Rajan P

    2009-01-01

    Background Dengue fever is a mosquito-borne illness that places significant burden on tropical developing countries with unplanned urbanization. A surveillance system using Google Earth and GIS mapping technologies was developed in Nicaragua as a management tool. Methods and Results Satellite imagery of the town of Bluefields, Nicaragua captured from Google Earth was used to create a base-map in ArcGIS 9. Indices of larval infestation, locations of tire dumps, cemeteries, large areas of standing water, etc. that may act as larval development sites, and locations of the homes of dengue cases collected during routine epidemiologic surveying were overlaid onto this map. Visual imagery of the location of dengue cases, larval infestation, and locations of potential larval development sites were used by dengue control specialists to prioritize specific neighborhoods for targeted control interventions. Conclusion This dengue surveillance program allows public health workers in resource-limited settings to accurately identify areas with high indices of mosquito infestation and interpret the spatial relationship of these areas with potential larval development sites such as garbage piles and large pools of standing water. As a result, it is possible to prioritize control strategies and to target interventions to highest risk areas in order to eliminate the likely origin of the mosquito vector. This program is well-suited for resource-limited settings since it utilizes readily available technologies that do not rely on Internet access for daily use and can easily be implemented in many developing countries for very little cost. PMID:19627614

  4. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  5. Tools for Understanding Identity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael

    Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical modelmore » for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take into account the difficulty of the inferences, allowing the user to consider different scenarios depending on the perceived resources of the attacker, or to prioritize lines of investigation. It also has a number of interesting visualizations that are designed to aid the user in understanding the model. The tool works by considering the inferences as a graph and runs various graph-theoretic algorithms, with some novel adaptations, in order to deduce various properties. Using the Model To help investigators exploit the model to perform identity attribution, we have developed the Identity Map visualization. For a user-provided set of known starting elements and a set of desired target elements for a given identity, the Identity Map generates investigative workflows as paths through the model. Each path consists of a series of elements and inferences between them that connect the input and output elements. Each path also has an associated confidence level that estimates the reliability of the resulting attribution. Identity Map can help investigators understand the possible ways to make an identification decision and guide them toward the data-collection or analysis steps required to reach that decision.« less

  6. Visual classification of medical data using MLP mapping.

    PubMed

    Cağatay Güler, E; Sankur, B; Kahya, Y P; Raudys, S

    1998-05-01

    In this work we discuss the design of a novel non-linear mapping method for visual classification based on multilayer perceptrons (MLP) and assigned class target values. In training the perceptron, one or more target output values for each class in a 2-dimensional space are used. In other words, class membership information is interpreted visually as closeness to target values in a 2D feature space. This mapping is obtained by training the multilayer perceptron (MLP) using class membership information, input data and judiciously chosen target values. Weights are estimated in such a way that each training feature of the corresponding class is forced to be mapped onto the corresponding 2-dimensional target value.

  7. Combining mouse mammary gland gene expression and comparative mapping for the identification of candidate genes for QTL of milk production traits in cattle

    PubMed Central

    Ron, Micha; Israeli, Galit; Seroussi, Eyal; Weller, Joel I; Gregg, Jeffrey P; Shani, Moshe; Medrano, Juan F

    2007-01-01

    Background Many studies have found segregating quantitative trait loci (QTL) for milk production traits in different dairy cattle populations. However, even for relatively large effects with a saturated marker map the confidence interval for QTL location by linkage analysis spans tens of map units, or hundreds of genes. Combining mapping and arraying has been suggested as an approach to identify candidate genes. Thus, gene expression analysis in the mammary gland of genes positioned in the confidence interval of the QTL can bridge the gap between fine mapping and quantitative trait nucleotide (QTN) determination. Results We hybridized Affymetrix microarray (MG-U74v2), containing 12,488 murine probes, with RNA derived from mammary gland of virgin, pregnant, lactating and involuting C57BL/6J mice in a total of nine biological replicates. We combined microarray data from two additional studies that used the same design in mice with a total of 75 biological replicates. The same filtering and normalization was applied to each microarray data using GeneSpring software. Analysis of variance identified 249 differentially expressed probe sets common to the three experiments along the four developmental stages of puberty, pregnancy, lactation and involution. 212 genes were assigned to their bovine map positions through comparative mapping, and thus form a list of candidate genes for previously identified QTLs for milk production traits. A total of 82 of the genes showed mammary gland-specific expression with at least 3-fold expression over the median representing all tissues tested in GeneAtlas. Conclusion This work presents a web tool for candidate genes for QTL (cgQTL) that allows navigation between the map of bovine milk production QTL, potential candidate genes and their level of expression in mammary gland arrays and in GeneAtlas. Three out of four confirmed genes that affect QTL in livestock (ABCG2, DGAT1, GDF8, IGF2) were over expressed in the target organ. Thus, cgQTL can be used to determine priority of candidate genes for QTN analysis based on differential expression in the target organ. PMID:17584498

  8. The Lunar Reconnaissance Orbiter, a Planning Tool for Missions to the Moon

    NASA Astrophysics Data System (ADS)

    Keller, J. W.; Petro, N. E.

    2017-12-01

    The Lunar Reconnaissance Orbiter Mission was conceived as a one year exploration mission to pave the way for a return to the lunar surface, both robotically and by humans. After a year in orbit LRO transitioned to a science mission but has operated in a duel role of science and exploration ever since. Over the years LRO has compiled a wealth of data that can and is being used for planning future missions to the Moon by NASA, other national agencies and by private enterprises. While collecting this unique and unprecedented data set, LRO's science investigations have uncovered new questions that motivate new missions and targets. Examples include: when did volcanism on the Moon cease, motivating a sample return mission from an irregular mare patch such as Ina-D; or, is there significant water ice sequestered near the poles outside of the permanently shaded regions? In this presentation we will review the data products, tools and maps that are available for mission planning, discuss how the operating LRO mission can further enhance future missions, and suggest new targets motivated by LRO's scientific investigations.

  9. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  10. Green Infrastructure Design Based on Spatial Conservation Prioritization and Modeling of Biodiversity Features and Ecosystem Services.

    PubMed

    Snäll, Tord; Lehtomäki, Joona; Arponen, Anni; Elith, Jane; Moilanen, Atte

    2016-02-01

    There is high-level political support for the use of green infrastructure (GI) across Europe, to maintain viable populations and to provide ecosystem services (ES). Even though GI is inherently a spatial concept, the modern tools for spatial planning have not been recognized, such as in the recent European Environment Agency (EEA) report. We outline a toolbox of methods useful for GI design that explicitly accounts for biodiversity and ES. Data on species occurrence, habitats, and environmental variables are increasingly available via open-access internet platforms. Such data can be synthesized by statistical species distribution modeling, producing maps of biodiversity features. These, together with maps of ES, can form the basis for GI design. We argue that spatial conservation prioritization (SCP) methods are effective tools for GI design, as the overall SCP goal is cost-effective allocation of conservation efforts. Corridors are currently promoted by the EEA as the means for implementing GI design, but they typically target the needs of only a subset of the regional species pool. SCP methods would help to ensure that GI provides a balanced solution for the requirements of many biodiversity features (e.g., species, habitat types) and ES simultaneously in a cost-effective manner. Such tools are necessary to make GI into an operational concept for combating biodiversity loss and promoting ES.

  11. Green Infrastructure Design Based on Spatial Conservation Prioritization and Modeling of Biodiversity Features and Ecosystem Services

    NASA Astrophysics Data System (ADS)

    Snäll, Tord; Lehtomäki, Joona; Arponen, Anni; Elith, Jane; Moilanen, Atte

    2016-02-01

    There is high-level political support for the use of green infrastructure (GI) across Europe, to maintain viable populations and to provide ecosystem services (ES). Even though GI is inherently a spatial concept, the modern tools for spatial planning have not been recognized, such as in the recent European Environment Agency (EEA) report. We outline a toolbox of methods useful for GI design that explicitly accounts for biodiversity and ES. Data on species occurrence, habitats, and environmental variables are increasingly available via open-access internet platforms. Such data can be synthesized by statistical species distribution modeling, producing maps of biodiversity features. These, together with maps of ES, can form the basis for GI design. We argue that spatial conservation prioritization (SCP) methods are effective tools for GI design, as the overall SCP goal is cost-effective allocation of conservation efforts. Corridors are currently promoted by the EEA as the means for implementing GI design, but they typically target the needs of only a subset of the regional species pool. SCP methods would help to ensure that GI provides a balanced solution for the requirements of many biodiversity features (e.g., species, habitat types) and ES simultaneously in a cost-effective manner. Such tools are necessary to make GI into an operational concept for combating biodiversity loss and promoting ES.

  12. miR-MaGiC improves quantification accuracy for small RNA-seq.

    PubMed

    Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina

    2018-05-15

    Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.

  13. Combined Ultrasound and MR Imaging to Guide Focused Ultrasound Therapies in the Brain

    PubMed Central

    Arvanitis, Costas D.; Livingstone, Margaret S.; McDannold, Nathan

    2013-01-01

    Purpose Several emerging therapies with potential for use in the brain harness effects produced by acoustic cavitation – the interaction between ultrasound and microbubbles either generated during sonication or introduced into the vasculature. Systems developed for transcranial MRI-guided focused ultrasound (MRgFUS) thermal ablation can enable their clinical translation, but methods for real-time monitoring and control are currently lacking. Acoustic emissions produced during sonication can provide information about the location, strength, and type of the microbubble oscillations within the ultrasound field, and they can be mapped in real-time using passive imaging approaches. Here, we tested whether such mapping can be achieved transcranially within a clinical brain MRgFUS system. Materials and Methods We integrated an ultrasound imaging array into the hemisphere transducer of the MRgFUS device. Passive cavitation maps were obtained during sonications combined with a circulating microbubble agent at 20 targets in the cingulate cortex in three macaques. The maps were compared with MRI-evident tissue effects. Results The system successfully mapped microbubble activity during both stable and inertial cavitation, which was correlated with MRI-evident transient blood-brain barrier disruption and vascular damage, respectively. The location of this activity was coincident with the resulting tissue changes within the expected resolution limits of the system. Conclusion While preliminary, these data clearly demonstrate, for the first time, that is possible to construct maps of stable and inertial cavitation transcranially, in a large animal model, and under clinically relevant conditions. Further, these results suggest that this hybrid ultrasound/MRI approach can provide comprehensive guidance for targeted drug delivery via blood-brain barrier disruption and other emerging ultrasound treatments, facilitating their clinical translation. We anticipate it will also prove to be an important research tool that will further the development of a broad range of microbubble-enhanced therapies. PMID:23788054

  14. Mapping texts through dimensionality reduction and visualization techniques for interactive exploration of document collections

    NASA Astrophysics Data System (ADS)

    de Andrade Lopes, Alneu; Minghim, Rosane; Melo, Vinícius; Paulovich, Fernando V.

    2006-01-01

    The current availability of information many times impair the tasks of searching, browsing and analyzing information pertinent to a topic of interest. This paper presents a methodology to create a meaningful graphical representation of documents corpora targeted at supporting exploration of correlated documents. The purpose of such an approach is to produce a map from a document body on a research topic or field based on the analysis of their contents, and similarities amongst articles. The document map is generated, after text pre-processing, by projecting the data in two dimensions using Latent Semantic Indexing. The projection is followed by hierarchical clustering to support sub-area identification. The map can be interactively explored, helping to narrow down the search for relevant articles. Tests were performed using a collection of documents pre-classified into three research subject classes: Case-Based Reasoning, Information Retrieval, and Inductive Logic Programming. The map produced was capable of separating the main areas and approaching documents by their similarity, revealing possible topics, and identifying boundaries between them. The tool can deal with the exploration of inter-topics and intra-topic relationship and is useful in many contexts that need deciding on relevant articles to read, such as scientific research, education, and training.

  15. Methyllycaconitine: a non-radiolabeled ligand for mapping α7 neuronal nicotinic acetylcholine receptors - in vivo target localization and biodistribution in rat brain.

    PubMed

    Nirogi, Ramakrishna; Kandikere, Vishwottam; Bhyrapuneni, Gopinadh; Saralaya, Ramanatha; Muddana, Nageswararao; Komarneni, Prashanth

    2012-07-01

    Reduction of cerebral cortical and hippocampal α7 neuronal nicotinic acetylcholine receptor (nAChR) density was observed in the Alzheimer's disease (AD) and other neurodegenerative diseases. Mapping the subtypes of nAChRs with selective ligand by viable, quick and consistent method in preclinical drug discovery may lead to rapid development of more effective therapeutic agents. The objective of this study was to evaluate the use of methyllycaconitine (MLA) in non-radiolabeled form for mapping α7 nAChRs in rat brain. MLA pharmacokinetic and brain penetration properties were assessed in male Wistar rats. The tracer properties of MLA were evaluated in rat brain by dose and time dependent differential regional distribution studies. Target specificity was validated after blocking with potent α7 nAChR agonists ABBF, PNU282987 and nicotine. High performance liquid chromatography combined with triple quad mass spectral detector (LC-MS/MS) was used to measure the plasma and brain tissue concentrations of MLA. MLA has shown rapid brain uptake followed by a 3-5 fold higher specific binding in regions containing the α7 nAChRs (hypothalamus - 1.60 ng/g), when compared to non-specific regions (striatum - 0.53 ng/g, hippocampus - 0.46 ng/g, midbrain - 0.37 ng/g, frontal cortex - 0.35 ng/g and cerebellum - 0.30 ng/g). Pretreatment with potent α7 nAChR agonists significantly blocked the MLA uptake in hypothalamus. The non-radiolabeled MLA binding to brain region was comparable with the α7 mRNA localization and receptor distribution reported for [(3)H] MLA in rat brain. The rat pharmacokinetic, brain penetration and differential brain regional distribution features favor that MLA is suitable to use in preclinical stage for mapping α7 nAChRs. Hence, this approach can be employed as an essential tool for quicker development of novel selective ligand to map variation in the α7 receptor densities, as well as to evaluate potential new chemical entities targeting neurodegenerative diseases. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Mappability of drug-like space: towards a polypharmacologically competent map of drug-relevant compounds.

    PubMed

    Sidorov, Pavel; Gaspar, Helena; Marcou, Gilles; Varnek, Alexandre; Horvath, Dragos

    2015-12-01

    Intuitive, visual rendering--mapping--of high-dimensional chemical spaces (CS), is an important topic in chemoinformatics. Such maps were so far dedicated to specific compound collections--either limited series of known activities, or large, even exhaustive enumerations of molecules, but without associated property data. Typically, they were challenged to answer some classification problem with respect to those same molecules, admired for their aesthetical virtues and then forgotten--because they were set-specific constructs. This work wishes to address the question whether a general, compound set-independent map can be generated, and the claim of "universality" quantitatively justified, with respect to all the structure-activity information available so far--or, more realistically, an exploitable but significant fraction thereof. The "universal" CS map is expected to project molecules from the initial CS into a lower-dimensional space that is neighborhood behavior-compliant with respect to a large panel of ligand properties. Such map should be able to discriminate actives from inactives, or even support quantitative neighborhood-based, parameter-free property prediction (regression) models, for a wide panel of targets and target families. It should be polypharmacologically competent, without requiring any target-specific parameter fitting. This work describes an evolutionary growth procedure of such maps, based on generative topographic mapping, followed by the validation of their polypharmacological competence. Validation was achieved with respect to a maximum of exploitable structure-activity information, covering all of Homo sapiens proteins of the ChEMBL database, antiparasitic and antiviral data, etc. Five evolved maps satisfactorily solved hundreds of activity-based ligand classification challenges for targets, and even in vivo properties independent from training data. They also stood chemogenomics-related challenges, as cumulated responsibility vectors obtained by mapping of target-specific ligand collections were shown to represent validated target descriptors, complying with currently accepted target classification in biology. Therefore, they represent, in our opinion, a robust and well documented answer to the key question "What is a good CS map?"

  17. Comparative Genomics Analyses Reveal Extensive Chromosome Colinearity and Novel Quantitative Trait Loci in Eucalyptus.

    PubMed

    Li, Fagen; Zhou, Changpin; Weng, Qijie; Li, Mei; Yu, Xiaoli; Guo, Yong; Wang, Yu; Zhang, Xiaohong; Gan, Siming

    2015-01-01

    Dense genetic maps, along with quantitative trait loci (QTLs) detected on such maps, are powerful tools for genomics and molecular breeding studies. In the important woody genus Eucalyptus, the recent release of E. grandis genome sequence allows for sequence-based genomic comparison and searching for positional candidate genes within QTL regions. Here, dense genetic maps were constructed for E. urophylla and E. tereticornis using genomic simple sequence repeats (SSR), expressed sequence tag (EST) derived SSR, EST-derived cleaved amplified polymorphic sequence (EST-CAPS), and diversity arrays technology (DArT) markers. The E. urophylla and E. tereticornis maps comprised 700 and 585 markers across 11 linkage groups, totaling at 1,208.2 and 1,241.4 cM in length, respectively. Extensive synteny and colinearity were observed as compared to three earlier DArT-based eucalypt maps (two maps with E. grandis × E. urophylla and one map of E. globulus) and with the E. grandis genome sequence. Fifty-three QTLs for growth (10-56 months of age) and wood density (56 months) were identified in 22 discrete regions on both maps, in which only one colocalizaiton was found between growth and wood density. Novel QTLs were revealed as compared with those previously detected on DArT-based maps for similar ages in Eucalyptus. Eleven to 585 positional candidate genes were obained for a 56-month-old QTL through aligning QTL confidence interval with the E. grandis genome. These results will assist in comparative genomics studies, targeted gene characterization, and marker-assisted selection in Eucalyptus and the related taxa.

  18. Comparative Genomics Analyses Reveal Extensive Chromosome Colinearity and Novel Quantitative Trait Loci in Eucalyptus

    PubMed Central

    Weng, Qijie; Li, Mei; Yu, Xiaoli; Guo, Yong; Wang, Yu; Zhang, Xiaohong; Gan, Siming

    2015-01-01

    Dense genetic maps, along with quantitative trait loci (QTLs) detected on such maps, are powerful tools for genomics and molecular breeding studies. In the important woody genus Eucalyptus, the recent release of E. grandis genome sequence allows for sequence-based genomic comparison and searching for positional candidate genes within QTL regions. Here, dense genetic maps were constructed for E. urophylla and E. tereticornis using genomic simple sequence repeats (SSR), expressed sequence tag (EST) derived SSR, EST-derived cleaved amplified polymorphic sequence (EST-CAPS), and diversity arrays technology (DArT) markers. The E. urophylla and E. tereticornis maps comprised 700 and 585 markers across 11 linkage groups, totaling at 1,208.2 and 1,241.4 cM in length, respectively. Extensive synteny and colinearity were observed as compared to three earlier DArT-based eucalypt maps (two maps with E. grandis × E. urophylla and one map of E. globulus) and with the E. grandis genome sequence. Fifty-three QTLs for growth (10–56 months of age) and wood density (56 months) were identified in 22 discrete regions on both maps, in which only one colocalizaiton was found between growth and wood density. Novel QTLs were revealed as compared with those previously detected on DArT-based maps for similar ages in Eucalyptus. Eleven to 585 positional candidate genes were obained for a 56-month-old QTL through aligning QTL confidence interval with the E. grandis genome. These results will assist in comparative genomics studies, targeted gene characterization, and marker-assisted selection in Eucalyptus and the related taxa. PMID:26695430

  19. PFGE MAPPER and PFGE READER: two tools to aid in the analysis and data input of pulse field gel electrophoresis maps.

    PubMed Central

    Shifman, M. A.; Nadkarni, P.; Miller, P. L.

    1992-01-01

    Pulse field gel electrophoresis mapping is an important technique for characterizing large segments of DNA. We have developed two tools to aid in the construction of pulse field electrophoresis gel maps: PFGE READER which stores experimental conditions and calculates fragment sizes and PFGE MAPPER which constructs pulse field gel electrophoresis maps. PMID:1482898

  20. Battle of France WWII

    NASA Astrophysics Data System (ADS)

    Gadhath, Arpitha Rao

    The purpose of this thesis is to build an interactive Geographical Information System (GIS) tool, relating to the series of events that occurred during the Battle of France World War II. The tool gives us an insight about the countries involved in the battle, their allies and their strategies. This tool was created to use it as a one stop source of information regarding all the important battles that took place, which lead to the fall of France. The tool brings together the maps of all the countries involved. Integrated with each map is the data relevant to that map. The data for each country includes the place of attack, the strategies used during the attack, and the kind of warfare. The tool also makes use of HTML files to give all the information, along with the images from the time of the war and a footage which explains everything about the particular battle. The tool was build using JAVA, along with the use of MOJO (Map Objects Java Objects) to develop Maps of each of the countries. MOJO is developed by ESRI (Environmental Science Research Institute) which makes it easier to add data to the maps. It also makes highlighting important information easier making use of pop-up windows, charts and infographics. HTML files were designed making use of the open-source template developed by Bootstrap. The tool is built in such a way that the interface is simple and easy for the user to use and understand.

  1. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a useful Climate Twins regions search. The Climate Twins tool works actually comparing future climate conditions of a certain source area in the Greater Alpine Region with current climate conditions of entire Europe and the neighbouring southern as well south-eastern areas as target regions. A next version will integrate web crawling features for searching information about climate-related local adaptations observed today in the target region which may turn out as appropriate solution for the source region under future climate conditions. The contribution will present the current tool functionally and will discuss which indicator sets, similarity conditions and uncertainty ranges work best to deliver scientifically sound climate comparisons and distinct mapping results.

  2. Molecular inversion probe assay.

    PubMed

    Absalan, Farnaz; Ronaghi, Mostafa

    2007-01-01

    We have described molecular inversion probe technologies for large-scale genetic analyses. This technique provides a comprehensive and powerful tool for the analysis of genetic variation and enables affordable, large-scale studies that will help uncover the genetic basis of complex disease and explain the individual variation in response to therapeutics. Major applications of the molecular inversion probes (MIP) technologies include targeted genotyping from focused regions to whole-genome studies, and allele quantification of genomic rearrangements. The MIP technology (used in the HapMap project) provides an efficient, scalable, and affordable way to score polymorphisms in case/control populations for genetic studies. The MIP technology provides the highest commercially available multiplexing levels and assay conversion rates for targeted genotyping. This enables more informative, genome-wide studies with either the functional (direct detection) approach or the indirect detection approach.

  3. Evidence-Based Concept Mapping for the Athletic Training Student

    ERIC Educational Resources Information Center

    Speicher, Timothy E.; Martin, Malissa; Zigmont, Jason

    2013-01-01

    Context: A concept map is a graphical and cognitive tool that enables learners to link together interrelated concepts using propositions or statements that answer a posed problem. As an assessment tool, concept mapping reveals a learner's research skill proficiency and cognitive processing. Background: The identification and organization of the…

  4. Mapping one strong 'Ohana: using network analysis and GIS to enhance the effectiveness of a statewide coalition to prevent child abuse and neglect.

    PubMed

    Cardazone, Gina; U Sy, Angela; Chik, Ivan; Corlew, Laura Kate

    2014-06-01

    Network analysis and GIS enable the presentation of meaningful data about organizational relationships and community characteristics, respectively. Together, these tools can provide a concrete representation of the ecological context in which coalitions operate, and may help coalitions identify opportunities for growth and enhanced effectiveness. This study uses network analysis and GIS mapping as part of an evaluation of the One Strong 'Ohana (OSO) campaign. The OSO campaign was launched in 2012 via a partnership between the Hawai'i Children's Trust Fund (HCTF) and the Joyful Heart Foundation. The OSO campaign uses a collaborative approach aimed at increasing public awareness of child maltreatment and protective factors that can prevent maltreatment, as well as enhancing the effectiveness of the HCTF Coalition. This study focuses on three elements of the OSO campaign evaluation: (1) Network analysis exploring the relationships between 24 active Coalition member organizations, (2) GIS mapping of responses to a randomized statewide phone survey (n = 1,450) assessing awareness of factors contributing to child maltreatment, and (3) Combined GIS maps and network data, illustrating opportunities for geographically-targeted coalition building and public awareness activities.

  5. The historical development of the magnetic method in exploration

    USGS Publications Warehouse

    Nabighian, M.N.; Grauch, V.J.S.; Hansen, R.O.; LaFehr, T.R.; Li, Y.; Peirce, J.W.; Phillips, J.D.; Ruder, M.E.

    2005-01-01

    The magnetic method, perhaps the oldest of geophysical exploration techniques, blossomed after the advent of airborne surveys in World War II. With improvements in instrumentation, navigation, and platform compensation, it is now possible to map the entire crustal section at a variety of scales, from strongly magnetic basement at regional scale to weakly magnetic sedimentary contacts at local scale. Methods of data filtering, display, and interpretation have also advanced, especially with the availability of low-cost, high-performance personal computers and color raster graphics. The magnetic method is the primary exploration tool in the search for minerals. In other arenas, the magnetic method has evolved from its sole use for mapping basement structure to include a wide range of new applications, such as locating intrasedimentary faults, defining subtle lithologic contacts, mapping salt domes in weakly magnetic sediments, and better defining targets through 3D inversion. These new applications have increased the method's utility in all realms of exploration - in the search for minerals, oil and gas, geothermal resources, and groundwater, and for a variety of other purposes such as natural hazards assessment, mapping impact structures, and engineering and environmental studies. ?? 2005 Society of Exploration Geophysicists. All rights reserved.

  6. Infrared small target detection based on directional zero-crossing measure

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangyue; Ding, Qinghai; Luo, Haibo; Hui, Bin; Chang, Zheng; Zhang, Junchao

    2017-12-01

    Infrared small target detection under complex background and low signal-to-clutter ratio (SCR) condition is of great significance to the development on precision guidance and infrared surveillance. In order to detect targets precisely and extract targets from intricate clutters effectively, a detection method based on zero-crossing saliency (ZCS) map is proposed. The original map is first decomposed into different first-order directional derivative (FODD) maps by using FODD filters. Then the ZCS map is obtained by fusing all directional zero-crossing points. At last, an adaptive threshold is adopted to segment targets from the ZCS map. Experimental results on a series of images show that our method is effective and robust for detection under complex backgrounds. Moreover, compared with other five state-of-the-art methods, our method achieves better performance in terms of detection rate, SCR gain and background suppression factor.

  7. iPads at Field Camp: A First Test of the Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Hurst, S. D.; Stewart, M. A.

    2011-12-01

    An iPad 2 was given to approximately half of the University of Illinois students attending the Wasatch-Uinta Field Camp (WUFC) in summer 2011. The iPads were provisioned with orientation measuring, mapping and location software. The software would automatically transfer an orientation measurement to the current location on the Google Maps application, and was able to output a full list of orientation data. Students also had normal access to more traditional mapping tools such as Brunton compasses and GPS units and were required to map with these tools along with other students of WUFC not provided iPads. Compared to traditional tools, iPads have drawbacks such as increased weight, break-ability, need for power source and wireless connectivity; in sum, they need a substantial infrastructure that reduces range, availability, and probably most importantly, convenience. Some of these drawbacks inhibited adoption by our students, the primary reasons being the added weight and the inability to map directly to a GIS application with detailed topographic maps equivalent to the physical topographic map sheets used at WUFC. In their favor, the iPads combine a host of tools into one, including software that can measure orientation in a fashion more intuitively than a Brunton. They also allow storage, editing and analysis of data, notes (spoken and/or written) and potentially unlimited access to a variety of maps. Via a post-field camp survey of the University of Illinois students at WUFC, we have identified some of the important issues that need to be addressed before portable tablets like the iPad become the tool of choice for general field work. Some problems are intrinsic to almost any advanced technology, some are artifacts of the current generations of hardware and software available for these devices. Technical drawbacks aside, the adoption of iPads was further inhibited primarily by inexperience with their use as a mapping tool and secondarily by their redundancy with traditional tools. We are addressing some aspects of software limitations and future technology improvements by the industry will naturally reduce other limitations. We will continue testing iPads during field trips and courses for the foreseeable future. As we begin to deal with these limitations and students become more accustomed to their use in the field, we expect our students to more fully embrace iPads as a convenient field and mapping tool.

  8. Coproducing Aboriginal patient journey mapping tools for improved quality and coordination of care.

    PubMed

    Kelly, Janet; Dwyer, Judith; Mackean, Tamara; O'Donnell, Kim; Willis, Eileen

    2016-12-08

    This paper describes the rationale and process for developing a set of Aboriginal patient journey mapping tools with Aboriginal patients, health professionals, support workers, educators and researchers in the Managing Two Worlds Together project between 2008 and 2015. Aboriginal patients and their families from rural and remote areas, and healthcare providers in urban, rural and remote settings, shared their perceptions of the barriers and enablers to quality care in interviews and focus groups, and individual patient journey case studies were documented. Data were thematically analysed. In the absence of suitable existing tools, a new analytical framework and mapping approach was developed. The utility of the tools in other settings was then tested with health professionals, and the tools were further modified for use in quality improvement in health and education settings in South Australia and the Northern Territory. A central set of patient journey mapping tools with flexible adaptations, a workbook, and five sets of case studies describing how staff adapted and used the tools at different sites are available for wider use.

  9. Facilitating participatory multilevel decision-making by using interactive mental maps.

    PubMed

    Pfeiffer, Constanze; Glaser, Stephanie; Vencatesan, Jayshree; Schliermann-Kraus, Elke; Drescher, Axel; Glaser, Rüdiger

    2008-11-01

    Participation of citizens in political, economic or social decisions is increasingly recognized as a precondition to foster sustainable development processes. Since spatial information is often important during planning and decision making, participatory mapping gains in popularity. However, little attention has been paid to the fact that information must be presented in a useful way to reach city planners and policy makers. Above all, the importance of visualisation tools to support collaboration, analytical reasoning, problem solving and decision-making in analysing and planning processes has been underestimated. In this paper, we describe how an interactive mental map tool has been developed in a highly interdisciplinary disaster management project in Chennai, India. We moved from a hand drawn mental maps approach to an interactive mental map tool. This was achieved by merging socio-economic and geospatial data on infrastructure, local perceptions, coping and adaptation strategies with remote sensing data and modern technology of map making. This newly developed interactive mapping tool allowed for insights into different locally-constructed realities and facilitated the communication of results to the wider public and respective policy makers. It proved to be useful in visualising information and promoting participatory decision-making processes. We argue that the tool bears potential also for health research projects. The interactive mental map can be used to spatially and temporally assess key health themes such as availability of, and accessibility to, existing health care services, breeding sites of disease vectors, collection and storage of water, waste disposal, location of public toilets or defecation sites.

  10. Technological Advances In The Surgical Treatment Of Movement Disorders

    PubMed Central

    Gross, Robert E.; McDougal, Margaret E.

    2013-01-01

    Technological innovations have driven the advancement of the surgical treatment of movement disorders, from the invention of the stereotactic frame to the adaptation of deep brain stimulation (DBS). Along these lines, this review will describe recent advances in getting neuromodulation modalities, including DBS, to the target; and in the delivery of therapy at the target. Recent radiological advances are altering the way that DBS leads are targeted and inserted, by refining the ability to visualize the subcortical targets using high-field strength MRI and other innovations such as diffusion tensor imaging, and the development of novel targeting devices enabling purely anatomical implantations without the need for neurophysiological monitoring. New portable CT scanners also are facilitating lead implantation without monitoring as well as improving radiological verification of DBS lead location. Advances in neurophysiological mapping include efforts to develop automatic target verification algorithms, and probabilistic maps to guide target selection. The delivery of therapy at the target is being improved by the development of the next generation of internal pulse generators (IPGs). These include constant current devices that mitigate the variability introduced by impedance changes of the stimulated tissue, and in the near future, devices that deliver novel stimulation patterns with improved efficiency. Closed-loop adaptive IPGs are being tested, which may tailor stimulation to ongoing changes in the nervous system reflected in Œbiomarkers1 continuously recorded by the devices. Finer grained DBS leads, in conjunction with new IPGs and advanced programming tools, may offer improved outcomes via Œcurrent steering1 algorithms. Finally, even thermocoagulation - essentially replaced by DBS - is being advanced by new Œminimally-invasive1 approaches that may improve this therapy for selected patients in whom it may be preferred. Functional neurosurgery has a history of being driven by technological innovation, a tradition that continues into its future. PMID:23812894

  11. RISK COMMUNICATION IN ACTION: THE TOOLS OF MESSAGE MAPPING

    EPA Science Inventory

    Risk Communication in Action: The Tools of Message Mapping, is a workbook designed to guide risk communicators in crisis situations. The first part of this workbook will review general guidelines for risk communication. The second part will focus on one of the most robust tools o...

  12. Jules Verne Voyager, Jr: An Interactive Map Tool for Teaching Plate Tectonics

    NASA Astrophysics Data System (ADS)

    Hamburger, M. W.; Meertens, C. M.

    2010-12-01

    We present an interactive, web-based map utility that can make new geological and geophysical results accessible to a large number and variety of users. The tool provides a user-friendly interface that allows users to access a variety of maps, satellite images, and geophysical data at a range of spatial scales. The map tool, dubbed 'Jules Verne Voyager, Jr.', allows users to interactively create maps of a variety of study areas around the world. The utility was developed in collaboration with the UNAVCO Consortium for study of global-scale tectonic processes. Users can choose from a variety of base maps (including "Face of the Earth" and "Earth at Night" satellite imagery mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others), add a number of geographic and geophysical overlays (coastlines, political boundaries, rivers and lakes, earthquake and volcano locations, stress axes, etc.), and then superimpose both observed and model velocity vectors representing a compilation of 2933 GPS geodetic measurements from around the world. A remarkable characteristic of the geodetic compilation is that users can select from some 21 plates' frames of reference, allowing a visual representation of both 'absolute' plate motion (in a no-net rotation reference frame) and relative motion along all of the world's plate boundaries. The tool allows users to zoom among at least three map scales. The map tool can be viewed at http://jules.unavco.org/VoyagerJr/Earth. A more detailed version of the map utility, developed in conjunction with the EarthScope initiative, focuses on North America geodynamics, and provides more detailed geophysical and geographic information for the United States, Canada, and Mexico. The ‘EarthScope Voyager’ can be accessed at http://jules.unavco.org/VoyagerJr/EarthScope. Because the system uses pre-constructed gif images and overlays, the system can rapidly create and display maps to a large number of users simultaneously and does not require any special software installation on users' systems. In addition, a javascript-based educational interface, dubbed "Exploring our Dynamic Planet", incorporates the map tool, explanatory material, background scientific material, and curricular activities that encourage users to explore Earth processes using the Jules Verne Voyager, Jr. tool. Exploring our Dynamic Planet can be viewed at http://www.dpc.ucar.edu/VoyagerJr/. Because of its flexibility, the map utilities can be used for hands-on exercises exploring plate interaction in a range of academic settings, from high school science classes to entry-level undergraduate to graduate-level tectonics courses.

  13. National Seabed Mapping Programmes Collaborate to Advance Marine Geomorphological Mapping in Adjoining European Seas

    NASA Astrophysics Data System (ADS)

    Monteys, X.; Guinan, J.; Green, S.; Gafeira, J.; Dove, D.; Baeten, N. J.; Thorsnes, T.

    2017-12-01

    Marine geomorphological mapping is an effective means of characterising and understanding the seabed and its features with direct relevance to; offshore infrastructure placement, benthic habitat mapping, conservation & policy, marine spatial planning, fisheries management and pure research. Advancements in acoustic survey techniques and data processing methods resulting in the availability of high-resolution marine datasets e.g. multibeam echosounder bathymetry and shallow seismic mean that geological interpretations can be greatly improved by combining with geomorphological maps. Since December 2015, representatives from the national seabed mapping programmes of Norway (MAREANO), Ireland (INFOMAR) and the United Kingdom (MAREMAP) have collaborated and established the MIM geomorphology working group) with the common aim of advancing best practice for geological mapping in their adjoining sea areas in north-west Europe. A recently developed two-part classification system for Seabed Geomorphology (`Morphology' and Geomorphology') has been established as a result of an initiative led by the British Geological Survey (BGS) with contributions from the MIM group (Dove et al. 2016). To support the scheme, existing BGS GIS tools (SIGMA) have been adapted to apply this two-part classification system and here we present on the tools effectiveness in mapping geomorphological features, along with progress in harmonising the classification and feature nomenclature. Recognising that manual mapping of seabed features can be time-consuming and subjective, semi-automated approaches for mapping seabed features and improving mapping efficiency is being developed using Arc-GIS based tools. These methods recognise, spatially delineate and morphologically describe seabed features such as pockmarks (Gafeira et al., 2012) and cold-water coral mounds. Such tools utilise multibeam echosounder data or any other bathymetric dataset (e.g. 3D seismic, Geldof et al., 2014) that can produce a depth digital model. The tools have the capability to capture an extensive list of morphological attributes. The MIM geomorphology working group's strategy to develop methods for more efficient marine geomorphological mapping is presented with data examples and case studies showing the latest results.

  14. Assessment and application of national environmental databases and mapping tools at the local level to two community case studies.

    PubMed

    Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad

    2011-03-01

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.

  15. Self Organizing Map-Based Classification of Cathepsin k and S Inhibitors with Different Selectivity Profiles Using Different Structural Molecular Fingerprints: Design and Application for Discovery of Novel Hits.

    PubMed

    Ihmaid, Saleh K; Ahmed, Hany E A; Zayed, Mohamed F; Abadleh, Mohammed M

    2016-01-30

    The main step in a successful drug discovery pipeline is the identification of small potent compounds that selectively bind to the target of interest with high affinity. However, there is still a shortage of efficient and accurate computational methods with powerful capability to study and hence predict compound selectivity properties. In this work, we propose an affordable machine learning method to perform compound selectivity classification and prediction. For this purpose, we have collected compounds with reported activity and built a selectivity database formed of 153 cathepsin K and S inhibitors that are considered of medicinal interest. This database has three compound sets, two K/S and S/K selective ones and one non-selective KS one. We have subjected this database to the selectivity classification tool 'Emergent Self-Organizing Maps' for exploring its capability to differentiate selective cathepsin inhibitors for one target over the other. The method exhibited good clustering performance for selective ligands with high accuracy (up to 100 %). Among the possibilites, BAPs and MACCS molecular structural fingerprints were used for such a classification. The results exhibited the ability of the method for structure-selectivity relationship interpretation and selectivity markers were identified for the design of further novel inhibitors with high activity and target selectivity.

  16. Rapid discrimination of different Apiaceae species based on HPTLC fingerprints and targeted flavonoids determination using multivariate image analysis.

    PubMed

    Shawky, Eman; Abou El Kheir, Rasha M

    2018-02-11

    Species of Apiaceae are used in folk medicine as spices and in officinal medicinal preparations of drugs. They are an excellent source of phenolics exhibiting antioxidant activity, which are of great benefit to human health. Discrimination among Apiaceae medicinal herbs remains an intricate challenge due to their morphological similarity. In this study, a combined "untargeted" and "targeted" approach to investigate different Apiaceae plants species was proposed by using the merging of high-performance thin layer chromatography (HPTLC)-image analysis and pattern recognition methods which were used for fingerprinting and classification of 42 different Apiaceae samples collected from Egypt. Software for image processing was applied for fingerprinting and data acquisition. HPTLC fingerprint assisted by principal component analysis (PCA) and hierarchical cluster analysis (HCA)-heat maps resulted in a reliable untargeted approach for discrimination and classification of different samples. The "targeted" approach was performed by developing and validating an HPTLC method allowing the quantification of eight flavonoids. The combination of quantitative data with PCA and HCA-heat-maps allowed the different samples to be discriminated from each other. The use of chemometrics tools for evaluation of fingerprints reduced expense and analysis time. The proposed method can be adopted for routine discrimination and evaluation of the phytochemical variability in different Apiaceae species extracts. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Barley whole exome capture: a tool for genomic research in the genus Hordeum and beyond

    PubMed Central

    Mascher, Martin; Richmond, Todd A; Gerhardt, Daniel J; Himmelbach, Axel; Clissold, Leah; Sampath, Dharanya; Ayling, Sarah; Steuernagel, Burkhard; Pfeifer, Matthias; D'Ascenzo, Mark; Akhunov, Eduard D; Hedley, Pete E; Gonzales, Ana M; Morrell, Peter L; Kilian, Benjamin; Blattner, Frank R; Scholz, Uwe; Mayer, Klaus FX; Flavell, Andrew J; Muehlbauer, Gary J; Waugh, Robbie; Jeddeloh, Jeffrey A; Stein, Nils

    2013-01-01

    Advanced resources for genome-assisted research in barley (Hordeum vulgare) including a whole-genome shotgun assembly and an integrated physical map have recently become available. These have made possible studies that aim to assess genetic diversity or to isolate single genes by whole-genome resequencing and in silico variant detection. However such an approach remains expensive given the 5 Gb size of the barley genome. Targeted sequencing of the mRNA-coding exome reduces barley genomic complexity more than 50-fold, thus dramatically reducing this heavy sequencing and analysis load. We have developed and employed an in-solution hybridization-based sequence capture platform to selectively enrich for a 61.6 megabase coding sequence target that includes predicted genes from the genome assembly of the cultivar Morex as well as publicly available full-length cDNAs and de novo assembled RNA-Seq consensus sequence contigs. The platform provides a highly specific capture with substantial and reproducible enrichment of targeted exons, both for cultivated barley and related species. We show that this exome capture platform provides a clear path towards a broader and deeper understanding of the natural variation residing in the mRNA-coding part of the barley genome and will thus constitute a valuable resource for applications such as mapping-by-sequencing and genetic diversity analyzes. PMID:23889683

  18. Mapping as a visual health communication tool: promises and dilemmas.

    PubMed

    Parrott, Roxanne; Hopfer, Suellen; Ghetian, Christie; Lengerich, Eugene

    2007-01-01

    In the era of evidence-based public health promotion and planning, the use of maps as a form of evidence to communicate about the multiple determinants of cancer is on the rise. Geographic information systems and mapping technologies make future proliferation of this strategy likely. Yet disease maps as a communication form remain largely unexamined. This content analysis considers the presence of multivariate information, credibility cues, and the communication function of publicly accessible maps for cancer control activities. Thirty-six state comprehensive cancer control plans were publicly available in July 2005 and were reviewed for the presence of maps. Fourteen of the 36 state cancer plans (39%) contained map images (N = 59 static maps). A continuum of map inter activity was observed, with 10 states having interactive mapping tools available to query and map cancer information. Four states had both cancer plans with map images and interactive mapping tools available to the public on their Web sites. Of the 14 state cancer plans that depicted map images, two displayed multivariate data in a single map. Nine of the 10 states with interactive mapping capability offered the option to display multivariate health risk messages. The most frequent content category mapped was cancer incidence and mortality, with stage at diagnosis infrequently available. The most frequent communication function served by the maps reviewed was redundancy, as maps repeated information contained in textual forms. The social and ethical implications for communicating about cancer through the use of visual geographic representations are discussed.

  19. Visualization and analysis of pulsed ion beam energy density profile with infrared imaging

    NASA Astrophysics Data System (ADS)

    Isakova, Y. I.; Pushkarev, A. I.

    2018-03-01

    Infrared imaging technique was used as a surface temperature-mapping tool to characterize the energy density distribution of intense pulsed ion beams on a thin metal target. The technique enables the measuring of the total ion beam energy and the energy density distribution along the cross section and allows one to optimize the operation of an ion diode and control target irradiation mode. The diagnostics was tested on the TEMP-4M accelerator at TPU, Tomsk, Russia and on the TEMP-6 accelerator at DUT, Dalian, China. The diagnostics was applied in studies of the dynamics of the target cooling in vacuum after irradiation and in the experiments with target ablation. Errors caused by the target ablation and target cooling during measurements have been analyzed. For Fluke Ti10 and Fluke Ti400 infrared cameras, the technique can achieve surface energy density sensitivity of 0.05 J/cm2 and spatial resolution of 1-2 mm. The thermal imaging diagnostics does not require expensive consumed materials. The measurement time does not exceed 0.1 s; therefore, this diagnostics can be used for the prompt evaluation of the energy density distribution of a pulsed ion beam and during automation of the irradiation process.

  20. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public

    PubMed Central

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314

  1. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.

    PubMed

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.

  2. Interactive Learning Modules: Enabling Near Real-Time Oceanographic Data Use In Undergraduate Education

    NASA Astrophysics Data System (ADS)

    Kilb, D. L.; Fundis, A. T.; Risien, C. M.

    2012-12-01

    The focus of the Education and Public Engagement (EPE) component of the NSF's Ocean Observatories Initiative (OOI) is to provide a new layer of cyber-interactivity for undergraduate educators to bring near real-time data from the global ocean into learning environments. To accomplish this, we are designing six online services including: 1) visualization tools, 2) a lesson builder, 3) a concept map builder, 4) educational web services (middleware), 5) collaboration tools and 6) an educational resource database. Here, we report on our Fall 2012 release that includes the first four of these services: 1) Interactive visualization tools allow users to interactively select data of interest, display the data in various views (e.g., maps, time-series and scatter plots) and obtain statistical measures such as mean, standard deviation and a regression line fit to select data. Specific visualization tools include a tool to compare different months of data, a time series explorer tool to investigate the temporal evolution of select data parameters (e.g., sea water temperature or salinity), a glider profile tool that displays ocean glider tracks and associated transects, and a data comparison tool that allows users to view the data either in scatter plot view comparing one parameter with another, or in time series view. 2) Our interactive lesson builder tool allows users to develop a library of online lesson units, which are collaboratively editable and sharable and provides starter templates designed from learning theory knowledge. 3) Our interactive concept map tool allows the user to build and use concept maps, a graphical interface to map the connection between concepts and ideas. This tool also provides semantic-based recommendations, and allows for embedding of associated resources such as movies, images and blogs. 4) Education web services (middleware) will provide an educational resource database API.

  3. Evaluating time-lapse ERT for monitoring DNAPL remediation via numerical simulation

    NASA Astrophysics Data System (ADS)

    Power, C.; Karaoulis, M.; Gerhard, J.; Tsourlos, P.; Giannopoulos, A.

    2012-12-01

    Dense non-aqueous phase liquids (DNAPLs) remain a challenging geoenvironmental problem in the near subsurface. Numerous thermal, chemical, and biological treatment methods are being applied at sites but without a non-destructive, rapid technique to map the evolution of DNAPL mass in space and time, the degree of remedial success is difficult to quantify. Electrical resistivity tomography (ERT) has long been presented as highly promising in this context but has not yet become a practitioner's tool due to challenges in interpreting the survey results at real sites where the initial condition (DNAPL mass, DNAPL distribution, subsurface heterogeneity) is typically unknown. Recently, a new numerical model was presented that couples DNAPL and ERT simulation at the field scale, providing a tool for optimizing ERT application and interpretation at DNAPL sites (Power et al., 2011, Fall AGU, H31D-1191). The objective of this study is to employ this tool to evaluate the effectiveness of time-lapse ERT to monitor DNAPL source zone remediation, taking advantage of new inversion methodologies that exploit the differences in the target over time. Several three-dimensional releases of chlorinated solvent DNAPLs into heterogeneous clayey sand at the field scale were generated, varying in the depth and complexity of the source zone (target). Over time, dissolution of the DNAPL in groundwater was simulated with simultaneous mapping via periodic ERT surveys. Both surface and borehole ERT surveys were conducted for comparison purposes. The latest four-dimensional ERT inversion algorithms were employed to generate time-lapse isosurfaces of the DNAPL source zone for all cases. This methodology provided a qualitative assessment of the ability of ERT to track DNAPL mass removal for complex source zones in realistically heterogeneous environments. In addition, it provided a quantitative comparison between the actual DNAPL mass removed and that interpreted by ERT as a function of depth below the water table, as well as an estimate of the minimum DNAPL saturation changes necessary for an observable response from ERT.

  4. WE-AB-207B-07: Dose Cloud: Generating “Big Data” for Radiation Therapy Treatment Plan Optimization Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folkerts, MM; University of California San Diego, La Jolla, California; Long, T

    Purpose: To provide a tool to generate large sets of realistic virtual patient geometries and beamlet doses for treatment optimization research. This tool enables countless studies exploring the fundamental interplay between patient geometry, objective functions, weight selections, and achievable dose distributions for various algorithms and modalities. Methods: Generating realistic virtual patient geometries requires a small set of real patient data. We developed a normalized patient shape model (PSM) which captures organ and target contours in a correspondence-preserving manner. Using PSM-processed data, we perform principal component analysis (PCA) to extract major modes of variation from the population. These PCA modes canmore » be shared without exposing patient information. The modes are re-combined with different weights to produce sets of realistic virtual patient contours. Because virtual patients lack imaging information, we developed a shape-based dose calculation (SBD) relying on the assumption that the region inside the body contour is water. SBD utilizes a 2D fluence-convolved scatter kernel, derived from Monte Carlo simulations, and can compute both full dose for a given set of fluence maps, or produce a dose matrix (dose per fluence pixel) for many modalities. Combining the shape model with SBD provides the data needed for treatment plan optimization research. Results: We used PSM to capture organ and target contours for 96 prostate cases, extracted the first 20 PCA modes, and generated 2048 virtual patient shapes by randomly sampling mode scores. Nearly half of the shapes were thrown out for failing anatomical checks, the remaining 1124 were used in computing dose matrices via SBD and a standard 7-beam protocol. As a proof of concept, and to generate data for later study, we performed fluence map optimization emphasizing PTV coverage. Conclusions: We successfully developed and tested a tool for creating customizable sets of virtual patients suitable for large-scale radiation therapy optimization research.« less

  5. Predicting diffuse microbial pollution risk across catchments: The performance of SCIMAP and recommendations for future development.

    PubMed

    Porter, Kenneth D H; Reaney, Sim M; Quilliam, Richard S; Burgess, Chris; Oliver, David M

    2017-12-31

    Microbial pollution of surface waters in agricultural catchments can be a consequence of poor farm management practices, such as excessive stocking of livestock on vulnerable land or inappropriate handling of manures and slurries. Catchment interventions such as fencing of watercourses, streamside buffer strips and constructed wetlands have the potential to reduce faecal pollution of watercourses. However these interventions are expensive and occupy valuable productive land. There is, therefore, a requirement for tools to assist in the spatial targeting of such interventions to areas where they will have the biggest impact on water quality improvements whist occupying the minimal amount of productive land. SCIMAP is a risk-based model that has been developed for this purpose but with a focus on diffuse sediment and nutrient pollution. In this study we investigated the performance of SCIMAP in predicting microbial pollution of watercourses and assessed modelled outputs of E. coli, a common faecal indicator organism (FIO), against observed water quality information. SCIMAP was applied to two river catchments in the UK. SCIMAP uses land cover risk weightings, which are routed through the landscape based on hydrological connectivity to generate catchment scale maps of relative in-stream pollution risk. Assessment of the model's performance and derivation of optimum land cover risk weightings was achieved using a Monte-Carlo sampling approach. Performance of the SCIMAP framework for informing on FIO risk was variable with better performance in the Yealm catchment (r s =0.88; p<0.01) than the Wyre (r s =-0.36; p>0.05). Across both catchments much uncertainty was associated with the application of optimum risk weightings attributed to different land use classes. Overall, SCIMAP showed potential as a useful tool in the spatial targeting of FIO diffuse pollution management strategies; however, improvements are required to transition the existing SCIMAP framework to a robust FIO risk-mapping tool. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Mappability of drug-like space: towards a polypharmacologically competent map of drug-relevant compounds

    NASA Astrophysics Data System (ADS)

    Sidorov, Pavel; Gaspar, Helena; Marcou, Gilles; Varnek, Alexandre; Horvath, Dragos

    2015-12-01

    Intuitive, visual rendering—mapping—of high-dimensional chemical spaces (CS), is an important topic in chemoinformatics. Such maps were so far dedicated to specific compound collections—either limited series of known activities, or large, even exhaustive enumerations of molecules, but without associated property data. Typically, they were challenged to answer some classification problem with respect to those same molecules, admired for their aesthetical virtues and then forgotten—because they were set-specific constructs. This work wishes to address the question whether a general, compound set-independent map can be generated, and the claim of "universality" quantitatively justified, with respect to all the structure-activity information available so far—or, more realistically, an exploitable but significant fraction thereof. The "universal" CS map is expected to project molecules from the initial CS into a lower-dimensional space that is neighborhood behavior-compliant with respect to a large panel of ligand properties. Such map should be able to discriminate actives from inactives, or even support quantitative neighborhood-based, parameter-free property prediction (regression) models, for a wide panel of targets and target families. It should be polypharmacologically competent, without requiring any target-specific parameter fitting. This work describes an evolutionary growth procedure of such maps, based on generative topographic mapping, followed by the validation of their polypharmacological competence. Validation was achieved with respect to a maximum of exploitable structure-activity information, covering all of Homo sapiens proteins of the ChEMBL database, antiparasitic and antiviral data, etc. Five evolved maps satisfactorily solved hundreds of activity-based ligand classification challenges for targets, and even in vivo properties independent from training data. They also stood chemogenomics-related challenges, as cumulated responsibility vectors obtained by mapping of target-specific ligand collections were shown to represent validated target descriptors, complying with currently accepted target classification in biology. Therefore, they represent, in our opinion, a robust and well documented answer to the key question "What is a good CS map?"

  7. ConMap: Investigating New Computer-Based Approaches to Assessing Conceptual Knowledge Structure in Physics.

    ERIC Educational Resources Information Center

    Beatty, Ian D.

    There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap ("Conceptual Mapping") project, described…

  8. Using Digital Mapping Tool in Ill-Structured Problem Solving

    ERIC Educational Resources Information Center

    Bai, Hua

    2013-01-01

    Scaffolding students' problem solving and helping them to improve problem solving skills are critical in instructional design courses. This study investigated the effects of students' uses of a digital mapping tool on their problem solving performance in a design case study. It was found that the students who used the digital mapping tool…

  9. A Case Study Optimizing Human Resources in Rwanda's First Dental School: Three Innovative Management Tools.

    PubMed

    Hackley, Donna M; Mumena, Chrispinus H; Gatarayiha, Agnes; Cancedda, Corrado; Barrow, Jane R

    2018-06-01

    Harvard School of Dental Medicine, University of Maryland School of Dentistry, and the University of Rwanda (UR) are collaborating to create Rwanda's first School of Dentistry as part of the Human Resources for Health (HRH) Rwanda initiative that aims to strengthen the health care system of Rwanda. The HRH oral health team developed three management tools to measure progress in systems-strengthening efforts: 1) the road map is an operations plan for the entire dental school and facilitates delivery of the curriculum and management of human and material resources; 2) each HRH U.S. faculty member develops a work plan with targeted deliverables for his or her rotation, which is facilitated with biweekly flash reports that measure progress and keep the faculty member focused on his or her specific deliverables; and 3) the redesigned HRH twinning model, changed from twinning of an HRH faculty member with a single Rwandan faculty member to twinning with multiple Rwandan faculty members based on shared academic interests and goals, has improved efficiency, heightened engagement of the UR dental faculty, and increased the impact of HRH U.S. faculty members. These new tools enable the team to measure its progress toward the collaborative's goals and understand the successes and challenges in moving toward the planned targets. The tools have been valuable instruments in fostering discussion around priorities and deployment of resources as well as in developing strong relationships, enabling two-way exchange of knowledge, and promoting sustainability.

  10. Liquid chromatography-mass spectrometry in metabolomics research: mass analyzers in ultra high pressure liquid chromatography coupling.

    PubMed

    Forcisi, Sara; Moritz, Franco; Kanawati, Basem; Tziotis, Dimitrios; Lehmann, Rainer; Schmitt-Kopplin, Philippe

    2013-05-31

    The present review gives an introduction into the concept of metabolomics and provides an overview of the analytical tools applied in non-targeted metabolomics with a focus on liquid chromatography (LC). LC is a powerful analytical tool in the study of complex sample matrices. A further development and configuration employing Ultra-High Pressure Liquid Chromatography (UHPLC) is optimized to provide the largest known liquid chromatographic resolution and peak capacity. Reasonably UHPLC plays an important role in separation and consequent metabolite identification of complex molecular mixtures such as bio-fluids. The most sensitive detectors for these purposes are mass spectrometers. Almost any mass analyzer can be optimized to identify and quantify small pre-defined sets of targets; however, the number of analytes in metabolomics is far greater. Optimized protocols for quantification of large sets of targets may be rendered inapplicable. Results on small target set analyses on different sample matrices are easily comparable with each other. In non-targeted metabolomics there is almost no analytical method which is applicable to all different matrices due to limitations pertaining to mass analyzers and chromatographic tools. The specifications of the most important interfaces and mass analyzers are discussed. We additionally provide an exemplary application in order to demonstrate the level of complexity which remains intractable up to date. The potential of coupling a high field Fourier Transform Ion Cyclotron Resonance Mass Spectrometer (ICR-FT/MS), the mass analyzer with the largest known mass resolving power, to UHPLC is given with an example of one human pre-treated plasma sample. This experimental example illustrates one way of overcoming the necessity of faster scanning rates in the coupling with UHPLC. The experiment enabled the extraction of thousands of features (analytical signals). A small subset of this compositional space could be mapped into a mass difference network whose topology shows specificity toward putative metabolite classes and retention time. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Analysis of discriminants for experimental 3D SAR imagery of human targets

    NASA Astrophysics Data System (ADS)

    Chan, Brigitte; Sévigny, Pascale; DiFilippo, David D. J.

    2014-10-01

    Development of a prototype 3-D through-wall synthetic aperture radar (SAR) system is currently underway at Defence Research and Development Canada. The intent is to map out building wall layouts and to detect targets of interest and their location behind walls such as humans, arms caches, and furniture. This situational awareness capability can be invaluable to the military working in an urban environment. Tools and algorithms are being developed to exploit the resulting 3-D imagery. Current work involves analyzing signatures of targets behind a wall and understanding the clutter and multipath signals in a room of interest. In this paper, a comprehensive study of 3-D human target signature metrics in free space is presented. The aim is to identify features for discrimination of the human target from other targets. Targets used in this investigation include a human standing, a human standing with arms stretched out, a chair, a table, and a metallic plate. Several features were investigated as potential discriminants and five which were identified as good candidates are presented in this paper. Based on this study, no single feature could be used to fully discriminate the human targets from all others. A combination of at least two different features is required to achieve this.

  12. An evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool): Concurrent, face and content validity.

    PubMed

    De Groef, An; Van Kampen, Marijke; Moortgat, Peter; Anthonissen, Mieke; Van den Kerckhove, Eric; Christiaens, Marie-Rose; Neven, Patrick; Geraerts, Inge; Devoogdt, Nele

    2018-01-01

    To investigate the concurrent, face and content validity of an evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool). 1) Concurrent validity of the MAP-BC evaluation tool was investigated by exploring correlations (Spearman's rank Correlation Coefficient) between the subjective scores (0 -no adhesions to 3 -very strong adhesions) of the skin level using the MAP-BC evaluation tool and objective elasticity parameters (maximal skin extension and gross elasticity) generated by the Cutometer Dual MPA 580. Nine different examination points on and around the mastectomy scar were evaluated. 2) Face and content validity were explored by questioning therapists experienced with myofascial therapy in breast cancer patients about the comprehensibility and comprehensiveness of the MAP-BC evaluation tool. 1) Only three meaningful correlations were found on the mastectomy scar. For the most lateral examination point on the mastectomy scar a moderate negative correlation (-0.44, p = 0.01) with the maximal skin extension and a moderate positive correlation with the resistance versus ability of returning or 'gross elasticity' (0.42, p = 0.02) were found. For the middle point on the mastectomy scar an almost moderate positive correlation with gross elasticity was found as well (0.38, p = 0.04) 2) Content and face validity have been found to be good. Eighty-nine percent of the respondent found the instructions understandable and 98% found the scoring system obvious. Thirty-seven percent of the therapists suggested to add the possibility to evaluate additional anatomical locations in case of reconstructive and/or bilateral surgery. The MAP-BC evaluation tool for myofascial adhesions in breast cancer patients has good face and content validity. Evidence for good concurrent validity of the skin level was found only on the mastectomy scar itself.

  13. PHLUX: Photographic Flux Tools for Solar Glare and Flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2010-12-02

    A web-based tool to a) analytically and empirically quantify glare from reflected light and determine the potential impact (e.g., temporary flash blindness, retinal burn), and b) produce flux maps for central receivers. The tool accepts RAW digital photographs of the glare source (for hazard assessment) or the receiver (for flux mapping), as well as a photograph of the sun for intensity and size scaling. For glare hazard assessment, the tool determines the retinal irradiance (W/cm2) and subtended source angle for an observer and plots the glare source on a hazard spectrum (i.e., low-potential for flash blindness impact, potential for flashmore » blindness impact, retinal burn). For flux mapping, the tool provides a colored map of the receiver scaled by incident solar flux (W/m2) and unwraps the physical dimensions of the receiver while accounting for the perspective of the photographer (e.g., for a flux map of a cylindrical receiver, the horizontal axis denotes receiver angle in degrees and the vertical axis denotes vertical position in meters; for a flat panel receiver, the horizontal axis denotes horizontal position in meters and the vertical axis denotes vertical position in meters). The flux mapping capability also allows the user to specify transects along which the program plots incident solar flux on the receiver.« less

  14. Mapping soil texture targeting predefined depth range or synthetizing from standard layers?

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Dezső Kaposi, András; Szatmári, Gábor; Takács, Katalin; Pásztor, László

    2017-04-01

    There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. Physical soil properties, especially particle size distribution play important role in this context. A few of the requirements can be satisfied by the sand-, silt-, and clay content maps compiled according to global standards such as GlobalSoilMap (GSM) or Soil Grids. Soil texture classes (e. g. according to USDA classification) can be derived from these three fraction data, in this way texture map can be compiled based on the proper separate maps. Soil texture class as well as fraction information represent direct input of crop-, meteorological- and hydrological models. The model inputs frequently require maps representing soil features of 0-30 cm depth, which is covered by three consecutive depth intervals according to standard specifications: 0-5 cm, 5-15 cm, 15-30 cm. Becoming GSM and SoilGrids the most detailed freely available spatial soil data sources, the common model users (e. g. meteorologists, agronomists, or hydrologists) would produce input map from (the weighted mean of) these three layers. However, if the basic soil data and proper knowledge is obtainable, a soil texture map targeting directly the 0-30 cm layer could be independently compiled. In our work we compared Hungary's soil texture maps compiled using the same reference and auxiliary data and inference methods but for differing layer distribution. We produced the 0-30 cm clay, silt and sand map as well as the maps for the three standard layers (0-5 cm, 5-15 cm, 15-30 cm). Maps of sand, silt and clay percentage were computed through regression kriging (RK) applying Additive Log-Ratio (alr) transformation. In addition to the Hungarian Soil Information and Monitoring System as reference soil data, digital elevation model and its derived components, soil physical property maps, remotely sensed images, land use -, geological-, as well as meteorological data were applied as auxiliary variables. We compared the directly compiled and the synthetized clay-, sand content, and texture class maps by different tools. In addition to pairwise comparison of basic statistical features (histograms, scatter plots), we examined the spatial distribution of the differences. We quantified the taxonomical distances of the textural classes, in order to investigate the differences of the map-pairs. We concluded that the directly computed and the synthetized maps show various differences. In the case of clay-, and sand content maps, the map-pairs have to be considered statistically different. On the other hand, the differences of the texture class maps are not significant. However, in all cases, the differences rather concern the extreme ranges and categories. Using of synthetized maps can intensify extremities by error propagation in models and scenarios. Based on our results, we suggest the usage of the directly composed maps.

  15. EJSCREEN: Environmental Justice Screening and Mapping Tool

    EPA Pesticide Factsheets

    EJSCREEN is an environmental justice screening and mapping tool that provides EPA and the public with a nationally consistent approach to characterizing potential areas may warrant further consideration, analysis, or outreach.

  16. Quantitative targeting maps based on experimental investigations for a branched tube model in magnetic drug targeting

    NASA Astrophysics Data System (ADS)

    Gitter, K.; Odenbach, S.

    2011-12-01

    Magnetic drug targeting (MDT), because of its high targeting efficiency, is a promising approach for tumour treatment. Unwanted side effects are considerably reduced, since the nanoparticles are concentrated within the target region due to the influence of a magnetic field. Nevertheless, understanding the transport phenomena of nanoparticles in an artery system is still challenging. This work presents experimental results for a branched tube model. Quantitative results describe, for example, the net amount of nanoparticles that are targeted towards the chosen region due to the influence of a magnetic field. As a result of measurements, novel drug targeting maps, combining, e.g. the magnetic volume force, the position of the magnet and the net amount of targeted nanoparticles, are presented. The targeting maps are valuable for evaluation and comparison of setups and are also helpful for the design and the optimisation of a magnet system with an appropriate strength and distribution of the field gradient. The maps indicate the danger of accretion within the tube and also show the promising result of magnetic drug targeting that up to 97% of the nanoparticles were successfully targeted.

  17. Prioritizing Seafloor Mapping for Washington’s Pacific Coast

    PubMed Central

    Battista, Timothy; Buja, Ken; Christensen, John; Hennessey, Jennifer; Lassiter, Katrina

    2017-01-01

    Remote sensing systems are critical tools used for characterizing the geological and ecological composition of the seafloor. However, creating comprehensive and detailed maps of ocean and coastal environments has been hindered by the high cost of operating ship- and aircraft-based sensors. While a number of groups (e.g., academic research, government resource management, and private sector) are engaged in or would benefit from the collection of additional seafloor mapping data, disparate priorities, dauntingly large data gaps, and insufficient funding have confounded strategic planning efforts. In this study, we addressed these challenges by implementing a quantitative, spatial process to facilitate prioritizing seafloor mapping needs in Washington State. The Washington State Prioritization Tool (WASP), a custom web-based mapping tool, was developed to solicit and analyze mapping priorities from each participating group. The process resulted in the identification of several discrete, high priority mapping hotspots. As a result, several of the areas have been or will be subsequently mapped. Furthermore, information captured during the process about the intended application of the mapping data was paramount for identifying the optimum remote sensing sensors and acquisition parameters to use during subsequent mapping surveys. PMID:28350338

  18. [The experiment of participatory mapping in order to construct a cartographical alternative to the FHS].

    PubMed

    Goldstein, Roberta Argento; Barcellos, Christovam; Magalhães, Monica de Avelar Figueiredo Mafra; Gracie, Renata; Viacava, Francisco

    2013-01-01

    Maps and mapping procedures are useful tools for systematic interpretation and evaluation and for reporting of results to management. Applied to the Family Health Strategy (FHS), these maps permit the demarcation of the territory and the establishment of links between the territory, its population and health services. In this paper the use of maps by the FHS in 17 municipalities in northern and northeastern Brazil is studied and the process of demarcation and digitization of areas with the participation of teams is described. The survey conducted using questionnaires and discussion workshops showed that difficulties still prevail in reconciling the map (drawing) produced at the local level with maps produced by other government sectors. In general, the maps used at local level employ their own references, which prevent the interplay of information with other cartographic documents and their full use as a tool for evaluation and management. The combination of participatory mapping tools, associated with Geographic Information Systems (GIS) applications proposed in this paper, represents an alternative to mapping the territory of operations of FHS teams, as well as a reflection on the concept of territory and operation by the FHS.

  19. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  20. Social Network Mapping: A New Tool For The Leadership Toolbox

    DTIC Science & Technology

    2002-04-01

    SOCIAL NETWORK MAPPING: A NEW TOOL FOR THE LEADERSHIP TOOLBOX By Elisabeth J. Strines, Colonel, USAF 8037 Washington Road Alexandria...valid OMB control number. 1. REPORT DATE 00 APR 2002 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Social Network Mapping: A...describes the concept of social network mapping and demonstrates how it can be used by squadron commanders and leaders at all levels to provide subtle

  1. Identifying and Tracing User Needs

    NASA Astrophysics Data System (ADS)

    To, C.; Tauer, E.

    2017-12-01

    Providing adequate tools to the user community hinges on reaching the specific goals and needs behind the intended application of the tool. While the approach of leveraging user-supplied inputs and use cases to identify those goals is not new, there frequently remains the challenge of tracing those use cases through to implementation in an efficient and manageable fashion. Processes can become overcomplicated very quickly, and additionally, explicitly mapping progress towards the achievement of the user demands can become overwhelming when hundreds of use-cases are at play. This presentation will discuss a demonstrated use-case approach that has achieved an initial success with a tool re-design and deployment, the means to apply use cases in the generation of a roadmap for future releases over time, and the ability to include and adjust to new user requirements and suggestions with minimal disruption to the traceability. It is hoped that the findings and lessons learned will help make use case employment easier for others seeking to create user-targeted capabilities.

  2. A visualization tool to support decision making in environmental and biological planning

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Conzelmann, Craig; Suir, Kevin J.

    2014-01-01

    Large-scale ecosystem management involves consideration of many factors for informed decision making. The EverVIEW Data Viewer is a cross-platform desktop decision support tool to help decision makers compare simulation model outputs from competing plans for restoring Florida's Greater Everglades. The integration of NetCDF metadata conventions into EverVIEW allows end-users from multiple institutions within and beyond the Everglades restoration community to share information and tools. Our development process incorporates continuous interaction with targeted end-users for increased likelihood of adoption. One of EverVIEW's signature features is side-by-side map panels, which can be used to simultaneously compare species or habitat impacts from alternative restoration plans. Other features include examination of potential restoration plan impacts across multiple geographic or tabular displays, and animation through time. As a result of an iterative, standards-driven approach, EverVIEW is relevant to large-scale planning beyond Florida, and is used in multiple biological planning efforts in the United States.

  3. Assessment of a Bayesian Belief Network-GIS framework as a practical tool to support marine planning.

    PubMed

    Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I

    2010-10-01

    For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. CaveCAD: a tool for architectural design in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo

    2014-02-01

    Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.

  5. GEsture: an online hand-drawing tool for gene expression pattern search.

    PubMed

    Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning

    2018-01-01

    Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.

  6. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    PubMed

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  7. The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro

    2010-05-01

    The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.

  8. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.

    PubMed

    Danion, Frederic; Mathew, James; Flanagan, J Randall

    2017-01-01

    Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.

  9. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics

    PubMed Central

    Mathew, James

    2017-01-01

    Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964

  10. The impact of CmapTools utilization towards students' conceptual change on optics topic

    NASA Astrophysics Data System (ADS)

    Rofiuddin, Muhammad Rifqi; Feranie, Selly

    2017-05-01

    Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.

  11. Shrinking risk profiles after deworming of children in Port Elizabeth, South Africa, with special reference to Ascaris lumbricoides and Trichuris trichiura.

    PubMed

    Müller, Ivan; Gall, Stefanie; Beyleveld, Lindsey; Gerber, Markus; Pühse, Uwe; Du Randt, Rosa; Steinmann, Peter; Zondie, Leyli; Walter, Cheryl; Utzinger, Jürg

    2017-11-27

    Risk maps facilitate discussion among different stakeholders and provide a tool for spatial targeting of health interventions. We present maps documenting shrinking risk profiles after deworming with respect to soil-transmitted helminthiasis among schoolchildren from disadvantaged neighbourhoods in Port Elizabeth, South Africa. Children were examined for soil-transmitted helminth infections using duplicate Kato-Katz thick smears in March 2015, October 2015 and May 2016, and subsequently treated with albendazole after each survey. The mean infection intensities for Ascaris lumbricoides were 9,554 eggs per gram of stool (EPG) in March 2015, 4,317 EPG in October 2015 and 1,684 EPG in March 2016. The corresponding figures for Trichuris trichiura were 664 EPG, 331 EPG and 87 EPG. Repeated deworming shrank the risk of soil-transmitted helminthiasis, but should be complemented by other public health measures.

  12. High-Redshift Astrophysics Using Every Photon

    NASA Astrophysics Data System (ADS)

    Breysse, Patrick; Kovetz, Ely; Rahman, Mubdi; Kamionkowski, Marc

    2017-01-01

    Large galaxy surveys have dramatically improved our understanding of the complex processes which govern gas dynamics and star formation in the nearby universe. However, we know far less about the most distant galaxies, as existing high-redshift observations can only detect the very brightest sources. Intensity mapping surveys provide a promising tool to access this poorly-studied population. By observing emission lines with low angular resolution, these surveys can make use of every photon in a target line to study faint emitters which are inaccessible using traditional techniques. With upcoming carbon monoxide experiments in mind, I will demonstrate how an intensity map can be used to measure the luminosity function of a galaxy population, and in turn how these measurements will allow us to place robust constraints on the cosmic star formation history. I will then show how cross-correlating CO isotopologue lines will make it possible to study gas dynamics within the earliest galaxies in unprecedented detail.

  13. Three-Dimensional Optical Mapping of Nanoparticle Distribution in Intact Tissues.

    PubMed

    Sindhwani, Shrey; Syed, Abdullah Muhammad; Wilhelm, Stefan; Glancy, Dylan R; Chen, Yih Yang; Dobosz, Michael; Chan, Warren C W

    2016-05-24

    The role of tissue architecture in mediating nanoparticle transport, targeting, and biological effects is unknown due to the lack of tools for imaging nanomaterials in whole organs. Here, we developed a rapid optical mapping technique to image nanomaterials in intact organs ex vivo and in three-dimensions (3D). We engineered a high-throughput electrophoretic flow device to simultaneously transform up to 48 tissues into optically transparent structures, allowing subcellular imaging of nanomaterials more than 1 mm deep into tissues which is 25-fold greater than current techniques. A key finding is that nanomaterials can be retained in the processed tissue by chemical cross-linking of surface adsorbed serum proteins to the tissue matrix, which enables nanomaterials to be imaged with respect to cells, blood vessels, and other structures. We developed a computational algorithm to analyze and quantitatively map nanomaterial distribution. This method can be universally applied to visualize the distribution and interactions of materials in whole tissues and animals including such applications as the imaging of nanomaterials, tissue engineered constructs, and biosensors within their intact biological environment.

  14. Mapping the Human Toxome by Systems Toxicology

    PubMed Central

    Bouhifd, Mounir; Hogberg, Helena T.; Kleensang, Andre; Maertens, Alexandra; Zhao, Liang; Hartung, Thomas

    2014-01-01

    Toxicity testing typically involves studying adverse health outcomes in animals subjected to high doses of toxicants with subsequent extrapolation to expected human responses at lower doses. The low-throughput of current toxicity testing approaches (which are largely the same for industrial chemicals, pesticides and drugs) has led to a backlog of more than 80,000 chemicals to which human beings are potentially exposed whose potential toxicity remains largely unknown. Employing new testing strategies that employ the use of predictive, high-throughput cell-based assays (of human origin) to evaluate perturbations in key pathways, referred as pathways of toxicity, and to conduct targeted testing against those pathways, we can begin to greatly accelerate our ability to test the vast “storehouses” of chemical compounds using a rational, risk-based approach to chemical prioritization, and provide test results that are more predictive of human toxicity than current methods. The NIH Transformative Research Grant project Mapping the Human Toxome by Systems Toxicology aims at developing the tools for pathway mapping, annotation and validation as well as the respective knowledge base to share this information. PMID:24443875

  15. Genetic Modifiers and Oligogenic Inheritance

    PubMed Central

    Kousi, Maria; Katsanis, Nicholas

    2015-01-01

    Despite remarkable progress in the identification of mutations that drive genetic disorders, progress in understanding the effect of genetic background on the penetrance and expressivity of causal alleles has been modest, in part because of the methodological challenges in identifying genetic modifiers. Nonetheless, the progressive discovery of modifier alleles has improved both our interpretative ability and our analytical tools to dissect such phenomena. In this review, we analyze the genetic properties and behaviors of modifiers as derived from studies in patient populations and model organisms and we highlight conceptual and technological tools used to overcome some of the challenges inherent in modifier mapping and cloning. Finally, we discuss how the identification of these modifiers has facilitated the elucidation of biological pathways and holds the potential to improve the clinical predictive value of primary causal mutations and to develop novel drug targets. PMID:26033081

  16. Methylation oligonucleotide microarray: a novel tool to analyze methylation patterns

    NASA Astrophysics Data System (ADS)

    Hou, Peng; Ji, Meiju; He, Nongyao; Lu, Zuhong

    2003-04-01

    A new technique to analyze methylation patterns in several adjacent CpG sites was developed and reported here. We selected a 336bp segment of the 5"-untranslated region and the first exon of the p16Ink4a gene, which include the most densely packed CpG fragment of the islands containing 32 CpG dinucleotides, as the investigated target. The probes that include all types of methylation patterns were designed to fabricate a DNA microarray to determine the methylation patterns of seven adjacent CpG dinucleotides sites. High accuracy and reproducibility were observed in several parallel experiments. The results led us to the conclusion that the methylation oligonucleotide microarray can be applied as a novel and powerful tool to map methylation patterns and changes in multiple CpG island loci in a variety of tumors.

  17. Mapping Application Partnership Tool for Anacostia Watershed (Washington, DC/Maryland)

    EPA Pesticide Factsheets

    Mapping Application Partnership Tool (MAPT) of the Urban Waters Federal Partnership (UWFP) reconnects urban communities with their waterways by improving coordination among federal agencies and collaborating with community-led efforts.

  18. Curriculum Mapping: A Method to Assess and Refine Undergraduate Degree Programs

    ERIC Educational Resources Information Center

    Joyner-Melito, Helen S.

    2016-01-01

    Over the past several decades, there has been increasing interest in program- and university-level assessment and aligning learning outcomes to program content. Curriculum mapping is a tool that creates a visual map of all courses in the curriculum and how they relate to curriculum learning outcomes. Assessment tools/activities are often included…

  19. Assessing ecological departure from reference conditions with the Fire Regime Condition Class (FRCC) Mapping Tool

    Treesearch

    Stephen W. Barrett; Thomas DeMeo; Jeffrey L. Jones; J.D. Zeiler; Lee C. Hutter

    2006-01-01

    Knowledge of ecological departure from a range of reference conditions provides a critical context for managing sustainable ecosystems. Fire Regime Condition Class (FRCC) is a qualitative measure characterizing possible departure from historical fire regimes. The FRCC Mapping Tool was developed as an ArcMap extension utilizing the protocol identified by the Interagency...

  20. Concept Maps: An Alternative Methodology to Assess Young Children

    ERIC Educational Resources Information Center

    Atiles, Julia T.; Dominique-Maikell, Nikole; McKean, Kathleen

    2014-01-01

    The authors investigated the utility and efficacy of using concepts maps as a research tool to assess young children. Pre- and post- concept maps have been used as an assessment and evaluation tool with teachers and with older students, typically children who can read and write; this article summarizes an investigation into the utility of using…

  1. Separate visual representations for perception and for visually guided behavior

    NASA Technical Reports Server (NTRS)

    Bridgeman, Bruce

    1989-01-01

    Converging evidence from several sources indicates that two distinct representations of visual space mediate perception and visually guided behavior, respectively. The two maps of visual space follow different rules; spatial values in either one can be biased without affecting the other. Ordinarily the two maps give equivalent responses because both are veridically in register with the world; special techniques are required to pull them apart. One such technique is saccadic suppression: small target displacements during saccadic eye movements are not preceived, though the displacements can change eye movements or pointing to the target. A second way to separate cognitive and motor-oriented maps is with induced motion: a slowly moving frame will make a fixed target appear to drift in the opposite direction, while motor behavior toward the target is unchanged. The same result occurs with stroboscopic induced motion, where the frame jump abruptly and the target seems to jump in the opposite direction. A third method of separating cognitive and motor maps, requiring no motion of target, background or eye, is the Roelofs effect: a target surrounded by an off-center rectangular frame will appear to be off-center in the direction opposite the frame. Again the effect influences perception, but in half of the subjects it does not influence pointing to the target. This experience also reveals more characteristics of the maps and their interactions with one another, the motor map apparently has little or no memory, and must be fed from the biased cognitive map if an enforced delay occurs between stimulus presentation and motor response. In designing spatial displays, the results mean that what you see isn't necessarily what you get. Displays must be designed with either perception or visually guided behavior in mind.

  2. DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.

    PubMed

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/

  3. DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster

    PubMed Central

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693

  4. STK Integrated Message Production List Editor (SIMPLE) for CEO Operations

    NASA Technical Reports Server (NTRS)

    Trenchard, Mike; Heydorn, James

    2014-01-01

    Late in fiscal year 2011, the Crew Earth Observations (CEO) team was tasked to upgrade and replace its mission planning and mission operations software systems, which were developed in the Space Shuttle era of the 1980s and 1990s. The impetuses for this change were the planned transition of all workstations to the Windows 7 64-bit operating system and the desire for more efficient and effective use of Satellite Tool Kit (STK) software required for reliable International Space Station (ISS) Earth location tracking. An additional requirement of this new system was the use of the same SQL database of CEO science sites from the SMMS, which was also being developed. STK Integrated Message Production List Editor (SIMPLE) is the essential, all-in-one tool now used by CEO staff to perform daily ISS mission planning to meet its requirement to acquire astronaut photography of specific sites on Earth. The sites are part of a managed, long-term database that has been defined and developed for scientific, educational, and public interest. SIMPLE's end product is a set of basic time and location data computed for an operator-selected set of targets that the ISS crew will be asked to photograph (photography is typically planned 12 to 36 hours out). The CEO operator uses SIMPLE to (a) specify a payload operations planning period; (b) acquire and validate the best available ephemeris data (vectors) for the ISS during the planning period; (c) ingest and display mission-specific site information from the CEO database; (d) identify and display potential current dynamic event targets as map features; (e) compute and display time and location information for each target; (f) screen and select targets based on known crew availability constraints, obliquity constraints, and real-time evaluated constraints to target visibility due to illumination (sun elevation) and atmospheric conditions (weather); and finally (g) incorporate basic, computed time and location information for each selected target into the daily CEO Target List product (message) for submission to ISS payload planning and integration teams for their review and approval prior to uplink. SIMPLE requires and uses the following resources: an ISS mission planning period Greenwich Mean Time start date/time and end date/time), the best available ISS mission ephemeris data (vectors) for that planning period, the STK software package configured for the ISS, and an ISS mission-specific subset of the CEO sites database. The primary advantages realized by the development and implementation of SIMPLE into the CEO payload operations support activity are a smooth transition to the Windows 7 operating system upon scheduled workstation refresh; streamlining of the input and verification of the current ISS ephemeris (vector data); seamless incorporation of selected contents of the SQL database of science sites; the ability to tag and display potential dynamic event opportunities on orbit track maps; simplification of the display and selection of encountered sites based on crew availability, illumination, obliquity, and weather constraints; the incorporation of high-quality mapping of the Earth with various satellite-based datasets for use in describing targets; and the ability to encapsulate and export the essential selected target elements in XML format for use by onboard Earth-location systems, such as Worldmap. SIMPLE is a carefully designed and crafted in-house software package that includes detailed help files for the user and meticulous internal documentation for future modifications. It was delivered in February 2012 for test and evaluation. Following acceptance, it was implemented for CEO mission operations support in May 2012.

  5. Structured feedback on students' concept maps: the proverbial path to learning?

    PubMed

    Joseph, Conran; Conradsson, David; Nilsson Wikmar, Lena; Rowe, Michael

    2017-05-25

    Good conceptual knowledge is an essential requirement for health professions students, in that they are required to apply concepts learned in the classroom to a variety of different contexts. However, the use of traditional methods of assessment limits the educator's ability to correct students' conceptual knowledge prior to altering the educational context. Concept mapping (CM) is an educational tool for evaluating conceptual knowledge, but little is known about its use in facilitating the development of richer knowledge frameworks. In addition, structured feedback has the potential to develop good conceptual knowledge. The purpose of this study was to use Kinchin's criteria to assess the impact of structured feedback on the graphical complexity of CM's by observing the development of richer knowledge frameworks. Fifty-eight physiotherapy students created CM's targeting the integration of two knowledge domains within a case-based teaching paradigm. Each student received one round of structured feedback that addressed correction, reinforcement, forensic diagnosis, benchmarking, and longitudinal development on their CM's prior to the final submission. The concept maps were categorized according to Kinchin's criteria as either Spoke, Chain or Net representations, and then evaluated against defined traits of meaningful learning. The inter-rater reliability of categorizing CM's was good. Pre-feedback CM's were predominantly Chain structures (57%), with Net structures appearing least often. There was a significant reduction of the basic Spoke- structured CMs (P = 0.002) and a significant increase of Net-structured maps (P < 0.001) at the final evaluation (post-feedback). Changes in structural complexity of CMs appeared to be indicative of broader knowledge frameworks as assessed against the meaningful learning traits. Feedback on CM's seemed to have contributed towards improving conceptual knowledge and correcting naive conceptions of related knowledge. Educators in medical education could therefore consider using CM's to target individual student development.

  6. Self-optimizing Monte Carlo method for nuclear well logging simulation

    NASA Astrophysics Data System (ADS)

    Liu, Lianyan

    1997-09-01

    In order to increase the efficiency of Monte Carlo simulation for nuclear well logging problems, a new method has been developed for variance reduction. With this method, an importance map is generated in the regular Monte Carlo calculation as a by-product, and the importance map is later used to conduct the splitting and Russian roulette for particle population control. By adopting a spatial mesh system, which is independent of physical geometrical configuration, the method allows superior user-friendliness. This new method is incorporated into the general purpose Monte Carlo code MCNP4A through a patch file. Two nuclear well logging problems, a neutron porosity tool and a gamma-ray lithology density tool are used to test the performance of this new method. The calculations are sped up over analog simulation by 120 and 2600 times, for the neutron porosity tool and for the gamma-ray lithology density log, respectively. The new method enjoys better performance by a factor of 4~6 times than that of MCNP's cell-based weight window, as per the converged figure-of-merits. An indirect comparison indicates that the new method also outperforms the AVATAR process for gamma-ray density tool problems. Even though it takes quite some time to generate a reasonable importance map from an analog run, a good initial map can create significant CPU time savings. This makes the method especially suitable for nuclear well logging problems, since one or several reference importance maps are usually available for a given tool. Study shows that the spatial mesh sizes should be chosen according to the mean-free-path. The overhead of the importance map generator is 6% and 14% for neutron and gamma-ray cases. The learning ability towards a correct importance map is also demonstrated. Although false-learning may happen, physical judgement can help diagnose with contributon maps. Calibration and analysis are performed for the neutron tool and the gamma-ray tool. Due to the fact that a very good initial importance map is always available after the first point has been calculated, high computing efficiency is maintained. The availability of contributon maps provides an easy way of understanding the logging measurement and analyzing for the depth of investigation.

  7. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    NASA Astrophysics Data System (ADS)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    The agricultural sector is the most common suspected source of nutrient pollution in Irish rivers. However, it is also often the most difficult source to characterise due to its predominantly diffuse nature. Particulate phosphorus in surface water and dissolved phosphorus in groundwater are of particular concern in Irish water bodies. Hence the further development of models and indices to assess diffuse sources of contaminants are required for use by the Irish Environmental Protection Agency (EPA) to provide support for river basin planning. Understanding connectivity in the landscape is a vital component of characterising the source-pathway-receptor relationships for water-borne contaminants, and hence is a priority in this research. The DIFFUSE Project will focus on connectivity modelling and incorporation of connectivity into sediment, nutrient and pesticide risk mapping. The Irish approach to understanding and managing natural water bodies has developed substantially in recent years assisted by outputs from multiple research projects, including modelling and analysis tools developed during the Pathways and CatchmentTools projects. These include the Pollution Impact Potential (PIP) maps, which are an example of research output that is used by the EPA to support catchment management. The PIP maps integrate an understanding of the pollution pressures and mobilisation pathways and, using the source-pathways-receptor model, provide a scientific basis for evaluation of mitigation measures. These maps indicate the potential risk posed by nitrate and phosphate from diffuse agricultural sources to surface and groundwater receptors and delineate critical source areas (CSAs) as a means of facilitating the targeting of mitigation measures. Building on this previous research, the DIFFUSE Project will develop revised and new catchment managements tools focused on connectivity, sediment, phosphorus and pesticides. The DIFFUSE project will strive to identify the state-of-the-art methods and models that are most applicable to Irish conditions and management challenges. All styles of modelling considered useful for water resources management are relevant to this project and a balance of technical sophistication, data availability and operational practicalities is the ultimate goal. Achievement of this objective will be measured by comparing the performance of the new models developed in the project with models used in other countries. The models and tools developed in the course of the project will be evaluated by comparison with Irish catchment data and with other state-of-the-art models in a model-inter-comparison workshop which will be open to other models and the wider research community.

  8. Mapping the tumour human leukocyte antigen (HLA) ligandome by mass spectrometry.

    PubMed

    Freudenmann, Lena Katharina; Marcu, Ana; Stevanović, Stefan

    2018-07-01

    The entirety of human leukocyte antigen (HLA)-presented peptides is referred to as the HLA ligandome of a cell or tissue, in tumours often termed immunopeptidome. Mapping the tumour immunopeptidome by mass spectrometry (MS) comprehensively views the pathophysiologically relevant antigenic signature of human malignancies. MS is an unbiased approach stringently filtering the candidates to be tested as opposed to epitope prediction algorithms. In the setting of peptide-specific immunotherapies, MS-based strategies significantly diminish the risk of lacking clinical benefit, as they yield highly enriched amounts of truly presented peptides. Early immunopeptidomic efforts were severely limited by technical sensitivity and manual spectra interpretation. The technological progress with development of orbitrap mass analysers and enhanced chromatographic performance led to vast improvements in mass accuracy, sensitivity, resolution, and speed. Concomitantly, bioinformatic tools were developed to process MS data, integrate sequencing results, and deconvolute multi-allelic datasets. This enabled the immense advancement of tumour immunopeptidomics. Studying the HLA-presented peptide repertoire bears high potential for both answering basic scientific questions and translational application. Mapping the tumour HLA ligandome has started to significantly contribute to target identification for the design of peptide-specific cancer immunotherapies in clinical trials and compassionate need treatments. In contrast to prediction algorithms, rare HLA allotypes and HLA class II can be adequately addressed when choosing MS-guided target identification platforms. Herein, we review the identification of tumour HLA ligands focusing on sources, methods, bioinformatic data analysis, translational application, and provide an outlook on future developments. © 2018 John Wiley & Sons Ltd.

  9. Computer aided manufacturing for complex freeform optics

    NASA Astrophysics Data System (ADS)

    Wolfs, Franciscus; Fess, Ed; Johns, Dustin; LePage, Gabriel; Matthews, Greg

    2017-10-01

    Recently, the desire to use freeform optics has been increasing. Freeform optics can be used to expand the capabilities of optical systems and reduce the number of optics needed in an assembly. The traits that increase optical performance also present challenges in manufacturing. As tolerances on freeform optics become more stringent, it is necessary to continue to improve methods for how the grinding and polishing processes interact with metrology. To create these complex shapes, OptiPro has developed a computer aided manufacturing package called PROSurf. PROSurf generates tool paths required for grinding and polishing freeform optics with multiple axes of motion. It also uses metrology feedback for deterministic corrections. ProSurf handles 2 key aspects of the manufacturing process that most other CAM systems struggle with. The first is having the ability to support several input types (equations, CAD models, point clouds) and still be able to create a uniform high-density surface map useable for generating a smooth tool path. The second is to improve the accuracy of mapping a metrology file to the part surface. To perform this OptiPro is using 3D error maps instead of traditional 2D maps. The metrology error map drives the tool path adjustment applied during processing. For grinding, the error map adjusts the tool position to compensate for repeatable system error. For polishing, the error map drives the relative dwell times of the tool across the part surface. This paper will present the challenges associated with these issues and solutions that we have created.

  10. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    NASA Astrophysics Data System (ADS)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its early stages, high school and college teachers, as well as researchers have expressed interest in using and extending these tools for visualizing and interacting with data on Earth and other planetary bodies.

  11. Joint explorative analysis of neuroreceptor subsystems in the human brain: application to receptor-transporter correlation using PET data.

    PubMed

    Cselényi, Zsolt; Lundberg, Johan; Halldin, Christer; Farde, Lars; Gulyás, Balázs

    2004-10-01

    Positron emission tomography (PET) has proved to be a highly successful technique in the qualitative and quantitative exploration of the human brain's neurotransmitter-receptor systems. In recent years, the number of PET radioligands, targeted to different neuroreceptor systems of the human brain, has increased considerably. This development paves the way for a simultaneous analysis of different receptor systems and subsystems in the same individual. The detailed exploration of the versatility of neuroreceptor systems requires novel technical approaches, capable of operating on huge parametric image datasets. An initial step of such explorative data processing and analysis should be the development of novel exploratory data-mining tools to gain insight into the "structure" of complex multi-individual, multi-receptor data sets. For practical reasons, a possible and feasible starting point of multi-receptor research can be the analysis of the pre- and post-synaptic binding sites of the same neurotransmitter. In the present study, we propose an unsupervised, unbiased data-mining tool for this task and demonstrate its usefulness by using quantitative receptor maps, obtained with positron emission tomography, from five healthy subjects on (pre-synaptic) serotonin transporters (5-HTT or SERT) and (post-synaptic) 5-HT(1A) receptors. Major components of the proposed technique include the projection of the input receptor maps to a feature space, the quasi-clustering and classification of projected data (neighbourhood formation), trans-individual analysis of neighbourhood properties (trajectory analysis), and the back-projection of the results of trajectory analysis to normal space (creation of multi-receptor maps). The resulting multi-receptor maps suggest that complex relationships and tendencies in the relationship between pre- and post-synaptic transporter-receptor systems can be revealed and classified by using this method. As an example, we demonstrate the regional correlation of the serotonin transporter-receptor systems. These parameter-specific multi-receptor maps can usefully guide the researchers in their endeavour to formulate models of multi-receptor interactions and changes in the human brain.

  12. AEGIS: a wildfire prevention and management information system

    NASA Astrophysics Data System (ADS)

    Kalabokidis, Kostas; Ager, Alan; Finney, Mark; Athanasis, Nikos; Palaiologou, Palaiologos; Vasilakos, Christos

    2016-03-01

    We describe a Web-GIS wildfire prevention and management platform (AEGIS) developed as an integrated and easy-to-use decision support tool to manage wildland fire hazards in Greece (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing online access to information that is essential for wildfire management. The system uses a number of spatial and non-spatial data sources to support key system functionalities. Land use/land cover maps were produced by combining field inventory data with high-resolution multispectral satellite images (RapidEye). These data support wildfire simulation tools that allow the users to examine potential fire behavior and hazard with the Minimum Travel Time fire spread algorithm. End-users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations, i.e., single-fire propagation, point-scale calculation of potential fire behavior, and burn probability analysis, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANNs) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps are used to generate integrated output map of fire hazard prediction. The system also incorporates weather information obtained from remote automatic weather stations and weather forecast maps. The system and associated computation algorithms leverage parallel processing techniques (i.e., High Performance Computing and Cloud Computing) that ensure computational power required for real-time application. All AEGIS functionalities are accessible to authorized end-users through a web-based graphical user interface. An innovative smartphone application, AEGIS App, also provides mobile access to the web-based version of the system.

  13. Advances in the Validation of Satellite-Based Maps of Volcanic Sulfur Dioxide Plumes

    NASA Astrophysics Data System (ADS)

    Realmuto, V. J.; Berk, A.; Acharya, P. K.; Kennett, R.

    2013-12-01

    The monitoring of volcanic gas emissions with gas cameras, spectrometer arrays, tethersondes, and UAVs presents new opportunities for the validation of satellite-based retrievals of gas concentrations. Gas cameras and spectrometer arrays provide instantaneous observations of the gas burden, or concentration along an optical path, over broad sections of a plume, similar to the observations acquired by nadir-viewing satellites. Tethersondes and UAVs provide us with direct measurements of the vertical profiles of gas concentrations within plumes. This presentation will focus on our current efforts to validate ASTER-based maps of sulfur dioxide plumes at Turrialba and Kilauea Volcanoes (located in Costa Rica and Hawaii, respectively). These volcanoes, which are the subjects of comprehensive monitoring programs, are challenging targets for thermal infrared (TIR) remote sensing due the warm and humid atmospheric conditions. The high spatial resolution of ASTER in the TIR (90 meters) allows us to map the plumes back to their source vents, but also requires us to pay close attention to the temperature and emissivity of the surfaces beneath the plumes. Our knowledge of the surface and atmospheric conditions is never perfect, and we employ interactive mapping techniques that allow us to evaluate the impact of these uncertainties on our estimates of plume composition. To accomplish this interactive mapping we have developed the Plume Tracker tool kit, which integrates retrieval procedures, visualization tools, and a customized version of the MODTRAN radiative transfer (RT) model under a single graphics user interface (GUI). We are in the process of porting the RT calculations to graphics processing units (GPUs) with the goal of achieving a 100-fold increase in the speed of computation relative to conventional CPU-based processing. We will report on our progress with this evolution of Plume Tracker. Portions of this research were conducted at the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  14. Higher versus lower blood pressure targets for vasopressor therapy in shock: a multicentre pilot randomized controlled trial.

    PubMed

    Lamontagne, François; Meade, Maureen O; Hébert, Paul C; Asfar, Pierre; Lauzier, François; Seely, Andrew J E; Day, Andrew G; Mehta, Sangeeta; Muscedere, John; Bagshaw, Sean M; Ferguson, Niall D; Cook, Deborah J; Kanji, Salmaan; Turgeon, Alexis F; Herridge, Margaret S; Subramanian, Sanjay; Lacroix, Jacques; Adhikari, Neill K J; Scales, Damon C; Fox-Robichaud, Alison; Skrobik, Yoanna; Whitlock, Richard P; Green, Robert S; Koo, Karen K Y; Tanguay, Teddie; Magder, Sheldon; Heyland, Daren K

    2016-04-01

    In shock, hypotension may contribute to inadequate oxygen delivery, organ failure and death. We conducted the Optimal Vasopressor Titration (OVATION) pilot trial to inform the design of a larger trial examining the effect of lower versus higher mean arterial pressure (MAP) targets for vasopressor therapy in shock. We randomly assigned critically ill patients who were presumed to suffer from vasodilatory shock regardless of admission diagnosis to a lower (60-65 mmHg) versus a higher (75-80 mmHg) MAP target. The primary objective was to measure the separation in MAP between groups. We also recorded days with protocol deviations, enrolment rate, cardiac arrhythmias and mortality for prespecified subgroups. A total of 118 patients were enrolled from 11 centres (2.3 patients/site/month of screening). The between-group separation in MAP was 9 mmHg (95% CI 7-11). In the lower and higher MAP groups, we observed deviations on 12 versus 8% of all days on vasopressors (p = 0.059). Risks of cardiac arrhythmias (20 versus 36%, p = 0.07) and hospital mortality (30 versus 33%, p = 0.84) were not different between lower and higher MAP arms. Among patients aged 75 years or older, a lower MAP target was associated with reduced hospital mortality (13 versus 60%, p = 0.03) but not in younger patients. This pilot study supports the feasibility of a large trial comparing lower versus higher MAP targets for shock. Further research may help delineate the reasons for vasopressor dosing in excess of prescribed targets and how individual patient characteristics modify the response to vasopressor therapy.

  15. Automated strip-mine and reclamation mapping from ERTS

    NASA Technical Reports Server (NTRS)

    Rogers, R. H. (Principal Investigator); Reed, L. E.; Pettyjohn, W. A.

    1974-01-01

    The author has identified the following significant results. Computer processing techniques were applied to ERTS-1 computer-compatible tape (CCT) data acquired in August 1972 on the Ohio Power Company's coal mining operation in Muskingum County, Ohio. Processing results succeeded in automatically classifying, with an accuracy greater than 90%: (1) stripped earth and major sources of erosion; (2) partially reclaimed areas and minor sources of erosion; (3) water with sedimentation; (4) water without sedimentation; and (5) vegetation. Computer-generated tables listing the area in acres and square kilometers were produced for each target category. Processing results also included geometrically corrected map overlays, one for each target category, drawn on a transparent material by a pen under computer control. Each target category is assigned a distinctive color on the overlay to facilitate interpretation. The overlays, drawn at a scale of 1:250,000 when placed over an AMS map of the same area, immediately provided map locations for each target. These mapping products were generated at a tenth of the cost of conventional mapping techniques.

  16. Phosphorylation-Dependent Regulation of Ryanodine Receptors

    PubMed Central

    Marx, Steven O.; Reiken, Steven; Hisamatsu, Yuji; Gaburjakova, Marta; Gaburjakova, Jana; Yang, Yi-Ming; Rosemblit, Nora; Marks, Andrew R.

    2001-01-01

    Ryanodine receptors (RyRs), intracellular calcium release channels required for cardiac and skeletal muscle contraction, are macromolecular complexes that include kinases and phosphatases. Phosphorylation/dephosphorylation plays a key role in regulating the function of many ion channels, including RyRs. However, the mechanism by which kinases and phosphatases are targeted to ion channels is not well understood. We have identified a novel mechanism involved in the formation of ion channel macromolecular complexes: kinase and phosphatase targeting proteins binding to ion channels via leucine/isoleucine zipper (LZ) motifs. Activation of kinases and phosphatases bound to RyR2 via LZs regulates phosphorylation of the channel, and disruption of kinase binding via LZ motifs prevents phosphorylation of RyR2. Elucidation of this new role for LZs in ion channel macromolecular complexes now permits: (a) rapid mapping of kinase and phosphatase targeting protein binding sites on ion channels; (b) predicting which kinases and phosphatases are likely to regulate a given ion channel; (c) rapid identification of novel kinase and phosphatase targeting proteins; and (d) tools for dissecting the role of kinases and phosphatases as modulators of ion channel function. PMID:11352932

  17. Solution NMR Spectroscopy in Target-Based Drug Discovery.

    PubMed

    Li, Yan; Kang, Congbao

    2017-08-23

    Solution NMR spectroscopy is a powerful tool to study protein structures and dynamics under physiological conditions. This technique is particularly useful in target-based drug discovery projects as it provides protein-ligand binding information in solution. Accumulated studies have shown that NMR will play more and more important roles in multiple steps of the drug discovery process. In a fragment-based drug discovery process, ligand-observed and protein-observed NMR spectroscopy can be applied to screen fragments with low binding affinities. The screened fragments can be further optimized into drug-like molecules. In combination with other biophysical techniques, NMR will guide structure-based drug discovery. In this review, we describe the possible roles of NMR spectroscopy in drug discovery. We also illustrate the challenges encountered in the drug discovery process. We include several examples demonstrating the roles of NMR in target-based drug discoveries such as hit identification, ranking ligand binding affinities, and mapping the ligand binding site. We also speculate the possible roles of NMR in target engagement based on recent processes in in-cell NMR spectroscopy.

  18. Cartographic Design in Flood Risk Mapping - A Challenge for Communication and Stakeholder Involvement

    NASA Astrophysics Data System (ADS)

    Fuchs, S.; Serrhini, K.; Dorner, W.

    2009-12-01

    In order to mitigate flood hazards and to minimise associated losses, technical protection measures have been additionally and increasingly supplemented by non-technical mitigation, i.e. land-use planning activities. This is commonly done by creating maps which indicate such areas by different cartographic symbols, such as colour, size, shape, and typography. Hazard and risk mapping is the accepted procedure when communicating potential threats to stakeholders, and is therefore required in the European Member States in order to meet the demands of the European Flood Risk Directive. However, available information is sparse concerning the impact of such maps on different stakeholders, i.e., specialists in flood risk management, politicians, and affected citizens. The lack of information stems from a traditional approach to map production which does not take into account specific end-user needs. In order to overcome this information shortage the current study used a circular approach such that feed-back mechanisms originating from different perception patterns of the end user would be considered. Different sets of small-scale as well as large-scale risk maps were presented to different groups of test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented communication of cartographic information. Therefore, the method of eye tracking was applied using a video-oculography technique. This resulted in a suggestion for a map template which fulfils the requirement to serve as an efficient communication tool for specialists and practitioners in hazard and risk mapping as well as for laypersons. Taking the results of this study will enable public authorities who are responsible for flood mitigation to (1) improve their flood risk maps, (2) enhance flood risk awareness, and therefore (3) create more disaster-resilient communities.

  19. Developing a mapping tool for tablets

    NASA Astrophysics Data System (ADS)

    Vaughan, Alan; Collins, Nathan; Krus, Mike

    2014-05-01

    Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  20. National Interest Shown in Watershed Mapping Tool

    EPA Pesticide Factsheets

    The State of Maryland is able to identify prime locations for watershed restoration and preservation using an interactive mapping tool developed by a partnership of agencies led by EPA’s Mid-Atlantic Water Protection Division.

  1. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Monitoring urban subsidence based on SAR lnterferometric point target analysis

    USGS Publications Warehouse

    Zhang, Y.; Zhang, Jiahua; Gong, W.; Lu, Z.

    2009-01-01

    lnterferometric point target analysis (IPTA) is one of the latest developments in radar interferometric processing. It is achieved by analysis of the interferometric phases of some individual point targets, which are discrete and present temporarily stable backscattering characteristics, in long temporal series of interferometric SAR images. This paper analyzes the interferometric phase model of point targets, and then addresses two key issues within IPTA process. Firstly, a spatial searching method is proposed to unwrap the interferometric phase difference between two neighboring point targets. The height residual error and linear deformation rate of each point target can then be calculated, when a global reference point with known height correction and deformation history is chosen. Secondly, a spatial-temporal filtering scheme is proposed to further separate the atmosphere phase and nonlinear deformation phase from the residual interferometric phase. Finally, an experiment of the developed IPTA methodology is conducted over Suzhou urban area. Totally 38 ERS-1/2 SAR scenes are analyzed, and the deformation information over 3 546 point targets in the time span of 1992-2002 are generated. The IPTA-derived deformation shows very good agreement with the published result, which demonstrates that the IPTA technique can be developed into an operational tool to map the ground subsidence over urban area.

  3. Targeted Recombinant Progeny: a design for ultra-high resolution mapping of Quantitative Trait Loci in crosses between inbred or pure lines.

    PubMed

    Heifetz, Eliyahu M; Soller, Morris

    2015-07-07

    High-resolution mapping of the loci (QTN) responsible for genetic variation in quantitative traits is essential for positional cloning of candidate genes, and for effective marker assisted selection. The confidence interval (QTL) flanking the point estimate of QTN-location is proportional to the number of individuals in the mapping population carrying chromosomes recombinant in the given interval. Consequently, many designs for high resolution QTN mapping are based on increasing the proportion of recombinants in the mapping population. The "Targeted Recombinant Progeny" (TRP) design is a new design for high resolution mapping of a target QTN in crosses between pure, or inbred lines. It is a three-generation procedure generating a large number of recombinant individuals within a QTL previously shown to contain a QTN. This is achieved by having individuals that carry chromosomes recombinant across the target QTL interval as parents of a large mapping population; most of whom will therefore carry recombinant chromosomes targeted to the given QTL. The TRP design is particularly useful for high resolution mapping of QTN that differentiate inbred or pure lines, and hence are not amenable to high resolution mapping by genome-wide association tests. In the absence of residual polygenic variation, population sizes required for achieving given mapping resolution by the TRP-F2 design relative to a standard F2 design ranged from 0.289 for a QTN with standardized allele substitution effect = 0.2, mapped to an initial QTL of 0.2 Morgan to 0.041 for equivalent QTN mapped to an initial QTL of 0.02 M. In the presence of residual polygenic variation, the relative effectiveness of the TRP design ranges from 1.068 to 0.151 for the same initial QTL intervals and QTN effect. Thus even in the presence of polygenic variation, the TRP can still provide major savings. Simulation showed that mapping by TRP should be based on 30-50 markers spanning the initial interval; and on at least 50 or more G2 families representing this number of recombination points,. The TRP design can be an effective procedure for achieving high and ultra-high mapping resolution of a target QTN previously mapped to a known confidence interval (QTL).

  4. mrtailor: a tool for PDB-file preparation for the generation of external restraints.

    PubMed

    Gruene, Tim

    2013-09-01

    Model building starting from, for example, a molecular-replacement solution with low sequence similarity introduces model bias, which can be difficult to detect, especially at low resolution. The program mrtailor removes low-similarity regions from a template PDB file according to sequence similarity between the target sequence and the template sequence and maps the target sequence onto the PDB file. The modified PDB file can be used to generate external restraints for low-resolution refinement with reduced model bias and can be used as a starting point for model building and refinement. The program can call ProSMART [Nicholls et al. (2012), Acta Cryst. D68, 404-417] directly in order to create external restraints suitable for REFMAC5 [Murshudov et al. (2011), Acta Cryst. D67, 355-367]. Both a command-line version and a GUI exist.

  5. Harmonisation of geological data to support geohazard mapping: the case of eENVplus project

    NASA Astrophysics Data System (ADS)

    Cipolloni, Carlo; Krivic, Matija; Novak, Matevž; Pantaloni, Marco; Šinigoj, Jasna

    2014-05-01

    In the eENVplus project, which aims is to unlock huge amounts of environmental datamanaged by the national and regional environmental agencies and other public and private organisations, we have developed a cross-border pilot on the geological data harmonisation through the integration and harmonisation of existing services. The pilot analyses the methodology and results of the OneGeology-Europe project, elaborated at the scale of 1:1M, to point out difficulties and unsolved problems highlighted during the project. This preliminary analysis is followed by a comparison of two geological maps provided by the neighbouring countries with the objective to compare and define the geometric and semantic anomalous contacts between geological polygons and lines in the maps. This phase will be followed by a detailed scale geological map analysis aimed to solve the anomalies identified in the previous phase. The two Geological Surveys involved into the pilot will discuss the problems highlighted during this phase. Subsequently the semantic description will be redefined and the geometry of the polygons in geological maps will be redrawn or adjusted according to a lithostratigraphic approach that takes in account the homogeneity of age, lithology, depositional environment and consolidation degree of geological units. The two Geological Surveys have decided to apply the harmonisation process on two different dataset: the first is represented by the Geological Map at the scale of 1:1,000,000, partially harmonised within the OneGeology-Europe project that will be re-aligned with GE INSPIRE data model to produce data and services compliant with INSPIRE target schema. The main target of Geological Surveys is to produce data and web services compliant with the wider international schema, where there are more options to provide data, with specific attributes that are important to obtain the geohazard map as in the case of this pilot project; therefore we have decided to apply GeoSciML 3.2 schema to the dataset that represents Geological Map at the scale of 1:100,000. Within the pilot will be realised two main geohazard examples with a semi-automatized procedure based on a specific tool component integrated in the client: a landslide susceptibility map and a potential flooding map. In this work we want to present the first results obtained with use case geo-processing procedure in the first test phase, where we have developed a dataset compliant with GE INSPIRE to perform the landslide and flooding susceptibility maps.

  6. A fruit quality gene map of Prunus

    PubMed Central

    2009-01-01

    Background Prunus fruit development, growth, ripening, and senescence includes major biochemical and sensory changes in texture, color, and flavor. The genetic dissection of these complex processes has important applications in crop improvement, to facilitate maximizing and maintaining stone fruit quality from production and processing through to marketing and consumption. Here we present an integrated fruit quality gene map of Prunus containing 133 genes putatively involved in the determination of fruit texture, pigmentation, flavor, and chilling injury resistance. Results A genetic linkage map of 211 markers was constructed for an intraspecific peach (Prunus persica) progeny population, Pop-DG, derived from a canning peach cultivar 'Dr. Davis' and a fresh market cultivar 'Georgia Belle'. The Pop-DG map covered 818 cM of the peach genome and included three morphological markers, 11 ripening candidate genes, 13 cold-responsive genes, 21 novel EST-SSRs from the ChillPeach database, 58 previously reported SSRs, 40 RAFs, 23 SRAPs, 14 IMAs, and 28 accessory markers from candidate gene amplification. The Pop-DG map was co-linear with the Prunus reference T × E map, with 39 SSR markers in common to align the maps. A further 158 markers were bin-mapped to the reference map: 59 ripening candidate genes, 50 cold-responsive genes, and 50 novel EST-SSRs from ChillPeach, with deduced locations in Pop-DG via comparative mapping. Several candidate genes and EST-SSRs co-located with previously reported major trait loci and quantitative trait loci for chilling injury symptoms in Pop-DG. Conclusion The candidate gene approach combined with bin-mapping and availability of a community-recognized reference genetic map provides an efficient means of locating genes of interest in a target genome. We highlight the co-localization of fruit quality candidate genes with previously reported fruit quality QTLs. The fruit quality gene map developed here is a valuable tool for dissecting the genetic architecture of fruit quality traits in Prunus crops. PMID:19995417

  7. Benchmarking short sequence mapping tools

    PubMed Central

    2013-01-01

    Background The development of next-generation sequencing instruments has led to the generation of millions of short sequences in a single run. The process of aligning these reads to a reference genome is time consuming and demands the development of fast and accurate alignment tools. However, the current proposed tools make different compromises between the accuracy and the speed of mapping. Moreover, many important aspects are overlooked while comparing the performance of a newly developed tool to the state of the art. Therefore, there is a need for an objective evaluation method that covers all the aspects. In this work, we introduce a benchmarking suite to extensively analyze sequencing tools with respect to various aspects and provide an objective comparison. Results We applied our benchmarking tests on 9 well known mapping tools, namely, Bowtie, Bowtie2, BWA, SOAP2, MAQ, RMAP, GSNAP, Novoalign, and mrsFAST (mrFAST) using synthetic data and real RNA-Seq data. MAQ and RMAP are based on building hash tables for the reads, whereas the remaining tools are based on indexing the reference genome. The benchmarking tests reveal the strengths and weaknesses of each tool. The results show that no single tool outperforms all others in all metrics. However, Bowtie maintained the best throughput for most of the tests while BWA performed better for longer read lengths. The benchmarking tests are not restricted to the mentioned tools and can be further applied to others. Conclusion The mapping process is still a hard problem that is affected by many factors. In this work, we provided a benchmarking suite that reveals and evaluates the different factors affecting the mapping process. Still, there is no tool that outperforms all of the others in all the tests. Therefore, the end user should clearly specify his needs in order to choose the tool that provides the best results. PMID:23758764

  8. Radiometric spectral and band rendering of targets using anisotropic BRDFs and measured backgrounds

    NASA Astrophysics Data System (ADS)

    Hilgers, John W.; Hoffman, Jeffrey A.; Reynolds, William R.; Jafolla, James C.

    2000-07-01

    Achievement of ultra-high fidelity signature modeling of targets requires a significant level of complexity for all of the components required in the rendering process. Specifically, the reflectance of the surface must be described using the bi-directional distribution function (BRDF). In addition, the spatial representation of the background must be high fidelity. A methodology and corresponding model for spectral and band rendering of targets using both isotropic and anisotropic BRDFs is presented. In addition, a set of tools will be described for generating theoretical anisotropic BRDFs and for reducing data required for a description of an anisotropic BRDF by 5 orders of magnitude. This methodology is hybrid using a spectrally measured panoramic of the background mapped to a large hemisphere. Both radiosity and ray-tracing approaches are incorporated simultaneously for a robust solution. In the thermal domain the spectral emission is also included in the solution. Rendering examples using several BRDFs will be presented.

  9. EnGeoMAP - geological applications within the EnMAP hyperspectral satellite science program

    NASA Astrophysics Data System (ADS)

    Boesche, N. K.; Mielke, C.; Rogass, C.; Guanter, L.

    2016-12-01

    Hyperspectral investigations from near field to space substantially contribute to geological exploration and mining monitoring of raw material and mineral deposits. Due to their spectral characteristics, large mineral occurrences and minefields can be identified from space and the spatial distribution of distinct proxy minerals be mapped. In the frame of the EnMAP hyperspectral satellite science program a mineral and elemental mapping tool was developed - the EnGeoMAP. It contains a basic mineral mapping and a rare earth element mapping approach. This study shows the performance of EnGeoMAP based on simulated EnMAP data of the rare earth element bearing Mountain Pass Carbonatite Complex, USA, and the Rodalquilar and Lomilla Calderas, Spain, which host the economically relevant gold-silver, lead-zinc-silver-gold and alunite deposits. The mountain pass image data was simulated on the basis of AVIRIS Next Generation images, while the Rodalquilar data is based on HyMap images. The EnGeoMAP - Base approach was applied to both images, while the mountain pass image data were additionally analysed using the EnGeoMAP - REE software tool. The results are mineral and elemental maps that serve as proxies for the regional lithology and deposit types. The validation of the maps is based on chemical analyses of field samples. Current airborne sensors meet the spatial and spectral requirements for detailed mineral mapping and future hyperspectral space borne missions will additionally provide a large coverage. For those hyperspectral missions, EnGeoMAP is a rapid data analysis tool that is provided to spectral geologists working in mineral exploration.

  10. The Family Map: A Tool for Understanding the Risks for Children in Families with Substance Abuse

    ERIC Educational Resources Information Center

    Bokony, Patti A.; Conners-Burrow, Nicola A.; Whiteside-Mansell, Leanne; Johnson, Danya; McKelvey, Lorraine; Bradley, Robert H.

    2010-01-01

    This article reviews the findings from our assessments of children and their families in two Head Start programs using the Family Map. Specifically, we used the Family Map assessment tool to identify risks to children associated with alcohol and drug use in families with young children. Practical suggestions are offered to administrators about the…

  11. A Study to Determine the Contribution Made by Concept Maps to a Computer Architecture and Organization Course

    ERIC Educational Resources Information Center

    Aydogan, Tuncay; Ergun, Serap

    2016-01-01

    Concept mapping is a method of graphical learning that can be beneficial as a study method for concept linking and organization. Concept maps, which provide an elegant, easily understood representation of an expert's domain knowledge, are tools for organizing and representing knowledge. These tools have been used in educational environments to…

  12. A Mapping of Drug Space from the Viewpoint of Small Molecule Metabolism

    PubMed Central

    Basuino, Li; Chambers, Henry F.; Lee, Deok-Sun; Wiest, Olaf G.; Babbitt, Patricia C.

    2009-01-01

    Small molecule drugs target many core metabolic enzymes in humans and pathogens, often mimicking endogenous ligands. The effects may be therapeutic or toxic, but are frequently unexpected. A large-scale mapping of the intersection between drugs and metabolism is needed to better guide drug discovery. To map the intersection between drugs and metabolism, we have grouped drugs and metabolites by their associated targets and enzymes using ligand-based set signatures created to quantify their degree of similarity in chemical space. The results reveal the chemical space that has been explored for metabolic targets, where successful drugs have been found, and what novel territory remains. To aid other researchers in their drug discovery efforts, we have created an online resource of interactive maps linking drugs to metabolism. These maps predict the “effect space” comprising likely target enzymes for each of the 246 MDDR drug classes in humans. The online resource also provides species-specific interactive drug-metabolism maps for each of the 385 model organisms and pathogens in the BioCyc database collection. Chemical similarity links between drugs and metabolites predict potential toxicity, suggest routes of metabolism, and reveal drug polypharmacology. The metabolic maps enable interactive navigation of the vast biological data on potential metabolic drug targets and the drug chemistry currently available to prosecute those targets. Thus, this work provides a large-scale approach to ligand-based prediction of drug action in small molecule metabolism. PMID:19701464

  13. GIS-based interactive tool to map the advent of world conquerors

    NASA Astrophysics Data System (ADS)

    Lakkaraju, Mahesh

    The objective of this thesis is to show the scale and extent of some of the greatest empires the world has ever seen. This is a hybrid project between the GIS based interactive tool and the web-based JavaScript tool. This approach lets the students learn effectively about the emperors themselves while understanding how long and far their empires spread. In the GIS based tool, a map is displayed with various points on it, and when a user clicks on one point, the relevant information of what happened at that particular place is displayed. Apart from this information, users can also select the interactive animation button and can walk through a set of battles in chronological order. As mentioned, this uses Java as the main programming language, and MOJO (Map Objects Java Objects) provided by ESRI. MOJO is very effective as its GIS related features can be included in the application itself. This app. is a simple tool and has been developed for university or high school level students. D3.js is an interactive animation and visualization platform built on the Javascript framework. Though HTML5, CSS3, Javascript and SVG animations can be used to derive custom animations, this tool can help bring out results with less effort and more ease of use. Hence, it has become the most sought after visualization tool for multiple applications. D3.js has provided a map-based visualization feature so that we can easily display text-based data in a map-based interface. To draw the map and the points on it, D3.js uses data rendered in TOPO JSON format. The latitudes and longitudes can be provided, which are interpolated into the Map svg. One of the main advantages of doing it this way is that more information is retained when we use a visual medium.

  14. QTL Mapping and CRISPR/Cas9 Editing to Identify a Drug Resistance Gene in Toxoplasma gondii.

    PubMed

    Shen, Bang; Powell, Robin H; Behnke, Michael S

    2017-06-22

    Scientific knowledge is intrinsically linked to available technologies and methods. This article will present two methods that allowed for the identification and verification of a drug resistance gene in the Apicomplexan parasite Toxoplasma gondii, the method of Quantitative Trait Locus (QTL) mapping using a Whole Genome Sequence (WGS) -based genetic map and the method of Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)/Cas9 -based gene editing. The approach of QTL mapping allows one to test if there is a correlation between a genomic region(s) and a phenotype. Two datasets are required to run a QTL scan, a genetic map based on the progeny of a recombinant cross and a quantifiable phenotype assessed in each of the progeny of that cross. These datasets are then formatted to be compatible with R/qtl software that generates a QTL scan to identify significant loci correlated with the phenotype. Although this can greatly narrow the search window of possible candidates, QTLs span regions containing a number of genes from which the causal gene needs to be identified. Having WGS of the progeny was critical to identify the causal drug resistance mutation at the gene level. Once identified, the candidate mutation can be verified by genetic manipulation of drug sensitive parasites. The most facile and efficient method to genetically modify T. gondii is the CRISPR/Cas9 system. This system comprised of just 2 components both encoded on a single plasmid, a single guide RNA (gRNA) containing a 20 bp sequence complementary to the genomic target and the Cas9 endonuclease that generates a double-strand DNA break (DSB) at the target, repair of which allows for insertion or deletion of sequences around the break site. This article provides detailed protocols to use CRISPR/Cas9 based genome editing tools to verify the gene responsible for sinefungin resistance and to construct transgenic parasites.

  15. QTL Mapping and CRISPR/Cas9 Editing to Identify a Drug Resistance Gene in Toxoplasma gondii

    PubMed Central

    Shen, Bang; Powell, Robin H.; Behnke, Michael S.

    2017-01-01

    Scientific knowledge is intrinsically linked to available technologies and methods. This article will present two methods that allowed for the identification and verification of a drug resistance gene in the Apicomplexan parasite Toxoplasma gondii, the method of Quantitative Trait Locus (QTL) mapping using a Whole Genome Sequence (WGS) -based genetic map and the method of Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)/Cas9 -based gene editing. The approach of QTL mapping allows one to test if there is a correlation between a genomic region(s) and a phenotype. Two datasets are required to run a QTL scan, a genetic map based on the progeny of a recombinant cross and a quantifiable phenotype assessed in each of the progeny of that cross. These datasets are then formatted to be compatible with R/qtl software that generates a QTL scan to identify significant loci correlated with the phenotype. Although this can greatly narrow the search window of possible candidates, QTLs span regions containing a number of genes from which the causal gene needs to be identified. Having WGS of the progeny was critical to identify the causal drug resistance mutation at the gene level. Once identified, the candidate mutation can be verified by genetic manipulation of drug sensitive parasites. The most facile and efficient method to genetically modify T. gondii is the CRISPR/Cas9 system. This system comprised of just 2 components both encoded on a single plasmid, a single guide RNA (gRNA) containing a 20 bp sequence complementary to the genomic target and the Cas9 endonuclease that generates a double-strand DNA break (DSB) at the target, repair of which allows for insertion or deletion of sequences around the break site. This article provides detailed protocols to use CRISPR/Cas9 based genome editing tools to verify the gene responsible for sinefungin resistance and to construct transgenic parasites. PMID:28671645

  16. Using habitat suitability models to target invasive plant species surveys.

    PubMed

    Crall, Alycia W; Jarnevich, Catherine S; Panke, Brendon; Young, Nick; Renz, Mark; Morisette, Jeffrey

    2013-01-01

    Managers need new tools for detecting the movement and spread of nonnative, invasive species. Habitat suitability models are a popular tool for mapping the potential distribution of current invaders, but the ability of these models to prioritize monitoring efforts has not been tested in the field. We tested the utility of an iterative sampling design (i.e., models based on field observations used to guide subsequent field data collection to improve the model), hypothesizing that model performance would increase when new data were gathered from targeted sampling using criteria based on the initial model results. We also tested the ability of habitat suitability models to predict the spread of invasive species, hypothesizing that models would accurately predict occurrences in the field, and that the use of targeted sampling would detect more species with less sampling effort than a nontargeted approach. We tested these hypotheses on two species at the state scale (Centaurea stoebe and Pastinaca sativa) in Wisconsin (USA), and one genus at the regional scale (Tamarix) in the western United States. These initial data were merged with environmental data at 30-m2 resolution for Wisconsin and 1-km2 resolution for the western United States to produce our first iteration models. We stratified these initial models to target field sampling and compared our models and success at detecting our species of interest to other surveys being conducted during the same field season (i.e., nontargeted sampling). Although more data did not always improve our models based on correct classification rate (CCR), sensitivity, specificity, kappa, or area under the curve (AUC), our models generated from targeted sampling data always performed better than models generated from nontargeted data. For Wisconsin species, the model described actual locations in the field fairly well (kappa = 0.51, 0.19, P < 0.01), and targeted sampling did detect more species than nontargeted sampling with less sampling effort (chi2 = 47.42, P < 0.01). From these findings, we conclude that habitat suitability models can be highly useful tools for guiding invasive species monitoring, and we support the use of an iterative sampling design for guiding such efforts.

  17. [siRNAs with high specificity to the target: a systematic design by CRM algorithm].

    PubMed

    Alsheddi, T; Vasin, L; Meduri, R; Randhawa, M; Glazko, G; Baranova, A

    2008-01-01

    'Off-target' silencing effect hinders the development of siRNA-based therapeutic and research applications. Common solution to this problem is an employment of the BLAST that may miss significant alignments or an exhaustive Smith-Waterman algorithm that is very time-consuming. We have developed a Comprehensive Redundancy Minimizer (CRM) approach for mapping all unique sequences ("targets") 9-to-15 nt in size within large sets of sequences (e.g. transcriptomes). CRM outputs a list of potential siRNA candidates for every transcript of the particular species. These candidates could be further analyzed by traditional "set-of-rules" types of siRNA designing tools. For human, 91% of transcripts are covered by candidate siRNAs with kernel targets of N = 15. We tested our approach on the collection of previously described experimentally assessed siRNAs and found that the correlation between efficacy and presence in CRM-approved set is significant (r = 0.215, p-value = 0.0001). An interactive database that contains a precompiled set of all human siRNA candidates with minimized redundancy is available at http://129.174.194.243. Application of the CRM-based filtering minimizes potential "off-target" silencing effects and could improve routine siRNA applications.

  18. Gbm.auto: A software tool to simplify spatial modelling and Marine Protected Area planning

    PubMed Central

    Officer, Rick; Clarke, Maurice; Reid, David G.; Brophy, Deirdre

    2017-01-01

    Boosted Regression Trees. Excellent for data-poor spatial management but hard to use Marine resource managers and scientists often advocate spatial approaches to manage data-poor species. Existing spatial prediction and management techniques are either insufficiently robust, struggle with sparse input data, or make suboptimal use of multiple explanatory variables. Boosted Regression Trees feature excellent performance and are well suited to modelling the distribution of data-limited species, but are extremely complicated and time-consuming to learn and use, hindering access for a wide potential user base and therefore limiting uptake and usage. BRTs automated and simplified for accessible general use with rich feature set We have built a software suite in R which integrates pre-existing functions with new tailor-made functions to automate the processing and predictive mapping of species abundance data: by automating and greatly simplifying Boosted Regression Tree spatial modelling, the gbm.auto R package suite makes this powerful statistical modelling technique more accessible to potential users in the ecological and modelling communities. The package and its documentation allow the user to generate maps of predicted abundance, visualise the representativeness of those abundance maps and to plot the relative influence of explanatory variables and their relationship to the response variables. Databases of the processed model objects and a report explaining all the steps taken within the model are also generated. The package includes a previously unavailable Decision Support Tool which combines estimated escapement biomass (the percentage of an exploited population which must be retained each year to conserve it) with the predicted abundance maps to generate maps showing the location and size of habitat that should be protected to conserve the target stocks (candidate MPAs), based on stakeholder priorities, such as the minimisation of fishing effort displacement. Gbm.auto for management in various settings By bridging the gap between advanced statistical methods for species distribution modelling and conservation science, management and policy, these tools can allow improved spatial abundance predictions, and therefore better management, decision-making, and conservation. Although this package was built to support spatial management of a data-limited marine elasmobranch fishery, it should be equally applicable to spatial abundance modelling, area protection, and stakeholder engagement in various scenarios. PMID:29216310

  19. PLASMAP: an interactive computational tool for storage, retrieval and device-independent graphic display of conventional restriction maps.

    PubMed Central

    Stone, B N; Griesinger, G L; Modelevsky, J L

    1984-01-01

    We describe an interactive computational tool, PLASMAP, which allows the user to electronically store, retrieve, and display circular restriction maps. PLASMAP permits users to construct libraries of plasmid restriction maps as a set of files which may be edited in the laboratory at any time. The display feature of PLASMAP quickly generates device-independent, artist-quality, full-color or monochrome, hard copies or CRT screens of complex, conventional circular restriction maps. PMID:6320096

  20. NASA Lunar and Planetary Mapping and Modeling

    NASA Astrophysics Data System (ADS)

    Day, Brian; Law, Emily

    2016-10-01

    NASA's Lunar and Planetary Mapping and Modeling Portals provide web-based suites of interactive visualization and analysis tools to enable mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, and Vesta. New portals for additional planetary bodies are being planned. This presentation will recap some of the enhancements to these products during the past year and preview work currently being undertaken.New data products added to the Lunar Mapping and Modeling Portal (LMMP) include both generalized products as well as polar data products specifically targeting potential sites for the Resource Prospector mission. New tools being developed include traverse planning and surface potential analysis. Current development work on LMMP also includes facilitating mission planning and data management for lunar CubeSat missions. Looking ahead, LMMP is working with the NASA Astromaterials Office to integrate with their Lunar Apollo Sample database to help better visualize the geographic contexts of retrieved samples. All of this will be done within the framework of a new user interface which, among other improvements, will provide significantly enhanced 3D visualizations and navigation.Mars Trek, the project's Mars portal, has now been assigned by NASA's Planetary Science Division to support site selection and analysis for the Mars 2020 Rover mission as well as for the Mars Human Landing Exploration Zone Sites, and is being enhanced with data products and analysis tools specifically requested by the proposing teams for the various sites. NASA Headquarters is giving high priority to Mars Trek's use as a means to directly involve the public in these upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars.The portals also serve as outstanding resources for education and outreach. As such, they have been designated by NASA's Science Mission Directorate as key supporting infrastructure for the new education programs selected through the division's recent CAN.

  1. The IBD interactome: an integrated view of aetiology, pathogenesis and therapy.

    PubMed

    de Souza, Heitor S P; Fiocchi, Claudio; Iliopoulos, Dimitrios

    2017-12-01

    Crohn's disease and ulcerative colitis are prototypical complex diseases characterized by chronic and heterogeneous manifestations, induced by interacting environmental, genomic, microbial and immunological factors. These interactions result in an overwhelming complexity that cannot be tackled by studying the totality of each pathological component (an '-ome') in isolation without consideration of the interaction among all relevant -omes that yield an overall 'network effect'. The outcome of this effect is the 'IBD interactome', defined as a disease network in which dysregulation of individual -omes causes intestinal inflammation mediated by dysfunctional molecular modules. To define the IBD interactome, new concepts and tools are needed to implement a systems approach; an unbiased data-driven integration strategy that reveals key players of the system, pinpoints the central drivers of inflammation and enables development of targeted therapies. Powerful bioinformatics tools able to query and integrate multiple -omes are available, enabling the integration of genomic, epigenomic, transcriptomic, proteomic, metabolomic and microbiome information to build a comprehensive molecular map of IBD. This approach will enable identification of IBD molecular subtypes, correlations with clinical phenotypes and elucidation of the central hubs of the IBD interactome that will aid discovery of compounds that can specifically target the hubs that control the disease.

  2. Using telephony data to facilitate discovery of clinical workflows.

    PubMed

    Rucker, Donald W

    2017-04-19

    Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  3. Controlling feeding behavior by chemical or gene-directed targeting in the brain: what's so spatial about our methods?

    PubMed Central

    Khan, Arshad M.

    2013-01-01

    Intracranial chemical injection (ICI) methods have been used to identify the locations in the brain where feeding behavior can be controlled acutely. Scientists conducting ICI studies often document their injection site locations, thereby leaving kernels of valuable location data for others to use to further characterize feeding control circuits. Unfortunately, this rich dataset has not yet been formally contextualized with other published neuroanatomical data. In particular, axonal tracing studies have delineated several neural circuits originating in the same areas where ICI injection feeding-control sites have been documented, but it remains unclear whether these circuits participate in feeding control. Comparing injection sites with other types of location data would require careful anatomical registration between the datasets. Here, a conceptual framework is presented for how such anatomical registration efforts can be performed. For example, by using a simple atlas alignment tool, a hypothalamic locus sensitive to the orexigenic effects of neuropeptide Y (NPY) can be aligned accurately with the locations of neurons labeled by anterograde tracers or those known to express NPY receptors or feeding-related peptides. This approach can also be applied to those intracranial “gene-directed” injection (IGI) methods (e.g., site-specific recombinase methods, RNA expression or interference, optogenetics, and pharmacosynthetics) that involve viral injections to targeted neuronal populations. Spatial alignment efforts can be accelerated if location data from ICI/IGI methods are mapped to stereotaxic brain atlases to allow powerful neuroinformatics tools to overlay different types of data in the same reference space. Atlas-based mapping will be critical for community-based sharing of location data for feeding control circuits, and will accelerate our understanding of structure-function relationships in the brain for mammalian models of obesity and metabolic disorders. PMID:24385950

  4. CRISPR-directed mitotic recombination enables genetic mapping without crosses.

    PubMed

    Sadhu, Meru J; Bloom, Joshua S; Day, Laura; Kruglyak, Leonid

    2016-05-27

    Linkage and association studies have mapped thousands of genomic regions that contribute to phenotypic variation, but narrowing these regions to the underlying causal genes and variants has proven much more challenging. Resolution of genetic mapping is limited by the recombination rate. We developed a method that uses CRISPR (clustered, regularly interspaced, short palindromic repeats) to build mapping panels with targeted recombination events. We tested the method by generating a panel with recombination events spaced along a yeast chromosome arm, mapping trait variation, and then targeting a high density of recombination events to the region of interest. Using this approach, we fine-mapped manganese sensitivity to a single polymorphism in the transporter Pmr1. Targeting recombination events to regions of interest allows us to rapidly and systematically identify causal variants underlying trait differences. Copyright © 2016, American Association for the Advancement of Science.

  5. Enhanced Management of and Access to Hurricane Sandy Ocean and Coastal Mapping Data

    NASA Astrophysics Data System (ADS)

    Eakins, B.; Neufeld, D.; Varner, J. D.; McLean, S. J.

    2014-12-01

    NOAA's National Geophysical Data Center (NGDC) has significantly improved the discovery and delivery of its geophysical data holdings, initially targeting ocean and coastal mapping (OCM) data in the U.S. coastal region impacted by Hurricane Sandy in 2012. We have developed a browser-based, interactive interface that permits users to refine their initial map-driven data-type choices prior to bulk download (e.g., by selecting individual surveys), including the ability to choose ancillary files, such as reports or derived products. Initial OCM data types now available in a U.S. East Coast map viewer, as well as underlying web services, include: NOS hydrographic soundings and multibeam sonar bathymetry. Future releases will include trackline geophysics, airborne topographic and bathymetric-topographic lidar, bottom sample descriptions, and digital elevation models.This effort also includes working collaboratively with other NOAA offices and partners to develop automated methods to receive and verify data, stage data for archive, and notify data providers when ingest and archive are completed. We have also developed improved metadata tools to parse XML and auto-populate OCM data catalogs, support the web-based creation and editing of ISO-compliant metadata records, and register metadata in appropriate data portals. This effort supports a variety of NOAA mission requirements, from safe navigation to coastal flood forecasting and habitat characterization.

  6. Investigation of relationships between linears, total and hazy areas, and petroleum production in the Williston Basin: An ERTS approach

    NASA Technical Reports Server (NTRS)

    Erickson, J. M.; Street, J. S. (Principal Investigator); Munsell, C. J.; Obrien, D. E.

    1975-01-01

    The author has identified the following significant results. ERTS-1 imagery in a variety of formats was used to locate linear, tonal, and hazy features and to relate them to areas of hydrocarbon production in the Williston Basin of North Dakota, eastern Montana, and northern South Dakota. Derivative maps of rectilinear, curvilinear, tonal, and hazy features were made using standard laboratory techniques. Mapping of rectilinears on both bands 5 and 7 over the entire region indicated the presence of a northeast-southwest and a northwest-southeast regional trend which is indicative of the bedrock fracture pattern in the basin. Curved lines generally bound areas of unique tone, maps of tonal patterns repeat many of the boundaries seen on curvilinear maps. Tones were best analyzed on spring and fall imagery in the Williston Basin. It is postulated that hazy areas are caused by atmospheric phenomena. The ability to use ERTS imagery as an exploration tool was examined where petroleum and gas are presently produced (Bottineau Field, Nesson and Antelope anticlines, Redwing Creek, and Cedar Creek anticline). It is determined that some tonal and linear features coincide with location of present production in Redwing and Cedar Creeks. In the remaining cases, targets could not be sufficiently well defined to justify this method.

  7. Story Map Instruction: A Road Map for Reading Comprehension.

    ERIC Educational Resources Information Center

    Davis, Zephaniah, T.; McPherson, Michael D.

    1989-01-01

    Introduces teachers to the development and use of story maps as a tool for promoting reading comprehension. Presents a definition and review of story map research. Explains how to construct story maps, and offers suggestions for starting story map instruction. Provides variations on the use of story maps. (MG)

  8. Music-therapy analyzed through conceptual mapping

    NASA Astrophysics Data System (ADS)

    Martinez, Rodolfo; de la Fuente, Rebeca

    2002-11-01

    Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.

  9. Science Teachers' Use of a Concept Map Marking Guide as a Formative Assessment Tool for the Concept of Energy

    ERIC Educational Resources Information Center

    Won, Mihye; Krabbe, Heiko; Ley, Siv Ling; Treagust, David F.; Fischer, Hans E.

    2017-01-01

    In this study, we investigated the value of a concept map marking guide as an alternative formative assessment tool for science teachers to adopt for the topic of energy. Eight high school science teachers marked students' concept maps using an itemized holistic marking guide. Their marking was compared with the researchers' marking and the scores…

  10. Mapping, Awareness, And Virtualization Network Administrator Training Tool Virtualization Module

    DTIC Science & Technology

    2016-03-01

    AND VIRTUALIZATION NETWORK ADMINISTRATOR TRAINING TOOL VIRTUALIZATION MODULE by Erik W. Berndt March 2016 Thesis Advisor: John Gibson...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MAPPING, AWARENESS, AND VIRTUALIZATION NETWORK ADMINISTRATOR TRAINING TOOL... VIRTUALIZATION MODULE 5. FUNDING NUMBERS 6. AUTHOR(S) Erik W. Berndt 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School

  11. Mapping healthcare systems: a policy relevant analytic tool

    PubMed Central

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L.V.

    2017-01-01

    Abstract Background In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool – the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Methods Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. Results We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. Conclusions As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. PMID:28541518

  12. A software tool for rapid flood inundation mapping

    USGS Publications Warehouse

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  13. Automated mapping of clinical terms into SNOMED-CT. An application to codify procedures in pathology.

    PubMed

    Allones, J L; Martinez, D; Taboada, M

    2014-10-01

    Clinical terminologies are considered a key technology for capturing clinical data in a precise and standardized manner, which is critical to accurately exchange information among different applications, medical records and decision support systems. An important step to promote the real use of clinical terminologies, such as SNOMED-CT, is to facilitate the process of finding mappings between local terms of medical records and concepts of terminologies. In this paper, we propose a mapping tool to discover text-to-concept mappings in SNOMED-CT. Name-based techniques were combined with a query expansion system to generate alternative search terms, and with a strategy to analyze and take advantage of the semantic relationships of the SNOMED-CT concepts. The developed tool was evaluated and compared to the search services provided by two SNOMED-CT browsers. Our tool automatically mapped clinical terms from a Spanish glossary of procedures in pathology with 88.0% precision and 51.4% recall, providing a substantial improvement of recall (28% and 60%) over other publicly accessible mapping services. The improvements reached by the mapping tool are encouraging. Our results demonstrate the feasibility of accurately mapping clinical glossaries to SNOMED-CT concepts, by means a combination of structural, query expansion and named-based techniques. We have shown that SNOMED-CT is a great source of knowledge to infer synonyms for the medical domain. Results show that an automated query expansion system overcomes the challenge of vocabulary mismatch partially.

  14. COSMO-SkyMed and GIS applications

    NASA Astrophysics Data System (ADS)

    Milillo, Pietro; Sole, Aurelia; Serio, Carmine

    2013-04-01

    Geographic Information Systems (GIS) and Remote Sensing have become key technology tools for the collection, storage and analysis of spatially referenced data. Industries that utilise these spatial technologies include agriculture, forestry, mining, market research as well as the environmental analysis . Synthetic Aperture Radar (SAR) is a coherent active sensor operating in the microwave band which exploits relative motion between antenna and target in order to obtain a finer spatial resolution in the flight direction exploiting the Doppler effect. SAR have wide applications in Remote Sensing such as cartography, surface deformation detection, forest cover mapping, urban planning, disasters monitoring , surveillance etc… The utilization of satellite remote sensing and GIS technology for this applications has proven to be a powerful and effective tool for environmental monitoring. Remote sensing techniques are often less costly and time-consuming for large geographic areas compared to conventional methods, moreover GIS technology provides a flexible environment for, analyzing and displaying digital data from various sources necessary for classification, change detection and database development. The aim of this work si to illustrate the potential of COSMO-SkyMed data and SAR applications in a GIS environment, in particular a demostration of the operational use of COSMO-SkyMed SAR data and GIS in real cases will be provided for what concern DEM validation, river basin estimation, flood mapping and landslide monitoring.

  15. Malleable architecture generator for FPGA computing

    NASA Astrophysics Data System (ADS)

    Gokhale, Maya; Kaba, James; Marks, Aaron; Kim, Jang

    1996-10-01

    The malleable architecture generator (MARGE) is a tool set that translates high-level parallel C to configuration bit streams for field-programmable logic based computing systems. MARGE creates an application-specific instruction set and generates the custom hardware components required to perform exactly those computations specified by the C program. In contrast to traditional fixed-instruction processors, MARGE's dynamic instruction set creation provides for efficient use of hardware resources. MARGE processes intermediate code in which each operation is annotated by the bit lengths of the operands. Each basic block (sequence of straight line code) is mapped into a single custom instruction which contains all the operations and logic inherent in the block. A synthesis phase maps the operations comprising the instructions into register transfer level structural components and control logic which have been optimized to exploit functional parallelism and function unit reuse. As a final stage, commercial technology-specific tools are used to generate configuration bit streams for the desired target hardware. Technology- specific pre-placed, pre-routed macro blocks are utilized to implement as much of the hardware as possible. MARGE currently supports the Xilinx-based Splash-2 reconfigurable accelerator and National Semiconductor's CLAy-based parallel accelerator, MAPA. The MARGE approach has been demonstrated on systolic applications such as DNA sequence comparison.

  16. Depth assisted compression of full parallax light fields

    NASA Astrophysics Data System (ADS)

    Graziosi, Danillo B.; Alpaslan, Zahir Y.; El-Ghoroury, Hussein S.

    2015-03-01

    Full parallax light field displays require high pixel density and huge amounts of data. Compression is a necessary tool used by 3D display systems to cope with the high bandwidth requirements. One of the formats adopted by MPEG for 3D video coding standards is the use of multiple views with associated depth maps. Depth maps enable the coding of a reduced number of views, and are used by compression and synthesis software to reconstruct the light field. However, most of the developed coding and synthesis tools target linearly arranged cameras with small baselines. Here we propose to use the 3D video coding format for full parallax light field coding. We introduce a view selection method inspired by plenoptic sampling followed by transform-based view coding and view synthesis prediction to code residual views. We determine the minimal requirements for view sub-sampling and present the rate-distortion performance of our proposal. We also compare our method with established video compression techniques, such as H.264/AVC, H.264/MVC, and the new 3D video coding algorithm, 3DV-ATM. Our results show that our method not only has an improved rate-distortion performance, it also preserves the structure of the perceived light fields better.

  17. Mars Exploration Rovers Landing Dispersion Analysis

    NASA Technical Reports Server (NTRS)

    Knocke, Philip C.; Wawrzyniak, Geoffrey G.; Kennedy, Brian M.; Desai, Prasun N.; Parker, TImothy J.; Golombek, Matthew P.; Duxbury, Thomas C.; Kass, David M.

    2004-01-01

    Landing dispersion estimates for the Mars Exploration Rover missions were key elements in the site targeting process and in the evaluation of landing risk. This paper addresses the process and results of the landing dispersion analyses performed for both Spirit and Opportunity. The several contributors to landing dispersions (navigation and atmospheric uncertainties, spacecraft modeling, winds, and margins) are discussed, as are the analysis tools used. JPL's MarsLS program, a MATLAB-based landing dispersion visualization and statistical analysis tool, was used to calculate the probability of landing within hazardous areas. By convolving this with the probability of landing within flight system limits (in-spec landing) for each hazard area, a single overall measure of landing risk was calculated for each landing ellipse. In-spec probability contours were also generated, allowing a more synoptic view of site risks, illustrating the sensitivity to changes in landing location, and quantifying the possible consequences of anomalies such as incomplete maneuvers. Data and products required to support these analyses are described, including the landing footprints calculated by NASA Langley's POST program and JPL's AEPL program, cartographically registered base maps and hazard maps, and flight system estimates of in-spec landing probabilities for each hazard terrain type. Various factors encountered during operations, including evolving navigation estimates and changing atmospheric models, are discussed and final landing points are compared with approach estimates.

  18. Validation of the MedUseQ: A Self-Administered Screener for Older Adults to Assess Medication Use Problems.

    PubMed

    Berman, Rebecca L; Iris, Madelyn; Conrad, Kendon J; Robinson, Carrie

    2018-01-01

    Older adults taking multiple prescription and nonprescription drugs are at risk for medication use problems, yet there are few brief, self-administered screening tools designed specifically for them. The study objective was to develop and validate a patient-centered screener for community-dwelling older adults. In phase 1, a convenience sample of 57 stakeholders (older adults, pharmacists, nurses, and physicians) participated in concept mapping, using Concept System® Global MAX TM , to identify items for a questionnaire. In phase 2, a 40-item questionnaire was tested with a convenience sample of 377 adults and a 24-item version was tested with 306 older adults, aged 55 and older, using Rasch methodology. In phase 3, stakeholder focus groups provided feedback on the format of questionnaire materials and recommended strategies for addressing problems. The concept map contained 72 statements organized into 6 conceptual clusters or domains. The 24-item screener was unidimensional. Cronbach's alpha was .87, person reliability was acceptable (.74), and item reliability was high (.96). The MedUseQ is a validated, patient-centered tool targeting older adults that can be used to assess a wide range of medication use problems in clinical and community settings and to identify areas for education, intervention, or further assessment.

  19. Interactive Web Interface to the Global Strain Rate Map Project

    NASA Astrophysics Data System (ADS)

    Meertens, C. M.; Estey, L.; Kreemer, C.; Holt, W.

    2004-05-01

    An interactive web interface allows users to explore the results of a global strain rate and velocity model and to compare them to other geophysical observations. The most recent model, an updated version of Kreemer et al., 2003, has 25 independent rigid plate-like regions separated by deformable boundaries covered by about 25,000 grid areas. A least-squares fit was made to 4900 geodetic velocities from 79 different geodetic studies. In addition, Quaternary fault slip rate data are used to infer geologic strain rate estimates (currently only for central Asia). Information about the style and direction of expected strain rate is inferred from the principal axes of the seismic strain rate field. The current model, as well as source data, references and an interactive map tool, are located at the International Lithosphere Program (ILP) "A Global Strain Rate Map (ILP II-8)" project website: http://www-world-strain-map.org. The purpose of the ILP GSRM project is to provide new information from this, and other investigations, that will contribute to a better understanding of continental dynamics and to the quantification of seismic hazards. A unique aspect of the GSRM interactive Java map tool is that the user can zoom in and make custom views of the model grid and results for any area of the globe selecting strain rate and style contour plots and principal axes, observed and model velocity fields in specified frames of reference, and geologic fault data. The results can be displayed with other data sets such Harvard CMT earthquake focal mechanisms, stress directions from the ILP World Stress Map Project, and topography. With the GSRM Java map tool, the user views custom maps generated by a Generic Mapping Tool (GMT) server. These interactive capabilities greatly extend what is possible to present in a published paper. A JavaScript version, using pre-constructed maps, as well as a related information site have also been created for broader education and outreach access. The GSRM map tool will be demonstrated and latest model GSRM 1.1 results, containing important new data for Asia, Iran, western Pacific, and Southern California, will be presented.

  20. PepMapper: a collaborative web tool for mapping epitopes from affinity-selected peptides.

    PubMed

    Chen, Wenhan; Guo, William W; Huang, Yanxin; Ma, Zhiqiang

    2012-01-01

    Epitope mapping from affinity-selected peptides has become popular in epitope prediction, and correspondingly many Web-based tools have been developed in recent years. However, the performance of these tools varies in different circumstances. To address this problem, we employed an ensemble approach to incorporate two popular Web tools, MimoPro and Pep-3D-Search, together for taking advantages offered by both methods so as to give users more options for their specific purposes of epitope-peptide mapping. The combined operation of Union finds as many associated peptides as possible from both methods, which increases sensitivity in finding potential epitopic regions on a given antigen surface. The combined operation of Intersection achieves to some extent the mutual verification by the two methods and hence increases the likelihood of locating the genuine epitopic region on a given antigen in relation to the interacting peptides. The Consistency between Intersection and Union is an indirect sufficient condition to assess the likelihood of successful peptide-epitope mapping. On average from 27 tests, the combined operations of PepMapper outperformed either MimoPro or Pep-3D-Search alone. Therefore, PepMapper is another multipurpose mapping tool for epitope prediction from affinity-selected peptides. The Web server can be freely accessed at: http://informatics.nenu.edu.cn/PepMapper/

  1. Learning to merge: a new tool for interactive mapping

    NASA Astrophysics Data System (ADS)

    Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy

    2013-05-01

    The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.

  2. GDR (Genome Database for Rosaceae): integrated web-database for Rosaceae genomics and genetics data

    PubMed Central

    Jung, Sook; Staton, Margaret; Lee, Taein; Blenda, Anna; Svancara, Randall; Abbott, Albert; Main, Dorrie

    2008-01-01

    The Genome Database for Rosaceae (GDR) is a central repository of curated and integrated genetics and genomics data of Rosaceae, an economically important family which includes apple, cherry, peach, pear, raspberry, rose and strawberry. GDR contains annotated databases of all publicly available Rosaceae ESTs, the genetically anchored peach physical map, Rosaceae genetic maps and comprehensively annotated markers and traits. The ESTs are assembled to produce unigene sets of each genus and the entire Rosaceae. Other annotations include putative function, microsatellites, open reading frames, single nucleotide polymorphisms, gene ontology terms and anchored map position where applicable. Most of the published Rosaceae genetic maps can be viewed and compared through CMap, the comparative map viewer. The peach physical map can be viewed using WebFPC/WebChrom, and also through our integrated GDR map viewer, which serves as a portal to the combined genetic, transcriptome and physical mapping information. ESTs, BACs, markers and traits can be queried by various categories and the search result sites are linked to the mapping visualization tools. GDR also provides online analysis tools such as a batch BLAST/FASTA server for the GDR datasets, a sequence assembly server and microsatellite and primer detection tools. GDR is available at http://www.rosaceae.org. PMID:17932055

  3. From direct-space discrepancy functions to crystallographic least squares.

    PubMed

    Giacovazzo, Carmelo

    2015-01-01

    Crystallographic least squares are a fundamental tool for crystal structure analysis. In this paper their properties are derived from functions estimating the degree of similarity between two electron-density maps. The new approach leads also to modifications of the standard least-squares procedures, potentially able to improve their efficiency. The role of the scaling factor between observed and model amplitudes is analysed: the concept of unlocated model is discussed and its scattering contribution is combined with that arising from the located model. Also, the possible use of an ancillary parameter, to be associated with the classical weight related to the variance of the observed amplitudes, is studied. The crystallographic discrepancy factors, basic tools often combined with least-squares procedures in phasing approaches, are analysed. The mathematical approach here described includes, as a special case, the so-called vector refinement, used when accurate estimates of the target phases are available.

  4. ClusCo: clustering and comparison of protein models.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej

    2013-02-22

    The development, optimization and validation of protein modeling methods require efficient tools for structural comparison. Frequently, a large number of models need to be compared with the target native structure. The main reason for the development of Clusco software was to create a high-throughput tool for all-versus-all comparison, because calculating similarity matrix is the one of the bottlenecks in the protein modeling pipeline. Clusco is fast and easy-to-use software for high-throughput comparison of protein models with different similarity measures (cRMSD, dRMSD, GDT_TS, TM-Score, MaxSub, Contact Map Overlap) and clustering of the comparison results with standard methods: K-means Clustering or Hierarchical Agglomerative Clustering. The application was highly optimized and written in C/C++, including the code for parallel execution on CPU and GPU, which resulted in a significant speedup over similar clustering and scoring computation programs.

  5. Heat and Health in a Changing Climate: Building a Decision Support Tool for California Public Health Officials

    NASA Astrophysics Data System (ADS)

    Steinberg, N.

    2017-12-01

    There is considerable interest in overlaying climate projections with social vulnerability maps as a mechanism for targeting community adaptation efforts. Yet the identification of relevant factors for adaptation- and resilience-based decisions remain a challenge. Our findings show that successful adaptation interventions are more likely when factors are grouped and spatially represented. By designing a decision-support tool that is focused on informing long-term planning to mitigate the public health impacts of extreme heat, communities can more easily integrate climate, land use, and population characteristics into local planning processes. The ability to compare risks and potential health impacts across census tracts may also position local practitioners to leverage scarce resources. This presentation will discuss the information gaps identified by planners and public health practitioners throughout California and illustrate the spatial variations of key health risk factors.

  6. Automating the implementation of an equilibrium profile model for glacier reconstruction in a GIS environment

    NASA Astrophysics Data System (ADS)

    Frew, Craig R.; Pellitero, Ramón; Rea, Brice R.; Spagnolo, Matteo; Bakke, Jostein; Hughes, Philip D.; Ivy-Ochs, Susan; Lukas, Sven; Renssen, Hans; Ribolini, Adriano

    2014-05-01

    Reconstruction of glacier equilibrium line altitudes (ELAs) associated with advance stages of former ice masses is widely used as a tool for palaeoclimatic reconstruction. This requires an accurate reconstruction of palaeo-glacier surface hypsometry, based on mapping of available ice-marginal landform evidence. Classically, the approach used to define ice-surface elevations, using such evidence, follows the 'cartographic method', whereby contours are estimated based on an 'understanding' of the typical surface form of contemporary ice masses. This method introduces inherent uncertainties in the palaeoclimatic interpretation of reconstructed ELAs, especially where the upper limits of glaciation are less well constrained and/or the age of such features in relation to terminal moraine sequences is unknown. An alternative approach is to use equilibrium profile models to define ice surface elevations. Such models are tuned, generally using basal shear stress, in order to generate an ice surface that reaches 'target elevations' defined by geomorphology. In areas where there are no geomorphological constraints for the former ice surface, the reconstruction is undertaken using glaciologiaclly representative values for basal shear stress. Numerical reconstructions have been shown to produce glaciologically "realistic" ice surface geometries, allowing for more objective and robust comparative studies at local to regional scales. User-friendly tools for the calculation of equilibrium profiles are presently available in the literature. Despite this, their use is not yet widespread, perhaps owing to the difficult and time consuming nature of acquiring the necessary inputs from contour maps or digital elevation models. Here we describe a tool for automatically reconstructing palaeo-glacier surface geometry using an equilibrium profile equation implemented in ArcGIS. The only necessary inputs for this tool are 1) a suitable digital elevation model and 2) mapped outlines of the former glacier terminus position (usually a frontal moraine system) and any relevant geomorphological constraints on ice surface elevation (e.g. lateral moraines, trimlines etc.). This provides a standardised method for glacier reconstruction that can be applied rapidly and systematically to large geomorphological datasets.

  7. Noninvasive, targeted gene therapy for acute spinal cord injury using LIFU-mediated BDNF-loaded cationic nanobubble destruction.

    PubMed

    Song, Zhaojun; Ye, Yongjie; Zhang, Zhi; Shen, Jieliang; Hu, Zhenming; Wang, Zhigang; Zheng, Jiazhuang

    2018-02-12

    Various gene delivery systems have been widely studied for the acute spinal cord injury (SCI) treatment. In the present study, a novel type of brain-derived neurotrophic factor (BDNF)-loaded cationic nanobubbles (CNBs) conjugated with MAP-2 antibody (mAb MAP-2 /BDNF/CNBs) was prepared to provide low-intensity focused ultrasound (LIFU)-targeted gene therapy. In vitro experiments, the ultrasound-targeted tranfection to BDNF overexpressioin in neurons and efficiently inhibition neuronal apoptosis have been demonstrated, and the elaborately designed mAb MAP-2 /BDNF/CNBs can specifically target to the neurons. Furthermore, in a acute SCI rat model, LIFU-mediated mAb MAP-2 /BDNF/CNBs transfection significantly increased BDNF expression, attenuated histological injury, decreased neurons loss, inhibited neuronal apoptosis in injured spinal cords, and increased BBB scores in SCI rats. LIFU-mediated mAb MAP-2 /BDNF/CNBs destruction significantly increase transfection efficiency of BDNF gene both in vitro and in vivo, and has a significant neuroprotective effect on the injured spinal cord. Therefore, the combination of LIFU irradiation and gene therapy through mAb MAP-2 /BDNF/CNBs can be considered as a novel non-invasive and targeted treatment for gene therapy of SCI. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. IASMHYN: A web tool for mapping Soil Water Budget and agro-hydrological assessment trough the integration of monitoring and remote sensing data

    NASA Astrophysics Data System (ADS)

    Bagli, Stefano; Pistocchi, Alberto; Mazzoli, Paolo; Borga, Marco; Bertoldi, Giacomo; Brenner, Johannes; Luzzi, Valerio

    2016-04-01

    Climate change, increasing pressure on farmland to satisfy the growing demand, and need to ensure environmental quality for agriculture in order to be competitive require an increasing capacity of water management. In this context, web-based for forecasting and monitoring the hydrological conditions of topsoil can be an effective means to save water, maximize crop protection and reduce soil loss and the leaching of pollutants. Such tools need to be targeted to the users and be accessible in a simple way in order to allow adequate take up in the practice. IASMHYN "Improved management of Agricultural Systems by Monitoring and Hydrological evaluation" is a web mapping service designed to provide and update on a daily basis the main water budget variables for farmland management. A beta version of the tool is available at www.gecosistema.com/iasmhyn . IASMHYN is an instrument for "second level monitoring" that takes into account accurate hydro-meteorological information's from ground stations and remote sensing sources, and turns them into practically usable decision variables for precision farming, making use of geostatistical analysis and hydrological models The main routines embedded in IASMYHN exclusively use open source libraries (R packages and Python), to perform following operations: (1) Automatic acquisition of observed data, both from ground stations and remote sensing, concerning precipitation (RADAR) and temperature (MODIS-LST) available from various sources; (2) Interpolation of acquisitions through regression kriging in order to spatially map the meteorological data; (3) Run of hydrological models to obtain spatial information of hydrological soil variables of immediate interest in agriculture. The real time results that are produced are available trough a web interface and provide the user with spatial maps and time series of the following variables, supporting decision on irrigation, soil protection from erosion, pollution risk of groundwater and streams: - Daily precipitation and its characteristics (rain, snow or hail, rain erosiveness); - Maximum, minimum and average daily temperature; - Soil Water Content (SWC); - Infiltration into the deep layers of the soil and surface runoff; - Potential loss of soil due to erosion - Residence time of a possible chemical (pesticides, fertilizers) applied to the soil. Thematic real time maps are produced give the user support decision on irrigation, soil management and pesticide/fertilizer application. The ongoing project will also lead to validation and improvement of estimates of hydrological variables from satellite imagery and radar data. The tool has been cross-validated with estimates of evapotranspiration and soil water content in agricultural sites in South Tyrol (Italy) in the framework of MONALISA project (http://www.monalisa-project.eu). A comparison with physical based models, satellite imagery and radar data will allow further generalization of the product. The ultimate goal of the tool is to make available on the market a service that is generally applicable in Europe , using commonly available data, to provide single farmers and organizations effective and up to date information for planning and programming their activities.

  9. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  10. A Mathematical Model for Storage and Recall of Images using Targeted Synchronization of Coupled Maps.

    PubMed

    Palaniyandi, P; Rangarajan, Govindan

    2017-08-21

    We propose a mathematical model for storage and recall of images using coupled maps. We start by theoretically investigating targeted synchronization in coupled map systems wherein only a desired (partial) subset of the maps is made to synchronize. A simple method is introduced to specify coupling coefficients such that targeted synchronization is ensured. The principle of this method is extended to storage/recall of images using coupled Rulkov maps. The process of adjusting coupling coefficients between Rulkov maps (often used to model neurons) for the purpose of storing a desired image mimics the process of adjusting synaptic strengths between neurons to store memories. Our method uses both synchronisation and synaptic weight modification, as the human brain is thought to do. The stored image can be recalled by providing an initial random pattern to the dynamical system. The storage and recall of the standard image of Lena is explicitly demonstrated.

  11. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  12. Knowledge Mapping: A Multipurpose Task Analysis Tool.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1988-01-01

    Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

  13. Mapping auditory nerve firing density using high-level compound action potentials and high-pass noise masking a

    PubMed Central

    Earl, Brian R.; Chertoff, Mark E.

    2012-01-01

    Future implementation of regenerative treatments for sensorineural hearing loss may be hindered by the lack of diagnostic tools that specify the target(s) within the cochlea and auditory nerve for delivery of therapeutic agents. Recent research has indicated that the amplitude of high-level compound action potentials (CAPs) is a good predictor of overall auditory nerve survival, but does not pinpoint the location of neural damage. A location-specific estimate of nerve pathology may be possible by using a masking paradigm and high-level CAPs to map auditory nerve firing density throughout the cochlea. This initial study in gerbil utilized a high-pass masking paradigm to determine normative ranges for CAP-derived neural firing density functions using broadband chirp stimuli and low-frequency tonebursts, and to determine if cochlear outer hair cell (OHC) pathology alters the distribution of neural firing in the cochlea. Neural firing distributions for moderate-intensity (60 dB pSPL) chirps were affected by OHC pathology whereas those derived with high-level (90 dB pSPL) chirps were not. These results suggest that CAP-derived neural firing distributions for high-level chirps may provide an estimate of auditory nerve survival that is independent of OHC pathology. PMID:22280596

  14. Mapping multiple components of malaria risk for improved targeting of elimination interventions.

    PubMed

    Cohen, Justin M; Le Menach, Arnaud; Pothin, Emilie; Eisele, Thomas P; Gething, Peter W; Eckhoff, Philip A; Moonen, Bruno; Schapira, Allan; Smith, David L

    2017-11-13

    There is a long history of considering the constituent components of malaria risk and the malaria transmission cycle via the use of mathematical models, yet strategic planning in endemic countries tends not to take full advantage of available disease intelligence to tailor interventions. National malaria programmes typically make operational decisions about where to implement vector control and surveillance activities based upon simple categorizations of annual parasite incidence. With technological advances, an enormous opportunity exists to better target specific malaria interventions to the places where they will have greatest impact by mapping and evaluating metrics related to a variety of risk components, each of which describes a different facet of the transmission cycle. Here, these components and their implications for operational decision-making are reviewed. For each component, related mappable malaria metrics are also described which may be measured and evaluated by malaria programmes seeking to better understand the determinants of malaria risk. Implementing tailored programmes based on knowledge of the heterogeneous distribution of the drivers of malaria transmission rather than only consideration of traditional metrics such as case incidence has the potential to result in substantial improvements in decision-making. As programmes improve their ability to prioritize their available tools to the places where evidence suggests they will be most effective, elimination aspirations may become increasingly feasible.

  15. Physicochemical properties of the modeled structure of astacin metalloprotease moulting enzyme NAS-36 and mapping the druggable allosteric space of Heamonchus contortus, Brugia malayi and Ceanorhabditis elegans via molecular dynamics simulation.

    PubMed

    Sharma, Om Prakash; Agrawal, Sonali; Kumar, M Suresh

    2013-12-01

    Nematodes represent the second largest phylum in the animal kingdom. It is the most abundant species (500,000) in the planet. It causes chronic, debilitating infections worldwide such as ascariasis, trichuriasis, hookworm, enterobiasis, strongyloidiasis, filariasis and trichinosis, among others. Molecular modeling tools can play an important role in the identification and structural investigation of molecular targets that can act as a vital candidate against filariasis. In this study, sequence analysis of NAS-36 from H. contortus (Heamonchus contortus), B. malayi (Brugia malayi) and C. elegans (Ceanorhabditis elegans) has been performed, in order to identify the conserved residues. Tertiary structure was developed for an insight into the molecular structure of the enzyme. Molecular Dynamics Simulation (MDS) studies have been carried out to analyze the stability and the physical properties of the proposed enzyme models in the H. contortus, B. malayi and C. elegans. Moreover, the drug binding sites have been mapped for inhibiting the function of NAS-36 enzyme. The molecular identity of this protease could eventually demonstrate how ex-sheathment is regulated, as well as provide a potential target of anthelmintics for the prevention of nematode infections.

  16. Bamgineer: Introduction of simulated allele-specific copy number variants into exome and targeted sequence data sets.

    PubMed

    Samadian, Soroush; Bruce, Jeff P; Pugh, Trevor J

    2018-03-01

    Somatic copy number variations (CNVs) play a crucial role in development of many human cancers. The broad availability of next-generation sequencing data has enabled the development of algorithms to computationally infer CNV profiles from a variety of data types including exome and targeted sequence data; currently the most prevalent types of cancer genomics data. However, systemic evaluation and comparison of these tools remains challenging due to a lack of ground truth reference sets. To address this need, we have developed Bamgineer, a tool written in Python to introduce user-defined haplotype-phased allele-specific copy number events into an existing Binary Alignment Mapping (BAM) file, with a focus on targeted and exome sequencing experiments. As input, this tool requires a read alignment file (BAM format), lists of non-overlapping genome coordinates for introduction of gains and losses (bed file), and an optional file defining known haplotypes (vcf format). To improve runtime performance, Bamgineer introduces the desired CNVs in parallel using queuing and parallel processing on a local machine or on a high-performance computing cluster. As proof-of-principle, we applied Bamgineer to a single high-coverage (mean: 220X) exome sequence file from a blood sample to simulate copy number profiles of 3 exemplar tumors from each of 10 tumor types at 5 tumor cellularity levels (20-100%, 150 BAM files in total). To demonstrate feasibility beyond exome data, we introduced read alignments to a targeted 5-gene cell-free DNA sequencing library to simulate EGFR amplifications at frequencies consistent with circulating tumor DNA (10, 1, 0.1 and 0.01%) while retaining the multimodal insert size distribution of the original data. We expect Bamgineer to be of use for development and systematic benchmarking of CNV calling algorithms by users using locally-generated data for a variety of applications. The source code is freely available at http://github.com/pughlab/bamgineer.

  17. TU-AB-202-03: Prediction of PET Transfer Uncertainty by DIR Error Estimating Software, AUTODIRECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Phillips, J

    2016-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool, but DIR errors can adversely affect its clinical applications. To estimate voxel-specific DIR uncertainty, a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), has been developed and validated. This work tests the ability of this software to predict uncertainty for the transfer of standard uptake values (SUV) from positron-emission tomography (PET) with DIR. Methods: Virtual phantoms are used for this study. Each phantom has a planning computed tomography (CT) image and a diagnostic PET-CT image set. A deformation was digitally applied to the diagnostic CT to create the planningmore » CT image and establish a known deformation between the images. One lung and three rectum patient datasets were employed to create the virtual phantoms. Both of these sites have difficult deformation scenarios associated with them, which can affect DIR accuracy (lung tissue sliding and changes in rectal filling). The virtual phantoms were created to simulate these scenarios by introducing discontinuities in the deformation field at the lung rectum border. The DIR algorithm from Plastimatch software was applied to these phantoms. The SUV mapping errors from the DIR were then compared to that predicted by AUTODIRECT. Results: The SUV error distributions closely followed the AUTODIRECT predicted error distribution for the 4 test cases. The minimum and maximum PET SUVs were produced from AUTODIRECT at 95% confidence interval before applying gradient-based SUV segmentation for each of these volumes. Notably, 93.5% of the target volume warped by the true deformation was included within the AUTODIRECT-predicted maximum SUV volume after the segmentation, while 78.9% of the target volume was within the target volume warped by Plastimatch. Conclusion: The AUTODIRECT framework is able to predict PET transfer uncertainty caused by DIR, which enables an understanding of the associated target volume uncertainty.« less

  18. Study of Tools for Network Discovery and Network Mapping

    DTIC Science & Technology

    2003-11-01

    connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP

  19. Artificial Intelligence-Based Student Learning Evaluation: A Concept Map-Based Approach for Analyzing a Student's Understanding of a Topic

    ERIC Educational Resources Information Center

    Jain, G. Panka; Gurupur, Varadraj P.; Schroeder, Jennifer L.; Faulkenberry, Eileen D.

    2014-01-01

    In this paper, we describe a tool coined as artificial intelligence-based student learning evaluation tool (AISLE). The main purpose of this tool is to improve the use of artificial intelligence techniques in evaluating a student's understanding of a particular topic of study using concept maps. Here, we calculate the probability distribution of…

  20. Regional Geological Mapping in the Graham Land of Antarctic Peninsula Using LANDSAT-8 Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Pour, A. B.; Hashim, M.; Park, Y.

    2017-10-01

    Geological investigations in Antarctica confront many difficulties due to its remoteness and extreme environmental conditions. In this study, the applications of Landsat-8 data were investigated to extract geological information for lithological and alteration mineral mapping in poorly exposed lithologies in inaccessible domains such in Antarctica. The north-eastern Graham Land, Antarctic Peninsula (AP) was selected in this study to conduct a satellite-based remote sensing mapping technique. Continuum Removal (CR) spectral mapping tool and Independent Components Analysis (ICA) were applied to Landsat-8 spectral bands to map poorly exposed lithologies at regional scale. Pixels composed of distinctive absorption features of alteration mineral assemblages associated with poorly exposed lithological units were detected by applying CR mapping tool to VNIR and SWIR bands of Landsat-8.Pixels related to Si-O bond emission minima features were identified using CR mapping tool to TIR bands in poorly mapped andunmapped zones in north-eastern Graham Land at regional scale. Anomaly pixels in the ICA image maps related to spectral featuresof Al-O-H, Fe, Mg-O-H and CO3 groups and well-constrained lithological attributions from felsic to mafic rocks were detectedusing VNIR, SWIR and TIR datasets of Landsat-8. The approach used in this study performed very well for lithological andalteration mineral mapping with little available geological data or without prior information of the study region.

  1. WE-AB-BRA-05: Fully Automatic Segmentation of Male Pelvic Organs On CT Without Manual Intervention

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Y; Lian, J; Chen, R

    Purpose: We aim to develop a fully automatic tool for accurate contouring of major male pelvic organs in CT images for radiotherapy without any manual initialization, yet still achieving superior performance than the existing tools. Methods: A learning-based 3D deformable shape model was developed for automatic contouring. Specifically, we utilized a recent machine learning method, random forest, to jointly learn both image regressor and classifier for each organ. In particular, the image regressor is trained to predict the 3D displacement from each vertex of the 3D shape model towards the organ boundary based on the local image appearance around themore » location of this vertex. The predicted 3D displacements are then used to drive the 3D shape model towards the target organ. Once the shape model is deformed close to the target organ, it is further refined by an organ likelihood map estimated by the learned classifier. As the organ likelihood map provides good guideline for the organ boundary, the precise contouring Result could be achieved, by deforming the 3D shape model locally to fit boundaries in the organ likelihood map. Results: We applied our method to 29 previously-treated prostate cancer patients, each with one planning CT scan. Compared with manually delineated pelvic organs, our method obtains overlap ratios of 85.2%±3.74% for the prostate, 94.9%±1.62% for the bladder, and 84.7%±1.97% for the rectum, respectively. Conclusion: This work demonstrated feasibility of a novel machine-learning based approach for accurate and automatic contouring of major male pelvic organs. It shows the potential to replace the time-consuming and inconsistent manual contouring in the clinic. Also, compared with the existing works, our method is more accurate and also efficient since it does not require any manual intervention, such as manual landmark placement. Moreover, our method obtained very similar contouring results as the clinical experts. Project is partially support by a grant from NCI 1R01CA140413.« less

  2. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  3. Nutritional metabolomics: Progress in addressing complexity in diet and health

    PubMed Central

    Jones, Dean P.; Park, Youngja; Ziegler, Thomas R.

    2013-01-01

    Nutritional metabolomics is rapidly maturing to use small molecule chemical profiling to support integration of diet and nutrition in complex biosystems research. These developments are critical to facilitate transition of nutritional sciences from population-based to individual-based criteria for nutritional research, assessment and management. This review addresses progress in making these approaches manageable for nutrition research. Important concept developments concerning the exposome, predictive health and complex pathobiology, serve to emphasize the central role of diet and nutrition in integrated biosystems models of health and disease. Improved analytic tools and databases for targeted and non-targeted metabolic profiling, along with bioinformatics, pathway mapping and computational modeling, are now used for nutrition research on diet, metabolism, microbiome and health associations. These new developments enable metabolome-wide association studies (MWAS) and provide a foundation for nutritional metabolomics, along with genomics, epigenomics and health phenotyping, to support integrated models required for personalized diet and nutrition forecasting. PMID:22540256

  4. Analyzing multiple data sets by interconnecting RSAT programs via SOAP Web services: an example with ChIP-chip data.

    PubMed

    Sand, Olivier; Thomas-Chollier, Morgane; Vervisch, Eric; van Helden, Jacques

    2008-01-01

    This protocol shows how to access the Regulatory Sequence Analysis Tools (RSAT) via a programmatic interface in order to automate the analysis of multiple data sets. We describe the steps for writing a Perl client that connects to the RSAT Web services and implements a workflow to discover putative cis-acting elements in promoters of gene clusters. In the presented example, we apply this workflow to lists of transcription factor target genes resulting from ChIP-chip experiments. For each factor, the protocol predicts the binding motifs by detecting significantly overrepresented hexanucleotides in the target promoters and generates a feature map that displays the positions of putative binding sites along the promoter sequences. This protocol is addressed to bioinformaticians and biologists with programming skills (notions of Perl). Running time is approximately 6 min on the example data set.

  5. Targeted, noninvasive blockade of cortical neuronal activity

    NASA Astrophysics Data System (ADS)

    McDannold, Nathan; Zhang, Yongzhi; Power, Chanikarn; Arvanitis, Costas D.; Vykhodtseva, Natalia; Livingstone, Margaret

    2015-11-01

    Here we describe a novel method to noninvasively modulate targeted brain areas through the temporary disruption of the blood-brain barrier (BBB) via focused ultrasound, enabling focal delivery of a neuroactive substance. Ultrasound was used to locally disrupt the BBB in rat somatosensory cortex, and intravenous administration of GABA then produced a dose-dependent suppression of somatosensory-evoked potentials in response to electrical stimulation of the sciatic nerve. No suppression was observed 1-5 days afterwards or in control animals where the BBB was not disrupted. This method has several advantages over existing techniques: it is noninvasive; it is repeatable via additional GABA injections; multiple brain regions can be affected simultaneously; suppression magnitude can be titrated by GABA dose; and the method can be used with freely behaving subjects. We anticipate that the application of neuroactive substances in this way will be a useful tool for noninvasively mapping brain function, and potentially for surgical planning or novel therapies.

  6. Modifications to risk-targeted seismic design maps for subduction and near-fault hazards

    USGS Publications Warehouse

    Liel, Abbie B.; Luco, Nicolas; Raghunandan, Meera; Champion, C.; Haukaas, Terje

    2015-01-01

    ASCE 7-10 introduced new seismic design maps that define risk-targeted ground motions such that buildings designed according to these maps will have 1% chance of collapse in 50 years. These maps were developed by iterative risk calculation, wherein a generic building collapse fragility curve is convolved with the U.S. Geological Survey hazard curve until target risk criteria are met. Recent research shows that this current approach may be unconservative at locations where the tectonic environment is much different than that used to develop the generic fragility curve. This study illustrates how risk-targeted ground motions at selected sites would change if generic building fragility curve and hazard assessment were modified to account for seismic risk from subduction earthquakes and near-fault pulses. The paper also explores the difficulties in implementing these changes.

  7. Enhanced STEM Learning with the GeoMapApp Data Exploration Tool

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2014-12-01

    GeoMapApp (http://www.geomapapp.org), is a free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory. GeoMapApp provides casual and specialist users alike with access to hundreds of built-in geoscience data sets covering geology, geophysics, geochemistry, oceanography, climatology, cryospherics, and the environment. Users can also import their own data tables, spreadsheets, shapefiles, grids and images. Simple manipulation and analysis tools combined with layering capabilities and engaging visualisations provide a powerful platform with which to explore and interrogate geoscience data in its proper geospatial context thus helping users to more easily gain insight into the meaning of the data. A global elevation base map covering the oceans as well as continents forms the backbone of GeoMapApp. The multi-resolution base map is updated regularly and includes data sources ranging from Space Shuttle elevation data for land areas to ultra-high-resolution surveys of coral reefs and seafloor hydrothermal vent fields. Examples of built-in data sets that can be layered over the elevation model include interactive earthquake and volcano data, plate tectonic velocities, hurricane tracks, land and ocean temperature, water column properties, age of the ocean floor, and deep submersible bottom photos. A versatile profiling tool provides instant access to data cross-sections. Contouring and 3-D views are also offered - the attached image shows a 3-D view of East Africa's Ngorongoro Crater as an example. Tabular data - both imported and built-in - can be displayed in a variety of ways and a lasso tool enables users to quickly select data points directly from the map. A range of STEM-based education material based upon GeoMapApp is already available, including a number of self-contained modules for school- and college-level students (http://www.geomapapp.org/education/contributed_material.html). More learning modules are planned, such as one on the effects of sea-level rise. GeoMapApp users include students, teachers, researchers, curriculum developers and outreach specialists.

  8. GSyellow, a Multifaceted Tag for Functional Protein Analysis in Monocot and Dicot Plants.

    PubMed

    Besbrugge, Nienke; Van Leene, Jelle; Eeckhout, Dominique; Cannoot, Bernard; Kulkarni, Shubhada R; De Winne, Nancy; Persiau, Geert; Van De Slijke, Eveline; Bontinck, Michiel; Aesaert, Stijn; Impens, Francis; Gevaert, Kris; Van Damme, Daniel; Van Lijsebettens, Mieke; Inzé, Dirk; Vandepoele, Klaas; Nelissen, Hilde; De Jaeger, Geert

    2018-06-01

    The ability to tag proteins has boosted the emergence of generic molecular methods for protein functional analysis. Fluorescent protein tags are used to visualize protein localization, and affinity tags enable the mapping of molecular interactions by, for example, tandem affinity purification or chromatin immunoprecipitation. To apply these widely used molecular techniques on a single transgenic plant line, we developed a multifunctional tandem affinity purification tag, named GS yellow , which combines the streptavidin-binding peptide tag with citrine yellow fluorescent protein. We demonstrated the versatility of the GS yellow tag in the dicot Arabidopsis ( Arabidopsis thaliana ) using a set of benchmark proteins. For proof of concept in monocots, we assessed the localization and dynamic interaction profile of the leaf growth regulator ANGUSTIFOLIA3 (AN3), fused to the GS yellow tag, along the growth zone of the maize ( Zea mays ) leaf. To further explore the function of ZmAN3, we mapped its DNA-binding landscape in the growth zone of the maize leaf through chromatin immunoprecipitation sequencing. Comparison with AN3 target genes mapped in the developing maize tassel or in Arabidopsis cell cultures revealed strong conservation of AN3 target genes between different maize tissues and across monocots and dicots, respectively. In conclusion, the GS yellow tag offers a powerful molecular tool for distinct types of protein functional analyses in dicots and monocots. As this approach involves transforming a single construct, it is likely to accelerate both basic and translational plant research. © 2018 American Society of Plant Biologists. All rights reserved.

  9. QuIN: A Web Server for Querying and Visualizing Chromatin Interaction Networks.

    PubMed

    Thibodeau, Asa; Márquez, Eladio J; Luo, Oscar; Ruan, Yijun; Menghi, Francesca; Shin, Dong-Guk; Stitzel, Michael L; Vera-Licona, Paola; Ucar, Duygu

    2016-06-01

    Recent studies of the human genome have indicated that regulatory elements (e.g. promoters and enhancers) at distal genomic locations can interact with each other via chromatin folding and affect gene expression levels. Genomic technologies for mapping interactions between DNA regions, e.g., ChIA-PET and HiC, can generate genome-wide maps of interactions between regulatory elements. These interaction datasets are important resources to infer distal gene targets of non-coding regulatory elements and to facilitate prioritization of critical loci for important cellular functions. With the increasing diversity and complexity of genomic information and public ontologies, making sense of these datasets demands integrative and easy-to-use software tools. Moreover, network representation of chromatin interaction maps enables effective data visualization, integration, and mining. Currently, there is no software that can take full advantage of network theory approaches for the analysis of chromatin interaction datasets. To fill this gap, we developed a web-based application, QuIN, which enables: 1) building and visualizing chromatin interaction networks, 2) annotating networks with user-provided private and publicly available functional genomics and interaction datasets, 3) querying network components based on gene name or chromosome location, and 4) utilizing network based measures to identify and prioritize critical regulatory targets and their direct and indirect interactions. QuIN's web server is available at http://quin.jax.org QuIN is developed in Java and JavaScript, utilizing an Apache Tomcat web server and MySQL database and the source code is available under the GPLV3 license available on GitHub: https://github.com/UcarLab/QuIN/.

  10. A saliency-based approach to detection of infrared target

    NASA Astrophysics Data System (ADS)

    Chen, Yanfei; Sang, Nong; Dan, Zhiping

    2013-10-01

    Automatic target detection in infrared images is a hot research field of national defense technology. We propose a new saliency-based infrared target detection model in this paper, which is based on the fact that human focus of attention is directed towards the relevant target to interpret the most promising information. For a given image, the convolution of the image log amplitude spectrum with a low-pass Gaussian kernel of an appropriate scale is equivalent to an image saliency detector in the frequency domain. At the same time, orientation and shape features extracted are combined into a saliency map in the spatial domain. Our proposed model decides salient targets based on a final saliency map, which is generated by integration of the saliency maps in the frequency and spatial domain. At last, the size of each salient target is obtained by maximizing entropy of the final saliency map. Experimental results show that the proposed model can highlight both small and large salient regions in infrared image, as well as inhibit repeated distractors in cluttered image. In addition, its detecting efficiency has improved significantly.

  11. REGULATION OF EPHRIN-A EXPRESSION IN COMPRESSED RETINOCOLLICULAR MAPS

    PubMed Central

    Tadesse, T.; Cheng, Q.; Xu, M.; Baro, D.J.; Young, L.J.; Pallas, S.L.

    2012-01-01

    Retinotopic maps can undergo compression and expansion in response to changes in target size, but the mechanism underlying this compensatory process has remained a mystery. The discovery of ephrins as molecular mediators of Sperry’s chemoaffinity process allows a mechanistic approach to this important issue. In Syrian hamsters, neonatal, partial (PT) ablation of posterior superior colliculus (SC) leads to compression of the retinotopic map, independent of neural activity. Graded, repulsive EphA receptor/ephrin-A ligand interactions direct the formation of the retinocollicular map, but whether ephrins might also be involved in map compression is unknown. To examine whether map compression might be directed by changes in the ephrin expression pattern, we compared ephrin-A2 and ephrin-A5 mRNA expression between normal SC and PT SC using in situ hybridization and quantitative real-time PCR. We found that ephrin-A ligand expression in the compressed maps was low anteriorly and high posteriorly, as in normal animals. Consistent with our hypothesis, the steepness of the ephrin gradient increased in the lesioned colliculi. Interestingly, overall levels of ephrin-A2 and -A5 expression declined immediately after neonatal target damage, perhaps promoting axon outgrowth. These data establish a correlation between changes in ephrin-A gradients and map compression, and suggest that ephrin-A expression gradients may be regulated by target size. This in turn could lead to compression of the retinocollicular map onto the reduced target. These findings have important implications for mechanisms of recovery from traumatic brain injury. PMID:23008269

  12. YouGenMap: a web platform for dynamic multi-comparative mapping and visualization of genetic maps

    Treesearch

    Keith Batesole; Kokulapalan Wimalanathan; Lin Liu; Fan Zhang; Craig S. Echt; Chun Liang

    2014-01-01

    Comparative genetic maps are used in examination of genome organization, detection of conserved gene order, and exploration of marker order variations. YouGenMap is an open-source web tool that offers dynamic comparative mapping capability of users' own genetic mapping between 2 or more map sets. Users' genetic map data and optional gene annotations are...

  13. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  14. PharmMapper server: a web server for potential drug target identification using pharmacophore mapping approach

    PubMed Central

    Liu, Xiaofeng; Ouyang, Sisheng; Yu, Biao; Liu, Yabo; Huang, Kai; Gong, Jiayu; Zheng, Siyuan; Li, Zhihua; Li, Honglin; Jiang, Hualiang

    2010-01-01

    In silico drug target identification, which includes many distinct algorithms for finding disease genes and proteins, is the first step in the drug discovery pipeline. When the 3D structures of the targets are available, the problem of target identification is usually converted to finding the best interaction mode between the potential target candidates and small molecule probes. Pharmacophore, which is the spatial arrangement of features essential for a molecule to interact with a specific target receptor, is an alternative method for achieving this goal apart from molecular docking method. PharmMapper server is a freely accessed web server designed to identify potential target candidates for the given small molecules (drugs, natural products or other newly discovered compounds with unidentified binding targets) using pharmacophore mapping approach. PharmMapper hosts a large, in-house repertoire of pharmacophore database (namely PharmTargetDB) annotated from all the targets information in TargetBank, BindingDB, DrugBank and potential drug target database, including over 7000 receptor-based pharmacophore models (covering over 1500 drug targets information). PharmMapper automatically finds the best mapping poses of the query molecule against all the pharmacophore models in PharmTargetDB and lists the top N best-fitted hits with appropriate target annotations, as well as respective molecule’s aligned poses are presented. Benefited from the highly efficient and robust triangle hashing mapping method, PharmMapper bears high throughput ability and only costs 1 h averagely to screen the whole PharmTargetDB. The protocol was successful in finding the proper targets among the top 300 pharmacophore candidates in the retrospective benchmarking test of tamoxifen. PharmMapper is available at http://59.78.96.61/pharmmapper. PMID:20430828

  15. LACO-Wiki: A land cover validation tool and a new, innovative teaching resource for remote sensing and the geosciences

    NASA Astrophysics Data System (ADS)

    See, Linda; Perger, Christoph; Dresel, Christopher; Hofer, Martin; Weichselbaum, Juergen; Mondel, Thomas; Steffen, Fritz

    2016-04-01

    The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.laco-wiki.net.

  16. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  17. PhosphOrtholog: a web-based tool for cross-species mapping of orthologous protein post-translational modifications.

    PubMed

    Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa

    2015-08-19

    Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.

  18. NaviCell Web Service for network-based data visualization.

    PubMed

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei

    2015-07-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. NaviCell Web Service for network-based data visualization

    PubMed Central

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei

    2015-01-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393

  20. Demands on attention and the role of response priming in visual discrimination of feature conjunctions.

    PubMed

    Fournier, Lisa R; Herbert, Rhonda J; Farris, Carrie

    2004-10-01

    This study examined how response mapping of features within single- and multiple-feature targets affects decision-based processing and attentional capacity demands. Observers judged the presence or absence of 1 or 2 target features within an object either presented alone or with distractors. Judging the presence of 2 features relative to the less discriminable of these features alone was faster (conjunction benefits) when the task-relevant features differed in discriminability and were consistently mapped to responses. Conjunction benefits were attributed to asynchronous decision priming across attended, task-relevant dimensions. A failure to find conjunction benefits for disjunctive conjunctions was attributed to increased memory demands and variable feature-response mapping for 2- versus single-feature targets. Further, attentional demands were similar between single- and 2-feature targets when response mapping, memory demands, and discriminability of the task-relevant features were equated between targets. Implications of the findings for recent attention models are discussed. (c) 2004 APA, all rights reserved

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshmukh, Ranjit; Wu, Grace

    The MapRE (Multi-criteria Analysis for Planning Renewable Energy) GIS (Geographic Information Systems) Tools are a set of ArcGIS tools to a) conduct site suitability analysis for wind and solar resources using inclusion and exclusion criteria, and create resource maps, b) create project opportunity areas and compute various attributes such as cost, distances to existing and planned infrastructure. and environmental impact factors; and c) calculate and update various attributes for already processed renewable energy zones. In addition, MapRE data sets are geospatial data of renewable energy project opportunity areas and zones with pre-calculated attributes for several countries. These tools and datamore » are available at mapre.lbl.gov.« less

  2. Evaluation of equipment and methods to map lost circulation zones in geothermal wells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, W.J.; Leon, P.A.; Pittard, G.

    A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.

  3. Map of the Pluto System - Children's Edition

    NASA Astrophysics Data System (ADS)

    Hargitai, H. I.

    2016-12-01

    Cartography is a powerful tool in the scientific visualization and communication of spatial data. Cartographic visualization for children requires special methods. Although almost all known solid surface bodies in the Solar System have been mapped in detail during the last more than 5 decades, books and publications that target children, tweens and teens never include any of the cartographic results of these missions. We have developed a series of large size planetary maps with the collaboration of planetary scientists, cartographers and graphic artists. The maps are based on photomosaics and DTMs that were redrawn as artwork. This process necessarily involved generalization, interpretation and transformation into the visual language that can be understood by children. In the first project we selected six planetary bodies (Venus, the Moon, Mars, Io, Europa and Titan) and invited six illustrators of childrens'books. Although the overall structure of the maps look similar, the visual approach was significantly different. An important addition was that the maps contained a narrative: different characters - astronauts or "alien-like lifeforms" - interacted with the surface. The map contents were translated into 11 languages and published online at https://childrensmaps.wordpress.com.We report here on the new map of the series. Following the New Horizons' Pluto flyby we have started working on a map that, unlike the others, depicts a planetary system, not only one body. Since only one hemisphere was imaged in high resolution, this map is showing the encounter hemispheres of Pluto and Charon. Projected high resolution image mosaics with informal nomenclature were provided by the New Horizons Team. The graphic artist is Adrienn Gyöngyösi. Our future plan is to produce a book format Children's Atlas of Solar System bodies that makes planetary cartographic and astrogeologic results more accessible for children, and the next generation of planetary scientists among them.

  4. Arctic Research Mapping Application (ARMAP): 2D Maps and 3D Globes Support Arctic Science

    NASA Astrophysics Data System (ADS)

    Johnson, G.; Gaylord, A. G.; Brady, J. J.; Cody, R. P.; Aguilar, J. A.; Dover, M.; Garcia-Lavigne, D.; Manley, W.; Score, R.; Tweedie, C. E.

    2007-12-01

    The Arctic Research Mapping Application (ARMAP) is a suite of online services to provide support of Arctic science. These services include: a text based online search utility, 2D Internet Map Server (IMS); 3D globes and Open Geospatial Consortium (OGC) Web Map Services (WMS). With ARMAP's 2D maps and 3D globes, users can navigate to areas of interest, view a variety of map layers, and explore U.S. Federally funded research projects. Projects can be queried by location, year, funding program, discipline, and keyword. Links take you to specific information and other web sites associated with a particular research project. The Arctic Research Logistics Support Service (ARLSS) database is the foundation of ARMAP including US research funded by the National Science Foundation, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, and the United States Geological Survey. Avoiding a duplication of effort has been a primary objective of the ARMAP project which incorporates best practices (e.g. Spatial Data Infrastructure and OGC standard web services and metadata) and off the shelf technologies where appropriate. The ARMAP suite provides tools for users of various levels of technical ability to interact with the data by importing the web services directly into their own GIS applications and virtual globes; performing advanced GIS queries; simply printing maps from a set of predefined images in the map gallery; browsing the layers in an IMS; or by choosing to "fly to" sites using a 3D globe. With special emphasis on the International Polar Year (IPY), ARMAP has targeted science planners, scientists, educators, and the general public. In sum, ARMAP goes beyond a simple map display to enable analysis, synthesis, and coordination of Arctic research. ARMAP may be accessed via the gateway web site at http://www.armap.org.

  5. Mapping healthcare systems: a policy relevant analytic tool.

    PubMed

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  6. Data and Tools | NREL

    Science.gov Websites

    Data and Tools Data and Tools NREL develops data sets, maps, models, and tools for the analysis of , models, and tools in the alphabetical listing. Popular Resources PVWatts Calculator Geospatial Data

  7. VSDMIP: virtual screening data management on an integrated platform

    NASA Astrophysics Data System (ADS)

    Gil-Redondo, Rubén; Estrada, Jorge; Morreale, Antonio; Herranz, Fernando; Sancho, Javier; Ortiz, Ángel R.

    2009-03-01

    A novel software (VSDMIP) for the virtual screening (VS) of chemical libraries integrated within a MySQL relational database is presented. Two main features make VSDMIP clearly distinguishable from other existing computational tools: (i) its database, which stores not only ligand information but also the results from every step in the VS process, and (ii) its modular and pluggable architecture, which allows customization of the VS stages (such as the programs used for conformer generation or docking), through the definition of a detailed workflow employing user-configurable XML files. VSDMIP, therefore, facilitates the storage and retrieval of VS results, easily adapts to the specific requirements of each method and tool used in the experiments, and allows the comparison of different VS methodologies. To validate the usefulness of VSDMIP as an automated tool for carrying out VS several experiments were run on six protein targets (acetylcholinesterase, cyclin-dependent kinase 2, coagulation factor Xa, estrogen receptor alpha, p38 MAP kinase, and neuraminidase) using nine binary (actives/inactive) test sets. The performance of several VS configurations was evaluated by means of enrichment factors and receiver operating characteristic plots.

  8. Unique Sensor Plane Maps Invisible Toxins for First Responders

    ScienceCinema

    Kroutil, Robert; Thomas, Mark; Aten, Keith

    2018-05-30

    A unique airborne emergency response tool, ASPECT is a Los Alamos/U.S. Environmental Protection Agency project that can put chemical and radiological mapping tools in the air over an accident scene. The name ASPECT is an acronym for Airborne Spectral Photometric Environmental Collection Technology.

  9. Teaching science with technology: Using EPA’s EnviroAtlas in the classroom

    EPA Science Inventory

    Background/Question/Methods U.S. EPA’s EnviroAtlas provides a collection of web-based, interactive tools and resources for exploring ecosystem goods and services. EnviroAtlas contains two primary tools: An Interactive Map, which provides access to 300+ maps at multiple exte...

  10. Hyperspectral remote sensing applied to mineral exploration in southern Peru: A multiple data integration approach in the Chapi Chiara gold prospect

    NASA Astrophysics Data System (ADS)

    Carrino, Thais Andressa; Crósta, Alvaro Penteado; Toledo, Catarina Labouré Bemfica; Silva, Adalene Moreira

    2018-02-01

    Remote sensing is a strategic key tool for mineral exploration, due to its capacity of detecting hydrothermal alteration minerals or alteration mineral zones associated with different types of mineralization systems. A case study of an epithermal system located in southern Peru is presented, aimed at the characterization of mineral assemblies for discriminating potential high sulfidation epithermal targets, using hyperspectral imagery integrated with petrography, XRD and magnetic data. HyMap images were processed using the Mixture Tuned Matched Filtering (MTMF) technique for producing alteration map in the Chapi Chiara epithermal gold prospect. Extensive areas marked by advanced argillic alteration (alunite-kaolinite-dickite ± topaz) were mapped in detail, as well as limited argillic (illite-smectite) and propylitic (chlorite spectral domain) alteration. The magmatic-hydrothermal processes responsible for the formation of hypogene minerals were also related to the destruction of ferrimagnetic minerals (e.g., magnetite) of host rocks such as andesite, and the remobilization/formation of paramagnetic Fe-Ti oxides (e.g., rutile, anatase). The large alteration zones of advanced argillic alteration are controlled by structures related to a regional NW-SE trend, and also by local NE-SW and ENE-WSW ones.

  11. Mapping Skills and Activities with Children's Literature

    ERIC Educational Resources Information Center

    Gandy, S. Kay

    2006-01-01

    In the primary grades, maps are useful tools to help the young reader put stories into perspective. This article presents 18 quality children's books that contain maps or lessons about maps, as well as activities to use in the classroom to teach map skills. A table is included with ratings of the usability of the maps.

  12. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods.

    PubMed

    Skjerdal, Taran; Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; de Cecare, Alessandra; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Trevisiani, Marcello; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes , quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.

  13. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods

    PubMed Central

    Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch.; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as “good”; “sufficient”; or “corrective action needed” based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users. PMID:29457031

  14. MODSNOW-Tool: an operational tool for daily snow cover monitoring using MODIS data

    NASA Astrophysics Data System (ADS)

    Gafurov, Abror; Lüdtke, Stefan; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Schöne, Tilo; Schmidt, Sebastian; Kalashnikova, Olga; Merz, Bruno

    2017-04-01

    Spatially distributed snow cover information in mountain areas is extremely important for water storage estimations, seasonal water availability forecasting, or the assessment of snow-related hazards (e.g. enhanced snow-melt following intensive rains, or avalanche events). Moreover, spatially distributed snow cover information can be used to calibrate and/or validate hydrological models. We present the MODSNOW-Tool - an operational monitoring tool offers a user-friendly application which can be used for catchment-based operational snow cover monitoring. The application automatically downloads and processes freely available daily Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover data. The MODSNOW-Tool uses a step-wise approach for cloud removal and delivers cloud-free snow cover maps for the selected river basins including basin specific snow cover extent statistics. The accuracy of cloud-eliminated MODSNOW snow cover maps was validated for 84 almost cloud-free days in the Karadarya river basin in Central Asia, and an average accuracy of 94 % was achieved. The MODSNOW-Tool can be used in operational and non-operational mode. In the operational mode, the tool is set up as a scheduled task on a local computer allowing automatic execution without user interaction and delivers snow cover maps on a daily basis. In the non-operational mode, the tool can be used to process historical time series of snow cover maps. The MODSNOW-Tool is currently implemented and in use at the national hydrometeorological services of four Central Asian states - Kazakhstan, Kyrgyzstan, Uzbekistan and Turkmenistan and used for seasonal water availability forecast.

  15. Using habitat suitability models to target invasive plant species surveys

    USGS Publications Warehouse

    Crall, Alycia W.; Jarnevich, Catherine S.; Panke, Brendon; Young, Nick; Renz, Mark; Morisette, Jeffrey

    2013-01-01

    Managers need new tools for detecting the movement and spread of nonnative, invasive species. Habitat suitability models are a popular tool for mapping the potential distribution of current invaders, but the ability of these models to prioritize monitoring efforts has not been tested in the field. We tested the utility of an iterative sampling design (i.e., models based on field observations used to guide subsequent field data collection to improve the model), hypothesizing that model performance would increase when new data were gathered from targeted sampling using criteria based on the initial model results. We also tested the ability of habitat suitability models to predict the spread of invasive species, hypothesizing that models would accurately predict occurrences in the field, and that the use of targeted sampling would detect more species with less sampling effort than a nontargeted approach. We tested these hypotheses on two species at the state scale (Centaurea stoebe and Pastinaca sativa) in Wisconsin (USA), and one genus at the regional scale (Tamarix) in the western United States. These initial data were merged with environmental data at 30-m2 resolution for Wisconsin and 1-km2 resolution for the western United States to produce our first iteration models. We stratified these initial models to target field sampling and compared our models and success at detecting our species of interest to other surveys being conducted during the same field season (i.e., nontargeted sampling). Although more data did not always improve our models based on correct classification rate (CCR), sensitivity, specificity, kappa, or area under the curve (AUC), our models generated from targeted sampling data always performed better than models generated from nontargeted data. For Wisconsin species, the model described actual locations in the field fairly well (kappa = 0.51, 0.19, P 2) = 47.42, P < 0.01). From these findings, we conclude that habitat suitability models can be highly useful tools for guiding invasive species monitoring, and we support the use of an iterative sampling design for guiding such efforts.

  16. Google Maps offers a new way to evaluate claudication.

    PubMed

    Khambati, Husain; Boles, Kim; Jetty, Prasad

    2017-05-01

    Accurate determination of walking capacity is important for the clinical diagnosis and management plan for patients with peripheral arterial disease. The current "gold standard" of measurement is walking distance on a treadmill. However, treadmill testing is not always reflective of the patient's natural walking conditions, and it may not be fully accessible in every vascular clinic. The objective of this study was to determine whether Google Maps, the readily available GPS-based mapping tool, offers an accurate and accessible method of evaluating walking distances in vascular claudication patients. Patients presenting to the outpatient vascular surgery clinic between November 2013 and April 2014 at the Ottawa Hospital with vasculogenic calf, buttock, and thigh claudication symptoms were identified and prospectively enrolled in our study. Onset of claudication symptoms and maximal walking distance (MWD) were evaluated using four tools: history; Walking Impairment Questionnaire (WIQ), a validated claudication survey; Google Maps distance calculator (patients were asked to report their daily walking routes on the Google Maps-based tool runningmap.com, and walking distances were calculated accordingly); and treadmill testing for onset of symptoms and MWD, recorded in a double-blinded fashion. Fifteen patients were recruited for the study. Determination of walking distances using Google Maps proved to be more accurate than by both clinical history and WIQ, correlating highly with the gold standard of treadmill testing for both claudication onset (r = .805; P < .001) and MWD (r = .928; P < .0001). In addition, distances were generally under-reported on history and WIQ. The Google Maps tool was also efficient, with reporting times averaging below 4 minutes. For vascular claudicants with no other walking limitations, Google Maps is a promising new tool that combines the objective strengths of the treadmill test and incorporates real-world walking environments. It offers an accurate, efficient, inexpensive, and readily accessible way to assess walking distances in patients with peripheral vascular disease. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  17. Direct shear mapping - a new weak lensing tool

    NASA Astrophysics Data System (ADS)

    de Burgh-Day, C. O.; Taylor, E. N.; Webster, R. L.; Hopkins, A. M.

    2015-08-01

    We have developed a new technique called direct shear mapping (DSM) to measure gravitational lensing shear directly from observations of a single background source. The technique assumes the velocity map of an unlensed, stably rotating galaxy will be rotationally symmetric. Lensing distorts the velocity map making it asymmetric. The degree of lensing can be inferred by determining the transformation required to restore axisymmetry. This technique is in contrast to traditional weak lensing methods, which require averaging an ensemble of background galaxy ellipticity measurements, to obtain a single shear measurement. We have tested the efficacy of our fitting algorithm with a suite of systematic tests on simulated data. We demonstrate that we are in principle able to measure shears as small as 0.01. In practice, we have fitted for the shear in very low redshift (and hence unlensed) velocity maps, and have obtained null result with an error of ±0.01. This high-sensitivity results from analysing spatially resolved spectroscopic images (i.e. 3D data cubes), including not just shape information (as in traditional weak lensing measurements) but velocity information as well. Spirals and rotating ellipticals are ideal targets for this new technique. Data from any large Integral Field Unit (IFU) or radio telescope is suitable, or indeed any instrument with spatially resolved spectroscopy such as the Sydney-Australian-Astronomical Observatory Multi-Object Integral Field Spectrograph (SAMI), the Atacama Large Millimeter/submillimeter Array (ALMA), the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX) and the Square Kilometer Array (SKA).

  18. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  19. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET) sequence data.

    PubMed

    Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun

    2006-08-25

    We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  20. Analyzing the effectiveness of flare dispensing programs against pulse width modulation seekers using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Şahingil, Mehmet C.; Aslan, Murat Š.

    2013-10-01

    Infrared guided missile seekers utilizing pulse width modulation in target tracking is one of the threats against air platforms. To be able to achieve a "soft-kill" protection of own platform against these type of threats, one needs to examine carefully the seeker operating principle with its special electronic counter-counter measure (ECCM) capability. One of the cost-effective ways of soft kill protection is to use flare decoys in accordance with an optimized dispensing program. Such an optimization requires a good understanding of the threat seeker, capabilities of the air platform and engagement scenario information between them. Modeling and simulation is very powerful tool to achieve a valuable insight and understand the underlying phenomenology. A careful interpretation of simulation results is crucial to infer valuable conclusions from the data. In such an interpretation there are lots of factors (features) which affect the results. Therefore, powerful statistical tools and pattern recognition algorithms are of special interest in the analysis. In this paper, we show how self-organizing maps (SOMs), which is one of those powerful tools, can be used in analyzing the effectiveness of various flare dispensing programs against a PWM seeker. We perform several Monte Carlo runs for a typical engagement scenario in a MATLAB-based simulation environment. In each run, we randomly change the flare dispending program and obtain corresponding class: "successful" or "unsuccessful", depending on whether the corresponding flare dispensing program deceives the seeker or not, respectively. Then, in the analysis phase, we use SOMs to interpret and visualize the results.

  1. A web-based tool for groundwater mapping and drought analysis

    NASA Astrophysics Data System (ADS)

    Christensen, S.; Burns, M.; Jones, N.; Strassberg, G.

    2012-12-01

    In 2011-2012, the state of Texas saw the worst one-year drought on record. Fluctuations in gravity measured by GRACE satellites indicate that as much as 100 cubic kilometers of water was lost during this period. Much of this came from reservoirs and shallow soil moisture, but a significant amount came from aquifers. In response to this crisis, a Texas Drought Technology Steering Committee (TDTSC) consisting of academics and water managers was formed to develop new tools and strategies to assist the state in monitoring, predicting, and responding to drought events. In this presentation, we describe one of the tools that was developed as part of this effort. When analyzing the impact of drought on groundwater levels, it is fairly common to examine time series data at selected monitoring wells. However, accurately assessing impacts and trends requires both spatial and temporal analysis involving the development of detailed water level maps at various scales. Creating such maps in a flexible and rapid fashion is critical for effective drought analysis, but can be challenging due to the massive amounts of data involved and the processing required to generate such maps. Furthermore, wells are typically not sampled at the same points in time, and so developing a water table map for a particular date requires both spatial and temporal interpolation of water elevations. To address this challenge, a Cloud-based water level mapping system was developed for the state of Texas. The system is based on the Texas Water Development Board (TWDB) groundwater database, but can be adapted to use other databases as well. The system involves a set of ArcGIS workflows running on a server with a web-based front end and a Google Earth plug-in. A temporal interpolation geoprocessing tool was developed to estimate the piezometric heads for all wells in a given region at a specific date using a regression analysis. This interpolation tool is coupled with other geoprocessing tools to filter data and interpolate point elevations spatially to produce water level, drawdown, and depth to groundwater maps. The web interface allows for users to generate these maps at locations and times of interest. A sequence of maps can be generated over a period of time and animated to visualize how water levels are changing. The time series regression analysis can also be used to do short-term predictions of future water levels.

  2. Analysis and Visualization Tool for Targeted Amplicon Bisulfite Sequencing on Ion Torrent Sequencers

    PubMed Central

    Pabinger, Stephan; Ernst, Karina; Pulverer, Walter; Kallmeyer, Rainer; Valdes, Ana M.; Metrustry, Sarah; Katic, Denis; Nuzzo, Angelo; Kriegner, Albert; Vierlinger, Klemens; Weinhaeusel, Andreas

    2016-01-01

    Targeted sequencing of PCR amplicons generated from bisulfite deaminated DNA is a flexible, cost-effective way to study methylation of a sample at single CpG resolution and perform subsequent multi-target, multi-sample comparisons. Currently, no platform specific protocol, support, or analysis solution is provided to perform targeted bisulfite sequencing on a Personal Genome Machine (PGM). Here, we present a novel tool, called TABSAT, for analyzing targeted bisulfite sequencing data generated on Ion Torrent sequencers. The workflow starts with raw sequencing data, performs quality assessment, and uses a tailored version of Bismark to map the reads to a reference genome. The pipeline visualizes results as lollipop plots and is able to deduce specific methylation-patterns present in a sample. The obtained profiles are then summarized and compared between samples. In order to assess the performance of the targeted bisulfite sequencing workflow, 48 samples were used to generate 53 different Bisulfite-Sequencing PCR amplicons from each sample, resulting in 2,544 amplicon targets. We obtained a mean coverage of 282X using 1,196,822 aligned reads. Next, we compared the sequencing results of these targets to the methylation level of the corresponding sites on an Illumina 450k methylation chip. The calculated average Pearson correlation coefficient of 0.91 confirms the sequencing results with one of the industry-leading CpG methylation platforms and shows that targeted amplicon bisulfite sequencing provides an accurate and cost-efficient method for DNA methylation studies, e.g., to provide platform-independent confirmation of Illumina Infinium 450k methylation data. TABSAT offers a novel way to analyze data generated by Ion Torrent instruments and can also be used with data from the Illumina MiSeq platform. It can be easily accessed via the Platomics platform, which offers a web-based graphical user interface along with sample and parameter storage. TABSAT is freely available under a GNU General Public License version 3.0 (GPLv3) at https://github.com/tadkeys/tabsat/ and http://demo.platomics.com/. PMID:27467908

  3. Exploring physics concepts among novice teachers through CMAP tools

    NASA Astrophysics Data System (ADS)

    Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.

    2018-03-01

    Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.

  4. Automated search method for AFM and profilers

    NASA Astrophysics Data System (ADS)

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  5. Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping

    NASA Astrophysics Data System (ADS)

    Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.

    2017-12-01

    Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.

  6. Common features of microRNA target prediction tools

    PubMed Central

    Peterson, Sarah M.; Thompson, Jeffrey A.; Ufkin, Melanie L.; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates

    2014-01-01

    The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output. PMID:24600468

  7. Common features of microRNA target prediction tools.

    PubMed

    Peterson, Sarah M; Thompson, Jeffrey A; Ufkin, Melanie L; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates

    2014-01-01

    The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.

  8. Development and testing of a tool for assessing and resolving medication-related problems in older adults in an ambulatory care setting: the individualized medication assessment and planning (iMAP) tool.

    PubMed

    Crisp, Ginny D; Burkhart, Jena Ivey; Esserman, Denise A; Weinberger, Morris; Roth, Mary T

    2011-12-01

    Medication is one of the most important interventions for improving the health of older adults, yet it has great potential for causing harm. Clinical pharmacists are well positioned to engage in medication assessment and planning. The Individualized Medication Assessment and Planning (iMAP) tool was developed to aid clinical pharmacists in documenting medication-related problems (MRPs) and associated recommendations. The purpose of our study was to assess the reliability and usability of the iMAP tool in classifying MRPs and associated recommendations in older adults in the ambulatory care setting. Three cases, representative of older adults seen in an outpatient setting, were developed. Pilot testing was conducted and a "gold standard" key developed. Eight eligible pharmacists consented to participate in the study. They were instructed to read each case, make an assessment of MRPs, formulate a plan, and document the information using the iMAP tool. Inter-rater reliability was assessed for each case, comparing the pharmacists' identified MRPs and recommendations to the gold standard. Consistency of categorization across reviewers was assessed using the κ statistic or percent agreement. The mean κ across the 8 pharmacists in classifying MRPs compared with the gold standard was 0.74 (range, 0.54-1.00) for case 1 and 0.68 (range, 0.36-1.00) for case 2, indicating substantial agreement. For case 3, percent agreement was 63% (range, 40%-100%). The mean κ across the 8 pharmacists when classifying recommendations compared with the gold standard was 0.87 (range, 0.58-1.00) for case 1 and 0.88 (range, 0.75-1.00) for case 2, indicating almost perfect agreement. For case 3, percent agreement was 68% (range, 40%-100%). Clinical pharmacists found the iMAP tool easy to use. The iMAP tool provides a reliable and standardized approach for clinical pharmacists to use in the ambulatory care setting to classify MRPs and associated recommendations. Future studies will explore the predictive validity of the tool on clinical outcomes such as health care utilization. Copyright © 2011 Elsevier HS Journals, Inc. All rights reserved.

  9. Hydration Map, Based on Mastcam Spectra, for Knorr Rock Target

    NASA Image and Video Library

    2013-03-18

    On this image of the rock target Knorr, color coding maps the amount of mineral hydration indicated by a ratio of near-infrared reflectance intensities measured by the Mastcam on NASA Mars rover Curiosity.

  10. The Adversarial Route Analysis Tool: A Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  11. GuidosToolbox: universal digital image object analysis

    Treesearch

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  12. Linking the ACT ASPIRE Assessments to NWEA MAP Assessments

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  13. Assessment and Application of National Environmental Databases and Mapping Tools at the Local Level to Two Community Case Studies

    EPA Science Inventory

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessm...

  14. Mind Maps as Facilitative Tools in Science Education

    ERIC Educational Resources Information Center

    Safar, Ammar H.; Jafer,Yaqoub J.; Alqadiri, Mohammad A.

    2014-01-01

    This study explored the perceptions, attitudes, and willingness of pre-service science teachers in the College of Education at Kuwait University about using concept/mind maps and its related application software as facilitative tools, for teaching and learning, in science education. The first level (i.e., reaction) of Kirkpatrick's/Phillips'…

  15. Groundnut improvement: use of genetic and genomic tools

    PubMed Central

    Janila, Pasupuleti; Nigam, S. N.; Pandey, Manish K.; Nagesh, P.; Varshney, Rajeev K.

    2013-01-01

    Groundnut (Arachis hypogaea L.), a self-pollinated legume is an important crop cultivated in 24 million ha world over for extraction of edible oil and food uses. The kernels are rich in oil (48–50%) and protein (25–28%), and are source of several vitamins, minerals, antioxidants, biologically active polyphenols, flavonoids, and isoflavones. Improved varieties of groundnut with high yield potential were developed and released for cultivation world over. The improved varieties belong to different maturity durations and possess resistance to diseases, tolerance to drought, enhanced oil content, and improved quality traits for food uses. Conventional breeding procedures along with the tools for phenotyping were largely used in groundnut improvement programs. Mutations were used to induce variability and wide hybridization was attempted to tap variability from wild species. Low genetic variability has been a bottleneck for groundnut improvement. The vast potential of wild species, reservoir of new alleles remains under-utilized. Development of linkage maps of groundnut during the last decade was followed by identification of markers and quantitative trait loci for the target traits. Consequently, the last decade has witnessed the deployment of molecular breeding approaches to complement the ongoing groundnut improvement programs in USA, China, India, and Japan. The other potential advantages of molecular breeding are the feasibility to target multiple traits for improvement and provide tools to tap new alleles from wild species. The first groundnut variety developed through marker-assisted back-crossing is a root-knot nematode-resistant variety, NemaTAM in USA. The uptake of molecular breeding approaches in groundnut improvement programs by NARS partners in India and many African countries is slow or needs to be initiated in part due to inadequate infrastructure, high genotyping costs, and human capacities. Availability of draft genome sequence for diploid (AA and BB) and tetraploid, AABB genome species of Arachis in coming years is expected to bring low-cost genotyping to the groundnut community that will facilitate use of modern genetics and breeding approaches such as genome-wide association studies for trait mapping and genomic selection for crop improvement. PMID:23443056

  16. Optimizing farm landscape by two decision-support tools for present and future: A case study in a mountainous farm of Taiwan

    NASA Astrophysics Data System (ADS)

    Chou, S.; Lin, Y.

    2013-12-01

    Rapid expansion of agricultural land-use has been identified as the main factor degrading biodiversity. Many studies have indicated that habitat quality and connectivity for multiple species can be preserved by applying the systematic conservation planning and software programs for spatial conservation prioritizations are usually used by planners to solve conservation problems for present and future. However, each conservation software program uses different algorithms and may not be suitable or efficient for all case studies. Therefore, in this study we compared the performance of two commonly used decision-support tools, Marxan and Zonation, on identifying priority areas as reserve region for 16 bird species in the mountain area of Taiwan. The priority areas are considered as the results of the tradeoff between bird presence (biological factor) and agricultural products (economic factor). Marxan uses the minimum set approach to identify priority areas for meeting specific targets while Zonation uses the maximum coverage approach to identify priority areas given a fixed budget. Therefore, we design the scenario with the most comparable setting, which selects target-based planning as the removal rule and boundary length penalty option in zonation. The landscape composition and configuration of the simulated priority areas were further evaluated by using landscape metrics and their similarity were examined by using Spearman's rank tests. The results showed that Marxan performed more efficiently while Zonation generated the priority areas in better connectivity. As the selection of conservation programs depends on users objectives and needs for present and future, this study provides useful information on determining suitable and efficient decision-support tools for future bird conservation. Conservation maps for Zonation based on different BLP parameter. The conservation value for Zonation is based on the hierarchical solution output. (a)BLP =1000 (b)BLP =3000 (c)BLP =5000 (d)BLP =7000 Conservation maps for Marxan based on different BMP parameter. The conservation value for Marxan is based on the selection frequency. (a)BMP =2500 (b)BMP =5000 (c)BMP =7500 (d)BMP =10000

  17. High resolution and low altitude magnetic surveys for structural geology mapping in the Seabee mine, Saskatchewan, Canada, using UAV-MAG™ technology.

    NASA Astrophysics Data System (ADS)

    Braun, A.; Parvar, K.; Burns, M.

    2017-12-01

    Uninhabited Aerial Vehicles (UAV) provide the operational flexibility and ease of use which makes them ideal tools for low altitude and high resolution magnetic surveys. Being able to fly at lower altitudes compared to manned aircrafts provides the proximity to the target needed to increase the sensitivity to detect smaller and less magnetic targets. Considering the same sensor specifications, this further increases the signal to noise ratio. However, to increase spatial resolution, a tighter line spacing is needed which increases the survey time. We describe a case study in the Seabee mine in Saskatchewan, Canada. Using Pioneer Exploration Ltd. UAV-MAG™ technology, we emphasize the importance of altitude and line spacing in magnetic surveys with UAVs in order to resolve smaller and less magnetic targets compared to conventional manned airborne magnetic surveys. Mapping lithological or stratigraphic changes along the target structure requires an existing gradient in magnetic susceptibility. Mostly, this criterium is either not presented or the is weaker than the sensor's signal to noise ratio at a certain flying altitude. However, the folded structure in the study region shows high susceptibility changes in rock formations in high altitude regional magnetic surveys. In order to confirm that there are no missed structural elements in the target region, a UAV magnetic survey using a GEM Systems GSMP-35A potassium vapor magnetometer on Pioneer Exploration's UAV-MAG™ platform was conducted to exploit the structure in detail and compare the gain in spatial resolution from flying at lower altitude and with denser flight lines. The survey was conducted at 25 meters above ground level (AGL). Line spacing was set to 15 meters and a total of 550 kilometers was covered using an autonomous UAV. The collected data were compared to the regional airborne data which were collected at 150 meters AGL with a line spacing of 100 meters. Comparison revealed an anticline with plunge in the northeastern side of the gird. The analysis of the magnetic data, both total magnetic intensity and gradients, reveals that the UAV survey is able to resolve much smaller structures than the manned airborne survey. These details also match observations made in previous geological mapping missions.

  18. Using intervention mapping to develop a work-related guidance tool for those affected by cancer.

    PubMed

    Munir, Fehmidah; Kalawsky, Katryna; Wallis, Deborah J; Donaldson-Feilder, Emma

    2013-01-05

    Working-aged individuals diagnosed and treated for cancer require support and assistance to make decisions regarding work. However, healthcare professionals do not consider the work-related needs of patients and employers do not understand the full impact cancer can have upon the employee and their work. We therefore developed a work-related guidance tool for those diagnosed with cancer that enables them to take the lead in stimulating discussion with a range of different healthcare professionals, employers, employment agencies and support services. The tool facilitates discussions through a set of questions individuals can utilise to find solutions and minimise the impact cancer diagnosis, prognosis and treatment may have on their employment, sick leave and return to work outcomes. The objective of the present article is to describe the systematic development and content of the tool using Intervention Mapping Protocol (IMP). The study used the first five steps of the intervention mapping process to guide the development of the tool. A needs assessment identified the 'gaps' in information/advice received from healthcare professionals and other stakeholders. The intended outcomes and performance objectives for the tool were then identified followed by theory-based methods and an implementation plan. A draft of the tool was developed and subjected to a two-stage Delphi process with various stakeholders. The final tool was piloted with 38 individuals at various stages of the cancer journey. The tool was designed to be a self-led tool that can be used by any person with a cancer diagnosis and working for most types of employers. The pilot study indicated that the tool was relevant and much needed. Intervention Mapping is a valuable protocol for designing complex guidance tools. The process and design of this particular tool can lend itself to other situations both occupational and more health-care based.

  19. Using intervention mapping to develop a work-related guidance tool for those affected by cancer

    PubMed Central

    2013-01-01

    Background Working-aged individuals diagnosed and treated for cancer require support and assistance to make decisions regarding work. However, healthcare professionals do not consider the work-related needs of patients and employers do not understand the full impact cancer can have upon the employee and their work. We therefore developed a work-related guidance tool for those diagnosed with cancer that enables them to take the lead in stimulating discussion with a range of different healthcare professionals, employers, employment agencies and support services. The tool facilitates discussions through a set of questions individuals can utilise to find solutions and minimise the impact cancer diagnosis, prognosis and treatment may have on their employment, sick leave and return to work outcomes. The objective of the present article is to describe the systematic development and content of the tool using Intervention Mapping Protocol (IMP). Methods The study used the first five steps of the intervention mapping process to guide the development of the tool. A needs assessment identified the ‘gaps’ in information/advice received from healthcare professionals and other stakeholders. The intended outcomes and performance objectives for the tool were then identified followed by theory-based methods and an implementation plan. A draft of the tool was developed and subjected to a two-stage Delphi process with various stakeholders. The final tool was piloted with 38 individuals at various stages of the cancer journey. Results The tool was designed to be a self-led tool that can be used by any person with a cancer diagnosis and working for most types of employers. The pilot study indicated that the tool was relevant and much needed. Conclusions Intervention Mapping is a valuable protocol for designing complex guidance tools. The process and design of this particular tool can lend itself to other situations both occupational and more health-care based. PMID:23289708

  20. Evaluation of a color-coded Landsat 5/6 ratio image for mapping lithologic differences in western South Dakota

    USGS Publications Warehouse

    Raines, Gary L.; Bretz, R.F.; Shurr, George W.

    1979-01-01

    From analysis of a color-coded Landsat 5/6 ratio, image, a map of the vegetation density distribution has been produced by Raines of 25,000 sq km of western South Dakota. This 5/6 ratio image is produced digitally calculating the ratios of the bands 5 and 6 of the Landsat data and then color coding these ratios in an image. Bretz and Shurr compared this vegetation density map with published and unpublished data primarily of the U.S. Geological Survey and the South Dakota Geological Survey; good correspondence is seen between this map and existing geologic maps, especially with the soils map. We believe that this Landsat ratio image can be used as a tool to refine existing maps of surficial geology and bedrock, where bedrock is exposed, and to improve mapping accuracy in areas of poor exposure common in South Dakota. In addition, this type of image could be a useful, additional tool in mapping areas that are unmapped.

  1. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  2. Neural mechanisms underlying sensitivity to reverse-phi motion in the fly

    PubMed Central

    Meier, Matthias; Serbe, Etienne; Eichner, Hubert; Borst, Alexander

    2017-01-01

    Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics. PMID:29261684

  3. Neural mechanisms underlying sensitivity to reverse-phi motion in the fly.

    PubMed

    Leonhardt, Aljoscha; Meier, Matthias; Serbe, Etienne; Eichner, Hubert; Borst, Alexander

    2017-01-01

    Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics.

  4. The Applications of Model-Based Geostatistics in Helminth Epidemiology and Control

    PubMed Central

    Magalhães, Ricardo J. Soares; Clements, Archie C.A.; Patil, Anand P.; Gething, Peter W.; Brooker, Simon

    2011-01-01

    Funding agencies are dedicating substantial resources to tackle helminth infections. Reliable maps of the distribution of helminth infection can assist these efforts by targeting control resources to areas of greatest need. The ability to define the distribution of infection at regional, national and subnational levels has been enhanced greatly by the increased availability of good quality survey data and the use of model-based geostatistics (MBG), enabling spatial prediction in unsampled locations. A major advantage of MBG risk mapping approaches is that they provide a flexible statistical platform for handling and representing different sources of uncertainty, providing plausible and robust information on the spatial distribution of infections to inform the design and implementation of control programmes. Focussing on schistosomiasis and soil-transmitted helminthiasis, with additional examples for lymphatic filariasis and onchocerciasis, we review the progress made to date with the application of MBG tools in large-scale, real-world control programmes and propose a general framework for their application to inform integrative spatial planning of helminth disease control programmes. PMID:21295680

  5. Relax with CouchDB - Into the non-relational DBMS era of Bioinformatics

    PubMed Central

    Manyam, Ganiraju; Payton, Michelle A.; Roth, Jack A.; Abruzzo, Lynne V.; Coombes, Kevin R.

    2012-01-01

    With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. PMID:22609849

  6. Kazusa Marker DataBase: a database for genomics, genetics, and molecular breeding in plants.

    PubMed

    Shirasawa, Kenta; Isobe, Sachiko; Tabata, Satoshi; Hirakawa, Hideki

    2014-09-01

    In order to provide useful genomic information for agronomical plants, we have established a database, the Kazusa Marker DataBase (http://marker.kazusa.or.jp). This database includes information on DNA markers, e.g., SSR and SNP markers, genetic linkage maps, and physical maps, that were developed at the Kazusa DNA Research Institute. Keyword searches for the markers, sequence data used for marker development, and experimental conditions are also available through this database. Currently, 10 plant species have been targeted: tomato (Solanum lycopersicum), pepper (Capsicum annuum), strawberry (Fragaria × ananassa), radish (Raphanus sativus), Lotus japonicus, soybean (Glycine max), peanut (Arachis hypogaea), red clover (Trifolium pratense), white clover (Trifolium repens), and eucalyptus (Eucalyptus camaldulensis). In addition, the number of plant species registered in this database will be increased as our research progresses. The Kazusa Marker DataBase will be a useful tool for both basic and applied sciences, such as genomics, genetics, and molecular breeding in crops.

  7. Scalable Regression Tree Learning on Hadoop using OpenPlanet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yin, Wei; Simmhan, Yogesh; Prasanna, Viktor

    As scientific and engineering domains attempt to effectively analyze the deluge of data arriving from sensors and instruments, machine learning is becoming a key data mining tool to build prediction models. Regression tree is a popular learning model that combines decision trees and linear regression to forecast numerical target variables based on a set of input features. Map Reduce is well suited for addressing such data intensive learning applications, and a proprietary regression tree algorithm, PLANET, using MapReduce has been proposed earlier. In this paper, we describe an open source implement of this algorithm, OpenPlanet, on the Hadoop framework usingmore » a hybrid approach. Further, we evaluate the performance of OpenPlanet using realworld datasets from the Smart Power Grid domain to perform energy use forecasting, and propose tuning strategies of Hadoop parameters to improve the performance of the default configuration by 75% for a training dataset of 17 million tuples on a 64-core Hadoop cluster on FutureGrid.« less

  8. Transportable Manned and Robotic Digital Geophysical Mapping Tow Vehicle, Phase 1

    DTIC Science & Technology

    2007-08-01

    by using the UX PROCESS QC/QA tools to evaluate quality. Areas evaluated included induced noise, position and track accuracy, synchronization/latency... tools . To gain additional data on productivity and the effect of alternate direction of travel we mapped an unobstructed subset of the Grid 1-4 area...independently evaluated by using the UX PROCESS QC/QA tools to evaluate quality. Areas evaluated included induced noise, position and track

  9. GIS based application tool -- history of East India Company

    NASA Astrophysics Data System (ADS)

    Phophaliya, Sudhir

    The emphasis of the thesis is to build an intuitive and robust GIS (Geographic Information systems) Tool which gives an in depth information on history of East India Company. The GIS tool also incorporates various achievements of East India Company which helped to establish their business all over world especially India. The user has the option to select these movements and acts by clicking on any of the marked states on the World map. The World Map also incorporates key features for East India Company like landing of East India Company in India, Darjeeling Tea Establishment, East India Company Stock Redemption Act etc. The user can know more about these features simply by clicking on each of them. The primary focus of the tool is to give the user a unique insight about East India Company; for this the tool has several HTML (Hypertext markup language) pages which the user can select. These HTML pages give information on various topics like the first Voyage, Trade with China, 1857 Revolt etc. The tool has been developed in JAVA. For the Indian map MOJO (Map Objects Java Objects) is used. MOJO is developed by ESRI. The major features shown on the World map was designed using MOJO. MOJO made it easy to incorporate the statistical data with these features. The user interface was intentionally kept simple and easy to use. To keep the user engaged, key aspects are explained using HTML pages. The idea is that pictures will help the user garner interest in the history of East India Company.

  10. Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows

    NASA Astrophysics Data System (ADS)

    Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.

    2017-06-01

    The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.

  11. Unmanned aircraft systems image collection and computer vision image processing for surveying and mapping that meets professional needs

    NASA Astrophysics Data System (ADS)

    Peterson, James Preston, II

    Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.

  12. GIS prospectivity mapping and 3D modeling validation for potential uranium deposit targets in Shangnan district, China

    NASA Astrophysics Data System (ADS)

    Xie, Jiayu; Wang, Gongwen; Sha, Yazhou; Liu, Jiajun; Wen, Botao; Nie, Ming; Zhang, Shuai

    2017-04-01

    Integrating multi-source geoscience information (such as geology, geophysics, geochemistry, and remote sensing) using GIS mapping is one of the key topics and frontiers in quantitative geosciences for mineral exploration. GIS prospective mapping and three-dimensional (3D) modeling can be used not only to extract exploration criteria and delineate metallogenetic targets but also to provide important information for the quantitative assessment of mineral resources. This paper uses the Shangnan district of Shaanxi province (China) as a case study area. GIS mapping and potential granite-hydrothermal uranium targeting were conducted in the study area combining weights of evidence (WofE) and concentration-area (C-A) fractal methods with multi-source geoscience information. 3D deposit-scale modeling using GOCAD software was performed to validate the shapes and features of the potential targets at the subsurface. The research results show that: (1) the known deposits have potential zones at depth, and the 3D geological models can delineate surface or subsurface ore-forming features, which can be used to analyze the uncertainty of the shape and feature of prospectivity mapping at the subsurface; (2) single geochemistry anomalies or remote sensing anomalies at the surface require combining the depth exploration criteria of geophysics to identify potential targets; and (3) the single or sparse exploration criteria zone with few mineralization spots at the surface has high uncertainty in terms of the exploration target.

  13. Scientific Approaches | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    CPTAC employs two complementary scientific approaches, a "Targeting Genome to Proteome" (Targeting G2P) approach and a "Mapping Proteome to Genome" (Mapping P2G) approach, in order to address biological questions from data generated on a sample.

  14. Usability Evaluation of Public Web Mapping Sites

    NASA Astrophysics Data System (ADS)

    Wang, C.

    2014-04-01

    Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The success rate from different public web mapping sites was calculated and compared, and displayed by the means of diagram. And the answers from questionnaires were also classified and organized in this part. Moreover, based on the analysis, this paper expands the discussion about the layout, map visualization, map tools, search logic and etc. Finally, this paper closed with some valuable guidelines and suggestions for the design of public web mapping sites. Also, limitations for this research stated in the end.

  15. Concept mapping in a critical care orientation program: a pilot study to develop critical thinking and decision-making skills in novice nurses.

    PubMed

    Wahl, Stacy E; Thompson, Anita M

    2013-10-01

    Newly graduated registered nurses who were hired into a critical care intensive care unit showed a lack of critical thinking skills to inform their clinical decision-making abilities. This study evaluated the effectiveness of concept mapping as a teaching tool to improve critical thinking and clinical decision-making skills in novice nurses. A self-evaluation tool was administered before and after the learning intervention. The 25-item tool measured five key indicators of the development of critical thinking skills: problem recognition, clinical decision-making, prioritization, clinical implementation, and reflection. Statistically significant improvements were seen in 10 items encompassing all five indicators. Concept maps are an effective tool for educators to use in assisting novice nurses to develop their critical thinking and clinical decision-making skills. Copyright 2013, SLACK Incorporated.

  16. Alzheimer's disease master regulators analysis: search for potential molecular targets and drug repositioning candidates.

    PubMed

    Vargas, D M; De Bastiani, M A; Zimmer, E R; Klamt, F

    2018-06-23

    Alzheimer's disease (AD) is a multifactorial and complex neuropathology that involves impairment of many intricate molecular mechanisms. Despite recent advances, AD pathophysiological characterization remains incomplete, which hampers the development of effective treatments. In fact, currently, there are no effective pharmacological treatments for AD. Integrative strategies such as transcription regulatory network and master regulator analyses exemplify promising new approaches to study complex diseases and may help in the identification of potential pharmacological targets. In this study, we used transcription regulatory network and master regulator analyses on transcriptomic data of human hippocampus to identify transcription factors (TFs) that can potentially act as master regulators in AD. All expression profiles were obtained from the Gene Expression Omnibus database using the GEOquery package. A normal hippocampus transcription factor-centered regulatory network was reconstructed using the ARACNe algorithm. Master regulator analysis and two-tail gene set enrichment analysis were employed to evaluate the inferred regulatory units in AD case-control studies. Finally, we used a connectivity map adaptation to prospect new potential therapeutic interventions by drug repurposing. We identified TFs with already reported involvement in AD, such as ATF2 and PARK2, as well as possible new targets for future investigations, such as CNOT7, CSRNP2, SLC30A9, and TSC22D1. Furthermore, Connectivity Map Analysis adaptation suggested the repositioning of six FDA-approved drugs that can potentially modulate master regulator candidate regulatory units (Cefuroxime, Cyproterone, Dydrogesterone, Metrizamide, Trimethadione, and Vorinostat). Using a transcription factor-centered regulatory network reconstruction we were able to identify several potential molecular targets and six drug candidates for repositioning in AD. Our study provides further support for the use of bioinformatics tools as exploratory strategies in neurodegenerative diseases research, and also provides new perspectives on molecular targets and drug therapies for future investigation and validation in AD.

  17. The Salient Map Analysis for Research and Teaching (SMART) method: Powerful potential as a formative assessment in the biomedical sciences

    NASA Astrophysics Data System (ADS)

    Cathcart, Laura Anne

    This dissertation consists of two studies: 1) development and characterization of the Salient Map Analysis for Research and Teaching (SMART) method as a formative assessment tool and 2) a case study exploring how a paramedic instructor's beliefs about learners affect her utilization of the SMART method and vice versa. The first study explored: How can a novel concept map analysis method be designed as an effective formative assessment tool? The SMART method improves upon existing concept map analysis methods because it does not require hierarchically structured concept maps and it preserves the rich content of the maps instead of reducing each map down to a numerical score. The SMART method is performed by comparing a set of students' maps to each other and to an instructor's map. The resulting composite map depicts, in percentages and highlighted colors, the similarities and differences between all of the maps. Some advantages of the SMART method as a formative assessment tool include its ability to highlight changes across time, problematic or alternative conceptions, and patterns of student responses at a glance. Study two explored: How do a paramedic instructor's beliefs about students and learning affect---and become affected by---her use of the SMART method as a formative assessment tool? This case study of Angel, an expert paramedic instructor, begins to address a gap in the emergency medical services (EMS) education literature, which contains almost no research on teachers or pedagogy. Angel and I worked together as participant co-researchers (Heron & Reason, 1997) exploring the affordances of the SMART method. This study, based on those interactions with Angel, involved using open coding to identify themes (Strauss & Corbin, 1998) from Angel's views of students and use of the SMART method. Angel views learning as a sense-making process. She has a multi-faceted view of her students as novices and invests substantial time trying to understand their concept maps. Not only do these beliefs affect her use of the SMART method; in addition, her beliefs are refined through the use of the SMART method.

  18. Quantitative Analysis of Electro-Anatomical Maps: Application to an Experimental Model of Left Bundle Branch Block/Cardiac Resynchronization Therapy

    PubMed Central

    Duchateau, Nicolas; Kostantyn Butakov, Constantine Butakoff; Andreu, David; Fernández-Armenta, Juan; Bijnens, Bart; Berruezo, Antonio; Sitges, Marta; Camara, Oscar

    2017-01-01

    Electro-anatomical maps (EAMs) are commonly acquired in clinical routine for guiding ablation therapies. They provide voltage and activation time information on a 3-D anatomical mesh representation, making them useful for analyzing the electrical activation patterns in specific pathologies. However, the variability between the different acquisitions and anatomies hampers the comparison between different maps. This paper presents two contributions for the analysis of electrical patterns in EAM data from biventricular surfaces of cardiac chambers. The first contribution is an integrated automatic 2-D disk representation (2-D bull’s eye plot) of the left ventricle (LV) and right ventricle (RV) obtained with a quasi-conformal mapping from the 3-D EAM meshes, that allows an analysis of cardiac resynchronization therapy (CRT) lead positioning, interpretation of global (total activation time), and local indices (local activation time (LAT), surrogates of conduction velocity, inter-ventricular, and transmural delays) that characterize changes in the electrical activation pattern. The second contribution is a set of indices derived from the electrical activation: speed maps, computed from LAT values, to study the electrical wave propagation, and histograms of isochrones to analyze regional electrical heterogeneities in the ventricles. We have applied the proposed methods to look for the underlying physiological mechanisms of left bundle branch block (LBBB) and CRT, with the goal of optimizing the therapy by improving CRT response. To better illustrate the benefits of the proposed tools, we created a set of synthetically generated and fully controlled activation patterns, where the proposed representation and indices were validated. Then, the proposed analysis tools are used to analyze EAM data from an experimental swine model of induced LBBB with an implanted CRT device. We have analyzed and compared the electrical activation patterns at baseline, LBBB, and CRT stages in four animals: two without any structural disease and two with an induced infarction. By relating the CRT lead location with electrical dyssynchrony, we evaluated current hypotheses about lead placement in CRT and showed that optimal pacing sites should target the RV lead close to the apex and the LV one distant from it. PMID:29164019

  19. Telling Stories about the Changing Landscape: One Center's Evolution

    NASA Astrophysics Data System (ADS)

    Arnold, C. L., Jr.; Wilson, E. H.; Chadwick, C.; Dickson, D.

    2016-12-01

    Since its inception, the Center for Land Use Education and Research (CLEAR) at the University of Connecticut has had a strong applied research and public outreach focus. As a center that focuses on topics that virtually all have a geographic component, the intersection of Web and mapping technologies over the past decade has been an invaluable tool for communicating information. The primary target audience of this information is land use decision makers, who in New England are almost exclusively at the local (municipal) level and are often unpaid volunteers with little or no science background. Data-driven science communication focusing on this very worthy - and very needy - sector of the populace poses problems different from communicating with academic peers at one end of the spectrum, or the general public on the other end. The information must be understandable and accessible to non-technical users, yet specific and authoritative enough to inform decisions. CLEAR's approach to reaching this audience has evolved over the years in response to new internet and GIS technologies on the one hand, and internal deliberations on the other. A critical point was the 2004 public debut of the Center's Changing Landscape project, comprised of complex remotely-sensed land cover data: CLEAR principals decided to make the data publicly available via the Center website, but also to design a website to make the information accessible in as many ways, and for as many different audiences, as possible. This approach has had considerable success, as evidenced in the widespread use of the land cover information by communities, NGOs, federal and state agencies, and academia. Over the past several years, CLEAR has embraced the ESRI story map as a technological tool that embodies the Center's goal of "democratization" of science-based information through multifaceted accessibility. CLEAR's Story Map Gallery currently has six maps, covering a wide range of topics including the Changing Landscape project, black bear behavior and distribution, historical coastline changes, and social science research on the adoption of green infrastructure practices. More will be coming as both the story map format and the Center's projects grow and evolve.

  20. The PARIGA server for real time filtering and analysis of reciprocal BLAST results.

    PubMed

    Orsini, Massimiliano; Carcangiu, Simone; Cuccuru, Gianmauro; Uva, Paolo; Tramontano, Anna

    2013-01-01

    BLAST-based similarity searches are commonly used in several applications involving both nucleotide and protein sequences. These applications span from simple tasks such as mapping sequences over a database to more complex procedures as clustering or annotation processes. When the amount of analysed data increases, manual inspection of BLAST results become a tedious procedure. Tools for parsing or filtering BLAST results for different purposes are then required. We describe here PARIGA (http://resources.bioinformatica.crs4.it/pariga/), a server that enables users to perform all-against-all BLAST searches on two sets of sequences selected by the user. Moreover, since it stores the two BLAST output in a python-serialized-objects database, results can be filtered according to several parameters in real-time fashion, without re-running the process and avoiding additional programming efforts. Results can be interrogated by the user using logical operations, for example to retrieve cases where two queries match same targets, or when sequences from the two datasets are reciprocal best hits, or when a query matches a target in multiple regions. The Pariga web server is designed to be a helpful tool for managing the results of sequence similarity searches. The design and implementation of the server renders all operations very fast and easy to use.

  1. Evolution of Force Sensing Technologies.

    PubMed

    Shah, Dipen

    2017-06-01

    In order to Improve the procedural success and long-term outcomes of catheter ablation techniques for atrial fibrillation (AF), an Important unfulfilled requirement is to create durable electrophysiologically complete lesions. Measurement of contact force (CF) between the catheter tip and the target tissue can guide physicians to optimise both mapping and ablation procedures. Contact force can affect lesion size and clinical outcomes following catheter ablation of AF. Force sensing technologies have matured since their advent several years ago, and now allow the direct measurement of CF between the catheter tip and the target myocardium in real time. In order to obtain complete durable lesions, catheter tip spatial stability and stable contact force are important. Suboptimal energy delivery, lesion density/contiguity and/or excessive wall thickness of the pulmonary vein-left atrial (PV-LA) junction may result in conduction recovery at these sites. Lesion assessment tools may help predict and localise electrical weak points resulting in conduction recovery during and after ablation. There is increasing clinical evidence to show that optimal use of CF sensing during ablation can reduce acute PV re-conduction, although prospective randomised studies are desirable to confirm long-term favourable clinical outcomes. In combination with optimised lesion assessment tools, contact force sensing technology has the potential to become the standard of care for all patients undergoing AF catheter ablation.

  2. Parametric tools over crowdsourced maps as means for participatory consideration of environmental issues in cities

    NASA Astrophysics Data System (ADS)

    Montoya, Paula; Ballesteros, José; Gervás, Pablo

    2015-04-01

    The increasing complexity of space use and resource cycles in cities, demands an understanding of the built environment as "ecological": enabling mutation while remaining balanced and biologically sustainable. Designing man`s environment is no longer a question of defining types, but rather an act of inserting changes within a complex system. Architecture and urban planning have become increasingly aware of their condition as system-oriented disciplines, and they are in the process of developing the necessary languages, design tools, and alliances. We will argue the relevance of parametric maps as one of the most powerful of those tools, in terms of their potential for adaptive prototype design, convergence of disciplines, and collaborative work. Cities need to change in order to survive. As the main human landscape (by 2050 75% of the world's population will live in urban areas) cities follow biological patterns of behaviour, constantly replacing their cells, renovating infrastructure systems and refining methods for energy provision and waste management. They need to adapt constantly. As responsive entities, they develop their own protocols for reaction to environmental change and challenge the increasing pressure of several issues related to scale: population, mobility, water and energy supply, pollution... The representation of these urban issues on maps becomes crucial for understanding and addressing them in design. Maps enhanced with parametric tools are relational and not only they register environmental dynamics but they allow adaptation of the system through interwoven parameters of mutation. Citizens are taking part in decisions and becoming aware of their role as urban experts in a bottom-up design process of the cities where they live. Modern tools for dynamic visualisation and collaborative edition of maps have an important role to play in this process. More and more people consult maps on hand-held devices as part of their daily routine. The advent of open access collaborative maps allows them to actively extend and modify these maps by uploading data of their own design. This can generate an immense amount of unique information that is publicly available. The work of architects, planners, and political agents can be informed by the contributions of a community of volunteer cartographers. Counter-cartographies built through collaboration arise from spontaneous processes of knowledge and data collection, and demand continuous non-commercial revision. Both scientific and non-academic users have direct access to geostrategic information and actively take part in exploring, recording and inserting their contrasted contributions into the way in which our world is described. This proposal explores the idea of a counter-cartography as a collection of maps that unveil territorial environmental conditions different from those shown in official maps. By using parametric tools we can incorporate information of this type directly into architectural documents and generate interlaced changes in the design. A parametric map is a flexible yet accurate tool for design and discovery: it integrates multiple particular views into a precise physical context that culminates in a generative design. Working with complex maps in this way is gradually becoming the ultimate document for designing the city in an integrated manner.

  3. Automatic Rock Detection and Mapping from HiRISE Imagery

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Adams, Douglas S.; Cheng, Yang

    2008-01-01

    This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.

  4. Linking the Kentucky K-PREP Assessments to NWEA MAP Tests

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  5. Linking the Kansas KAP Assessments to NWEA MAP Tests

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  6. Linking the PARCC Assessments to NWEA MAP Tests for Illinois

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  7. Linking the Texas STAAR Assessments to NWEA MAP Tests

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  8. Linking the Nebraska NeSA Assessments to NWEA MAP Tests

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  9. Linking the Alaska AMP Assessments to NWEA MAP Tests

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  10. Linking the PARCC Assessments to NWEA MAP Tests for New Mexico

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  11. Linking the Arizona AZMERIT Assessments to NWEA MAP Tests

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from the Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  12. Improving the Usefulness of Concept Maps as a Research Tool for Science Education

    ERIC Educational Resources Information Center

    Van Zele, Els; Lenaerts, Josephina; Wieme, Willem

    2004-01-01

    The search for authentic science research tools to evaluate student understanding in a hybrid learning environment with a large multimedia component has resulted in the use of concept maps as a representation of student's knowledge organization. One hundred and seventy third-semester introductory university-level engineering students represented…

  13. BACCardI--a tool for the validation of genomic assemblies, assisting genome finishing and intergenome comparison.

    PubMed

    Bartels, Daniela; Kespohl, Sebastian; Albaum, Stefan; Drüke, Tanja; Goesmann, Alexander; Herold, Julia; Kaiser, Olaf; Pühler, Alfred; Pfeiffer, Friedhelm; Raddatz, Günter; Stoye, Jens; Meyer, Folker; Schuster, Stephan C

    2005-04-01

    We provide the graphical tool BACCardI for the construction of virtual clone maps from standard assembler output files or BLAST based sequence comparisons. This new tool has been applied to numerous genome projects to solve various problems including (a) validation of whole genome shotgun assemblies, (b) support for contig ordering in the finishing phase of a genome project, and (c) intergenome comparison between related strains when only one of the strains has been sequenced and a large insert library is available for the other. The BACCardI software can seamlessly interact with various sequence assembly packages. Genomic assemblies generated from sequence information need to be validated by independent methods such as physical maps. The time-consuming task of building physical maps can be circumvented by virtual clone maps derived from read pair information of large insert libraries.

  14. Ground-penetrating radar--A tool for mapping reservoirs and lakes

    USGS Publications Warehouse

    Truman, C.C.; Asmussen, L.E.; Allison, H.D.

    1991-01-01

    Ground-penetrating radar was evaluated as a tool for mapping reservoir and lake bottoms and providing stage-storage information. An impulse radar was used on a 1.4-ha (3.5-acre) reservoir with 31 transects located 6.1 m (20 feet) apart. Depth of water and lateral extent of the lake bottom were accurately measured by ground-penetrating radar. A linear (positive) relationship existed between measured water depth and ground-penetrating radar-determined water depth (R2=0.989). Ground-penetrating radar data were used to create a contour map of the lake bottom. Relationships between water (contour) elevation and water surface area and volume were established. Ground-penetrating radar proved to be a useful tool for mapping lakes, detecting lake bottom variations, locating old stream channels, and determining water depths. The technology provides accurate, continuous profile data in a relatively short time compared to traditional surveying and depth-sounding techniques.

  15. THE HOLDRIDGE LIFE ZONES OF THE CONTERMINOUS UNITED STATES IN RELATION TO ECOSYSTEM MAPPING

    EPA Science Inventory

    Our main goals were to develop a map of the life zones for the conterminous United States, based on the Holdridge Life Zone system as a tool for ecosystem mapping, and to compare the map of Holdridge life zones with other global vegetation classification and mapping efforts.
    ...

  16. Using E-Maps to Organize and Navigate Online Content

    ERIC Educational Resources Information Center

    Ruffini, Michael F.

    2008-01-01

    Computer-generated mind maps, or e-maps, provide an outstanding e-learning tool for organizing and navigating web-based content and files. Considerable research indicates the effectiveness of using graphic organizers such as mind maps to facilitate meaningful learning. Tony Buzan and Barry Buzan argue that mind maps better harness the way the…

  17. GIS tool to locate major Sikh temples in USA

    NASA Astrophysics Data System (ADS)

    Sharma, Saumya

    This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.

  18. Utility assessment of a map-based online geo-collaboration tool.

    PubMed

    Sidlar, Christopher L; Rinner, Claus

    2009-05-01

    Spatial group decision-making processes often include both informal and analytical components. Discussions among stakeholders or planning experts are an example of an informal component. When participants discuss spatial planning projects they typically express concerns and comments by pointing to places on a map. The Argumentation Map model provides a conceptual basis for collaborative tools that enable explicit linkages of arguments to the places to which they refer. These tools allow for the input of explicitly geo-referenced arguments as well as the visual access to arguments through a map interface. In this paper, we will review previous utility studies in geo-collaboration and evaluate a case study of a Web-based Argumentation Map application. The case study was conducted in the summer of 2005 when student participants discussed planning issues on the University of Toronto St. George campus. During a one-week unmoderated discussion phase, 11 participants wrote 60 comments on issues such as safety, facilities, parking, and building aesthetics. By measuring the participants' use of geographic references, we draw conclusions on how well the software tool supported the potential of the underlying concept. This research aims to contribute to a scientific approach to geo-collaboration in which the engineering of novel spatial decision support methods is complemented by a critical assessment of their utility in controlled, realistic experiments.

  19. Storm Prediction Center Fire Weather Forecasts

    Science.gov Websites

    Archive NOAA Weather Radio Research Non-op. Products Forecast Tools Svr. Tstm. Events SPC Publications SPC Composite Maps Fire Weather Graphical Composite Maps Forecast and observational maps for various fire

  20. RadMap

    EPA Pesticide Factsheets

    RadMap is an interactive desktop tool featuring a nationwide geographic information systems (GIS) map of long-term radiation monitoring locations across the United States with access to key information about the monitor and the area surrounding it.

  1. A phenomenographic case study: Concept maps from the perspectives of middle school students

    NASA Astrophysics Data System (ADS)

    Saglam, Yilmaz

    The objective of this study was to investigate the experiences of middle school students when concept maps were used as a learning tool. Twenty-nine students' written responses, concept maps and videotapes were analyzed. Out of 29 students, thirteen students were interviewed using a semi-structured and open-ended interview protocol. The students' initial written responses provided us with the students' initial reactions to concept maps. The videotapes captured the students' behavior, and interpersonal interactions. The interviews probed students': (1) knowledge about drawing concept maps, (2) perception of the meaning and usefulness of concept maps, and (3) attitudes towards concept maps. The results indicated that the students viewed concept maps as useful tools in learning science. They believed that concept maps organized and summarized the information, which thereby helped them understand the topic easily. They also believed that concept maps had some cognitive benefits. However, the students viewed concept maps as hard to construct because it was difficult for the students to think of related concepts. The students' initial written responses, interviews and videotapes indicated that the students seemed to see both positive and negative aspects of concept maps. Some students' had more positive and some had more negative attitudes.

  2. Dynamics of paramagnetic agents by off-resonance rotating frame technique in the presence of magnetization transfer effect

    NASA Astrophysics Data System (ADS)

    Zhang, Huiming; Xie, Yang

    2007-02-01

    The simple method for measuring the rotational correlation time of paramagnetic ion chelates via off-resonance rotating frame technique is challenged in vivo by the magnetization transfer effect. A theoretical model for the spin relaxation of water protons in the presence of paramagnetic ion chelates and magnetization transfer effect is described. This model considers the competitive relaxations of water protons by the paramagnetic relaxation pathway and the magnetization transfer pathway. The influence of magnetization transfer on the total residual z-magnetization has been quantitatively evaluated in the context of the magnetization map and various difference magnetization profiles for the macromolecule conjugated Gd-DTPA in cross-linked protein gels. The numerical simulations and experimental validations confirm that the rotational correlation time for the paramagnetic ion chelates can be measured even in the presence of strong magnetization transfer. This spin relaxation model also provides novel approaches to enhance the detection sensitivity for paramagnetic labeling by suppressing the spin relaxations caused by the magnetization transfer. The inclusion of the magnetization transfer effect allows us to use the magnetization map as a simulation tool to design efficient paramagnetic labeling targeting at specific tissues, to design experiments running at low RF power depositions, and to optimize the sensitivity for detecting paramagnetic labeling. Thus, the presented method will be a very useful tool for the in vivo applications such as molecular imaging via paramagnetic labeling.

  3. Soil Security Assessment of Tasmania

    NASA Astrophysics Data System (ADS)

    Field, Damien; Kidd, Darren; McBratney, Alex

    2017-04-01

    The concept of soil security aligns well with the aspirational and marketing policies of the Tasmanian Government, where increased agricultural expansion through new irrigation schemes and multiple-use State managed production forests co-exists beside pristine World Heritage conservation land, a major drawcard of the economically important tourism industry . Regarding the Sustainable Development Gaols (SDG's) this could be seen as a exemplar of the emerging tool for quantification of spatial soil security to effectively protect our soil resource in terms of food (SDG 2.4, 3.9) and water security (SDG 6.4, 6.6), biodiversity maintenance and safeguarding fragile ecosystems (SDG 15.3, 15.9). The recent development and application of Digital Soil Mapping and Assessment capacities in Tasmania to stimulate agricultural production and better target appropriate soil resources has formed the foundational systems that can enable the first efforts in quantifying and mapping Tasmanian Soil Security, in particular the five Soil Security dimensions (Capability, Condition, Capital, Codification and Connectivity). However, to provide a measure of overall soil security, it was necessary to separately assess the State's three major soil uses; Agriculture, Conservation and Forestry. These products will provide an indication of where different activities are sustainable or at risk, where more soil data is needed, and provide a tool to better plan for a State requiring optimal food and fibre production, without depleting its natural soil resources and impacting on the fragile ecosystems supporting environmental benefits and the tourism industry.

  4. magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation

    NASA Astrophysics Data System (ADS)

    Angleraud, Christophe

    2014-06-01

    The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.

  5. Hawaiian Volcano Observatory seismic data, January to March 2009

    USGS Publications Warehouse

    Nakata, Jennifer S.; Okubo, Paul G.

    2010-01-01

    Figures 11–14 are maps showing computer-located hypocenters. The maps were generated using the Generic Mapping Tools (GMT), found at http://gmt.soest.hawaii.edu/ (last accessed 01/22/2010), in place of traditional QPLOT maps.

  6. Genome contact map explorer: a platform for the comparison, interactive visualization and analysis of genome contact maps

    PubMed Central

    Kumar, Rajendra; Sobhy, Haitham

    2017-01-01

    Abstract Hi-C experiments generate data in form of large genome contact maps (Hi-C maps). These show that chromosomes are arranged in a hierarchy of three-dimensional compartments. But to understand how these compartments form and by how much they affect genetic processes such as gene regulation, biologists and bioinformaticians need efficient tools to visualize and analyze Hi-C data. However, this is technically challenging because these maps are big. In this paper, we remedied this problem, partly by implementing an efficient file format and developed the genome contact map explorer platform. Apart from tools to process Hi-C data, such as normalization methods and a programmable interface, we made a graphical interface that let users browse, scroll and zoom Hi-C maps to visually search for patterns in the Hi-C data. In the software, it is also possible to browse several maps simultaneously and plot related genomic data. The software is openly accessible to the scientific community. PMID:28973466

  7. S-MART, a software toolbox to aid RNA-Seq data analysis.

    PubMed

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci.

  8. S-MART, A Software Toolbox to Aid RNA-seq Data Analysis

    PubMed Central

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci. PMID:21998740

  9. Framework See-Think as a Tool for Crowdsourcing Support - Case Study on Crisis Management

    NASA Astrophysics Data System (ADS)

    Netek, R.; Panek, J.

    2016-06-01

    See-Think-Do is a framework originally used as an approach focused on a service and product marketing on the Internet. Customers can be classified into three groups according to their involvement from potential users to real customers. The article presents an idea of public involvement in community mapping in three levels: "See"—almost any user; "Think"—potential contributors; and "Do"—interested users. The case study implements the See-Think-Do framework as an awareness-based approach used for The Crisis Map of the Czech Republic. It is an Ushahidi-based crowdsourcing platform for sharing spatial and multimedia information during crisis situations, e.g. disaster floods in 2013. While the current crisis projects use public mapping just at the onset of the disaster, according to See-Think-Do any user can be considered as a potential contributor even during the dormant period. The focus is put on the "See" and "Think" groups of contributors, which are currently ignored. The objective of this paper is to summarize approaches (social networks, mass-media, emailing, gamification, …) and tools (GIT/GIS, ICT, multimedia) for increasing the awareness about the project within the resting phase. That recruits a higher number of both active and passive users during the disaster. It allows the training in ICT, cartographical, spatial and GIS skills in a non-stressful way and the targeting on specific operators. Volunteers from the "Think" group may be used for data processing or rectification, GIS professionals from the "Do" group for data verification. The results refer that contributors with already established skills and required literacy (interface, data uploading) provide data faster and more accurate, the usability of the project increases based on users' comments.

  10. Ghost analysis visualization techniques for complex systems: examples from the NIF Final Optics Assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beer, G K; Hendrix, J L; Rowe, J

    1998-06-26

    The stray light or "ghost" analysis of the National Ignition Facility's (NIP) Final Optics Assembly (FOA) has proved to be one of the most complex ghost analyses ever attempted. The NIF FOA consists of a bundle of four beam lines that: 1) provides the vacuum seal to the target chamber, 2) converts 1ω to 3ω light, 3) focuses the light on the target, 4) separates a fraction of the 3ω beam for energy diagnostics, 5) separates the three wavelengths to diffract unwanted 1ω & 2ω light away from the target, 6) provides spatial beam smoothing, and 7) provides a debrismore » barrier between the target chamber and the switchyard mirrors. The three wavelengths of light and seven optical elements with three diffractive optic surfaces generate three million ghosts through 4 th order. Approximately 24,000 of these ghosts have peak fluence exceeding 1 J/cm 2. The shear number of ghost paths requires a visualization method that allows overlapping ghosts on optics and mechanical components to be summed and then mapped to the optical and mechanical component surfaces in 3D space. This paper addresses the following aspects of the NIF Final Optics Ghost analysis: 1) materials issues for stray light mitigation, 2) limitations of current software tools (especially in modeling diffractive optics), 3) computer resource limitations affecting automated coherent raytracing, 4) folding the stray light analysis into the opto-mechanical design process, 5) analysis and visualization tools from simple hand calculations to specialized stray light analysis computer codes, and 6) attempts at visualizing these ghosts using a CAD model and another using a high end data visualization software approach.« less

  11. New Tool Quantitatively Maps Minority-Carrier Lifetime of Multicrystalline Silicon Bricks (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-11-01

    NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less

  12. A tool for the estimation of the distribution of landslide area in R

    NASA Astrophysics Data System (ADS)

    Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.

    2012-04-01

    We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery. The tool can also be used to evaluate the probability density and the frequency density of landslide volume.

  13. Time-Resolved Spectroscopy and Near Infrared Imaging for Prostate Cancer Detection: Receptor-targeted and Native Biomarker

    NASA Astrophysics Data System (ADS)

    Pu, Yang

    Optical spectroscopy and imaging using near-infrared (NIR) light provides powerful tools for non-invasive detection of cancer in tissue. Optical techniques are capable of quantitative reconstructions maps of tissue absorption and scattering properties, thus can map in vivo the differences in the content of certain marker chromophores and/or fluorophores in normal and cancerous tissues (for example: water, tryptophan, collagen and NADH contents). Potential clinical applications of optical spectroscopy and imaging include functional tumor detection and photothermal therapeutics. Optical spectroscopy and imaging apply contrasts from intrinsic tissue chromophores such as water, collagen and NADH, and extrinsic optical contrast agents such as Indocyanine Green (ICG) to distinguish disease tissue from the normal one. Fluorescence spectroscopy and imaging also gives high sensitivity and specificity for biomedical diagnosis. Recent developments on specific-targeting fluorophores such as small receptor-targeted dye-peptide conjugate contrast agent offer high contrast between normal and cancerous tissues hence provide promising future for early tumour detection. This thesis focus on a study to distinguish the cancerous prostate tissue from the normal prostate tissues with enhancement of specific receptor-targeted prostate cancer contrast agents using optical spectroscopy and imaging techniques. The scattering and absorption coefficients, and anisotropy factor of cancerous and normal prostate tissues were investigated first as the basis for the biomedical diagnostic and optical imaging. Understanding the receptors over-expressed prostate cancer cells and molecular target mechanism of ligand, two small ICG-derivative dye-peptides, namely Cypate-Bombesin Peptide Analogue Conjugate (Cybesin) and Cypate-Octreotate Peptide Conjugate (Cytate), were applied to study their clinical potential for human prostate cancer detection. In this work, the steady-state and time-resolved fluorescence spectroscopy of Cybesin (Cytate) in solution, and in cancerous and normal prostate tissues were studied. It was found that more Cybesin (Cytate) was uptaken in the cancerous prostate tissue than those in the normal tissue. The preferential uptake of Cybesin (Cytate) in cancerous tissue was used to image and distinguish cancerous areas from the normal tissue. To investigate rotational dynamics and fluorescence polarization anisotropy of the contrast agents in prostate tissues, an analytical model was used to extract the rotational times and polarization anisotropies, which were observed for higher values of Cybesin (Cytate)-stained cancerous prostate tissue in comparison with the normal tissue. These reflect changes of microstructures of cancerous and normal tissues and their different binding affinity with contrast agents. The results indicate that the use of optical spectroscopy and imaging combined with receptor-targeted contrast agents is a valuable tool to study microenvironmental changes of tissue, and detect prostate cancer in early stage.

  14. Robust Small Target Co-Detection from Airborne Infrared Image Sequences.

    PubMed

    Gao, Jingli; Wen, Chenglin; Liu, Meiqin

    2017-09-29

    In this paper, a novel infrared target co-detection model combining the self-correlation features of backgrounds and the commonality features of targets in the spatio-temporal domain is proposed to detect small targets in a sequence of infrared images with complex backgrounds. Firstly, a dense target extraction model based on nonlinear weights is proposed, which can better suppress background of images and enhance small targets than weights of singular values. Secondly, a sparse target extraction model based on entry-wise weighted robust principal component analysis is proposed. The entry-wise weight adaptively incorporates structural prior in terms of local weighted entropy, thus, it can extract real targets accurately and suppress background clutters efficiently. Finally, the commonality of targets in the spatio-temporal domain are used to construct target refinement model for false alarms suppression and target confirmation. Since real targets could appear in both of the dense and sparse reconstruction maps of a single frame, and form trajectories after tracklet association of consecutive frames, the location correlation of the dense and sparse reconstruction maps for a single frame and tracklet association of the location correlation maps for successive frames have strong ability to discriminate between small targets and background clutters. Experimental results demonstrate that the proposed small target co-detection method can not only suppress background clutters effectively, but also detect targets accurately even if with target-like interference.

  15. Using the intervention mapping protocol to reduce European preschoolers' sedentary behavior, an application to the ToyBox-Study.

    PubMed

    De Decker, Ellen; De Craemer, Marieke; De Bourdeaudhuij, Ilse; Verbestel, Vera; Duvinage, Kristin; Iotova, Violeta; Grammatikaki, Evangelia; Wildgruber, Andreas; Mouratidou, Theodora; Manios, Yannis; Cardon, Greet

    2014-02-19

    High levels of sedentary behavior are often measured in preschoolers, but only a few interventions have been developed to counteract this. Furthermore, detailed descriptions of interventions in preschoolers targeting different forms of sedentary behavior could not be located in the literature. The aim of the present paper was to describe the different steps of the Intervention Mapping Protocol used towards the development of an intervention component of the ToyBox-study focusing on decreasing preschoolers' sedentary behavior. The ToyBox-study focuses on the prevention of overweight in 4- to 6-year-old children by implementing a multi-component kindergarten-based intervention with family involvement in six different European countries. Applying the Intervention Mapping Protocol, six different steps were systematically completed for the structured planning and development of the intervention. A literature search and results from focus groups with parents/caregivers and kindergarten teachers were used as a guide during the development of the intervention and the intervention materials. The application of the different steps in the Intervention Mapping Protocol resulted in the creation of matrices of change objectives, followed by the selection of practical applications for five different intervention tools that could be used at the individual level of the preschool child, at the interpersonal level (i.e., parents/caregivers) and at the organizational level (i.e., kindergarten teachers). No cultural differences regarding preschoolers' sedentary behavior were identified between the participating countries during the focus groups, so cultural and local adaptations of the intervention materials were not necessary to improve the adoption and implementation of the intervention. A systematic and evidence-based approach was used for the development of this kindergarten-based family-involved intervention targeting preschoolers, with the inclusion of parental involvement. The application of the Intervention Mapping Protocol may lead to the development of more effective interventions. The detailed intervention matrices that were developed as part of the ToyBox-study can be used by other researchers as an aid in order to avoid repetitive work for the design of similar interventions.

  16. Using the intervention mapping protocol to reduce European preschoolers’ sedentary behavior, an application to the ToyBox-Study

    PubMed Central

    2014-01-01

    Background High levels of sedentary behavior are often measured in preschoolers, but only a few interventions have been developed to counteract this. Furthermore, detailed descriptions of interventions in preschoolers targeting different forms of sedentary behavior could not be located in the literature. The aim of the present paper was to describe the different steps of the Intervention Mapping Protocol used towards the development of an intervention component of the ToyBox-study focusing on decreasing preschoolers’ sedentary behavior. The ToyBox-study focuses on the prevention of overweight in 4- to 6-year-old children by implementing a multi-component kindergarten-based intervention with family involvement in six different European countries. Methods Applying the Intervention Mapping Protocol, six different steps were systematically completed for the structured planning and development of the intervention. A literature search and results from focus groups with parents/caregivers and kindergarten teachers were used as a guide during the development of the intervention and the intervention materials. Results The application of the different steps in the Intervention Mapping Protocol resulted in the creation of matrices of change objectives, followed by the selection of practical applications for five different intervention tools that could be used at the individual level of the preschool child, at the interpersonal level (i.e., parents/caregivers) and at the organizational level (i.e., kindergarten teachers). No cultural differences regarding preschoolers’ sedentary behavior were identified between the participating countries during the focus groups, so cultural and local adaptations of the intervention materials were not necessary to improve the adoption and implementation of the intervention. Conclusions A systematic and evidence-based approach was used for the development of this kindergarten-based family-involved intervention targeting preschoolers, with the inclusion of parental involvement. The application of the Intervention Mapping Protocol may lead to the development of more effective interventions. The detailed intervention matrices that were developed as part of the ToyBox-study can be used by other researchers as an aid in order to avoid repetitive work for the design of similar interventions. PMID:24552138

  17. Using a concept map as a tool for strategic planning: The Healthy Brain Initiative.

    PubMed

    Anderson, Lynda A; Day, Kristine L; Vandenberg, Anna E

    2011-09-01

    Concept mapping is a tool to assist in strategic planning that allows planners to work through a sequence of phases to produce a conceptual framework. Although several studies describe how concept mapping is applied to various public health problems, the flexibility of the methods used in each phase of the process is often overlooked. If practitioners were more aware of the flexibility, more public health endeavors could benefit from using concept mapping as a tool for strategic planning. The objective of this article is to describe how the 6 concept-mapping phases originally outlined by William Trochim guided our strategic planning process and how we adjusted the specific methods in the first 2 phases to meet the specialized needs and requirements to create The Healthy Brain Initiative: A National Public Health Road Map to Maintaining Cognitive Health. In the first stage (phases 1 and 2 of concept mapping), we formed a steering committee, convened 4 work groups over a period of 3 months, and generated an initial set of 42 action items grounded in science. In the second stage (phases 3 and 4), we engaged stakeholders in sorting and rating the action items and constructed a series of concept maps. In the third and final stage (phases 5 and 6), we examined and refined the action items and generated a final concept map consisting of 44 action items. We then selected the top 10 action items, and in 2007, we published The Healthy Brain Initiative: A National Public Health Road Map to Maintaining Cognitive Health, which represents the strategic plan for The Healthy Brain Initiative.

  18. Review of State Legislative Approaches to Eliminating Racial and Ethnic Health Disparities, 2002–2011

    PubMed Central

    Pollack, Keshia; Rutkow, Lainie

    2015-01-01

    We conducted a legal mapping study of state bills related to racial/ethnic health disparities in all 50 states between 2002 and 2011. Forty-five states introduced at least 1 bill that specifically targeted racial/ethnic health disparities; we analyzed 607 total bills. Of these 607 bills, 330 were passed into law (54.4%). These bills approached eliminating racial/ethnic health disparities by developing governmental infrastructure, providing appropriations, and focusing on specific diseases and data collection. In addition, states tackled emerging topics that were previously lacking laws, particularly Hispanic health. Legislation is an important policy tool for states to advance the elimination of racial/ethnic health disparities. PMID:25905834

  19. The efieldbook program: A teaching resource for geology

    NASA Astrophysics Data System (ADS)

    Vacas Peña, José Manuel; Chamoso, José M.; Urones, Carmen

    2011-04-01

    The eFieldBook program is a geology teaching tool with high didactic potential that guides a student's work in the field using multimedia and other resources. This program allows the collection of geo-referenced geological information as well as its storage and transmission, if necessary, as soon as it is collected. The data can be collected as in traditional field notebooks or on maps and photographs. The information can be used as soon as it is collected and can be exported to other programs such as Word, Excel, Georient or statistical packages. eFieldBook safely stores and backs up user information by sending any data collected to a selected Internet target at regular time intervals.

  20. Aircraft Detection in High-Resolution SAR Images Based on a Gradient Textural Saliency Map.

    PubMed

    Tan, Yihua; Li, Qingyun; Li, Yansheng; Tian, Jinwen

    2015-09-11

    This paper proposes a new automatic and adaptive aircraft target detection algorithm in high-resolution synthetic aperture radar (SAR) images of airport. The proposed method is based on gradient textural saliency map under the contextual cues of apron area. Firstly, the candidate regions with the possible existence of airport are detected from the apron area. Secondly, directional local gradient distribution detector is used to obtain a gradient textural saliency map in the favor of the candidate regions. In addition, the final targets will be detected by segmenting the saliency map using CFAR-type algorithm. The real high-resolution airborne SAR image data is used to verify the proposed algorithm. The results demonstrate that this algorithm can detect aircraft targets quickly and accurately, and decrease the false alarm rate.

  1. Developing nurses' intercultural/intraprofessional communication skills using the EXCELLence in Cultural Experiential Learning and Leadership Social Interaction Maps.

    PubMed

    Henderson, Saras; Barker, Michelle

    2017-09-27

    To examine how the use of Social Interaction Maps, a tool in the EXCELLence in Cultural Experiential Learning and Leadership Program, can enhance the development of nurses' intercultural/intraprofessional communication skills. Nurses face communication challenges when interacting with others from similar background as well as those from a culturally and linguistically diverse background. We used the EXCELLence in Cultural Experiential Learning and Leadership Program's Social Interaction Maps tool to foster intercultural/intraprofessional communication skills in nurses. Social Interaction Maps describe verbal and nonverbal communication behaviours that model ways of communicating in a culturally appropriate manner. The maps include four stages of an interaction, namely Approach, Bridging, Communicating and Departing using the acronym ABCD. Qualitative approach was used with a purposeful sample of nurses enrolled in a postgraduate course. Fifteen participants were recruited. The Social Interaction Map tool was taught to participants in a workshop where they engaged in sociocultural communication activities using scenarios. Participants were asked to apply Social Interaction Maps in their workplaces. Six weeks later, participants completed a semistructured open-ended questionnaire and participated in a discussion forum on their experience of using Social Interaction Maps. Data were content-analysed. Four themes identified in the use of the Social Interaction Maps were (i) enhancing self-awareness of communication skills; (ii) promoting skills in being nonconfrontational during difficult interactions; (iii) highlighting the importance of A (Approach) and B (Bridging) in interaction with others; and (iv) awareness of how others interpret what is said C (Communicating) and discussing to resolve issues before closure D (Departing). Application of the EXCELLence in Cultural Experiential Learning and Leadership Social Interaction Mapping tool was shown to be useful in developing intercultural/intraprofessional communication skills in nurses. Professional development programmes that incorporate EXCELLence in Cultural Experiential Learning and Leadership Social Interaction Maps can enhance nurses' intercultural/intraprofessional communication competencies when engaging with others from culturally and linguistically diverse backgrounds and improve the way nurses communicate with each other. © 2017 John Wiley & Sons Ltd.

  2. Prioritization of malaria endemic zones using self-organizing maps in the Manipur state of India.

    PubMed

    Murty, Upadhyayula Suryanarayana; Srinivasa Rao, Mutheneni; Misra, Sunil

    2008-09-01

    Due to the availability of a huge amount of epidemiological and public health data that require analysis and interpretation by using appropriate mathematical tools to support the existing method to control the mosquito and mosquito-borne diseases in a more effective way, data-mining tools are used to make sense from the chaos. Using data-mining tools, one can develop predictive models, patterns, association rules, and clusters of diseases, which can help the decision-makers in controlling the diseases. This paper mainly focuses on the applications of data-mining tools that have been used for the first time to prioritize the malaria endemic regions in Manipur state by using Self Organizing Maps (SOM). The SOM results (in two-dimensional images called Kohonen maps) clearly show the visual classification of malaria endemic zones into high, medium and low in the different districts of Manipur, and will be discussed in the paper.

  3. Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning

    NASA Astrophysics Data System (ADS)

    Cui, J.; Dong, B.; Li, J.; Li, L.

    2017-09-01

    As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.

  4. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  5. RadMap Installation Instructions

    EPA Pesticide Factsheets

    RadMap is an interactive desktop tool featuring a nationwide geographic information systems (GIS) map of long-term radiation monitoring locations across the United States with access to key information about the monitor and the area surrounding it.

  6. Salient target detection based on pseudo-Wigner-Ville distribution and Rényi entropy.

    PubMed

    Xu, Yuannan; Zhao, Yuan; Jin, Chenfei; Qu, Zengfeng; Liu, Liping; Sun, Xiudong

    2010-02-15

    We present what we believe to be a novel method based on pseudo-Wigner-Ville distribution (PWVD) and Rényi entropy for salient targets detection. In the foundation of studying the statistical property of Rényi entropy via PWVD, the residual entropy-based saliency map of an input image can be obtained. From the saliency map, target detection is completed by the simple and convenient threshold segmentation. Experimental results demonstrate the proposed method can detect targets effectively in complex ground scenes.

  7. MiroRNA-188 Acts as Tumor Suppressor in Non-Small-Cell Lung Cancer by Targeting MAP3K3.

    PubMed

    Zhao, Lili; Ni, Xin; Zhao, Linlin; Zhang, Yao; Jin, Dan; Yin, Wei; Wang, Dandan; Zhang, Wei

    2018-04-02

    Non-small cell lung cancer (NSCLC) is the most prevalent form of lung cancer. MicroRNAs have been increasingly implicated in NSCLC and may serve as novel therapeutic targets to combat cancer. Here we investigated the functional implication of miR-188 in NSCLC. We first analyzed miR-188 expression in both NSCLC clinical samples and cancer cell lines. Next we investigated its role in A549 and H2126 cells with cell proliferation, migration, and apoptosis assays. To extend the in vitro study, we employed both xenograft model and LSL- K-ras G12D lung cancer model to examine the role of miR-188 in tumorigenesis. Last we tested MAP3K3 as miR-188 target in NSCLC model. MiR-188 expression was significantly downregulated at the NSCLC tumor sites and lung cancer cells. In vitro transfection of miR-188 reduced cell proliferation and migration potential and promoted cell apoptosis. In xenograft model, miR-188 inhibited tumor growth derived from cancer cells. Intranasal miR-188 administration reduced tumor formation in NSCLC animal model. MAP3K3 was validated as direct target of miR-188. Knocking down MAP3K3 in mice also inhibited tumorigenesis in LSL- K-ras G12D model. Our results demonstrate that miR-188 and its downstream target MAP3K3 could be a potential therapeutic target for NSCLC.

  8. Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)

    2000-01-01

    HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).

  9. HapZipper: sharing HapMap populations just got easier.

    PubMed

    Chanda, Pritam; Elhaik, Eran; Bader, Joel S

    2012-11-01

    The rapidly growing amount of genomic sequence data being generated and made publicly available necessitate the development of new data storage and archiving methods. The vast amount of data being shared and manipulated also create new challenges for network resources. Thus, developing advanced data compression techniques is becoming an integral part of data production and analysis. The HapMap project is one of the largest public resources of human single-nucleotide polymorphisms (SNPs), characterizing over 3 million SNPs genotyped in over 1000 individuals. The standard format and biological properties of HapMap data suggest that a dedicated genetic compression method can outperform generic compression tools. We propose a compression methodology for genetic data by introducing HapZipper, a lossless compression tool tailored to compress HapMap data beyond benchmarks defined by generic tools such as gzip, bzip2 and lzma. We demonstrate the usefulness of HapZipper by compressing HapMap 3 populations to <5% of their original sizes. HapZipper is freely downloadable from https://bitbucket.org/pchanda/hapzipper/downloads/HapZipper.tar.bz2.

  10. Web GIS in practice III: creating a simple interactive map of England's Strategic Health Authorities using Google Maps API, Google Earth KML, and MSN Virtual Earth Map Control

    PubMed Central

    Boulos, Maged N Kamel

    2005-01-01

    This eye-opener article aims at introducing the health GIS community to the emerging online consumer geoinformatics services from Google and Microsoft (MSN), and their potential utility in creating custom online interactive health maps. Using the programmable interfaces provided by Google and MSN, we created three interactive demonstrator maps of England's Strategic Health Authorities. These can be browsed online at – Google Maps API (Application Programming Interface) version, – Google Earth KML (Keyhole Markup Language) version, and – MSN Virtual Earth Map Control version. Google and MSN's worldwide distribution of "free" geospatial tools, imagery, and maps is to be commended as a significant step towards the ultimate "wikification" of maps and GIS. A discussion is provided of these emerging online mapping trends, their expected future implications and development directions, and associated individual privacy, national security and copyrights issues. Although ESRI have announced their planned response to Google (and MSN), it remains to be seen how their envisaged plans will materialize and compare to the offerings from Google and MSN, and also how Google and MSN mapping tools will further evolve in the near future. PMID:16176577

  11. Smiles2Monomers: a link between chemical and biological structures for polymers.

    PubMed

    Dufresne, Yoann; Noé, Laurent; Leclère, Valérie; Pupin, Maude

    2015-01-01

    The monomeric composition of polymers is powerful for structure comparison and synthetic biology, among others. Many databases give access to the atomic structure of compounds but the monomeric structure of polymers is often lacking. We have designed a smart algorithm, implemented in the tool Smiles2Monomers (s2m), to infer efficiently and accurately the monomeric structure of a polymer from its chemical structure. Our strategy is divided into two steps: first, monomers are mapped on the atomic structure by an efficient subgraph-isomorphism algorithm ; second, the best tiling is computed so that non-overlapping monomers cover all the structure of the target polymer. The mapping is based on a Markovian index built by a dynamic programming algorithm. The index enables s2m to search quickly all the given monomers on a target polymer. After, a greedy algorithm combines the mapped monomers into a consistent monomeric structure. Finally, a local branch and cut algorithm refines the structure. We tested this method on two manually annotated databases of polymers and reconstructed the structures de novo with a sensitivity over 90 %. The average computation time per polymer is 2 s. s2m automatically creates de novo monomeric annotations for polymers, efficiently in terms of time computation and sensitivity. s2m allowed us to detect annotation errors in the tested databases and to easily find the accurate structures. So, s2m could be integrated into the curation process of databases of small compounds to verify the current entries and accelerate the annotation of new polymers. The full method can be downloaded or accessed via a website for peptide-like polymers at http://bioinfo.lifl.fr/norine/smiles2monomers.jsp.Graphical abstract:.

  12. QuIN: A Web Server for Querying and Visualizing Chromatin Interaction Networks

    PubMed Central

    Thibodeau, Asa; Márquez, Eladio J.; Luo, Oscar; Ruan, Yijun; Shin, Dong-Guk; Stitzel, Michael L.; Ucar, Duygu

    2016-01-01

    Recent studies of the human genome have indicated that regulatory elements (e.g. promoters and enhancers) at distal genomic locations can interact with each other via chromatin folding and affect gene expression levels. Genomic technologies for mapping interactions between DNA regions, e.g., ChIA-PET and HiC, can generate genome-wide maps of interactions between regulatory elements. These interaction datasets are important resources to infer distal gene targets of non-coding regulatory elements and to facilitate prioritization of critical loci for important cellular functions. With the increasing diversity and complexity of genomic information and public ontologies, making sense of these datasets demands integrative and easy-to-use software tools. Moreover, network representation of chromatin interaction maps enables effective data visualization, integration, and mining. Currently, there is no software that can take full advantage of network theory approaches for the analysis of chromatin interaction datasets. To fill this gap, we developed a web-based application, QuIN, which enables: 1) building and visualizing chromatin interaction networks, 2) annotating networks with user-provided private and publicly available functional genomics and interaction datasets, 3) querying network components based on gene name or chromosome location, and 4) utilizing network based measures to identify and prioritize critical regulatory targets and their direct and indirect interactions. AVAILABILITY: QuIN’s web server is available at http://quin.jax.org QuIN is developed in Java and JavaScript, utilizing an Apache Tomcat web server and MySQL database and the source code is available under the GPLV3 license available on GitHub: https://github.com/UcarLab/QuIN/. PMID:27336171

  13. Using intervention mapping (IM) to develop a self-management programme for employees with a chronic disease in the Netherlands.

    PubMed

    Detaille, Sarah I; van der Gulden, Joost W J; Engels, Josephine A; Heerkens, Yvonne F; van Dijk, Frank J H

    2010-06-21

    Employees with a chronic disease often encounter problems at work because of their chronic disease. The current paper describes the development of a self-management programme based on the Chronic Disease Self-Management programme (CDSMP) of Stanford University to help employees with a chronic somatic disease cope with these problems at work. The objective of this article is to present the systematic development and content of this programme. The method of intervention mapping (Bartholomew 2006) was used to tailor the original CDSMP for employees with a chronic somatic disease. This paper describes the process of adjusting the CDSMP for this target group. A needs assessment has been carried out by a literature review and qualitative focus groups with employees with a chronic disease and involved health professionals. On the basis of the needs assessment, the relevant determinants of self-management behaviour at work have been identified for the target population and the objectives of the training have been formulated. Furthermore, techniques have been chosen to influence self-management and the determinants of behaviour and a programme plan has been developed. The intervention was designed to address general personal factors such as lifestyle, disease-related factors (for example coping with the disease) and work-related personal factors (such as self-efficacy at work). The course consists of six sessions of each two and a half hour and intents to increase the self management and empowerment of employees with a chronic somatic disease. Intervention mapping has been found to be a useful tool for tailoring in a systematic way the original CDSMP for employees with a chronic somatic disease. It might be valuable to use IM for the development or adjusting of interventions in occupational health care.

  14. Effectiveness of a Low-Cost, Graduate Student–Led Intervention on Study Habits and Performance in Introductory Biology

    PubMed Central

    Hoskins, Tyler D.; Gantz, J. D.; Chaffee, Blake R.; Arlinghaus, Kel; Wiebler, James; Hughes, Michael; Fernandes, Joyce J.

    2017-01-01

    Institutions have developed diverse approaches that vary in effectiveness and cost to improve student performance in introductory science, technology, engineering, and mathematics courses. We developed a low-cost, graduate student–led, metacognition-based study skills course taught in conjunction with the introductory biology series at Miami University. Our approach aimed to improve performance for underachieving students by combining an existing framework for the process of learning (the study cycle) with concrete tools (outlines and concept maps) that have been shown to encourage deep understanding. To assess the effectiveness of our efforts, we asked 1) how effective our voluntary recruitment model was at enrolling the target cohort, 2) how the course impacted performance on lecture exams, 3) how the course impacted study habits and techniques, and 4) whether there are particular study habits or techniques that are associated with large improvements on exam scores. Voluntary recruitment attracted only 11–17% of our target cohort. While focal students improved on lecture exams relative to their peers who did not enroll, gains were relatively modest, and not all students improved. Further, although students across both semesters of our study reported improved study habits (based on pre and post surveys) and on outlines and concept maps (based on retrospectively scored assignments), gains were more dramatic in the Fall semester. Multivariate models revealed that, while changes in study habits and in the quality of outlines and concept maps were weakly associated with change in performance on lecture exams, relationships were only significant in the Fall semester and were sometimes counterintuitive. Although benefits of the course were offset somewhat by the inefficiency of voluntary recruitment, we demonstrate the effectiveness our course, which is inexpensive to implement and has advantage of providing pedagogical experience to future educators. PMID:28747353

  15. A tool for exploring space-time patterns: an animation user research.

    PubMed

    Ogao, Patrick J

    2006-08-29

    Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment.

  16. A tool for exploring space-time patterns : an animation user research

    PubMed Central

    Ogao, Patrick J

    2006-01-01

    Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. Conclusion The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment. PMID:16938138

  17. A Tool for Teaching Three-Dimensional Dermatomes Combined with Distribution of Cutaneous Nerves on the Limbs

    ERIC Educational Resources Information Center

    Kooloos, Jan G. M.; Vorstenbosch, Marc A. T. M.

    2013-01-01

    A teaching tool that facilitates student understanding of a three-dimensional (3D) integration of dermatomes with peripheral cutaneous nerve field distributions is described. This model is inspired by the confusion in novice learners between dermatome maps and nerve field distribution maps. This confusion leads to the misconception that these two…

  18. Tracking the fate of watershed nitrogen: The “N-Sink” Web Tool and Two Case Studies

    EPA Science Inventory

    This product describes the application of a web-based decision support tool, N-Sink, in two case study watersheds. N-Sink is a customized ArcMap© program that provides maps of N sourcesand sinks within a watershed, and estimates the delivery efficiency of N movement from sou...

  19. Integrating Concept Mapping into Information Systems Education for Meaningful Learning and Assessment

    ERIC Educational Resources Information Center

    Wei, Wei; Yue, Kwok-Bun

    2017-01-01

    Concept map (CM) is a theoretically sound yet easy to learn tool and can be effectively used to represent knowledge. Even though many disciplines have adopted CM as a teaching and learning tool to improve learning effectiveness, its application in IS curriculum is sparse. Meaningful learning happens when one iteratively integrates new concepts and…

  20. Mind Maps: Hot New Tools Proposed for Cyberspace Librarians.

    ERIC Educational Resources Information Center

    Humphreys, Nancy K.

    1999-01-01

    Describes how online searchers can use a software tool based on back-of-the-book indexes to assist in dealing with search engine databases compiled by spiders that crawl across the entire Internet or through large Web sites. Discusses human versus machine knowledge, conversion of indexes to mind maps or mini-thesauri, middleware, eXtensible Markup…

Top