Sample records for generic mapping tools

  1. Hawaiian Volcano Observatory seismic data, January to March 2009

    USGS Publications Warehouse

    Nakata, Jennifer S.; Okubo, Paul G.

    2010-01-01

    Figures 11–14 are maps showing computer-located hypocenters. The maps were generated using the Generic Mapping Tools (GMT), found at http://gmt.soest.hawaii.edu/ (last accessed 01/22/2010), in place of traditional QPLOT maps.

  2. Development of a New Branded UK Food Composition Database for an Online Dietary Assessment Tool

    PubMed Central

    Carter, Michelle C.; Hancock, Neil; Albar, Salwa A.; Brown, Helen; Greenwood, Darren C.; Hardie, Laura J.; Frost, Gary S.; Wark, Petra A.; Cade, Janet E.

    2016-01-01

    The current UK food composition tables are limited, containing ~3300 mostly generic food and drink items. To reflect the wide range of food products available to British consumers and to potentially improve accuracy of dietary assessment, a large UK specific electronic food composition database (FCDB) has been developed. A mapping exercise has been conducted that matched micronutrient data from generic food codes to “Back of Pack” data from branded food products using a semi-automated process. After cleaning and processing, version 1.0 of the new FCDB contains 40,274 generic and branded items with associated 120 macronutrient and micronutrient data and 5669 items with portion images. Over 50% of food and drink items were individually mapped to within 10% agreement with the generic food item for energy. Several quality checking procedures were applied after mapping including; identifying foods above and below the expected range for a particular nutrient within that food group and cross-checking the mapping of items such as concentrated and raw/dried products. The new electronic FCDB has substantially increased the size of the current, publically available, UK food tables. The FCDB has been incorporated into myfood24, a new fully automated online dietary assessment tool and, a smartphone application for weight loss. PMID:27527214

  3. Development of a New Branded UK Food Composition Database for an Online Dietary Assessment Tool.

    PubMed

    Carter, Michelle C; Hancock, Neil; Albar, Salwa A; Brown, Helen; Greenwood, Darren C; Hardie, Laura J; Frost, Gary S; Wark, Petra A; Cade, Janet E

    2016-08-05

    The current UK food composition tables are limited, containing ~3300 mostly generic food and drink items. To reflect the wide range of food products available to British consumers and to potentially improve accuracy of dietary assessment, a large UK specific electronic food composition database (FCDB) has been developed. A mapping exercise has been conducted that matched micronutrient data from generic food codes to "Back of Pack" data from branded food products using a semi-automated process. After cleaning and processing, version 1.0 of the new FCDB contains 40,274 generic and branded items with associated 120 macronutrient and micronutrient data and 5669 items with portion images. Over 50% of food and drink items were individually mapped to within 10% agreement with the generic food item for energy. Several quality checking procedures were applied after mapping including; identifying foods above and below the expected range for a particular nutrient within that food group and cross-checking the mapping of items such as concentrated and raw/dried products. The new electronic FCDB has substantially increased the size of the current, publically available, UK food tables. The FCDB has been incorporated into myfood24, a new fully automated online dietary assessment tool and, a smartphone application for weight loss.

  4. HapZipper: sharing HapMap populations just got easier.

    PubMed

    Chanda, Pritam; Elhaik, Eran; Bader, Joel S

    2012-11-01

    The rapidly growing amount of genomic sequence data being generated and made publicly available necessitate the development of new data storage and archiving methods. The vast amount of data being shared and manipulated also create new challenges for network resources. Thus, developing advanced data compression techniques is becoming an integral part of data production and analysis. The HapMap project is one of the largest public resources of human single-nucleotide polymorphisms (SNPs), characterizing over 3 million SNPs genotyped in over 1000 individuals. The standard format and biological properties of HapMap data suggest that a dedicated genetic compression method can outperform generic compression tools. We propose a compression methodology for genetic data by introducing HapZipper, a lossless compression tool tailored to compress HapMap data beyond benchmarks defined by generic tools such as gzip, bzip2 and lzma. We demonstrate the usefulness of HapZipper by compressing HapMap 3 populations to <5% of their original sizes. HapZipper is freely downloadable from https://bitbucket.org/pchanda/hapzipper/downloads/HapZipper.tar.bz2.

  5. Mapping Successful Language Learning Approaches in the Adaptation of Generic Software

    ERIC Educational Resources Information Center

    Hourigan, Triona; Murray, Liam

    2006-01-01

    This paper investigates the use of a generic piece of software, the "Copernic Summarizer" (www.copernic.com) as a language learning tool and considers two discrete pedagogical approaches used as part of its integration within the context of teaching and learning a foreign language. Firstly, this paper will present a brief overview on the emerging…

  6. PhosphOrtholog: a web-based tool for cross-species mapping of orthologous protein post-translational modifications.

    PubMed

    Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa

    2015-08-19

    Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.

  7. A modern Python interface for the Generic Mapping Tools

    NASA Astrophysics Data System (ADS)

    Uieda, L.; Wessel, P.

    2017-12-01

    Figures generated by The Generic Mapping Tools (GMT) are present in countless publications across the Earth sciences. The command-line interface of GMT lends the tool its flexibility but also creates a barrier to entry for begginers. Meanwhile, adoption of the Python programming language has grown across the scientific community. This growth is largely due to the simplicity and low barrier to entry of the language and its ecosystem of tools. Thus, it is not surprising that there have been at least three attempts to create Python interfaces for GMT: gmtpy (github.com/emolch/gmtpy), pygmt (github.com/ian-r-rose/pygmt), and PyGMT (github.com/glimmer-cism/PyGMT). None of these projects are currently active and, with the exception of pygmt, they do not use the GMT Application Programming Interface (API) introduced in GMT 5. The two main Python libraries for plotting data on maps are the matplotlib Basemap toolkit (matplotlib.org/basemap) and Cartopy (scitools.org.uk/cartopy), both of which rely on matplotlib (matplotlib.org) as the backend for generating the figures. Basemap is known to have limitations and is being discontinued. Cartopy is an improvement over Basemap but is still bound by the speed and memory constraints of matplotlib. We present a new Python interface for GMT (GMT/Python) that makes use of the GMT API and of new features being developed for the upcoming GMT 6 release. The GMT/Python library is designed according to the norms and styles of the Python community. The library integrates with the scientific Python ecosystem by using the "virtual files" from the GMT API to implement input and output of Python data types (numpy "ndarray" for tabular data and xarray "Dataset" for grids). Other features include an object-oriented interface for creating figures, the ability to display figures in the Jupyter notebook, and descriptive aliases for GMT arguments (e.g., "region" instead of "R" and "projection" instead of "J"). GMT/Python can also serve as a backend for developing new high-level interfaces, which can help make GMT more accessible to beginners and more intuitive for Python users. GMT/Python is an open-source project hosted on Github (github.com/GenericMappingTools/gmt-python) and is in early stages of development. A first release will accompany the release of GMT 6, which is expected for early 2018.

  8. A Toolkit for bulk PCR-based marker design from next-generation sequence data: application for development of a framework linkage map in bulb onion (Allium cepa L.)

    PubMed Central

    2012-01-01

    Background Although modern sequencing technologies permit the ready detection of numerous DNA sequence variants in any organisms, converting such information to PCR-based genetic markers is hampered by a lack of simple, scalable tools. Onion is an example of an under-researched crop with a complex, heterozygous genome where genome-based research has previously been hindered by limited sequence resources and genetic markers. Results We report the development of generic tools for large-scale web-based PCR-based marker design in the Galaxy bioinformatics framework, and their application for development of next-generation genetics resources in a wide cross of bulb onion (Allium cepa L.). Transcriptome sequence resources were developed for the homozygous doubled-haploid bulb onion line ‘CUDH2150’ and the genetically distant Indian landrace ‘Nasik Red’, using 454™ sequencing of normalised cDNA libraries of leaf and shoot. Read mapping of ‘Nasik Red’ reads onto ‘CUDH2150’ assemblies revealed 16836 indel and SNP polymorphisms that were mined for portable PCR-based marker development. Tools for detection of restriction polymorphisms and primer set design were developed in BioPython and adapted for use in the Galaxy workflow environment, enabling large-scale and targeted assay design. Using PCR-based markers designed with these tools, a framework genetic linkage map of over 800cM spanning all chromosomes was developed in a subset of 93 F2 progeny from a very large F2 family developed from the ‘Nasik Red’ x ‘CUDH2150’ inter-cross. The utility of tools and genetic resources developed was tested by designing markers to transcription factor-like polymorphic sequences. Bin mapping these markers using a subset of 10 progeny confirmed the ability to place markers within 10 cM bins, enabling increased efficiency in marker assignment and targeted map refinement. The major genetic loci conditioning red bulb colour (R) and fructan content (Frc) were located on this map by QTL analysis. Conclusions The generic tools developed for the Galaxy environment enable rapid development of sets of PCR assays targeting sequence variants identified from Illumina and 454 sequence data. They enable non-specialist users to validate and exploit large volumes of next-generation sequence data using basic equipment. PMID:23157543

  9. A toolkit for bulk PCR-based marker design from next-generation sequence data: application for development of a framework linkage map in bulb onion (Allium cepa L.).

    PubMed

    Baldwin, Samantha; Revanna, Roopashree; Thomson, Susan; Pither-Joyce, Meeghan; Wright, Kathryn; Crowhurst, Ross; Fiers, Mark; Chen, Leshi; Macknight, Richard; McCallum, John A

    2012-11-19

    Although modern sequencing technologies permit the ready detection of numerous DNA sequence variants in any organisms, converting such information to PCR-based genetic markers is hampered by a lack of simple, scalable tools. Onion is an example of an under-researched crop with a complex, heterozygous genome where genome-based research has previously been hindered by limited sequence resources and genetic markers. We report the development of generic tools for large-scale web-based PCR-based marker design in the Galaxy bioinformatics framework, and their application for development of next-generation genetics resources in a wide cross of bulb onion (Allium cepa L.). Transcriptome sequence resources were developed for the homozygous doubled-haploid bulb onion line 'CUDH2150' and the genetically distant Indian landrace 'Nasik Red', using 454™ sequencing of normalised cDNA libraries of leaf and shoot. Read mapping of 'Nasik Red' reads onto 'CUDH2150' assemblies revealed 16836 indel and SNP polymorphisms that were mined for portable PCR-based marker development. Tools for detection of restriction polymorphisms and primer set design were developed in BioPython and adapted for use in the Galaxy workflow environment, enabling large-scale and targeted assay design. Using PCR-based markers designed with these tools, a framework genetic linkage map of over 800cM spanning all chromosomes was developed in a subset of 93 F(2) progeny from a very large F(2) family developed from the 'Nasik Red' x 'CUDH2150' inter-cross. The utility of tools and genetic resources developed was tested by designing markers to transcription factor-like polymorphic sequences. Bin mapping these markers using a subset of 10 progeny confirmed the ability to place markers within 10 cM bins, enabling increased efficiency in marker assignment and targeted map refinement. The major genetic loci conditioning red bulb colour (R) and fructan content (Frc) were located on this map by QTL analysis. The generic tools developed for the Galaxy environment enable rapid development of sets of PCR assays targeting sequence variants identified from Illumina and 454 sequence data. They enable non-specialist users to validate and exploit large volumes of next-generation sequence data using basic equipment.

  10. GOMMA: a component-based infrastructure for managing and analyzing life science ontologies and their evolution

    PubMed Central

    2011-01-01

    Background Ontologies are increasingly used to structure and semantically describe entities of domains, such as genes and proteins in life sciences. Their increasing size and the high frequency of updates resulting in a large set of ontology versions necessitates efficient management and analysis of this data. Results We present GOMMA, a generic infrastructure for managing and analyzing life science ontologies and their evolution. GOMMA utilizes a generic repository to uniformly and efficiently manage ontology versions and different kinds of mappings. Furthermore, it provides components for ontology matching, and determining evolutionary ontology changes. These components are used by analysis tools, such as the Ontology Evolution Explorer (OnEX) and the detection of unstable ontology regions. We introduce the component-based infrastructure and show analysis results for selected components and life science applications. GOMMA is available at http://dbs.uni-leipzig.de/GOMMA. Conclusions GOMMA provides a comprehensive and scalable infrastructure to manage large life science ontologies and analyze their evolution. Key functions include a generic storage of ontology versions and mappings, support for ontology matching and determining ontology changes. The supported features for analyzing ontology changes are helpful to assess their impact on ontology-dependent applications such as for term enrichment. GOMMA complements OnEX by providing functionalities to manage various versions of mappings between two ontologies and allows combining different match approaches. PMID:21914205

  11. Geometric Comparisons of Selected Small Topographically Fresh Volcanoes in the Borealis and Elysium Planitia Volcanic Fields, Mars: Implications for Eruptive Styles

    NASA Technical Reports Server (NTRS)

    Taylor, K.; Sakimoto, S. E. H.; Mitchell, D.

    2002-01-01

    MOLA (Mars Orbiter Laser Altimeter) data from small, topographically fresh volcanoes from the Elysium and Borealis regions were gridded and analyzed using GMT (Generic Mapping Tools) programs. Results compare eruptive styles of the two regions, and draw conclusions about the different volcanic regions. Additional information is contained in the original extended abstract.

  12. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  13. Fiber-connected, indefinite Morse 2-functions on connected n-manifolds

    PubMed Central

    Gay, David T.; Kirby, Robion C.

    2011-01-01

    We discuss generic smooth maps from smooth manifolds to smooth surfaces, which we call “Morse 2-functions,” and homotopies between such maps. The two central issues are to keep the fibers connected, in which case the Morse 2-function is “fiber-connected,” and to avoid local extrema over one-dimensional submanifolds of the range, in which case the Morse 2-function is “indefinite.” This is foundational work for the long-range goal of defining smooth invariants from Morse 2-functions using tools analogous to classical Morse homology and Cerf theory. PMID:21518894

  14. A Computational Solution to Automatically Map Metabolite Libraries in the Context of Genome Scale Metabolic Networks.

    PubMed

    Merlet, Benjamin; Paulhe, Nils; Vinson, Florence; Frainay, Clément; Chazalviel, Maxime; Poupin, Nathalie; Gloaguen, Yoann; Giacomoni, Franck; Jourdan, Fabien

    2016-01-01

    This article describes a generic programmatic method for mapping chemical compound libraries on organism-specific metabolic networks from various databases (KEGG, BioCyc) and flat file formats (SBML and Matlab files). We show how this pipeline was successfully applied to decipher the coverage of chemical libraries set up by two metabolomics facilities MetaboHub (French National infrastructure for metabolomics and fluxomics) and Glasgow Polyomics (GP) on the metabolic networks available in the MetExplore web server. The present generic protocol is designed to formalize and reduce the volume of information transfer between the library and the network database. Matching of metabolites between libraries and metabolic networks is based on InChIs or InChIKeys and therefore requires that these identifiers are specified in both libraries and networks. In addition to providing covering statistics, this pipeline also allows the visualization of mapping results in the context of metabolic networks. In order to achieve this goal, we tackled issues on programmatic interaction between two servers, improvement of metabolite annotation in metabolic networks and automatic loading of a mapping in genome scale metabolic network analysis tool MetExplore. It is important to note that this mapping can also be performed on a single or a selection of organisms of interest and is thus not limited to large facilities.

  15. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    NASA Astrophysics Data System (ADS)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  16. Delineating Beach and Dune Morphology from Massive Terrestrial Laser Scanning Data Using the Generic Mapping Tools

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Wang, G.; Yan, B.; Kearns, T.

    2016-12-01

    Terrestrial laser scanning (TLS) techniques have been proven to be efficient tools to collect three-dimensional high-density and high-accuracy point clouds for coastal research and resource management. However, the processing and presenting of massive TLS data is always a challenge for research when targeting a large area with high-resolution. This article introduces a workflow using shell-scripting techniques to chain together tools from the Generic Mapping Tools (GMT), Geographic Resources Analysis Support System (GRASS), and other command-based open-source utilities for automating TLS data processing. TLS point clouds acquired in the beach and dune area near Freeport, Texas in May 2015 were used for the case study. Shell scripts for rotating the coordinate system, removing anomalous points, assessing data quality, generating high-accuracy bare-earth DEMs, and quantifying beach and sand dune features (shoreline, cross-dune section, dune ridge, toe, and volume) are presented in this article. According to this investigation, the accuracy of the laser measurements (distance from the scanner to the targets) is within a couple of centimeters. However, the positional accuracy of TLS points with respect to a global coordinate system is about 5 cm, which is dominated by the accuracy of GPS solutions for obtaining the positions of the scanner and reflector. The accuracy of TLS-derived bare-earth DEM is primarily determined by the size of grid cells and roughness of the terrain surface for the case study. A DEM with grid cells of 4m x 1m (shoreline by cross-shore) provides a suitable spatial resolution and accuracy for deriving major beach and dune features.

  17. Groundwater vulnerability maps for pesticides for Flanders

    NASA Astrophysics Data System (ADS)

    Dams, Jef; Joris, Ingeborg; Bronders, Jan; Van Looy, Stijn; Vanden Boer, Dirk; Heuvelmans, Griet; Seuntjens, Piet

    2017-04-01

    Pesticides are increasingly being detected in shallow groundwater and and are one of the main causes of the poor chemical status of phreatic groundwater bodies in Flanders. There is a need for groundwater vulnerability maps in order to design monitoring strategies and land-use strategies for sensitive areas such as drinking water capture zones. This research focuses on the development of generic vulnerability maps for pesticides for Flanders and a tool to calculate substance-specific vulnerability maps at the scale of Flanders and at the local scale. (1) The generic vulnerability maps are constructed using an index based method in which maps of the main contributing factors in soil and saturated zone to high concentrations of pesticides in groundwater are classified and overlain. Different weights are assigned to the contributing factors according to the type of pesticide (low/high mobility, low/high persistence). Factors that are taken into account are the organic matter content and texture of soil, depth of the unsaturated zone, organic carbon and redox potential of the phreatic groundwater and thickness and conductivity of the phreatic layer. (2) Secondly a tool is developed that calculates substance-specific vulnerability maps for Flanders using a hybrid approach where a process-based leaching model GeoPEARL is combined with vulnerability indices that account for dilution in the phreatic layer. The GeoPEARL model is parameterized for Flanders in 1434 unique combinations of soil properties, climate and groundwater depth. Leaching is calculated for a 20 year period for each 50 x 50 m gridcell in Flanders. (3) At the local scale finally, a fully process-based approach is applied combining GeoPEARL leaching calculations and flowline calculations of pesticide transport in the saturated zone to define critical zones in the capture zone of a receptor such as a drinking water well or a river segment. The three approaches are explained more in detail and illustrated with the results for the entire Flanders region and for a case-study focusing at a drinking water production site in West Flanders.

  18. Interactive Web Interface to the Global Strain Rate Map Project

    NASA Astrophysics Data System (ADS)

    Meertens, C. M.; Estey, L.; Kreemer, C.; Holt, W.

    2004-05-01

    An interactive web interface allows users to explore the results of a global strain rate and velocity model and to compare them to other geophysical observations. The most recent model, an updated version of Kreemer et al., 2003, has 25 independent rigid plate-like regions separated by deformable boundaries covered by about 25,000 grid areas. A least-squares fit was made to 4900 geodetic velocities from 79 different geodetic studies. In addition, Quaternary fault slip rate data are used to infer geologic strain rate estimates (currently only for central Asia). Information about the style and direction of expected strain rate is inferred from the principal axes of the seismic strain rate field. The current model, as well as source data, references and an interactive map tool, are located at the International Lithosphere Program (ILP) "A Global Strain Rate Map (ILP II-8)" project website: http://www-world-strain-map.org. The purpose of the ILP GSRM project is to provide new information from this, and other investigations, that will contribute to a better understanding of continental dynamics and to the quantification of seismic hazards. A unique aspect of the GSRM interactive Java map tool is that the user can zoom in and make custom views of the model grid and results for any area of the globe selecting strain rate and style contour plots and principal axes, observed and model velocity fields in specified frames of reference, and geologic fault data. The results can be displayed with other data sets such Harvard CMT earthquake focal mechanisms, stress directions from the ILP World Stress Map Project, and topography. With the GSRM Java map tool, the user views custom maps generated by a Generic Mapping Tool (GMT) server. These interactive capabilities greatly extend what is possible to present in a published paper. A JavaScript version, using pre-constructed maps, as well as a related information site have also been created for broader education and outreach access. The GSRM map tool will be demonstrated and latest model GSRM 1.1 results, containing important new data for Asia, Iran, western Pacific, and Southern California, will be presented.

  19. ADOPT: A tool for automatic detection of tectonic plates at the surface of convection models

    NASA Astrophysics Data System (ADS)

    Mallard, C.; Jacquet, B.; Coltice, N.

    2017-08-01

    Mantle convection models with plate-like behavior produce surface structures comparable to Earth's plate boundaries. However, analyzing those structures is a difficult task, since convection models produce, as on Earth, diffuse deformation and elusive plate boundaries. Therefore we present here and share a quantitative tool to identify plate boundaries and produce plate polygon layouts from results of numerical models of convection: Automatic Detection Of Plate Tectonics (ADOPT). This digital tool operates within the free open-source visualization software Paraview. It is based on image segmentation techniques to detect objects. The fundamental algorithm used in ADOPT is the watershed transform. We transform the output of convection models into a topographic map, the crest lines being the regions of deformation (plate boundaries) and the catchment basins being the plate interiors. We propose two generic protocols (the field and the distance methods) that we test against an independent visual detection of plate polygons. We show that ADOPT is effective to identify the smaller plates and to close plate polygons in areas where boundaries are diffuse or elusive. ADOPT allows the export of plate polygons in the standard OGR-GMT format for visualization, modification, and analysis under generic softwares like GMT or GPlates.

  20. Interactive Geophysical Mapping on the Web

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Hamburger, M.; Estey, L.; Weingroff, M.; Deardorff, R.; Holt, W.

    2002-12-01

    We have developed a set of interactive, web-based map utilities that make geophysical results accessible to a large number and variety of users. These tools provide access to pre-determined map regions via a simple Html/JavaScript interface or to user-selectable areas using a Java interface to a Generic Mapping Tools (GMT) engine. Users can access a variety of maps, satellite images, and geophysical data at a range of spatial scales for the earth and other planets of the solar system. Developed initially by UNAVCO for study of global-scale geodynamic processes, users can choose from a variety of base maps (satellite mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others) and can then add a number of geographic and geophysical overlays for example coastlines, political boundaries, rivers and lakes, NEIC earthquake and volcano locations, stress axes, and observed and model plate motion and deformation velocity vectors representing a compilation of 2933 geodetic measurements from around the world. The software design is flexible allowing for construction of special editions for different target audiences. Custom maps been implemented for UNAVCO as the "Jules Verne Voyager" and "Voyager Junior", for the International Lithosphere Project's "Global Strain Rate Map", and for EarthScope Education and Outreach as "EarthScope Voyager Jr.". For the later, a number of EarthScope-specific features have been added, including locations of proposed USArray (seismic), Plate Boundary Observatory (geodetic), and San Andreas Fault Observatory at Depth sites plus detailed maps and geographically referenced examples of EarthScope-related scientific investigations. In addition, we are developing a website that incorporates background materials and curricular activities that encourage users to explore Earth processes. A cluster of map processing computers and nearly a terabyte of disk storage has been assembled to power the generation of interactive maps and provide space for a very large collection of map data. A portal to these map tools can be found at: http://jules.unavco.ucar.edu.

  1. Using the Generic Mapping Tools From Within the MATLAB, Octave and Julia Computing Environments

    NASA Astrophysics Data System (ADS)

    Luis, J. M. F.; Wessel, P.

    2016-12-01

    The Generic Mapping Tools (GMT) is a widely used software infrastructure tool set for analyzing and displaying geoscience data. Its power to analyze and process data and produce publication-quality graphics has made it one of several standard processing toolsets used by a large segment of the Earth and Ocean Sciences. GMT's strengths lie in superior publication-quality vector graphics, geodetic-quality map projections, robust data processing algorithms scalable to enormous data sets, and ability to run under all common operating systems. The GMT tool chest offers over 120 modules sharing a common set of command options, file structures, and documentation. GMT modules are command line tools that accept input and write output, and this design allows users to write scripts in which one module's output becomes another module's input, creating highly customized GMT workflows. With the release of GMT 5, these modules are high-level functions with a C API, potentially allowing users access to high-level GMT capabilities from any programmable environment. Many scientists who use GMT also use other computational tools, such as MATLAB® and its clone Octave. We have built a MATLAB/Octave interface on top of the GMT 5 C API. Thus, MATLAB or Octave now has full access to all GMT modules as well as fundamental input/output of GMT data objects via a MEX function. Internally, the GMT/MATLAB C API defines six high-level composite data objects that handle input and output of data via individual GMT modules. These are data tables, grids, text tables (text/data mixed records), color palette tables, raster images (1-4 color bands), and PostScript. The API is responsible for translating between the six GMT objects and the corresponding native MATLAB objects. References to data arrays are passed if transposing of matrices is not required. The GMT and MATLAB/Octave combination is extremely flexible, letting the user harvest the general numerical and graphical capabilities of both systems, and represents a giant step forward in interoperability between GMT and other software package. We will present examples of the symbiotic benefits of combining these platforms. Two other extensions are also in the works: a nearly finished Julia wrapper and an embryonic Python module. Publication supported by FCT- project UID/GEO/50019/2013 - Instituto D. Luiz

  2. Working in disadvantaged communities: What additional competencies do we need?

    PubMed Central

    Harris, Elizabeth; Harris, Mark F; Madden, Lynne; Wise, Marilyn; Sainsbury, Peter; MacDonald, John; Gill, Betty

    2009-01-01

    Background Residents of socioeconomically disadvantaged locations are more likely to have poor health than residents of socioeconomically advantaged locations and this has been comprehensively mapped in Australian cities. These inequalities present a challenge for the public health workers based in or responsible for improving the health of people living in disadvantaged localities. The purpose of this study was to develop a generic workforce needs assessment tool and to use it to identify the competencies needed by the public health workforce to work effectively in disadvantaged communities. Methods A two-step mixed method process was used to identify the workforce needs. In step 1 a generic workforce needs assessment tool was developed and applied in three NSW Area Health Services using focus groups, key stakeholder interviews and a staff survey. In step 2 the findings of this needs assessment process were mapped against the existing National Health Training Package (HLT07) competencies, gaps were identified, additional competencies described and modules of training developed to fill identified gaps. Results There was a high level of agreement among the AHS staff on the nature of the problems to be addressed but less confidence indentifying the work to be done. Processes for needs assessments, community consultations and adapting mainstream programs to local needs were frequently mentioned as points of intervention. Recruiting and retaining experienced staff to work in these communities and ensuring their safety were major concerns. Workforce skill development needs were seen in two ways: higher order planning/epidemiological skills and more effective working relationships with communities and other sectors. Organisational barriers to effective practice were high levels of annual compulsory training, balancing state and national priorities with local needs and giving equal attention to the population groups that are easy to reach and to those that are difficult to engage. A number of additional competency areas were identified and three training modules developed. Conclusion The generic workforce needs assessment tool was easy to use and interpret. It appears that the public health workforce involved in this study has a high level of understanding of the relationship between the social determinants and health. However there is a skill gap in identifying and undertaking effective intervention. PMID:19393091

  3. The GMT/MATLAB Toolbox

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  4. Color management with a hammer: the B-spline fitter

    NASA Astrophysics Data System (ADS)

    Bell, Ian E.; Liu, Bonny H. P.

    2003-01-01

    To paraphrase Abraham Maslow: If the only tool you have is a hammer, every problem looks like a nail. We have a B-spline fitter customized for 3D color data, and many problems in color management can be solved with this tool. Whereas color devices were once modeled with extensive measurement, look-up tables and trilinear interpolation, recent improvements in hardware have made B-spline models an affordable alternative. Such device characterizations require fewer color measurements than piecewise linear models, and have uses beyond simple interpolation. A B-spline fitter, for example, can act as a filter to remove noise from measurements, leaving a model with guaranteed smoothness. Inversion of the device model can then be carried out consistently and efficiently, as the spline model is well behaved and its derivatives easily computed. Spline-based algorithms also exist for gamut mapping, the composition of maps, and the extrapolation of a gamut. Trilinear interpolation---a degree-one spline---can still be used after nonlinear spline smoothing for high-speed evaluation with robust convergence. Using data from several color devices, this paper examines the use of B-splines as a generic tool for modeling devices and mapping one gamut to another, and concludes with applications to high-dimensional and spectral data.

  5. Modifications to risk-targeted seismic design maps for subduction and near-fault hazards

    USGS Publications Warehouse

    Liel, Abbie B.; Luco, Nicolas; Raghunandan, Meera; Champion, C.; Haukaas, Terje

    2015-01-01

    ASCE 7-10 introduced new seismic design maps that define risk-targeted ground motions such that buildings designed according to these maps will have 1% chance of collapse in 50 years. These maps were developed by iterative risk calculation, wherein a generic building collapse fragility curve is convolved with the U.S. Geological Survey hazard curve until target risk criteria are met. Recent research shows that this current approach may be unconservative at locations where the tectonic environment is much different than that used to develop the generic fragility curve. This study illustrates how risk-targeted ground motions at selected sites would change if generic building fragility curve and hazard assessment were modified to account for seismic risk from subduction earthquakes and near-fault pulses. The paper also explores the difficulties in implementing these changes.

  6. Simulation of facial expressions using person-specific sEMG signals controlling a biomechanical face model.

    PubMed

    Eskes, Merijn; Balm, Alfons J M; van Alphen, Maarten J A; Smeele, Ludi E; Stavness, Ian; van der Heijden, Ferdinand

    2018-01-01

    Functional inoperability in advanced oral cancer is difficult to assess preoperatively. To assess functions of lips and tongue, biomechanical models are required. Apart from adjusting generic models to individual anatomy, muscle activation patterns (MAPs) driving patient-specific functional movements are necessary to predict remaining functional outcome. We aim to evaluate how volunteer-specific MAPs derived from surface electromyographic (sEMG) signals control a biomechanical face model. Muscle activity of seven facial muscles in six volunteers was measured bilaterally with sEMG. A triple camera set-up recorded 3D lip movement. The generic face model in ArtiSynth was adapted to our needs. We controlled the model using the volunteer-specific MAPs. Three activation strategies were tested: activating all muscles [Formula: see text], selecting the three muscles showing highest muscle activity bilaterally [Formula: see text]-this was calculated by taking the mean of left and right muscles and then selecting the three with highest variance-and activating the muscles considered most relevant per instruction [Formula: see text], bilaterally. The model's lip movement was compared to the actual lip movement performed by the volunteers, using 3D correlation coefficients [Formula: see text]. The correlation coefficient between simulations and measurements with [Formula: see text] resulted in a median [Formula: see text] of 0.77. [Formula: see text] had a median [Formula: see text] of 0.78, whereas with [Formula: see text] the median [Formula: see text] decreased to 0.45. We demonstrated that MAPs derived from noninvasive sEMG measurements can control movement of the lips in a generic finite element face model with a median [Formula: see text] of 0.78. Ultimately, this is important to show the patient-specific residual movement using the patient's own MAPs. When the required treatment tools and personalisation techniques for geometry and anatomy become available, this may enable surgeons to test the functional results of wedge excisions for lip cancer in a virtual environment and to weigh surgery versus organ-sparing radiotherapy or photodynamic therapy.

  7. Application of process mapping to understand integration of high risk medicine care bundles within community pharmacy practice.

    PubMed

    Weir, Natalie M; Newham, Rosemary; Corcoran, Emma D; Ali Atallah Al-Gethami, Ashwag; Mohammed Abd Alridha, Ali; Bowie, Paul; Watson, Anne; Bennie, Marion

    2017-11-21

    The Scottish Patient Safety Programme - Pharmacy in Primary Care collaborative is a quality improvement initiative adopting the Institute of Healthcare Improvement Breakthrough Series collaborative approach. The programme developed and piloted High Risk Medicine (HRM) Care Bundles (CB), focused on warfarin and non-steroidal anti-inflammatories (NSAIDs), within 27 community pharmacies over 4 NHS Regions. Each CB involves clinical assessment and patient education, although the CB content varies between regions. To support national implementation, this study aims to understand how the pilot pharmacies integrated the HRM CBs into routine practice to inform the development of a generic HRM CB process map. Regional process maps were developed in 4 pharmacies through simulation of the CB process, staff interviews and documentation of resources. Commonalities were collated to develop a process map for each HRM, which were used to explore variation at a national event. A single, generic process map was developed which underwent validation by case study testing. The findings allowed development of a generic process map applicable to warfarin and NSAID CB implementation. Five steps were identified as required for successful CB delivery: patient identification; clinical assessment; pharmacy CB prompt; CB delivery; and documentation. The generic HRM CB process map encompasses the staff and patients' journey and the CB's integration into routine community pharmacy practice. Pharmacist involvement was required only for clinical assessment, indicating suitability for whole-team involvement. Understanding CB integration into routine practice has positive implications for successful implementation. The generic process map can be used to develop targeted resources, and/or be disseminated to facilitate CB delivery and foster whole team involvement. Similar methods could be utilised within other settings, to allow those developing novel services to distil the key processes and consider their integration within routine workflows to effect maximal, efficient implementation and benefit to patient care. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Facet Theory and the Mapping Sentence As Hermeneutically Consistent Structured Meta-Ontology and Structured Meta-Mereology

    PubMed Central

    Hackett, Paul M. W.

    2016-01-01

    When behavior is interpreted in a reliable manner (i.e., robustly across different situations and times) its explained meaning may be seen to possess hermeneutic consistency. In this essay I present an evaluation of the hermeneutic consistency that I propose may be present when the research tool known as the mapping sentence is used to create generic structural ontologies. I also claim that theoretical and empirical validity is a likely result of employing the mapping sentence in research design and interpretation. These claims are non-contentious within the realm of quantitative psychological and behavioral research. However, I extend the scope of both facet theory based research and claims for its structural utility, reliability and validity to philosophical and qualitative investigations. I assert that the hermeneutic consistency of a structural ontology is a product of a structural representation's ontological components and the mereological relationships between these ontological sub-units: the mapping sentence seminally allows for the depiction of such structure. PMID:27065932

  9. keep your models up-to-date: connecting community mapping data to complex urban flood modelling

    NASA Astrophysics Data System (ADS)

    Winsemius, Hessel; Eilander, Dirk; Ward, Philip; Diaz Loaiza, Andres; Iliffe, Mark; Mawanda, Shaban; Luo, Tianyi; Kimacha, Nyambiri; Chen, Jorik

    2017-04-01

    The world is urbanizing rapidly. According to the United Nation's World Urbanization Prospect, 50% of the global population already lives in urban areas today. This number is expected to grow to 66% by 2050. The rapid changes in these urban environments go hand in hand with rapid changes in natural hazard risks, in particular in informal unplanned neighbourhoods. In Dar Es Salaam - Tanzania, flood risk dominates and given the rapid changes in the city, continuous updates of detailed street level hazard and risk mapping are needed to adequately support decision making for urban planning, infrastructure design and disaster response. Over the past years, the Ramani Huria and Zuia Mafuriko projects have mapped the most flood prone neighbourhoods, including roads, buildings, drainage and land use and contributed data to the open-source OpenStreetMap database. In this contribution, we will demonstrate how we mobilize these contributed data to establish dynamic flood models for Dar Es Salaam and keep these up-to-date by making a direct link between the data, and model schematization. The tools automatically establish a sound 1D drainage network as well as a high resolution terrain dataset, by fusing the OpenStreetMap data with existing lower resolution terrain data such as the globally available satellite based SRTM 30. It then translates these fully automatically into the inputs required for the D-HYDRO modeling suite. Our tools are built such that community and stakeholder knowledge can be included in the model details through workshops with the tools so that missing essential information about the city's details can be augmented on-the-fly. This process creates a continuous dialogue between members of the community that collect data, and stakeholders requiring data for flood models. Moreover, used taxonomy and data filtering can be configured to conditions in other cities, making the tools generic and scalable. The tools are made available open-source.

  10. Globes from global data: Charting international research networks with the GRASS GIS r.out.polycones add-on module.

    NASA Astrophysics Data System (ADS)

    Löwe, Peter

    2015-04-01

    Many Free and Open Source Software (FOSS) tools have been created for the various application fields within geoscience. While FOSS allows re-implementation of functionalities in new environments by access to the original codebase, the easiest approach to build new software solutions for new problems is the combination or merging of existing software tools. Such mash-ups are implemented by embedding and encapsulating FOSS tools within each another, effectively focusing the use of the embedded software to the specific role it needs to perform in the given scenario, while ignoring all its other capabilities. GRASS GIS is a powerful and established FOSS GIS for raster, vector and volume data processing while the Generic Mapping Tools (GMT) are a suite of powerful Open Source mapping tools, which exceed the mapping capabilities of GRASS GIS. This poster reports on the new GRASS GIS add-on module r.out.polycones. It enables users to utilize non-continuous projections for map production within the GRASS production environment. This is implemented on the software level by encapsulating a subset of GMT mapping capabilities into a GRASS GIS (Version 6.x) add-on module. The module was developed at the German National Library of Science and Technology (TIB) to provide custom global maps of scientific collaboration networks, such as the DataCite consortium, the registration agency for Digital Object Identifiers (DOI) for research data. The GRASS GIS add-on module can be used for global mapping of raster data into a variety of non continuous sinosoidal projections, allowing the creation of printable biangles (gores) to be used for globe making. Due to the well structured modular nature of GRASS modules, technical follow-up work will focus on API-level Python-based integration in GRASS 7 [1]. Based on this, GMT based mapping capabilities in GRASS will be extended beyond non-continuous sinosoidal maps and advanced from raster-layers to content GRASS display monitors. References: [1] Petras, V., Petrasova, A., Chemin, Y., Zambelli, P., Landa, M., Gebbert, S., Neteler, N., Löwe, P.: Analyzing rasters, vectors and time series using new Python interfaces in GRASS GIS 7, Geophysical Research Abstracts Vol. 17, EGU2015-8142, 2015 (in preparation)

  11. Comparative map and trait viewer (CMTV): an integrated bioinformatic tool to construct consensus maps and compare QTL and functional genomics data across genomes and experiments.

    PubMed

    Sawkins, M C; Farmer, A D; Hoisington, D; Sullivan, J; Tolopko, A; Jiang, Z; Ribaut, J-M

    2004-10-01

    In the past few decades, a wealth of genomic data has been produced in a wide variety of species using a diverse array of functional and molecular marker approaches. In order to unlock the full potential of the information contained in these independent experiments, researchers need efficient and intuitive means to identify common genomic regions and genes involved in the expression of target phenotypic traits across diverse conditions. To address this need, we have developed a Comparative Map and Trait Viewer (CMTV) tool that can be used to construct dynamic aggregations of a variety of types of genomic datasets. By algorithmically determining correspondences between sets of objects on multiple genomic maps, the CMTV can display syntenic regions across taxa, combine maps from separate experiments into a consensus map, or project data from different maps into a common coordinate framework using dynamic coordinate translations between source and target maps. We present a case study that illustrates the utility of the tool for managing large and varied datasets by integrating data collected by CIMMYT in maize drought tolerance research with data from public sources. This example will focus on one of the visualization features for Quantitative Trait Locus (QTL) data, using likelihood ratio (LR) files produced by generic QTL analysis software and displaying the data in a unique visual manner across different combinations of traits, environments and crosses. Once a genomic region of interest has been identified, the CMTV can search and display additional QTLs meeting a particular threshold for that region, or other functional data such as sets of differentially expressed genes located in the region; it thus provides an easily used means for organizing and manipulating data sets that have been dynamically integrated under the focus of the researcher's specific hypothesis.

  12. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  13. A tool for exploring space-time patterns: an animation user research.

    PubMed

    Ogao, Patrick J

    2006-08-29

    Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment.

  14. A tool for exploring space-time patterns : an animation user research

    PubMed Central

    Ogao, Patrick J

    2006-01-01

    Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. Conclusion The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment. PMID:16938138

  15. Digit replacement: A generic map for nonlinear dynamical systems.

    PubMed

    García-Morales, Vladimir

    2016-09-01

    A simple discontinuous map is proposed as a generic model for nonlinear dynamical systems. The orbit of the map admits exact solutions for wide regions in parameter space and the method employed (digit manipulation) allows the mathematical design of useful signals, such as regular or aperiodic oscillations with specific waveforms, the construction of complex attractors with nontrivial properties as well as the coexistence of different basins of attraction in phase space with different qualitative properties. A detailed analysis of the dynamical behavior of the map suggests how the latter can be used in the modeling of complex nonlinear dynamics including, e.g., aperiodic nonchaotic attractors and the hierarchical deposition of grains of different sizes on a surface.

  16. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.

  17. Visual Data Analysis for Satellites

    NASA Technical Reports Server (NTRS)

    Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick

    2008-01-01

    The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.

  18. Voyager Interactive Web Interface to EarthScope

    NASA Astrophysics Data System (ADS)

    Eriksson, S. C.; Meertens, C. M.; Estey, L.; Weingroff, M.; Hamburger, M. W.; Holt, W. E.; Richard, G. A.

    2004-12-01

    Visualization of data is essential in helping scientists and students develop a conceptual understanding of relationships among many complex types of data and keep track of large amounts of information. Developed initially by UNAVCO for study of global-scale geodynamic processes, the Voyager map visualization tools have evolved into interactive, web-based map utilities that can make scientific results accessible to a large number and variety of educators and students as well as the originally targeted scientists. A portal to these map tools can be found at: http://jules.unavco.org. The Voyager tools provide on-line interactive data visualization through pre-determined map regions via a simple HTML/JavaScript interface (for large numbers of students using the tools simultaneously) or through student-selectable areas using a Java interface to a Generic Mapping Tools (GMT) engine. Students can access a variety of maps, satellite images, and geophysical data at a range of spatial scales for the earth and other planets of the solar system. Students can also choose from a variety of base maps (satellite mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others) and can then add a number of geographic and geophysical overlays, for example coastlines, political boundaries, rivers and lakes, earthquake and volcano locations, stress axes, and observed and model plate motion, as well as deformation velocity vectors representing a compilation of over 5000 geodetic measurements from around the world. The related educational website, "Exploring our Dynamic Planet", (http://www.dpc.ucar.edu/VoyagerJr/jvvjrtool.html) incorporates background materials and curricular activities that encourage students to explore Earth processes. One of the present curricular modules is designed for high school students or introductory-level undergraduate non-science majors. The purpose of the module is for students to examine real data to investigate how plate tectonic processes are reflected in observed geophysical phenomena. Constructing maps by controlling map parameters and answering open-ended questions which describe, compare relationships, and work with both observed and model data, promote conceptual understanding of plate tectonics and related processes. The goals of curricular development emphasize inquiry, development of critical thinking skills, and student-centered interests. Custom editions of the map utility have been made as the "Jules Verne Voyager" and "Voyager Junior", for the International Lithosphere Project's "Global Strain Rate Map", and for EarthScope Education and Outreach as "EarthScope Voyager Jr.". For the latter, a number of EarthScope-specific features have been added, including locations of proposed USArray (seismic), Plate Boundary Observatory (geodetic), and San Andreas Fault Observatory at Depth sites, plus detailed maps and geographically referenced examples of EarthScope-related scientific investigations. As EarthScope develops, maps will be updated in `real time' so that students of all ages can use the data in formal and informal educational settings.

  19. Estimating B1+ in the breast at 7 T using a generic template.

    PubMed

    van Rijssel, Michael J; Pluim, Josien P W; Luijten, Peter R; Gilhuijs, Kenneth G A; Raaijmakers, Alexander J E; Klomp, Dennis W J

    2018-05-01

    Dynamic contrast-enhanced MRI is the workhorse of breast MRI, where the diagnosis of lesions is largely based on the enhancement curve shape. However, this curve shape is biased by RF transmit (B 1 + ) field inhomogeneities. B 1 + field information is required in order to correct these. The use of a generic, coil-specific B 1 + template is proposed and tested. Finite-difference time-domain simulations for B 1 + were performed for healthy female volunteers with a wide range of breast anatomies. A generic B 1 + template was constructed by averaging simulations based on four volunteers. Three-dimensional B 1 + maps were acquired in 15 other volunteers. Root mean square error (RMSE) metrics were calculated between individual simulations and the template, and between individual measurements and the template. The agreement between the proposed template approach and a B 1 + mapping method was compared against the agreement between acquisition and reacquisition using the same mapping protocol. RMSE values (% of nominal flip angle) comparing individual simulations with the template were in the range 2.00-4.01%, with mean 2.68%. RMSE values comparing individual measurements with the template were in the range8.1-16%, with mean 11.7%. The agreement between the proposed template approach and a B 1 + mapping method was only slightly worse than the agreement between two consecutive acquisitions using the same mapping protocol in one volunteer: the range of agreement increased from ±16% of the nominal angle for repeated measurement to ±22% for the B 1 + template. With local RF transmit coils, intersubject differences in B 1 + fields of the breast are comparable to the accuracy of B 1 + mapping methods, even at 7 T. Consequently, a single generic B 1 + template suits subjects over a wide range of breast anatomies, eliminating the need for a time-consuming B 1 + mapping protocol. © 2018 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  20. Estimating B 1 + in the breast at 7 T using a generic template

    PubMed Central

    Pluim, Josien P. W.; Luijten, Peter R.; Gilhuijs, Kenneth G. A.; Raaijmakers, Alexander J. E.; Klomp, Dennis W. J.

    2018-01-01

    Dynamic contrast‐enhanced MRI is the workhorse of breast MRI, where the diagnosis of lesions is largely based on the enhancement curve shape. However, this curve shape is biased by RF transmit (B 1 +) field inhomogeneities. B 1 + field information is required in order to correct these. The use of a generic, coil‐specific B 1 + template is proposed and tested. Finite‐difference time‐domain simulations for B 1 + were performed for healthy female volunteers with a wide range of breast anatomies. A generic B 1 + template was constructed by averaging simulations based on four volunteers. Three‐dimensional B 1 + maps were acquired in 15 other volunteers. Root mean square error (RMSE) metrics were calculated between individual simulations and the template, and between individual measurements and the template. The agreement between the proposed template approach and a B 1 + mapping method was compared against the agreement between acquisition and reacquisition using the same mapping protocol. RMSE values (% of nominal flip angle) comparing individual simulations with the template were in the range 2.00‐4.01%, with mean 2.68%. RMSE values comparing individual measurements with the template were in the range8.1‐16%, with mean 11.7%. The agreement between the proposed template approach and a B 1 + mapping method was only slightly worse than the agreement between two consecutive acquisitions using the same mapping protocol in one volunteer: the range of agreement increased from ±16% of the nominal angle for repeated measurement to ±22% for the B 1 + template. With local RF transmit coils, intersubject differences in B 1 + fields of the breast are comparable to the accuracy of B 1 + mapping methods, even at 7 T. Consequently, a single generic B 1 + template suits subjects over a wide range of breast anatomies, eliminating the need for a time‐consuming B 1 + mapping protocol. PMID:29570887

  1. Analysis of stationary displacement patterns in rotating machinery subject to local harmonic excitation

    NASA Astrophysics Data System (ADS)

    Österlind, Tomas; Kari, Leif; Nicolescu, Cornel Mihai

    2017-02-01

    Rotor vibration and stationary displacement patterns observed in rotating machineries subject to local harmonic excitation are analysed for improved understanding and dynamic characterization. The analysis stresses the importance of coordinate transformation between rotating and stationary frame of reference for accurate results and estimation of dynamic properties. A generic method which can be used for various rotor applications such as machine tool spindle and turbo machinery vibration is presented. The phenomenon shares similarities with stationary waves in rotating disks though focuses on vibration in shafts. The paper further proposes a graphical tool, the displacement map, which can be used for selection of stable rotational speed for rotating machinery. The results are validated through simulation of dynamic response of a milling cutter, which is a typical example of a variable speed rotor operating under different load conditions.

  2. The coffee genome hub: a resource for coffee genomes

    PubMed Central

    Dereeper, Alexis; Bocs, Stéphanie; Rouard, Mathieu; Guignon, Valentin; Ravel, Sébastien; Tranchant-Dubreuil, Christine; Poncet, Valérie; Garsmeur, Olivier; Lashermes, Philippe; Droc, Gaëtan

    2015-01-01

    The whole genome sequence of Coffea canephora, the perennial diploid species known as Robusta, has been recently released. In the context of the C. canephora genome sequencing project and to support post-genomics efforts, we developed the Coffee Genome Hub (http://coffee-genome.org/), an integrative genome information system that allows centralized access to genomics and genetics data and analysis tools to facilitate translational and applied research in coffee. We provide the complete genome sequence of C. canephora along with gene structure, gene product information, metabolism, gene families, transcriptomics, syntenic blocks, genetic markers and genetic maps. The hub relies on generic software (e.g. GMOD tools) for easy querying, visualizing and downloading research data. It includes a Genome Browser enhanced by a Community Annotation System, enabling the improvement of automatic gene annotation through an annotation editor. In addition, the hub aims at developing interoperability among other existing South Green tools managing coffee data (phylogenomics resources, SNPs) and/or supporting data analyses with the Galaxy workflow manager. PMID:25392413

  3. Tripal v1.1: a standards-based toolkit for construction of online genetic and genomic databases.

    PubMed

    Sanderson, Lacey-Anne; Ficklin, Stephen P; Cheng, Chun-Huai; Jung, Sook; Feltus, Frank A; Bett, Kirstin E; Main, Dorrie

    2013-01-01

    Tripal is an open-source freely available toolkit for construction of online genomic and genetic databases. It aims to facilitate development of community-driven biological websites by integrating the GMOD Chado database schema with Drupal, a popular website creation and content management software. Tripal provides a suite of tools for interaction with a Chado database and display of content therein. The tools are designed to be generic to support the various ways in which data may be stored in Chado. Previous releases of Tripal have supported organisms, genomic libraries, biological stocks, stock collections and genomic features, their alignments and annotations. Also, Tripal and its extension modules provided loaders for commonly used file formats such as FASTA, GFF, OBO, GAF, BLAST XML, KEGG heir files and InterProScan XML. Default generic templates were provided for common views of biological data, which could be customized using an open Application Programming Interface to change the way data are displayed. Here, we report additional tools and functionality that are part of release v1.1 of Tripal. These include (i) a new bulk loader that allows a site curator to import data stored in a custom tab delimited format; (ii) full support of every Chado table for Drupal Views (a powerful tool allowing site developers to construct novel displays and search pages); (iii) new modules including 'Feature Map', 'Genetic', 'Publication', 'Project', 'Contact' and the 'Natural Diversity' modules. Tutorials, mailing lists, download and set-up instructions, extension modules and other documentation can be found at the Tripal website located at http://tripal.info. DATABASE URL: http://tripal.info/.

  4. 7 CFR 1485.10 - General purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... participants may receive assistance for either generic or brand promotion activities. EIP/MAP participants are U.S. commercial entities that receive assistance for brand promotion activities. (c) The MAP and EIP...

  5. A GIS-based generic real-time risk assessment framework and decision tools for chemical spills in the river basin.

    PubMed

    Jiang, Jiping; Wang, Peng; Lung, Wu-seng; Guo, Liang; Li, Mei

    2012-08-15

    This paper presents a generic framework and decision tools of real-time risk assessment on Emergency Environmental Decision Support System for response to chemical spills in river basin. The generic "4-step-3-model" framework is able to delineate the warning area and the impact on vulnerable receptors considering four types of hazards referring to functional area, societal impact, and human health and ecology system. Decision tools including the stand-alone system and software components were implemented on GIS platform. A detailed case study on the Songhua River nitrobenzene spill illustrated the goodness of the framework and tool Spill first responders and decision makers of catchment management will benefit from the rich, visual and dynamic hazard information output from the software. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Recent Developments Related To An Optically Controlled Microwave Phased Array Antenna.

    NASA Astrophysics Data System (ADS)

    Kittel, A.; Peinke, J.; Klein, M.; Baier, G.; Parisi, J.; Rössler, O. E.

    1990-12-01

    A generic 3-dimensional diffeomorphic map, with constant Jacobian determinant, is proposed and looked at numerically. It contains a lower-dimensional basin boundary along which a chaotic motion takes place. This boundary is nowhere differentiable in one direction. Therefore, nowhere differentiable limit sets exist generically in nature.

  7. Mapping Generic Skills Curricula: Outcomes and Discussion

    ERIC Educational Resources Information Center

    Robley, Will; Whittle, Sue; Murdoch-Eaton, Deborah

    2005-01-01

    Generic skills development is increasingly being embedded into UK higher education curricula to improve the employability and lifelong learning skills of graduates. At the same time universities are being required to benchmark their curricular outcomes against national and employer standards. This paper presents and discusses the results of a…

  8. An ocean gazetteer for education and research

    NASA Astrophysics Data System (ADS)

    Delaney, R.; Staudigel, D.; Staudigel, H.

    2003-04-01

    Global travel, economy, and news coverage often challenge the student's and teacher's knowledge of the geography of the seas. The International Hydrographic Organization (IHO) has published a description of all the major seas making up earth's oceans, but there is currently no electronic tool that identifies them on a digital map. During an internship at Scripps Institution of Oceanography, we transferred the printed visual description of the seas from IHO publication 23 into a digital format. This digital map was turned into a (Flash) web application that allows a user to identify any of the IHO seas on a world map, simply by moving the computer cursor over it. In our presentation, we will describe the path taken to produce this web application and the learning process involved in this path during our internship at Scripps. The main steps in this process included the digitization of the official IHO maps, the transfer of this information onto a modern digital map by Smith and Sandwell. Adjustments were necessary due to the fact that many of the landmasses were placed incorrectly on a lat/long grid, off by as much as 100km. Boundaries between seas were often misrepresented by the IHO as straight lines on a Mercator projection. Once the digitization of the seas was completed we used the 2d animation environment Flash and we produced an interactive map environment that allows any teacher or student of ocean geography to identify an ocean by name and location. Aside from learning about the geography of the oceans, we were introduced to the use of digitizers, we learned to make maps using Generic Mapping Tools (GMT) and digital global bathymetry data sets, and we learned about map projections. We studied Flash to produce an interactive map of the oceans that displays bathymetry and topography, highlighting any particular sea the cursor moves across. The name of the selected sea in our Flash application appears in a textbox on the bottom of the map. The result of this project can be found at http://earthref.org/PACER/beta/IH023seas.

  9. A period-doubling cascade precedes chaos for planar maps.

    PubMed

    Sander, Evelyn; Yorke, James A

    2013-09-01

    A period-doubling cascade is often seen in numerical studies of those smooth (one-parameter families of) maps for which as the parameter is varied, the map transitions from one without chaos to one with chaos. Our emphasis in this paper is on establishing the existence of such a cascade for many maps with phase space dimension 2. We use continuation methods to show the following: under certain general assumptions, if at one parameter there are only finitely many periodic orbits, and at another parameter value there is chaos, then between those two parameter values there must be a cascade. We investigate only families that are generic in the sense that all periodic orbit bifurcations are generic. Our method of proof in showing there is one cascade is to show there must be infinitely many cascades. We discuss in detail two-dimensional families like those which arise as a time-2π maps for the Duffing equation and the forced damped pendulum equation.

  10. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  11. A mapping from the unitary to doubly stochastic matrices and symbols on a finite set

    NASA Astrophysics Data System (ADS)

    Karabegov, Alexander V.

    2008-11-01

    We prove that the mapping from the unitary to doubly stochastic matrices that maps a unitary matrix (ukl) to the doubly stochastic matrix (|ukl|2) is a submersion at a generic unitary matrix. The proof uses the framework of operator symbols on a finite set.

  12. Regional mapping of depression-focussed groundwater recharge incorporating variable topography, climate, and land use

    NASA Astrophysics Data System (ADS)

    Pavlovskii, I.; Noorduijn, S. L.; Abrakhimova, P.; Bentley, L. R.; Cey, E. E.; Hayashi, M.

    2016-12-01

    In the water-deficient setting of the Northern Great Plains (or Prairie Pothole Region, PPR), groundwater recharge constitutes only a small fraction of the water budget, meaning that recharge estimates have a high degree of uncertainty. Additionally, recharge primarily occurs as focussed recharge when small topographical depressions are inundated by surface runoff, typically during spring melt while underlying soils are still frozen. This results in a high spatial and temporal variability of recharge rates, which further complicates their evaluation. As part of a major research project called Groundwater Recharge in the Prairies (GRIP), we have developed a soil water balance model to estimate recharge rates at a scale of a single depression and its catchment (< 10 ha). In the next stage of the GRIP project, the present study investigates the possibility of applying this tool for recharge mapping on a regional scale in the Edmonton-Calgary corridor in Alberta, located in the north-western fringe of the PPR. The entire area (49500 km2) was divided into elements based on the proximity to one of 24 Alberta Agriculture weather stations. For each element, the model was run for a series of generic scenarios consisting of representative land use and depression catchment parameters. The latter were constructed using a high-resolution digital elevation model (DEM). The recharge value for each element was then computed using a weighted average of the generic scenario outputs. The new method has a number of benefits. Use of generic scenarios instead of real depressions dramatically reduces computational cost. Extraction of relevant parameters from DEM accounts for depressions which are only flooded sporadically and thus may be absent from the inventories of wet areas based on satellite images. If extra data on topographical parameters become available, the recharge may be recalculated without repeating the entire workflow.

  13. Toward an Integrated Design, Inspection and Redundancy Research Program.

    DTIC Science & Technology

    1984-01-01

    William Creelman William H. Silcox National Marine Service Standard Oil Company of California St. Louis, Missouri San Francisco, California .-- N...develop physical models and generic tools for analyzing the effects of redundancy, reserve strength, and residual strength on the system behavior of marine...probabilistic analyses to be applicable to real-world problems, this program needs to provide - the deterministic physical models and generic tools upon

  14. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    PubMed

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  15. A Suite of Models to Support the Quantitative Assessment of Spread in Pest Risk Analysis

    PubMed Central

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J.; Baker, Richard H. A.; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174

  16. A Visual-Based Approach to the Mapping of Generic Skills: Its Application to a Marketing Degree

    ERIC Educational Resources Information Center

    Ang, Lawrence; D'Alessandro, Steven; Winzar, Hume

    2014-01-01

    With increasing complexity in the world, universities continue to face pressure to demonstrate that their graduates have acquired skills beyond discipline-based knowledge. These are generic skills like critical thinking, intellectual curiosity, problem-solving and so forth. In order to demonstrate this, universities have to show how their teaching…

  17. Some Comments on Mapping from Disease-Specific to Generic Health-Related Quality-of-Life Scales

    PubMed Central

    Palta, Mari

    2013-01-01

    An article by Lu et al. in this issue of Value in Health addresses the mapping of treatment or group differences in disease-specific measures (DSMs) of health-related quality of life onto differences in generic health-related quality-of-life scores, with special emphasis on how the mapping is affected by the reliability of the DSM. In the proposed mapping, a factor analytic model defines a conversion factor between the scores as the ratio of factor loadings. Hence, the mapping applies to convert true underlying scales and has desirable properties facilitating the alignment of instruments and understanding their relationship in a coherent manner. It is important to note, however, that when DSM means or differences in mean DSMs are estimated, their mapping is still of a measurement error–prone predictor, and the correct conversion coefficient is the true mapping multiplied by the reliability of the DSM in the relevant sample. In addition, the proposed strategy for estimating the factor analytic mapping in practice requires assumptions that may not hold. We discuss these assumptions and how they may be the reason we obtain disparate estimates of the mapping factor in an application of the proposed methods to groups of patients. PMID:23337233

  18. Free software helps map and display data

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Smith, Walter H. F.

    When creating camera-ready figures, most scientists are familiar with the sequence of raw data → processing → final illustration and with the spending of large sums of money to finalize papers for submission to scientific journals, prepare proposals, and create overheads and slides for various presentations. This process can be tedious and is often done manually, since available commercial or in-house software usually can do only part of the job.To expedite this process, we introduce the Generic Mapping Tools (GMT), which is a free, public domain software package that can be used to manipulate columns of tabular data, time series, and gridded data sets and to display these data in a variety of forms ranging from simple x-y plots to maps and color, perspective, and shaded-relief illustrations. GMT uses the PostScript page description language, which can create arbitrarily complex images in gray tones or 24-bit true color by superimposing multiple plot files. Line drawings, bitmapped images, and text can be easily combined in one illustration. PostScript plot files are device-independent, meaning the same file can be printed at 300 dots per inch (dpi) on an ordinary laserwriter or at 2470 dpi on a phototypesetter when ultimate quality is needed. GMT software is written as a set of UNIX tools and is totally self contained and fully documented. The system is offered free of charge to federal agencies and nonprofit educational organizations worldwide and is distributed over the computer network Internet.

  19. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  20. A generic nuclei detection method for histopathological breast images

    NASA Astrophysics Data System (ADS)

    Kost, Henning; Homeyer, André; Bult, Peter; Balkenhol, Maschenka C. A.; van der Laak, Jeroen A. W. M.; Hahn, Horst K.

    2016-03-01

    The detection of cell nuclei plays a key role in various histopathological image analysis problems. Considering the high variability of its applications, we propose a novel generic and trainable detection approach. Adaption to specific nuclei detection tasks is done by providing training samples. A trainable deconvolution and classification algorithm is used to generate a probability map indicating the presence of a nucleus. The map is processed by an extended watershed segmentation step to identify the nuclei positions. We have tested our method on data sets with different stains and target nuclear types. We obtained F1-measures between 0.83 and 0.93.

  1. Constructing a Graphic Organizer in the Classroom: Introductory Students' Perception of Achievement Using a Decision Map to Solve Aqueous Acid-Base Equilibria Problems

    ERIC Educational Resources Information Center

    DeMeo, Stephen

    2007-01-01

    Common examples of graphic organizers include flow diagrams, concept maps, and decision trees. The author has created a novel type of graphic organizer called a decision map. A decision map is a directional heuristic that helps learners solve problems within a generic framework. It incorporates questions that the user must answer and contains…

  2. Integrable mappings with transcendental invariants

    NASA Astrophysics Data System (ADS)

    Grammaticos, B.; Ramani, A.

    2007-06-01

    We examine a family of integrable mappings which possess rational invariants involving polynomials of arbitrarily high degree. Next we extend these mappings to the case where their parameters are functions of the independent variable. The resulting mappings do not preserve any invariant but are solvable by linearisation. Using this result we then proceed to construct the solution of the initial autonomous mappings and use it to explicitly construct the invariant, which turns out to be transcendental in the generic case.

  3. NAS Grid Benchmarks: A Tool for Grid Space Exploration

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We present an approach for benchmarking services provided by computational Grids. It is based on the NAS Parallel Benchmarks (NPB) and is called NAS Grid Benchmark (NGB) in this paper. We present NGB as a data flow graph encapsulating an instance of an NPB code in each graph node, which communicates with other nodes by sending/receiving initialization data. These nodes may be mapped to the same or different Grid machines. Like NPB, NGB will specify several different classes (problem sizes). NGB also specifies the generic Grid services sufficient for running the bench-mark. The implementor has the freedom to choose any specific Grid environment. However, we describe a reference implementation in Java, and present some scenarios for using NGB.

  4. Developing multiple-choices test items as tools for measuring the scientific-generic skills on solar system

    NASA Astrophysics Data System (ADS)

    Bhakti, Satria Seto; Samsudin, Achmad; Chandra, Didi Teguh; Siahaan, Parsaoran

    2017-05-01

    The aim of research is developing multiple-choices test items as tools for measuring the scientific of generic skills on solar system. To achieve the aim that the researchers used the ADDIE model consisting Of: Analyzing, Design, Development, Implementation, dan Evaluation, all of this as a method research. While The scientific of generic skills limited research to five indicator including: (1) indirect observation, (2) awareness of the scale, (3) inference logic, (4) a causal relation, and (5) mathematical modeling. The participants are 32 students at one of junior high schools in Bandung. The result shown that multiple-choices that are constructed test items have been declared valid by the expert validator, and after the tests show that the matter of developing multiple-choices test items be able to measuring the scientific of generic skills on solar system.

  5. Patient-reported outcome measures in reconstructive breast surgery: is there a role for generic measures?

    PubMed

    Korus, Lisa J; Cypel, Tatiana; Zhong, Toni; Wu, Albert W

    2015-03-01

    Patient-reported outcomes provide an invaluable tool in the assessment of outcomes in plastic surgery. Traditionally, patient-reported outcomes have consisted of either generic or ad hoc measures; however, more recently, there has been interest in formally constructed and validated questionnaires that are specifically designed for a particular patient population. The purpose of this systematic review was to determine whether generic measures still have a role in the evaluation of breast reconstruction outcomes, given the recent popularity and push for use of specific measures. A systematic review was performed to identify all articles using patient-reported outcomes in the assessment of postmastectomy breast reconstruction. Frequency of use was tabulated and the most frequently used tools were assessed for success of use, using criteria described previously by the Medical Outcomes Trust. To date, the most frequently used measures are still generic measures. The 36-Item Short-Form Health Survey was the most frequently used and most successfully applied showing evidence of responsiveness in multiple settings. Other measures such as the Hospital Anxiety and Depression Scale, the Hopwood Body Image Scale, and the Rosenberg Self-Esteem Scale were able to show responsiveness in certain settings but lacked evidence as universal tools for the assessment of outcomes in reconstructive breast surgery. Despite the recent advent of measures designed specifically to assess patient-reported outcomes in the breast reconstruction population, there still appears to be a role for the use of generic instruments. Many of these tools would benefit from undergoing formal validation in the breast reconstruction population.

  6. An XML transfer schema for exchange of genomic and genetic mapping data: implementation as a web service in a Taverna workflow.

    PubMed

    Paterson, Trevor; Law, Andy

    2009-08-14

    Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. We have developed a simple generic XML schema (GenomicMappingData.xsd - GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data.

  7. An XML transfer schema for exchange of genomic and genetic mapping data: implementation as a web service in a Taverna workflow

    PubMed Central

    Paterson, Trevor; Law, Andy

    2009-01-01

    Background Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. Results We have developed a simple generic XML schema (GenomicMappingData.xsd – GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. Conclusion The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data. PMID:19682365

  8. Global localization of 3D point clouds in building outline maps of urban outdoor environments.

    PubMed

    Landsiedel, Christian; Wollherr, Dirk

    2017-01-01

    This paper presents a method to localize a robot in a global coordinate frame based on a sparse 2D map containing outlines of building and road network information and no location prior information. Its input is a single 3D laser scan of the surroundings of the robot. The approach extends the generic chamfer matching template matching technique from image processing by including visibility analysis in the cost function. Thus, the observed building planes are matched to the expected view of the corresponding map section instead of to the entire map, which makes a more accurate matching possible. Since this formulation operates on generic edge maps from visual sensors, the matching formulation can be expected to generalize to other input data, e.g., from monocular or stereo cameras. The method is evaluated on two large datasets collected in different real-world urban settings and compared to a baseline method from literature and to the standard chamfer matching approach, where it shows considerable performance benefits, as well as the feasibility of global localization based on sparse building outline data.

  9. 7 CFR 1485.18 - Advances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AGRICULTURE EXPORT PROGRAMS COOPERATIVE AGREEMENTS FOR THE DEVELOPMENT OF FOREIGN MARKETS FOR AGRICULTURAL COMMODITIES Market Access Program § 1485.18 Advances. (a) Policy. In general, CCC operates MAP and EIP/MAP on... participant for generic promotion activities. Prior to making an advance, CCC may require the participant to...

  10. Mapping functional connectivity

    Treesearch

    Peter Vogt; Joseph R. Ferrari; Todd R. Lookingbill; Robert H. Gardner; Kurt H. Riitters; Katarzyna Ostapowicz

    2009-01-01

    An objective and reliable assessment of wildlife movement is important in theoretical and applied ecology. The identification and mapping of landscape elements that may enhance functional connectivity is usually a subjective process based on visual interpretations of species movement patterns. New methods based on mathematical morphology provide a generic, flexible,...

  11. What we know about the purpose, theoretical foundation, scope and dimensionality of existing self-management measurement tools: A scoping review.

    PubMed

    Packer, Tanya L; Fracini, America; Audulv, Åsa; Alizadeh, Neda; van Gaal, Betsie G I; Warner, Grace; Kephart, George

    2018-04-01

    To identify self-report, self-management measures for adults with chronic conditions, and describe their purpose, theoretical foundation, dimensionality (multi versus uni), and scope (generic versus condition specific). A search of four databases (8479 articles) resulted in a scoping review of 28 self-management measures. Although authors identified tools as measures of self-management, wide variation in constructs measured, purpose, and theoretical foundations existed. Subscales on 13 multidimensional tools collectively measure domains of self-management relevant to clients, however no one tool's subscales cover all domains. Viewing self-management as a complex, multidimensional whole, demonstrated that existing measures assess different, related aspects of self-management. Activities and social roles, though important to patients, are rarely measured. Measures with capacity to quantify and distinguish aspects of self-management may promote tailored patient care. In selecting tools for research or assessment, the reason for development, definitions, and theories underpinning the measure should be scrutinized. Our ability to measure self-management must be rigorously mapped to provide comprehensive and system-wide care for clients with chronic conditions. Viewing self-management as a complex whole will help practitioners to understand the patient perspective and their contribution in supporting each individual patient. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Some comments on mapping from disease-specific to generic health-related quality-of-life scales.

    PubMed

    Palta, Mari

    2013-01-01

    An article by Lu et al. in this issue of Value in Health addresses the mapping of treatment or group differences in disease-specific measures (DSMs) of health-related quality of life onto differences in generic health-related quality-of-life scores, with special emphasis on how the mapping is affected by the reliability of the DSM. In the proposed mapping, a factor analytic model defines a conversion factor between the scores as the ratio of factor loadings. Hence, the mapping applies to convert true underlying scales and has desirable properties facilitating the alignment of instruments and understanding their relationship in a coherent manner. It is important to note, however, that when DSM means or differences in mean DSMs are estimated, their mapping is still of a measurement error-prone predictor, and the correct conversion coefficient is the true mapping multiplied by the reliability of the DSM in the relevant sample. In addition, the proposed strategy for estimating the factor analytic mapping in practice requires assumptions that may not hold. We discuss these assumptions and how they may be the reason we obtain disparate estimates of the mapping factor in an application of the proposed methods to groups of patients. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  14. Big Bang Bifurcation Analysis and Allee Effect in Generic Growth Functions

    NASA Astrophysics Data System (ADS)

    Leonel Rocha, J.; Taha, Abdel-Kaddous; Fournier-Prunaret, D.

    2016-06-01

    The main purpose of this work is to study the dynamics and bifurcation properties of generic growth functions, which are defined by the population size functions of the generic growth equation. This family of unimodal maps naturally incorporates a principal focus of ecological and biological research: the Allee effect. The analysis of this kind of extinction phenomenon allows to identify a class of Allee’s functions and characterize the corresponding Allee’s effect region and Allee’s bifurcation curve. The bifurcation analysis is founded on the performance of fold and flip bifurcations. The dynamical behavior is rich with abundant complex bifurcation structures, the big bang bifurcations of the so-called “box-within-a-box” fractal type being the most outstanding. Moreover, these bifurcation cascades converge to different big bang bifurcation curves with distinct kinds of boxes, where for the corresponding parameter values several attractors are associated. To the best of our knowledge, these results represent an original contribution to clarify the big bang bifurcation analysis of continuous 1D maps.

  15. Developing Daily Quantitative Damage Estimates From Geospatial Layers To Support Post Event Recovery

    NASA Astrophysics Data System (ADS)

    Woods, B. K.; Wei, L. H.; Connor, T. C.

    2014-12-01

    With the growth of natural hazard data available in near real-time it is increasingly feasible to deliver damage estimates caused by natural disasters. These estimates can be used in disaster management setting or by commercial entities to optimize the deployment of resources and/or routing of goods and materials. This work outlines an end-to-end, modular process to generate estimates of damage caused by severe weather. The processing stream consists of five generic components: 1) Hazard modules that provide quantitate data layers for each peril. 2) Standardized methods to map the hazard data to an exposure layer based on atomic geospatial blocks. 3) Peril-specific damage functions that compute damage metrics at the atomic geospatial block level. 4) Standardized data aggregators, which map damage to user-specific geometries. 5) Data dissemination modules, which provide resulting damage estimates in a variety of output forms. This presentation provides a description of this generic tool set, and an illustrated example using HWRF-based hazard data for Hurricane Arthur (2014). In this example, the Python-based real-time processing ingests GRIB2 output from the HWRF numerical model, dynamically downscales it in conjunctions with a land cover database using a multiprocessing pool, and a just-in-time compiler (JIT). The resulting wind fields are contoured, and ingested into a PostGIS database using OGR. Finally, the damage estimates are calculated at the atomic block level and aggregated to user-defined regions using PostgreSQL queries to construct application specific tabular and graphics output.

  16. Generalized exact holographic mapping with wavelets

    NASA Astrophysics Data System (ADS)

    Lee, Ching Hua

    2017-12-01

    The idea of renormalization and scale invariance is pervasive across disciplines. It has not only drawn numerous surprising connections between physical systems under the guise of holographic duality, but has also inspired the development of wavelet theory now widely used in signal processing. Synergizing on these two developments, we describe in this paper a generalized exact holographic mapping that maps a generic N -dimensional lattice system to a (N +1 )-dimensional holographic dual, with the emergent dimension representing scale. In previous works, this was achieved via the iterations of the simplest of all unitary mappings, the Haar mapping, which fails to preserve the form of most Hamiltonians. By taking advantage of the full generality of biorthogonal wavelets, our new generalized holographic mapping framework is able to preserve the form of a large class of lattice Hamiltonians. By explicitly separating features that are fundamentally associated with the physical system from those that are basis specific, we also obtain a clearer understanding of how the resultant bulk geometry arises. For instance, the number of nonvanishing moments of the high-pass wavelet filter is revealed to be proportional to the radius of the dual anti-de Sitter space geometry. We conclude by proposing modifications to the mapping for systems with generic Fermi pockets.

  17. A Generic Approach for Inversion of Surface Reflectance over Land: Overview, Application and Validation Using MODIS and LANDSAT8 Data

    NASA Technical Reports Server (NTRS)

    Vermote, E.; Roger, J. C.; Justice, C. O.; Franch, B.; Claverie, M.

    2016-01-01

    This paper presents a generic approach developed to derive surface reflectance over land from a variety of sensors. This technique builds on the extensive dataset acquired by the Terra platform by combining MODIS and MISR to derive an explicit and dynamic map of band ratio's between blue and red channels and is a refinement of the operational approach used for MODIS and LANDSAT over the past 15 years. We will present the generic approach and the application to MODIS and LANDSAT data and its validation using the AERONET data.

  18. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  19. 76 FR 50993 - Agency Information Collection Activities: Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...

  20. Application of Gene Expression Trajectories Initiated from ErbB Receptor Activation Highlights the Dynamics of Divergent Promoter Usage.

    PubMed

    Carbajo, Daniel; Magi, Shigeyuki; Itoh, Masayoshi; Kawaji, Hideya; Lassmann, Timo; Arner, Erik; Forrest, Alistair R R; Carninci, Piero; Hayashizaki, Yoshihide; Daub, Carsten O; Okada-Hatakeyama, Mariko; Mar, Jessica C

    2015-01-01

    Understanding how cells use complex transcriptional programs to alter their fate in response to specific stimuli is an important question in biology. For the MCF-7 human breast cancer cell line, we applied gene expression trajectory models to identify the genes involved in driving cell fate transitions. We modified trajectory models to account for the scenario where cells were exposed to different stimuli, in this case epidermal growth factor and heregulin, to arrive at different cell fates, i.e. proliferation and differentiation respectively. Using genome-wide CAGE time series data collected from the FANTOM5 consortium, we identified the sets of promoters that were involved in the transition of MCF-7 cells to their specific fates versus those with expression changes that were generic to both stimuli. Of the 1,552 promoters identified, 1,091 had stimulus-specific expression while 461 promoters had generic expression profiles over the time course surveyed. Many of these stimulus-specific promoters mapped to key regulators of the ERK (extracellular signal-regulated kinases) signaling pathway such as FHL2 (four and a half LIM domains 2). We observed that in general, generic promoters peaked in their expression early on in the time course, while stimulus-specific promoters tended to show activation of their expression at a later stage. The genes that mapped to stimulus-specific promoters were enriched for pathways that control focal adhesion, p53 signaling and MAPK signaling while generic promoters were enriched for cell death, transcription and the cell cycle. We identified 162 genes that were controlled by an alternative promoter during the time course where a subset of 37 genes had separate promoters that were classified as stimulus-specific and generic. The results of our study highlighted the degree of complexity involved in regulating a cell fate transition where multiple promoters mapping to the same gene can demonstrate quite divergent expression profiles.

  1. Generic trending and analysis system

    NASA Technical Reports Server (NTRS)

    Keehan, Lori; Reese, Jay

    1994-01-01

    The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.

  2. Development and validation of a tool to assess knowledge and attitudes towards generic medicines among students in Greece: The ATtitude TOwards GENerics (ATTOGEN) questionnaire.

    PubMed

    Domeyer, Philip J; Aletras, Vassilis; Anagnostopoulos, Fotios; Katsari, Vasiliki; Niakas, Dimitris

    2017-01-01

    The use of generic medicines is a cost-effective policy, often dictated by fiscal restraints. To our knowledge, no fully validated tool exploring the students' knowledge and attitudes towards generic medicines exists. The aim of our study was to develop and validate a questionnaire exploring the knowledge and attitudes of M.Sc. in Health Care Management students and recent alumni's towards generic drugs in Greece. The development of the questionnaire was a result of literature review and pilot-testing of its preliminary versions to researchers and students. The final version of the questionnaire contains 18 items measuring the respondents' knowledge and attitude towards generic medicines on a 5-point Likert scale. Given the ordinal nature of the data, ordinal alpha and polychoric correlations were computed. The sample was randomly split into two halves. Exploratory factor analysis, performed in the first sample, was used for the creation of multi-item scales. Confirmatory factor analysis and Generalized Linear Latent and Mixed Model analysis (GLLAMM) with the use of the rating scale model were used in the second sample to assess goodness of fit. An assessment of internal consistency reliability, test-retest reliability, and construct validity was also performed. Among 1402 persons contacted, 986 persons completed our questionnaire (response rate = 70.3%). Overall Cronbach's alpha was 0.871. The conjoint use of exploratory and confirmatory factor analysis resulted in a six-scale model, which seemed to fit the data well. Five of the six scales, namely trust, drug quality, state audit, fiscal impact and drug substitution were found to be valid and reliable, while the knowledge scale suffered only from low inter-scale correlations and a ceiling effect. However, the subsequent confirmatory factor and GLLAMM analyses indicated a good fit of the model to the data. The ATTOGEN instrument proved to be a reliable and valid tool, suitable for assessing students' knowledge and attitudes towards generic medicines.

  3. GMODWeb: a web framework for the generic model organism database

    PubMed Central

    O'Connor, Brian D; Day, Allen; Cain, Scott; Arnaiz, Olivier; Sperling, Linda; Stein, Lincoln D

    2008-01-01

    The Generic Model Organism Database (GMOD) initiative provides species-agnostic data models and software tools for representing curated model organism data. Here we describe GMODWeb, a GMOD project designed to speed the development of model organism database (MOD) websites. Sites created with GMODWeb provide integration with other GMOD tools and allow users to browse and search through a variety of data types. GMODWeb was built using the open source Turnkey web framework and is available from . PMID:18570664

  4. IAU nomenclature for albedo features on the planet Mercury

    NASA Technical Reports Server (NTRS)

    Dollfus, A.; Chapman, C. R.; Davies, M. E.; Gingerich, O.; Goldstein, R.; Guest, J.; Morrison, D.; Smith, B. A.

    1978-01-01

    The International Astronomical Union has endorsed a nomenclature for the albedo features on Mercury. Designations are based upon the mythological names related to the god Hermes; they are expressed in Latin form. The dark-hued albedo features are associated with the generic term Solitudo. The light-hued areas are designated by a single name without generic term. The 32 names adopted are allocated on the Mercury map.

  5. Influence of neighbourhood information on 'Local Climate Zone' mapping in heterogeneous cities

    NASA Astrophysics Data System (ADS)

    Verdonck, Marie-Leen; Okujeni, Akpona; van der Linden, Sebastian; Demuzere, Matthias; De Wulf, Robert; Van Coillie, Frieke

    2017-10-01

    Local climate zone (LCZ) mapping is an emerging field in urban climate research. LCZs potentially provide an objective framework to assess urban form and function worldwide. The scheme is currently being used to globally map LCZs as a part of the World Urban Database and Access Portal Tools (WUDAPT) initiative. So far, most of the LCZ maps lack proper quantitative assessment, challenging the generic character of the WUDAPT workflow. Using the standard method introduced by the WUDAPT community difficulties arose concerning the built zones due to high levels of heterogeneity. To overcome this problem a contextual classifier is adopted in the mapping process. This paper quantitatively assesses the influence of neighbourhood information on the LCZ mapping result of three cities in Belgium: Antwerp, Brussels and Ghent. Overall accuracies for the maps were respectively 85.7 ± 0.5, 79.6 ± 0.9, 90.2 ± 0.4%. The approach presented here results in overall accuracies of 93.6 ± 0.2, 92.6 ± 0.3 and 95.6 ± 0.3% for Antwerp, Brussels and Ghent. The results thus indicate a positive influence of neighbourhood information for all study areas with an increase in overall accuracies of 7.9, 13.0 and 5.4%. This paper reaches two main conclusions. Firstly, evidence was introduced on the relevance of a quantitative accuracy assessment in LCZ mapping, showing that the accuracies reported in previous papers are not easily achieved. Secondly, the method presented in this paper proves to be highly effective in Belgian cities, and given its open character shows promise for application in other heterogeneous cities worldwide.

  6. Linking rare and common disease: mapping clinical disease-phenotypes to ontologies in therapeutic target validation.

    PubMed

    Sarntivijai, Sirarat; Vasant, Drashtti; Jupp, Simon; Saunders, Gary; Bento, A Patrícia; Gonzalez, Daniel; Betts, Joanna; Hasan, Samiul; Koscielny, Gautier; Dunham, Ian; Parkinson, Helen; Malone, James

    2016-01-01

    The Centre for Therapeutic Target Validation (CTTV - https://www.targetvalidation.org/) was established to generate therapeutic target evidence from genome-scale experiments and analyses. CTTV aims to support the validity of therapeutic targets by integrating existing and newly-generated data. Data integration has been achieved in some resources by mapping metadata such as disease and phenotypes to the Experimental Factor Ontology (EFO). Additionally, the relationship between ontology descriptions of rare and common diseases and their phenotypes can offer insights into shared biological mechanisms and potential drug targets. Ontologies are not ideal for representing the sometimes associated type relationship required. This work addresses two challenges; annotation of diverse big data, and representation of complex, sometimes associated relationships between concepts. Semantic mapping uses a combination of custom scripting, our annotation tool 'Zooma', and expert curation. Disease-phenotype associations were generated using literature mining on Europe PubMed Central abstracts, which were manually verified by experts for validity. Representation of the disease-phenotype association was achieved by the Ontology of Biomedical AssociatioN (OBAN), a generic association representation model. OBAN represents associations between a subject and object i.e., disease and its associated phenotypes and the source of evidence for that association. The indirect disease-to-disease associations are exposed through shared phenotypes. This was applied to the use case of linking rare to common diseases at the CTTV. EFO yields an average of over 80% of mapping coverage in all data sources. A 42% precision is obtained from the manual verification of the text-mined disease-phenotype associations. This results in 1452 and 2810 disease-phenotype pairs for IBD and autoimmune disease and contributes towards 11,338 rare diseases associations (merged with existing published work [Am J Hum Genet 97:111-24, 2015]). An OBAN result file is downloadable at http://sourceforge.net/p/efo/code/HEAD/tree/trunk/src/efoassociations/. Twenty common diseases are linked to 85 rare diseases by shared phenotypes. A generalizable OBAN model for association representation is presented in this study. Here we present solutions to large-scale annotation-ontology mapping in the CTTV knowledge base, a process for disease-phenotype mining, and propose a generic association model, 'OBAN', as a means to integrate disease using shared phenotypes. EFO is released monthly and available for download at http://www.ebi.ac.uk/efo/.

  7. Expressive map design: OGC SLD/SE++ extension for expressive map styles

    NASA Astrophysics Data System (ADS)

    Christophe, Sidonie; Duménieu, Bertrand; Masse, Antoine; Hoarau, Charlotte; Ory, Jérémie; Brédif, Mathieu; Lecordix, François; Mellado, Nicolas; Turbet, Jérémie; Loi, Hugo; Hurtut, Thomas; Vanderhaeghe, David; Vergne, Romain; Thollot, Joëlle

    2018-05-01

    In the context of custom map design, handling more artistic and expressive tools has been identified as a carto-graphic need, in order to design stylized and expressive maps. Based on previous works on style formalization, an approach for specifying the map style has been proposed and experimented for particular use cases. A first step deals with the analysis of inspiration sources, in order to extract `what does make the style of the source', i.e. the salient visual characteristics to be automatically reproduced (textures, spatial arrangements, linear stylization, etc.). In a second step, in order to mimic and generate those visual characteristics, existing and innovative rendering techniques have been implemented in our GIS engine, thus extending the capabilities to generate expressive renderings. Therefore, an extension of the existing cartographic pipeline has been proposed based on the following aspects: 1- extension of the symbolization specifications OGC SLD/SE in order to provide a formalism to specify and reference expressive rendering methods; 2- separate the specification of each rendering method and its parameterization, as metadata. The main contribution has been described in (Christophe et al. 2016). In this paper, we focus firstly on the extension of the cartographic pipeline (SLD++ and metadata) and secondly on map design capabilities which have been experimented on various topographic styles: old cartographic styles (Cassini), artistic styles (watercolor, impressionism, Japanese print), hybrid topographic styles (ortho-imagery & vector data) and finally abstract and photo-realist styles for the geovisualization of costal area. The genericity and interoperability of our approach are promising and have already been tested for 3D visualization.

  8. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. Biological data integration: wrapping data and tools.

    PubMed

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  10. magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation

    NASA Astrophysics Data System (ADS)

    Angleraud, Christophe

    2014-06-01

    The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.

  11. Comparison between utility of the Thai Pediatric Quality of Life Inventory 4.0 Generic Core Scales and 3.0 Cerebral Palsy Module.

    PubMed

    Tantilipikorn, Pinailug; Watter, Pauline; Prasertsukdee, Saipin

    2013-03-01

    Health-related quality of life (HRQOL) is increasingly being considered in the management of patients with various conditions. HRQOL instruments can be broadly classified as generic or disease-specific measures. Several generic HRQOL instruments in different languages have been developed for paediatric populations including the Pediatric Quality of Life Inventory 4.0 (PedsQL 4.0) Generic Core Scale. This tool and a condition-specific tool, PedsQL 3.0 Cerebral Palsy (CP) Module, are widely used in children with CP. No psychometric properties have been reported for Thai PedsQL 4.0. Therefore, this study aimed to explore the psychometric properties of the Thai version of the PedsQL 4.0 Generic Core Scales and compare these with the values for the Thai PedsQL 3.0 CP Module reported previously. Thai PedsQL 4.0 Generic Core Scales and the PedsQL 3.0 CP Module were completed, respectively, by children with CP and their parents or caregivers twice within 2-4 weeks. Respondents were 97 parents or caregivers and 54 children. Minimal missing data were found in most scales. Acceptable internal consistency was supported, except for Emotional, Social, and School Functioning. Intraclass correlation coefficients for parent-proxy report and self-report were good to excellent (0.625-0.849). The feasibility and reliability of the Thai PedsQL 4.0 Generic Core Scales were supported. The Thai PedsQL 3.0 CP Module showed higher values for the psychometric properties. Low-to-good correlations were found among the scales between the PedsQL 4.0 Generic Core Scales and the 3.0 CP Module. Both instruments could be used to measure HRQOL for children with CP, and may provide different information.

  12. Cross-sectional mapping for refined beam elements with applications to shell-like structures

    NASA Astrophysics Data System (ADS)

    Pagani, A.; de Miguel, A. G.; Carrera, E.

    2017-06-01

    This paper discusses the use of higher-order mapping functions for enhancing the physical representation of refined beam theories. Based on the Carrera unified formulation (CUF), advanced one-dimensional models are formulated by expressing the displacement field as a generic expansion of the generalized unknowns. According to CUF, a novel physically/geometrically consistent model is devised by employing Legendre-like polynomial sets to approximate the generalized unknowns at the cross-sectional level, whereas a local mapping technique based on the blending functions method is used to describe the exact physical boundaries of the cross-section domain. Classical and innovative finite element methods, including hierarchical p-elements and locking-free integration schemes, are utilized to solve the governing equations of the unified beam theory. Several numerical applications accounting for small displacements/rotations and strains are discussed, including beam structures with cross-sectional curved edges, cylindrical shells, and thin-walled aeronautical wing structures with reinforcements. The results from the proposed methodology are widely assessed by comparisons with solutions from the literature and commercial finite element software tools. The attention is focussed on the high computational efficiency and the marked capabilities of the present beam model, which can deal with a broad spectrum of structural problems with unveiled accuracy in terms of geometrical representation of the domain boundaries.

  13. Planetary SUrface Portal (PSUP): a tool for easy visualization and analysis of Martian surface

    NASA Astrophysics Data System (ADS)

    Poulet, Francois; Quantin-Nataf, Cathy; Ballans, Hervé; Lozac'h, Loic; Audouard, Joachim; Carter, John; Dassas, karin; Malapert, Jean-Christophe; Marmo, Chiara; Poulleau, Gilles; Riu, Lucie; Séjourné, antoine

    2016-10-01

    PSUP is two software application platforms for working with raster, vector, DTM, and hyper-spectral data acquired by various space instruments analyzing the surface of Mars from orbit. The first platform of PSUP is MarsSI (Martian surface data processing Information System, http://emars.univ-lyon1.fr). It provides data analysis functionalities to select and download ready-to-use products or to process data though specific and validated pipelines. To date, MarsSI handles CTX, HiRISE and CRISM data of NASA/MRO mission, HRSC and OMEGA data of ESA/MEx mission and THEMIS data of NASA/ODY mission (Lozac'h et al., EPSC 2015). The second part of PSUP is also open to the scientific community and can be visited at http://psup.ias.u-psud.fr/. This web-based user interface provides access to many data products for Mars: image footprints and rasters from the MarsSI tool; compositional maps from OMEGA and TES; albedo and thermal inertia from OMEGA and TES; mosaics from THEMIS, Viking, and CTX; high level specific products (defined as catalogues) such as hydrated mineral sites derived from CRISM and OMEGA data, central peaks mineralogy,… In addition, OMEGA C channel data cubes corrected for atmospheric and aerosol contributions can be downloaded. The architecture of PSUP data management and visualization is based on SITools2 and MIZAR, two CNES generic tools developed by a joint effort between CNES and scientific laboratories. SITools2 provides a self-manageable data access layer deployed on the PSUP data, while MIZAR is 3D application in a browser for discovering and visualizing geospatial data. Further developments including the addition of high level products of Mars (regional geological maps, new global compositional maps,…) are foreseen. Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  14. Computer aided planning of orthopaedic surgeries: the definition of generic planning steps for bone removal procedures.

    PubMed

    Putzer, David; Moctezuma, Jose Luis; Nogler, Michael

    2017-11-01

    An increasing number of orthopaedic surgeons are using computer aided planning tools for bone removal applications. The aim of the study was to consolidate a set of generic functions to be used for a 3D computer assisted planning or simulation. A limited subset of 30 surgical procedures was analyzed and verified in 243 surgical procedures of a surgical atlas. Fourteen generic functions to be used in 3D computer assisted planning and simulations were extracted. Our results showed that the average procedure comprises 14 ± 10 (SD) steps with ten different generic planning steps and four generic bone removal steps. In conclusion, the study shows that with a limited number of 14 planning functions it is possible to perform 243 surgical procedures out of Campbell's Operative Orthopedics atlas. The results may be used as a basis for versatile generic intraoperative planning software.

  15. On generic obstructions to recovering correct statistics from climate simulations: Homogenization for deterministic maps and multiplicative noise

    NASA Astrophysics Data System (ADS)

    Gottwald, Georg; Melbourne, Ian

    2013-04-01

    Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.

  16. Integrated Control Modeling for Propulsion Systems Using NPSS

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.

  17. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  18. A Generic Framework of Performance Measurement in Networked Enterprises

    NASA Astrophysics Data System (ADS)

    Kim, Duk-Hyun; Kim, Cheolhan

    Performance measurement (PM) is essential for managing networked enterprises (NEs) because it greatly affects the effectiveness of collaboration among members of NE.PM in NE requires somewhat different approaches from PM in a single enterprise because of heterogeneity, dynamism, and complexity of NE’s. This paper introduces a generic framework of PM in NE (we call it NEPM) based on the Balanced Scorecard (BSC) approach. In NEPM key performance indicators and cause-and-effect relationships among them are defined in a generic strategy map. NEPM could be applied to various types of NEs after specializing KPIs and relationships among them. Effectiveness of NEPM is shown through a case study of some Korean NEs.

  19. A Web Geographic Information System to share data and explorative analysis tools: The application to West Nile disease in the Mediterranean basin.

    PubMed

    Savini, Lara; Tora, Susanna; Di Lorenzo, Alessio; Cioci, Daniela; Monaco, Federica; Polci, Andrea; Orsini, Massimiliano; Calistri, Paolo; Conte, Annamaria

    2018-01-01

    In the last decades an increasing number of West Nile Disease cases was observed in equines and humans in the Mediterranean basin and surveillance systems are set up in numerous countries to manage and control the disease. The collection, storage and distribution of information on the spread of the disease becomes important for a shared intervention and control strategy. To this end, a Web Geographic Information System has been developed and disease data, climatic and environmental remote sensed data, full genome sequences of selected isolated strains are made available. This paper describes the Disease Monitoring Dashboard (DMD) web system application, the tools available for the preliminary analysis on climatic and environmental factors and the other interactive tools for epidemiological analysis. WNV occurrence data are collected from multiple official and unofficial sources. Whole genome sequences and metadata of WNV strains are retrieved from public databases or generated in the framework of the Italian surveillance activities. Climatic and environmental data are provided by NASA website. The Geographical Information System is composed by Oracle 10g Database and ESRI ArcGIS Server 10.03; the web mapping client application is developed with the ArcGIS API for Javascript and Phylocanvas library to facilitate and optimize the mash-up approach. ESRI ArcSDE 10.1 has been used to store spatial data. The DMD application is accessible through a generic web browser at https://netmed.izs.it/networkMediterraneo/. The system collects data through on-line forms and automated procedures and visualizes data as interactive graphs, maps and tables. The spatial and temporal dynamic visualization of disease events is managed by a time slider that returns results on both map and epidemiological curve. Climatic and environmental data can be associated to cases through python procedures and downloaded as Excel files. The system compiles multiple datasets through user-friendly web tools; it integrates entomological, veterinary and human surveillance, molecular information on pathogens and environmental and climatic data. The principal result of the DMD development is the transfer and dissemination of knowledge and technologies to develop strategies for integrated prevention and control measures of animal and human diseases.

  20. Preliminary Analysis of Assessment Instrument Design to Reveal Science Generic Skill and Chemistry Literacy

    ERIC Educational Resources Information Center

    Sumarni, Woro; Sudarmin; Supartono, Wiyanto

    2016-01-01

    The purpose of this research is to design assessment instrument to evaluate science generic skill (SGS) achievement and chemistry literacy in ethnoscience-integrated chemistry learning. The steps of tool designing refers to Plomp models including 1) Investigation Phase (Prelimenary Investigation); 2) Designing Phase (Design); 3)…

  1. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    ERIC Educational Resources Information Center

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  2. Generic Service Integration in Adaptive Learning Experiences Using IMS Learning Design

    ERIC Educational Resources Information Center

    de-la-Fuente-Valentin, Luis; Pardo, Abelardo; Kloos, Carlos Delgado

    2011-01-01

    IMS Learning Design is a specification to capture the orchestration taking place in a learning scenario. This paper presents an extension called Generic Service Integration. This paradigm allows a bidirectional communication between the course engine in charge of the orchestration and conventional Web 2.0 tools. This communication allows the…

  3. Generic Reflective Feedback: An Effective Approach to Developing Clinical Reasoning Skills

    ERIC Educational Resources Information Center

    Wojcikowski, K.; Brownie, S.

    2013-01-01

    Problem-based learning can be an effective tool to develop clinical reasoning skills. However, it traditionally takes place in tutorial groups, giving students little flexibility in how and when they learn. This pilot study compared the effectiveness of generic reflective feedback (GRF) with tutorial-based reflective feedback on the development of…

  4. Development and validation of a tool to assess knowledge and attitudes towards generic medicines among students in Greece: The ATtitude TOwards GENerics (ATTOGEN) questionnaire

    PubMed Central

    Katsari, Vasiliki; Niakas, Dimitris

    2017-01-01

    Introduction The use of generic medicines is a cost-effective policy, often dictated by fiscal restraints. To our knowledge, no fully validated tool exploring the students’ knowledge and attitudes towards generic medicines exists. The aim of our study was to develop and validate a questionnaire exploring the knowledge and attitudes of M.Sc. in Health Care Management students and recent alumni’s towards generic drugs in Greece. Materials and methods The development of the questionnaire was a result of literature review and pilot-testing of its preliminary versions to researchers and students. The final version of the questionnaire contains 18 items measuring the respondents’ knowledge and attitude towards generic medicines on a 5-point Likert scale. Given the ordinal nature of the data, ordinal alpha and polychoric correlations were computed. The sample was randomly split into two halves. Exploratory factor analysis, performed in the first sample, was used for the creation of multi-item scales. Confirmatory factor analysis and Generalized Linear Latent and Mixed Model analysis (GLLAMM) with the use of the rating scale model were used in the second sample to assess goodness of fit. An assessment of internal consistency reliability, test-retest reliability, and construct validity was also performed. Results Among 1402 persons contacted, 986 persons completed our questionnaire (response rate = 70.3%). Overall Cronbach’s alpha was 0.871. The conjoint use of exploratory and confirmatory factor analysis resulted in a six-scale model, which seemed to fit the data well. Five of the six scales, namely trust, drug quality, state audit, fiscal impact and drug substitution were found to be valid and reliable, while the knowledge scale suffered only from low inter-scale correlations and a ceiling effect. However, the subsequent confirmatory factor and GLLAMM analyses indicated a good fit of the model to the data. Conclusions The ATTOGEN instrument proved to be a reliable and valid tool, suitable for assessing students’ knowledge and attitudes towards generic medicines. PMID:29186163

  5. 7 CFR 1485.18 - Advances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... MARKETS FOR AGRICULTURAL COMMODITIES Market Access Program § 1485.18 Advances. (a) Policy. In general, CCC... payments to an MAP participant for generic promotion activities. Prior to making an advance, CCC may...

  6. 7 CFR 1485.18 - Advances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... MARKETS FOR AGRICULTURAL COMMODITIES Market Access Program § 1485.18 Advances. (a) Policy. In general, CCC... payments to an MAP participant for generic promotion activities. Prior to making an advance, CCC may...

  7. Encouraging the use of generic medicines: implications for transition economies.

    PubMed

    King, Derek R; Kanavos, Panos

    2002-08-01

    Generic drugs have a key role to play in the efficient allocation of financial resources for pharmaceutical medicines. Policies implemented in the countries with a high rate of generic drug use, such as Canada, Denmark, Germany, the Netherlands, the United Kingdom, and the United States, are reviewed, with consideration of the market structures that facilitate strong competition. Savings in these countries are realized through increases in the volume of generic drugs used and the frequently significant differences in the price between generic medicines and branded originator medicines. Their policy tools include the mix of supply-side measures and demand-side measures that are relevant for generic promotion and higher generic use. On the supply-side, key policy measures include generic drug marketing regulation that facilitates market entry soon after patent expiration, reference pricing, the pricing of branded originator products, and the degree of price competition in pharmaceutical markets. On the demand-side, measures typically encompass influencing prescribing and dispensing patterns as well as introducing a co-payment structure for consumers/patients that takes into consideration the difference in cost between branded and generic medicines. Quality of generic medicines is a pre-condition for all other measures discussed to take effect. The paper concludes by offering a list of policy options for decision-makers in Central and Eastern European economies in transition.

  8. Two-dimensional ice mapping of molecular cores

    NASA Astrophysics Data System (ADS)

    Noble, J. A.; Fraser, H. J.; Pontoppidan, K. M.; Craigon, A. M.

    2017-06-01

    We present maps of the column densities of H2O, CO2 and CO ices towards the molecular cores B 35A, DC 274.2-00.4, BHR 59 and DC 300.7-01.0. These ice maps, probing spatial distances in molecular cores as low as 2200 au, challenge the traditional hypothesis that the denser the region observed, the more ice is present, providing evidence that the relationships between solid molecular species are more varied than the generic picture we often adopt to model gas-grain chemical processes and explain feedback between solid phase processes and gas phase abundances. We present the first combined solid-gas maps of a single molecular species, based upon observations of both CO ice and gas phase C18O towards B 35A, a star-forming dense core in Orion. We conclude that molecular species in the solid phase are powerful tracers of 'small-scale' chemical diversity, prior to the onset of star formation. With a component analysis approach, we can probe the solid phase chemistry of a region at a level of detail greater than that provided by statistical analyses or generic conclusions drawn from single pointing line-of-sight observations alone.

  9. Cosmological N -body simulations with generic hot dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandbyge, Jacob; Hannestad, Steen, E-mail: jacobb@phys.au.dk, E-mail: sth@phys.au.dk

    2017-10-01

    We have calculated the non-linear effects of generic fermionic and bosonic hot dark matter components in cosmological N -body simulations. For sub-eV masses, the non-linear power spectrum suppression caused by thermal free-streaming resembles the one seen for massive neutrinos, whereas for masses larger than 1 eV, the non-linear relative suppression of power is smaller than in linear theory. We furthermore find that in the non-linear regime, one can map fermionic to bosonic models by performing a simple transformation.

  10. Cosmological N-body simulations with generic hot dark matter

    NASA Astrophysics Data System (ADS)

    Brandbyge, Jacob; Hannestad, Steen

    2017-10-01

    We have calculated the non-linear effects of generic fermionic and bosonic hot dark matter components in cosmological N-body simulations. For sub-eV masses, the non-linear power spectrum suppression caused by thermal free-streaming resembles the one seen for massive neutrinos, whereas for masses larger than 1 eV, the non-linear relative suppression of power is smaller than in linear theory. We furthermore find that in the non-linear regime, one can map fermionic to bosonic models by performing a simple transformation.

  11. Impact of medicare part D plan features on use of generic drugs.

    PubMed

    Tang, Yan; Gellad, Walid F; Men, Aiju; Donohue, Julie M

    2014-06-01

    Little is known about how Medicare Part D plan features influence choice of generic versus brand drugs. To examine the association between Part D plan features and generic medication use. Data from a 2009 random sample of 1.6 million fee-for-service, Part D enrollees aged 65 years and above, who were not dually eligible or receiving low-income subsidies, were used to examine the association between plan features (generic cost-sharing, difference in brand and generic copay, prior authorization, step therapy) and choice of generic antidepressants, antidiabetics, and statins. Logistic regression models accounting for plan-level clustering were adjusted for sociodemographic and health status. Generic cost-sharing ranged from $0 to $9 for antidepressants and statins, and from $0 to $8 for antidiabetics (across 5th-95th percentiles). Brand-generic cost-sharing differences were smallest for statins (5th-95th percentiles: $16-$37) and largest for antidepressants ($16-$64) across plans. Beneficiaries with higher generic cost-sharing had lower generic use [adjusted odds ratio (OR)=0.97, 95% confidence interval (CI), 0.95-0.98 for antidepressants; OR=0.97, 95% CI, 0.96-0.98 for antidiabetics; OR=0.94, 95% CI, 0.92-0.95 for statins]. Larger brand-generic cost-sharing differences and prior authorization were significantly associated with greater generic use in all categories. Plans could increase generic use by 5-12 percentage points by reducing generic cost-sharing from the 75th ($7) to 25th percentiles ($4-$5), increasing brand-generic cost-sharing differences from the 25th ($25-$26) to 75th ($32-$33) percentiles, and using prior authorization and step therapy. Cost-sharing features and utilization management tools were significantly associated with generic use in 3 commonly used medication categories.

  12. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  13. Generic Modeling of a Life Support System for Process Technology Comparison

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.

  14. Automatic control system generation for robot design validation

    NASA Technical Reports Server (NTRS)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  15. Tools for Rapid Understanding of Malware Code

    DTIC Science & Technology

    2015-05-07

    cloaking techniques. We used three malware detectors, covering a wide spectrum of detection technologies, for our experiments: VirusTotal, an online ...Analysis and Manipulation ( SCAM ), 2014. [9] Babak Yadegari, Brian Johannesmeyer, Benjamin Whitely, and Saumya Debray. A generic approach to automatic...and Manipulation ( SCAM ), 2014. [9] Babak Yadegari, Brian Johannesmeyer, Benjamin Whitely, and Saumya Debray. A generic approach to automatic

  16. "I'm No Lady Astronaut": Nonsexist Language for Tomorrow.

    ERIC Educational Resources Information Center

    Vardell, Sylvia M.

    As a powerful tool for education, language informs, influences, discloses, and communicates. Research on the use of language has found that it also discriminates. Among the different manifestations of sexism in language are (1) the use of "he" as a generic pronoun; (2) the "generic" use of "man" as an exclusively male referent; (3) the use of "you…

  17. AADL Fault Modeling and Analysis Within an ARP4761 Safety Assessment

    DTIC Science & Technology

    2014-10-01

    Analysis Generator 27 3.2.3 Mapping to OpenFTA Format File 27 3.2.4 Mapping to Generic XML Format 28 3.2.5 AADL and FTA Mapping Rules 28 3.2.6 Issues...PSSA), System Safety Assessment (SSA), Common Cause Analysis (CCA), Fault Tree Analysis ( FTA ), Failure Modes and Effects Analysis (FMEA), Failure...Modes and Effects Summary, Mar - kov Analysis (MA), and Dependence Diagrams (DDs), also referred to as Reliability Block Dia- grams (RBDs). The

  18. The Invasive Species Forecasting System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Most, Neal; Gill, Roger; Ma, Peter

    2011-01-01

    The Invasive Species Forecasting System (ISFS) provides computational support for the generic work processes found in many regional-scale ecosystem modeling applications. Decision support tools built using ISFS allow a user to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of management concern, such as a national park, monument, forest, or refuge. This type of decision product helps resource managers plan invasive species protection, monitoring, and control strategies for the lands they manage. Until now, scientists and resource managers have lacked the data-assembly and computing capabilities to produce these maps quickly and cost efficiently. ISFS focuses on regional-scale habitat suitability modeling for invasive terrestrial plants. ISFS s component architecture emphasizes simplicity and adaptability. Its core services can be easily adapted to produce model-based decision support tools tailored to particular parks, monuments, forests, refuges, and related management units. ISFS can be used to build standalone run-time tools that require no connection to the Internet, as well as fully Internet-based decision support applications. ISFS provides the core data structures, operating system interfaces, network interfaces, and inter-component constraints comprising the canonical workflow for habitat suitability modeling. The predictors, analysis methods, and geographic extents involved in any particular model run are elements of the user space and arbitrarily configurable by the user. ISFS provides small, lightweight, readily hardened core components of general utility. These components can be adapted to unanticipated uses, are tailorable, and require at most a loosely coupled, nonproprietary connection to the Web. Users can invoke capabilities from a command line; programmers can integrate ISFS's core components into more complex systems and services. Taken together, these features enable a degree of decentralization and distributed ownership that have helped other types of scientific information services succeed in recent years.

  19. Predictors of generic substitution: The role of psychological, sociodemographic, and contextual factors.

    PubMed

    Drozdowska, Aleksandra; Hermanowski, Tomasz

    2016-01-01

    Escalating pharmaceutical costs have become a global challenge for both governments and patients. Generic substitution is one way of decreasing these costs. The aim of this study was to investigate factors associated with patients' choice between generic drugs and innovator drugs. The survey was conducted in June 2013, 1000 people from across Poland were chosen as a representative population sample. The outcome (a preference for generics/a preference for innovator pharmaceuticals/no preference) was modeled by multinomial logistic regression, adjusted for several variables describing patients' sensitivity to selected generic features (price, brand, and country of origin), to third-party opinions about generics (information on generics in the mass media, opinions of health professionals (i.e. physicians, pharmacists), relatives/friends), as well as patients' personal experiences and income per household. The results supported the predictive capacity of most independent variables (except for patient sensitivity to the country of origin and to the information on generics in the mass media), denoting patients' preferences toward generic substitution. Patient sensitivity to recommendations by physicians, generic brand, and household income were the strongest predictors of the choice between generic and innovator pharmaceuticals (P < 0.001). The probability of choosing generics over innovator drugs was significantly higher among respondents with the lowest income levels, in those who were indifferent to generic brand or their physician's opinion, as well as in respondents who were sensitive to recommendations by pharmacists or attached a greater value to a past experience with generics (their own experience or that of relatives/friends). In consideration of the foregoing, awareness-raising campaigns may be recommended, supported by a variety of systemic solutions and tools to encourage generic substitution. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. VHDL Modeling of PRC-70 Radio ASICs for Reverse Engineering

    DTIC Science & Technology

    1994-08-01

    VDVDDvm TZMP ATRWa-T IMPML4MM) pact mWu (1155_0, 1184-0, 1199.0); somork map (DE.AY-> 1uw FANC ~m3 VDDO.VDD. port mwap (M19.0 191 10.. fl0...)*, 194: mnv...IWMDEAY1> fl, VDD-VDD, M eMMUR1T 1F1O𔃽ATU3) part map(1139) ,RCIR...9Qj14Q)); gen ricmap (DELAYm.o I r, 1IANOUYF.o1, VD~mVDD, pact map (1140YNJ136YW9Y...8217URE-TEMPBRATURE) port map (N945,N2467); 11341: inv generic map (DELAY->1 as, FANC )UT->1, VDD->VDD, TEMPERATURE->TEMPMRThRB) pod map (N24M h25 1); 11344

  1. Why are dunkels sticky? Preschoolers infer functionality and intentional creation for artifact properties learned from generic language.

    PubMed

    Cimpian, Andrei; Cadena, Cristina

    2010-10-01

    Artifacts pose a potential learning problem for children because the mapping between their features and their functions is often not transparent. In solving this problem, children are likely to rely on a number of information sources (e.g., others' actions, affordances). We argue that children's sensitivity to nuances in the language used to describe artifacts is an important, but so far unacknowledged, piece of this puzzle. Specifically, we hypothesize that children are sensitive to whether an unfamiliar artifact's features are highlighted using generic (e.g., "Dunkels are sticky") or non-generic (e.g., "This dunkel is sticky") language. Across two studies, older-but not younger-preschoolers who heard such features introduced via generic statements inferred that they are a functional part of the artifact's design more often than children who heard the same features introduced via non-generic statements. The ability to pick up on this linguistic cue may expand considerably the amount of conceptual information about artifacts that children derive from conversations with adults. Copyright 2010 Elsevier B.V. All rights reserved.

  2. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  3. Proceedings of the 1987 conference on tools for the simulation profession

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawkins, R.; Klukis, K.

    1987-01-01

    This book covers the proceedings of the 1987 conference on tools for the simulation profession. Some of the topics are: SIMULACT: a generic tool for simulating distributed systems; ESL language simulation of spacecraft batteries; and Trends in global cadmium levels from increased use of fossil fuels.

  4. Mapping Turnaround Times (TAT) to a Generic Timeline: A Systematic Review of TAT Definitions in Clinical Domains

    PubMed Central

    2011-01-01

    Background Assessing turnaround times can help to analyse workflows in hospital information systems. This paper presents a systematic review of literature concerning different turnaround time definitions. Our objectives were to collect relevant literature with respect to this kind of process times in hospitals and their respective domains. We then analysed the existing definitions and summarised them in an appropriate format. Methods Our search strategy was based on Pubmed queries and manual reviews of the bibliographies of retrieved articles. Studies were included if precise definitions of turnaround times were available. A generic timeline was designed through a consensus process to provide an overview of these definitions. Results More than 1000 articles were analysed and resulted in 122 papers. Of those, 162 turnaround time definitions in different clinical domains were identified. Starting and end points vary between these domains. To illustrate those turnaround time definitions, a generic timeline was constructed using preferred terms derived from the identified definitions. The consensus process resulted in the following 15 terms: admission, order, biopsy/examination, receipt of specimen in laboratory, procedure completion, interpretation, dictation, transcription, verification, report available, delivery, physician views report, treatment, discharge and discharge letter sent. Based on this analysis, several standard terms for turnaround time definitions are proposed. Conclusion Using turnaround times to benchmark clinical workflows is still difficult, because even within the same clinical domain many different definitions exist. Mapping of turnaround time definitions to a generic timeline is feasible. PMID:21609424

  5. The Experiences of Counselors Who Use Cognitive Behavioral Therapy with Middle School Students Who Were Bullied: A Generic Qualitative Study

    ERIC Educational Resources Information Center

    Tucker, Gloria J.

    2016-01-01

    This generic qualitative study investigated the experiences of counselors who use cognitive behavioral therapy with middle school students who were bullied. Counselors can play a significant role in the life of an adolescent when tools are offered to help the adolescent recognize negative thought patterns and help them work towards attaining…

  6. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  7. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  8. Psychometric properties of the self-report Malay version of the Pediatric Quality of Life (PedsQLTM) 4.0 Generic Core Scales among multiethnic Malaysian adolescents.

    PubMed

    Ainuddin, Husna A; Loh, Siew Yim; Chinna, Karuthan; Low, Wah Yun; Roslani, April Camilla

    2015-06-01

    Adolescence is the potential period for growth and optimal functioning, but developmental issues like time of transition from childhood to adulthood will create stress and affect the adolescent's quality of life (QOL). However, there is a lack of research tool for measuring adolescent's QOL in Malaysia. The aim of the study was to determine the validity and reliability of the self-report Malay version of the pediatric QOL (PedsQL™) 4.0 Generic Core Scales in assessing the QOL of Malaysian adolescents. A cross-sectional study design using the 23-item self-report Malay version of the PedsQL 4.0 Generic Core Scales was administered on a convenient cluster sampling (n = 297 adolescent) from a secondary school. The internal consistency reliability had Cronbach's α values ranging from .70 to .89. Factor analysis reported a six-factor structure via principal axis factor analysis. In conclusion, the self-report Malay version of the pediatric QOL 4.0 Generic Core Scales is a reliable and valid tool to measure the QOL of multiethnic Malaysian adolescents. © The Author(s) 2013.

  9. Doing accelerator physics using SDDS, UNIX, and EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borland, M.; Emery, L.; Sereno, N.

    1995-12-31

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less

  10. Transactions in domain-specific information systems

    NASA Astrophysics Data System (ADS)

    Zacek, Jaroslav

    2017-07-01

    Substantial number of the current information system (IS) implementations is based on transaction approach. In addition, most of the implementations are domain-specific (e.g. accounting IS, resource planning IS). Therefore, we have to have a generic transaction model to build and verify domain-specific IS. The paper proposes a new transaction model for domain-specific ontologies. This model is based on value oriented business process modelling technique. The transaction model is formalized by the Petri Net theory. First part of the paper presents common business processes and analyses related to business process modeling. Second part defines the transactional model delimited by REA enterprise ontology paradigm and introduces states of the generic transaction model. The generic model proposal is defined and visualized by the Petri Net modelling tool. Third part shows application of the generic transaction model. Last part of the paper concludes results and discusses a practical usability of the generic transaction model.

  11. Physics-Based Design Tools for Lightweight Ceramic Composite Turbine Components with Durable Microstructures

    NASA Technical Reports Server (NTRS)

    DiCarlo, James A.

    2011-01-01

    Under the Supersonics Project of the NASA Fundamental Aeronautics Program, modeling and experimental efforts are underway to develop generic physics-based tools to better implement lightweight ceramic matrix composites into supersonic engine components and to assure sufficient durability for these components in the engine environment. These activities, which have a crosscutting aspect for other areas of the Fundamental Aero program, are focusing primarily on improving the multi-directional design strength and rupture strength of high-performance SiC/SiC composites by advanced fiber architecture design. This presentation discusses progress in tool development with particular focus on the use of 2.5D-woven architectures and state-of-the-art constituents for a generic un-cooled SiC/SiC low-pressure turbine blade.

  12. M-and-C Domain Map Maker: an environment complimenting MDE with M-and-C knowledge and ensuring solution completeness

    NASA Astrophysics Data System (ADS)

    Patwari, Puneet; Choudhury, Subhrojyoti R.; Banerjee, Amar; Swaminathan, N.; Pandey, Shreya

    2016-07-01

    Model Driven Engineering (MDE) as a key driver to reduce development cost of M&C systems is beginning to find acceptance across scientific instruments such as Radio Telescopes and Nuclear Reactors. Such projects are adopting it to reduce time to integrate, test and simulate their individual controllers and increase reusability and traceability in the process. The creation and maintenance of models is still a significant challenge to realizing MDE benefits. Creating domain-specific modelling environments reduces the barriers, and we have been working along these lines, creating a domain-specific language and environment based on an M&C knowledge model. However, large projects involve several such domains, and there is still a need to interconnect the domain models, in order to ensure modelling completeness. This paper presents a knowledge-centric approach to doing that, by creating a generic system model that underlies the individual domain knowledge models. We present our vision for M&C Domain Map Maker, a set of processes and tools that enables explication of domain knowledge in terms of domain models with mutual consistency relationships to aid MDE.

  13. Universal ventricular coordinates: A generic framework for describing position within the heart and transferring data.

    PubMed

    Bayer, Jason; Prassl, Anton J; Pashaei, Ali; Gomez, Juan F; Frontera, Antonio; Neic, Aurel; Plank, Gernot; Vigmond, Edward J

    2018-04-01

    Being able to map a particular set of cardiac ventricles to a generic topologically equivalent representation has many applications, including facilitating comparison of different hearts, as well as mapping quantities and structures of interest between them. In this paper we describe Universal Ventricular Coordinates (UVC), which can be used to describe position within any biventricular heart. UVC comprise four unique coordinates that we have chosen to be intuitive, well defined, and relevant for physiological descriptions. We describe how to determine these coordinates for any volumetric mesh by illustrating how to properly assign boundary conditions and utilize solutions to Laplace's equation. Using UVC, we transferred scalar, vector, and tensor data between four unstructured ventricular meshes from three different species. Performing the mappings was very fast, on the order of a few minutes, since mesh nodes were searched in a KD tree. Distance errors in mapping mesh nodes back and forth between meshes were less than the size of an element. Analytically derived fiber directions were also mapped across meshes and compared, showing  < 5° difference over most of the ventricles. The ability to transfer gradients was also demonstrated. Topologically variable structures, like papillary muscles, required further definition outside of the UVC framework. In conclusion, UVC can aid in transferring many types of data between different biventricular geometries. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.

  14. Light propagation in linearly perturbed ΛLTB models

    NASA Astrophysics Data System (ADS)

    Meyer, Sven; Bartelmann, Matthias

    2017-11-01

    We apply a generic formalism of light propagation to linearly perturbed spherically symmetric dust models including a cosmological constant. For a comoving observer on the central worldline, we derive the equation of geodesic deviation and perform a suitable spherical harmonic decomposition. This allows to map the abstract gauge-invariant perturbation variables to well-known quantities from weak gravitational lensing like convergence or cosmic shear. The resulting set of differential equations can effectively be solved by a Green's function approach leading to line-of-sight integrals sourced by the perturbation variables on the backward lightcone. The resulting spherical harmonic coefficients of the lensing observables are presented and the shear field is decomposed into its E- and B-modes. Results of this work are an essential tool to add information from linear structure formation to the analysis of spherically symmetric dust models with the purpose of testing the Copernican Principle with multiple cosmological probes.

  15. A generic multi-flex-body dynamics, controls simulation tool for space station

    NASA Technical Reports Server (NTRS)

    London, Ken W.; Lee, John F.; Singh, Ramen P.; Schubele, Buddy

    1991-01-01

    An order (n) multiflex body Space Station simulation tool is introduced. The flex multibody modeling is generic enough to model all phases of Space Station from build up through to Assembly Complete configuration and beyond. Multibody subsystems such as the Mobile Servicing System (MSS) undergoing a prescribed translation and rotation are also allowed. The software includes aerodynamic, gravity gradient, and magnetic field models. User defined controllers can be discrete or continuous. Extensive preprocessing of 'body by body' NASTRAN flex data is built in. A significant aspect, too, is the integrated controls design capability which includes model reduction and analytic linearization.

  16. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Gustaf: Detecting and correctly classifying SVs in the NGS twilight zone.

    PubMed

    Trappe, Kathrin; Emde, Anne-Katrin; Ehrlich, Hans-Christian; Reinert, Knut

    2014-12-15

    The landscape of structural variation (SV) including complex duplication and translocation patterns is far from resolved. SV detection tools usually exhibit low agreement, are often geared toward certain types or size ranges of variation and struggle to correctly classify the type and exact size of SVs. We present Gustaf (Generic mUlti-SpliT Alignment Finder), a sound generic multi-split SV detection tool that detects and classifies deletions, inversions, dispersed duplications and translocations of ≥ 30 bp. Our approach is based on a generic multi-split alignment strategy that can identify SV breakpoints with base pair resolution. We show that Gustaf correctly identifies SVs, especially in the range from 30 to 100 bp, which we call the next-generation sequencing (NGS) twilight zone of SVs, as well as larger SVs >500 bp. Gustaf performs better than similar tools in our benchmark and is furthermore able to correctly identify size and location of dispersed duplications and translocations, which otherwise might be wrongly classified, for example, as large deletions. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  19. [A functional analysis of healthcare auditors' skills in Venezuela, 2008].

    PubMed

    Chirinos-Muñoz, Mónica S

    2010-10-01

    Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.

  20. The Need and Keys for a New Generation Network Adjustment Software

    NASA Astrophysics Data System (ADS)

    Colomina, I.; Blázquez, M.; Navarro, J. A.; Sastre, J.

    2012-07-01

    Orientation and calibration of photogrammetric and remote sensing instruments is a fundamental capacity of current mapping systems and a fundamental research topic. Neither digital remote sensing acquisition systems nor direct orientation gear, like INS and GNSS technologies, made block adjustment obsolete. On the contrary, the continuous flow of new primary data acquisition systems has challenged the capacity of the legacy block adjustment systems - in general network adjustment systems - in many aspects: extensibility, genericity, portability, large data sets capacity, metadata support and many others. In this article, we concentrate on the extensibility and genericity challenges that current and future network systems shall face. For this purpose we propose a number of software design strategies with emphasis on rigorous abstract modeling that help in achieving simplicity, genericity and extensibility together with the protection of intellectual proper rights in a flexible manner. We illustrate our suggestions with the general design approach of GENA, the generic extensible network adjustment system of GeoNumerics.

  1. Technology to Enhance Mathematics and Science Instruction: Changes in Teacher Perceptions after Participating in a Yearlong Professional Development Program

    ERIC Educational Resources Information Center

    Kersaint, Gladis; Ritzhaupt, Albert D.; Liu, Feng

    2014-01-01

    The purpose of this study is to examine the extent to which teachers of mathematics or science who were engaged in a year-long initiative to help them integrate technological tools were (a) familiar with generic and mathematics- or science-specific technology, (b) comfortable integrating generic and content-specific technology, (c) believe that…

  2. Predicting the Ability of Marine Mammal Populations to Compensate for Behavioral Disturbances

    DTIC Science & Technology

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Predicting the Ability of Marine Mammal Populations to...determine the ability of marine mammal populations to respond to behavioral disturbances. These tools are to be generic and applicable in a wide range...scale consequences. OBJECTIVES • Develop simple, generic measures that allow the estimation of marine mammal populations and individuals to

  3. Automated analysis in generic groups

    NASA Astrophysics Data System (ADS)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an automated tool to search for structure-preserving signatures matching a template. Through exhaustive search we conjecture lower bounds for the number of pairings required in the Type~II setting and prove our conjecture to be true. Finally, our tool exhibits examples of structure-preserving signatures matching the lower bounds, which proves tightness of our bounds, as well as improves on previously known structure-preserving signature schemes.

  4. Progress in Multi-Disciplinary Data Life Cycle Management

    NASA Astrophysics Data System (ADS)

    Jung, C.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.

    2015-12-01

    Modern science is most often driven by data. Improvements in state-of-the-art technologies and methods in many scientific disciplines lead not only to increasing data rates, but also to the need to improve or even completely overhaul their data life cycle management. Communities usually face two kinds of challenges: generic ones like federated authorization and authentication infrastructures and data preservation, and ones that are specific to their community and their respective data life cycle. In practice, the specific requirements often hinder the use of generic tools and methods. The German Helmholtz Association project ’’Large-Scale Data Management and Analysis” (LSDMA) addresses both challenges: its five Data Life Cycle Labs (DLCLs) closely collaborate with communities in joint research and development to optimize the communities data life cycle management, while its Data Services Integration Team (DSIT) provides generic data tools and services. We present most recent developments and results from the DLCLs covering communities ranging from heavy ion physics and photon science to high-throughput microscopy, and from DSIT.

  5. Heterogeneous Sensor Data Exploration and Sustainable Declarative Monitoring Architecture: Application to Smart Building

    NASA Astrophysics Data System (ADS)

    Servigne, S.; Gripay, Y.; Pinarer, O.; Samuel, J.; Ozgovde, A.; Jay, J.

    2016-09-01

    Concerning energy consumption and monitoring architectures, our goal is to develop a sustainable declarative monitoring architecture for lower energy consumption taking into account the monitoring system itself. Our second is to develop theoretical and practical tools to model, explore and exploit heterogeneous data from various sources in order to understand a phenomenon like energy consumption of smart building vs inhabitants' social behaviours. We focus on a generic model for data acquisition campaigns based on the concept of generic sensor. The concept of generic sensor is centered on acquired data and on their inherent multi-dimensional structure, to support complex domain-specific or field-oriented analysis processes. We consider that a methodological breakthrough may pave the way to deep understanding of voluminous and heterogeneous scientific data sets. Our use case concerns energy efficiency of buildings to understand relationship between physical phenomena and user behaviors. The aim of this paper is to give a presentation of our methodology and results concerning architecture and user-centric tools.

  6. Generic Airspace Concepts and Research

    NASA Technical Reports Server (NTRS)

    Mogford, Richard H.

    2010-01-01

    The purpose of this study was to evaluate methods for reducing the training and memorization required to manage air traffic in mid-term, Next Generation Air Transportation System (NextGen) airspace. We contrasted the performance of controllers using a sector information display and NextGen automation tools while working with familiar and unfamiliar sectors. The airspace included five sectors from Oakland and Salt Lake City Centers configured as a "generic center" called "West High Center." The Controller Information Tool was used to present essential information for managing these sectors. The Multi Aircraft Control System air traffic control simulator provided data link and conflict detection and resolution. There were five experienced air traffic controller participants. Each was familiar with one or two of the five sectors, but not the others. The participants rotated through all five sectors during the ten data collection runs. The results addressing workload, traffic management, and safety, as well as controller and observer comments, supported the generic sector concept. The unfamiliar sectors were comparable to the familiar sectors on all relevant measures.

  7. Using generic tool kits to build intelligent systems

    NASA Technical Reports Server (NTRS)

    Miller, David J.

    1994-01-01

    The Intelligent Systems and Robots Center at Sandia National Laboratories is developing technologies for the automation of processes associated with environmental remediation and information-driven manufacturing. These technologies, which focus on automated planning and programming and sensor-based and model-based control, are used to build intelligent systems which are able to generate plans of action, program the necessary devices, and use sensors to react to changes in the environment. By automating tasks through the use of programmable devices tied to computer models which are augmented by sensing, requirements for faster, safer, and cheaper systems are being satisfied. However, because of the need for rapid cost-effect prototyping and multi-laboratory teaming, it is also necessary to define a consistent approach to the construction of controllers for such systems. As a result, the Generic Intelligent System Controller (GISC) concept has been developed. This concept promotes the philosophy of producing generic tool kits which can be used and reused to build intelligent control systems.

  8. Planetary mapping—The datamodel's perspective and GIS framework

    NASA Astrophysics Data System (ADS)

    van Gasselt, S.; Nass, A.

    2011-09-01

    Demands for a broad range of integrated geospatial data-analysis tools and methods for planetary data organization have been growing considerably since the late 1990s when a plethora of missions equipped with new instruments entered planetary orbits or landed on the surface. They sent back terabytes of new data which soon became accessible for the scientific community and public and which needed to be organized. On the terrestrial side, issues of data access, organization and utilization for scientific and economic analyses are handled by using a range of well-established geographic information systems (GIS) that also found their way into the field of planetary sciences in the late 1990s. We here address key issues concerning the field of planetary mapping by making use of established GIS environments and discuss methods of addressing data organization and mapping requirements by using an easily integrable datamodel that is - for the time being - designed as file-geodatabase (FileGDB) environment in ESRI's ArcGIS. A major design-driving requirement for this datamodel is its extensibility and scalability for growing scientific as well as technical needs, e.g., the utilization of such a datamodel for surface mapping of different planetary objects as defined by their respective reference system and by using different instrument data. Furthermore, it is a major goal to construct a generic model which allows to perform combined geologic as well as geomorphologic mapping tasks making use of international standards without loss of information and by maintaining topologic integrity. An integration of such a datamodel within a geospatial DBMS context can practically be performed by individuals as well as groups without having to deal with the details of administrative tasks and data ingestion issues. Besides the actual mapping, key components of such a mapping datamodel deal with the organization and search for image-sensor data and previous mapping efforts, as well as the proper organization of cartographic representations and assignments of geologic/geomorphologic units within their stratigraphic context.

  9. The Generic Mapping Tools 6: Classic versus Modern Mode

    NASA Astrophysics Data System (ADS)

    Wessel, P.; Uieda, L.; Luis, J. M. F.; Scharroo, R.; Smith, W. H. F.; Wobbe, F.

    2017-12-01

    The Generic Mapping Tools (GMT; gmt.soest.hawaii.edu) is a 25-year old, mature open-source software package for the analysis and display of geoscience data (e.g., interpolate, filter, manipulate, project and plot temporal and spatial data). The GMT "toolbox" includes about 80 core and 40 supplemental modules sharing a common set of command options, file structures, and documentation. GMT5, when released in 2013, introduced an application programming interface (API) to allow programmatic access to GMT from other computing environments. Since then, we have released a GMT/MATLAB toolbox, an experimental GMT/Julia package, and will soon introduce a GMT/Python module. In developing these extensions, we wanted to simplify the GMT learning curve but quickly realized the main stumbling blocks to GMT command-line mastery would be ported to the external environments unless we introduced major changes. With thousands of GMT scripts already in use by scientists around the world, we were acutely aware of the need for backwards compatibility. Our solution, to be released as GMT 6, was to add a modern run mode that complements the classic mode offered so far. Modern mode completely eliminates the top three obstacles for new (and not so new) GMT users: (1) The responsibility to properly stack PostScript layers manually (i.e., the -O -K dance), (2) the responsibility of handling output redirection of PostScript (create versus append), and (3) the need to provide commands with repeated information about regions (-R) and projections (-J). Thus, modern mode results in shorter, simpler scripts with fewer pitfalls, without interfering with classic scripts. Our implementation adds five new commands that begin and end a modern session, simplify figure management, automate the conversion of PostScript to more suitable formats, automate region detection, and offer a new automated subplot environment for multi-panel illustrations. Here, we highlight the GMT modern mode and the simplifications it offers, both for command-line use and in external environments. GMT 6 is in beta mode but accessible from our repository. Numerous improvements have been added in addition to modern mode; we expect a formal release in early 2018. Publication partially supported by FCT project UID/GEO/50019/2013 - Instituto D. Luiz.

  10. Picture Pile: A citizen-powered tool for rapid post-disaster damage assessments

    NASA Astrophysics Data System (ADS)

    Danylo, Olha; Sturn, Tobias; Giovando, Cristiano; Moorthy, Inian; Fritz, Steffen; See, Linda; Kapur, Ravi; Girardot, Blake; Ajmar, Andrea; Giulio Tonolo, Fabio; Reinicke, Tobias; Mathieu, Pierre Philippe; Duerauer, Martina

    2017-04-01

    According to the World Bank's global risk analysis, around 34% of the total world's population lives in areas of high mortality risk from two or more natural hazards. Therefore, timely and innovative methods to rapidly assess damage to subsequently aid relief and recovery efforts are critical. In this field of post-disaster damage assessment, several crowdsourcing-based technological tools that engage citizens in carrying out various tasks, including data collection, satellite image analysis and online interactive mapping, have recently been developed. One such tool is Picture Pile, a cross-platform application that is designed as a generic and flexible tool for ingesting satellite imagery for rapid classification. As part of the ESA's Crowd4Sat initiative led by Imperative Space, this study develops a workflow for employing Picture Pile for rapid post-disaster damage assessment. We outline how satellite image interpretation tasks within Picture Pile can be crowdsourced using the example of Hurricane Matthew, which affected large regions of Haiti in September 2016. The application provides simple microtasks, where the user is presented with satellite images and is asked a simple yes/no question. A "before" disaster satellite image is displayed next to an "after" disaster image and the user is asked to assess whether there is any visible, detectable damage. The question is formulated precisely to focus the user's attention on a particular aspect of the damage. The user-interface of Picture Pile is also built for users to rapidly classify the images by swiping to indicate their answer, thereby efficiently completing the microstask. The proposed approach will not only help to increase citizen awareness of natural disasters, but also provide them with a unique opportunity to contribute directly to relief efforts. Furthermore, to gain confidence in the crowdsourced results, quality assurance methods were integrated during the testing phase of the application using image classifications from experts. The application has a built-in real-time quality assurance system to provide volunteers with feedback when their answer does not agree with that of an expert. Picture Pile is intended to supplement existing approaches for post-disaster damage assessment and can be used by different networks of volunteers (e.g., the Humanitarian OpenStreetMap Team) to assess damage and create up-to-date maps of response to disaster events.

  11. Generic Skills. Trade Families. Based on Data on the Use of 588 Tool Skills from 1600 Workers and Supervisors in 131 Occupations.

    ERIC Educational Resources Information Center

    Smith, Arthur De W.

    The Generic Skills studies were designed to provide training specifications that will enable graduates of trades training programs to compete for job placement in a range of occupations rather than in a single occupation. The studies identified a number of trade families, classified on the basis of skills used in work performance, and also…

  12. The Banana Genome Hub

    PubMed Central

    Droc, Gaëtan; Larivière, Delphine; Guignon, Valentin; Yahiaoui, Nabila; This, Dominique; Garsmeur, Olivier; Dereeper, Alexis; Hamelin, Chantal; Argout, Xavier; Dufayard, Jean-François; Lengelle, Juliette; Baurens, Franc-Christophe; Cenci, Alberto; Pitollat, Bertrand; D’Hont, Angélique; Ruiz, Manuel; Rouard, Mathieu; Bocs, Stéphanie

    2013-01-01

    Banana is one of the world’s favorite fruits and one of the most important crops for developing countries. The banana reference genome sequence (Musa acuminata) was recently released. Given the taxonomic position of Musa, the completed genomic sequence has particular comparative value to provide fresh insights about the evolution of the monocotyledons. The study of the banana genome has been enhanced by a number of tools and resources that allows harnessing its sequence. First, we set up essential tools such as a Community Annotation System, phylogenomics resources and metabolic pathways. Then, to support post-genomic efforts, we improved banana existing systems (e.g. web front end, query builder), we integrated available Musa data into generic systems (e.g. markers and genetic maps, synteny blocks), we have made interoperable with the banana hub, other existing systems containing Musa data (e.g. transcriptomics, rice reference genome, workflow manager) and finally, we generated new results from sequence analyses (e.g. SNP and polymorphism analysis). Several uses cases illustrate how the Banana Genome Hub can be used to study gene families. Overall, with this collaborative effort, we discuss the importance of the interoperability toward data integration between existing information systems. Database URL: http://banana-genome.cirad.fr/ PMID:23707967

  13. Building energy simulation in real time through an open standard interface

    DOE PAGES

    Pang, Xiufeng; Nouidui, Thierry S.; Wetter, Michael; ...

    2015-10-20

    Building energy models (BEMs) are typically used for design and code compliance for new buildings and in the renovation of existing buildings to predict energy use. We present the increasing adoption of BEM as standard practice in the building industry presents an opportunity to extend the use of BEMs into construction, commissioning and operation. In 2009, the authors developed a real-time simulation framework to execute an EnergyPlus model in real time to improve building operation. This paper reports an enhancement of that real-time energy simulation framework. The previous version only works with software tools that implement the custom co-simulation interfacemore » of the Building Controls Virtual Test Bed (BCVTB), such as EnergyPlus, Dymola and TRNSYS. The new version uses an open standard interface, the Functional Mockup Interface (FMI), to provide a generic interface to any application that supports the FMI protocol. In addition, the new version utilizes the Simple Measurement and Actuation Profile (sMAP) tool as the data acquisition system to acquire, store and present data. Lastly, this paper introduces the updated architecture of the real-time simulation framework using FMI and presents proof-of-concept demonstration results which validate the new framework.« less

  14. CCSDS Advanced Orbiting Systems Virtual Channel Access Service for QoS MACHETE Model

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John S.

    2011-01-01

    To support various communications requirements imposed by different missions, interplanetary communication protocols need to be designed, validated, and evaluated carefully. Multimission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE), described in "Simulator of Space Communication Networks" (NPO-41373), NASA Tech Briefs, Vol. 29, No. 8 (August 2005), p. 44, combines various tools for simulation and performance analysis of space networks. The MACHETE environment supports orbital analysis, link budget analysis, communications network simulations, and hardware-in-the-loop testing. By building abstract behavioral models of network protocols, one can validate performance after identifying the appropriate metrics of interest. The innovators have extended the MACHETE model library to include a generic link-layer Virtual Channel (VC) model supporting quality-of-service (QoS) controls based on IP streams. The main purpose of this generic Virtual Channel model addition was to interface fine-grain flow-based QoS (quality of service) between the network and MAC layers of the QualNet simulator, a commercial component of MACHETE. This software model adds the capability of mapping IP streams, based on header fields, to virtual channel numbers, allowing extended QoS handling at link layer. This feature further refines the QoS v existing at the network layer. QoS at the network layer (e.g. diffserv) supports few QoS classes, so data from one class will be aggregated together; differentiating between flows internal to a class/priority is not supported. By adding QoS classification capability between network and MAC layers through VC, one maps multiple VCs onto the same physical link. Users then specify different VC weights, and different queuing and scheduling policies at the link layer. This VC model supports system performance analysis of various virtual channel link-layer QoS queuing schemes independent of the network-layer QoS systems.

  15. Mapping the Paediatric Quality of Life Inventory (PedsQL™) Generic Core Scales onto the Child Health Utility Index-9 Dimension (CHU-9D) Score for Economic Evaluation in Children.

    PubMed

    Lambe, Tosin; Frew, Emma; Ives, Natalie J; Woolley, Rebecca L; Cummins, Carole; Brettell, Elizabeth A; Barsoum, Emma N; Webb, Nicholas J A

    2018-04-01

    The Paediatric Quality of Life Inventory (PedsQL™) questionnaire is a widely used, generic instrument designed for measuring health-related quality of life (HRQoL); however, it is not preference-based and therefore not suitable for cost-utility analysis. The Child Health Utility Index-9 Dimension (CHU-9D), however, is a preference-based instrument that has been primarily developed to support cost-utility analysis. This paper presents a method for estimating CHU-9D index scores from responses to the PedsQL™ using data from a randomised controlled trial of prednisolone therapy for treatment of childhood corticosteroid-sensitive nephrotic syndrome. HRQoL data were collected from children at randomisation, week 16, and months 12, 18, 24, 36 and 48. Observations on children aged 5 years and older were pooled across all data collection timepoints and were then randomised into an estimation (n = 279) and validation (n = 284) sample. A number of models were developed using the estimation data before internal validation. The best model was chosen using multi-stage selection criteria. Most of the models developed accurately predicted the CHU-9D mean index score. The best performing model was a generalised linear model (mean absolute error = 0.0408; mean square error = 0.0035). The proportion of index scores deviating from the observed scores by <  0.03 was 53%. The mapping algorithm provides an empirical tool for estimating CHU-9D index scores and for conducting cost-utility analyses within clinical studies that have only collected PedsQL™ data. It is valid for children aged 5 years or older. Caution should be exercised when using this with children younger than 5 years, older adolescents (>  13 years) or patient groups with particularly poor quality of life. 16645249.

  16. Generic synopsis of the jumping plant-lice (Hemiptera: Sternorrhyncha: Psylloidea) from Colombia.

    PubMed

    Rendón-Mera, Diana Isabel; Serna, Francisco; Burckhardt, Daniel

    2017-11-20

    Jumping plant-lice (Hemiptera: Sternorrhyncha: Psylloidea) are a group of phloem-feeding insects with nearly 4000        described species. Previous records from Colombia comprise 19 genera of all eight known families. The revision of material deposited in six Colombian and three foreign museums yielded another nine genera that constitute new country records. Material from 16 departments was examined. Each genus is diagnosed and information is provided on biology, damage and host-plants. Local distribution maps and a generic key for the identification of adults are provided.

  17. Pharmacists' experiences and attitudes regarding generic drugs and generic substitution: two sides of the coin.

    PubMed

    Olsson, Erika; Kälvemark Sporrong, Sofia

    2012-12-01

    Generic drug substitution reduces costs for medicines, but the downsides include unintentional double medication, confusion and anxiety among patients. Information from pharmacists affects patients' experiences of substitution with generic drugs. The aim of this study was to explore experiences and attitudes to generic substitution among Swedish community pharmacists. An interview guide was developed. Semi-structured interviews with community pharmacists were conducted and transcribed verbatim. Analysis was inductive; extracts from the transcripts were compared and combined to form themes and subcategories. Pharmacists from a heterogeneous convenience sample of pharmacies were interviewed until data saturation had been achieved. Sixteen pharmacists were interviewed. Three main themes and twelve subcategories were identified, with the main themes being the role of the pharmacist, pharmacists' concerns regarding patients, and the generic drug. Pharmacists found it positive that generic substitution decreases the costs for pharmaceuticals but also emphasized that the switch can confuse and worry patients, which could result in less benefit from treatment. Respondents claimed that generic substitution has changed the focus in the pharmacist-patient meeting towards economics and regulations. According to the interviewed pharmacists generic substitution is not primarily an issue of generic versus brand-name products, but concerns above all the challenges that the switch implies for patients and pharmacists. To prevent known confusion and concerns among patients it is important that community pharmacists acquire the necessary tools and knowledge to manage this situation; pharmacists themselves as well as pharmacy owners and authorities share responsibility for this. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.

  18. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  19. WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks

    ERIC Educational Resources Information Center

    Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.

    2013-01-01

    A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…

  20. European solvent industry group generic exposure scenario risk and exposure tool

    PubMed Central

    Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris

    2014-01-01

    The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates. PMID:23361440

  1. European solvent industry group generic exposure scenario risk and exposure tool.

    PubMed

    Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris

    2014-01-01

    The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates.

  2. The differences between the branded and generic medicines using solid dosage forms: In-vitro dissolution testing

    PubMed Central

    Al Ameri, Mubarak Nasser; Nayuni, Nanda; Anil Kumar, K.G.; Perrett, David; Tucker, Arthur; Johnston, Atholl

    2011-01-01

    Introduction Dissolution is the amount of substance that goes into solution per unit time under standardised conditions of liquid/solid interface, solvent composition and temperature. Dissolution is one of the most important tools to predict the in-vivo bioavailability and in some cases to determine bioequivalence and assure interchangeability. Aim To compare the differences in dissolution behaviour of solid dosage forms between innovators (reference products) and their generic counterparts (tested products). Methods Four replicates for each batch of 37 tested medicines was carried out using A PT-DT70 dissolution tester from Pharma Test. A total of 13 branded medicines and 24 generic counterparts were obtained locally and internationally to detect any differences in their dissolution behaviour. They were tested according to the British Pharmacopeia, European Pharmacopeia and the US Pharmacopeia with the rate of dissolution determined by ultra-violet Spectrophotometery. Results Most tested medicines complied with the pharmacopoeial specifications and achieved 85% dissolution in 60 min. However, some generic medicines showed significant differences in dissolution rate at 60 and 120 min. Many generic medicines showed a slower dissolution rate than their branded counterparts such as the generic forms of omeprazole 20 mg. Some showed an incomplete dissolution such as the generic form of nifedipine 10 mg. Other generics showed faster dissolution rate than their branded counterpart such as the generic forms of meloxicam 15 mg. Moreover, some generics from different batches of the same manufacturer showed significant differences in their dissolution rate such as the generic forms of meloxicam 7.5 mg. Nevertheless, some generic medicines violated the EMA and the FDA guidelines for industry when they failed to achieve 85% dissolution at 60 min, such as the generic form of diclofenac sodium 50 mg. Conclusion Most medicines in this study complied with the pharmacopeial limits. However, some generics dissolved differently than their branded counterparts. This can clearly question the interchangeability between the branded and its generic counterpart or even among generics. PMID:25755988

  3. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.

  4. A Automated Tool for Supporting FMEAs of Digital Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less

  5. The utility of clinical decision tools for diagnosing osteoporosis in postmenopausal women with rheumatoid arthritis

    PubMed Central

    Brand, Caroline; Lowe, Adrian; Hall, Stephen

    2008-01-01

    Background Patients with rheumatoid arthritis have a higher risk of low bone mineral density than normal age matched populations. There is limited evidence to support cost effectiveness of population screening in rheumatoid arthritis and case finding strategies have been proposed as a means to increase cost effectiveness of diagnostic screening for osteoporosis. This study aimed to assess the performance attributes of generic and rheumatoid arthritis specific clinical decision tools for diagnosing osteoporosis in a postmenopausal population with rheumatoid arthritis who attend ambulatory specialist rheumatology clinics. Methods A cross-sectional study of 127 ambulatory post-menopausal women with rheumatoid arthritis was performed. Patients currently receiving or who had previously received bone active therapy were excluded. Eligible women underwent clinical assessment and dual-energy-xray absorptiometry (DXA) bone mineral density assessment. Clinical decision tools, including those specific for rheumatoid arthritis, were compared to seven generic post-menopausal tools to predict osteoporosis (defined as T score < -2.5). Sensitivity, specificity, positive predictive and negative predictive values and area under the curve were assessed. The diagnostic attributes of the clinical decision tools were compared by examination of the area under the receiver-operator-curve. Results One hundred and twenty seven women participated. The median age was 62 (IQR 56–71) years. Median disease duration was 108 (60–168) months. Seventy two (57%) women had no record of a previous DXA examination. Eighty (63%) women had T scores at femoral neck or lumbar spine less than -1. The area under the ROC curve for clinical decision tool prediction of T score <-2.5 varied between 0.63 and 0.76. The rheumatoid arthritis specific decision tools did not perform better than generic tools, however, the National Osteoporosis Foundation score could potentially reduce the number of unnecessary DXA tests by approximately 45% in this population. Conclusion There was limited utility of clinical decision tools for predicting osteoporosis in this patient population. Fracture prediction tools that include risk factors independent of BMD are needed. PMID:18230132

  6. Tree Cover Mapping Tool—Documentation and user manual

    USGS Publications Warehouse

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2016-06-02

    The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.

  7. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    NASA Astrophysics Data System (ADS)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.

  8. Tools for Energized Teaching: Revitalize Instruction with Ease

    ERIC Educational Resources Information Center

    Wilson, Kenneth

    2006-01-01

    "Challenge yourself to break out of your old routines. Think anew." Ken Wilson, educator, trainer and consultant has assembled a versatile, practical and generic book to use across disciplines and with all age levels. This collection of accessible, user-friendly tools incorporates and connects current education research--without the jargon. "This…

  9. Public Domain Generic Tools: An Overview.

    ERIC Educational Resources Information Center

    Erjavec, Tomaz

    This paper presents an introduction to language engineering software, especially for computerized language and text corpora. The focus of the paper is on small and relatively independent pieces of software designed for specific, often low-level language analysis tasks, and on tools in the public domain. Discussion begins with the application of…

  10. Development and Command-Control Tools for Many-Robot Systems

    DTIC Science & Technology

    2005-01-01

    been components such as pressure sensors and accelerometers for the automobile market. In fact, robots of any size have yet to appear in our daily...34 mode, so that the target hardware is neither reprogrammable nor rechargable. The goal of this paper is to propose some generic tools that the

  11. The Protein Identifier Cross-Referencing (PICR) service: reconciling protein identifiers across multiple source databases.

    PubMed

    Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning

    2007-10-18

    Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.

  12. Paving the way for the use of the SDQ in economic evaluations of school-based population health interventions: an empirical analysis of the external validity of SDQ mapping algorithms to the CHU9D in an educational setting.

    PubMed

    Boyer, Nicole R S; Miller, Sarah; Connolly, Paul; McIntosh, Emma

    2016-04-01

    The Strengths and Difficulties Questionnaire (SDQ) is a behavioural screening tool for children. The SDQ is increasingly used as the primary outcome measure in population health interventions involving children, but it is not preference based; therefore, its role in allocative economic evaluation is limited. The Child Health Utility 9D (CHU9D) is a generic preference-based health-related quality of-life measure. This study investigates the applicability of the SDQ outcome measure for use in economic evaluations and examines its relationship with the CHU9D by testing previously published mapping algorithms. The aim of the paper is to explore the feasibility of using the SDQ within economic evaluations of school-based population health interventions. Data were available from children participating in a cluster randomised controlled trial of the school-based roots of empathy programme in Northern Ireland. Utility was calculated using the original and alternative CHU9D tariffs along with two SDQ mapping algorithms. t tests were performed for pairwise differences in utility values from the preference-based tariffs and mapping algorithms. Mean (standard deviation) SDQ total difficulties and prosocial scores were 12 (3.2) and 8.3 (2.1). Utility values obtained from the original tariff, alternative tariff, and mapping algorithms using five and three SDQ subscales were 0.84 (0.11), 0.80 (0.13), 0.84 (0.05), and 0.83 (0.04), respectively. Each method for calculating utility produced statistically significantly different values except the original tariff and five SDQ subscale algorithm. Initial evidence suggests the SDQ and CHU9D are related in some of their measurement properties. The mapping algorithm using five SDQ subscales was found to be optimal in predicting mean child health utility. Future research valuing changes in the SDQ scores would contribute to this research.

  13. On macroeconomic characteristics of pharmaceutical generics and the potential for manufacturing and consumption under fuzzy conditions.

    PubMed

    Gascón, Fernando; de la Fuente, David; Puente, Javier; Lozano, Jesús

    2007-11-01

    The aim of this paper is to develop a methodology that is useful for analyzing, from a macroeconomic perspective, the aggregate demand and the aggregate supply features of the market of pharmaceutical generics. In order to determine the potential consumption and the potential production of pharmaceutical generics in different countries, two fuzzy decision support systems are proposed. Two fuzzy decision support systems, both based on the Mamdani model, were applied in this paper. These systems, generated by Matlab Toolbox 'Fuzzy' (v. 2.0), are able to determine the potential of a country for the manufacturing or the consumption of pharmaceutical generics. The systems make use of three macroeconomic input variables. In an empirical application of our proposed methodology, the potential towards consumption and manufacturing in Holland, Sweden, Italy and Spain has been estimated from national indicators. Cross-country comparisons are made and graphical surfaces are analyzed in order to interpret the results. The main contribution of this work is the development of a methodology that is useful for analyzing aggregate demand and aggregate supply characteristics of pharmaceutical generics. The methodology is valid for carrying out a systematic analysis of the potential generics have at a macrolevel in different countries. The main advantages of the use of fuzzy decision support systems in the context of pharmaceutical generics are the flexibility in the construction of the system, the speed in interpreting the results offered by the inference and surface maps and the ease with which a sensitivity analysis of the potential behavior of a given country may be performed.

  14. A Generic Communication Protocol for Remote Laboratories: an Implementation on e-lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henriques, Rafael B.; Fernandes, H.; Duarte, Andre S.

    2015-07-01

    The remote laboratories at IST (Instituto Superior Tecnico), e-lab, serve as a valuable tool for education and training based on remote control technologies. Due to the high number and increase of remotely operated experiments a generic protocol was developed to perform the communication between the software driver and the respective experimental setup in an easier and more unified way. The training in these fields of students and personnel can take advantage of such infrastructure with the purpose of deploying new experiments in a faster way. More than 10 experiments using the generic protocol are available on-line in a 24 xmore » 7 way. (authors)« less

  15. Canonical Representations of the Simple Map

    NASA Astrophysics Data System (ADS)

    Kerwin, Olivia; Punjabi, Alkesh; Ali, Halima; Boozer, Allen

    2007-11-01

    The simple map is the simplest map that has the topology of a divertor tokamak. The simple map has three canonical representations: (i) toroidal flux and poloidal angle (ψ,θ) as canonical coordinates, (ii) the physical variables (R,Z) or (X,Y) as canonical coordinates, and (iii) the action-angle (J,ζ) or magnetic variables (ψ,θ) as canonical coordinates. We give the derivation of the simple map in the (X,Y) representation. The simple map in this representation has been studied extensively (Ref. 1 and references therein). We calculate the magnetic coordinates for the simple map, construct the simple map in magnetic coordinates, and calculate generic topological effects of magnetic perturbations in divertor tokamaks using the map. We also construct the simple map in (ψ,θ) representation. Preliminary results of these studies will be presented. This work is supported by US DOE OFES DE-FG02-01ER54624 and DE-FG02-04ER54793. [1] A. Punjabi, H. Ali, T. Evans, and A. Boozer, Phys Lett A 364 140--145 (2007).

  16. IC-Finder: inferring robustly the hierarchical organization of chromatin folding

    PubMed Central

    Haddad, Noelle

    2017-01-01

    Abstract The spatial organization of the genome plays a crucial role in the regulation of gene expression. Recent experimental techniques like Hi-C have emphasized the segmentation of genomes into interaction compartments that constitute conserved functional domains participating in the maintenance of a proper cell identity. Here, we propose a novel method, IC-Finder, to identify interaction compartments (IC) from experimental Hi-C maps. IC-Finder is based on a hierarchical clustering approach that we adapted to account for the polymeric nature of chromatin. Based on a benchmark of realistic in silico Hi-C maps, we show that IC-Finder is one of the best methods in terms of reliability and is the most efficient numerically. IC-Finder proposes two original options: a probabilistic description of the inferred compartments and the possibility to explore the various hierarchies of chromatin organization. Applying the method to experimental data in fly and human, we show how the predicted segmentation may depend on the normalization scheme and how 3D compartmentalization is tightly associated with epigenomic information. IC-Finder provides a robust and generic ‘all-in-one’ tool to uncover the general principles of 3D chromatin folding and their influence on gene regulation. The software is available at http://membres-timc.imag.fr/Daniel.Jost/DJ-TIMC/Software.html. PMID:28130423

  17. The Full Kostant-Toda Hierarchy on the Positive Flag Variety

    NASA Astrophysics Data System (ADS)

    Kodama, Yuji; Williams, Lauren

    2015-04-01

    We study some geometric and combinatorial aspects of the solution to the full Kostant-Toda (f-KT) hierarchy, when the initial data is given by an arbitrary point on the totally non-negative (tnn) flag variety of . The f-KT flows on the tnn flag variety are complete, and we show that their asymptotics are completely determined by the cell decomposition of the tnn flag variety given by Rietsch (Total positivity and real flag varieties. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, 1998). Our results represent the first results on the asymptotics of the f-KT hierarchy (and even the f-KT lattice); moreover, our results are not confined to the generic flow, but cover non-generic flows as well. We define the f-KT flow on the weight space via the moment map, and show that the closure of each f-KT flow forms an interesting convex polytope which we call a Bruhat interval polytope. In particular, the Bruhat interval polytope for the generic flow is the permutohedron of the symmetric group . We also prove analogous results for the full symmetric Toda hierarchy, by mapping our f-KT solutions to those of the full symmetric Toda hierarchy. In the appendix we show that Bruhat interval polytopes are generalized permutohedra, in the sense of Postnikov (Int. Math. Res. Not. IMRN (6):1026-1106, 2009).

  18. Visuospatial and psychomotor aptitude predicts endovascular performance of inexperienced individuals on a virtual reality simulator.

    PubMed

    Van Herzeele, Isabelle; O'Donoghue, Kevin G L; Aggarwal, Rajesh; Vermassen, Frank; Darzi, Ara; Cheshire, Nicholas J W

    2010-04-01

    This study evaluated virtual reality (VR) simulation for endovascular training of medical students to determine whether innate perceptual, visuospatial, and psychomotor aptitude (VSA) can predict initial and plateau phase of technical endovascular skills acquisition. Twenty medical students received didactic and endovascular training on a commercially available VR simulator. Each student treated a series of 10 identical noncomplex renal artery stenoses endovascularly. The simulator recorded performance data instantly and objectively. An experienced interventionalist rated the performance at the initial and final sessions using generic (out of 40) and procedure-specific (out of 30) rating scales. VSA were tested with fine motor dexterity (FMD, Perdue Pegboard), psychomotor ability (minimally invasive virtual reality surgical trainer [MIST-VR]), image recall (Rey-Osterrieth), and organizational aptitude (map-planning). VSA performance scores were correlated with the assessment parameters of endovascular skills at commencement and completion of training. Medical students exhibited statistically significant learning curves from the initial to the plateau performance for contrast usage (medians, 28 vs 17 mL, P < .001), total procedure time (2120 vs 867 seconds, P < .001), and fluoroscopy time (993 vs. 507 seconds, P < .001). Scores on generic and procedure-specific rating scales improved significantly (10 vs 25, P < .001; 8 vs 17 P < .001). Significant correlations were noted for FMD with initial and plateau sessions for fluoroscopy time (r(s) = -0.564, P = .010; r(s) = -.449, P = .047). FMD correlated with procedure-specific scores at the initial session (r(s) = .607, P = .006). Image recall correlated with generic skills at the end of training (r(s) = .587, P = .006). Simulator-based training in endovascular skills improved performance in medical students. There were significant correlations between initial endovascular skill and fine motor dexterity as well as with image recall at end of the training period. In addition to current recruitment strategies, VSA may be a useful tool for predictive validity studies.

  19. An open-source java platform for automated reaction mapping.

    PubMed

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  20. Hawaiian Volcano Observatory Seismic Data, January to December 2008

    USGS Publications Warehouse

    Nakata, Jennifer S.; Okubo, Paul G.

    2009-01-01

    The U.S. Geological Survey (USGS), Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year. The seismic summary is offered without interpretation as a source of preliminary data and is complete in that most data for events of M greater than 1.5 are included. All latitude and longitude references in this report are stated in Old Hawaiian Datum. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data necessitated an annual publication, beginning with Summary 74 for the year 1974. Beginning in 2004, summaries are simply identified by the year, rather than by summary number. Summaries originally issued as administrative reports were republished in 2007 as Open-File Reports. All the summaries since 1956 are listed at http://geopubs.wr.usgs.gov/ (last accessed 09/21/2009). In January 1986, HVO adopted CUSP (California Institute of Technology USGS Seismic Processing). Summary 86 includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes background information about the seismic network to provide the end user an understanding of the processing parameters and how the data were gathered. A report by Klein and Koyanagi (1980) tabulates instrumentation, calibration, and recording history of each seismic station in the network. It is designed as a reference for users of seismograms and phase data and includes and augments the information in the station table in this summary. Figures 11-14 are maps showing computer-located hypocenters. The maps were generated using the Generic Mapping Tools (GMT http://gmt.soest.hawaii.edu/, last accessed 09/21/2009) in place of traditional Qplot maps.

  1. User Guide for the Anvil Threat Cooridor Forecast Tool V2.4 for AWIPS

    NASA Technical Reports Server (NTRS)

    Barett, Joe H., III; Bauman, William H., III

    2008-01-01

    The Anvil Tool GUI allows users to select a Data Type, toggle the map refresh on/off, place labels, and choose the Profiler Type (source of the KSC 50 MHz profiler data), the Date- Time of the data, the Center of Plot, and the Station (location of the RAOB or 50 MHz profiler). If the Data Type is Models, the user selects a Fcst Hour (forecast hour) instead of Station. There are menus for User Profiles, Circle Label Options, and Frame Label Options. Labels can be placed near the center circle of the plot and/or at a specified distance and direction from the center of the circle (Center of Plot). The default selection for the map refresh is "ON". When the user creates a new Anvil Tool map with Refresh Map "ON, the plot is automatically displayed in the AWIPS frame. If another Anvil Tool map is already displayed and the user does not change the existing map number shown at the bottom of the GUI, the new Anvil Tool map will overwrite the old one. If the user turns the Refresh Map "OFF", the new Anvil Tool map is created but not automatically displayed. The user can still display the Anvil Tool map through the Maps dropdown menu* as shown in Figure 4.

  2. Making Space for Place: Mapping Tools and Practices to Teach for Spatial Justice

    ERIC Educational Resources Information Center

    Rubel, Laurie H.; Hall-Wieckert, Maren; Lim, Vivian Y.

    2017-01-01

    This article presents a set of spatial tools for classroom learning about spatial justice. As part of a larger team, we designed a curriculum that engaged 10 learners with 3 spatial tools: (a) an oversized floor map, (b) interactive geographic information systems (GIS) maps, and (c) participatory mapping. We analyze how these tools supported…

  3. The generic unfolding of a codimension-two connection to a two-fold singularity of planar Filippov systems

    NASA Astrophysics Data System (ADS)

    Novaes, Douglas D.; Teixeira, Marco A.; Zeli, Iris O.

    2018-05-01

    Generic bifurcation theory was classically well developed for smooth differential systems, establishing results for k-parameter families of planar vector fields. In the present study we focus on a qualitative analysis of 2-parameter families, , of planar Filippov systems assuming that Z 0,0 presents a codimension-two minimal set. Such object, named elementary simple two-fold cycle, is characterized by a regular trajectory connecting a visible two-fold singularity to itself, for which the second derivative of the first return map is nonvanishing. We analyzed the codimension-two scenario through the exhibition of its bifurcation diagram.

  4. PSUP: A Planetary SUrface Portal

    NASA Astrophysics Data System (ADS)

    Poulet, F.; Quantin-Nataf, C.; Ballans, H.; Dassas, K.; Audouard, J.; Carter, J.; Gondet, B.; Lozac'h, L.; Malapert, J.-C.; Marmo, C.; Riu, L.; Séjourné, A.

    2018-01-01

    The large size and complexity of planetary data acquired by spacecraft during the last two decades create a demand within the planetary community for access to the archives of raw and high level data and for the tools necessary to analyze these data. Among the different targets of the Solar System, Mars is unique as the combined datasets from the Viking, Mars Global Surveyor, Mars Odyssey, Mars Express and Mars Reconnaissance Orbiter missions provide a tremendous wealth of information that can be used to study the surface of Mars. The number and the size of the datasets require an information system to process, manage and distribute data. The Observatories of Paris Sud (OSUPS) and Lyon (OSUL) have developed a portal, called PSUP (Planetary SUrface Portal), for providing users with efficient and easy access to data products dedicated to the Martian surface. The objectives of the portal are: 1) to allow processing and downloading of data via a specific application called MarsSI (Martian surface data processing Information System); 2) to provide the visualization and merging of high level (image, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu), and 3) to distribute some of these specific high level data with an emphasis on products issued by the science teams of OSUPS and OSUL. As the MarsSI service is extensively described in a companion paper (Quantin-Nataf et al., companion paper, submitted to this special issue), the present paper focus on the general architecture and the functionalities of the web-based user interface MarsVisu. This service provides access to many data products for Mars: albedo, mineral and thermal inertia global maps from spectrometers; mosaics from imagers; image footprints and rasters from the MarsSI tool; high level specific products (defined as catalogs or vectors). MarsVisu can be used to quickly assess the visualized processed data and maps as well as identify areas that have not been mapped yet. It also allows overlapping of these data products on a virtual Martian globe, which can be difficult to use collectively. The architecture of PSUP data management layer and visualization is based on SITools2 (Malapert and Marseille, 2012) and MIZAR (Module for Interactive visualiZation from Astronomical Repositories) respectively, two CNES generic tools developed by a joint effort between the French space agency (CNES) and French scientific laboratories. Future developments include the addition of high level products of Mars (regional geological maps, new global compositional maps…) and tools (spectra extraction from hyperspectral cubes). Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  5. An analogy of the charge distribution on Julia sets with the Brownian motion

    NASA Astrophysics Data System (ADS)

    Lopes, Artur O.

    1989-09-01

    A way to compute the entropy of an invariant measure of a hyperbolic rational map from the information given by a Ruelle-Perron-Frobenius operator of a generic Holder-continuous function will be shown. This result was motivated by an analogy of the Brownian motion with the dynamical system given by a rational map and the maximal measure. In the case the rational map is a polynomial, then the maximal measure is the charge distribution in the Julia set. The main theorem of this paper can be seen as a large deviation result. It is a kind of Donsker-Varadhan formula for dynamical systems.

  6. Economic principles for resource allocation decisions at national level to mitigate the effects of disease in farm animal populations.

    PubMed

    Howe, K S; Häsler, B; Stärk, K D C

    2013-01-01

    This paper originated in a project to develop a practical, generic tool for the economic evaluation of surveillance for farm animal diseases at national level by a state veterinary service. Fundamental to that process is integration of epidemiological and economic perspectives. Using a generalized example of epidemic disease, we show that an epidemic curve maps into its economic equivalent, a disease mitigation function, that traces the relationship between value losses avoided and mitigation resources expended. Crucially, elementary economic principles show that mitigation, defined as loss reduction achieved by surveillance and intervention, must be explicitly conceptualized as a three-variable process, and the relative contributions of surveillance and intervention resources investigated with regard to the substitution possibilities between them. Modelling the resultant mitigation surfaces for different diseases should become a standard approach to animal health policy analysis for economic efficiency, a contribution to the evolving agenda for animal health economics research.

  7. Microsatellite diversity of isolates of the parasitic nematode Haemonchus contortus.

    PubMed

    Otsen, M; Plas, M E; Lenstra, J A; Roos, M H; Hoekstra, R

    2000-09-01

    The alarming development of anthelmintic resistance in important gastrointestinal nematode parasites of man and live-stock is caused by selection for specific genotypes. In order to provide genetic tools to study the nematode populations and the consequences of anthelmintic treatment, we isolated and sequenced 59 microsatellites of the sheep and goat parasite Haemonchus contortus. These microsatellites consist typically of 2-10 tandems CA/GT repeats that are interrupted by sequences of 1-10 bp. A predominant cause of the imperfect structure of the microsatellites appeared mutations of G/C bp in the tandem repeat. About 44% of the microsatellites were associated with the HcREP1 direct repeat, and it was demonstrated that a generic HcREP1 primer could be used to amplify HcREP1-associated microsatellites. Thirty microsatellites could be typed by polymerase chain reaction (PCR) of which 27 were polymorphic. A number of these markers were used to detect genetic contamination of an experimental inbred population. The microsatellites may also contribute to the genetic mapping of drug resistance genes.

  8. Nanobits, Nembranes and Micro Four-Point Probes: Customizable Tools for insitu Manipulation and Characterisation of Nanostructures

    NASA Astrophysics Data System (ADS)

    Boggild, Peter; Hjorth Petersen, Dirch; Sardan Sukas, Ozlem; Dam, Henrik Friis; Lei, Anders; Booth, Timothy; Molhave, Kristian; Eicchorn, Volkmar

    2010-03-01

    We present a range of highly adaptable microtools for direct interaction with nanoscale structures; (i) semiautomatic pick-and-place assembly of multiwalled carbon nanotubes onto cantilevers for high-aspect ratio scanning probe microscopy, using electrothermal microgrippers inside a SEM. Topology optimisation was used to calculate the optimal gripper shape defined by the boundary conditions, resulting in 10-100 times better performance. By instead pre-defining detachable tips using electron beam lithography, free-form scanning probe tips (Nanobits) can be mounted in virtually any position on a cantilever; (ii) scanning micro four point probes allow fast, non- destructive mapping of local electrical properties (sheet resistance and Hall mobility) and hysteresis effects of graphene sheets; (iii) sub 100 nm freestanding devices with wires, heaters, actuators, sensors, resonators and probes were defined in a 100 nm thin membrane with focused ion beam milling. By patterning generic membrane templates (Nembranes) the fabrication time of a TEM compatible NEMS device is effectively reduced to less around 20 minutes.

  9. Concept Mapping Using Cmap Tools to Enhance Meaningful Learning

    NASA Astrophysics Data System (ADS)

    Cañas, Alberto J.; Novak, Joseph D.

    Concept maps are graphical tools that have been used in all facets of education and training for organizing and representing knowledge. When learners build concept maps, meaningful learning is facilitated. Computer-based concept mapping software such as CmapTools have further extended the use of concept mapping and greatly enhanced the potential of the tool, facilitating the implementation of a concept map-centered learning environment. In this chapter, we briefly present concept mapping and its theoretical foundation, and illustrate how it can lead to an improved learning environment when it is combined with CmapTools and the Internet. We present the nationwide “Proyecto Conéctate al Conocimiento” in Panama as an example of how concept mapping, together with technology, can be adopted by hundreds of schools as a means to enhance meaningful learning.

  10. Modeling of Protection in Dynamic Simulation Using Generic Relay Models and Settings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samaan, Nader A.; Dagle, Jeffery E.; Makarov, Yuri V.

    This paper shows how generic protection relay models available in planning tools can be augmented with settings that are based on NERC standards or best engineering practice. Selected generic relay models in Siemens PSS®E have been used in dynamic simulations in the proposed approach. Undervoltage, overvoltage, underfrequency, and overfrequency relays have been modeled for each generating unit. Distance-relay protection was modeled for transmission system protection. Two types of load-shedding schemes were modeled: underfrequency (frequency-responsive non-firm load shedding) and underfrequency and undervoltage firm load shedding. Several case studies are given to show the impact of protection devices on dynamic simulations. Thismore » is useful for simulating cascading outages.« less

  11. Synthesizing cognition in neuromorphic electronic systems

    PubMed Central

    Neftci, Emre; Binas, Jonathan; Rutishauser, Ueli; Chicca, Elisabetta; Indiveri, Giacomo; Douglas, Rodney J.

    2013-01-01

    The quest to implement intelligent processing in electronic neuromorphic systems lacks methods for achieving reliable behavioral dynamics on substrates of inherently imprecise and noisy neurons. Here we report a solution to this problem that involves first mapping an unreliable hardware layer of spiking silicon neurons into an abstract computational layer composed of generic reliable subnetworks of model neurons and then composing the target behavioral dynamics as a “soft state machine” running on these reliable subnets. In the first step, the neural networks of the abstract layer are realized on the hardware substrate by mapping the neuron circuit bias voltages to the model parameters. This mapping is obtained by an automatic method in which the electronic circuit biases are calibrated against the model parameters by a series of population activity measurements. The abstract computational layer is formed by configuring neural networks as generic soft winner-take-all subnetworks that provide reliable processing by virtue of their active gain, signal restoration, and multistability. The necessary states and transitions of the desired high-level behavior are then easily embedded in the computational layer by introducing only sparse connections between some neurons of the various subnets. We demonstrate this synthesis method for a neuromorphic sensory agent that performs real-time context-dependent classification of motion patterns observed by a silicon retina. PMID:23878215

  12. Audio Tracking in Noisy Environments by Acoustic Map and Spectral Signature.

    PubMed

    Crocco, Marco; Martelli, Samuele; Trucco, Andrea; Zunino, Andrea; Murino, Vittorio

    2018-05-01

    A novel method is proposed for generic target tracking by audio measurements from a microphone array. To cope with noisy environments characterized by persistent and high energy interfering sources, a classification map (CM) based on spectral signatures is calculated by means of a machine learning algorithm. Next, the CM is combined with the acoustic map, describing the spatial distribution of sound energy, in order to obtain a cleaned joint map in which contributions from the disturbing sources are removed. A likelihood function is derived from this map and fed to a particle filter yielding the target location estimation on the acoustic image. The method is tested on two real environments, addressing both speaker and vehicle tracking. The comparison with a couple of trackers, relying on the acoustic map only, shows a sharp improvement in performance, paving the way to the application of audio tracking in real challenging environments.

  13. A mesh generation and machine learning framework for Drosophila gene expression pattern image analysis

    PubMed Central

    2013-01-01

    Background Multicellular organisms consist of cells of many different types that are established during development. Each type of cell is characterized by the unique combination of expressed gene products as a result of spatiotemporal gene regulation. Currently, a fundamental challenge in regulatory biology is to elucidate the gene expression controls that generate the complex body plans during development. Recent advances in high-throughput biotechnologies have generated spatiotemporal expression patterns for thousands of genes in the model organism fruit fly Drosophila melanogaster. Existing qualitative methods enhanced by a quantitative analysis based on computational tools we present in this paper would provide promising ways for addressing key scientific questions. Results We develop a set of computational methods and open source tools for identifying co-expressed embryonic domains and the associated genes simultaneously. To map the expression patterns of many genes into the same coordinate space and account for the embryonic shape variations, we develop a mesh generation method to deform a meshed generic ellipse to each individual embryo. We then develop a co-clustering formulation to cluster the genes and the mesh elements, thereby identifying co-expressed embryonic domains and the associated genes simultaneously. Experimental results indicate that the gene and mesh co-clusters can be correlated to key developmental events during the stages of embryogenesis we study. The open source software tool has been made available at http://compbio.cs.odu.edu/fly/. Conclusions Our mesh generation and machine learning methods and tools improve upon the flexibility, ease-of-use and accuracy of existing methods. PMID:24373308

  14. Tools for automated acoustic monitoring within the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  15. Systematic review of the impact of urinary tract infections on health-related quality of life.

    PubMed

    Bermingham, Sarah L; Ashe, Joanna F

    2012-12-01

    What's known on the subject? and What does the study add? Values for equivalent health states can vary substantially depending on the measure used and method of valuation; this has a direct impact on the results of economic analyses. To date, the majority of existing economic evaluations that include UTI as a health state refer to an analysis in which the Index of Well Being was used to estimate the quality of life experienced by young women with UTIs. Currently, there are no validated methods or filters for systematically searching for the type of generic quality of life data required for decision analytic models. This study is the only systematic review of quality of life in people with UTI in the literature. Twelve studies were identified which report quality of life using a variety of generic methods; the results of these papers were summarized in a way that is useful for a health researcher seeking to populate a decision model, design a clinical study or assess the effect of UTI on quality of life relative to other conditions. One research group provided previously unpublished data from a large cohort study; these scores were mapped to EuroQol 5-Dimension values using published algorithms and probabilistic simulations. The aim of this review was to identify studies that have evaluated the impact of symptomatic urinary tract infection (UTI) and UTI-associated bacteraemia on quality of life, and to summarize these data in a way that is useful for a health researcher seeking to populate a cost-utility model, design a clinical study or assess the effect of UTIs on quality of life relative to other conditions. We conducted a systematic search of the literature using MEDLINE, EMBASE, the NHS Economic Evaluations database, Health Technology Assessment database, Health Economics Evaluations database, Cost-Effectiveness Analysis Registry and EuroQol website. Studies that reported utility values for symptomatic UTI or UTI-associated bacteraemia derived from a generic QoL measurement tool or expert opinion were included. Studies using disease-specific instruments were excluded. Twelve studies were identified that included a generic measure of health-related quality of life for patients with UTIs. These measures included: the short-form (SF)-36 and SF-12 questionnaires; the Health Utilities Index Mark 2; Quality of Well Being; the Index of Well Being, standard gamble; the Health and Activity Limitation Index; and expert opinion. The authors of studies using either of the SF questionnaires were contacted for additional data. One research group provided previously unpublished data from a large cohort study; these scores were mapped to EuroQol 5-Dimension (EQ-5D) values using published algorithms and probabilistic simulations. The present review provides health researchers with several sources from which to select utility values to populate cost-utility models. It also shows that very few studies have measured quality of life in patients with UTI using generic preference-based measures of health and none have evaluated the impact of this health state on quality of life in children. Future studies ought to consider the inclusion of commonly used preference-based measures of health, such as the EQ-5D, in all patient populations experiencing symptomatic UTI or UTI-related complications. © 2012 NATIONAL CLINICAL GUIDELINE CENTRE OF ROYAL COLLEGE OF PHYSICIANS.

  16. Experimental Stage Separation Tool Development in NASA Langley's Aerothermodynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Scallion, William I.

    2005-01-01

    As part of the research effort at NASA in support of the stage separation and ascent aerothermodynamics research program, proximity testing of a generic bimese wing-body configuration was conducted in NASA Langley's Aerothermodynamics Laboratory in the 20-Inch Mach 6 Air Tunnel. The objective of this work is the development of experimental tools and testing methodologies to apply to hypersonic stage separation problems for future multi-stage launch vehicle systems. Aerodynamic force and moment proximity data were generated at a nominal Mach number of 6 over a small range of angles of attack. The generic bimese configuration was tested in a belly-to-belly and back-to-belly orientation at 86 relative proximity locations. Over 800 aerodynamic proximity data points were taken to serve as a database for code validation. Longitudinal aerodynamic data generated in this test program show very good agreement with viscous computational predictions. Thus a framework has been established to study separation problems in the hypersonic regime using coordinated experimental and computational tools.

  17. The Generic Spacecraft Analyst Assistant (gensaa): a Tool for Developing Graphical Expert Systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real-time data. The analysts must watch for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As the satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At NASA GSFC, fault-isolation expert systems are in operation supporting this data monitoring task. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will readily support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  18. Map reading tools for map libraries.

    USGS Publications Warehouse

    Greenberg, G.L.

    1982-01-01

    Engineers, navigators and military strategists employ a broad array of mechanical devices to facilitate map use. A larger number of map users such as educators, students, tourists, journalists, historians, politicians, economists and librarians are unaware of the available variety of tools which can be used with maps to increase the speed and efficiency of their application and interpretation. This paper identifies map reading tools such as coordinate readers, protractors, dividers, planimeters, and symbol-templets according to a functional classification. Particularly, arrays of tools are suggested for use in determining position, direction, distance, area and form (perimeter-shape-pattern-relief). -from Author

  19. Accurate estimation of short read mapping quality for next-generation genome sequencing

    PubMed Central

    Ruffalo, Matthew; Koyutürk, Mehmet; Ray, Soumya; LaFramboise, Thomas

    2012-01-01

    Motivation: Several software tools specialize in the alignment of short next-generation sequencing reads to a reference sequence. Some of these tools report a mapping quality score for each alignment—in principle, this quality score tells researchers the likelihood that the alignment is correct. However, the reported mapping quality often correlates weakly with actual accuracy and the qualities of many mappings are underestimated, encouraging the researchers to discard correct mappings. Further, these low-quality mappings tend to correlate with variations in the genome (both single nucleotide and structural), and such mappings are important in accurately identifying genomic variants. Approach: We develop a machine learning tool, LoQuM (LOgistic regression tool for calibrating the Quality of short read mappings, to assign reliable mapping quality scores to mappings of Illumina reads returned by any alignment tool. LoQuM uses statistics on the read (base quality scores reported by the sequencer) and the alignment (number of matches, mismatches and deletions, mapping quality score returned by the alignment tool, if available, and number of mappings) as features for classification and uses simulated reads to learn a logistic regression model that relates these features to actual mapping quality. Results: We test the predictions of LoQuM on an independent dataset generated by the ART short read simulation software and observe that LoQuM can ‘resurrect’ many mappings that are assigned zero quality scores by the alignment tools and are therefore likely to be discarded by researchers. We also observe that the recalibration of mapping quality scores greatly enhances the precision of called single nucleotide polymorphisms. Availability: LoQuM is available as open source at http://compbio.case.edu/loqum/. Contact: matthew.ruffalo@case.edu. PMID:22962451

  20. A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity

    USGS Publications Warehouse

    Blair, J.L.; McCrory, P.A.; Oppenheimer, D.H.; Waldhauser, F.

    2011-01-01

    We present a Geographic Information System (GIS) of a new 3-dimensional (3D) model of the subducted Juan de Fuca Plate beneath western North America and associated seismicity of the Cascadia subduction system. The geo-referenced 3D model was constructed from weighted control points that integrate depth information from hypocenter locations and regional seismic velocity studies. We used the 3D model to differentiate earthquakes that occur above the Juan de Fuca Plate surface from earthquakes that occur below the plate surface. This GIS project of the Cascadia subduction system supersedes the one previously published by McCrory and others (2006). Our new slab model updates the model with new constraints. The most significant updates to the model include: (1) weighted control points to incorporate spatial uncertainty, (2) an additional gridded slab surface based on the Generic Mapping Tools (GMT) Surface program which constructs surfaces based on splines in tension (see expanded description below), (3) double-differenced hypocenter locations in northern California to better constrain slab location there, and (4) revised slab shape based on new hypocenter profiles that incorporate routine depth uncertainties as well as data from new seismic-reflection and seismic-refraction studies. We also provide a 3D fly-through animation of the model for use as a visualization tool.

  1. Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process

    DTIC Science & Technology

    2015-10-01

    ARL-TR-7501 ● OCT 2015 US Army Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact...Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process by Stephen Berkebile Vehicle...YYYY) October 2015 2. REPORT TYPE Final 3. DATES COVERED (From - To) 1 January–30 June 2015 4. TITLE AND SUBTITLE Friction Mapping as a Tool for

  2. 40 CFR 1033.110 - Emission diagnostics-general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...

  3. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  4. GSyellow, a Multifaceted Tag for Functional Protein Analysis in Monocot and Dicot Plants.

    PubMed

    Besbrugge, Nienke; Van Leene, Jelle; Eeckhout, Dominique; Cannoot, Bernard; Kulkarni, Shubhada R; De Winne, Nancy; Persiau, Geert; Van De Slijke, Eveline; Bontinck, Michiel; Aesaert, Stijn; Impens, Francis; Gevaert, Kris; Van Damme, Daniel; Van Lijsebettens, Mieke; Inzé, Dirk; Vandepoele, Klaas; Nelissen, Hilde; De Jaeger, Geert

    2018-06-01

    The ability to tag proteins has boosted the emergence of generic molecular methods for protein functional analysis. Fluorescent protein tags are used to visualize protein localization, and affinity tags enable the mapping of molecular interactions by, for example, tandem affinity purification or chromatin immunoprecipitation. To apply these widely used molecular techniques on a single transgenic plant line, we developed a multifunctional tandem affinity purification tag, named GS yellow , which combines the streptavidin-binding peptide tag with citrine yellow fluorescent protein. We demonstrated the versatility of the GS yellow tag in the dicot Arabidopsis ( Arabidopsis thaliana ) using a set of benchmark proteins. For proof of concept in monocots, we assessed the localization and dynamic interaction profile of the leaf growth regulator ANGUSTIFOLIA3 (AN3), fused to the GS yellow tag, along the growth zone of the maize ( Zea mays ) leaf. To further explore the function of ZmAN3, we mapped its DNA-binding landscape in the growth zone of the maize leaf through chromatin immunoprecipitation sequencing. Comparison with AN3 target genes mapped in the developing maize tassel or in Arabidopsis cell cultures revealed strong conservation of AN3 target genes between different maize tissues and across monocots and dicots, respectively. In conclusion, the GS yellow tag offers a powerful molecular tool for distinct types of protein functional analyses in dicots and monocots. As this approach involves transforming a single construct, it is likely to accelerate both basic and translational plant research. © 2018 American Society of Plant Biologists. All rights reserved.

  5. The Balanced Scorecard of acute settings: development process, definition of 20 strategic objectives and implementation.

    PubMed

    Groene, Oliver; Brandt, Elimer; Schmidt, Werner; Moeller, Johannes

    2009-08-01

    Strategy development and implementation in acute care settings is often restricted by competing challenges, the pace of policy reform and the existence of parallel hierarchies. To describe a generic approach to strategy development, illustrate the use of the Balanced Scorecard as a tool to facilitate strategy implementation and demonstrate how to break down strategic goals into measurable elements. Multi-method approach using three different conceptual models: Health Promoting Hospitals Standards and Strategies, the European Foundation for Quality Management (EFQM) Model and the Balanced Scorecard. A bundle of qualitative and quantitative methods were used including in-depth interviews, standardized organization-wide surveys on organizational values, staff satisfaction and patient experience. Three acute care hospitals in four different locations belonging to a German holding group. Chief executive officer, senior medical officers, working group leaders and hospital staff. Development and implementation of the Balanced Scorecard. Twenty strategic objectives with corresponding Balanced Scorecard measures. A stepped approach from strategy development to implementation is presented to identify key themes for strategy development, drafting a strategy map and developing strategic objectives and measures. The Balanced Scorecard, in combination with the EFQM model, is a useful tool to guide strategy development and implementation in health care organizations. As for other quality improvement and management tools not specifically developed for health care organizations, some adaptations are required to improve acceptability among professionals. The step-wise approach of strategy development and implementation presented here may support similar processes in comparable organizations.

  6. An avionics scenario and command model description for Space Generic Open Avionics Architecture (SGOAA)

    NASA Technical Reports Server (NTRS)

    Stovall, John R.; Wray, Richard B.

    1994-01-01

    This paper presents a description of a model for a space vehicle operational scenario and the commands for avionics. This model will be used in developing a dynamic architecture simulation model using the Statemate CASE tool for validation of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA has been proposed as an avionics architecture standard to NASA through its Strategic Avionics Technology Working Group (SATWG) and has been accepted by the Society of Automotive Engineers (SAE) for conversion into an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division (FDSD) of the NASA Johnson Space Center (JSC) by the Lockheed Engineering and Sciences Company (LESC), Houston, Texas. This SGOAA includes a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, and a nine class model of interfaces. The SGOAA is both scalable and recursive and can be applied to any hierarchical level of hardware/software processing systems.

  7. Magrit: a new thematic cartography tool

    NASA Astrophysics Data System (ADS)

    Viry, Matthieu; Giraud, Timothée; Lambert, Nicolas

    2018-05-01

    The article provides an overview of the features of the Magrit web application: a free online thematic mapping tool, presenting a strong pedagogical dimension and making possible to mobilize all the elements necessary for the realization of a thematic map. In this tool, several simple modes of representation are proposed such as proportional maps or choropleth maps. Other, more complex modes are also available such as smoothed maps and cartograms. Each map can be finalized thanks to layout and customization features (projection, scale, orientation, toponyms, etc.) and exported in vector format. Magrit is therefore a complete, light and versatile tool particularly adapted to cartography teaching at the university.

  8. Algorithm To Architecture Mapping Model (ATAMM) multicomputer operating system functional specification

    NASA Technical Reports Server (NTRS)

    Mielke, R.; Stoughton, J.; Som, S.; Obando, R.; Malekpour, M.; Mandala, B.

    1990-01-01

    A functional description of the ATAMM Multicomputer Operating System is presented. ATAMM (Algorithm to Architecture Mapping Model) is a marked graph model which describes the implementation of large grained, decomposed algorithms on data flow architectures. AMOS, the ATAMM Multicomputer Operating System, is an operating system which implements the ATAMM rules. A first generation version of AMOS which was developed for the Advanced Development Module (ADM) is described. A second generation version of AMOS being developed for the Generic VHSIC Spaceborne Computer (GVSC) is also presented.

  9. Chaotic attractors of relaxation oscillators

    NASA Astrophysics Data System (ADS)

    Guckenheimer, John; Wechselberger, Martin; Young, Lai-Sang

    2006-03-01

    We develop a general technique for proving the existence of chaotic attractors for three-dimensional vector fields with two time scales. Our results connect two important areas of dynamical systems: the theory of chaotic attractors for discrete two-dimensional Henon-like maps and geometric singular perturbation theory. Two-dimensional Henon-like maps are diffeomorphisms that limit on non-invertible one-dimensional maps. Wang and Young formulated hypotheses that suffice to prove the existence of chaotic attractors in these families. Three-dimensional singularly perturbed vector fields have return maps that are also two-dimensional diffeomorphisms limiting on one-dimensional maps. We describe a generic mechanism that produces folds in these return maps and demonstrate that the Wang-Young hypotheses are satisfied. Our analysis requires a careful study of the convergence of the return maps to their singular limits in the Ck topology for k >= 3. The theoretical results are illustrated with a numerical study of a variant of the forced van der Pol oscillator.

  10. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    PubMed

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. The Generic Resolution Advisor and Conflict Evaluator (GRACE) for Unmanned Aircraft Detect-And-Avoid Systems

    NASA Technical Reports Server (NTRS)

    Abramson, Michael; Refai, Mohamad; Santiago, Confesor

    2017-01-01

    The paper describes the Generic Resolution Advisor and Conflict Evaluator (GRACE), a novel alerting and guidance algorithm that combines flexibility, robustness, and computational efficiency. GRACE is generic since it was designed without any assumptions regarding temporal or spatial scales, aircraft performance, or its sensor and communication systems. Therefore, GRACE was adopted as a core component of the Java Architecture for Detect-And-Avoid (DAA) Extensibility and Modeling, developed by NASA as a research and modeling tool for Unmanned Aerial Systems Integration in the National Airspace System (NAS). GRACE has been used in a number of real-time and fast-time experiments supporting evolving requirements of DAA research, including parametric studies, NAS-wide simulations, human-in-the-loop experiments, and live flight tests.

  12. 7 CFR 1485.12 - Participation eligibility.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... value of resources provided by CCC for such generic promotion; or (ii) In the case of brand promotion, at least 50 percent of the total cost of such brand promotions. (b) To participate in the EIP/MAP, an entity: (1) Shall be a U.S. commercial entity that either owns the brand(s) of the agricultural commodity...

  13. Wireless Sensor Node Data Gathering and Location Mapping

    DTIC Science & Technology

    2012-03-01

    adaptive two-phase approach to WiFi location sensing,” 4 th Int. Conf. on Pervasive Computing and Communications Workshops, Pisa, Italy, 2006, pp. 452...wrt.v24_micro_generic.bin, March 2010. [11] P. Asadoorian and L. Pesce, Linksys WRT54G Ultimate Hacking , Burlington, MA: Syngress, 2007, pp. 25. 32

  14. Soils [Chapter 4.2

    Treesearch

    Daniel G. Neary; Johannes W. A. Langeveld

    2015-01-01

    Soils are crucial for profitable and sustainable biomass feedstock production. They provide nutrients and water, give support for plants, and provide habitat for enormous numbers of biota. There are several systems for soil classification. FAO has provided a generic classification system that was used for a global soil map (Bot et al., 2000). The USDA Natural Resources...

  15. Cognitive-Operative Model of Intelligent Learning Systems Behavior

    ERIC Educational Resources Information Center

    Laureano-Cruces, Ana Lilia; Ramirez-Rodriguez, Javier; Mora-Torres, Martha; de Arriaga, Fernando; Escarela-Perez, Rafael

    2010-01-01

    In this paper behavior during the teaching-learning process is modeled by means of a fuzzy cognitive map. The elements used to model such behavior are part of a generic didactic model, which emphasizes the use of cognitive and operative strategies as part of the student-tutor interaction. Examples of possible initial scenarios for the…

  16. Design of compound libraries for fragment screening

    NASA Astrophysics Data System (ADS)

    Blomberg, Niklas; Cosgrove, David A.; Kenny, Peter W.; Kolmodin, Karin

    2009-08-01

    Approaches to the design of libraries for fragment screening are illustrated with reference to a 20 k generic fragment screening library and a 1.2 k generic NMR screening library. Tools and methods for library design that have been developed within AstraZeneca are described, including Foyfi fingerprints and the Flush program for neighborhood characterization. It will be shown how Flush and the BigPicker, which selects maximally diverse sets of compounds, are used to apply the Core and Layer method for library design. Approaches to partitioning libraries into cocktails are also described.

  17. Enhancing compliance at Department of Defense facilities: comparison of three environmental audit tools.

    PubMed

    Hepler, Jeff A; Neumann, Cathy

    2003-04-01

    To enhance environmental compliance, the U.S. Department of Defense (DOD) recently developed and implemented a standardized environmental audit tool called The Environmental Assessment and Management (TEAM) Guide. Utilization of a common audit tool (TEAM Guide) throughout DOD agencies could be an effective agent of positive change. If, however, the audit tool is inappropriate, environmental compliance at DOD facilities could worsen. Furthermore, existing audit systems such as the U.S. Environmental Protection Agency's (U.S. EPA's) Generic Protocol for Conducting Environmental Audits of Federal Facilities and the International Organization for Standardization's (ISO's) Standard 14001, "Environmental Management System Audits," may be abandoned even if they offer significant advantages over TEAM Guide audit tool. Widespread use of TEAM Guide should not take place until thorough and independent evaluation has been performed. The purpose of this paper is to compare DOD's TEAM Guide audit tool with U.S. EPA's Generic Protocol for Conducting Environmental Audits of Federal Facilities and ISO 14001, in order to assess which is most appropriate and effective for DOD facilities, and in particular those operated by the U.S. Army Corps of Engineers (USACE). USACE was selected as a result of one author's recent experience as a district environmental compliance coordinator responsible for the audit mission at this agency. Specific recommendations for enhancing the quality of environmental audits at all DOD facilities also are given.

  18. Awareness Development Across Perspectives Tool (ADAPT)

    DTIC Science & Technology

    2010-10-01

    individualist and collectivist cultures are described and linked in the generic knowledge base, and the specific cultural aspects and how they relate to...effort focuses on making a tool based on (1) knowledge developed within diverse scientific disciplines (e.g. cultural anthropology, social psychology...psychological operations, humanitarian missions) is performed in a large variety of locations and cultures (e.g., Africa, Asia), requiring a diversity

  19. Optimal health and disease management using spatial uncertainty: a geographic characterization of emergent artemisinin-resistant Plasmodium falciparum distributions in Southeast Asia.

    PubMed

    Grist, Eric P M; Flegg, Jennifer A; Humphreys, Georgina; Mas, Ignacio Suay; Anderson, Tim J C; Ashley, Elizabeth A; Day, Nicholas P J; Dhorda, Mehul; Dondorp, Arjen M; Faiz, M Abul; Gething, Peter W; Hien, Tran T; Hlaing, Tin M; Imwong, Mallika; Kindermans, Jean-Marie; Maude, Richard J; Mayxay, Mayfong; McDew-White, Marina; Menard, Didier; Nair, Shalini; Nosten, Francois; Newton, Paul N; Price, Ric N; Pukrittayakamee, Sasithon; Takala-Harrison, Shannon; Smithuis, Frank; Nguyen, Nhien T; Tun, Kyaw M; White, Nicholas J; Witkowski, Benoit; Woodrow, Charles J; Fairhurst, Rick M; Sibley, Carol Hopkins; Guerin, Philippe J

    2016-10-24

    Artemisinin-resistant Plasmodium falciparum malaria parasites are now present across much of mainland Southeast Asia, where ongoing surveys are measuring and mapping their spatial distribution. These efforts require substantial resources. Here we propose a generic 'smart surveillance' methodology to identify optimal candidate sites for future sampling and thus map the distribution of artemisinin resistance most efficiently. The approach uses the 'uncertainty' map generated iteratively by a geostatistical model to determine optimal locations for subsequent sampling. The methodology is illustrated using recent data on the prevalence of the K13-propeller polymorphism (a genetic marker of artemisinin resistance) in the Greater Mekong Subregion. This methodology, which has broader application to geostatistical mapping in general, could improve the quality and efficiency of drug resistance mapping and thereby guide practical operations to eliminate malaria in affected areas.

  20. External validity of a generic safety climate scale for lone workers across different industries and companies.

    PubMed

    Lee, Jin; Huang, Yueng-hsiang; Robertson, Michelle M; Murphy, Lauren A; Garabet, Angela; Chang, Wen-Ruey

    2014-02-01

    The goal of this study was to examine the external validity of a 12-item generic safety climate scale for lone workers in order to evaluate the appropriateness of generalized use of the scale in the measurement of safety climate across various lone work settings. External validity evidence was established by investigating the measurement equivalence (ME) across different industries and companies. Confirmatory factor analysis (CFA)-based and item response theory (IRT)-based perspectives were adopted to examine the ME of the generic safety climate scale for lone workers across 11 companies from the trucking, electrical utility, and cable television industries. Fairly strong evidence of ME was observed for both organization- and group-level generic safety climate sub-scales. Although significant invariance was observed in the item intercepts across the different lone work settings, absolute model fit indices remained satisfactory in the most robust step of CFA-based ME testing. IRT-based ME testing identified only one differentially functioning item from the organization-level generic safety climate sub-scale, but its impact was minimal and strong ME was supported. The generic safety climate scale for lone workers reported good external validity and supported the presence of a common feature of safety climate among lone workers. The scale can be used as an effective safety evaluation tool in various lone work situations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Information Power Grid (IPG) Tutorial 2003

    NASA Technical Reports Server (NTRS)

    Meyers, George

    2003-01-01

    For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.

  2. Integrating genomics and proteomics data to predict drug effects using binary linear programming.

    PubMed

    Ji, Zhiwei; Su, Jing; Liu, Chenglin; Wang, Hongyan; Huang, Deshuang; Zhou, Xiaobo

    2014-01-01

    The Library of Integrated Network-Based Cellular Signatures (LINCS) project aims to create a network-based understanding of biology by cataloging changes in gene expression and signal transduction that occur when cells are exposed to a variety of perturbations. It is helpful for understanding cell pathways and facilitating drug discovery. Here, we developed a novel approach to infer cell-specific pathways and identify a compound's effects using gene expression and phosphoproteomics data under treatments with different compounds. Gene expression data were employed to infer potential targets of compounds and create a generic pathway map. Binary linear programming (BLP) was then developed to optimize the generic pathway topology based on the mid-stage signaling response of phosphorylation. To demonstrate effectiveness of this approach, we built a generic pathway map for the MCF7 breast cancer cell line and inferred the cell-specific pathways by BLP. The first group of 11 compounds was utilized to optimize the generic pathways, and then 4 compounds were used to identify effects based on the inferred cell-specific pathways. Cross-validation indicated that the cell-specific pathways reliably predicted a compound's effects. Finally, we applied BLP to re-optimize the cell-specific pathways to predict the effects of 4 compounds (trichostatin A, MS-275, staurosporine, and digoxigenin) according to compound-induced topological alterations. Trichostatin A and MS-275 (both HDAC inhibitors) inhibited the downstream pathway of HDAC1 and caused cell growth arrest via activation of p53 and p21; the effects of digoxigenin were totally opposite. Staurosporine blocked the cell cycle via p53 and p21, but also promoted cell growth via activated HDAC1 and its downstream pathway. Our approach was also applied to the PC3 prostate cancer cell line, and the cross-validation analysis showed very good accuracy in predicting effects of 4 compounds. In summary, our computational model can be used to elucidate potential mechanisms of a compound's efficacy.

  3. Leverage and Delegation in Developing an Information Model for Geology

    NASA Astrophysics Data System (ADS)

    Cox, S. J.

    2007-12-01

    GeoSciML is an information model and XML encoding developed by a group of primarily geologic survey organizations under the auspices of the IUGS CGI. The scope of the core model broadly corresponds with information traditionally portrayed on a geologic map, viz. interpreted geology, some observations, the map legend and accompanying memoir. The development of GeoSciML has followed the methodology specified for an Application Schema defined by OGC and ISO 19100 series standards. This requires agreement within a community concerning their domain model, its formal representation using UML, documentation as a Feature Type Catalogue, with an XML Schema implementation generated from the model by applying a rule-based transformation. The framework and technology supports a modular governance process. Standard datatypes and GI components (geometry, the feature and coverage metamodels, metadata) are imported from the ISO framework. The observation and sampling model (including boreholes) is imported from OGC. The scale used for most scalar literal values (terms, codes, measures) allows for localization where necessary. Wildcards and abstract base- classes provide explicit extensibility points. Link attributes appear in a regular way in the encodings, allowing reference to external resources using URIs. The encoding is compatible with generic GI data-service interfaces (WFS, WMS, SOS). For maximum interoperability within a community, the interfaces may be specialised through domain-specified constraints (e.g. feature-types, scale and vocabulary bindings, query-models). Formalization using UML and XML allows use of standard validation and processing tools. Use of upper-level elements defined for generic GI application reduces the development effort and governance resonsibility, while maximising cross-domain interoperability. On the other hand, enabling specialization to be delegated in a controlled manner is essential to adoption across a range of subdisciplines and jurisdictions. The GeoSciML design team is responsible only for the part of the model that is unique to geology but for which general agreement can be reached within the domain. This paper is presented on behalf of the Interoperability Working Group of the IUGS Commission for Geoscience Information (CGI) - follow web-link for details of the membership.

  4. Implementation of structure-mapping inference by event-file binding and action planning: a model of tool-improvisation analogies.

    PubMed

    Fields, Chris

    2011-03-01

    Structure-mapping inferences are generally regarded as dependent upon relational concepts that are understood and expressible in language by subjects capable of analogical reasoning. However, tool-improvisation inferences are executed by members of a variety of non-human primate and other species. Tool improvisation requires correctly inferring the motion and force-transfer affordances of an object; hence tool improvisation requires structure mapping driven by relational properties. Observational and experimental evidence can be interpreted to indicate that structure-mapping analogies in tool improvisation are implemented by multi-step manipulation of event files by binding and action-planning mechanisms that act in a language-independent manner. A functional model of language-independent event-file manipulations that implement structure mapping in the tool-improvisation domain is developed. This model provides a mechanism by which motion and force representations commonly employed in tool-improvisation structure mappings may be sufficiently reinforced to be available to inwardly directed attention and hence conceptualization. Predictions and potential experimental tests of this model are outlined.

  5. Patient-Reported Measures for Person-Centered Coordinated Care: A Comparative Domain Map and Web-Based Compendium for Supporting Policy Development and Implementation

    PubMed Central

    Wheat, Hannah; Horrell, Jane; Sugavanam, Thavapriya; Fosh, Benjamin; Valderas, Jose M

    2018-01-01

    Background Patient-reported measure (PRM) questionnaires were originally used in research to measure outcomes of intervention studies. They have now evolved into a diverse family of tools measuring a range of constructs including quality of life and experiences of care. Current health and social care policy increasingly advocates their use for embedding the patient voice into service redesign through new models of care such as person-centered coordinated care (P3C). If chosen carefully and used efficiently, these tools can help improve care delivery through a variety of novel ways, including system-level feedback for health care management and commissioning. Support and guidance on how to use these tools would be critical to achieve these goals. Objective The objective of this study was to develop evidence-based guidance and support for the use of P3C-PRMs in health and social care policy through identification of PRMs that can be used to enhance the development of P3C, mapping P3C-PRMs against an existing model of domains of P3C, and integration and organization of the information in a user-friendly Web-based database. Methods A pragmatic approach was used for the systematic identification of candidate P3C-PRMs, which aimed at balancing comprehensiveness and feasibility. This utilized a number of resources, including existing compendiums, peer-reviewed and gray literature (using a flexible search strategy), and stakeholder engagement (which included guidance for relevant clinical areas). A subset of those candidate measures (meeting prespecified eligibility criteria) was then mapped against a theoretical model of P3C, facilitating classification of the construct being measured and the subsequent generation of shortlists for generic P3C measures, specific aspects of P3C (eg, communication or decision making), and condition-specific measures (eg, diabetes, cancer) in priority areas, as highlighted by stakeholders. Results In total, 328 P3C-PRMs were identified, which were used to populate a freely available Web-based database. Of these, 63 P3C-PRMs met the eligibility criteria for shortlisting and were classified according to their measurement constructs and mapped against the theoretical P3C model. We identified tools with the best coverage of P3C, thereby providing evidence of their content validity as outcome measures for new models of care. Transitions and medications were 2 areas currently poorly covered by existing measures. All the information is currently available at a user-friendly web-based portal (p3c.org.uk), which includes all relevant information on each measure, such as the constructs targeted and links to relevant literature, in addition to shortlists according to relevant constructs. Conclusions A detailed compendium of P3C-PRMs has been developed using a pragmatic systematic approach supported by stakeholder engagement. Our user-friendly suite of tools is designed to act as a portal to the world of PRMs for P3C, and have utility for a broad audience, including (but not limited to) health care commissioners, managers, and researchers. PMID:29444767

  6. Usability and Functional Enhancements to an Online Interface for Predicting Post Fire Erosion (WEPP-PEP)

    NASA Astrophysics Data System (ADS)

    Lew, Roger; Dobre, Mariana; Elliot, William; Robichaud, Pete; Brooks, Erin; Frankenberger, Jim

    2017-04-01

    There is an increased interest in the United States to use soil burn severity maps in watershed-scale hydrologic models to estimate post-fire sediment erosion from burned areas. This information is needed by stakeholders in order to concentrate their pre- or post-fire management efforts in ecologically sensitive areas to decrease the probability of post-fire sediment delivery. But these tools traditionally have been time consuming and difficult to use by managers because input datasets must be obtained and correctly processed for valid results. The Water Erosion Prediction Project (WEPP) has previously been developed as an online and easy-to-use interface to help land managers with running simulations without any knowledge of computer programming or hydrologic modeling. The interface automates the acquisition of DEM, climate, soils, and landcover data, and also automates channel and hillslope delineation for the users. The backend is built with Mapserver, GDAL, PHP, C++, Python while the front end uses OpenLayers, and, of course, JavaScript. The existing WEPP online interface was enhanced to provide better usability to stakeholders in United States (Forest Service, BLM, USDA) as well as to provide enhanced functionality for managing both pre-fire and post-fire treatments. Previously, only site administrators could add burn severity maps. The interface now allows users to create accounts to upload and share FlamMap prediction maps, differenced Normalized Burned Ratio (dNBR), or Burned Area Reflectance Classification (BARC) maps. All maps are loaded into a sortable catalog so users can quickly find their area of interest. Once loaded, the interface has been modified to support running comparisons between baseline condition with "no burn" and with a burn severity classification map. The interface has also been enhanced to allow users to conduct single storm analyses to examine, for example, how much soil loss would result after a 100-year storm. An OpenLayers map allows users to overlay the watershed hillslopes and channels, burn severity, and erosion. The interface provides flowpath results for each hillslope and at the outlet, as well as return period and frequency analysis reports. Once problematic areas have been identified, the interface allows users to export the watershed in a format that can be used by the Erosion Risk Management Tool (ERMiT) and Disturbed WEPP (post-disturbance modeling) for more detailed hillslope-level analyses. Numerous other changes were made to improve the overall usability of the interface: allow simulations in both SI and English units, added immovable pop-up dialogs to guide the users, and removed extraneous information from the interface. In upcoming months, a workshop will be conducted to demonstrate these new capabilities to stakeholders. Efforts are underway to use site-specific SSURGO soils to that are modified based on burn severity rather than using generic soil classes.

  7. Applying open source data visualization tools to standard based medical data.

    PubMed

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  8. Visualizing astronomy data using VRML

    NASA Astrophysics Data System (ADS)

    Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.

    2004-09-01

    Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.

  9. Basic to Advanced InSAR Processing: GMTSAR

    NASA Astrophysics Data System (ADS)

    Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.

    2017-12-01

    Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.

  10. Mapping WIL Activities in the Curriculum to Develop Graduate Capabilities: A Case Study in Accounting

    ERIC Educational Resources Information Center

    Natoli, Riccardo; Jackling, Beverley; Kaider, Friederika; Clark, Colin

    2013-01-01

    Big business continues to request universities to produce graduates who possess both technical and generic skills. Although work-integrated learning (WIL) programs can be used to develop these skills, WIL placements in Australia are undertaken by a minority of students. Perceiving a gap, one Australian university undertook a major WIL revamp to…

  11. In Search of Museum Professional Knowledge Base: Mapping the Professional Knowledge Debate onto Museum Work

    ERIC Educational Resources Information Center

    Tlili, Anwar

    2016-01-01

    Museum professionalism remains an unexplored area in museum studies, particularly with regard to what is arguably the core generic question of a "sui generis" professional knowledge base, and its necessary and sufficient conditions. The need to examine this question becomes all the more important with the increasing expansion of the…

  12. Mapping a Domain Model and Architecture to a Generic Design

    DTIC Science & Technology

    1994-05-01

    Code 103 Aooession Fol [TIS GRA&I Or DTIC TAB 0 Uaannouneed 0 hTst ifI catlon DIstributiqoj.- A Availability 0Qede Avail and,(o viaa~ CMLVSBI44-TF#4...this step is a record (in no specific form) of the selected features. This record (list, highlighted features diagram, or other media ) is used

  13. System Framework for a Multi-Band, Multi-Mode Software Defined Radio

    DTIC Science & Technology

    2014-06-01

    detection, while the VITA Radio Transport ( VRT ) protocol over Gigabit Ethernet (GIGE) is implemented for the data interface. In addition to the SoC...CTRL VGA CTRL C2 GPP C2 CORE SW ARM0 RX SYN CTRL PL MEMORY MAP DR CTRL GENERIC INTERRUPT CONTROLLER DR GPP VITERBI ALGORITHM & VRT INTERFACE ARM1

  14. Generic extravehicular (EVA) and telerobot task primitives for analysis, design, and integration. Version 1.0: Reference compilation for the EVA and telerobotics communities

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Drews, Michael

    1990-01-01

    The results are described of an effort to establish commonality and standardization of generic crew extravehicular (crew-EVA) and telerobotic task analysis primitives used for the study of spaceborne operations. Although direct crew-EVA plans are the most visible output of spaceborne operations, significant ongoing efforts by a wide variety of projects and organizations also require tools for estimation of crew-EVA and telerobotic times. Task analysis tools provide estimates for input to technical and cost tradeoff studies. A workshop was convened to identify the issues and needs to establish a common language and syntax for task analysis primitives. In addition, the importance of such a syntax was shown to have precedence over the level to which such a syntax is applied. The syntax, lists of crew-EVA and telerobotic primitives, and the data base in diskette form are presented.

  15. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  16. Advanced Robotics for In-Space Vehicle Processing

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Estus, Jay; Heneghan, Cate; Bosley, John

    1990-01-01

    An analysis of spaceborne vehicle processing is described. Generic crew-EVA tasks are presented for a specific vehicle, the orbital maneuvering vehicle (OMV), with general implications to other on-orbit vehicles. The OMV is examined with respect to both servicing and maintenance. Crew-EVA activities are presented by task and mapped to a common set of generic crew-EVA primitives to identify high-demand areas for telerobot services. Similarly, a set of telerobot primitives is presented that can be used to model telerobot actions for alternative telerobot reference configurations. The telerobot primitives are tied to technologies and used for composting telerobot operations for an automated refueling scenario. Telerobotics technology issues and design accomodation guidelines (hooks and scars) for the Space Station Freedom are described.

  17. A visual interface for generic message translation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blattner, M.M.; Kou, L.T.; Carlson, J.W.

    1988-06-21

    This paper is concerned with the translation of data structures we call messages. Messages are an example of a type of data structure encountered in generic data translation. Our objective is to provide a system that the nonprogrammer can use to specify the nature of translations from one type to another. For this reason we selected a visual interface that uses interaction techniques that do not require a knowledge of programming or command languages. The translator must accomplish two tasks: create a mapping between fields in different message types that specifies which fields have similar semantic content, and reformat ormore » translate data specifications within those fields. The translations are accomplished with appropriate, but different, visual metaphors. 14 refs., 4 figs.« less

  18. Knowledge maps: a tool for online assessment with automated feedback.

    PubMed

    Ho, Veronica W; Harris, Peter G; Kumar, Rakesh K; Velan, Gary M

    2018-12-01

    In higher education, most assessments or examinations comprise either multiple-choice items or open-ended questions such as modified essay questions (MEQs). Online concept and knowledge maps are potential tools for assessment, which might emphasize meaningful, integrated understanding of phenomena. We developed an online knowledge-mapping assessment tool, which provides automated feedback on student-submitted maps. We conducted a pilot study to investigate the potential utility of online knowledge mapping as a tool for automated assessment by comparing the scores generated by the software with manual grading of a MEQ on the same topic for a cohort of first-year medical students. In addition, an online questionnaire was used to gather students' perceptions of the tool. Map items were highly discriminating between students of differing knowledge of the topic overall. Regression analysis showed a significant correlation between map scores and MEQ scores, and responses to the questionnaire regarding use of knowledge maps for assessment were overwhelmingly positive. These results suggest that knowledge maps provide a similar indication of students' understanding of a topic as a MEQ, with the advantage of instant, consistent computer grading and time savings for educators. Online concept and knowledge maps could be a useful addition to the assessment repertoire in higher education.

  19. On the use of Bayesian decision theory for issuing natural hazard warnings

    NASA Astrophysics Data System (ADS)

    Economou, T.; Stephenson, D. B.; Rougier, J. C.; Neal, R. A.; Mylne, K. R.

    2016-10-01

    Warnings for natural hazards improve societal resilience and are a good example of decision-making under uncertainty. A warning system is only useful if well defined and thus understood by stakeholders. However, most operational warning systems are heuristic: not formally or transparently defined. Bayesian decision theory provides a framework for issuing warnings under uncertainty but has not been fully exploited. Here, a decision theoretic framework is proposed for hazard warnings. The framework allows any number of warning levels and future states of nature, and a mathematical model for constructing the necessary loss functions for both generic and specific end-users is described. The approach is illustrated using one-day ahead warnings of daily severe precipitation over the UK, and compared to the current decision tool used by the UK Met Office. A probability model is proposed to predict precipitation, given ensemble forecast information, and loss functions are constructed for two generic stakeholders: an end-user and a forecaster. Results show that the Met Office tool issues fewer high-level warnings compared with our system for the generic end-user, suggesting the former may not be suitable for risk averse end-users. In addition, raw ensemble forecasts are shown to be unreliable and result in higher losses from warnings.

  20. On the use of Bayesian decision theory for issuing natural hazard warnings.

    PubMed

    Economou, T; Stephenson, D B; Rougier, J C; Neal, R A; Mylne, K R

    2016-10-01

    Warnings for natural hazards improve societal resilience and are a good example of decision-making under uncertainty. A warning system is only useful if well defined and thus understood by stakeholders. However, most operational warning systems are heuristic: not formally or transparently defined. Bayesian decision theory provides a framework for issuing warnings under uncertainty but has not been fully exploited. Here, a decision theoretic framework is proposed for hazard warnings. The framework allows any number of warning levels and future states of nature, and a mathematical model for constructing the necessary loss functions for both generic and specific end-users is described. The approach is illustrated using one-day ahead warnings of daily severe precipitation over the UK, and compared to the current decision tool used by the UK Met Office. A probability model is proposed to predict precipitation, given ensemble forecast information, and loss functions are constructed for two generic stakeholders: an end-user and a forecaster. Results show that the Met Office tool issues fewer high-level warnings compared with our system for the generic end-user, suggesting the former may not be suitable for risk averse end-users. In addition, raw ensemble forecasts are shown to be unreliable and result in higher losses from warnings.

  1. On the use of Bayesian decision theory for issuing natural hazard warnings

    PubMed Central

    Stephenson, D. B.; Rougier, J. C.; Neal, R. A.; Mylne, K. R.

    2016-01-01

    Warnings for natural hazards improve societal resilience and are a good example of decision-making under uncertainty. A warning system is only useful if well defined and thus understood by stakeholders. However, most operational warning systems are heuristic: not formally or transparently defined. Bayesian decision theory provides a framework for issuing warnings under uncertainty but has not been fully exploited. Here, a decision theoretic framework is proposed for hazard warnings. The framework allows any number of warning levels and future states of nature, and a mathematical model for constructing the necessary loss functions for both generic and specific end-users is described. The approach is illustrated using one-day ahead warnings of daily severe precipitation over the UK, and compared to the current decision tool used by the UK Met Office. A probability model is proposed to predict precipitation, given ensemble forecast information, and loss functions are constructed for two generic stakeholders: an end-user and a forecaster. Results show that the Met Office tool issues fewer high-level warnings compared with our system for the generic end-user, suggesting the former may not be suitable for risk averse end-users. In addition, raw ensemble forecasts are shown to be unreliable and result in higher losses from warnings. PMID:27843399

  2. Usefulness of the WHOQOL-BREF questionnaire in assessing the quality of life of parents of children with asthma

    PubMed Central

    Roncada, Cristian; Dias, Caroline Pieta; Goecks, Suelen; Cidade, Simone Elenise Falcão; Pitrez, Paulo Márcio Condessa

    2015-01-01

    Objective:: To evaluate the quality of life (QOL) of parents of children with asthma and to analyze the internal consistency of the generic QOL tool World Health Organization Quality of Life, abbreviated version (WHOQOL-BREF). Methods:: We evaluated the QOL of parents of asthmatic and healthy children aged between 8 and 16, using the generic WHOQOL-BREF questionnaire. We also evaluated the internal consistency using Cronbach's alpha (αC), in order to determine whether the tool had good validity for the target audience. Results:: The study included 162 individuals with a mean age of 43.8±13.6 years, of which 104 were female (64.2%) and 128 were married (79.0%). When assessing the QOL, the group of parents of healthy children had higher scores than the group of parents of asthmatic children in the four areas evaluated by the questionnaire (Physical, Psychological Health, Social Relationships and Environment), indicating a better quality of life. Regarding the internal consistency of the WHOQOL-BREF, values of ˛C were 0.86 points for the group of parents of asthmatic children, and 0.88 for the group of parents of healthy children. Conclusions:: Parents of children with asthma have impaired quality of life due to their children's disease. Furthermore, the WHOQOL-BREF, even as a generic tool, showed to be practical and efficient to evaluate the quality of life of parents of asthmatic children. © 2015 Sociedade de Pediatria de São Paulo. Published by Elsevier Editora Ltda. All rights reserved. PMID:26137868

  3. Analyzing existing conventional soil information sources to be incorporated in thematic Spatial Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Pascual-Aguilar, J. A.; Rubio, J. L.; Domínguez, J.; Andreu, V.

    2012-04-01

    New information technologies give the possibility of widespread dissemination of spatial information to different geographical scales from continental to local by means of Spatial Data Infrastructures. Also administrative awareness on the need for open access information services has allowed the citizens access to this spatial information through development of legal documents, such as the INSPIRE Directive of the European Union, adapted by national laws as in the case of Spain. The translation of the general criteria of generic Spatial Data Infrastructures (SDI) to thematic ones is a crucial point for the progress of these instruments as large tool for the dissemination of information. In such case, it must be added to the intrinsic criteria of digital information, such as the harmonization information and the disclosure of metadata, the own environmental information characteristics and the techniques employed in obtaining it. In the case of inventories and mapping of soils, existing information obtained by traditional means, prior to the digital technologies, is considered to be a source of valid information, as well as unique, for the development of thematic SDI. In this work, an evaluation of existing and accessible information that constitutes the basis for building a thematic SDI of soils in Spain is undertaken. This information framework has common features to other European Union states. From a set of more than 1,500 publications corresponding to the national territory of Spain, the study was carried out in those documents (94) found for five autonomous regions of northern Iberian Peninsula (Asturias, Cantabria, Basque Country, Navarra and La Rioja). The analysis was performed taking into account the criteria of soil mapping and inventories. The results obtained show a wide variation in almost all the criteria: geographic representation (projections, scales) and geo-referencing the location of the profiles, map location of profiles integrated with edaphic units, description and taxonomic classification systems of soils (FAO, Soil taxonomy, etc.), amount and type of soil analysis parameters and dates of the inventories. In conclusion, the construction of thematic SDI on soil should take into account, prior to the integration of all maps and inventories, a series of processes of harmonization that allows spatial continuity between existing information and also temporal identification of the inventories and maps. This should require the development of at least two types of integration tools: (1) enabling spatial continuity without contradictions between maps made at different times and with different criteria and (2) the development of information systems data (metadata) to highlight the characteristics of information and connection possibilities with other sources that comprise the Spatial Data Infrastructure. Acknowledgements This research has financed by the European Union within the framework of the GS Soil project (eContentplus Programme ECP-2008-GEO-318004).

  4. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  5. Concept-Mapping Tools and the Development of Students' Critical-Thinking Skills

    ERIC Educational Resources Information Center

    Tseng, Sheng-Shiang

    2015-01-01

    Developing students' critical-thinking skills has recently received attention at all levels of education. This article proposes the use of concept-mapping tools to improve students' critical-thinking skills. The article introduces a Web-based concept-mapping tool--Popplet--and demonstrates its application for teaching critical-thinking skills in…

  6. NRMRL/TTSD CUSTOMER SATISFACTION FOCUS GROUP

    EPA Science Inventory

    TTB uses a variety of technology transfer products and tools to communicate risk and information about technologies and research. TTB has begun a project to use EPA's generic Customer Satisfaction Survey Information Collection Request (ICR) to determine satisfaction with their pr...

  7. EpiCollect: linking smartphones to web applications for epidemiology, ecology and community data collection.

    PubMed

    Aanensen, David M; Huntley, Derek M; Feil, Edward J; al-Own, Fada'a; Spratt, Brian G

    2009-09-16

    Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features) both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth). Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.

  8. The Julia sets of basic uniCremer polynomials of arbitrary degree

    NASA Astrophysics Data System (ADS)

    Blokh, Alexander; Oversteegen, Lex

    Let P be a polynomial of degree d with a Cremer point p and no repelling or parabolic periodic bi-accessible points. We show that there are two types of such Julia sets J_P . The red dwarf J_P are nowhere connected im kleinen and such that the intersection of all impressions of external angles is a continuum containing p and the orbits of all critical images. The solar J_P are such that every angle with dense orbit has a degenerate impression disjoint from other impressions and J_P is connected im kleinen at its landing point. We study bi-accessible points and locally connected models of J_P and show that such sets J_P appear through polynomial-like maps for generic polynomials with Cremer points. Since known tools break down for d>2 (if d>2 , it is not known if there are small cycles near p , while if d=2 , this result is due to Yoccoz), we introduce wandering ray continua in J_P and provide a new application of Thurston laminations.

  9. Spontaneously broken Yang-Mills-Einstein supergravities as double copies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiodaroli, Marco; Günaydin, Murat; Johansson, Henrik

    Color/kinematics duality and the double-copy construction have proved to be systematic tools for gaining new insight into gravitational theories. Extending our earlier work, in this article we introduce new double-copy constructions for large classes of spontaneously-broken Yang-Mills-Einstein theories with adjoint Higgs elds. One gaugetheory copy entering the construction is a spontaneously-broken (super-)Yang-Mills theory, while the other copy is a bosonic Yang-Mills-scalar theory with trilinear scalar interactions that display an explicitly-broken global symmetry. We show that the kinematic numerators of these gauge theories can be made to obey color/kinematics duality by exhibiting particular additional Lie-algebraic relations. We discuss in detail explicitmore » examples with N = 2 supersymmetry, focusing on Yang-Mills-Einstein supergravity theories belonging to the generic Jordan family in four and five dimensions, and identify the map between the supergravity and double-copy elds and parameters. We also briefly discuss the application of our results to N = 4 supergravity theories. The constructions are illustrated by explicit examples of tree-level and one-loop scattering amplitudes.« less

  10. Spontaneously broken Yang-Mills-Einstein supergravities as double copies

    DOE PAGES

    Chiodaroli, Marco; Günaydin, Murat; Johansson, Henrik; ...

    2017-06-13

    Color/kinematics duality and the double-copy construction have proved to be systematic tools for gaining new insight into gravitational theories. Extending our earlier work, in this article we introduce new double-copy constructions for large classes of spontaneously-broken Yang-Mills-Einstein theories with adjoint Higgs elds. One gaugetheory copy entering the construction is a spontaneously-broken (super-)Yang-Mills theory, while the other copy is a bosonic Yang-Mills-scalar theory with trilinear scalar interactions that display an explicitly-broken global symmetry. We show that the kinematic numerators of these gauge theories can be made to obey color/kinematics duality by exhibiting particular additional Lie-algebraic relations. We discuss in detail explicitmore » examples with N = 2 supersymmetry, focusing on Yang-Mills-Einstein supergravity theories belonging to the generic Jordan family in four and five dimensions, and identify the map between the supergravity and double-copy elds and parameters. We also briefly discuss the application of our results to N = 4 supergravity theories. The constructions are illustrated by explicit examples of tree-level and one-loop scattering amplitudes.« less

  11. [HUMAN RESOURCES MANAGEMENT BASED ON COMPETENCIES].

    PubMed

    Larumbe Andueza, Ma Carmen; De Mendoza Cánton, Juana Hermoso

    2016-05-01

    We are living in a time with a lot of changes in which health organizations have more challenges to face. One of them is to recognize, strengthen, develop and retain the talent they have. Competency-based human resources management is emerging as a tool that contributes to achieve that aim. Competencies from the generic or characteristic perspective: personality traits, values and motivations, which are deeply rooted in the person. Through elaborating a competencies map for the organization, and identifying the job competencies profile, above all in key jobs, the employees know what it is going to expect from them. After, detect and cover the learning needs, it is possible to achieve better adjust between worker-job. The nursing unit manager is a key job because it is a link between management team and nursing team. The way that it is performed, it will have impact on the quality of care and its team motivation. So, the most adequate person who covers this job would have a part of knowledge, skills, attitudes and compatible interests with her job. Competency-based management helps identify both the potential and learning needs to performing this job.

  12. Open Technologies at Athabasca University's Geospace Observatories

    NASA Astrophysics Data System (ADS)

    Connors, M. G.; Schofield, I. S.

    2012-12-01

    Athabasca University Geophysical Observatories feature two auroral observation sites situated in the subauroral zone of western Canada, separated by approximately 25 km. These sites are both on high-speed internet and ideal for observing phenomena detectable from this latitude, which include noctilucent clouds, meteors, and magnetic and optical aspects of the aurora. General aspects of use of Linux in observatory management are described, with emphasis on recent imaging projects involving control of high resolution digital SLR cameras at low cadence, and inexpensive white light analog video cameras at 30 Hz. Linux shell scripts are extensively used, with image capture controlled by gphoto2, the ivtv-utils package, x264 video coding library, and ffmpeg. Imagemagick allows processing of images in an automated fashion. Image archives and movies are created and can be correlated with magnetic data. Much of the magnetic data stream also uses GMT (Generic Mapping Tools) within shell scripts for display. Additionally, SPASE metadata are generated for most of the magnetic data, thus allowing users of our AUTUMN magnetic data repository to perform SPASE queries on the dataset. Visualization products from our twin observatories will be presented.

  13. PTMscape: an open source tool to predict generic post-translational modifications and map modification crosstalk in protein domains and biological processes.

    PubMed

    Li, Ginny X H; Vogel, Christine; Choi, Hyungwon

    2018-06-07

    While tandem mass spectrometry can detect post-translational modifications (PTM) at the proteome scale, reported PTM sites are often incomplete and include false positives. Computational approaches can complement these datasets by additional predictions, but most available tools use prediction models pre-trained for single PTM type by the developers and it remains a difficult task to perform large-scale batch prediction for multiple PTMs with flexible user control, including the choice of training data. We developed an R package called PTMscape which predicts PTM sites across the proteome based on a unified and comprehensive set of descriptors of the physico-chemical microenvironment of modified sites, with additional downstream analysis modules to test enrichment of individual or pairs of PTMs in protein domains. PTMscape is flexible in the ability to process any major modifications, such as phosphorylation and ubiquitination, while achieving the sensitivity and specificity comparable to single-PTM methods and outperforming other multi-PTM tools. Applying this framework, we expanded proteome-wide coverage of five major PTMs affecting different residues by prediction, especially for lysine and arginine modifications. Using a combination of experimentally acquired sites (PSP) and newly predicted sites, we discovered that the crosstalk among multiple PTMs occur more frequently than by random chance in key protein domains such as histone, protein kinase, and RNA recognition motifs, spanning various biological processes such as RNA processing, DNA damage response, signal transduction, and regulation of cell cycle. These results provide a proteome-scale analysis of crosstalk among major PTMs and can be easily extended to other types of PTM.

  14. A life scientist's gateway to distributed data management and computing: the PathPort/ToolBus framework.

    PubMed

    Eckart, J Dana; Sobral, Bruno W S

    2003-01-01

    The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.

  15. Map based multimedia tool on Pacific theatre in World War II

    NASA Astrophysics Data System (ADS)

    Pakala Venkata, Devi Prasada Reddy

    Maps have been used for depicting data of all kinds in the educational community for many years. A standout amongst the rapidly changing methods of teaching is through the development of interactive and dynamic maps. The emphasis of the thesis is to develop an intuitive map based multimedia tool, which provides a timeline of battles and events in the Pacific theatre of World War II. The tool contains summaries of major battles and commanders and has multimedia content embedded in it. The primary advantage of this Map tool is that one can quickly know about all the battles and campaigns of the Pacific Theatre by accessing Timeline of Battles in each region or Individual Battles in each region or Summary of each Battle in an interactive way. This tool can be accessed via any standard web browser and motivate the user to know more about the battles involved in the Pacific Theatre. It was made responsive using Google maps API, JavaScript, HTML5 and CSS.

  16. Comparison of disease-specific quality of life tools in patients with chronic venous disease.

    PubMed

    Kuet, Mong-Loon; Lane, Tristan Ra; Anwar, Muzaffar A; Davies, Alun H

    2014-12-01

    This work was presented as a poster in the American Venous Forum 25th Annual Meeting; 28 February 2013; Phoenix, Arizona, USA. Quality of life (QoL) is an important outcome measure in the treatment for chronic venous disease. The Aberdeen Varicose Vein Questionnaire (AVVQ) and the ChronIc Venous Insufficiency quality of life Questionnaire (CIVIQ-14) are two validated disease-specific QoL questionnaires in current use. The aim of this study is to evaluate the relationship between the AVVQ and the CIVIQ-14 to enable better comparison between studies and to compare these disease-specific QoL tools with generic QoL and clinician-driven tools. Adults attending our institution for management of their varicose veins completed the AVVQ, CIVIQ-14 and EuroQol-5D (EQ-5D). Clinical data, CEAP classification and the Venous Clinical Severity Score (VCSS) were collected. The relationship between the AVVQ and CIVIQ-14 scores was analysed using Spearman's correlation. The AVVQ and CIVIQ-14 scores were also analysed with a generic QoL tool (EQ-5D) and a clinician-driven tool, the VCSS. One hundred patients, mean age 57.5 (44 males; 56 females), participated in the study. The median AVVQ score was 21.9 (range 0-74) and the median CIVIQ-14 score was 30 (range 0-89). A strong correlation was demonstrated between the AVVQ and CIVIQ-14 scores (r = 0.8; p < 0.0001). Strong correlation was maintained for patients with C1-3 disease (r = 0.7; p < 0.0001) and C4-6 disease (r = 0.8; p < 0.0001). The VCSS correlated strongly with the AVVQ and CIVIQ-14 scores (r = 0.7; p < 0.0001 and r = 0.7; p < 0.0001, respectively). Both the AVVQ and CIVIQ-14 scores correlated well with the EQ-5D score (r = -0.5; p < 0.0001 and r = -0.7; p < 0.0001, respectively). This study demonstrates that there is good correlation between two widely used varicose vein specific QoL tools (AVVQ and CIVIQ-14) across the whole spectrum of disease severity. Strong correlation exists between these disease-specific QoL tools and generic and clinician-driven tools. Our findings confirm valid comparisons between studies using either disease-specific QoL tool. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  17. Translating global recommendations on HIV and infant feeding to the local context: the development of culturally sensitive counselling tools in the Kilimanjaro Region, Tanzania.

    PubMed

    Leshabari, Sebalda C; Koniz-Booher, Peggy; Astrøm, Anne N; de Paoli, Marina M; Moland, Karen M

    2006-10-03

    This paper describes the process used to develop an integrated set of culturally sensitive, evidence-based counselling tools (job aids) by using qualitative participatory research. The aim of the intervention was to contribute to improving infant feeding counselling services for HIV positive women in the Kilimanjaro Region of Tanzania. Formative research using a combination of qualitative methods preceded the development of the intervention and mapped existing practices, perceptions and attitudes towards HIV and infant feeding (HIV/IF) among mothers, counsellors and community members. Intervention Mapping (IM) protocol guided the development of the overall intervention strategy. Theories of behaviour change, a review of the international HIV/IF guidelines and formative research findings contributed to the definition of performance and learning objectives. Key communication messages and colourful graphic illustrations related to infant feeding in the context of HIV were then developed and/or adapted from existing generic materials. Draft materials were field tested with intended audiences and subjected to stakeholder technical review. An integrated set of infant feeding counselling tools, referred to as 'job aids', was developed and included brochures on feeding methods that were found to be socially and culturally acceptable, a Question and Answer Guide for counsellors, a counselling card on the risk of transmission of HIV, and an infant feeding toolbox for demonstration. Each brochure describes the steps to ensure safer infant feeding using simple language and images based on local ideas and resources. The brochures are meant to serve as both a reference material during infant feeding counselling in the ongoing prevention of mother to child transmission (pMTCT) of HIV programme and as take home material for the mother. The study underscores the importance of formative research and a systematic theory based approach to developing an intervention aimed at improving counselling and changing customary feeding practices. The identification of perceived barriers and facilitators for change contributed to developing the key counselling messages and graphics, reflecting the socio-economic reality, cultural beliefs and norms of mothers and their significant others.

  18. Translating global recommendations on HIV and infant feeding to the local context: the development of culturally sensitive counselling tools in the Kilimanjaro Region, Tanzania

    PubMed Central

    Leshabari, Sebalda C; Koniz-Booher, Peggy; Åstrøm, Anne N; de Paoli, Marina M; Moland, Karen M

    2006-01-01

    Background This paper describes the process used to develop an integrated set of culturally sensitive, evidence-based counselling tools (job aids) by using qualitative participatory research. The aim of the intervention was to contribute to improving infant feeding counselling services for HIV positive women in the Kilimanjaro Region of Tanzania. Methods Formative research using a combination of qualitative methods preceded the development of the intervention and mapped existing practices, perceptions and attitudes towards HIV and infant feeding (HIV/IF) among mothers, counsellors and community members. Intervention Mapping (IM) protocol guided the development of the overall intervention strategy. Theories of behaviour change, a review of the international HIV/IF guidelines and formative research findings contributed to the definition of performance and learning objectives. Key communication messages and colourful graphic illustrations related to infant feeding in the context of HIV were then developed and/or adapted from existing generic materials. Draft materials were field tested with intended audiences and subjected to stakeholder technical review. Results An integrated set of infant feeding counselling tools, referred to as 'job aids', was developed and included brochures on feeding methods that were found to be socially and culturally acceptable, a Question and Answer Guide for counsellors, a counselling card on the risk of transmission of HIV, and an infant feeding toolbox for demonstration. Each brochure describes the steps to ensure safer infant feeding using simple language and images based on local ideas and resources. The brochures are meant to serve as both a reference material during infant feeding counselling in the ongoing prevention of mother to child transmission (pMTCT) of HIV programme and as take home material for the mother. Conclusion The study underscores the importance of formative research and a systematic theory based approach to developing an intervention aimed at improving counselling and changing customary feeding practices. The identification of perceived barriers and facilitators for change contributed to developing the key counselling messages and graphics, reflecting the socio-economic reality, cultural beliefs and norms of mothers and their significant others. PMID:17018140

  19. Map_plot and bgg_plot: software for integration of geoscience datasets

    NASA Astrophysics Data System (ADS)

    Gaillot, Philippe; Punongbayan, Jane T.; Rea, Brice

    2004-02-01

    Since 1985, the Ocean Drilling Program (ODP) has been supporting multidisciplinary research in exploring the structure and history of Earth beneath the oceans. After more than 200 Legs, complementary datasets covering different geological environments, periods and space scales have been obtained and distributed world-wide using the ODP-Janus and Lamont Doherty Earth Observatory-Borehole Research Group (LDEO-BRG) database servers. In Earth Sciences, more than in any other science, the ensemble of these data is characterized by heterogeneous formats and graphical representation modes. In order to fully and quickly assess this information, a set of Unix/Linux and Generic Mapping Tool-based C programs has been designed to convert and integrate datasets acquired during the present ODP and the future Integrated ODP (IODP) Legs. Using ODP Leg 199 datasets, we show examples of the capabilities of the proposed programs. The program map_plot is used to easily display datasets onto 2-D maps. The program bgg_plot (borehole geology and geophysics plot) displays data with respect to depth and/or time. The latter program includes depth shifting, filtering and plotting of core summary information, continuous and discrete-sample core measurements (e.g. physical properties, geochemistry, etc.), in situ continuous logs, magneto- and bio-stratigraphies, specific sedimentological analyses (lithology, grain size, texture, porosity, etc.), as well as core and borehole wall images. Outputs from both programs are initially produced in PostScript format that can be easily converted to Portable Document Format (PDF) or standard image formats (GIF, JPEG, etc.) using widely distributed conversion programs. Based on command line operations and customization of parameter files, these programs can be included in other shell- or database-scripts, automating plotting procedures of data requests. As an open source software, these programs can be customized and interfaced to fulfill any specific plotting need of geoscientists using ODP-like datasets.

  20. Generic skills in medical education: developing the tools for successful lifelong learning.

    PubMed

    Murdoch-Eaton, Deborah; Whittle, Sue

    2012-01-01

    Higher education has invested in defining the role of generic skills in developing effective, adaptable graduates fit for a changing workplace. Research confirms that the development of generic skills that underpin effectiveness and adaptability in graduates is highly context-dependent and is shaped by the discipline within which these skills are conceptualised, valued and taught. This places the responsibility for generic skills enhancement clearly within the remit of global medical education. Many factors will influence the skill set with which students begin their medical training and experience at entry needs to be taken into account. Learning and teaching environments enhance effective skill development through active learning, teaching for understanding, feedback, and teacher-student and student-student interaction. Medical curricula need to provide students with opportunities to practise and develop their generic skills in a range of discipline-specific contexts. Curricular design should include explicit and integrated generic skills objectives against which students' progress can be monitored. Assessment and feedback serve as valuable reinforcements of the professed importance of generic skills to both learner and teacher, and will encourage students to self-evaluate and take responsibility for their own skill development. The continual need for students to modify their practice in response to changes in their environment and the requirements of their roles will help students to develop the ability to transfer these skills at transition points in their training and future careers. If they are to take their place in an ever-changing profession, medical students need to be competent in the skills that underpin lifelong learning. Only then will the doctors of the future be well placed to adapt to changes in knowledge, update their practice in line with the changing evidence base, and continue to contribute effectively as societal needs change. © Blackwell Publishing Ltd 2012.

  1. A generic, cost-effective, and scalable cell lineage analysis platform

    PubMed Central

    Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud

    2016-01-01

    Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250

  2. Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.

    PubMed

    Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis

    2015-01-01

    Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration. © 2015 American Institute of Chemical Engineers.

  3. Tiered co-payments, pricing, and demand in reference price markets for pharmaceuticals.

    PubMed

    Herr, Annika; Suppliet, Moritz

    2017-12-01

    Health insurance companies curb price-insensitive behavior and the moral hazard of insureds by means of cost-sharing, such as tiered co-payments or reference pricing in drug markets. This paper evaluates the effect of price limits - below which drugs are exempt from co-payments - on prices and on demand. First, using a difference-in-differences estimation strategy, we find that the new policy decreases prices by 5 percent for generics and increases prices by 4 percent for brand-name drugs in the German reference price market. Second, estimating a nested-logit demand model, we show that consumers appreciate co-payment exempt drugs and calculate lower price elasticities for brand-name drugs than for generics. This explains the different price responses of brand-name and generic drugs and shows that price-related co-payment tiers are an effective tool to steer demand to low-priced drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. 3D Scene Reconstruction Using Omnidirectional Vision and LiDAR: A Hybrid Approach

    PubMed Central

    Vlaminck, Michiel; Luong, Hiep; Goeman, Werner; Philips, Wilfried

    2016-01-01

    In this paper, we propose a novel approach to obtain accurate 3D reconstructions of large-scale environments by means of a mobile acquisition platform. The system incorporates a Velodyne LiDAR scanner, as well as a Point Grey Ladybug panoramic camera system. It was designed with genericity in mind, and hence, it does not make any assumption about the scene or about the sensor set-up. The main novelty of this work is that the proposed LiDAR mapping approach deals explicitly with the inhomogeneous density of point clouds produced by LiDAR scanners. To this end, we keep track of a global 3D map of the environment, which is continuously improved and refined by means of a surface reconstruction technique. Moreover, we perform surface analysis on consecutive generated point clouds in order to assure a perfect alignment with the global 3D map. In order to cope with drift, the system incorporates loop closure by determining the pose error and propagating it back in the pose graph. Our algorithm was exhaustively tested on data captured at a conference building, a university campus and an industrial site of a chemical company. Experiments demonstrate that it is capable of generating highly accurate 3D maps in very challenging environments. We can state that the average distance of corresponding point pairs between the ground truth and estimated point cloud approximates one centimeter for an area covering approximately 4000 m2. To prove the genericity of the system, it was tested on the well-known Kitti vision benchmark. The results show that our approach competes with state of the art methods without making any additional assumptions. PMID:27854315

  5. 40 CFR 86.096-38 - Maintenance instructions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and distributing the information, excluding any research and development costs incurred in designing... available through, for example, generic aftermarket tools, a pass-through device, or inexpensive... dealerships, whichever is earlier. The index shall describe the title of the course or instructional session...

  6. 40 CFR 86.1808-01 - Maintenance instructions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and distributing the information, excluding any research and development costs incurred in designing... available through, for example, generic aftermarket tools, a pass-through device, or inexpensive... dealerships, whichever is earlier. The index shall describe the title of the course or instructional session...

  7. MOLECULAR MODELING AS A TOOL FOR UNDERSTANDING HUMAN HEALTH RISKS

    EPA Science Inventory

    A GENERIC STEP IN MANY MECHANISMS FOR CHEMICAL TOXICITY IS THE INTERACTION BETWEEN A SMALL MOLECULE AND A BIOLOGICAL MACROMOLECULE. THE INFORMATION THAT IS GATHERED FROM THIS STUDY WILL THEN BE USED TO EXTRACT RELATIONSHIPS AMONG THE INFORMATION DOMAINS.

  8. Sinking Maps: A Conceptual Tool for Visual Metaphor

    ERIC Educational Resources Information Center

    Giampa, Joan Marie

    2012-01-01

    Sinking maps, created by Northern Virginia Community College professor Joan Marie Giampa, are tools that teach fine art students how to construct visual metaphor by conceptually mapping sensory perceptions. Her dissertation answers the question, "Can visual metaphor be conceptually mapped in the art classroom?" In the Prologue, Giampa…

  9. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  10. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  11. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation.

    PubMed

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.

  12. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation

    PubMed Central

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Objects: Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications. PMID:27199729

  13. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  14. Construct Maps: A Tool to Organize Validity Evidence

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen

    2013-01-01

    The construct map is a promising tool for organizing the data standard-setting panelists interpret. The challenge in applying construct maps to standard-setting procedures will be the judicious selection of data to include within this organizing framework. Therefore, this commentary focuses on decisions about what to include in the construct map.…

  15. SGML Authoring Tools for Technical Communication.

    ERIC Educational Resources Information Center

    Davidson, W. J.

    1993-01-01

    Explains that structured authoring systems designed for the creation of generically encoded reusable information have context-sensitive application of markup, markup suppression, queing and automated formatting, structural navigation, and self-validation features. Maintains that they are a real alternative to conventional publishing systems. (SR)

  16. Knowledge mobilisation in healthcare: a critical review of health sector and generic management literature.

    PubMed

    Ferlie, Ewan; Crilly, Tessa; Jashapara, Ashok; Peckham, Anna

    2012-04-01

    The health policy domain has displayed increasing interest in questions of knowledge management and knowledge mobilisation within healthcare organisations. We analyse here the findings of a critical review of generic management and health-related literatures, covering the period 2000-2008. Using 29 pre-selected journals, supplemented by a search of selected electronic databases, we map twelve substantive domains classified into four broad groups: taxonomic and philosophical (e.g. different types of knowledge); theoretical discourse (e.g. critical organisational studies); disciplinary fields (e.g. organisational learning and Information Systems/Information Technology); and organisational processes and structures (e.g. organisational form). We explore cross-overs and gaps between these traditionally separate literature streams. We found that health sector literature has absorbed some generic concepts, notably Communities of Practice, but has not yet deployed the performance-oriented perspective of the Resource Based View (RBV) of the Firm. The generic literature uses healthcare sites to develop critical analyses of power and control in knowledge management, rooted in neo-Marxist/labour process and Foucauldian approaches. The review generates three theoretically grounded statements to inform future enquiry, by: (a) importing the RBV stream; (b) developing the critical organisational studies perspective further; and (c) exploring the theoretical argument that networks and other alternative organisational forms facilitate knowledge sharing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Assessment of wear dependence parameters in complex model of cutting tool wear

    NASA Astrophysics Data System (ADS)

    Antsev, A. V.; Pasko, N. I.; Antseva, N. V.

    2018-03-01

    This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.

  18. A Machine-Learning-Driven Sky Model.

    PubMed

    Satylmys, Pynar; Bashford-Rogers, Thomas; Chalmers, Alan; Debattista, Kurt

    2017-01-01

    Sky illumination is responsible for much of the lighting in a virtual environment. A machine-learning-based approach can compactly represent sky illumination from both existing analytic sky models and from captured environment maps. The proposed approach can approximate the captured lighting at a significantly reduced memory cost and enable smooth transitions of sky lighting to be created from a small set of environment maps captured at discrete times of day. The author's results demonstrate accuracy close to the ground truth for both analytical and capture-based methods. The approach has a low runtime overhead, so it can be used as a generic approach for both offline and real-time applications.

  19. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    PubMed

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  20. A screening tool to prioritize public health risk associated with accidental or deliberate release of chemicals into the atmosphere

    PubMed Central

    2013-01-01

    The Chemical Events Working Group of the Global Health Security Initiative has developed a flexible screening tool for chemicals that present a risk when accidentally or deliberately released into the atmosphere. The tool is generic, semi-quantitative, independent of site, situation and scenario, encompasses all chemical hazards (toxicity, flammability and reactivity), and can be easily and quickly implemented by non-subject matter experts using freely available, authoritative information. Public health practitioners and planners can use the screening tool to assist them in directing their activities in each of the five stages of the disaster management cycle. PMID:23517410

  1. A Generic Evaluation Model for Semantic Web Services

    NASA Astrophysics Data System (ADS)

    Shafiq, Omair

    Semantic Web Services research has gained momentum over the last few Years and by now several realizations exist. They are being used in a number of industrial use-cases. Soon software developers will be expected to use this infrastructure to build their B2B applications requiring dynamic integration. However, there is still a lack of guidelines for the evaluation of tools developed to realize Semantic Web Services and applications built on top of them. In normal software engineering practice such guidelines can already be found for traditional component-based systems. Also some efforts are being made to build performance models for servicebased systems. Drawing on these related efforts in component-oriented and servicebased systems, we identified the need for a generic evaluation model for Semantic Web Services applicable to any realization. The generic evaluation model will help users and customers to orient their systems and solutions towards using Semantic Web Services. In this chapter, we have presented the requirements for the generic evaluation model for Semantic Web Services and further discussed the initial steps that we took to sketch such a model. Finally, we discuss related activities for evaluating semantic technologies.

  2. Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)

    DTIC Science & Technology

    2015-07-01

    EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of

  3. SU-E-T-365: Estimation of Neutron Ambient Dose Equivalents for Radioprotection Exposed Workers in Radiotherapy Facilities Based On Characterization Patient Risk Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Irazola, L; Terron, J; Sanchez-Doblado, F

    2015-06-15

    Purpose: Previous measurements with Bonner spheres{sup 1} showed that normalized neutron spectra are equal for the majority of the existing linacs{sup 2}. This information, in addition to thermal neutron fluences obtained in the characterization procedure{sup 3}3, would allow to estimate neutron doses accidentally received by exposed workers, without the need of an extra experimental measurement. Methods: Monte Carlo (MC) simulations demonstrated that the thermal neutron fluence distribution inside the bunker is quite uniform, as a consequence of multiple scatter in the walls{sup 4}. Although inverse square law is approximately valid for the fast component, a more precise calculation could bemore » obtained with a generic fast fluence distribution map around the linac, from MC simulations{sup 4}. Thus, measurements of thermal neutron fluences performed during the characterization procedure{sup 3}, together with a generic unitary spectra{sup 2}, would allow to estimate the total neutron fluences and H*(10) at any point{sup 5}. As an example, we compared estimations with Bonner sphere measurements{sup 1}, for two points in five facilities: 3 Siemens (15–23 MV), Elekta (15 MV) and Varian (15 MV). Results: Thermal neutron fluences obtained from characterization, are within (0.2–1.6×10{sup 6}) cm−{sup 2}•Gy{sup −1} for the five studied facilities. This implies ambient equivalent doses ranging from (0.27–2.01) mSv/Gy 50 cm far from the isocenter and (0.03–0.26) mSv/Gy at detector location with an average deviation of ±12.1% respect to Bonner measurements. Conclusion: The good results obtained demonstrate that neutron fluence and H*(10) can be estimated based on: (a) characterization procedure established for patient risk estimation in each facility, (b) generic unitary neutron spectrum and (c) generic MC map distribution of the fast component. [1] Radiat. Meas (2010) 45: 1391 – 1397; [2] Phys. Med. Biol (2012) 5 7:6167–6191; [3] Med. Phys (2015) 42:276 - 281. [4] IFMBE (2012) 39: 1245–1248. [5] ICRU Report 57 (1998)« less

  4. Using Generalizability Theory to Examine Different Concept Map Scoring Methods

    ERIC Educational Resources Information Center

    Cetin, Bayram; Guler, Nese; Sarica, Rabia

    2016-01-01

    Problem Statement: In addition to being teaching tools, concept maps can be used as effective assessment tools. The use of concept maps for assessment has raised the issue of scoring them. Concept maps generated and used in different ways can be scored via various methods. Holistic and relational scoring methods are two of them. Purpose of the…

  5. Development of a Competency Mapping Tool for Undergraduate Professional Degree Programmes, Using Mechanical Engineering as a Case Study

    ERIC Educational Resources Information Center

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that…

  6. An Experiment in Mind-Mapping and Argument-Mapping: Tools for Assessing Outcomes in the Business Curriculum

    ERIC Educational Resources Information Center

    Gargouri, Chanaz; Naatus, Mary Kate

    2017-01-01

    Distinguished from other teaching-learning tools, such as mind and concept mapping in which students draw pictures and concepts and show relationships and correlation between them to demonstrate their own understanding of complex concepts, argument mapping is used to demonstrate clarity of reasoning, based on supporting evidence, and come to a…

  7. Teachers' Perceptions of Esri Story Maps as Effective Teaching Tools

    ERIC Educational Resources Information Center

    Strachan, Caitlin; Mitchell, Jerry

    2014-01-01

    The current study explores teachers' perceptions of Esri Story Maps as effective teaching tools. Story Maps are a relatively new web application created using Esri's cloud-based GIS platform, ArcGIS Online. They combine digitized, dynamic web maps with other story elements to help the creator effectively convey a message. The relative ease…

  8. Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.

    PubMed

    Lepley, C J

    1998-12-01

    The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.

  9. Advanced EVA system design requirements study

    NASA Technical Reports Server (NTRS)

    Woods, T. G.

    1988-01-01

    The results are presented of a study to identify specific criteria regarding space station extravehicular activity system (EVAS) hardware requirements. Key EVA design issues include maintainability, technology readiness, LSS volume vs. EVA time available, suit pressure/cabin pressure relationship and productivity effects, crew autonomy, integration of EVA as a program resource, and standardization of task interfaces. A variety of DOD EVA systems issues were taken into consideration. Recommendations include: (1) crew limitations, not hardware limitations; (2) capability to perform all of 15 generic missions; (3) 90 days on-orbit maintainability with 50 percent duty cycle as minimum; and (4) use by payload sponsors of JSC document 10615A plus a Generic Tool Kit and Specialized Tool Kit description. EVA baseline design requirements and criteria, including requirements of various subsystems, are outlined. Space station/EVA system interface requirements and EVA accommodations are discussed in the areas of atmosphere composition and pressure, communications, data management, logistics, safe haven, SS exterior and interior requirements, and SS airlock.

  10. Modeling Zone-3 Protection with Generic Relay Models for Dynamic Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vyakaranam, Bharat GNVSR; Diao, Ruisheng

    This paper presents a cohesive approach for calculating and coordinating the settings of multiple zone-3 protections for dynamic contingency analysis. The zone-3 protections are represented by generic distance relay models. A two-step approach for determining zone-3 relay settings is proposed. The first step is to calculate settings, particularly, the reach, of each zone-3 relay individually by iteratively running line open-end fault short circuit analysis; the blinder is also employed and properly set to meet the industry standard under extreme loading conditions. The second step is to systematically coordinate the protection settings of the zone-3 relays. The main objective of thismore » coordination step is to address the over-reaching issues. We have developed a tool to automate the proposed approach and generate the settings of all distance relays in a PSS/E dyr format file. The calculated zone-3 settings have been tested on a modified IEEE 300 system using a dynamic contingency analysis tool (DCAT).« less

  11. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  12. Tools for model-building with cryo-EM maps

    DOE PAGES

    Terwilliger, Thomas Charles

    2018-01-01

    There are new tools available to you in Phenix for interpreting cryo-EM maps. You can automatically sharpen (or blur) a map with phenix.auto_sharpen and you can segment a map with phenix.segment_and_split_map. If you have overlapping partial models for a map, you can merge them with phenix.combine_models. If you have a protein-RNA complex and protein chains have been accidentally built in the RNA region, you can try to remove them with phenix.remove_poor_fragments. You can put these together and automatically sharpen, segment and build a map with phenix.map_to_model.

  13. Tools for model-building with cryo-EM maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terwilliger, Thomas Charles

    There are new tools available to you in Phenix for interpreting cryo-EM maps. You can automatically sharpen (or blur) a map with phenix.auto_sharpen and you can segment a map with phenix.segment_and_split_map. If you have overlapping partial models for a map, you can merge them with phenix.combine_models. If you have a protein-RNA complex and protein chains have been accidentally built in the RNA region, you can try to remove them with phenix.remove_poor_fragments. You can put these together and automatically sharpen, segment and build a map with phenix.map_to_model.

  14. ASSIP Study of Real-Time Safety-Critical Embedded Software-Intensive System Engineering Practices

    DTIC Science & Technology

    2008-02-01

    and assessment 2. product engineering processes 3. tooling processes 6 | CMU/SEI-2008-SR-001 Slide 1 Process Standards IEC/ ISO 12207 Software...and technical effort to align with 12207 IEC/ ISO 15026 System & Software Integrity Levels Generic Safety SAE ARP 4754 Certification Considerations...Process Frameworks in revision – ISO 9001, ISO 9004 – ISO 15288/ ISO 12207 harmonization – RTCA DO-178B, MOD Standard UK 00-56/3, … • Methods & Tools

  15. The Small Body Mapping Tool (SBMT) for Accessing, Visualizing, and Analyzing Spacecraft Data in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barnouin, O. S.; Ernst, C. M.; Daly, R. T.

    2018-04-01

    The free, publicly available Small Body Mapping Tool (SBMT) developed at the Johns Hopkins University Applied Physics Laboratory is a powerful, easy-to-use tool for accessing and analyzing data from small bodies.

  16. The Synthesis Map Is a Multidimensional Educational Tool That Provides Insight into Students' Mental Models and Promotes Students' Synthetic Knowledge Generation

    ERIC Educational Resources Information Center

    Ortega, Ryan A.; Brame, Cynthia J.

    2015-01-01

    Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…

  17. Synopsis of the cyclocephaline scarab beetles (Coleoptera, Scarabaeidae, Dynastinae)

    PubMed Central

    Moore, Matthew R.; Cave, Ronald D.; Branham, Marc A.

    2018-01-01

    Abstract The cyclocephaline scarabs (Scarabaeidae: Dynastinae: Cyclocephalini) are a speciose tribe of beetles that include species that are ecologically and economically important as pollinators and pests of agriculture and turf. We provide an overview and synopsis of the 14 genera of Cyclocephalini that includes information on: 1) the taxonomic and nomenclatural history of the group; 2) diagnosis and identification of immature life-stages; 3) economic importance in agroecosystems; 4) natural enemies of these beetles; 5) use as food by humans; 6) the importance of adults as pollination mutualists; 7) fossil cyclocephalines and the evolution of the group; 8) generic-level identification of adults. We provide an expanded identification key to genera of world Cyclocephalini and diagnoses for each genus. Character illustrations and generic-level distribution maps are provided along with discussions on the relationships of the tribe’s genera. PMID:29670448

  18. Synopsis of the cyclocephaline scarab beetles (Coleoptera, Scarabaeidae, Dynastinae).

    PubMed

    Moore, Matthew R; Cave, Ronald D; Branham, Marc A

    2018-01-01

    The cyclocephaline scarabs (Scarabaeidae: Dynastinae: Cyclocephalini) are a speciose tribe of beetles that include species that are ecologically and economically important as pollinators and pests of agriculture and turf. We provide an overview and synopsis of the 14 genera of Cyclocephalini that includes information on: 1) the taxonomic and nomenclatural history of the group; 2) diagnosis and identification of immature life-stages; 3) economic importance in agroecosystems; 4) natural enemies of these beetles; 5) use as food by humans; 6) the importance of adults as pollination mutualists; 7) fossil cyclocephalines and the evolution of the group; 8) generic-level identification of adults. We provide an expanded identification key to genera of world Cyclocephalini and diagnoses for each genus. Character illustrations and generic-level distribution maps are provided along with discussions on the relationships of the tribe's genera.

  19. Administrative Job Level Study and Factoring System.

    ERIC Educational Resources Information Center

    Portland Community Coll., OR.

    The administrative job classification system and generic job descriptions presented in this report were developed at Portland Community College (PCC) as management tools. After introductory material outlining the objectives of and criteria used in the administrative job-level study, and offering information on the administrative job factoring…

  20. The Communication Audit as a Library Management Tool.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.; Bunge, Charles A.

    1987-01-01

    Discusses the relationship between effective organizational communication and specific library contexts and reviews the historical development of communication audits as a means of studying communication effectiveness. The generic structure of such audits is described and two models are presented in detail, including techniques for gathering and…

  1. Registration of 4D time-series of cardiac images with multichannel Diffeomorphic Demons.

    PubMed

    Peyrat, Jean-Marc; Delingette, Hervé; Sermesant, Maxime; Pennec, Xavier; Xu, Chenyang; Ayache, Nicholas

    2008-01-01

    In this paper, we propose a generic framework for intersubject non-linear registration of 4D time-series images. In this framework, spatio-temporal registration is defined by mapping trajectories of physical points as opposed to spatial registration that solely aims at mapping homologous points. First, we determine the trajectories we want to register in each sequence using a motion tracking algorithm based on the Diffeomorphic Demons algorithm. Then, we perform simultaneously pairwise registrations of corresponding time-points with the constraint to map the same physical points over time. We show this trajectory registration can be formulated as a multichannel registration of 3D images. We solve it using the Diffeomorphic Demons algorithm extended to vector-valued 3D images. This framework is applied to the inter-subject non-linear registration of 4D cardiac CT sequences.

  2. SocialMood: an information visualization tool to measure the mood of the people in social networks

    NASA Astrophysics Data System (ADS)

    Amorim, Guilherme; Franco, Roberto; Moraes, Rodolfo; Figueiredo, Bruno; Miranda, João.; Dobrões, José; Afonso, Ricardo; Meiguins, Bianchi

    2013-12-01

    Based on the arena of social networks, the tool developed in this study aims to identify trends mood among undergraduate students. Combining the methodology Self-Assessment Manikin (SAM), which originated in the field of Psychology, the system filters the content provided on the Web and isolates certain words, establishing a range of values as perceived positive, negative or neutral. A Big Data summarizing the results, assisting in the construction and visualization of behavioral profiles generic, so we have a guideline for the development of information visualization tools for social networks.

  3. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  4. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool.

    PubMed

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-06-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13-17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences.

  5. Transition Flight Control Room Automation

    NASA Technical Reports Server (NTRS)

    Welborn, Curtis Ray

    1990-01-01

    The Workstation Prototype Laboratory is currently working on a number of projects which we feel can have a direct impact on ground operations automation. These projects include: The Fuel Cell Monitoring System (FCMS), which will monitor and detect problems with the fuel cells on the Shuttle. FCMS will use a combination of rules (forward/backward) and multi-threaded procedures which run concurrently with the rules, to implement the malfunction algorithms of the EGIL flight controllers. The combination of rule based reasoning and procedural reasoning allows us to more easily map the malfunction algorithms into a real-time system implementation. A graphical computation language (AGCOMPL). AGCOMPL is an experimental prototype to determine the benefits and drawbacks of using a graphical language to design computations (algorithms) to work on Shuttle or Space Station telemetry and trajectory data. The design of a system which will allow a model of an electrical system, including telemetry sensors, to be configured on the screen graphically using previously defined electrical icons. This electrical model would then be used to generate rules and procedures for detecting malfunctions in the electrical components of the model. A generic message management (GMM) system. GMM is being designed as a message management system for real-time applications which send advisory messages to a user. The primary purpose of GMM is to reduce the risk of overloading a user with information when multiple failures occurs and in assisting the developer in devising an explanation facility. The emphasis of our work is to develop practical tools and techniques, while determining the feasibility of a given approach, including identification of appropriate software tools to support research, application and tool building activities.

  6. Transition flight control room automation

    NASA Technical Reports Server (NTRS)

    Welborn, Curtis Ray

    1990-01-01

    The Workstation Prototype Laboratory is currently working on a number of projects which can have a direct impact on ground operations automation. These projects include: (1) The fuel cell monitoring system (FCMS), which will monitor and detect problems with the fuel cells on the shuttle. FCMS will use a combination of rules (forward/backward) and multithreaded procedures, which run concurrently with the rules, to implement the malfunction algorithms of the EGIL flight controllers. The combination of rule-based reasoning and procedural reasoning allows us to more easily map the malfunction algorithms into a real-time system implementation. (2) A graphical computation language (AGCOMPL) is an experimental prototype to determine the benefits and drawbacks of using a graphical language to design computations (algorithms) to work on shuttle or space station telemetry and trajectory data. (3) The design of a system will allow a model of an electrical system, including telemetry sensors, to be configured on the screen graphically using previously defined electrical icons. This electrical model would then be used to generate rules and procedures for detecting malfunctions in the electrical components of the model. (4) A generic message management (GMM) system is being designed for real-time applications as a message management system which sends advisory messages to a user. The primary purpose of GMM is to reduce the risk of overloading a user with information when multiple failures occur and to assist the developer in the devising an explanation facility. The emphasis of our work is to develop practical tools and techniques, including identification of appropriate software tools to support research, application, and tool building activities, while determining the feasibility of a given approach.

  7. Tree-level gluon amplitudes on the celestial sphere

    NASA Astrophysics Data System (ADS)

    Schreiber, Anders Ø.; Volovich, Anastasia; Zlotnikov, Michael

    2018-06-01

    Pasterski, Shao and Strominger have recently proposed that massless scattering amplitudes can be mapped to correlators on the celestial sphere at infinity via a Mellin transform. We apply this prescription to arbitrary n-point tree-level gluon amplitudes. The Mellin transforms of MHV amplitudes are given by generalized hypergeometric functions on the Grassmannian Gr (4 , n), while generic non-MHV amplitudes are given by more complicated Gelfand A-hypergeometric functions.

  8. PFGE MAPPER and PFGE READER: two tools to aid in the analysis and data input of pulse field gel electrophoresis maps.

    PubMed Central

    Shifman, M. A.; Nadkarni, P.; Miller, P. L.

    1992-01-01

    Pulse field gel electrophoresis mapping is an important technique for characterizing large segments of DNA. We have developed two tools to aid in the construction of pulse field electrophoresis gel maps: PFGE READER which stores experimental conditions and calculates fragment sizes and PFGE MAPPER which constructs pulse field gel electrophoresis maps. PMID:1482898

  9. Battle of France WWII

    NASA Astrophysics Data System (ADS)

    Gadhath, Arpitha Rao

    The purpose of this thesis is to build an interactive Geographical Information System (GIS) tool, relating to the series of events that occurred during the Battle of France World War II. The tool gives us an insight about the countries involved in the battle, their allies and their strategies. This tool was created to use it as a one stop source of information regarding all the important battles that took place, which lead to the fall of France. The tool brings together the maps of all the countries involved. Integrated with each map is the data relevant to that map. The data for each country includes the place of attack, the strategies used during the attack, and the kind of warfare. The tool also makes use of HTML files to give all the information, along with the images from the time of the war and a footage which explains everything about the particular battle. The tool was build using JAVA, along with the use of MOJO (Map Objects Java Objects) to develop Maps of each of the countries. MOJO is developed by ESRI (Environmental Science Research Institute) which makes it easier to add data to the maps. It also makes highlighting important information easier making use of pop-up windows, charts and infographics. HTML files were designed making use of the open-source template developed by Bootstrap. The tool is built in such a way that the interface is simple and easy for the user to use and understand.

  10. Application of Porter's generic strategies in ambulatory health care: a comparison of managerial perceptions in two Israeli sick funds.

    PubMed

    Torgovicky, Refael; Goldberg, Avishay; Shvarts, Shifra; Bar Dayan, Yosefa; Onn, Erez; Levi, Yehezkel; BarDayan, Yaron

    2005-01-01

    A number of typologies have been developed in the strategic management literature to categorize strategies that an organization can pursue at the business level. Extensive research has established Porter's generic strategies of (1) cost leadership, (2) differentiation, (3) differentiation focus, (4) cost focus, and (5) stuck-in-the-middle as the dominant paradigm in the literature. The purpose of the current study was to research competitive strategies in the Israeli ambulatory health care system, by comparing managerial perceptions of present and ideal business strategies in two Israeli sick funds. We developed a unique research tool, which reliably examines the gap between the present and ideal status managerial views. We found a relation between the business strategy and performance measures, thus strengthening Porter's original theory about the nonviability of the stuck-in-the-middle strategy, and suggesting the applicability Porter's generic strategies to not-for-profit institutes in an ambulatory health care system.

  11. Generic Module for Collecting Data in Smart Cities

    NASA Astrophysics Data System (ADS)

    Martinez, A.; Ramirez, F.; Estrada, H.; Torres, L. A.

    2017-09-01

    The Future Internet brings new technologies to the common life of people, such as Internet of Things, Cloud Computing or Big Data. All this technologies have change the way people communicate and also the way the devices interact with the context, giving rise to new paradigms, as the case of smart cities. Currently, the mobile devices represent one of main sources of information for new applications that take into account the user context, such as apps for mobility, health, of security. Several platforms have been proposed that consider the development of Future Internet applications, however, no generic modules can be found that implement the collection of context data from smartphones. In this research work we present a generic module to collect data from different sensors of the mobile devices and also to send, in a standard manner, this data to the Open FIWARE Cloud to be stored or analyzed by software tools. The proposed module enables the human-as-a-sensor approach for FIWARE Platform.

  12. Evidence-Based Concept Mapping for the Athletic Training Student

    ERIC Educational Resources Information Center

    Speicher, Timothy E.; Martin, Malissa; Zigmont, Jason

    2013-01-01

    Context: A concept map is a graphical and cognitive tool that enables learners to link together interrelated concepts using propositions or statements that answer a posed problem. As an assessment tool, concept mapping reveals a learner's research skill proficiency and cognitive processing. Background: The identification and organization of the…

  13. iPads at Field Camp: A First Test of the Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Hurst, S. D.; Stewart, M. A.

    2011-12-01

    An iPad 2 was given to approximately half of the University of Illinois students attending the Wasatch-Uinta Field Camp (WUFC) in summer 2011. The iPads were provisioned with orientation measuring, mapping and location software. The software would automatically transfer an orientation measurement to the current location on the Google Maps application, and was able to output a full list of orientation data. Students also had normal access to more traditional mapping tools such as Brunton compasses and GPS units and were required to map with these tools along with other students of WUFC not provided iPads. Compared to traditional tools, iPads have drawbacks such as increased weight, break-ability, need for power source and wireless connectivity; in sum, they need a substantial infrastructure that reduces range, availability, and probably most importantly, convenience. Some of these drawbacks inhibited adoption by our students, the primary reasons being the added weight and the inability to map directly to a GIS application with detailed topographic maps equivalent to the physical topographic map sheets used at WUFC. In their favor, the iPads combine a host of tools into one, including software that can measure orientation in a fashion more intuitively than a Brunton. They also allow storage, editing and analysis of data, notes (spoken and/or written) and potentially unlimited access to a variety of maps. Via a post-field camp survey of the University of Illinois students at WUFC, we have identified some of the important issues that need to be addressed before portable tablets like the iPad become the tool of choice for general field work. Some problems are intrinsic to almost any advanced technology, some are artifacts of the current generations of hardware and software available for these devices. Technical drawbacks aside, the adoption of iPads was further inhibited primarily by inexperience with their use as a mapping tool and secondarily by their redundancy with traditional tools. We are addressing some aspects of software limitations and future technology improvements by the industry will naturally reduce other limitations. We will continue testing iPads during field trips and courses for the foreseeable future. As we begin to deal with these limitations and students become more accustomed to their use in the field, we expect our students to more fully embrace iPads as a convenient field and mapping tool.

  14. Impact of European pharmaceutical price regulation on generic price competition: a review.

    PubMed

    Puig-Junoy, Jaume

    2010-01-01

    Although economic theory indicates that it should not be necessary to intervene in the generic drug market through price regulation, most EU countries intervene in this market, both by regulating the maximum sale price of generics (price cap) and by setting the maximum reimbursement rate, especially by means of reference pricing systems. We analyse current knowledge of the impact of direct price-cap regulation of generic drugs and the implementation of systems regulating the reimbursement rate, particularly through reference pricing and similar tools, on dynamic price competition between generic competitors in Europe. A literature search was carried out in the EconLit and PubMed databases, and on Google Scholar. The search included papers published in English or Spanish between January 2000 and July 2009. Inclusion criteria included that studies had to present empirical results of a quantitative nature for EU countries of the impact of price capping and/or regulation of the reimbursement rate (reference pricing or similar systems) on price dynamics, corresponding to pharmacy sales, in the generic drug market. The available evidence indicates that price-cap regulation leads to a levelling off of generic prices at a higher level than would occur in the absence of this regulation. Reference pricing systems cause an obvious and almost compulsory reduction in the consumer price of all pharmaceuticals subject to this system, to a varying degree in different countries and periods, the reduction being greater for originator-branded drugs than for generics. In several countries with a reference pricing system, it was observed that generics with a consumer price lower than the reference price do not undergo price reductions until the reference price is reduced, even when there are other lower-priced generics on the market (absence of price competition below the reference price). Beyond the price reduction forced by the price-cap and/or reference pricing regulation itself, the entry of new generic competitors is useful for lowering the real transaction price of purchases made by pharmacies (dynamic price competition at ex-factory level), although this effect is weaker or non-significant for official ex-factory prices and consumer prices in some countries. When maximum reimbursement systems such as reference pricing or similar types are applied, pharmacies are seen to receive large discounts on the price they pay for the pharmaceuticals, although these discounts are not transferred to the consumer price. The percentage discount offered to pharmacies in a country that uses a price-cap system combined with reference pricing is positively and significantly related to the number of generic competitors in the market for the pharmaceutical (dynamic price competition at ex-factory level).

  15. Coproducing Aboriginal patient journey mapping tools for improved quality and coordination of care.

    PubMed

    Kelly, Janet; Dwyer, Judith; Mackean, Tamara; O'Donnell, Kim; Willis, Eileen

    2016-12-08

    This paper describes the rationale and process for developing a set of Aboriginal patient journey mapping tools with Aboriginal patients, health professionals, support workers, educators and researchers in the Managing Two Worlds Together project between 2008 and 2015. Aboriginal patients and their families from rural and remote areas, and healthcare providers in urban, rural and remote settings, shared their perceptions of the barriers and enablers to quality care in interviews and focus groups, and individual patient journey case studies were documented. Data were thematically analysed. In the absence of suitable existing tools, a new analytical framework and mapping approach was developed. The utility of the tools in other settings was then tested with health professionals, and the tools were further modified for use in quality improvement in health and education settings in South Australia and the Northern Territory. A central set of patient journey mapping tools with flexible adaptations, a workbook, and five sets of case studies describing how staff adapted and used the tools at different sites are available for wider use.

  16. Spacecraft control center automation using the generic inferential executor (GENIE)

    NASA Technical Reports Server (NTRS)

    Hartley, Jonathan; Luczak, Ed; Stump, Doug

    1996-01-01

    The increasing requirement to dramatically reduce the cost of mission operations led to increased emphasis on automation technology. The expert system technology used at the Goddard Space Flight Center (MD) is currently being applied to the automation of spacecraft control center activities. The generic inferential executor (GENIE) is a tool which allows pass automation applications to be constructed. The pass script templates constructed encode the tasks necessary to mimic flight operations team interactions with the spacecraft during a pass. These templates can be configured with data specific to a particular pass. Animated graphical displays illustrate the progress during the pass. The first GENIE application automates passes of the solar, anomalous and magnetospheric particle explorer (SAMPEX) spacecraft.

  17. The MEDA Project: Developing Evaluation Competence in the Training Software Domain.

    ERIC Educational Resources Information Center

    Machell, Joan; Saunders, Murray

    1992-01-01

    The MEDA (Methodologie d'Evaluation des Didacticiels pour les Adultes) tool is a generic instrument to evaluate training courseware. It was developed for software designers to improve products, for instructors to select appropriate courseware, and for distributors and consultants to match software to client needs. Describes software evaluation…

  18. Generic Educational Knowledge Representation for Adaptive and Cognitive Systems

    ERIC Educational Resources Information Center

    Caravantes, Arturo; Galan, Ramon

    2011-01-01

    The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…

  19. Towards Evolutional Authoring Support Systems

    ERIC Educational Resources Information Center

    Aroyo, Lora; Mizoguchi, Riichiro

    2004-01-01

    The ultimate aim of this research is to specify and implement a general authoring framework for content and knowledge engineering for Intelligent Educational Systems (IES). In this context we attempt to develop an authoring tool supporting this framework that is powerful in its functionality, generic in its support of instructional strategies and…

  20. Mapping polycyclic aromatic hydrocarbon and total toxicity equivalent soil concentrations by visible and near-infrared spectroscopy.

    PubMed

    Okparanma, Reuben N; Coulon, Frederic; Mayr, Thomas; Mouazen, Abdul M

    2014-09-01

    In this study, we used data from spectroscopic models based on visible and near-infrared (vis-NIR; 350-2500 nm) diffuse reflectance spectroscopy to develop soil maps of polycyclic aromatic hydrocarbons (PAHs) and total toxicity equivalent concentrations (TTEC) of the PAH mixture. The TTEC maps were then used for hazard assessment of three petroleum release sites in the Niger Delta province of Nigeria (5.317°N, 6.467°E). As the paired t-test revealed, there were non-significant (p > 0.05) differences between soil maps of PAH and TTEC developed with chemically measured and vis-NIR-predicted data. Comparison maps of PAH showed a slight to moderate agreement between measured and predicted data (Kappa coefficient = 0.19-0.56). Using proposed generic assessment criteria, hazard assessment showed that the degree of action for site-specific risk assessment and/or remediation is similar for both measurement methods. This demonstrates that the vis-NIR method may be useful for monitoring hydrocarbon contamination in a petroleum release site. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Facilitating participatory multilevel decision-making by using interactive mental maps.

    PubMed

    Pfeiffer, Constanze; Glaser, Stephanie; Vencatesan, Jayshree; Schliermann-Kraus, Elke; Drescher, Axel; Glaser, Rüdiger

    2008-11-01

    Participation of citizens in political, economic or social decisions is increasingly recognized as a precondition to foster sustainable development processes. Since spatial information is often important during planning and decision making, participatory mapping gains in popularity. However, little attention has been paid to the fact that information must be presented in a useful way to reach city planners and policy makers. Above all, the importance of visualisation tools to support collaboration, analytical reasoning, problem solving and decision-making in analysing and planning processes has been underestimated. In this paper, we describe how an interactive mental map tool has been developed in a highly interdisciplinary disaster management project in Chennai, India. We moved from a hand drawn mental maps approach to an interactive mental map tool. This was achieved by merging socio-economic and geospatial data on infrastructure, local perceptions, coping and adaptation strategies with remote sensing data and modern technology of map making. This newly developed interactive mapping tool allowed for insights into different locally-constructed realities and facilitated the communication of results to the wider public and respective policy makers. It proved to be useful in visualising information and promoting participatory decision-making processes. We argue that the tool bears potential also for health research projects. The interactive mental map can be used to spatially and temporally assess key health themes such as availability of, and accessibility to, existing health care services, breeding sites of disease vectors, collection and storage of water, waste disposal, location of public toilets or defecation sites.

  2. The Planck Legacy Archive

    NASA Astrophysics Data System (ADS)

    Dupac, X.; Arviset, C.; Fernandez Barreiro, M.; Lopez-Caniego, M.; Tauber, J.

    2015-12-01

    The Planck Collaboration has released in 2015 their second major dataset through the Planck Legacy Archive (PLA). It includes cosmological, Extragalactic and Galactic science data in temperature (intensity) and polarization. Full-sky maps are provided with unprecedented angular resolution and sensitivity, together with a large number of ancillary maps, catalogues (generic, SZ clusters and Galactic cold clumps), time-ordered data and other information. The extensive cosmological likelihood package allows cosmologists to fully explore the plausible parameters of the Universe. A new web-based PLA user interface is made public since Dec. 2014, allowing easier and faster access to all Planck data, and replacing the previous Java-based software. Numerous additional improvements to the PLA are also being developed through the so-called PLA Added-Value Interface, making use of an external contract with the Planetek Hellas and Expert Analytics software companies. This will allow users to process time-ordered data into sky maps, separate astrophysical components in existing maps, simulate the microwave and infrared sky through the Planck Sky Model, and use a number of other functionalities.

  3. Satellites vs. fiber optics based networks and services - Road map to strategic planning

    NASA Astrophysics Data System (ADS)

    Marandi, James H. R.

    An overview of a generic telecommunications network and its components is presented, and the current developments in satellite and fiber optics technologies are discussed with an eye on the trends in industry. A baseline model is proposed, and a cost comparison of fiber- vs satellite-based networks is made. A step-by-step 'road map' to the successful strategic planning of telecommunications services and facilities is presented. This road map provides for optimization of the current and future networks and services through effective utilization of both satellites and fiber optics. The road map is then applied to different segments of the telecommunications industry and market place, to show its effectiveness for the strategic planning of executives of three types: (1) those heading telecommunications manufacturing concerns, (2) those leading communication service companies, and (3) managers of telecommunication/MIS departments of major corporations. Future networking issues, such as developments in integrated-services digital network standards and technologies, are addressed.

  4. Failure Maps for Rectangular 17-4PH Stainless Steel Sandwiched Foam Panels

    NASA Technical Reports Server (NTRS)

    Raj, S. V.; Ghosn, L. J.

    2007-01-01

    A new and innovative concept is proposed for designing lightweight fan blades for aircraft engines using commercially available 17-4PH precipitation hardened stainless steel. Rotating fan blades in aircraft engines experience a complex loading state consisting of combinations of centrifugal, distributed pressure and torsional loads. Theoretical failure plastic collapse maps, showing plots of the foam relative density versus face sheet thickness, t, normalized by the fan blade span length, L, have been generated for rectangular 17-4PH sandwiched foam panels under these three loading modes assuming three failure plastic collapse modes. These maps show that the 17-4PH sandwiched foam panels can fail by either the yielding of the face sheets, yielding of the foam core or wrinkling of the face sheets depending on foam relative density, the magnitude of t/L and the loading mode. The design envelop of a generic fan blade is superimposed on the maps to provide valuable insights on the probable failure modes in a sandwiched foam fan blade.

  5. COMPASS: A general purpose computer aided scheduling tool

    NASA Technical Reports Server (NTRS)

    Mcmahon, Mary Beth; Fox, Barry; Culbert, Chris

    1991-01-01

    COMPASS is a generic scheduling system developed by McDonnell Douglas under the direction of the Software Technology Branch at JSC. COMPASS is intended to illustrate the latest advances in scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to potential NASA Space Station Freedom standards. COMPASS has some unique characteristics that distinguishes it from commercial products. These characteristics are discussed and used to illustrate some differences between scheduling tools.

  6. RISK COMMUNICATION IN ACTION: THE TOOLS OF MESSAGE MAPPING

    EPA Science Inventory

    Risk Communication in Action: The Tools of Message Mapping, is a workbook designed to guide risk communicators in crisis situations. The first part of this workbook will review general guidelines for risk communication. The second part will focus on one of the most robust tools o...

  7. Jules Verne Voyager, Jr: An Interactive Map Tool for Teaching Plate Tectonics

    NASA Astrophysics Data System (ADS)

    Hamburger, M. W.; Meertens, C. M.

    2010-12-01

    We present an interactive, web-based map utility that can make new geological and geophysical results accessible to a large number and variety of users. The tool provides a user-friendly interface that allows users to access a variety of maps, satellite images, and geophysical data at a range of spatial scales. The map tool, dubbed 'Jules Verne Voyager, Jr.', allows users to interactively create maps of a variety of study areas around the world. The utility was developed in collaboration with the UNAVCO Consortium for study of global-scale tectonic processes. Users can choose from a variety of base maps (including "Face of the Earth" and "Earth at Night" satellite imagery mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others), add a number of geographic and geophysical overlays (coastlines, political boundaries, rivers and lakes, earthquake and volcano locations, stress axes, etc.), and then superimpose both observed and model velocity vectors representing a compilation of 2933 GPS geodetic measurements from around the world. A remarkable characteristic of the geodetic compilation is that users can select from some 21 plates' frames of reference, allowing a visual representation of both 'absolute' plate motion (in a no-net rotation reference frame) and relative motion along all of the world's plate boundaries. The tool allows users to zoom among at least three map scales. The map tool can be viewed at http://jules.unavco.org/VoyagerJr/Earth. A more detailed version of the map utility, developed in conjunction with the EarthScope initiative, focuses on North America geodynamics, and provides more detailed geophysical and geographic information for the United States, Canada, and Mexico. The ‘EarthScope Voyager’ can be accessed at http://jules.unavco.org/VoyagerJr/EarthScope. Because the system uses pre-constructed gif images and overlays, the system can rapidly create and display maps to a large number of users simultaneously and does not require any special software installation on users' systems. In addition, a javascript-based educational interface, dubbed "Exploring our Dynamic Planet", incorporates the map tool, explanatory material, background scientific material, and curricular activities that encourage users to explore Earth processes using the Jules Verne Voyager, Jr. tool. Exploring our Dynamic Planet can be viewed at http://www.dpc.ucar.edu/VoyagerJr/. Because of its flexibility, the map utilities can be used for hands-on exercises exploring plate interaction in a range of academic settings, from high school science classes to entry-level undergraduate to graduate-level tectonics courses.

  8. National Seabed Mapping Programmes Collaborate to Advance Marine Geomorphological Mapping in Adjoining European Seas

    NASA Astrophysics Data System (ADS)

    Monteys, X.; Guinan, J.; Green, S.; Gafeira, J.; Dove, D.; Baeten, N. J.; Thorsnes, T.

    2017-12-01

    Marine geomorphological mapping is an effective means of characterising and understanding the seabed and its features with direct relevance to; offshore infrastructure placement, benthic habitat mapping, conservation & policy, marine spatial planning, fisheries management and pure research. Advancements in acoustic survey techniques and data processing methods resulting in the availability of high-resolution marine datasets e.g. multibeam echosounder bathymetry and shallow seismic mean that geological interpretations can be greatly improved by combining with geomorphological maps. Since December 2015, representatives from the national seabed mapping programmes of Norway (MAREANO), Ireland (INFOMAR) and the United Kingdom (MAREMAP) have collaborated and established the MIM geomorphology working group) with the common aim of advancing best practice for geological mapping in their adjoining sea areas in north-west Europe. A recently developed two-part classification system for Seabed Geomorphology (`Morphology' and Geomorphology') has been established as a result of an initiative led by the British Geological Survey (BGS) with contributions from the MIM group (Dove et al. 2016). To support the scheme, existing BGS GIS tools (SIGMA) have been adapted to apply this two-part classification system and here we present on the tools effectiveness in mapping geomorphological features, along with progress in harmonising the classification and feature nomenclature. Recognising that manual mapping of seabed features can be time-consuming and subjective, semi-automated approaches for mapping seabed features and improving mapping efficiency is being developed using Arc-GIS based tools. These methods recognise, spatially delineate and morphologically describe seabed features such as pockmarks (Gafeira et al., 2012) and cold-water coral mounds. Such tools utilise multibeam echosounder data or any other bathymetric dataset (e.g. 3D seismic, Geldof et al., 2014) that can produce a depth digital model. The tools have the capability to capture an extensive list of morphological attributes. The MIM geomorphology working group's strategy to develop methods for more efficient marine geomorphological mapping is presented with data examples and case studies showing the latest results.

  9. Assessment and application of national environmental databases and mapping tools at the local level to two community case studies.

    PubMed

    Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad

    2011-03-01

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.

  10. Mapping as a visual health communication tool: promises and dilemmas.

    PubMed

    Parrott, Roxanne; Hopfer, Suellen; Ghetian, Christie; Lengerich, Eugene

    2007-01-01

    In the era of evidence-based public health promotion and planning, the use of maps as a form of evidence to communicate about the multiple determinants of cancer is on the rise. Geographic information systems and mapping technologies make future proliferation of this strategy likely. Yet disease maps as a communication form remain largely unexamined. This content analysis considers the presence of multivariate information, credibility cues, and the communication function of publicly accessible maps for cancer control activities. Thirty-six state comprehensive cancer control plans were publicly available in July 2005 and were reviewed for the presence of maps. Fourteen of the 36 state cancer plans (39%) contained map images (N = 59 static maps). A continuum of map inter activity was observed, with 10 states having interactive mapping tools available to query and map cancer information. Four states had both cancer plans with map images and interactive mapping tools available to the public on their Web sites. Of the 14 state cancer plans that depicted map images, two displayed multivariate data in a single map. Nine of the 10 states with interactive mapping capability offered the option to display multivariate health risk messages. The most frequent content category mapped was cancer incidence and mortality, with stage at diagnosis infrequently available. The most frequent communication function served by the maps reviewed was redundancy, as maps repeated information contained in textual forms. The social and ethical implications for communicating about cancer through the use of visual geographic representations are discussed.

  11. A generic rabies risk assessment tool to support surveillance.

    PubMed

    Ward, Michael P; Hernández-Jover, Marta

    2015-06-01

    The continued spread of rabies in Indonesia poses a risk to human and animal populations in the remaining free islands, as well as the neighbouring rabies-free countries of Timor Leste, Papua New Guinea and Australia. Here we describe the development of a generic risk assessment tool which can be used to rapidly determine the vulnerability of rabies-free islands, so that scarce resources can be targeted to surveillance activities and the sensitivity of surveillance systems increased. The tool was developed by integrating information on the historical spread of rabies, anthropological studies, and the opinions of local animal health experts. The resulting tool is based on eight critical parameters that can be estimated from the literature, expert opinion, observational studies and information generated from routine surveillance. In the case study presented, results generated by this tool were most sensitive to the probability that dogs are present on private and fishing boats and it was predicted that rabies-infection (one infected case) might occur in a rabies-free island (upper 95% prediction interval) with a volume of 1000 boats movements. With 25,000 boat movements, the median of the probability distribution would be equal to one infected case, with an upper 95% prediction interval of six infected cases. This tool could also be used at the national-level to guide control and eradication plans. An initial recommendation from this study is to develop a surveillance programme to determine the likelihood that boats transport dogs, for example by port surveillance or regularly conducted surveys of fisherman and passenger ferries. However, the illegal nature of dog transportation from rabies-infected to rabies-free islands is a challenge for developing such surveillance. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. A System for Modelling Cell–Cell Interactions during Plant Morphogenesis

    PubMed Central

    Dupuy, Lionel; Mackenzie, Jonathan; Rudge, Tim; Haseloff, Jim

    2008-01-01

    Background and aims During the development of multicellular organisms, cells are capable of interacting with each other through a range of biological and physical mechanisms. A description of these networks of cell–cell interactions is essential for an understanding of how cellular activity is co-ordinated in regionalized functional entities such as tissues or organs. The difficulty of experimenting on living tissues has been a major limitation to describing such systems, and computer modelling appears particularly helpful to characterize the behaviour of multicellular systems. The experimental difficulties inherent to the multitude of parallel interactions that underlie cellular morphogenesis have led to the need for computer models. Methods A new generic model of plant cellular morphogenesis is described that expresses interactions amongst cellular entities explicitly: the plant is described as a multi-scale structure, and interactions between distinct entities is established through a topological neighbourhood. Tissues are represented as 2D biphasic systems where the cell wall responds to turgor pressure through a viscous yielding of the cell wall. Key Results This principle was used in the development of the CellModeller software, a generic tool dedicated to the analysis and modelling of plant morphogenesis. The system was applied to three contrasting study cases illustrating genetic, hormonal and mechanical factors involved in plant morphogenesis. Conclusions Plant morphogenesis is fundamentally a cellular process and the CellModeller software, through its underlying generic model, provides an advanced research tool to analyse coupled physical and biological morphogenetic mechanisms. PMID:17921524

  13. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public

    PubMed Central

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314

  14. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.

    PubMed

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.

  15. Interactive Learning Modules: Enabling Near Real-Time Oceanographic Data Use In Undergraduate Education

    NASA Astrophysics Data System (ADS)

    Kilb, D. L.; Fundis, A. T.; Risien, C. M.

    2012-12-01

    The focus of the Education and Public Engagement (EPE) component of the NSF's Ocean Observatories Initiative (OOI) is to provide a new layer of cyber-interactivity for undergraduate educators to bring near real-time data from the global ocean into learning environments. To accomplish this, we are designing six online services including: 1) visualization tools, 2) a lesson builder, 3) a concept map builder, 4) educational web services (middleware), 5) collaboration tools and 6) an educational resource database. Here, we report on our Fall 2012 release that includes the first four of these services: 1) Interactive visualization tools allow users to interactively select data of interest, display the data in various views (e.g., maps, time-series and scatter plots) and obtain statistical measures such as mean, standard deviation and a regression line fit to select data. Specific visualization tools include a tool to compare different months of data, a time series explorer tool to investigate the temporal evolution of select data parameters (e.g., sea water temperature or salinity), a glider profile tool that displays ocean glider tracks and associated transects, and a data comparison tool that allows users to view the data either in scatter plot view comparing one parameter with another, or in time series view. 2) Our interactive lesson builder tool allows users to develop a library of online lesson units, which are collaboratively editable and sharable and provides starter templates designed from learning theory knowledge. 3) Our interactive concept map tool allows the user to build and use concept maps, a graphical interface to map the connection between concepts and ideas. This tool also provides semantic-based recommendations, and allows for embedding of associated resources such as movies, images and blogs. 4) Education web services (middleware) will provide an educational resource database API.

  16. Development of a tool to support holistic generic assessment of clinical procedure skills.

    PubMed

    McKinley, Robert K; Strand, Janice; Gray, Tracey; Schuwirth, Lambert; Alun-Jones, Tom; Miller, Helen

    2008-06-01

    The challenges of maintaining comprehensive banks of valid checklists make context-specific checklists for assessment of clinical procedural skills problematic. This paper reports the development of a tool which supports generic holistic assessment of clinical procedural skills. We carried out a literature review, focus groups and non-participant observation of assessments with interview of participants, participant evaluation of a pilot objective structured clinical examination (OSCE), a national modified Delphi study with prior definitions of consensus and an OSCE. Participants were volunteers from a large acute teaching trust, a teaching primary care trust and a national sample of National Health Service staff. Results In total, 86 students, trainees and staff took part in the focus groups, observation of assessments and pilot OSCE, 252 in the Delphi study and 46 candidates and 50 assessors in the final OSCE. We developed a prototype tool with 5 broad categories amongst which were distributed 38 component competencies. There was > 70% agreement (our prior definition of consensus) at the first round of the Delphi study for inclusion of all categories and themes and no consensus for inclusion of additional categories or themes. Generalisability was 0.76. An OSCE based on the instrument has a predicted reliability of 0.79 with 12 stations and 1 assessor per station or 10 stations and 2 assessors per station. This clinical procedural skills assessment tool enables reliable assessment and has content and face validity for the assessment of clinical procedural skills. We have designated it the Leicester Clinical Procedure Assessment Tool (LCAT).

  17. ConMap: Investigating New Computer-Based Approaches to Assessing Conceptual Knowledge Structure in Physics.

    ERIC Educational Resources Information Center

    Beatty, Ian D.

    There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap ("Conceptual Mapping") project, described…

  18. Using Digital Mapping Tool in Ill-Structured Problem Solving

    ERIC Educational Resources Information Center

    Bai, Hua

    2013-01-01

    Scaffolding students' problem solving and helping them to improve problem solving skills are critical in instructional design courses. This study investigated the effects of students' uses of a digital mapping tool on their problem solving performance in a design case study. It was found that the students who used the digital mapping tool…

  19. Building generic anatomical models using virtual model cutting and iterative registration.

    PubMed

    Xiao, Mei; Soh, Jung; Meruvia-Pastor, Oscar; Schmidt, Eric; Hallgrímsson, Benedikt; Sensen, Christoph W

    2010-02-08

    Using 3D generic models to statistically analyze trends in biological structure changes is an important tool in morphometrics research. Therefore, 3D generic models built for a range of populations are in high demand. However, due to the complexity of biological structures and the limited views of them that medical images can offer, it is still an exceptionally difficult task to quickly and accurately create 3D generic models (a model is a 3D graphical representation of a biological structure) based on medical image stacks (a stack is an ordered collection of 2D images). We show that the creation of a generic model that captures spatial information exploitable in statistical analyses is facilitated by coupling our generalized segmentation method to existing automatic image registration algorithms. The method of creating generic 3D models consists of the following processing steps: (i) scanning subjects to obtain image stacks; (ii) creating individual 3D models from the stacks; (iii) interactively extracting sub-volume by cutting each model to generate the sub-model of interest; (iv) creating image stacks that contain only the information pertaining to the sub-models; (v) iteratively registering the corresponding new 2D image stacks; (vi) averaging the newly created sub-models based on intensity to produce the generic model from all the individual sub-models. After several registration procedures are applied to the image stacks, we can create averaged image stacks with sharp boundaries. The averaged 3D model created from those image stacks is very close to the average representation of the population. The image registration time varies depending on the image size and the desired accuracy of the registration. Both volumetric data and surface model for the generic 3D model are created at the final step. Our method is very flexible and easy to use such that anyone can use image stacks to create models and retrieve a sub-region from it at their ease. Java-based implementation allows our method to be used on various visualization systems including personal computers, workstations, computers equipped with stereo displays, and even virtual reality rooms such as the CAVE Automated Virtual Environment. The technique allows biologists to build generic 3D models of their interest quickly and accurately.

  20. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  1. An evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool): Concurrent, face and content validity.

    PubMed

    De Groef, An; Van Kampen, Marijke; Moortgat, Peter; Anthonissen, Mieke; Van den Kerckhove, Eric; Christiaens, Marie-Rose; Neven, Patrick; Geraerts, Inge; Devoogdt, Nele

    2018-01-01

    To investigate the concurrent, face and content validity of an evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool). 1) Concurrent validity of the MAP-BC evaluation tool was investigated by exploring correlations (Spearman's rank Correlation Coefficient) between the subjective scores (0 -no adhesions to 3 -very strong adhesions) of the skin level using the MAP-BC evaluation tool and objective elasticity parameters (maximal skin extension and gross elasticity) generated by the Cutometer Dual MPA 580. Nine different examination points on and around the mastectomy scar were evaluated. 2) Face and content validity were explored by questioning therapists experienced with myofascial therapy in breast cancer patients about the comprehensibility and comprehensiveness of the MAP-BC evaluation tool. 1) Only three meaningful correlations were found on the mastectomy scar. For the most lateral examination point on the mastectomy scar a moderate negative correlation (-0.44, p = 0.01) with the maximal skin extension and a moderate positive correlation with the resistance versus ability of returning or 'gross elasticity' (0.42, p = 0.02) were found. For the middle point on the mastectomy scar an almost moderate positive correlation with gross elasticity was found as well (0.38, p = 0.04) 2) Content and face validity have been found to be good. Eighty-nine percent of the respondent found the instructions understandable and 98% found the scoring system obvious. Thirty-seven percent of the therapists suggested to add the possibility to evaluate additional anatomical locations in case of reconstructive and/or bilateral surgery. The MAP-BC evaluation tool for myofascial adhesions in breast cancer patients has good face and content validity. Evidence for good concurrent validity of the skin level was found only on the mastectomy scar itself.

  2. SigWin-detector: a Grid-enabled workflow for discovering enriched windows of genomic features related to DNA sequences.

    PubMed

    Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M

    2008-08-08

    Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.

  3. PHLUX: Photographic Flux Tools for Solar Glare and Flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2010-12-02

    A web-based tool to a) analytically and empirically quantify glare from reflected light and determine the potential impact (e.g., temporary flash blindness, retinal burn), and b) produce flux maps for central receivers. The tool accepts RAW digital photographs of the glare source (for hazard assessment) or the receiver (for flux mapping), as well as a photograph of the sun for intensity and size scaling. For glare hazard assessment, the tool determines the retinal irradiance (W/cm2) and subtended source angle for an observer and plots the glare source on a hazard spectrum (i.e., low-potential for flash blindness impact, potential for flashmore » blindness impact, retinal burn). For flux mapping, the tool provides a colored map of the receiver scaled by incident solar flux (W/m2) and unwraps the physical dimensions of the receiver while accounting for the perspective of the photographer (e.g., for a flux map of a cylindrical receiver, the horizontal axis denotes receiver angle in degrees and the vertical axis denotes vertical position in meters; for a flat panel receiver, the horizontal axis denotes horizontal position in meters and the vertical axis denotes vertical position in meters). The flux mapping capability also allows the user to specify transects along which the program plots incident solar flux on the receiver.« less

  4. EJSCREEN: Environmental Justice Screening and Mapping Tool

    EPA Pesticide Factsheets

    EJSCREEN is an environmental justice screening and mapping tool that provides EPA and the public with a nationally consistent approach to characterizing potential areas may warrant further consideration, analysis, or outreach.

  5. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood forecasting purposes, A.H. Weerts, G.Y.H. El Serafy, S. Hummel, J. Dhondia, and H. Gerritsen (2009), accepted by Geoscience & Computers.

  6. Prioritizing Seafloor Mapping for Washington’s Pacific Coast

    PubMed Central

    Battista, Timothy; Buja, Ken; Christensen, John; Hennessey, Jennifer; Lassiter, Katrina

    2017-01-01

    Remote sensing systems are critical tools used for characterizing the geological and ecological composition of the seafloor. However, creating comprehensive and detailed maps of ocean and coastal environments has been hindered by the high cost of operating ship- and aircraft-based sensors. While a number of groups (e.g., academic research, government resource management, and private sector) are engaged in or would benefit from the collection of additional seafloor mapping data, disparate priorities, dauntingly large data gaps, and insufficient funding have confounded strategic planning efforts. In this study, we addressed these challenges by implementing a quantitative, spatial process to facilitate prioritizing seafloor mapping needs in Washington State. The Washington State Prioritization Tool (WASP), a custom web-based mapping tool, was developed to solicit and analyze mapping priorities from each participating group. The process resulted in the identification of several discrete, high priority mapping hotspots. As a result, several of the areas have been or will be subsequently mapped. Furthermore, information captured during the process about the intended application of the mapping data was paramount for identifying the optimum remote sensing sensors and acquisition parameters to use during subsequent mapping surveys. PMID:28350338

  7. [The experiment of participatory mapping in order to construct a cartographical alternative to the FHS].

    PubMed

    Goldstein, Roberta Argento; Barcellos, Christovam; Magalhães, Monica de Avelar Figueiredo Mafra; Gracie, Renata; Viacava, Francisco

    2013-01-01

    Maps and mapping procedures are useful tools for systematic interpretation and evaluation and for reporting of results to management. Applied to the Family Health Strategy (FHS), these maps permit the demarcation of the territory and the establishment of links between the territory, its population and health services. In this paper the use of maps by the FHS in 17 municipalities in northern and northeastern Brazil is studied and the process of demarcation and digitization of areas with the participation of teams is described. The survey conducted using questionnaires and discussion workshops showed that difficulties still prevail in reconciling the map (drawing) produced at the local level with maps produced by other government sectors. In general, the maps used at local level employ their own references, which prevent the interplay of information with other cartographic documents and their full use as a tool for evaluation and management. The combination of participatory mapping tools, associated with Geographic Information Systems (GIS) applications proposed in this paper, represents an alternative to mapping the territory of operations of FHS teams, as well as a reflection on the concept of territory and operation by the FHS.

  8. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  9. Social Network Mapping: A New Tool For The Leadership Toolbox

    DTIC Science & Technology

    2002-04-01

    SOCIAL NETWORK MAPPING: A NEW TOOL FOR THE LEADERSHIP TOOLBOX By Elisabeth J. Strines, Colonel, USAF 8037 Washington Road Alexandria...valid OMB control number. 1. REPORT DATE 00 APR 2002 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Social Network Mapping: A...describes the concept of social network mapping and demonstrates how it can be used by squadron commanders and leaders at all levels to provide subtle

  10. No elliptic islands for the universal area-preserving map

    NASA Astrophysics Data System (ADS)

    Johnson, Tomas

    2011-07-01

    A renormalization approach has been used in Eckmann et al (1982) and Eckmann et al (1984) to prove the existence of a universal area-preserving map, a map with hyperbolic orbits of all binary periods. The existence of a horseshoe, with positive Hausdorff dimension, in its domain was demonstrated in Gaidashev and Johnson (2009a). In this paper the coexistence problem is studied, and a computer-aided proof is given that no elliptic islands with period less than 18 exist in the domain. It is also shown that less than 1.5% of the measure of the domain consists of elliptic islands. This is proven by showing that the measure of initial conditions that escape to infinity is at least 98.5% of the measure of the domain, and we conjecture that the escaping set has full measure. This is highly unexpected, since generically it is believed that for conservative systems hyperbolicity and ellipticity coexist.

  11. Cavity approach to noisy learning in nonlinear perceptrons.

    PubMed

    Luo, P; Michael Wong, K Y

    2001-12-01

    We analyze the learning of noisy teacher-generated examples by nonlinear and differentiable student perceptrons using the cavity method. The generic activation of an example is a function of the cavity activation of the example, which is its activation in the perceptron that learns without the example. Mean-field equations for the macroscopic parameters and the stability condition yield results consistent with the replica method. When a single value of the cavity activation maps to multiple values of the generic activation, there is a competition in learning strategy between preferentially learning an example and sacrificing it in favor of the background adjustment. We find parameter regimes in which examples are learned preferentially or sacrificially, leading to a gap in the activation distribution. Full phase diagrams of this complex system are presented, and the theory predicts the existence of a phase transition from poor to good generalization states in the system. Simulation results confirm the theoretical predictions.

  12. MEqTrees Telescope and Radio-sky Simulations and CPU Benchmarking

    NASA Astrophysics Data System (ADS)

    Shanmugha Sundaram, G. A.

    2009-09-01

    MEqTrees is a Python-based implementation of the classical Measurement Equation, wherein the various 2×2 Jones matrices are parametrized representations in the spatial and sky domains for any generic radio telescope. Customized simulations of radio-source sky models and corrupt Jones terms are demonstrated based on a policy framework, with performance estimates derived for array configurations, ``dirty''-map residuals and processing power requirements for such computations on conventional platforms.

  13. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprising six health economists and one Delphi methodologist. A two-round, modified Delphi survey, with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorized within six sections: (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS Explanation and Elaboration paper. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of the reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  14. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-08-01

    "Mapping" onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite the publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist that aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of 6 health economists and 1 Delphi methodologist. A 2-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies, and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorized within 6 sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency, and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by 7 health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years.

  15. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    PubMed

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  16. The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro

    2010-05-01

    The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.

  17. A generic method for improving the spatial interoperability of medical and ecological databases.

    PubMed

    Ghenassia, A; Beuscart, J B; Ficheur, G; Occelli, F; Babykina, E; Chazard, E; Genin, M

    2017-10-03

    The availability of big data in healthcare and the intensive development of data reuse and georeferencing have opened up perspectives for health spatial analysis. However, fine-scale spatial studies of ecological and medical databases are limited by the change of support problem and thus a lack of spatial unit interoperability. The use of spatial disaggregation methods to solve this problem introduces errors into the spatial estimations. Here, we present a generic, two-step method for merging medical and ecological databases that avoids the use of spatial disaggregation methods, while maximizing the spatial resolution. Firstly, a mapping table is created after one or more transition matrices have been defined. The latter link the spatial units of the original databases to the spatial units of the final database. Secondly, the mapping table is validated by (1) comparing the covariates contained in the two original databases, and (2) checking the spatial validity with a spatial continuity criterion and a spatial resolution index. We used our novel method to merge a medical database (the French national diagnosis-related group database, containing 5644 spatial units) with an ecological database (produced by the French National Institute of Statistics and Economic Studies, and containing with 36,594 spatial units). The mapping table yielded 5632 final spatial units. The mapping table's validity was evaluated by comparing the number of births in the medical database and the ecological databases in each final spatial unit. The median [interquartile range] relative difference was 2.3% [0; 5.7]. The spatial continuity criterion was low (2.4%), and the spatial resolution index was greater than for most French administrative areas. Our innovative approach improves interoperability between medical and ecological databases and facilitates fine-scale spatial analyses. We have shown that disaggregation models and large aggregation techniques are not necessarily the best ways to tackle the change of support problem.

  18. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool

    PubMed Central

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-01-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13–17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences. PMID:29303048

  19. The impact of CmapTools utilization towards students' conceptual change on optics topic

    NASA Astrophysics Data System (ADS)

    Rofiuddin, Muhammad Rifqi; Feranie, Selly

    2017-05-01

    Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.

  20. Mapping Application Partnership Tool for Anacostia Watershed (Washington, DC/Maryland)

    EPA Pesticide Factsheets

    Mapping Application Partnership Tool (MAPT) of the Urban Waters Federal Partnership (UWFP) reconnects urban communities with their waterways by improving coordination among federal agencies and collaborating with community-led efforts.

  1. Curriculum Mapping: A Method to Assess and Refine Undergraduate Degree Programs

    ERIC Educational Resources Information Center

    Joyner-Melito, Helen S.

    2016-01-01

    Over the past several decades, there has been increasing interest in program- and university-level assessment and aligning learning outcomes to program content. Curriculum mapping is a tool that creates a visual map of all courses in the curriculum and how they relate to curriculum learning outcomes. Assessment tools/activities are often included…

  2. Assessing ecological departure from reference conditions with the Fire Regime Condition Class (FRCC) Mapping Tool

    Treesearch

    Stephen W. Barrett; Thomas DeMeo; Jeffrey L. Jones; J.D. Zeiler; Lee C. Hutter

    2006-01-01

    Knowledge of ecological departure from a range of reference conditions provides a critical context for managing sustainable ecosystems. Fire Regime Condition Class (FRCC) is a qualitative measure characterizing possible departure from historical fire regimes. The FRCC Mapping Tool was developed as an ArcMap extension utilizing the protocol identified by the Interagency...

  3. Concept Maps: An Alternative Methodology to Assess Young Children

    ERIC Educational Resources Information Center

    Atiles, Julia T.; Dominique-Maikell, Nikole; McKean, Kathleen

    2014-01-01

    The authors investigated the utility and efficacy of using concepts maps as a research tool to assess young children. Pre- and post- concept maps have been used as an assessment and evaluation tool with teachers and with older students, typically children who can read and write; this article summarizes an investigation into the utility of using…

  4. DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.

    PubMed

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/

  5. DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster

    PubMed Central

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693

  6. Preferred reporting items for studies mapping onto preference-based outcome measures: The MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-01-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication. A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprised of six health economists and one Delphi methodologist. Following a two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user friendly 23 item checklist. They are presented numerically and categorised within six sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.

  7. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite the publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication. A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprising six health economists and one Delphi methodologist. Following a two-round modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user-friendly 23-item checklist. They are presented numerically and categorised within six sections, namely: (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  8. Self-optimizing Monte Carlo method for nuclear well logging simulation

    NASA Astrophysics Data System (ADS)

    Liu, Lianyan

    1997-09-01

    In order to increase the efficiency of Monte Carlo simulation for nuclear well logging problems, a new method has been developed for variance reduction. With this method, an importance map is generated in the regular Monte Carlo calculation as a by-product, and the importance map is later used to conduct the splitting and Russian roulette for particle population control. By adopting a spatial mesh system, which is independent of physical geometrical configuration, the method allows superior user-friendliness. This new method is incorporated into the general purpose Monte Carlo code MCNP4A through a patch file. Two nuclear well logging problems, a neutron porosity tool and a gamma-ray lithology density tool are used to test the performance of this new method. The calculations are sped up over analog simulation by 120 and 2600 times, for the neutron porosity tool and for the gamma-ray lithology density log, respectively. The new method enjoys better performance by a factor of 4~6 times than that of MCNP's cell-based weight window, as per the converged figure-of-merits. An indirect comparison indicates that the new method also outperforms the AVATAR process for gamma-ray density tool problems. Even though it takes quite some time to generate a reasonable importance map from an analog run, a good initial map can create significant CPU time savings. This makes the method especially suitable for nuclear well logging problems, since one or several reference importance maps are usually available for a given tool. Study shows that the spatial mesh sizes should be chosen according to the mean-free-path. The overhead of the importance map generator is 6% and 14% for neutron and gamma-ray cases. The learning ability towards a correct importance map is also demonstrated. Although false-learning may happen, physical judgement can help diagnose with contributon maps. Calibration and analysis are performed for the neutron tool and the gamma-ray tool. Due to the fact that a very good initial importance map is always available after the first point has been calculated, high computing efficiency is maintained. The availability of contributon maps provides an easy way of understanding the logging measurement and analyzing for the depth of investigation.

  9. Neutrinos as Probes of Lorentz Invariance

    DOE PAGES

    Díaz, Jorge S.

    2014-01-01

    Neutrinos can be used to search for deviations from exact Lorentz invariance. The worldwide experimental program in neutrino physics makes these particles a remarkable tool to search for a variety of signals that could reveal minute relativity violations. This paper reviews the generic experimental signatures of the breakdown of Lorentz symmetry in the neutrino sector.

  10. Monte Carlo Simulation to Estimate Likelihood of Direct Lightning Strikes

    NASA Technical Reports Server (NTRS)

    Mata, Carlos; Medelius, Pedro

    2008-01-01

    A software tool has been designed to quantify the lightning exposure at launch sites of the stack at the pads under different configurations. In order to predict lightning strikes to generic structures, this model uses leaders whose origins (in the x-y plane) are obtained from a 2D random, normal distribution.

  11. PelePhysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-05-17

    PelePhysics is a suite of physics packages that provides functionality of use to reacting hydrodynamics CFD codes. The initial release includes an interface to reaction rate mechanism evaluation, transport coefficient evaluation, and a generalized equation of state (EOS) facility. Both generic evaluators and interfaces to code from externally available tools (Fuego for chemical rates, EGLib for transport coefficients) are provided.

  12. Methods and Frequency of Sharing of Learning Resources by Medical Students

    ERIC Educational Resources Information Center

    Judd, Terry; Elliott, Kristine

    2017-01-01

    University students have ready access to quality learning resources through learning management systems (LMS), online library collections and generic search tools. However, anecdotal evidence suggests they sometimes turn to peer-based sharing rather than sourcing resources directly. We know little about this practice--how common it is, what sort…

  13. BLAST for Behind-the-Meter Applications Lite Tool | Transportation Research

    Science.gov Websites

    provided by NREL's PV Watts calculator. A generic utility rate structure framework makes it possible to the BLAST documentation for proper CSV formatting. Rate structure values Define demand charges and energy costs to best represent your utility rate structure of interest. Demand charges and energy costs

  14. Excel Yourself with Personalised Email Messages

    ERIC Educational Resources Information Center

    McClean, Stephen

    2008-01-01

    Combining the Excel spreadsheet with an email program provides a very powerful tool for sending students personalised emails. Most email clients now support a Mail Merge facility whereby a generic template is created and information unique to each student record in the spreadsheet is filled into that template, generating tens if not hundreds of…

  15. An XML-based Generic Tool for Information Retrieval in Solar Databases

    NASA Astrophysics Data System (ADS)

    Scholl, Isabelle F.; Legay, Eric; Linsolas, Romain

    This paper presents the current architecture of the `Solar Web Project' now in its development phase. This tool will provide scientists interested in solar data with a single web-based interface for browsing distributed and heterogeneous catalogs of solar observations. The main goal is to have a generic application that can be easily extended to new sets of data or to new missions with a low level of maintenance. It is developed with Java and XML is used as a powerful configuration language. The server, independent of any database scheme, can communicate with a client (the user interface) and several local or remote archive access systems (such as existing web pages, ftp sites or SQL databases). Archive access systems are externally described in XML files. The user interface is also dynamically generated from an XML file containing the window building rules and a simplified database description. This project is developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France). Successful tests have been conducted with other solar archive access systems.

  16. A generic testbed for the design of plasma spectrometer control software with application to the THOR-CSW solar wind instrument

    NASA Astrophysics Data System (ADS)

    De Keyser, Johan; Lavraud, Benoit; Neefs, Eddy; Berkenbosch, Sophie; Beeckman, Bram; Maggiolo, Romain; Gamby, Emmanuel; Fedorov, Andrei; Baruah, Rituparna; Wong, King-Wah; Amoros, Carine; Mathon, Romain; Génot, Vincent; Marcucci, Federica; Brienza, Daniele

    2017-04-01

    Modern plasma spectrometers require intelligent software that is able to exploit their capabilities to the fullest. While the low-level control of the instrument and basic tasks such as performing the basic measurement, temperature control, and production of housekeeping data are to be done by software that is executed on an FPGA and/or processor inside the instrument, higher level tasks such as control of measurement sequences, on-board moment calculation, beam tracking decisions, and data compression, may be performed by the instrument or in the payload data processing unit. Such design decisions, as well as an assessment of the workload on the different processing components, require early prototyping. We have developed a generic simulation testbed for the design of plasma spectrometer control software that allows an early evaluation of the level of resources that is needed at each level. Early prototyping can pinpoint bottlenecks in the design allowing timely remediation. We have applied this tool to the THOR Cold Solar Wind (CSW) plasma spectrometer. Some examples illustrating the usefulness of the tool are given.

  17. Computer aided manufacturing for complex freeform optics

    NASA Astrophysics Data System (ADS)

    Wolfs, Franciscus; Fess, Ed; Johns, Dustin; LePage, Gabriel; Matthews, Greg

    2017-10-01

    Recently, the desire to use freeform optics has been increasing. Freeform optics can be used to expand the capabilities of optical systems and reduce the number of optics needed in an assembly. The traits that increase optical performance also present challenges in manufacturing. As tolerances on freeform optics become more stringent, it is necessary to continue to improve methods for how the grinding and polishing processes interact with metrology. To create these complex shapes, OptiPro has developed a computer aided manufacturing package called PROSurf. PROSurf generates tool paths required for grinding and polishing freeform optics with multiple axes of motion. It also uses metrology feedback for deterministic corrections. ProSurf handles 2 key aspects of the manufacturing process that most other CAM systems struggle with. The first is having the ability to support several input types (equations, CAD models, point clouds) and still be able to create a uniform high-density surface map useable for generating a smooth tool path. The second is to improve the accuracy of mapping a metrology file to the part surface. To perform this OptiPro is using 3D error maps instead of traditional 2D maps. The metrology error map drives the tool path adjustment applied during processing. For grinding, the error map adjusts the tool position to compensate for repeatable system error. For polishing, the error map drives the relative dwell times of the tool across the part surface. This paper will present the challenges associated with these issues and solutions that we have created.

  18. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    NASA Astrophysics Data System (ADS)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its early stages, high school and college teachers, as well as researchers have expressed interest in using and extending these tools for visualizing and interacting with data on Earth and other planetary bodies.

  19. Developing a mapping tool for tablets

    NASA Astrophysics Data System (ADS)

    Vaughan, Alan; Collins, Nathan; Krus, Mike

    2014-05-01

    Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  20. National Interest Shown in Watershed Mapping Tool

    EPA Pesticide Factsheets

    The State of Maryland is able to identify prime locations for watershed restoration and preservation using an interactive mapping tool developed by a partnership of agencies led by EPA’s Mid-Atlantic Water Protection Division.

  1. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. FuGEFlow: data model and markup language for flow cytometry.

    PubMed

    Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R

    2009-06-16

    Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development.

  3. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  4. Architecture-driven reuse of code in KASE

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    In order to support the synthesis of large, complex software systems, we need to focus on issues pertaining to the architectural design of a system in addition to algorithm and data structure design. An approach that is based on abstracting the architectural design of a set of problems in the form of a generic architecture, and providing tools that can be used to instantiate the generic architecture for specific problem instances is presented. Such an approach also facilitates reuse of code between different systems belonging to the same problem class. An application of our approach on a realistic problem is described; the results of the exercise are presented; and how our approach compares to other work in this area is discussed.

  5. Managing prescription drug costs: a case study.

    PubMed

    DuBois, R W; Feinberg, P E

    1994-06-01

    Pharmacy costs in most private insurance companies and public concerns have risen over the past several years. To address the problem of increased expenditures in its government employee pharmacy program, the State of New York sought bids from outside vendors to help it control pharmaceutical costs. The following is a case study of the tools the state employed in that effort. Over time, both prescription drug coverage and mental health and substance abuse benefits were carved out of the medical plan and are now provided under free-standing programs. In order to participate, an independent pharmacy must accept a discount of 10% off the average wholesale price of brand name drugs and 25% off the average generic price of generic drugs.

  6. Benchmarking short sequence mapping tools

    PubMed Central

    2013-01-01

    Background The development of next-generation sequencing instruments has led to the generation of millions of short sequences in a single run. The process of aligning these reads to a reference genome is time consuming and demands the development of fast and accurate alignment tools. However, the current proposed tools make different compromises between the accuracy and the speed of mapping. Moreover, many important aspects are overlooked while comparing the performance of a newly developed tool to the state of the art. Therefore, there is a need for an objective evaluation method that covers all the aspects. In this work, we introduce a benchmarking suite to extensively analyze sequencing tools with respect to various aspects and provide an objective comparison. Results We applied our benchmarking tests on 9 well known mapping tools, namely, Bowtie, Bowtie2, BWA, SOAP2, MAQ, RMAP, GSNAP, Novoalign, and mrsFAST (mrFAST) using synthetic data and real RNA-Seq data. MAQ and RMAP are based on building hash tables for the reads, whereas the remaining tools are based on indexing the reference genome. The benchmarking tests reveal the strengths and weaknesses of each tool. The results show that no single tool outperforms all others in all metrics. However, Bowtie maintained the best throughput for most of the tests while BWA performed better for longer read lengths. The benchmarking tests are not restricted to the mentioned tools and can be further applied to others. Conclusion The mapping process is still a hard problem that is affected by many factors. In this work, we provided a benchmarking suite that reveals and evaluates the different factors affecting the mapping process. Still, there is no tool that outperforms all of the others in all the tests. Therefore, the end user should clearly specify his needs in order to choose the tool that provides the best results. PMID:23758764

  7. SU-F-T-540: Comprehensive Fluence Delivery Optimization with Multileaf Collimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weppler, S; Villarreal-Barajas, J; Department of Medical Physics, Tom Baker Cancer Center, Calgary, Alberta

    2016-06-15

    Purpose: Multileaf collimator (MLC) leaf sequencing is performed via commercial black-box implementations, on which a user has limited to no access. We have developed an explicit, generic MLC sequencing model to serve as a tool for future investigations of fluence map optimization, fluence delivery optimization, and rotational collimator delivery methods. Methods: We have developed a novel, comprehensive model to effectively account for a variety of transmission and penumbra effects previously treated on an ad hoc basis in the literature. As the model is capable of quantifying a variety of effects, we utilize the asymmetric leakage intensity across each leaf tomore » deliver fluence maps with pixel size smaller than the narrowest leaf width. Developed using linear programming and mixed integer programming formulations, the model is implemented using state of the art open-source solvers. To demonstrate the versatility of the algorithm, a graphical user interface (GUI) was developed in MATLAB capable of accepting custom leaf specifications and transmission parameters. As a preliminary proof-ofconcept, we have sequenced the leaves of a Varian 120 Leaf Millennium MLC for five prostate cancer patient fields and one head and neck field. Predetermined fluence maps have been processed by data smoothing methods to obtain pixel sizes of 2.5 cm{sup 2}. The quality of output was analyzed using computer simulations. Results: For the prostate fields, an average root mean squared error (RMSE) of 0.82 and gamma (0.5mm/0.5%) of 91.4% were observed compared to RMSE and gamma (0.5mm/0.5%) values of 7.04 and 34.0% when the leakage considerations were omitted. Similar results were observed for the head and neck case. Conclusion: A model to sequence MLC leaves to optimality has been proposed. Future work will involve extensive testing and evaluation of the method on clinical MLCs and comparison with black-box leaf sequencing algorithms currently used by commercial treatment planning systems.« less

  8. EnGeoMAP - geological applications within the EnMAP hyperspectral satellite science program

    NASA Astrophysics Data System (ADS)

    Boesche, N. K.; Mielke, C.; Rogass, C.; Guanter, L.

    2016-12-01

    Hyperspectral investigations from near field to space substantially contribute to geological exploration and mining monitoring of raw material and mineral deposits. Due to their spectral characteristics, large mineral occurrences and minefields can be identified from space and the spatial distribution of distinct proxy minerals be mapped. In the frame of the EnMAP hyperspectral satellite science program a mineral and elemental mapping tool was developed - the EnGeoMAP. It contains a basic mineral mapping and a rare earth element mapping approach. This study shows the performance of EnGeoMAP based on simulated EnMAP data of the rare earth element bearing Mountain Pass Carbonatite Complex, USA, and the Rodalquilar and Lomilla Calderas, Spain, which host the economically relevant gold-silver, lead-zinc-silver-gold and alunite deposits. The mountain pass image data was simulated on the basis of AVIRIS Next Generation images, while the Rodalquilar data is based on HyMap images. The EnGeoMAP - Base approach was applied to both images, while the mountain pass image data were additionally analysed using the EnGeoMAP - REE software tool. The results are mineral and elemental maps that serve as proxies for the regional lithology and deposit types. The validation of the maps is based on chemical analyses of field samples. Current airborne sensors meet the spatial and spectral requirements for detailed mineral mapping and future hyperspectral space borne missions will additionally provide a large coverage. For those hyperspectral missions, EnGeoMAP is a rapid data analysis tool that is provided to spectral geologists working in mineral exploration.

  9. Mapping CHU9D Utility Scores from the PedsQLTM 4.0 SF-15.

    PubMed

    Mpundu-Kaambwa, Christine; Chen, Gang; Russo, Remo; Stevens, Katherine; Petersen, Karin Dam; Ratcliffe, Julie

    2017-04-01

    The Pediatric Quality of Life Inventory™ 4.0 Short Form 15 Generic Core Scales (hereafter the PedsQL) and the Child Health Utility-9 Dimensions (CHU9D) are two generic instruments designed to measure health-related quality of life in children and adolescents in the general population and paediatric patient groups living with specific health conditions. Although the PedsQL is widely used among paediatric patient populations, presently it is not possible to directly use the scores from the instrument to calculate quality-adjusted life-years (QALYs) for application in economic evaluation because it produces summary scores which are not preference-based. This paper examines different econometric mapping techniques for estimating CHU9D utility scores from the PedsQL for the purpose of calculating QALYs for cost-utility analysis. The PedsQL and the CHU9D were completed by a community sample of 755 Australian adolescents aged 15-17 years. Seven regression models were estimated: ordinary least squares estimator, generalised linear model, robust MM estimator, multivariate factorial polynomial estimator, beta-binomial estimator, finite mixture model and multinomial logistic model. The mean absolute error (MAE) and the mean squared error (MSE) were used to assess predictive ability of the models. The MM estimator with stepwise-selected PedsQL dimension scores as explanatory variables had the best predictive accuracy using MAE and the equivalent beta-binomial model had the best predictive accuracy using MSE. Our mapping algorithm facilitates the estimation of health-state utilities for use within economic evaluations where only PedsQL data is available and is suitable for use in community-based adolescents aged 15-17 years. Applicability of the algorithm in younger populations should be assessed in further research.

  10. The Family Map: A Tool for Understanding the Risks for Children in Families with Substance Abuse

    ERIC Educational Resources Information Center

    Bokony, Patti A.; Conners-Burrow, Nicola A.; Whiteside-Mansell, Leanne; Johnson, Danya; McKelvey, Lorraine; Bradley, Robert H.

    2010-01-01

    This article reviews the findings from our assessments of children and their families in two Head Start programs using the Family Map. Specifically, we used the Family Map assessment tool to identify risks to children associated with alcohol and drug use in families with young children. Practical suggestions are offered to administrators about the…

  11. A Study to Determine the Contribution Made by Concept Maps to a Computer Architecture and Organization Course

    ERIC Educational Resources Information Center

    Aydogan, Tuncay; Ergun, Serap

    2016-01-01

    Concept mapping is a method of graphical learning that can be beneficial as a study method for concept linking and organization. Concept maps, which provide an elegant, easily understood representation of an expert's domain knowledge, are tools for organizing and representing knowledge. These tools have been used in educational environments to…

  12. GIS-based interactive tool to map the advent of world conquerors

    NASA Astrophysics Data System (ADS)

    Lakkaraju, Mahesh

    The objective of this thesis is to show the scale and extent of some of the greatest empires the world has ever seen. This is a hybrid project between the GIS based interactive tool and the web-based JavaScript tool. This approach lets the students learn effectively about the emperors themselves while understanding how long and far their empires spread. In the GIS based tool, a map is displayed with various points on it, and when a user clicks on one point, the relevant information of what happened at that particular place is displayed. Apart from this information, users can also select the interactive animation button and can walk through a set of battles in chronological order. As mentioned, this uses Java as the main programming language, and MOJO (Map Objects Java Objects) provided by ESRI. MOJO is very effective as its GIS related features can be included in the application itself. This app. is a simple tool and has been developed for university or high school level students. D3.js is an interactive animation and visualization platform built on the Javascript framework. Though HTML5, CSS3, Javascript and SVG animations can be used to derive custom animations, this tool can help bring out results with less effort and more ease of use. Hence, it has become the most sought after visualization tool for multiple applications. D3.js has provided a map-based visualization feature so that we can easily display text-based data in a map-based interface. To draw the map and the points on it, D3.js uses data rendered in TOPO JSON format. The latitudes and longitudes can be provided, which are interpolated into the Map svg. One of the main advantages of doing it this way is that more information is retained when we use a visual medium.

  13. Querying XML Data with SPARQL

    NASA Astrophysics Data System (ADS)

    Bikakis, Nikos; Gioldasis, Nektarios; Tsinaraki, Chrisa; Christodoulakis, Stavros

    SPARQL is today the standard access language for Semantic Web data. In the recent years XML databases have also acquired industrial importance due to the widespread applicability of XML in the Web. In this paper we present a framework that bridges the heterogeneity gap and creates an interoperable environment where SPARQL queries are used to access XML databases. Our approach assumes that fairly generic mappings between ontology constructs and XML Schema constructs have been automatically derived or manually specified. The mappings are used to automatically translate SPARQL queries to semantically equivalent XQuery queries which are used to access the XML databases. We present the algorithms and the implementation of SPARQL2XQuery framework, which is used for answering SPARQL queries over XML databases.

  14. Mapping the current–current correlation function near a quantum critical point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prodan, Emil, E-mail: prodan@yu.edu; Bellissard, Jean

    2016-05-15

    The current–current correlation function is a useful concept in the theory of electron transport in homogeneous solids. The finite-temperature conductivity tensor as well as Anderson’s localization length can be computed entirely from this correlation function. Based on the critical behavior of these two physical quantities near the plateau–insulator or plateau–plateau transitions in the integer quantum Hall effect, we derive an asymptotic formula for the current–current correlation function, which enables us to make several theoretical predictions about its generic behavior. For the disordered Hofstadter model, we employ numerical simulations to map the current–current correlation function, obtain its asymptotic form near amore » critical point and confirm the theoretical predictions.« less

  15. Smart Cameras for Remote Science Survey

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Abbey, William; Allwood, Abigail; Bekker, Dmitriy; Bornstein, Benjamin; Cabrol, Nathalie A.; Castano, Rebecca; Estlin, Tara; Fuchs, Thomas; Wagstaff, Kiri L.

    2012-01-01

    Communication with remote exploration spacecraft is often intermittent and bandwidth is highly constrained. Future missions could use onboard science data understanding to prioritize downlink of critical features [1], draft summary maps of visited terrain [2], or identify targets of opportunity for followup measurements [3]. We describe a generic approach to classify geologic surfaces for autonomous science operations, suitable for parallelized implementations in FPGA hardware. We map these surfaces with texture channels - distinctive numerical signatures that differentiate properties such as roughness, pavement coatings, regolith characteristics, sedimentary fabrics and differential outcrop weathering. This work describes our basic image analysis approach and reports an initial performance evaluation using surface images from the Mars Exploration Rovers. Future work will incorporate these methods into camera hardware for real-time processing.

  16. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households.

    PubMed

    Fuady, Ahmad; Houweling, Tanja A; Mansyur, Muchtaruddin; Richardus, Jan H

    2018-01-01

    Indonesia is the second-highest country for tuberculosis (TB) incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO) recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study), we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is comprehensive and ready for use in future studies on TB-related catastrophic costs and is suitable for monitoring progress to achieve the target of the End TB Strategy.

  17. PLASMAP: an interactive computational tool for storage, retrieval and device-independent graphic display of conventional restriction maps.

    PubMed Central

    Stone, B N; Griesinger, G L; Modelevsky, J L

    1984-01-01

    We describe an interactive computational tool, PLASMAP, which allows the user to electronically store, retrieve, and display circular restriction maps. PLASMAP permits users to construct libraries of plasmid restriction maps as a set of files which may be edited in the laboratory at any time. The display feature of PLASMAP quickly generates device-independent, artist-quality, full-color or monochrome, hard copies or CRT screens of complex, conventional circular restriction maps. PMID:6320096

  18. Making Air Pollution Visible: A Tool for Promoting Environmental Health Literacy.

    PubMed

    Cleary, Ekaterina Galkina; Patton, Allison P; Wu, Hsin-Ching; Xie, Alan; Stubblefield, Joseph; Mass, William; Grinstein, Georges; Koch-Weser, Susan; Brugge, Doug; Wong, Carolyn

    2017-04-12

    Digital maps are instrumental in conveying information about environmental hazards geographically. For laypersons, computer-based maps can serve as tools to promote environmental health literacy about invisible traffic-related air pollution and ultrafine particles. Concentrations of these pollutants are higher near major roadways and increasingly linked to adverse health effects. Interactive computer maps provide visualizations that can allow users to build mental models of the spatial distribution of ultrafine particles in a community and learn about the risk of exposure in a geographic context. The objective of this work was to develop a new software tool appropriate for educating members of the Boston Chinatown community (Boston, MA, USA) about the nature and potential health risks of traffic-related air pollution. The tool, the Interactive Map of Chinatown Traffic Pollution ("Air Pollution Map" hereafter), is a prototype that can be adapted for the purpose of educating community members across a range of socioeconomic contexts. We built the educational visualization tool on the open source Weave software platform. We designed the tool as the centerpiece of a multimodal and intergenerational educational intervention about the health risk of traffic-related air pollution. We used a previously published fine resolution (20 m) hourly land-use regression model of ultrafine particles as the algorithm for predicting pollution levels and applied it to one neighborhood, Boston Chinatown. In designing the map, we consulted community experts to help customize the user interface to communication styles prevalent in the target community. The product is a map that displays ultrafine particulate concentrations averaged across census blocks using a color gradation from white to dark red. The interactive features allow users to explore and learn how changing meteorological conditions and traffic volume influence ultrafine particle concentrations. Users can also select from multiple map layers, such as a street map or satellite view. The map legends and labels are available in both Chinese and English, and are thus accessible to immigrants and residents with proficiency in either language. The map can be either Web or desktop based. The Air Pollution Map incorporates relevant language and landmarks to make complex scientific information about ultrafine particles accessible to members of the Boston Chinatown community. In future work, we will test the map in an educational intervention that features intergenerational colearning and the use of supplementary multimedia presentations. ©Ekaterina Galkina Cleary, Allison P Patton, Hsin-Ching Wu, Alan Xie, Joseph Stubblefield, William Mass, Georges Grinstein, Susan Koch-Weser, Doug Brugge, Carolyn Wong. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 12.04.2017.

  19. Patscanui: an intuitive web interface for searching patterns in DNA and protein data.

    PubMed

    Blin, Kai; Wohlleben, Wolfgang; Weber, Tilmann

    2018-05-02

    Patterns in biological sequences frequently signify interesting features in the underlying molecule. Many tools exist to search for well-known patterns. Less support is available for exploratory analysis, where no well-defined patterns are known yet. PatScanUI (https://patscan.secondarymetabolites.org/) provides a highly interactive web interface to the powerful generic pattern search tool PatScan. The complex PatScan-patterns are created in a drag-and-drop aware interface allowing researchers to do rapid prototyping of the often complicated patterns useful to identifying features of interest.

  20. The Alba ray tracing code: ART

    NASA Astrophysics Data System (ADS)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  1. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    PubMed

    Lhachimi, Stefan K; Nusselder, Wilma J; Smit, Henriette A; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P; Boshuizen, Hendriek C

    2012-01-01

    Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence.

  2. Using Petri Net Tools to Study Properties and Dynamics of Biological Systems

    PubMed Central

    Peleg, Mor; Rubin, Daniel; Altman, Russ B.

    2005-01-01

    Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791

  3. Development of emergency response tools for accidental radiological contamination of French coastal areas.

    PubMed

    Duffa, Céline; Bailly du Bois, Pascal; Caillaud, Matthieu; Charmasson, Sabine; Couvez, Céline; Didier, Damien; Dumas, Franck; Fievet, Bruno; Morillon, Mehdi; Renaud, Philippe; Thébault, Hervé

    2016-01-01

    The Fukushima nuclear accident resulted in the largest ever accidental release of artificial radionuclides in coastal waters. This accident has shown the importance of marine assessment capabilities for emergency response and the need to develop tools for adequately predicting the evolution and potential impact of radioactive releases to the marine environment. The French Institute for Radiological Protection and Nuclear Safety (IRSN) equips its emergency response centre with operational tools to assist experts and decision makers in the event of accidental atmospheric releases and contamination of the terrestrial environment. The on-going project aims to develop tools for the management of marine contamination events in French coastal areas. This should allow us to evaluate and anticipate post-accident conditions, including potential contamination sites, contamination levels and potential consequences. In order to achieve this goal, two complementary tools are developed: site-specific marine data sheets and a dedicated simulation tool (STERNE, Simulation du Transport et du transfert d'Eléments Radioactifs dans l'environNEment marin). Marine data sheets are used to summarize the marine environment characteristics of the various sites considered, and to identify vulnerable areas requiring implementation of population protection measures, such as aquaculture areas, beaches or industrial water intakes, as well as areas of major ecological interest. Local climatological data (dominant sea currents as a function of meteorological or tidal conditions) serving as the basis for an initial environmental sampling strategy is provided whenever possible, along with a list of possible local contacts for operational management purposes. The STERNE simulation tool is designed to predict radionuclide dispersion and contamination in seawater and marine species by incorporating spatio-temporal data. 3D hydrodynamic forecasts are used as input data. Direct discharge points or atmospheric deposition source terms can be taken into account. STERNE calculates Eulerian radionuclide dispersion using advection and diffusion equations established offline from hydrodynamic calculations. A radioecological model based on dynamic transfer equations is implemented to evaluate activity concentrations in aquatic organisms. Essential radioecological parameters (concentration factors and single or multicomponent biological half-lives) have been compiled for main radionuclides and generic marine species (fish, molluscs, crustaceans and algae). Dispersion and transfer calculations are performed simultaneously on a 3D grid. Results can be plotted on maps, with possible tracking of spatio-temporal evolution. Post-processing and visualization can then be performed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Spaceborne imaging radar research in the 90's

    NASA Technical Reports Server (NTRS)

    Elachi, Charles

    1986-01-01

    The imaging radar experiments on SEASAT and on the space shuttle (SIR-A and SIR-B) have led to a wide interest in the use of spaceborne imaging radars in Earth and planetary sciences. The radar sensors provide unique and complimentary information to what is acquired with visible and infrared imagers. This includes subsurface imaging in arid regions, all weather observation of ocean surface dynamic phenomena, structural mapping, soil moisture mapping, stereo imaging and resulting topographic mapping. However, experiments up to now have exploited only a very limited range of the generic capability of radar sensors. With planned sensor developments in the late 80's and early 90's, a quantum jump will be made in our ability to fully exploit the potential of these sensors. These developments include: multiparameter research sensors such as SIR-C and X-SAR, long-term and global monitoring sensors such as ERS-1, JERS-1, EOS, Radarsat, GLORI and the spaceborne sounder, planetary mapping sensors such as the Magellan and Cassini/Titan mappers, topographic three-dimensional imagers such as the scanning radar altimeter and three-dimensional rain mapping. These sensors and their associated research are briefly described.

  5. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  6. Story Map Instruction: A Road Map for Reading Comprehension.

    ERIC Educational Resources Information Center

    Davis, Zephaniah, T.; McPherson, Michael D.

    1989-01-01

    Introduces teachers to the development and use of story maps as a tool for promoting reading comprehension. Presents a definition and review of story map research. Explains how to construct story maps, and offers suggestions for starting story map instruction. Provides variations on the use of story maps. (MG)

  7. Music-therapy analyzed through conceptual mapping

    NASA Astrophysics Data System (ADS)

    Martinez, Rodolfo; de la Fuente, Rebeca

    2002-11-01

    Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.

  8. Science Teachers' Use of a Concept Map Marking Guide as a Formative Assessment Tool for the Concept of Energy

    ERIC Educational Resources Information Center

    Won, Mihye; Krabbe, Heiko; Ley, Siv Ling; Treagust, David F.; Fischer, Hans E.

    2017-01-01

    In this study, we investigated the value of a concept map marking guide as an alternative formative assessment tool for science teachers to adopt for the topic of energy. Eight high school science teachers marked students' concept maps using an itemized holistic marking guide. Their marking was compared with the researchers' marking and the scores…

  9. New 3D seismicity maps using chromo-stereoscopy with two alternative freewares

    NASA Astrophysics Data System (ADS)

    Okamoto, Y.

    2011-12-01

    Seismicity maps play a key role in an introduction of geosciences studies or outreach programs. Various techniques are used in order to show earthquakes in a three dimensional field. To use "chromo-stereoscopy" is our simple and easier-making solution. The Chroma Depth 3D Glasses are employed for this purpose. The glasses consist of two transparent blazed grating films covered with a paper holder and cost a little (1 US$). Looking through these glasses, the colored chart turns into three dimensional perspective due to the mechanism that the color codes make a depth dimension with dispersion. We use two complementary freewares to make maps, the GMT (Generic Mapping Tools, Wessel and Smith.1988) and the POV-Ray (Persistence of Vision Pty. Ltd. 2004). The two softwares have their own advantages; the GMT is specialized for map making with simple scripts, while the POV-Ray produces realistic 3D rendering images with more complicated scripts. The earthquakes are plotted with the rainbow color codes depending on their depths in a black background as printed or PC images. Therefore, the red colored shallow earthquakes are float in front and blue colored ones sink deeper. This effect is so amazing that the students who first wear these glasses are strongly moved and fascinated with this simple mechanism. The data used here are from JMA seismicity catalogue and USGS (ANSS) catalogue. The POV-Ray version needs coastline data, so we got them from the Coastline Extractor (NGDC) web site. Also, the POR-Ray has no function to draw lines in three dimensions, so we had to make some trials for showing them in relief. The main target of our map is "the Wadati-Beniof zone", in which the sub-ducting oceanic plate surface is fringed by deeper earthquakes colored yellow, green to blue. The active volcanic regions such as the Hawaii islands or the active fault regions such as the San Andreas Fault are also effective targets of our method. However, since their shallow complicated seismic structures rather than the sub-ducting plate boundaries, the amazing effect is somewhat spoiled. Now, we try to render a transparent sphere model to improve it. The future task is to evaluate the three dimensional effect quantitatively. Present version of our maps has some back draws, but their simple and easier-making process is quite suitable for study in class rooms and outreach purpose, not only for geosciences study itself but also for optics study at the secondary levels. The maps described here are now available in our website (http://www.osaka-kyoiku.ac.jp/ yossi/).

  10. Preferred reporting items for studies mapping onto preference-based outcome measures: the MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2016-02-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MApping onto Preference-based measures reporting Standards (MAPS) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of six health economists and one Delphi methodologist. A two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorised within six sections, namely (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  11. PREFERRED REPORTING ITEMS FOR STUDIES MAPPING ONTO PREFERENCE-BASED OUTCOME MEASURES: THE MAPS STATEMENT.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-01-01

    "Mapping" onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of six health economists and one Delphi methodologist. A two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies, and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of twenty-nine candidate items, a set of twenty-three essential reporting items was developed. The items are presented numerically and categorized within six sections, namely: (i) title and abstract, (ii) introduction, (iii) methods, (iv) results, (v) discussion, and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency. and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.

  12. Mapping, Awareness, And Virtualization Network Administrator Training Tool Virtualization Module

    DTIC Science & Technology

    2016-03-01

    AND VIRTUALIZATION NETWORK ADMINISTRATOR TRAINING TOOL VIRTUALIZATION MODULE by Erik W. Berndt March 2016 Thesis Advisor: John Gibson...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MAPPING, AWARENESS, AND VIRTUALIZATION NETWORK ADMINISTRATOR TRAINING TOOL... VIRTUALIZATION MODULE 5. FUNDING NUMBERS 6. AUTHOR(S) Erik W. Berndt 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School

  13. Mapping healthcare systems: a policy relevant analytic tool

    PubMed Central

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L.V.

    2017-01-01

    Abstract Background In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool – the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Methods Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. Results We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. Conclusions As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. PMID:28541518

  14. Reconstruction of genome-scale human metabolic models using omics data.

    PubMed

    Ryu, Jae Yong; Kim, Hyun Uk; Lee, Sang Yup

    2015-08-01

    The impact of genome-scale human metabolic models on human systems biology and medical sciences is becoming greater, thanks to increasing volumes of model building platforms and publicly available omics data. The genome-scale human metabolic models started with Recon 1 in 2007, and have since been used to describe metabolic phenotypes of healthy and diseased human tissues and cells, and to predict therapeutic targets. Here we review recent trends in genome-scale human metabolic modeling, including various generic and tissue/cell type-specific human metabolic models developed to date, and methods, databases and platforms used to construct them. For generic human metabolic models, we pay attention to Recon 2 and HMR 2.0 with emphasis on data sources used to construct them. Draft and high-quality tissue/cell type-specific human metabolic models have been generated using these generic human metabolic models. Integration of tissue/cell type-specific omics data with the generic human metabolic models is the key step, and we discuss omics data and their integration methods to achieve this task. The initial version of the tissue/cell type-specific human metabolic models can further be computationally refined through gap filling, reaction directionality assignment and the subcellular localization of metabolic reactions. We review relevant tools for this model refinement procedure as well. Finally, we suggest the direction of further studies on reconstructing an improved human metabolic model.

  15. Intrinsically Disordered Protein Specific Force Field CHARMM36IDPSFF.

    PubMed

    Liu, Hao; Song, Dong; Lu, Hui; Luo, Ray; Chen, Hai-Feng

    2018-05-28

    Intrinsically disordered proteins (IDPs) are closely related to various human diseases. Because IDPs lack certain tertiary structure, it is difficult to use X-ray and NMR methods to measure their structures. Therefore, molecular dynamics simulation is a useful tool to study the conformer distribution of IDPs. However, most generic protein force fields were found to be insufficient in simulations of IDPs. Here we report our development for the CHARMM community. Our residue-specific IDP force field (CHARMM36IDPSFF) was developed based on the base generic force field with CMAP corrections of for all 20 naturally occurring amino acids. Multiple tests show that the simulated chemical shifts with the newly developed force field are in quantitative agreement with NMR experiment and are more accurate than the base generic force field. Comparison of J-couplings with previous work shows that CHARMM36IDPSFF and its corresponding base generic force field have their own advantages. In addition, CHARMM36IDPSFF simulations also agree with experiment for SAXS profiles and radii of gyration of IDPs. Detailed analysis shows that CHARMM36IDPSFF can sample more diverse and disordered conformers. These findings confirm that the newly developed force field can improve the balance of accuracy and efficiency for the conformer sampling of IDPs. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. A Prototype Performance Assessment Model for Generic Deep Borehole Repository for High-Level Nuclear Waste - 12132

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Arnold, Bill W.; Swift, Peter N.

    2012-07-01

    A deep borehole repository is one of the four geologic disposal system options currently under study by the U.S. DOE to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic deep borehole repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a deep borehole. A prototype performance assessment model for a generic deep borehole repository has been developed using themore » approach for a mined geological repository. The preliminary results from the simplified deep borehole generic repository performance assessment indicate that soluble, non-sorbing (or weakly sorbing) fission product radionuclides, such as I-129, Se-79 and Cl-36, are the likely major dose contributors, and that the annual radiation doses to hypothetical future humans associated with those releases may be extremely small. While much work needs to be done to validate the model assumptions and parameters, these preliminary results highlight the importance of a robust seal design in assuring long-term isolation, and suggest that deep boreholes may be a viable alternative to mined repositories for disposal of both HLW and UNF. (authors)« less

  17. Key Technical Aspects Influencing the Accuracy of Tablet Subdivision.

    PubMed

    Teixeira, Maíra T; Sá-Barreto, Lívia C L; Gratieri, Taís; Gelfuso, Guilherme M; Silva, Izabel C R; Cunha-Filho, Marcílio S S

    2017-05-01

    Tablet subdivision is a common practice used mainly for dose adjustment. The aim of this study was to investigate how the technical aspects of production as well as the method of tablets subdivision (employing a tablet splitter or a kitchen knife) influence the accuracy of this practice. Five drugs commonly used as subdivided tablets were selected. For each drug, the innovator drug product, a scored-generic and a non-scored generic were investigated totalizing fifteen drug products. Mechanical and physical tests, including image analysis, were performed. Additionally, comparisons were made between tablet subdivision method, score, shape, diluent composition and coating. Image analysis based on surface area was a useful tool as an alternative assay to evaluate the accuracy of tablet subdivision. The tablet splitter demonstrates an advantage relative to a knife as it showed better results in weight loss and friability tests. Oblong, coated and scored tablets had better results after subdivision than round, uncoated and non-scored tablets. The presence of elastic diluents such as starch and dibasic phosphate dehydrate conferred a more appropriate behaviour for the subdivision process than plastic materials such as microcrystalline cellulose and lactose. Finally, differences were observed between generics and their innovator products in all selected drugs with regard the quality control assays in divided tablet, which highlights the necessity of health regulations to consider subdivision performance at least in marketing authorization of generic products.

  18. Community-Based Individual Knowledge Construction in the Classroom: A Process-Oriented Account

    ERIC Educational Resources Information Center

    Looi, C.-K.; Chen, W.

    2010-01-01

    This paper explores the process of knowledge convergence and knowledge sharing in the context of classroom collaboration in which students do a group learning activity mediated by a generic representation tool. In analysing the transcript of the interactions of a group, we adapt the group cognition method of Stahl and the uptake analysis…

  19. Remote Learning for the Manipulation and Control of Robotic Cells

    ERIC Educational Resources Information Center

    Goldstain, Ofir; Ben-Gal, Irad; Bukchin, Yossi

    2007-01-01

    This work proposes an approach to remote learning of robotic cells based on internet and simulation tools. The proposed approach, which integrates remote-learning and tele-operation into a generic scheme, is designed to enable students and developers to set-up and manipulate a robotic cell remotely. Its implementation is based on a dedicated…

  20. Language Resources for Language Technology: Proceedings of the TELRI (Trans-European Language Resources Infrastructure) European Seminar (1st, Tihany, Hungary, September 15-16, 1995).

    ERIC Educational Resources Information Center

    Rettig, Heike, Ed.

    This proceedings contains papers from the first European seminar of the Trans-European Language Resources Infrastructure (TELRI) include: "Cooperation with Central and Eastern Europe in Language Engineering" (Poul Andersen); "Language Technology and Language Resources in China" (Feng Zhiwei); "Public Domain Generic Tools:…

  1. The Intelligent Monitoring System: Generic Database Interface (GDI). User Manual. Revision

    DTIC Science & Technology

    1994-01-03

    Summary of Lo=catos Nan* Decufptin Directory Location User Manual FrameMaker ’ source organized inlo, a book UBSW~ftbendb~doclim/user-manual named gdibk A...functions. LNSRCf1bgenrdb/srC I. Framemaker is a docment publishing tool fium Fame Technology Cororation Baseline: 21.1 3-1 anoAW ftnua ?bewd uw on 3.2

  2. Applying a Generic Intelligent Tutoring System (ITS) Authoring Tool to Specific Military Domains

    DTIC Science & Technology

    2006-01-01

    to evaluate a student’s actions in a free play simulation, or comparison to correct and likely incorrect solutions for each scenario. There are...the student’s actions in the free play simulation. IISAT’s comparison libraries successfully evaluated a student’s battle plan with the addition of

  3. A software tool for rapid flood inundation mapping

    USGS Publications Warehouse

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  4. Locating Sequence on FPC Maps and Selecting a Minimal Tiling Path

    PubMed Central

    Engler, Friedrich W.; Hatfield, James; Nelson, William; Soderlund, Carol A.

    2003-01-01

    This study discusses three software tools, the first two aid in integrating sequence with an FPC physical map and the third automatically selects a minimal tiling path given genomic draft sequence and BAC end sequences. The first tool, FSD (FPC Simulated Digest), takes a sequenced clone and adds it back to the map based on a fingerprint generated by an in silico digest of the clone. This allows verification of sequenced clone positions and the integration of sequenced clones that were not originally part of the FPC map. The second tool, BSS (Blast Some Sequence), takes a query sequence and positions it on the map based on sequence associated with the clones in the map. BSS has multiple uses as follows: (1) When the query is a file of marker sequences, they can be added as electronic markers. (2) When the query is draft sequence, the results of BSS can be used to close gaps in a sequenced clone or the physical map. (3) When the query is a sequenced clone and the target is BAC end sequences, one may select the next clone for sequencing using both sequence comparison results and map location. (4) When the query is whole-genome draft sequence and the target is BAC end sequences, the results can be used to select many clones for a minimal tiling path at once. The third tool, pickMTP, automates the majority of this last usage of BSS. Results are presented using the rice FPC map, BAC end sequences, and whole-genome shotgun from Syngenta. PMID:12915486

  5. KinMap: a web-based tool for interactive navigation through human kinome data.

    PubMed

    Eid, Sameh; Turk, Samo; Volkamer, Andrea; Rippmann, Friedrich; Fulle, Simone

    2017-01-05

    Annotations of the phylogenetic tree of the human kinome is an intuitive way to visualize compound profiling data, structural features of kinases or functional relationships within this important class of proteins. The increasing volume and complexity of kinase-related data underlines the need for a tool that enables complex queries pertaining to kinase disease involvement and potential therapeutic uses of kinase inhibitors. Here, we present KinMap, a user-friendly online tool that facilitates the interactive navigation through kinase knowledge by linking biochemical, structural, and disease association data to the human kinome tree. To this end, preprocessed data from freely-available sources, such as ChEMBL, the Protein Data Bank, and the Center for Therapeutic Target Validation platform are integrated into KinMap and can easily be complemented by proprietary data. The value of KinMap will be exemplarily demonstrated for uncovering new therapeutic indications of known kinase inhibitors and for prioritizing kinases for drug development efforts. KinMap represents a new generation of kinome tree viewers which facilitates interactive exploration of the human kinome. KinMap enables generation of high-quality annotated images of the human kinome tree as well as exchange of kinome-related data in scientific communications. Furthermore, KinMap supports multiple input and output formats and recognizes alternative kinase names and links them to a unified naming scheme, which makes it a useful tool across different disciplines and applications. A web-service of KinMap is freely available at http://www.kinhub.org/kinmap/ .

  6. Automated mapping of clinical terms into SNOMED-CT. An application to codify procedures in pathology.

    PubMed

    Allones, J L; Martinez, D; Taboada, M

    2014-10-01

    Clinical terminologies are considered a key technology for capturing clinical data in a precise and standardized manner, which is critical to accurately exchange information among different applications, medical records and decision support systems. An important step to promote the real use of clinical terminologies, such as SNOMED-CT, is to facilitate the process of finding mappings between local terms of medical records and concepts of terminologies. In this paper, we propose a mapping tool to discover text-to-concept mappings in SNOMED-CT. Name-based techniques were combined with a query expansion system to generate alternative search terms, and with a strategy to analyze and take advantage of the semantic relationships of the SNOMED-CT concepts. The developed tool was evaluated and compared to the search services provided by two SNOMED-CT browsers. Our tool automatically mapped clinical terms from a Spanish glossary of procedures in pathology with 88.0% precision and 51.4% recall, providing a substantial improvement of recall (28% and 60%) over other publicly accessible mapping services. The improvements reached by the mapping tool are encouraging. Our results demonstrate the feasibility of accurately mapping clinical glossaries to SNOMED-CT concepts, by means a combination of structural, query expansion and named-based techniques. We have shown that SNOMED-CT is a great source of knowledge to infer synonyms for the medical domain. Results show that an automated query expansion system overcomes the challenge of vocabulary mismatch partially.

  7. Diversity Arrays Technology (DArT) for whole-genome profiling of barley

    PubMed Central

    Wenzl, Peter; Carling, Jason; Kudrna, David; Jaccoud, Damian; Huttner, Eric; Kleinhofs, Andris; Kilian, Andrzej

    2004-01-01

    Diversity Arrays Technology (DArT) can detect and type DNA variation at several hundred genomic loci in parallel without relying on sequence information. Here we show that it can be effectively applied to genetic mapping and diversity analyses of barley, a species with a 5,000-Mbp genome. We tested several complexity reduction methods and selected two that generated the most polymorphic genomic representations. Arrays containing individual fragments from these representations generated DArT fingerprints with a genotype call rate of 98.0% and a scoring reproducibility of at least 99.8%. The fingerprints grouped barley lines according to known genetic relationships. To validate the Mendelian behavior of DArT markers, we constructed a genetic map for a cross between cultivars Steptoe and Morex. Nearly all polymorphic array features could be incorporated into one of seven linkage groups (98.8%). The resulting map comprised ≈385 unique DArT markers and spanned 1,137 centimorgans. A comparison with the restriction fragment length polymorphism-based framework map indicated that the quality of the DArT map was equivalent, if not superior, to that of the framework map. These results highlight the potential of DArT as a generic technique for genome profiling in the context of molecular breeding and genomics. PMID:15192146

  8. Singapore Genome Variation Project: a haplotype map of three Southeast Asian populations.

    PubMed

    Teo, Yik-Ying; Sim, Xueling; Ong, Rick T H; Tan, Adrian K S; Chen, Jieming; Tantoso, Erwin; Small, Kerrin S; Ku, Chee-Seng; Lee, Edmund J D; Seielstad, Mark; Chia, Kee-Seng

    2009-11-01

    The Singapore Genome Variation Project (SGVP) provides a publicly available resource of 1.6 million single nucleotide polymorphisms (SNPs) genotyped in 268 individuals from the Chinese, Malay, and Indian population groups in Southeast Asia. This online database catalogs information and summaries on genotype and phased haplotype data, including allele frequencies, assessment of linkage disequilibrium (LD), and recombination rates in a format similar to the International HapMap Project. Here, we introduce this resource and describe the analysis of human genomic variation upon agglomerating data from the HapMap and the Human Genome Diversity Project, providing useful insights into the population structure of the three major population groups in Asia. In addition, this resource also surveyed across the genome for variation in regional patterns of LD between the HapMap and SGVP populations, and for signatures of positive natural selection using two well-established metrics: iHS and XP-EHH. The raw and processed genetic data, together with all population genetic summaries, are publicly available for download and browsing through a web browser modeled with the Generic Genome Browser.

  9. Singapore Genome Variation Project: A haplotype map of three Southeast Asian populations

    PubMed Central

    Teo, Yik-Ying; Sim, Xueling; Ong, Rick T.H.; Tan, Adrian K.S.; Chen, Jieming; Tantoso, Erwin; Small, Kerrin S.; Ku, Chee-Seng; Lee, Edmund J.D.; Seielstad, Mark; Chia, Kee-Seng

    2009-01-01

    The Singapore Genome Variation Project (SGVP) provides a publicly available resource of 1.6 million single nucleotide polymorphisms (SNPs) genotyped in 268 individuals from the Chinese, Malay, and Indian population groups in Southeast Asia. This online database catalogs information and summaries on genotype and phased haplotype data, including allele frequencies, assessment of linkage disequilibrium (LD), and recombination rates in a format similar to the International HapMap Project. Here, we introduce this resource and describe the analysis of human genomic variation upon agglomerating data from the HapMap and the Human Genome Diversity Project, providing useful insights into the population structure of the three major population groups in Asia. In addition, this resource also surveyed across the genome for variation in regional patterns of LD between the HapMap and SGVP populations, and for signatures of positive natural selection using two well-established metrics: iHS and XP-EHH. The raw and processed genetic data, together with all population genetic summaries, are publicly available for download and browsing through a web browser modeled with the Generic Genome Browser. PMID:19700652

  10. A Deformable Generic 3D Model of Haptoral Anchor of Monogenean

    PubMed Central

    Teo, Bee Guan; Dhillon, Sarinder Kaur; Lim, Lee Hong Susan

    2013-01-01

    In this paper, a digital 3D model which allows for visualisation in three dimensions and interactive manipulation is explored as a tool to help us understand the structural morphology and elucidate the functions of morphological structures of fragile microorganisms which defy live studies. We developed a deformable generic 3D model of haptoral anchor of dactylogyridean monogeneans that can subsequently be deformed into different desired anchor shapes by using direct manipulation deformation technique. We used point primitives to construct the rectangular building blocks to develop our deformable 3D model. Point primitives are manually marked on a 2D illustration of an anchor on a Cartesian graph paper and a set of Cartesian coordinates for each point primitive is manually extracted from the graph paper. A Python script is then written in Blender to construct 3D rectangular building blocks based on the Cartesian coordinates. The rectangular building blocks are stacked on top or by the side of each other following their respective Cartesian coordinates of point primitive. More point primitives are added at the sites in the 3D model where more structural variations are likely to occur, in order to generate complex anchor structures. We used Catmull-Clark subdivision surface modifier to smoothen the surface and edge of the generic 3D model to obtain a smoother and more natural 3D shape and antialiasing option to reduce the jagged edges of the 3D model. This deformable generic 3D model can be deformed into different desired 3D anchor shapes through direct manipulation deformation technique by aligning the vertices (pilot points) of the newly developed deformable generic 3D model onto the 2D illustrations of the desired shapes and moving the vertices until the desire 3D shapes are formed. In this generic 3D model all the vertices present are deployed for displacement during deformation. PMID:24204903

  11. A deformable generic 3D model of haptoral anchor of Monogenean.

    PubMed

    Teo, Bee Guan; Dhillon, Sarinder Kaur; Lim, Lee Hong Susan

    2013-01-01

    In this paper, a digital 3D model which allows for visualisation in three dimensions and interactive manipulation is explored as a tool to help us understand the structural morphology and elucidate the functions of morphological structures of fragile microorganisms which defy live studies. We developed a deformable generic 3D model of haptoral anchor of dactylogyridean monogeneans that can subsequently be deformed into different desired anchor shapes by using direct manipulation deformation technique. We used point primitives to construct the rectangular building blocks to develop our deformable 3D model. Point primitives are manually marked on a 2D illustration of an anchor on a Cartesian graph paper and a set of Cartesian coordinates for each point primitive is manually extracted from the graph paper. A Python script is then written in Blender to construct 3D rectangular building blocks based on the Cartesian coordinates. The rectangular building blocks are stacked on top or by the side of each other following their respective Cartesian coordinates of point primitive. More point primitives are added at the sites in the 3D model where more structural variations are likely to occur, in order to generate complex anchor structures. We used Catmull-Clark subdivision surface modifier to smoothen the surface and edge of the generic 3D model to obtain a smoother and more natural 3D shape and antialiasing option to reduce the jagged edges of the 3D model. This deformable generic 3D model can be deformed into different desired 3D anchor shapes through direct manipulation deformation technique by aligning the vertices (pilot points) of the newly developed deformable generic 3D model onto the 2D illustrations of the desired shapes and moving the vertices until the desire 3D shapes are formed. In this generic 3D model all the vertices present are deployed for displacement during deformation.

  12. PepMapper: a collaborative web tool for mapping epitopes from affinity-selected peptides.

    PubMed

    Chen, Wenhan; Guo, William W; Huang, Yanxin; Ma, Zhiqiang

    2012-01-01

    Epitope mapping from affinity-selected peptides has become popular in epitope prediction, and correspondingly many Web-based tools have been developed in recent years. However, the performance of these tools varies in different circumstances. To address this problem, we employed an ensemble approach to incorporate two popular Web tools, MimoPro and Pep-3D-Search, together for taking advantages offered by both methods so as to give users more options for their specific purposes of epitope-peptide mapping. The combined operation of Union finds as many associated peptides as possible from both methods, which increases sensitivity in finding potential epitopic regions on a given antigen surface. The combined operation of Intersection achieves to some extent the mutual verification by the two methods and hence increases the likelihood of locating the genuine epitopic region on a given antigen in relation to the interacting peptides. The Consistency between Intersection and Union is an indirect sufficient condition to assess the likelihood of successful peptide-epitope mapping. On average from 27 tests, the combined operations of PepMapper outperformed either MimoPro or Pep-3D-Search alone. Therefore, PepMapper is another multipurpose mapping tool for epitope prediction from affinity-selected peptides. The Web server can be freely accessed at: http://informatics.nenu.edu.cn/PepMapper/

  13. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  14. Learning to merge: a new tool for interactive mapping

    NASA Astrophysics Data System (ADS)

    Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy

    2013-05-01

    The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.

  15. GDR (Genome Database for Rosaceae): integrated web-database for Rosaceae genomics and genetics data

    PubMed Central

    Jung, Sook; Staton, Margaret; Lee, Taein; Blenda, Anna; Svancara, Randall; Abbott, Albert; Main, Dorrie

    2008-01-01

    The Genome Database for Rosaceae (GDR) is a central repository of curated and integrated genetics and genomics data of Rosaceae, an economically important family which includes apple, cherry, peach, pear, raspberry, rose and strawberry. GDR contains annotated databases of all publicly available Rosaceae ESTs, the genetically anchored peach physical map, Rosaceae genetic maps and comprehensively annotated markers and traits. The ESTs are assembled to produce unigene sets of each genus and the entire Rosaceae. Other annotations include putative function, microsatellites, open reading frames, single nucleotide polymorphisms, gene ontology terms and anchored map position where applicable. Most of the published Rosaceae genetic maps can be viewed and compared through CMap, the comparative map viewer. The peach physical map can be viewed using WebFPC/WebChrom, and also through our integrated GDR map viewer, which serves as a portal to the combined genetic, transcriptome and physical mapping information. ESTs, BACs, markers and traits can be queried by various categories and the search result sites are linked to the mapping visualization tools. GDR also provides online analysis tools such as a batch BLAST/FASTA server for the GDR datasets, a sequence assembly server and microsatellite and primer detection tools. GDR is available at http://www.rosaceae.org. PMID:17932055

  16. Felyx : A Free Open Software Solution for the Analysis of Large Earth Observation Datasets

    NASA Astrophysics Data System (ADS)

    Piolle, Jean-Francois; Shutler, Jamie; Poulter, David; Guidetti, Veronica; Donlon, Craig

    2014-05-01

    GHRSST project, by assembling large collections of earth observation data from various sources and agencies, has also raised the need for providing the user community with tools to inter-compare them, assess and monitor their quality. The ESA /Medspiration project, which implemented the first operating node of GHRSST system for Europe, also paved the way successfully towards such generic analytics tools by developing the High Resolution Diagnostic Dataset System (HR-DDS) and Satellite to In situ Multi-sensor Match-up Databases. Building on this heritage, ESA is now funding the development by IFREMER, PML and Pelamis of felyx, a web tool merging the two capabilities into a single software solution. It will consist in a free open software solution, written in python and javascript, whose aim is to provide Earth Observation data producers and users with an open-source, flexible and reusable tool to allow the quality and performance of data streams (satellite, in situ and model) to be easily monitored and studied. The primary concept of Felyx is to work as an extraction tool, subsetting source data over predefined target areas (which can be static or moving) : these data subsets, and associated metrics, can then be accessed by users or client applications either as raw files, automatic alerts and reports generated periodically, or through a flexible web interface enabling statistical analysis and visualization. Felyx presents itself as an open-source suite of tools, written in python and javascript, enabling : * subsetting large local or remote collections of Earth Observation data over predefined sites (geographical boxes) or moving targets (ship, buoy, hurricane), storing locally the extracted data (refered as miniProds). These miniProds constitute a much smaller representative subset of the original collection on which one can perform any kind of processing or assessment without having to cope with heavy volumes of data. * computing statistical metrics over these miniProds using for instance a set of usual statistical operators (mean, median, rms, ...), fully extensible and applicable to any variable of a dataset. These metrics are stored in a fast search engine, queryable by humans and automated applications. * reporting or alerting, based on user-defined inference rules, through various media (emails, twitter feeds,..) and devices (phones, tablets). * analysing miniProds and metrics through a web interface allowing to dig into this base of information and extracting useful knowledge through multidimensional interactive display functions (time series, scatterplots, histograms, maps). The services provided by felyx will be generic, deployable at users own premises and adaptable enough to integrate any kind of parameters. Users will be able to operate their own felyx instance at any location, on datasets and parameters of their own interest, and the various instances will be able to interact with each other, creating a web of felyx systems enabling aggregation and cross comparison of miniProds and metrics from multiple sources. Initially two instances will be operated simultaneously during a 6 months demonstration phase, at IFREMER - on sea surface temperature (for GHRSST community) and ocean waves datasets - and PML - on ocean colour. We will present results from the Felyx project, demonstrate how the GHRSST community can exploit Felyx and demonstrate how the wider community can make use of the GHRSST data within Felyx.

  17. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  18. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  19. Knowledge Mapping: A Multipurpose Task Analysis Tool.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1988-01-01

    Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

  20. Study of Tools for Network Discovery and Network Mapping

    DTIC Science & Technology

    2003-11-01

    connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP

  1. Artificial Intelligence-Based Student Learning Evaluation: A Concept Map-Based Approach for Analyzing a Student's Understanding of a Topic

    ERIC Educational Resources Information Center

    Jain, G. Panka; Gurupur, Varadraj P.; Schroeder, Jennifer L.; Faulkenberry, Eileen D.

    2014-01-01

    In this paper, we describe a tool coined as artificial intelligence-based student learning evaluation tool (AISLE). The main purpose of this tool is to improve the use of artificial intelligence techniques in evaluating a student's understanding of a particular topic of study using concept maps. Here, we calculate the probability distribution of…

  2. Regional Geological Mapping in the Graham Land of Antarctic Peninsula Using LANDSAT-8 Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Pour, A. B.; Hashim, M.; Park, Y.

    2017-10-01

    Geological investigations in Antarctica confront many difficulties due to its remoteness and extreme environmental conditions. In this study, the applications of Landsat-8 data were investigated to extract geological information for lithological and alteration mineral mapping in poorly exposed lithologies in inaccessible domains such in Antarctica. The north-eastern Graham Land, Antarctic Peninsula (AP) was selected in this study to conduct a satellite-based remote sensing mapping technique. Continuum Removal (CR) spectral mapping tool and Independent Components Analysis (ICA) were applied to Landsat-8 spectral bands to map poorly exposed lithologies at regional scale. Pixels composed of distinctive absorption features of alteration mineral assemblages associated with poorly exposed lithological units were detected by applying CR mapping tool to VNIR and SWIR bands of Landsat-8.Pixels related to Si-O bond emission minima features were identified using CR mapping tool to TIR bands in poorly mapped andunmapped zones in north-eastern Graham Land at regional scale. Anomaly pixels in the ICA image maps related to spectral featuresof Al-O-H, Fe, Mg-O-H and CO3 groups and well-constrained lithological attributions from felsic to mafic rocks were detectedusing VNIR, SWIR and TIR datasets of Landsat-8. The approach used in this study performed very well for lithological andalteration mineral mapping with little available geological data or without prior information of the study region.

  3. Singular perturbations and vanishing passage through a turning point

    NASA Astrophysics Data System (ADS)

    De Maesschalck, P.; Dumortier, F.

    The paper deals with planar slow-fast cycles containing a unique generic turning point. We address the question on how to study canard cycles when the slow dynamics can be singular at the turning point. We more precisely accept a generic saddle-node bifurcation to pass through the turning point. It reveals that in this case the slow divergence integral is no longer the good tool to use, but its derivative with respect to the layer variable still is. We provide general results as well as a number of applications. We show how to treat the open problems presented in Artés et al. (2009) [1] and Dumortier and Rousseau (2009) [13], dealing respectively with the graphics DI2a and DF1a from Dumortier et al. (1994) [14].

  4. Birational geometry of Fano double spaces of index two

    NASA Astrophysics Data System (ADS)

    Pukhlikov, Aleksandr V.

    2010-10-01

    We study birational geometry of Fano varieties realized as double covers \\sigma\\colon V\\to{\\ P}^M, M\\ge5, branched over generic smooth hypersurfaces W=W_{2(M-1)} of degree 2(M-1). We prove that the only structures of a rationally connected fibre space on V are pencil-subsystems of the free linear system \\vert{-\\frac12K_V}\\vert. The groups of birational and biregular self-maps of V coincide: \\operatorname{Bir}V=\\operatorname{Aut}V.

  5. Enhanced STEM Learning with the GeoMapApp Data Exploration Tool

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2014-12-01

    GeoMapApp (http://www.geomapapp.org), is a free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory. GeoMapApp provides casual and specialist users alike with access to hundreds of built-in geoscience data sets covering geology, geophysics, geochemistry, oceanography, climatology, cryospherics, and the environment. Users can also import their own data tables, spreadsheets, shapefiles, grids and images. Simple manipulation and analysis tools combined with layering capabilities and engaging visualisations provide a powerful platform with which to explore and interrogate geoscience data in its proper geospatial context thus helping users to more easily gain insight into the meaning of the data. A global elevation base map covering the oceans as well as continents forms the backbone of GeoMapApp. The multi-resolution base map is updated regularly and includes data sources ranging from Space Shuttle elevation data for land areas to ultra-high-resolution surveys of coral reefs and seafloor hydrothermal vent fields. Examples of built-in data sets that can be layered over the elevation model include interactive earthquake and volcano data, plate tectonic velocities, hurricane tracks, land and ocean temperature, water column properties, age of the ocean floor, and deep submersible bottom photos. A versatile profiling tool provides instant access to data cross-sections. Contouring and 3-D views are also offered - the attached image shows a 3-D view of East Africa's Ngorongoro Crater as an example. Tabular data - both imported and built-in - can be displayed in a variety of ways and a lasso tool enables users to quickly select data points directly from the map. A range of STEM-based education material based upon GeoMapApp is already available, including a number of self-contained modules for school- and college-level students (http://www.geomapapp.org/education/contributed_material.html). More learning modules are planned, such as one on the effects of sea-level rise. GeoMapApp users include students, teachers, researchers, curriculum developers and outreach specialists.

  6. YouGenMap: a web platform for dynamic multi-comparative mapping and visualization of genetic maps

    Treesearch

    Keith Batesole; Kokulapalan Wimalanathan; Lin Liu; Fan Zhang; Craig S. Echt; Chun Liang

    2014-01-01

    Comparative genetic maps are used in examination of genome organization, detection of conserved gene order, and exploration of marker order variations. YouGenMap is an open-source web tool that offers dynamic comparative mapping capability of users' own genetic mapping between 2 or more map sets. Users' genetic map data and optional gene annotations are...

  7. Evaluation of the User Strategy on 2d and 3d City Maps Based on Novel Scanpath Comparison Method and Graph Visualization

    NASA Astrophysics Data System (ADS)

    Dolezalova, J.; Popelka, S.

    2016-06-01

    The paper is dealing with scanpath comparison of eye-tracking data recorded during case study focused on the evaluation of 2D and 3D city maps. The experiment contained screenshots from three map portals. Two types of maps were used - standard map and 3D visualization. Respondents' task was to find particular point symbol on the map as fast as possible. Scanpath comparison is one group of the eye-tracking data analyses methods used for revealing the strategy of the respondents. In cartographic studies, the most commonly used application for scanpath comparison is eyePatterns that output is hierarchical clustering and a tree graph representing the relationships between analysed sequences. During an analysis of the algorithm generating a tree graph, it was found that the outputs do not correspond to the reality. We proceeded to the creation of a new tool called ScanGraph. This tool uses visualization of cliques in simple graphs and is freely available at www.eyetracking.upol.cz/scangraph. Results of the study proved the functionality of the tool and its suitability for analyses of different strategies of map readers. Based on the results of the tool, similar scanpaths were selected, and groups of respondents with similar strategies were identified. With this knowledge, it is possible to analyse the relationship between belonging to the group with similar strategy and data gathered from the questionnaire (age, sex, cartographic knowledge, etc.) or type of stimuli (2D, 3D map).

  8. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  9. LACO-Wiki: A land cover validation tool and a new, innovative teaching resource for remote sensing and the geosciences

    NASA Astrophysics Data System (ADS)

    See, Linda; Perger, Christoph; Dresel, Christopher; Hofer, Martin; Weichselbaum, Juergen; Mondel, Thomas; Steffen, Fritz

    2016-04-01

    The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.laco-wiki.net.

  10. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  11. Three-dimensional mapping of the local interstellar medium with composite data

    NASA Astrophysics Data System (ADS)

    Capitanio, L.; Lallement, R.; Vergely, J. L.; Elyajouri, M.; Monreal-Ibero, A.

    2017-10-01

    Context. Three-dimensional maps of the Galactic interstellar medium are general astrophysical tools. Reddening maps may be based on the inversion of color excess measurements for individual target stars or on statistical methods using stellar surveys. Three-dimensional maps based on diffuse interstellar bands (DIBs) have also been produced. All methods benefit from the advent of massive surveys and may benefit from Gaia data. Aims: All of the various methods and databases have their own advantages and limitations. Here we present a first attempt to combine different datasets and methods to improve the local maps. Methods: We first updated our previous local dust maps based on a regularized Bayesian inversion of individual color excess data by replacing Hipparcos or photometric distances with Gaia Data Release 1 values when available. Secondly, we complemented this database with a series of ≃5000 color excess values estimated from the strength of the λ15273 DIB toward stars possessing a Gaia parallax. The DIB strengths were extracted from SDSS/APOGEE spectra. Third, we computed a low-resolution map based on a grid of Pan-STARRS reddening measurements by means of a new hierarchical technique and used this map as the prior distribution during the inversion of the two other datasets. Results: The use of Gaia parallaxes introduces significant changes in some areas and globally increases the compactness of the structures. Additional DIB-based data make it possible to assign distances to clouds located behind closer opaque structures and do not introduce contradictory information for the close structures. A more realistic prior distribution instead of a plane-parallel homogeneous distribution helps better define the structures. We validated the results through comparisons with other maps and with soft X-ray data. Conclusions: Our study demonstrates that the combination of various tracers is a potential tool for more accurate maps. An online tool makes it possible to retrieve maps and reddening estimations. Our online tool is available at http://stilism.obspm.fr

  12. NaviCell Web Service for network-based data visualization.

    PubMed

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei

    2015-07-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. NaviCell Web Service for network-based data visualization

    PubMed Central

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei

    2015-01-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshmukh, Ranjit; Wu, Grace

    The MapRE (Multi-criteria Analysis for Planning Renewable Energy) GIS (Geographic Information Systems) Tools are a set of ArcGIS tools to a) conduct site suitability analysis for wind and solar resources using inclusion and exclusion criteria, and create resource maps, b) create project opportunity areas and compute various attributes such as cost, distances to existing and planned infrastructure. and environmental impact factors; and c) calculate and update various attributes for already processed renewable energy zones. In addition, MapRE data sets are geospatial data of renewable energy project opportunity areas and zones with pre-calculated attributes for several countries. These tools and datamore » are available at mapre.lbl.gov.« less

  15. Critical Incident Stress Management in Schools: Mental Health Component.

    ERIC Educational Resources Information Center

    Tortorici Luna, Joanne M.

    This manual provides a brief framework of organization that serves as a response tool for a wide spectrum of crisis circumstances encountered by schools. It is meant to be a generic guide for school teams and should be customized by each school that uses it. Even with emergency procedures in place, each crisis at a school needs to be evaluated as…

  16. Lessons Learnt from and Sustainability of Adopting a Personal Learning Environment & Network (Ple&N)

    ERIC Educational Resources Information Center

    Tsui, Eric; Sabetzadeh, Farzad

    2014-01-01

    This paper describes the feedback from the configuration and deployment of a Personal Learning Environment & Network (PLE&N) tool to support peer-based social learning for university students and graduates. An extension of an earlier project in which a generic and PLE&N was deployed for all learners, the current PLE&N is a…

  17. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  18. Formative Evaluation of a Generic Decision Aid for Classroom Use.

    ERIC Educational Resources Information Center

    Freeman, Jared T.; Guillen, Julio

    Results of a formative evaluation of a decision aid for students of taxonomic domains such as statistics or biology are reported. The tool, XPT-EASE, is designed to allow a student to search a taxonomy by traversing its branches in an arbitrary order, presumably the order simplest for the student, rather than by starting from the root node and…

  19. Virtual Golden Foods Corporation: Generic Skills in a Virtual Crisis Environment (A Pilot Study)

    ERIC Educational Resources Information Center

    Godat, Meredith

    2007-01-01

    Workplace learning in a crisis-rich environment is often difficult if not impossible to integrate into programs so that students are able to experience and apply crisis management practices and principles. This study presents the results of a pilot project that examined the effective use of a virtual reality (VR) environment as a tool to teach…

  20. Raster Metafile And Raster Metafile Translator Programs

    NASA Technical Reports Server (NTRS)

    Randall, Donald P.; Gates, Raymond L.; Skeens, Kristi M.

    1994-01-01

    Raster Metafile (RM) computer program is generic raster-image-format program, and Raster Metafile Translator (RMT) program is assortment of software tools for processing images prepared in this format. Processing includes reading, writing, and displaying RM images. Such other image-manipulation features as minimal compositing operator and resizing option available under RMT command structure. RMT written in FORTRAN 77 and C language.

  1. [Polish version of the ADOS (autism diagnostic observation schedule-generic)].

    PubMed

    Chojnicka, Izabela; Płoski, Rafał

    2012-01-01

    The article presents the Polish version of the autism diagnostic observation schedule-generic (ADOS), which together with the autism diagnostic interview-revised (ADI-R) is cited as the "gold standard" for the diagnosis of autism. The ADOS is a standardised, semistructured observation protocol appropriate for children and adults of differing age and language levels. It is linked to ICD-10 and DSM-IV-TR criteria. The ADOS consists of four modules, ranging from module 1 for nonverbal individuals to module 4 for verbally fluent adults. The adequate inter-rater reliability for items has been established. The protocol has high discriminant validity and distinguishes children with pervasive developmental disorders from children, who are outside of the spectrum. Although it does not enable to distinguish individuals with pervasive developmental disorder, unspecified from individuals with childhood autism. The paper presents subsequent steps of the translation process of the original version into Polish, as well as a chosen adaptation strategy of the Polish version. The ADOS is a very useful tool both for clinical diagnosis and for the scientific purpose diagnosis. In this last case it is extremely important to use a standardised method. Until now, there was no standardised diagnostic tool for autism in Poland.

  2. The CARMEN software as a service infrastructure.

    PubMed

    Weeks, Michael; Jessop, Mark; Fletcher, Martyn; Hodge, Victoria; Jackson, Tom; Austin, Jim

    2013-01-28

    The CARMEN platform allows neuroscientists to share data, metadata, services and workflows, and to execute these services and workflows remotely via a Web portal. This paper describes how we implemented a service-based infrastructure into the CARMEN Virtual Laboratory. A Software as a Service framework was developed to allow generic new and legacy code to be deployed as services on a heterogeneous execution framework. Users can submit analysis code typically written in Matlab, Python, C/C++ and R as non-interactive standalone command-line applications and wrap them as services in a form suitable for deployment on the platform. The CARMEN Service Builder tool enables neuroscientists to quickly wrap their analysis software for deployment to the CARMEN platform, as a service without knowledge of the service framework or the CARMEN system. A metadata schema describes each service in terms of both system and user requirements. The search functionality allows services to be quickly discovered from the many services available. Within the platform, services may be combined into more complicated analyses using the workflow tool. CARMEN and the service infrastructure are targeted towards the neuroscience community; however, it is a generic platform, and can be targeted towards any discipline.

  3. Cool horizons lead to information loss

    NASA Astrophysics Data System (ADS)

    Chowdhury, Borun D.

    2013-10-01

    There are two evidences for information loss during black hole evaporation: (i) a pure state evolves to a mixed state and (ii) the map from the initial state to final state is non-invertible. Any proposed resolution of the information paradox must address both these issues. The firewall argument focuses only on the first and this leads to order one deviations from the Unruh vacuum for maximally entangled black holes. The nature of the argument does not extend to black holes in pure states. It was shown by Avery, Puhm and the author that requiring the initial state to final state map to be invertible mandates structure at the horizon even for pure states. The proof works if black holes can be formed in generic states and in this paper we show that this is indeed the case. We also demonstrate how models proposed by Susskind, Papadodimas et al. and Maldacena et al. end up making the initial to final state map non-invertible and thus make the horizon "cool" at the cost of unitarity.

  4. Evaluation of equipment and methods to map lost circulation zones in geothermal wells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, W.J.; Leon, P.A.; Pittard, G.

    A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.

  5. Mapping healthcare systems: a policy relevant analytic tool.

    PubMed

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  6. Data and Tools | NREL

    Science.gov Websites

    Data and Tools Data and Tools NREL develops data sets, maps, models, and tools for the analysis of , models, and tools in the alphabetical listing. Popular Resources PVWatts Calculator Geospatial Data

  7. Unique Sensor Plane Maps Invisible Toxins for First Responders

    ScienceCinema

    Kroutil, Robert; Thomas, Mark; Aten, Keith

    2018-05-30

    A unique airborne emergency response tool, ASPECT is a Los Alamos/U.S. Environmental Protection Agency project that can put chemical and radiological mapping tools in the air over an accident scene. The name ASPECT is an acronym for Airborne Spectral Photometric Environmental Collection Technology.

  8. Teaching science with technology: Using EPA’s EnviroAtlas in the classroom

    EPA Science Inventory

    Background/Question/Methods U.S. EPA’s EnviroAtlas provides a collection of web-based, interactive tools and resources for exploring ecosystem goods and services. EnviroAtlas contains two primary tools: An Interactive Map, which provides access to 300+ maps at multiple exte...

  9. Mapping Skills and Activities with Children's Literature

    ERIC Educational Resources Information Center

    Gandy, S. Kay

    2006-01-01

    In the primary grades, maps are useful tools to help the young reader put stories into perspective. This article presents 18 quality children's books that contain maps or lessons about maps, as well as activities to use in the classroom to teach map skills. A table is included with ratings of the usability of the maps.

  10. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild.

    PubMed

    Asthana, Akshay; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-06-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important advantages. First, by virtue of discriminative training, invariance to external variations (like identity, pose, illumination and expression) is achieved. Second, we show that the responses generated by discriminatively trained filters (or patch-experts) are sparse and can be modeled using a very small number of parameters. As a result, the optimization methods based on the proposed texture model can better cope with unseen variations. We illustrate this point by formulating both part-based and holistic approaches for generic face alignment and show that our framework outperforms the state-of-the-art on multiple "wild" databases. The code and dataset annotations are available for research purposes from http://ibug.doc.ic.ac.uk/resources.

  11. The use of mental models in chemical risk protection: developing a generic workplace methodology.

    PubMed

    Cox, Patrick; Niewöhmer, Jörg; Pidgeon, Nick; Gerrard, Simon; Fischhoff, Baruch; Riley, Donna

    2003-04-01

    We adopted a comparative approach to evaluate and extend a generic methodology to analyze the different sets of beliefs held about chemical hazards in the workplace. Our study mapped existing knowledge structures about the risks associated with the use of perchloroethylene and rosin-based solder flux in differing workplaces. "Influence diagrams" were used to represent beliefs held by chemical experts; "user models" were developed from data elicited from open-ended interviews with the workplace users of the chemicals. The juxtaposition of expert and user understandings of chemical risks enabled us to identify knowledge gaps and misunderstandings and to reinforce appropriate sets of safety beliefs and behavior relevant to chemical risk communications. By designing safety information to be more relevant to the workplace context of users, we believe that employers and employees may gain improved knowledge about chemical hazards in the workplace, such that better chemical risk management, self-protection, and informed decision making develop over time.

  12. MODSNOW-Tool: an operational tool for daily snow cover monitoring using MODIS data

    NASA Astrophysics Data System (ADS)

    Gafurov, Abror; Lüdtke, Stefan; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Schöne, Tilo; Schmidt, Sebastian; Kalashnikova, Olga; Merz, Bruno

    2017-04-01

    Spatially distributed snow cover information in mountain areas is extremely important for water storage estimations, seasonal water availability forecasting, or the assessment of snow-related hazards (e.g. enhanced snow-melt following intensive rains, or avalanche events). Moreover, spatially distributed snow cover information can be used to calibrate and/or validate hydrological models. We present the MODSNOW-Tool - an operational monitoring tool offers a user-friendly application which can be used for catchment-based operational snow cover monitoring. The application automatically downloads and processes freely available daily Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover data. The MODSNOW-Tool uses a step-wise approach for cloud removal and delivers cloud-free snow cover maps for the selected river basins including basin specific snow cover extent statistics. The accuracy of cloud-eliminated MODSNOW snow cover maps was validated for 84 almost cloud-free days in the Karadarya river basin in Central Asia, and an average accuracy of 94 % was achieved. The MODSNOW-Tool can be used in operational and non-operational mode. In the operational mode, the tool is set up as a scheduled task on a local computer allowing automatic execution without user interaction and delivers snow cover maps on a daily basis. In the non-operational mode, the tool can be used to process historical time series of snow cover maps. The MODSNOW-Tool is currently implemented and in use at the national hydrometeorological services of four Central Asian states - Kazakhstan, Kyrgyzstan, Uzbekistan and Turkmenistan and used for seasonal water availability forecast.

  13. Arcmancer: Geodesics and polarized radiative transfer library

    NASA Astrophysics Data System (ADS)

    Pihajoki, Pauli; Mannerkoski, Matias; Nättilä, Joonas; Johansson, Peter H.

    2018-05-01

    Arcmancer computes geodesics and performs polarized radiative transfer in user-specified spacetimes. The library supports Riemannian and semi-Riemannian spaces of any dimension and metric; it also supports multiple simultaneous coordinate charts, embedded geometric shapes, local coordinate systems, and automatic parallel propagation. Arcmancer can be used to solve various problems in numerical geometry, such as solving the curve equation of motion using adaptive integration with configurable tolerances and differential equations along precomputed curves. It also provides support for curves with an arbitrary acceleration term and generic tools for generating ray initial conditions and performing parallel computation over the image, among other tools.

  14. Use of concurrent mixed methods combining concept mapping and focus groups to adapt a health equity tool in Canada.

    PubMed

    Guichard, Anne; Tardieu, Émilie; Dagenais, Christian; Nour, Kareen; Lafontaine, Ginette; Ridde, Valéry

    2017-04-01

    The aim of this project was to identify and prioritize a set of conditions to be considered for incorporating a health equity tool into public health practice. Concept mapping and focus groups were implemented as complementary methods to investigate the conditions of use of a health equity tool by public health organizations in Quebec. Using a hybrid integrated research design is a richer way to address the complexity of questions emerging from intervention and planning settings. This approach provides a deeper, operational, and contextualized understanding of research results involving different professional and organizational cultures, and thereby supports the decision-making process. Concept mapping served to identify and prioritize in a limited timeframe the conditions to be considered for incorporation into a health equity tool into public health practices. Focus groups then provided a more refined understanding of the barriers, issues, and facilitating factors surrounding the tools adoption, helped distinguish among participants' perspectives based on functional roles and organizational contexts, and clarified some apparently contradictory results from the concept map. The combined use of these two techniques brought the strengths of each approach to bear, thereby overcoming some of the respective limitations of concept mapping and focus groups. This design is appropriate for investigating targets with multiple levels of complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Google Maps offers a new way to evaluate claudication.

    PubMed

    Khambati, Husain; Boles, Kim; Jetty, Prasad

    2017-05-01

    Accurate determination of walking capacity is important for the clinical diagnosis and management plan for patients with peripheral arterial disease. The current "gold standard" of measurement is walking distance on a treadmill. However, treadmill testing is not always reflective of the patient's natural walking conditions, and it may not be fully accessible in every vascular clinic. The objective of this study was to determine whether Google Maps, the readily available GPS-based mapping tool, offers an accurate and accessible method of evaluating walking distances in vascular claudication patients. Patients presenting to the outpatient vascular surgery clinic between November 2013 and April 2014 at the Ottawa Hospital with vasculogenic calf, buttock, and thigh claudication symptoms were identified and prospectively enrolled in our study. Onset of claudication symptoms and maximal walking distance (MWD) were evaluated using four tools: history; Walking Impairment Questionnaire (WIQ), a validated claudication survey; Google Maps distance calculator (patients were asked to report their daily walking routes on the Google Maps-based tool runningmap.com, and walking distances were calculated accordingly); and treadmill testing for onset of symptoms and MWD, recorded in a double-blinded fashion. Fifteen patients were recruited for the study. Determination of walking distances using Google Maps proved to be more accurate than by both clinical history and WIQ, correlating highly with the gold standard of treadmill testing for both claudication onset (r = .805; P < .001) and MWD (r = .928; P < .0001). In addition, distances were generally under-reported on history and WIQ. The Google Maps tool was also efficient, with reporting times averaging below 4 minutes. For vascular claudicants with no other walking limitations, Google Maps is a promising new tool that combines the objective strengths of the treadmill test and incorporates real-world walking environments. It offers an accurate, efficient, inexpensive, and readily accessible way to assess walking distances in patients with peripheral vascular disease. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  16. Preferred reporting items for studies mapping onto preference-based outcome measures: The MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-08-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication.A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprised of six health economists and one Delphi methodologist. Following a two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user friendly 23 item checklist. They are presented numerically and categorised within six sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document.It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by eight health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.This statement was published jointly in Applied Health Economics and Health Policy, Health and Quality of Life Outcomes, International Journal of Technology Assessment in Health Care, Journal of Medical Economics, Medical Decision Making, PharmacoEconomics, and Quality of Life Research.

  17. Enhanced representation of soil NO emissions in the ...

    EPA Pesticide Factsheets

    Modeling of soil nitric oxide (NO) emissions is highly uncertain and may misrepresent its spatial and temporal distribution. This study builds upon a recently introduced parameterization to improve the timing and spatial distribution of soil NO emission estimates in the Community Multiscale Air Quality (CMAQ) model. The parameterization considers soil parameters, meteorology, land use, and mineral nitrogen (N) availability to estimate NO emissions. We incorporate daily year-specific fertilizer data from the Environmental Policy Integrated Climate (EPIC) agricultural model to replace the annual generic data of the initial parameterization, and use a 12 km resolution soil biome map over the continental USA. CMAQ modeling for July 2011 shows slight differences in model performance in simulating fine particulate matter and ozone from Interagency Monitoring of Protected Visual Environments (IMPROVE) and Clean Air Status and Trends Network (CASTNET) sites and NO2 columns from Ozone Monitoring Instrument (OMI) satellite retrievals. We also simulate how the change in soil NO emissions scheme affects the expected O3 response to projected emissions reductions. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and

  18. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET) sequence data.

    PubMed

    Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun

    2006-08-25

    We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  19. A web-based tool for groundwater mapping and drought analysis

    NASA Astrophysics Data System (ADS)

    Christensen, S.; Burns, M.; Jones, N.; Strassberg, G.

    2012-12-01

    In 2011-2012, the state of Texas saw the worst one-year drought on record. Fluctuations in gravity measured by GRACE satellites indicate that as much as 100 cubic kilometers of water was lost during this period. Much of this came from reservoirs and shallow soil moisture, but a significant amount came from aquifers. In response to this crisis, a Texas Drought Technology Steering Committee (TDTSC) consisting of academics and water managers was formed to develop new tools and strategies to assist the state in monitoring, predicting, and responding to drought events. In this presentation, we describe one of the tools that was developed as part of this effort. When analyzing the impact of drought on groundwater levels, it is fairly common to examine time series data at selected monitoring wells. However, accurately assessing impacts and trends requires both spatial and temporal analysis involving the development of detailed water level maps at various scales. Creating such maps in a flexible and rapid fashion is critical for effective drought analysis, but can be challenging due to the massive amounts of data involved and the processing required to generate such maps. Furthermore, wells are typically not sampled at the same points in time, and so developing a water table map for a particular date requires both spatial and temporal interpolation of water elevations. To address this challenge, a Cloud-based water level mapping system was developed for the state of Texas. The system is based on the Texas Water Development Board (TWDB) groundwater database, but can be adapted to use other databases as well. The system involves a set of ArcGIS workflows running on a server with a web-based front end and a Google Earth plug-in. A temporal interpolation geoprocessing tool was developed to estimate the piezometric heads for all wells in a given region at a specific date using a regression analysis. This interpolation tool is coupled with other geoprocessing tools to filter data and interpolate point elevations spatially to produce water level, drawdown, and depth to groundwater maps. The web interface allows for users to generate these maps at locations and times of interest. A sequence of maps can be generated over a period of time and animated to visualize how water levels are changing. The time series regression analysis can also be used to do short-term predictions of future water levels.

  20. Exploring physics concepts among novice teachers through CMAP tools

    NASA Astrophysics Data System (ADS)

    Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.

    2018-03-01

    Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.

  1. Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping

    NASA Astrophysics Data System (ADS)

    Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.

    2017-12-01

    Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.

  2. V and V of Lexical, Syntactic and Semantic Properties for Interactive Systems Through Model Checking of Formal Description of Dialog

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Martinie, Celia; Palanque, Philippe

    2013-01-01

    During early phases of the development of an interactive system, future system properties are identified (through interaction with end users in the brainstorming and prototyping phase of the application, or by other stakehold-ers) imposing requirements on the final system. They can be specific to the application under development or generic to all applications such as usability principles. Instances of specific properties include visibility of the aircraft altitude, speed… in the cockpit and the continuous possibility of disengaging the autopilot in whatever state the aircraft is. Instances of generic properties include availability of undo (for undoable functions) and availability of a progression bar for functions lasting more than four seconds. While behavioral models of interactive systems using formal description techniques provide complete and unambiguous descriptions of states and state changes, it does not provide explicit representation of the absence or presence of properties. Assessing that the system that has been built is the right system remains a challenge usually met through extensive use and acceptance tests. By the explicit representation of properties and the availability of tools to support checking these properties, it becomes possible to provide developers with means for systematic exploration of the behavioral models and assessment of the presence or absence of these properties. This paper proposes the synergistic use two tools for checking both generic and specific properties of interactive applications: Petshop and Java PathFinder. Petshop is dedicated to the description of interactive system behavior. Java PathFinder is dedicated to the runtime verification of Java applications and as an extension dedicated to User Interfaces. This approach is exemplified on a safety critical application in the area of interactive cockpits for large civil aircrafts.

  3. Development and testing of a tool for assessing and resolving medication-related problems in older adults in an ambulatory care setting: the individualized medication assessment and planning (iMAP) tool.

    PubMed

    Crisp, Ginny D; Burkhart, Jena Ivey; Esserman, Denise A; Weinberger, Morris; Roth, Mary T

    2011-12-01

    Medication is one of the most important interventions for improving the health of older adults, yet it has great potential for causing harm. Clinical pharmacists are well positioned to engage in medication assessment and planning. The Individualized Medication Assessment and Planning (iMAP) tool was developed to aid clinical pharmacists in documenting medication-related problems (MRPs) and associated recommendations. The purpose of our study was to assess the reliability and usability of the iMAP tool in classifying MRPs and associated recommendations in older adults in the ambulatory care setting. Three cases, representative of older adults seen in an outpatient setting, were developed. Pilot testing was conducted and a "gold standard" key developed. Eight eligible pharmacists consented to participate in the study. They were instructed to read each case, make an assessment of MRPs, formulate a plan, and document the information using the iMAP tool. Inter-rater reliability was assessed for each case, comparing the pharmacists' identified MRPs and recommendations to the gold standard. Consistency of categorization across reviewers was assessed using the κ statistic or percent agreement. The mean κ across the 8 pharmacists in classifying MRPs compared with the gold standard was 0.74 (range, 0.54-1.00) for case 1 and 0.68 (range, 0.36-1.00) for case 2, indicating substantial agreement. For case 3, percent agreement was 63% (range, 40%-100%). The mean κ across the 8 pharmacists when classifying recommendations compared with the gold standard was 0.87 (range, 0.58-1.00) for case 1 and 0.88 (range, 0.75-1.00) for case 2, indicating almost perfect agreement. For case 3, percent agreement was 68% (range, 40%-100%). Clinical pharmacists found the iMAP tool easy to use. The iMAP tool provides a reliable and standardized approach for clinical pharmacists to use in the ambulatory care setting to classify MRPs and associated recommendations. Future studies will explore the predictive validity of the tool on clinical outcomes such as health care utilization. Copyright © 2011 Elsevier HS Journals, Inc. All rights reserved.

  4. The Adversarial Route Analysis Tool: A Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  5. GuidosToolbox: universal digital image object analysis

    Treesearch

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  6. Linking the ACT ASPIRE Assessments to NWEA MAP Assessments

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Northwest Evaluation Association™ (NWEA™) is committed to providing partners with useful tools to help make inferences from Measures of Academic Progress® (MAP®) interim assessment scores. One important tool is the concordance table between MAP and state summative assessments. Concordance tables have been used for decades to relate scores on…

  7. Assessment and Application of National Environmental Databases and Mapping Tools at the Local Level to Two Community Case Studies

    EPA Science Inventory

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessm...

  8. Mind Maps as Facilitative Tools in Science Education

    ERIC Educational Resources Information Center

    Safar, Ammar H.; Jafer,Yaqoub J.; Alqadiri, Mohammad A.

    2014-01-01

    This study explored the perceptions, attitudes, and willingness of pre-service science teachers in the College of Education at Kuwait University about using concept/mind maps and its related application software as facilitative tools, for teaching and learning, in science education. The first level (i.e., reaction) of Kirkpatrick's/Phillips'…

  9. Incorporating Concept Mapping in Project-Based Learning: Lessons from Watershed Investigations

    NASA Astrophysics Data System (ADS)

    Rye, James; Landenberger, Rick; Warner, Timothy A.

    2013-06-01

    The concept map tool set forth by Novak and colleagues is underutilized in education. A meta-analysis has encouraged teachers to make extensive use of concept mapping, and researchers have advocated computer-based concept mapping applications that exploit hyperlink technology. Through an NSF sponsored geosciences education grant, middle and secondary science teachers participated in professional development to apply computer-based concept mapping in project-based learning (PBL) units that investigated local watersheds. Participants attended a summer institute, engaged in a summer through spring online learning academy, and presented PBL units at a subsequent fall science teachers' convention. The majority of 17 teachers who attended the summer institute had previously used the concept mapping strategy with students and rated it highly. Of the 12 teachers who continued beyond summer, applications of concept mapping ranged from collaborative planning of PBL projects to building students' vocabulary to students producing maps related to the PBL driving question. Barriers to the adoption and use of concept mapping included technology access at the schools, lack of time for teachers to advance their technology skills, lack of student motivation to choose to learn, and student difficulty with linking terms. In addition to mitigating the aforementioned barriers, projects targeting teachers' use of technology tools may enhance adoption by recruiting teachers as partners from schools as well as a small number that already are proficient in the targeted technology and emphasizing the utility of the concept map as a planning tool.

  10. Using intervention mapping to develop a work-related guidance tool for those affected by cancer.

    PubMed

    Munir, Fehmidah; Kalawsky, Katryna; Wallis, Deborah J; Donaldson-Feilder, Emma

    2013-01-05

    Working-aged individuals diagnosed and treated for cancer require support and assistance to make decisions regarding work. However, healthcare professionals do not consider the work-related needs of patients and employers do not understand the full impact cancer can have upon the employee and their work. We therefore developed a work-related guidance tool for those diagnosed with cancer that enables them to take the lead in stimulating discussion with a range of different healthcare professionals, employers, employment agencies and support services. The tool facilitates discussions through a set of questions individuals can utilise to find solutions and minimise the impact cancer diagnosis, prognosis and treatment may have on their employment, sick leave and return to work outcomes. The objective of the present article is to describe the systematic development and content of the tool using Intervention Mapping Protocol (IMP). The study used the first five steps of the intervention mapping process to guide the development of the tool. A needs assessment identified the 'gaps' in information/advice received from healthcare professionals and other stakeholders. The intended outcomes and performance objectives for the tool were then identified followed by theory-based methods and an implementation plan. A draft of the tool was developed and subjected to a two-stage Delphi process with various stakeholders. The final tool was piloted with 38 individuals at various stages of the cancer journey. The tool was designed to be a self-led tool that can be used by any person with a cancer diagnosis and working for most types of employers. The pilot study indicated that the tool was relevant and much needed. Intervention Mapping is a valuable protocol for designing complex guidance tools. The process and design of this particular tool can lend itself to other situations both occupational and more health-care based.

  11. Using intervention mapping to develop a work-related guidance tool for those affected by cancer

    PubMed Central

    2013-01-01

    Background Working-aged individuals diagnosed and treated for cancer require support and assistance to make decisions regarding work. However, healthcare professionals do not consider the work-related needs of patients and employers do not understand the full impact cancer can have upon the employee and their work. We therefore developed a work-related guidance tool for those diagnosed with cancer that enables them to take the lead in stimulating discussion with a range of different healthcare professionals, employers, employment agencies and support services. The tool facilitates discussions through a set of questions individuals can utilise to find solutions and minimise the impact cancer diagnosis, prognosis and treatment may have on their employment, sick leave and return to work outcomes. The objective of the present article is to describe the systematic development and content of the tool using Intervention Mapping Protocol (IMP). Methods The study used the first five steps of the intervention mapping process to guide the development of the tool. A needs assessment identified the ‘gaps’ in information/advice received from healthcare professionals and other stakeholders. The intended outcomes and performance objectives for the tool were then identified followed by theory-based methods and an implementation plan. A draft of the tool was developed and subjected to a two-stage Delphi process with various stakeholders. The final tool was piloted with 38 individuals at various stages of the cancer journey. Results The tool was designed to be a self-led tool that can be used by any person with a cancer diagnosis and working for most types of employers. The pilot study indicated that the tool was relevant and much needed. Conclusions Intervention Mapping is a valuable protocol for designing complex guidance tools. The process and design of this particular tool can lend itself to other situations both occupational and more health-care based. PMID:23289708

  12. VizieR Online Data Catalog: Jame Clerk Maxwell Telescope Science Archive (CADC, 2003)

    NASA Astrophysics Data System (ADS)

    Canadian Astronomy Data, Centre

    2018-01-01

    The JCMT Science Archive (JSA), a collaboration between the CADC and EOA, is the official distribution site for observational data obtained with the James Clerk Maxwell Telescope (JCMT) on Mauna Kea, Hawaii. The JSA search interface is provided by the CADC Search tool, which provides generic access to the complete set of telescopic data archived at the CADC. Help on the use of this tool is provided via tooltips. For additional information on instrument capabilities and data reduction, please consult the SCUBA-2 and ACSIS instrument pages provided on the JAC maintained JCMT pages. JCMT-specific help related to the use of the CADC AdvancedSearch tool is available from the JAC. (1 data file).

  13. Evaluation of a color-coded Landsat 5/6 ratio image for mapping lithologic differences in western South Dakota

    USGS Publications Warehouse

    Raines, Gary L.; Bretz, R.F.; Shurr, George W.

    1979-01-01

    From analysis of a color-coded Landsat 5/6 ratio, image, a map of the vegetation density distribution has been produced by Raines of 25,000 sq km of western South Dakota. This 5/6 ratio image is produced digitally calculating the ratios of the bands 5 and 6 of the Landsat data and then color coding these ratios in an image. Bretz and Shurr compared this vegetation density map with published and unpublished data primarily of the U.S. Geological Survey and the South Dakota Geological Survey; good correspondence is seen between this map and existing geologic maps, especially with the soils map. We believe that this Landsat ratio image can be used as a tool to refine existing maps of surficial geology and bedrock, where bedrock is exposed, and to improve mapping accuracy in areas of poor exposure common in South Dakota. In addition, this type of image could be a useful, additional tool in mapping areas that are unmapped.

  14. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  15. Transient performance of fan engine with water ingestion

    NASA Technical Reports Server (NTRS)

    Murthy, S. N. B.; Mullican, A.

    1993-01-01

    In a continuing investigation on developing and applying codes for prediction of performance of a turbine jet engine and its components with water ingestion during flight operation, including power settings, and flight altitudes and speed changes, an attempt was made to establish the effects of water ingestion through simulation of a generic high bypass ratio engine with a generic control. In view of the large effects arising in the air compression system and the prediffuser-combustor unit during water ingestion, attention was focused on those effects and the resulting changes in engine performance. Under all conditions of operation, whether ingestion is steady or not, it became evident that water ingestion causes a fan-compressor unit to operate in a time-dependent fashion with periodic features, particularly with respect to the state of water in the span and the film in the casing clearance space, at the exit of the machine. On the other hand, the aerodynamic performance of the unit may be considered as quasi-steady once the distribution of water has attained an equilibrium state with respect to its distribution and motion. For purposes of engine simulation, the performance maps for the generic fan-compressor unit were generated based on the attainment of a quasi-steady state (meaning steady except for long-period variations in performance) during ingestion and operation over a wide enough range of rotational speeds.

  16. Toward a generic UGV autopilot

    NASA Astrophysics Data System (ADS)

    Moore, Kevin L.; Whitehorn, Mark; Weinstein, Alejandro J.; Xia, Junjun

    2009-05-01

    Much of the success of small unmanned air vehicles (UAVs) has arguably been due to the widespread availability of low-cost, portable autopilots. While the development of unmanned ground vehicles (UGVs) has led to significant achievements, as typified by recent grand challenge events, to date the UGV equivalent of the UAV autopilot is not available. In this paper we describe our recent research aimed at the development of a generic UGV autopilot. Assuming we are given a drive-by-wire vehicle that accepts as inputs steering, brake, and throttle commands, we present a system that adds sonar ranging sensors, GPS/IMU/odometry, stereo camera, and scanning laser sensors, together with a variety of interfacing and communication hardware. The system also includes a finite state machine-based software architecture as well as a graphical user interface for the operator control unit (OCU). Algorithms are presented that enable an end-to-end scenario whereby an operator can view stereo images as seen by the vehicle and can input GPS waypoints either from a map or in the vehicle's scene-view image, at which point the system uses the environmental sensors as inputs to a Kalman filter for pose estimation and then computes control actions to move through the waypoint list, while avoiding obstacles. The long-term goal of the research is a system that is generically applicable to any drive-by-wire unmanned ground vehicle.

  17. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  18. Transportable Manned and Robotic Digital Geophysical Mapping Tow Vehicle, Phase 1

    DTIC Science & Technology

    2007-08-01

    by using the UX PROCESS QC/QA tools to evaluate quality. Areas evaluated included induced noise, position and track accuracy, synchronization/latency... tools . To gain additional data on productivity and the effect of alternate direction of travel we mapped an unobstructed subset of the Grid 1-4 area...independently evaluated by using the UX PROCESS QC/QA tools to evaluate quality. Areas evaluated included induced noise, position and track

  19. GIS based application tool -- history of East India Company

    NASA Astrophysics Data System (ADS)

    Phophaliya, Sudhir

    The emphasis of the thesis is to build an intuitive and robust GIS (Geographic Information systems) Tool which gives an in depth information on history of East India Company. The GIS tool also incorporates various achievements of East India Company which helped to establish their business all over world especially India. The user has the option to select these movements and acts by clicking on any of the marked states on the World map. The World Map also incorporates key features for East India Company like landing of East India Company in India, Darjeeling Tea Establishment, East India Company Stock Redemption Act etc. The user can know more about these features simply by clicking on each of them. The primary focus of the tool is to give the user a unique insight about East India Company; for this the tool has several HTML (Hypertext markup language) pages which the user can select. These HTML pages give information on various topics like the first Voyage, Trade with China, 1857 Revolt etc. The tool has been developed in JAVA. For the Indian map MOJO (Map Objects Java Objects) is used. MOJO is developed by ESRI. The major features shown on the World map was designed using MOJO. MOJO made it easy to incorporate the statistical data with these features. The user interface was intentionally kept simple and easy to use. To keep the user engaged, key aspects are explained using HTML pages. The idea is that pictures will help the user garner interest in the history of East India Company.

  20. Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows

    NASA Astrophysics Data System (ADS)

    Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.

    2017-06-01

    The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.

Top