ERIC Educational Resources Information Center
Northwest Evaluation Association, 2013
2013-01-01
While many educators expect the Common Core State Standards (CCSS) to be more rigorous than previous state standards, some wonder if the transition to CCSS and to a Common Core aligned MAP test will have an impact on their students' RIT scores or the NWEA norms. MAP assessments use a proprietary scale known as the RIT (Rasch unit) scale to measure…
Construct Maps as a Foundation for Standard Setting
ERIC Educational Resources Information Center
Wyse, Adam E.
2013-01-01
Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…
Modular avionics packaging standardization
NASA Astrophysics Data System (ADS)
Austin, M.; McNichols, J. K.
The Modular Avionics Packaging (MAP) Program for packaging future military avionics systems with the objective of improving reliability, maintainability, and supportability, and reducing equipment life cycle costs is addressed. The basic MAP packaging concepts called the Standard Avionics Module, the Standard Enclosure, and the Integrated Rack are summarized, and the benefits of modular avionics packaging, including low risk design, technology independence with common functions, improved maintainability and life cycle costs are discussed. Progress made in MAP is briefly reviewed.
Automating the selection of standard parallels for conic map projections
NASA Astrophysics Data System (ADS)
Šavriǒ, Bojan; Jenny, Bernhard
2016-05-01
Conic map projections are appropriate for mapping regions at medium and large scales with east-west extents at intermediate latitudes. Conic projections are appropriate for these cases because they show the mapped area with less distortion than other projections. In order to minimize the distortion of the mapped area, the two standard parallels of conic projections need to be selected carefully. Rules of thumb exist for placing the standard parallels based on the width-to-height ratio of the map. These rules of thumb are simple to apply, but do not result in maps with minimum distortion. There also exist more sophisticated methods that determine standard parallels such that distortion in the mapped area is minimized. These methods are computationally expensive and cannot be used for real-time web mapping and GIS applications where the projection is adjusted automatically to the displayed area. This article presents a polynomial model that quickly provides the standard parallels for the three most common conic map projections: the Albers equal-area, the Lambert conformal, and the equidistant conic projection. The model defines the standard parallels with polynomial expressions based on the spatial extent of the mapped area. The spatial extent is defined by the length of the mapped central meridian segment, the central latitude of the displayed area, and the width-to-height ratio of the map. The polynomial model was derived from 3825 maps-each with a different spatial extent and computationally determined standard parallels that minimize the mean scale distortion index. The resulting model is computationally simple and can be used for the automatic selection of the standard parallels of conic map projections in GIS software and web mapping applications.
Collecting Data to Construct an Isoline Map
ERIC Educational Resources Information Center
Lohrengel, C. Frederick, II.; Larson, Paul R.
2017-01-01
National Geography Standard 1 requires that students learn:"How to use maps and other geographic representations, geospatial technologies, and spatial thinking to understand and communicate information" (Heffron and Downs 2012). These concepts have real-world applicability. For example, elevation contour maps are common in many…
Paterson, Trevor; Law, Andy
2009-08-14
Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. We have developed a simple generic XML schema (GenomicMappingData.xsd - GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data.
Paterson, Trevor; Law, Andy
2009-01-01
Background Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. Results We have developed a simple generic XML schema (GenomicMappingData.xsd – GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. Conclusion The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data. PMID:19682365
US EPA Nonattainment Areas and Designations-Annual PM2.5 (1997 NAAQS)
This web service contains the following layers: PM2.5 Annual 1997 NAAQS State Level and PM2.5 Annual 1997 NAAQS National . It also contains the following tables: maps99.FRED_MAP_VIEWER.%fred_area_map_data and maps99.FRED_MAP_VIEWER.%fred_area_map_view. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1997PM25Annual/MapServer) and viewing the layer description.These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there
Rule-based mapping of fire-adapted vegetation and fire regimes for the Monongahela National Forest
Melissa A. Thomas-Van Gundy; Gregory J. Nowacki; Thomas M. Schuler
2007-01-01
A rule-based approach was employed in GIS to map fire-adapted vegetation and fire regimes within the proclamation boundary of the Monongahela National Forest. Spatial analyses and maps were generated using ArcMap 9.1. The resulting fireadaptation scores were then categorized into standard fire regime groups. Fire regime group V (200+ yrs) was the most common, assigned...
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Progress of Interoperability in Planetary Research for Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Hare, T. M.; Gaddis, L. R.
2015-12-01
For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.
Tuberculosis disease mapping in Kedah using standardized morbidity ratio
NASA Astrophysics Data System (ADS)
Diah, Ijlal Mohd; Aziz, Nazrina; Kasim, Maznah Mat
2017-10-01
This paper presents the results of relative risk estimation that applied to TB data in Kedah using the most common approach, Standardized Morbidity Ratio (SMR). Disease mapping has been recognized as one of the methods that can be used by government and public health in order to control diseases since it can give a clear picture of the risk areas. To get good disease mapping, relative risk estimation is an important issue that need to be considered. TB risk areas will be recognized through the map. From the result, Kulim shows the lowest risk areas of contracting TB while Kota Setar has the highest risk area.
Mapping global health research investments, time for new thinking--a Babel Fish for research data.
Terry, Robert F; Allen, Liz; Gardner, Charles A; Guzman, Javier; Moran, Mary; Viergever, Roderik F
2012-09-01
Today we have an incomplete picture of how much the world is spending on health and disease-related research and development (R&D). As such it is difficult to align, or even begin to coordinate, health R&D investments with international public health priorities. Current efforts to track and map global health research investments are complex, resource-intensive, and caveat-laden. An ideal situation would be for all research funding to be classified using a set of common standards and definitions. However, the adoption of such a standard by everyone is not a realistic, pragmatic or even necessary goal. It is time for new thinking informed by the innovations in automated online translation - e.g. Yahoo's Babel Fish. We propose a feasibility study to develop a system that can translate and map the diverse research classification systems into a common standard, allowing the targeting of scarce research investments to where they are needed most.
Mapping global health research investments, time for new thinking - A Babel Fish for research data
2012-01-01
Today we have an incomplete picture of how much the world is spending on health and disease-related research and development (R&D). As such it is difficult to align, or even begin to coordinate, health R&D investments with international public health priorities. Current efforts to track and map global health research investments are complex, resource-intensive, and caveat-laden. An ideal situation would be for all research funding to be classified using a set of common standards and definitions. However, the adoption of such a standard by everyone is not a realistic, pragmatic or even necessary goal. It is time for new thinking informed by the innovations in automated online translation - e.g. Yahoo's Babel Fish. We propose a feasibility study to develop a system that can translate and map the diverse research classification systems into a common standard, allowing the targeting of scarce research investments to where they are needed most. PMID:22938160
A Body of Work Standard-Setting Method with Construct Maps
ERIC Educational Resources Information Center
Wyse, Adam E.; Bunch, Michael B.; Deville, Craig; Viger, Steven G.
2014-01-01
This article describes a novel variation of the Body of Work method that uses construct maps to overcome problems of transparency, rater inconsistency, and scores gaps commonly occurring with the Body of Work method. The Body of Work method with construct maps was implemented to set cut-scores for two separate K-12 assessment programs in a large…
Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.
2004-01-01
These maps are not intended to be a substitute for site-specific studies for engineering design nor to replace standard maps commonly referenced in building codes. Rather, we hope that these maps will be used as a guide by government agencies; the engineering, urban planning, emergency preparedness, and response communities; and the general public as part of an overall program to reduce earthquake risk and losses in New Mexico.
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes
Sharma, Deepak K.; Solbrig, Harold R.; Prud’hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary’s metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration. PMID:28269909
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes.
Sharma, Deepak K; Solbrig, Harold R; Prud'hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary's metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration.
Andersen, Flemming; Watanabe, Hideaki; Bjarkam, Carsten; Danielsen, Erik H; Cumming, Paul
2005-07-15
The analysis of physiological processes in brain by position emission tomography (PET) is facilitated when images are spatially normalized to a standard coordinate system. Thus, PET activation studies of human brain frequently employ the common stereotaxic coordinates of Talairach. We have developed an analogous stereotaxic coordinate system for the brain of the Gottingen miniature pig, based on automatic co-registration of magnetic resonance (MR) images obtained in 22 male pigs. The origin of the pig brain stereotaxic space (0, 0, 0) was arbitrarily placed in the centroid of the pineal gland as identified on the average MRI template. The orthogonal planes were imposed using the line between stereotaxic zero and the optic chiasm. A series of mean MR images in the coronal, sagittal and horizontal planes were generated. To test the utility of the common coordinate system for functional imaging studies of minipig brain, we calculated cerebral blood flow (CBF) maps from normal minipigs and from minipigs with a syndrome of parkisonism induced by 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)-poisoning. These maps were transformed from the native space into the common stereotaxic space. After global normalization of these maps, an undirected search for differences between the groups was then performed using statistical parametric mapping. Using this method, we detected a statistically significant focal increase in CBF in the left cerebellum of the MPTP-lesioned group. We expect the present approach to be of general use in the statistical parametric mapping of CBF and other physiological parameters in living pig brain.
Common Bibliographic Standards for Baylor University Libraries. Revised.
ERIC Educational Resources Information Center
Scott, Sharon; And Others
Developed by a Baylor University (Texas) Task Force, the revised policies of bibliographic standards for the university libraries provide formats for: (1) archives and manuscript control; (2) audiovisual media; (3) books; (4) machine-readable data files; (5) maps; (6) music scores; (7) serials; and (8) sound recordings. The task force assumptions…
NASA Astrophysics Data System (ADS)
Yarnykh, V.; Korostyshevskaya, A.
2017-08-01
Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.
Cytogenetic map of common bean (Phaseolus vulgaris L.)
Fonsêca, Artur; Ferreira, Joana; dos Santos, Tiago Ribeiro Barros; Mosiolek, Magdalena; Bellucci, Elisa; Kami, James; Gepts, Paul; Geffroy, Valérie; Schweizer, Dieter; dos Santos, Karla G. B.
2010-01-01
A cytogenetic map of common bean was built by in situ hybridization of 35 bacterial artificial chromosomes (BACs) selected with markers mapping to eight linkage groups, plus two plasmids for 5S and 45S ribosomal DNA and one bacteriophage. Together with three previously mapped chromosomes (chromosomes 3, 4, and 7), 43 anchoring points between the genetic map and the cytogenetic map of the species are now available. Furthermore, a subset of four BAC clones was proposed to identify the 11 chromosome pairs of the standard cultivar BAT93. Three of these BACs labelled more than a single chromosome pair, indicating the presence of repetitive DNA in their inserts. A repetitive distribution pattern was observed for most of the BACs; for 38% of them, highly repetitive pericentromeric or subtelomeric signals were observed. These distribution patterns corresponded to pericentromeric and subtelomeric heterochromatin blocks observed with other staining methods. Altogether, the results indicate that around half of the common bean genome is heterochromatic and that genes and repetitive sequences are intermingled in the euchromatin and heterochromatin of the species. Electronic supplementary material The online version of this article (doi:10.1007/s10577-010-9129-8) contains supplementary material, which is available to authorized users. PMID:20449646
Lesson Planning with the Common Core
ERIC Educational Resources Information Center
Estes, Linda A.; McDuffie, Amy Roth; Tate, Cathie
2014-01-01
Planning a lesson can be similar to planning a road trip--a metaphor the authors use to describe how they applied research and theory to their lesson planning process. A map and mode of transportation, the Common Core State Standards for Mathematics (CCSSM) and textbooks as resources, can lead to desired destinations, such as students engaging in…
A critical assessment on the role of sentinel node mapping in endometrial cancer.
Bogani, Giorgio; Ditto, Antonino; Martinelli, Fabio; Signorelli, Mauro; Perotto, Stefania; Lorusso, Domenica; Raspagliesi, Francesco
2015-10-01
Endometrial cancer is the most common gynecologic malignancy in the developed countries. Although the high incidence of this occurrence no consensus, about the role of retroperitoneal staging, still exists. Growing evidence support the safety and efficacy of sentinel lymph node mapping. This technique is emerging as a new standard for endometrial cancer staging procedures. In the present paper, we discuss the role of sentinel lymph node mapping in endometrial cancer, highlighting the most controversies features.
Snake River Plain Geothermal Play Fairway Analysis - Phase 1 Raster Files
John Shervais
2015-10-09
Snake River Plain Play Fairway Analysis - Phase 1 CRS Raster Files. This dataset contains raster files created in ArcGIS. These raster images depict Common Risk Segment (CRS) maps for HEAT, PERMEABILITY, AND SEAL, as well as selected maps of Evidence Layers. These evidence layers consist of either Bayesian krige functions or kernel density functions, and include: (1) HEAT: Heat flow (Bayesian krige map), Heat flow standard error on the krige function (data confidence), volcanic vent distribution as function of age and size, groundwater temperature (equivalue interval and natural breaks bins), and groundwater T standard error. (2) PERMEABILTY: Fault and lineament maps, both as mapped and as kernel density functions, processed for both dilational tendency (TD) and slip tendency (ST), along with data confidence maps for each data type. Data types include mapped surface faults from USGS and Idaho Geological Survey data bases, as well as unpublished mapping; lineations derived from maximum gradients in magnetic, deep gravity, and intermediate depth gravity anomalies. (3) SEAL: Seal maps based on presence and thickness of lacustrine sediments and base of SRP aquifer. Raster size is 2 km. All files generated in ArcGIS.
Updated symbol catalogue for geologic and geomorphologic mapping in Planetary Scinces
NASA Astrophysics Data System (ADS)
Nass, Andrea; Fortezzo, Corey; Skinner, James, Jr.; Hunter, Marc; Hare, Trent
2017-04-01
Maps are one of the most powerful communication tools for spatial data. This is true for terrestrial data, as well as the many types of planetary data. Geologic and/or geomorphologic maps of planetary surfaces, in particular those of the Moon, Mars, and Venus, are standardized products and often prepared as a part of hypothesis-driven science investigations. The NASA-funded Planetary Geologic Mapping program, coordinated by the USGS Astrogeology Science Center (ASC), produces high-quality, standardized, and refereed geologic maps and digital databases of planetary bodies. In this context, 242 geologic, geomorphologic, and thematic map sheets and map series have been published since the 1962. However, outside of this program, numerous non-USGS published maps are created as result of scientific investigations and published, e.g. as figures or supplemental materials within a peer-reviewed journal article. Due to the complexity of planetary surfaces, diversity between different planet surfaces, and the varied resolution of the data, geomorphologic and geologic mapping is a challenging task. Because of these limiting conditions, the mapping process is a highly interpretative work and is mostly limited to remotely sensed satellite data - with a few expetions from rover data. Uniform and an unambiguous data are fundamental to make quality observations that lead to unbiased and supported interpretations, especially when there is no current groundtruthing. To allow for correlation between different map products (digital or analog), the most commonly used spatial objects are predefined cartographic symbols. The Federal Geographic Data Committee (FGDC) Digital Cartographic Standard for Geologic Map Symbolization (DCSGMS) defines the most commonly used symbols, colors, and hatch patterns in one comprehensive document. Chapter 25 of the DCSGMS defines the Planetary Geology Features based on the symbols defined in the Venus Mapper's Handbook. After reviewing the 242 planetary geological maps, we propose to 1) review standardized symbols for planetary maps, and 2) recommend an updated symbol collection for adoption by the planetary mapping community. Within these points, the focus is on the changing of symbology with respect to time and how it effects communication within and between the maps. Two key questions to address are 1) does chapter 25 provides enough variability within the subcategories (e.g., faults) to represent the data within the maps? 2) How recommendations to the mapping community and their steering committees could be delivered to enhance a map's communicability, and convey information succinctly but thoroughly. For determining the most representative symbol collection of existing maps to support future map results (within or outside of USGS mapping program) we defined a stepwise task list: 1) Statistical review of existing symbol sets and collections, 2) Establish a representative symbol set for planetary mapping, 3) Update cartographic symbols, 4) Implementation into GIS-based mapping software (this implementation will mimic the 2010 application of the planetary symbol set into ArcGIS (more information https://planetarymapping.wr.usgs.gov/Project). 6) Platform to provide the symbol set to the mapping community. This project was initiated within an ongoing cooperation work between the USGS ASC and the German Aerospace Center (DLR), Dept. of Planetary Geology.
USDA-ARS?s Scientific Manuscript database
Root rot diseases of bean (Phaseolus vulgaris L.) are a constraint to dry and snap bean production. We developed the RR138 RIL mapping population from the cross of OSU5446, a susceptible line that meets current snap bean processing industry standards, and RR6950, a root rot resistant dry bean in th...
Isolation and characterization of a cDNA clone specific for avian vitellogenin II.
Protter, A A; Wang, S Y; Shelness, G S; Ostapchuk, P; Williams, D L
1982-01-01
A clone for vitellogenin, a major avian, estrogen responsive egg yolk protein, was isolated from the cDNA library of estrogen-induced rooster liver. Two forms of plasma vitellogenin, vitellogenin I (VTG I) and vitellogenin II (VTG II), distinguishable on the basis of their unique partial proteolysis maps, have been characterized and their corresponding hepatic precursor forms identified. We have used this criterion to specifically characterize which vitellogenin protein had been cloned. Partial proteolysis maps of BTG I and VTG II standards, synthesized in vivo, were compared to maps of protein synthesized in vitro using RNA hybrid-selected by the vitellogenin plasmid. Eight major digest fragments were found common to the in vitro synthesized vitellogenin and the VTG II standard while no fragments were observed to correspond to the VTG I map. A restriction map of the VTG II cDNA clone permits comparison to previously described cDNA and genomic vitellogenin clones. Images PMID:6182527
Building perceptual color maps for visualizing interval data
NASA Astrophysics Data System (ADS)
Kalvin, Alan D.; Rogowitz, Bernice E.; Pelah, Adar; Cohen, Aron
2000-06-01
In visualization, a 'color map' maps a range of data values onto a scale of colors. However, unless a color map is e carefully constructed, visual artifacts can be produced. This problem has stimulated considerable interest in creating perceptually based color maps, that is, color maps where equal steps in data value are perceived as equal steps in the color map [Robertson (1988); Pizer (1981); Green (1992); Lefkowitz and Herman, 1992)]. In Rogowitz and Treinish, (1996, 1998) and in Bergman, Treinish and Rogowitz, (1995), we demonstrated that color maps based on luminance or saturation could be good candidates for satisfying this requirement. This work is based on the seminal work of S.S. Stevens (1966), who measured the perceived magnitude of different magnitudes of physical stimuli. He found that for many physical scales, including luminance (cd/m2) and saturation (the 'redness' of a long-wavelength light source), equal ratios in stimulus value produced equal ratios in perceptual magnitude. He interpreted this as indicating that there exists in human cognition a common scale for representing magnitude, and we scale the effects of different physical stimuli to this internal scale. In Rogowitz, Kalvin, Pelahb and Cohen (1999), we used a psychophysical technique to test this hypothesis as it applies to the creation of perceptually uniform color maps. We constructed color maps as trajectories through three-color spaces, a common computer graphics standard (uncalibrated HSV), a common perceptually-based engineering standard for creating visual stimuli (L*a*b*), and a space commonly used in the graphic arts (Munsell). For each space, we created color scales that varied linearly in hue, saturation, or luminance and measured the detectability of increments in hue, saturation or luminance for each of these color scales. We measured the amplitude of the just-detectable Gaussian increments at 20 different values along the range of each color map. For all three color spaces, we found that luminance-based color maps provided the most perceptually- uniform representations of the data. The just-detectable increment was constant at all points in the color map, with the exception of the lowest-luminance values, where a larger increment was required. The saturation-based color maps provided less sensitivity than the luminance-based color maps, requiring much larger increments for detection. For the hue- based color maps, the size of the increment required for detection varied across the range. For example, for the standard 'rainbow' color map (uncalibrated HSV, hue-varying map), a step in the 'green' region required an increment 16 times the size of the increment required in the 'cyan' part of the range. That is, the rainbow color map would not successfully represent changes in the data in the 'green' region of this color map. In this paper, we extend this research by studying the detectability of spatially-modulated Gabor targets based on these hue, saturation and luminance scales. Since, in visualization, the user is called upon to detect and identify patterns that vary in their spatial characteristics, it is important to study how different types of color maps represent data with varying spatial properties. To do so, we measured modulation thresholds for low-(0.2 c/deg) and high-spatial frequency (4.0 c/deg) Gabor patches and compared them with the Gaussian results. As before, we measured increment thresholds for hue, saturation, and luminance modulations. These color scales were constructed as trajectories along the three perceptual dimensions of color (hue, saturation, and luminance) in two color spaces, uncalibrated HSV and calibrated L*a*b. This allowed us to study how the three perceptual dimensions represent magnitude information for test patterns varying in spatial frequency. This design also allowed us to test the hypothesis that the luminance channel best carries high-spatial frequency information while the saturation channel best represents low spatial-frequency information (Mullen 1985; DeValois and DeValois 1988).
Transport properties in nontwist area-preserving maps
Szezech Jr., J. D.; Caldas, I. L.; Lopes, S. R.; ...
2009-10-23
Nontwist systems, common in the dynamical descriptions of fluids and plasmas, possess a shearless curve with a concomitant transport barrier that eliminates or reduces chaotic transport, even after its breakdown. In order to investigate the transport properties of nontwist systems, we analyze the barrier escape time and barrier transmissivity for the standard nontwist map, a paradigm of such systems. We interpret the sensitive dependence of these quantities upon map parameters by investigating chaotic orbit stickiness and the associated role played by the dominant crossing of stable and unstable manifolds.
The First Global Geological Map of Mercury
NASA Astrophysics Data System (ADS)
Prockter, L. M.; Head, J. W., III; Byrne, P. K.; Denevi, B. W.; Kinczyk, M. J.; Fassett, C.; Whitten, J. L.; Thomas, R.; Ernst, C. M.
2015-12-01
Geological maps are tools with which to understand the distribution and age relationships of surface geological units and structural features on planetary surfaces. Regional and limited global mapping of Mercury has already yielded valuable science results, elucidating the history and distribution of several types of units and features, such as regional plains, tectonic structures, and pyroclastic deposits. To date, however, no global geological map of Mercury exists, and there is currently no commonly accepted set of standardized unit descriptions and nomenclature. With MESSENGER monochrome image data, we are undertaking the global geological mapping of Mercury at the 1:15M scale applying standard U.S. Geological Survey mapping guidelines. This map will enable the development of the first global stratigraphic column of Mercury, will facilitate comparisons among surface units distributed discontinuously across the planet, and will provide guidelines for mappers so that future mapping efforts will be consistent and broadly interpretable by the scientific community. To date we have incorporated three major datasets into the global geological map: smooth plains units, tectonic structures, and impact craters and basins >20 km in diameter. We have classified most of these craters by relative age on the basis of the state of preservation of morphological features and standard classification schemes first applied to Mercury by the Mariner 10 imaging team. Additional datasets to be incorporated include intercrater plains units and crater ejecta deposits. In some regions MESSENGER color data is used to supplement the monochrome data, to help elucidate different plains units. The final map will be published online, together with a peer-reviewed publication. Further, a digital version of the map, containing individual map layers, will be made publicly available for use within geographic information systems (GISs).
Common Calibration Source for Monitoring Long-term Ozone Trends
NASA Technical Reports Server (NTRS)
Kowalewski, Matthew
2004-01-01
Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.
Artemov, Gleb N; Gordeev, Mikhail I; Kokhanenko, Alina A; Moskaev, Anton V; Velichevskaya, Alena I; Stegniy, Vladimir N; Sharakhov, Igor V; Sharakhova, Maria V
2018-03-27
Anopheles beklemishevi is a member of the Maculipennis group of malaria mosquitoes that has the most northern distribution among other members of the group. Although a cytogenetic map for the larval salivary gland chromosomes of this species has been developed, a high-quality standard cytogenetic photomap that enables genomics and population genetics studies of this mosquito at the adult stage is still lacking. In this study, a cytogenetic map for the polytene chromosomes of An. beklemishevi from ovarian nurse cells was developed using high-resolution digital imaging from field collected mosquitoes. PCR-amplified DNA probes for fluorescence in situ hybridization (FISH) were designed based on the genome of An. atroparvus. The DNA probe obtained by microdissection procedures from the breakpoint region was labelled in a DOP-PCR reaction. Population analysis was performed on 371 specimens collected in 18 locations. We report the development of a high-quality standard photomap for the polytene chromosomes from ovarian nurse cells of An. beklemishevi. To confirm the suitability of the map for physical mapping, several PCR-amplified probes were mapped to the chromosomes of An. beklemishevi using FISH. In addition, we identified and mapped DNA probes to flanking regions of the breakpoints of two inversions on chromosome X of this species. Inversion polymorphism was determined in 13 geographically distant populations of An. beklemishevi. Four polymorphic inversions were detected. The positions of common chromosomal inversions were indicated on the map. The study constructed a standard photomap for ovarian nurse cell chromosomes of An. beklemishevi and tested its suitability for physical genome mapping and population studies. Cytogenetic analysis determined inversion polymorphism in natural populations of An. beklemishevi related to this species' adaptation.
in Mapping of Gastric Cancer Incidence in Iran
Asmarian, Naeimehossadat; Jafari-Koshki, Tohid; Soleimani, Ali; Taghi Ayatollahi, Seyyed Mohammad
2016-10-01
Background: In many countries gastric cancer has the highest incidence among the gastrointestinal cancers and is the second most common cancer in Iran. The aim of this study was to identify and map high risk gastric cancer regions at the county-level in Iran. Methods: In this study we analyzed gastric cancer data for Iran in the years 2003-2010. Areato- area Poisson kriging and Besag, York and Mollie (BYM) spatial models were applied to smoothing the standardized incidence ratios of gastric cancer for the 373 counties surveyed in this study. The two methods were compared in term of accuracy and precision in identifying high risk regions. Result: The highest smoothed standardized incidence rate (SIR) according to area-to-area Poisson kriging was in Meshkinshahr county in Ardabil province in north-western Iran (2.4,SD=0.05), while the highest smoothed standardized incidence rate (SIR) according to the BYM model was in Ardabil, the capital of that province (2.9,SD=0.09). Conclusion: Both methods of mapping, ATA Poisson kriging and BYM, showed the gastric cancer incidence rate to be highest in north and north-west Iran. However, area-to-area Poisson kriging was more precise than the BYM model and required less smoothing. According to the results obtained, preventive measures and treatment programs should be focused on particular counties of Iran. Creative Commons Attribution License
Landscape features, standards, and semantics in U.S. national topographic mapping databases
Varanka, Dalia
2009-01-01
The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.
Dolin, Robert H.; Giannone, Gay; Schadow, Gunther
2007-01-01
We sought to determine how well the HL7 / ASTM Continuity of Care Document (CCD) standard supports the requirements underlying the Joint Commission medication reconciliation recommendations. In particular, the Joint Commission emphasizes that transition points in the continuum of care are vulnerable to communication breakdowns, and that these breakdowns are a common source of medication errors. These transition points are the focus of communication standards, suggesting that CCD can support and enable medication related patient safety initiatives. Data elements needed to support the Joint Commission recommendations were identified and mapped to CCD, and a detailed clinical scenario was constructed. The mapping identified minor gaps, and identified fields present in CCD not specifically identified by Joint Commission, but useful nonetheless when managing medications across transitions of care, suggesting that a closer collaboration between the Joint Commission and standards organizations will be mutually beneficial. The nationally recognized CCD specification provides a standards-based solution for enabling Joint Commission medication reconciliation objectives. PMID:18693823
Dolin, Robert H; Giannone, Gay; Schadow, Gunther
2007-10-11
We sought to determine how well the HL7/ASTM Continuity of Care Document (CCD) standard supports the requirements underlying the Joint Commission medication reconciliation recommendations. In particular, the Joint Commission emphasizes that transition points in the continuum of care are vulnerable to communication breakdowns, and that these breakdowns are a common source of medication errors. These transition points are the focus of communication standards, suggesting that CCD can support and enable medication related patient safety initiatives. Data elements needed to support the Joint Commission recommendations were identified and mapped to CCD, and a detailed clinical scenario was constructed. The mapping identified minor gaps, and identified fields present in CCD not specifically identified by Joint Commission, but useful nonetheless when managing medications across transitions of care, suggesting that a closer collaboration between the Joint Commission and standards organizations will be mutually beneficial. The nationally recognized CCD specification provides a standards-based solution for enabling Joint Commission medication reconciliation objectives.
OneGeology-Europe: architecture, portal and web services to provide a European geological map
NASA Astrophysics Data System (ADS)
Tellez-Arenas, Agnès.; Serrano, Jean-Jacques; Tertre, François; Laxton, John
2010-05-01
OneGeology-Europe is a large ambitious project to make geological spatial data further known and accessible. The OneGeology-Europe project develops an integrated system of data to create and make accessible for the first time through the internet the geological map of the whole of Europe. The architecture implemented by the project is web services oriented, based on the OGC standards: the geological map is not a centralized database but is composed by several web services, each of them hosted by a European country involved in the project. Since geological data are elaborated differently from country to country, they are difficult to share. OneGeology-Europe, while providing more detailed and complete information, will foster even beyond the geological community an easier exchange of data within Europe and globally. This implies an important work regarding the harmonization of the data, both model and the content. OneGeology-Europe is characterised by the high technological capacity of the EU Member States, and has the final goal to achieve the harmonisation of European geological survey data according to common standards. As a direct consequence Europe will make a further step in terms of innovation and information dissemination, continuing to play a world leading role in the development of geosciences information. The scope of the common harmonized data model was defined primarily by the requirements of the geological map of Europe, but in addition users were consulted and the requirements of both INSPIRE and ‘high-resolution' geological maps were considered. The data model is based on GeoSciML, developed since 2006 by a group of Geological Surveys. The data providers involved in the project implemented a new component that allows the web services to deliver the geological map expressed into GeoSciML. In order to capture the information describing the geological units of the map of Europe the scope of the data model needs to include lithology; age; genesis and metamorphic character. For high resolution maps physical properties, bedding characteristics and weathering also need to be added. Furthermore, Geological data held by national geological surveys is generally described in national language of the country. The project has to deal with the multilingual issue, an important requirement of the INSPIRE directive. The project provides a list of harmonized vocabularies, a set of web services to deal with them, and a web site for helping the geoscientists while mapping the terms used into the national datasets into these vocabularies. The web services provided by each data provider, with the particular component that allows them to deliver the harmonised data model and to handle the multilingualism, are the first part of the architecture. The project also implements a web portal that provides several functionalities. Thanks to the common data model implemented by each web service delivering a part of the geological map, and using OGC SLD standards, the client offers the following option. A user can request for a sub-selection of the map, for instance searching on a particular attribute such as "age is quaternary", and display only the parts of the map according to the filter. Using the web services on the common vocabularies, the data displayed are translated. The project started September 2008 for two years, with 29 partners from 20 countries (20 partners are Geological Surveys). The budget is 3.25 M€, with a European Commission contribution of 2.6 M€. The paper will describe the technical solutions to implement OneGeology-Europe components: the profile of the common data model to exchange geological data, the web services to view and access geological data; and a geoportal to provide the user with a user-friendly way to discover, view and access geological data.
Accuracy of CNV Detection from GWAS Data.
Zhang, Dandan; Qian, Yudong; Akula, Nirmala; Alliey-Rodriguez, Ney; Tang, Jinsong; Gershon, Elliot S; Liu, Chunyu
2011-01-13
Several computer programs are available for detecting copy number variants (CNVs) using genome-wide SNP arrays. We evaluated the performance of four CNV detection software suites--Birdsuite, Partek, HelixTree, and PennCNV-Affy--in the identification of both rare and common CNVs. Each program's performance was assessed in two ways. The first was its recovery rate, i.e., its ability to call 893 CNVs previously identified in eight HapMap samples by paired-end sequencing of whole-genome fosmid clones, and 51,440 CNVs identified by array Comparative Genome Hybridization (aCGH) followed by validation procedures, in 90 HapMap CEU samples. The second evaluation was program performance calling rare and common CNVs in the Bipolar Genome Study (BiGS) data set (1001 bipolar cases and 1033 controls, all of European ancestry) as measured by the Affymetrix SNP 6.0 array. Accuracy in calling rare CNVs was assessed by positive predictive value, based on the proportion of rare CNVs validated by quantitative real-time PCR (qPCR), while accuracy in calling common CNVs was assessed by false positive/false negative rates based on qPCR validation results from a subset of common CNVs. Birdsuite recovered the highest percentages of known HapMap CNVs containing >20 markers in two reference CNV datasets. The recovery rate increased with decreased CNV frequency. In the tested rare CNV data, Birdsuite and Partek had higher positive predictive values than the other software suites. In a test of three common CNVs in the BiGS dataset, Birdsuite's call was 98.8% consistent with qPCR quantification in one CNV region, but the other two regions showed an unacceptable degree of accuracy. We found relatively poor consistency between the two "gold standards," the sequence data of Kidd et al., and aCGH data of Conrad et al. Algorithms for calling CNVs especially common ones need substantial improvement, and a "gold standard" for detection of CNVs remains to be established.
Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-01-01
Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757
Design and implementation of a CORBA-based genome mapping system prototype.
Hu, J; Mungall, C; Nicholson, D; Archibald, A L
1998-01-01
CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk
Guiding District Implementation of Common Core State Standards: Innovation Configuration Maps
ERIC Educational Resources Information Center
Roy, Patricia; Killion, Joellen
2011-01-01
Leadership Networks are regional and content-specific networks focused on the preparation of college- and career-ready students. Each network includes teacher leaders, school administrators, central office staff, regional cooperatives, and institutes of higher education. Network members work collaboratively to focus their efforts on regional needs…
Nicholson, Suzanne W.; Stoeser, Douglas B.; Wilson, Frederic H.; Dicken, Connie L.; Ludington, Steve
2007-01-01
The growth in the use of Geographic nformation Systems (GS) has highlighted the need for regional and national digital geologic maps attributed with age and rock type information. Such spatial data can be conveniently used to generate derivative maps for purposes that include mineral-resource assessment, metallogenic studies, tectonic studies, human health and environmental research. n 1997, the United States Geological Survey’s Mineral Resources Program initiated an effort to develop national digital databases for use in mineral resource and environmental assessments. One primary activity of this effort was to compile a national digital geologic map database, utilizing state geologic maps, to support mineral resource studies in the range of 1:250,000- to 1:1,000,000-scale. Over the course of the past decade, state databases were prepared using a common standard for the database structure, fields, attributes, and data dictionaries. As of late 2006, standardized geological map databases for all conterminous (CONUS) states have been available on-line as USGS Open-File Reports. For Alaska and Hawaii, new state maps are being prepared, and the preliminary work for Alaska is being released as a series of 1:500,000-scale regional compilations. See below for a list of all published databases.
The Global Genome Biodiversity Network (GGBN) Data Standard specification
Droege, G.; Barker, K.; Seberg, O.; Coddington, J.; Benson, E.; Berendsohn, W. G.; Bunk, B.; Butler, C.; Cawsey, E. M.; Deck, J.; Döring, M.; Flemons, P.; Gemeinholzer, B.; Güntsch, A.; Hollowell, T.; Kelbert, P.; Kostadinov, I.; Kottmann, R.; Lawlor, R. T.; Lyal, C.; Mackenzie-Dodds, J.; Meyer, C.; Mulcahy, D.; Nussbeck, S. Y.; O'Tuama, É.; Orrell, T.; Petersen, G.; Robertson, T.; Söhngen, C.; Whitacre, J.; Wieczorek, J.; Yilmaz, P.; Zetzsche, H.; Zhang, Y.; Zhou, X.
2016-01-01
Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today’s ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard. Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard PMID:27694206
Collaborative Metadata Curation in Support of NASA Earth Science Data Stewardship
NASA Technical Reports Server (NTRS)
Sisco, Adam W.; Bugbee, Kaylin; le Roux, Jeanne; Staton, Patrick; Freitag, Brian; Dixon, Valerie
2018-01-01
Growing collection of NASA Earth science data is archived and distributed by EOSDIS’s 12 Distributed Active Archive Centers (DAACs). Each collection and granule is described by a metadata record housed in the Common Metadata Repository (CMR). Multiple metadata standards are in use, and core elements of each are mapped to and from a common model – the Unified Metadata Model (UMM). Work done by the Analysis and Review of CMR (ARC) Team.
Current Approaches to Improving Marine Geophysical Data Discovery and Access
NASA Astrophysics Data System (ADS)
Jencks, J. H.; Cartwright, J.; Varner, J. D.; Anderson, C.; Robertson, E.; McLean, S. J.
2016-02-01
Exploring, understanding, and managing the global oceans is a challenge when hydrographic maps are available for only 5% of the world's oceans, even less of which have been mapped geologically or to identify benthic habitats. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is not commonplace. While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. NOAA partners with other federal agencies to provide an integrated approach to carry out a coordinated and comprehensive ocean and coastal mapping program. In order to maximize the return on their mapping investment, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. The public value of these data and products is maximized by streamlining data acquisition and processing operations, minimizing redundancies, facilitating discovery, and developing common standards to promote re-use. For its part, NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. In order to maximize effectiveness in ocean and coastal mapping, we must be sure that limited funding is not being used to collect data in areas where data already exist. By making data more accessible, NCEI extends the use of, and therefore the value of, these data. Working together, we can ensure that valuable data are made available to the broadest community.
A Servicewide Benthic Mapping Program for National Parks
Moses, Christopher S.; Nayegandhi, Amar; Beavers, Rebecca; Brock, John
2010-01-01
In 2007, the National Park Service (NPS) Inventory and Monitoring Program directed the initiation of a benthic habitat mapping program in ocean and coastal parks in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With 74 ocean and Great Lakes parks stretching over more than 5,000 miles of coastline across 26 States and territories, this Servicewide Benthic Mapping Program (SBMP) is essential. This program will deliver benthic habitat maps and their associated inventory reports to NPS managers in a consistent, servicewide format to support informed management and protection of 3 million acres of submerged National Park System natural and cultural resources. The NPS and the U.S. Geological Survey (USGS) convened a workshop June 3-5, 2008, in Lakewood, Colo., to discuss the goals and develop the design of the NPS SBMP with an assembly of experts (Moses and others, 2010) who identified park needs and suggested best practices for inventory and mapping of bathymetry, benthic cover, geology, geomorphology, and some water-column properties. The recommended SBMP protocols include servicewide standards (such as gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). SBMP Mapping Process. The SBMP calls for a multi-step mapping process for each park, beginning with a gap assessment and data mining to determine data resources and needs. An interagency announcement of intent to acquire new data will provide opportunities to leverage partnerships. Prior to new data acquisition, all involved parties should be included in a scoping meeting held at network scale. Data collection will be followed by processing and interpretation, and finally expert review and publication. After publication, all digital materials will be archived in a common format. SBMP Classification Scheme. The SBMP will map using the Coastal and Marine Ecological Classification Standard (CMECS) that is being modified to include all NPS needs, such as lacustrine ecosystems and submerged cultural resources. CMECS Version III (Madden and others, 2010) includes components for water column, biotic cover, surface geology, sub-benthic, and geoform. SBMP Data Archiving. The SBMP calls for the storage of all raw data and final products in common-use data formats. The concept of 'collect once, use often' is essential to efficient use of mapping resources. Data should also be shared with other agencies and the public through various digital clearing houses, such as Geospatial One-Stop (http://gos2.geodata.gov/wps/portal/gos). To be most useful for managing submerged resources, the SBMP advocates the inventory and mapping of the five components of marine ecosystems: surface geology, biotic cover, geoform, sub-benthic, and water column. A complete benthic inventory of a park would include maps of bathymetry and the five components of CMECS. The completion of mapping for any set of components, such as bathymetry and surface geology, or a particular theme (for example, submerged aquatic vegetation) should also include a printed report.
Smart "geomorphological" map browsing - a tale about geomorphological maps and the internet
NASA Astrophysics Data System (ADS)
Geilhausen, M.; Otto, J.-C.
2012-04-01
With the digital production of geomorphological maps, the dissemination of research outputs now extends beyond simple paper products. Internet technologies can contribute to both, the dissemination of geomorphological maps and access to geomorphologic data and help to make geomorphological knowledge available to a greater public. Indeed, many national geological surveys employ end-to-end digital workflows from data capture in the field to final map production and dissemination. This paper deals with the potential of web mapping applications and interactive, portable georeferenced PDF maps for the distribution of geomorphological information. Web mapping applications such as Google Maps have become very popular and widespread and increased the interest and access to mapping. They link the Internet with GIS technology and are a common way of presenting dynamic maps online. The GIS processing is performed online and maps are visualised in interactive web viewers characterised by different capabilities such as zooming, panning or adding further thematic layers, with the map refreshed after each task. Depending on the system architecture and the components used, advanced symbology, map overlays from different applications and sources and their integration into a Desktop GIS are possible. This interoperability is achieved through the use of international open standards that include mechanisms for the integration and visualisation of information from multiple sources. The portable document format (PDF) is commonly used for printing and is a standard format that can be processed by many graphic software and printers without loss of information. A GeoPDF enables the sharing of geospatial maps and data in PDF documents. Multiple, independent map frames with individual spatial reference systems are possible within a GeoPDF, for example, for map overlays or insets. Geospatial functionality of a GeoPDF includes scalable map display, layer visibility control, access to attribute data, coordinate queries and spatial measurements. The full functionality of GeoPDFs requires free and user-friendly plug-ins for PDF readers and GIS software. A GeoPDF enables fundamental GIS functionality turning the formerly static PDF map into an interactive, portable georeferenced PDF map. GeoPDFs are easy to create and provide an interesting and valuable way to disseminate geomorphological maps. Our motivation to engage with the online distribution of geomorphological maps originates in the increasing number of web mapping applications available today indicating that the Internet has become a medium for displaying geographical information in rich forms and user-friendly interfaces. So, why not use the Internet to distribute geomorphological maps and enhance their practical application? Web mapping and dynamic PDF maps can play a key role in the movement towards a global dissemination of geomorphological information. This will be exemplified by live demonstrations of i.) existing geomorphological WebGIS applications, ii.) data merging from various sources using web map services, and iii.) free to download GeoPDF maps during the presentations.
Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-05-01
To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Lithologic mapping of silicate rocks using TIMS
NASA Technical Reports Server (NTRS)
Gillespie, A. R.
1986-01-01
Common rock-forming minerals have thermal infrared spectral features that are measured in the laboratory to infer composition. An airborne Daedalus scanner (TIMS) that collects six channels of thermal infrared radiance data (8 to 12 microns), may be used to measure these same features for rock identification. Previously, false-color composite pictures made from channels 1, 3, and 5 and emittance spectra for small areas on these images were used to make lithologic maps. Central wavelength, standard deviation, and amplitude of normal curves regressed on the emittance spectra are related to compositional information for crystalline igneous silicate rocks. As expected, the central wavelength varies systematically with silica content and with modal quartz content. Standard deviation is less sensitive to compositional changes, but large values may result from mixed admixture of vegetation. Compression of the six TIMS channels to three image channels made from the regressed parameters may be effective in improving geologic mapping from TIMS data, and these synthetic images may form a basis for the remote assessment of rock composition.
Nicholson, Suzanne W.; Dicken, Connie L.; Horton, John D.; Foose, Michael P.; Mueller, Julia A.L.; Hon, Rudi
2006-01-01
The rapid growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national scale digital geologic maps that have standardized information about geologic age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. Although two digital geologic maps (Schruben and others, 1994; Reed and Bush, 2004) of the United States currently exist, their scales (1:2,500,000 and 1:5,000,000) are too general for many regional applications. Most states have digital geologic maps at scales of about 1:500,000, but the databases are not comparably structured and, thus, it is difficult to use the digital database for more than one state at a time. This report describes the result for a seven state region of an effort by the U.S. Geological Survey to produce a series of integrated and standardized state geologic map databases that cover the entire United States. In 1997, the United States Geological Survey's Mineral Resources Program initiated the National Surveys and Analysis (NSA) Project to develop national digital databases. One primary activity of this project was to compile a national digital geologic map database, utilizing state geologic maps, to support studies in the range of 1:250,000- to 1:1,000,000-scale. To accomplish this, state databases were prepared using a common standard for the database structure, fields, attribution, and data dictionaries. For Alaska and Hawaii new state maps are being prepared and the preliminary work for Alaska is being released as a series of 1:250,000 scale quadrangle reports. This document provides background information and documentation for the integrated geologic map databases of this report. This report is one of a series of such reports releasing preliminary standardized geologic map databases for the United States. The data products of the project consist of two main parts, the spatial databases and a set of supplemental tables relating to geologic map units. The datasets serve as a data resource to generate a variety of stratigraphic, age, and lithologic maps. This documentation is divided into four main sections: (1) description of the set of data files provided in this report, (2) specifications of the spatial databases, (3) specifications of the supplemental tables, and (4) an appendix containing the data dictionaries used to populate some fields of the spatial database and supplemental tables.
NASA Astrophysics Data System (ADS)
Lewis, Donna L.; Phinn, Stuart
2011-01-01
Aerial photography interpretation is the most common mapping technique in the world. However, unlike an algorithm-based classification of satellite imagery, accuracy of aerial photography interpretation generated maps is rarely assessed. Vegetation communities covering an area of 530 km2 on Bullo River Station, Northern Territory, Australia, were mapped using an interpretation of 1:50,000 color aerial photography. Manual stereoscopic line-work was delineated at 1:10,000 and thematic maps generated at 1:25,000 and 1:100,000. Multivariate and intuitive analysis techniques were employed to identify 22 vegetation communities within the study area. The accuracy assessment was based on 50% of a field dataset collected over a 4 year period (2006 to 2009) and the remaining 50% of sites were used for map attribution. The overall accuracy and Kappa coefficient for both thematic maps was 66.67% and 0.63, respectively, calculated from standard error matrices. Our findings highlight the need for appropriate scales of mapping and accuracy assessment of aerial photography interpretation generated vegetation community maps.
SearchGUI: A Highly Adaptable Common Interface for Proteomics Search and de Novo Engines.
Barsnes, Harald; Vaudel, Marc
2018-05-25
Mass-spectrometry-based proteomics has become the standard approach for identifying and quantifying proteins. A vital step consists of analyzing experimentally generated mass spectra to identify the underlying peptide sequences for later mapping to the originating proteins. We here present the latest developments in SearchGUI, a common open-source interface for the most frequently used freely available proteomics search and de novo engines that has evolved into a central component in numerous bioinformatics workflows.
Single-edition quadrangle maps
,
1998-01-01
In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a common standard, as well as by significantly reducing duplication of effort.
Mapping soil texture targeting predefined depth range or synthetizing from standard layers?
NASA Astrophysics Data System (ADS)
Laborczi, Annamária; Dezső Kaposi, András; Szatmári, Gábor; Takács, Katalin; Pásztor, László
2017-04-01
There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. Physical soil properties, especially particle size distribution play important role in this context. A few of the requirements can be satisfied by the sand-, silt-, and clay content maps compiled according to global standards such as GlobalSoilMap (GSM) or Soil Grids. Soil texture classes (e. g. according to USDA classification) can be derived from these three fraction data, in this way texture map can be compiled based on the proper separate maps. Soil texture class as well as fraction information represent direct input of crop-, meteorological- and hydrological models. The model inputs frequently require maps representing soil features of 0-30 cm depth, which is covered by three consecutive depth intervals according to standard specifications: 0-5 cm, 5-15 cm, 15-30 cm. Becoming GSM and SoilGrids the most detailed freely available spatial soil data sources, the common model users (e. g. meteorologists, agronomists, or hydrologists) would produce input map from (the weighted mean of) these three layers. However, if the basic soil data and proper knowledge is obtainable, a soil texture map targeting directly the 0-30 cm layer could be independently compiled. In our work we compared Hungary's soil texture maps compiled using the same reference and auxiliary data and inference methods but for differing layer distribution. We produced the 0-30 cm clay, silt and sand map as well as the maps for the three standard layers (0-5 cm, 5-15 cm, 15-30 cm). Maps of sand, silt and clay percentage were computed through regression kriging (RK) applying Additive Log-Ratio (alr) transformation. In addition to the Hungarian Soil Information and Monitoring System as reference soil data, digital elevation model and its derived components, soil physical property maps, remotely sensed images, land use -, geological-, as well as meteorological data were applied as auxiliary variables. We compared the directly compiled and the synthetized clay-, sand content, and texture class maps by different tools. In addition to pairwise comparison of basic statistical features (histograms, scatter plots), we examined the spatial distribution of the differences. We quantified the taxonomical distances of the textural classes, in order to investigate the differences of the map-pairs. We concluded that the directly computed and the synthetized maps show various differences. In the case of clay-, and sand content maps, the map-pairs have to be considered statistically different. On the other hand, the differences of the texture class maps are not significant. However, in all cases, the differences rather concern the extreme ranges and categories. Using of synthetized maps can intensify extremities by error propagation in models and scenarios. Based on our results, we suggest the usage of the directly composed maps.
Lowry, Tina; Vreeman, Daniel J; Loo, George T; Delman, Bradley N; Thum, Frederick L; Slovis, Benjamin H; Shapiro, Jason S
2017-01-01
Background A health information exchange (HIE)–based prior computed tomography (CT) alerting system may reduce avoidable CT imaging by notifying ordering clinicians of prior relevant studies when a study is ordered. For maximal effectiveness, a system would alert not only for prior same CTs (exams mapped to the same code from an exam name terminology) but also for similar CTs (exams mapped to different exam name terminology codes but in the same anatomic region) and anatomically proximate CTs (exams in adjacent anatomic regions). Notification of previous same studies across an HIE requires mapping of local site CT codes to a standard terminology for exam names (such as Logical Observation Identifiers Names and Codes [LOINC]) to show that two studies with different local codes and descriptions are equivalent. Notifying of prior similar or proximate CTs requires an additional mapping of exam codes to anatomic regions, ideally coded by an anatomic terminology. Several anatomic terminologies exist, but no prior studies have evaluated how well they would support an alerting use case. Objective The aim of this study was to evaluate the fitness of five existing standard anatomic terminologies to support similar or proximate alerts of an HIE-based prior CT alerting system. Methods We compared five standard anatomic terminologies (Foundational Model of Anatomy, Systematized Nomenclature of Medicine Clinical Terms, RadLex, LOINC, and LOINC/Radiological Society of North America [RSNA] Radiology Playbook) to an anatomic framework created specifically for our use case (Simple ANatomic Ontology for Proximity or Similarity [SANOPS]), to determine whether the existing terminologies could support our use case without modification. On the basis of an assessment of optimal terminology features for our purpose, we developed an ordinal anatomic terminology utility classification. We mapped samples of 100 random and the 100 most frequent LOINC CT codes to anatomic regions in each terminology, assigned utility classes for each mapping, and statistically compared each terminology’s utility class rankings. We also constructed seven hypothetical alerting scenarios to illustrate the terminologies’ differences. Results Both RadLex and the LOINC/RSNA Radiology Playbook anatomic terminologies ranked significantly better (P<.001) than the other standard terminologies for the 100 most frequent CTs, but no terminology ranked significantly better than any other for 100 random CTs. Hypothetical scenarios illustrated instances where no standard terminology would support appropriate proximate or similar alerts, without modification. Conclusions LOINC/RSNA Radiology Playbook and RadLex’s anatomic terminologies appear well suited to support proximate or similar alerts for commonly ordered CTs, but for less commonly ordered tests, modification of the existing terminologies with concepts and relations from SANOPS would likely be required. Our findings suggest SANOPS may serve as a framework for enhancing anatomic terminologies in support of other similar use cases. PMID:29242174
Standardized unfold mapping: a technique to permit left atrial regional data display and analysis.
Williams, Steven E; Tobon-Gomez, Catalina; Zuluaga, Maria A; Chubb, Henry; Butakoff, Constantine; Karim, Rashed; Ahmed, Elena; Camara, Oscar; Rhode, Kawal S
2017-10-01
Left atrial arrhythmia substrate assessment can involve multiple imaging and electrical modalities, but visual analysis of data on 3D surfaces is time-consuming and suffers from limited reproducibility. Unfold maps (e.g., the left ventricular bull's eye plot) allow 2D visualization, facilitate multimodal data representation, and provide a common reference space for inter-subject comparison. The aim of this work is to develop a method for automatic representation of multimodal information on a left atrial standardized unfold map (LA-SUM). The LA-SUM technique was developed and validated using 18 electroanatomic mapping (EAM) LA geometries before being applied to ten cardiac magnetic resonance/EAM paired geometries. The LA-SUM was defined as an unfold template of an average LA mesh, and registration of clinical data to this mesh facilitated creation of new LA-SUMs by surface parameterization. The LA-SUM represents 24 LA regions on a flattened surface. Intra-observer variability of LA-SUMs for both EAM and CMR datasets was minimal; root-mean square difference of 0.008 ± 0.010 and 0.007 ± 0.005 ms (local activation time maps), 0.068 ± 0.063 gs (force-time integral maps), and 0.031 ± 0.026 (CMR LGE signal intensity maps). Following validation, LA-SUMs were used for automatic quantification of post-ablation scar formation using CMR imaging, demonstrating a weak but significant relationship between ablation force-time integral and scar coverage (R 2 = 0.18, P < 0.0001). The proposed LA-SUM displays an integrated unfold map for multimodal information. The method is applicable to any LA surface, including those derived from imaging and EAM systems. The LA-SUM would facilitate standardization of future research studies involving segmental analysis of the LA.
GeoInquiries: Addressing a Grand Challenge for Teaching with GIS in Schools
NASA Astrophysics Data System (ADS)
DiBiase, D.; Baker, T.
2016-12-01
According to the National Research Council (2006), geographic information systems (GIS) is a powerful tool for expanding students' abilities to think spatially, a critical skill for future STEM professionals. However, educators in mainstream subjects in U.S. education have struggled for decades to use GIS effectively in classrooms. GeoInquiries are no cost, standards-based (NGSS or AP), Creative Commons-licensed instructional activities that guide inquiry around map-based concepts found in key subjects like Earth and environmental science. Web maps developed for GeoInquiries expand upon printed maps in leading textbooks by taking advantage of 21st GIS capabilities. GeoInquiry collections consist of 15 activities, each chosen to offer a map-based activity every few weeks throughout the school year. GeoInquiries use a common inquiry instructional framework, learned by many educators during their teacher preparation coursework. GeoInquiries are instructionally flexible - acting as much like building blocks for crafting custom activities as finished instructional materials. Over a half million geoinquiries will be accessed in the next twelve months - serving an anticipated 15 million students. After a generation of outreach to the educators, GIS is finally finding its way the mainstream.
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
The Global Genome Biodiversity Network (GGBN) Data Standard specification.
Droege, G; Barker, K; Seberg, O; Coddington, J; Benson, E; Berendsohn, W G; Bunk, B; Butler, C; Cawsey, E M; Deck, J; Döring, M; Flemons, P; Gemeinholzer, B; Güntsch, A; Hollowell, T; Kelbert, P; Kostadinov, I; Kottmann, R; Lawlor, R T; Lyal, C; Mackenzie-Dodds, J; Meyer, C; Mulcahy, D; Nussbeck, S Y; O'Tuama, É; Orrell, T; Petersen, G; Robertson, T; Söhngen, C; Whitacre, J; Wieczorek, J; Yilmaz, P; Zetzsche, H; Zhang, Y; Zhou, X
2016-01-01
Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today's ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard.Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard. © The Author(s) 2016. Published by Oxford University Press.
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information
,
2004-01-01
Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.
SCHeMA web-based observation data information system
NASA Astrophysics Data System (ADS)
Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise
2016-04-01
It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.
interPopula: a Python API to access the HapMap Project dataset
2010-01-01
Background The HapMap project is a publicly available catalogue of common genetic variants that occur in humans, currently including several million SNPs across 1115 individuals spanning 11 different populations. This important database does not provide any programmatic access to the dataset, furthermore no standard relational database interface is provided. Results interPopula is a Python API to access the HapMap dataset. interPopula provides integration facilities with both the Python ecology of software (e.g. Biopython and matplotlib) and other relevant human population datasets (e.g. Ensembl gene annotation and UCSC Known Genes). A set of guidelines and code examples to address possible inconsistencies across heterogeneous data sources is also provided. Conclusions interPopula is a straightforward and flexible Python API that facilitates the construction of scripts and applications that require access to the HapMap dataset. PMID:21210977
Stolzberg, Daniel; Wong, Carmen; Butler, Blake E; Lomber, Stephen G
2017-10-15
Brain atlases play an important role in effectively communicating results from neuroimaging studies in a standardized coordinate system. Furthermore, brain atlases extend analysis of functional magnetic resonance imaging (MRI) data by delineating regions of interest over which to evaluate the extent of functional activation as well as measures of inter-regional connectivity. Here, we introduce a three-dimensional atlas of the cat cerebral cortex based on established cytoarchitectonic and electrophysiological findings. In total, 71 cerebral areas were mapped onto the gray matter (GM) of an averaged T1-weighted structural MRI acquired at 7 T from eight adult domestic cats. In addition, a nonlinear registration procedure was used to generate a common template brain as well as GM, white matter, and cerebral spinal fluid tissue probability maps to facilitate tissue segmentation as part of the standard preprocessing pipeline for MRI data analysis. The atlas and associated files can also be used for planning stereotaxic surgery and for didactic purposes. © 2017 Wiley Periodicals, Inc.
Spatial Relation Predicates in Topographic Feature Semantics
Varanka, Dalia E.; Caro, Holly K.
2013-01-01
Topographic data are designed and widely used for base maps of diverse applications, yet the power of these information sources largely relies on the interpretive skills of map readers and relational database expert users once the data are in map or geographic information system (GIS) form. Advances in geospatial semantic technology offer data model alternatives for explicating concepts and articulating complex data queries and statements. To understand and enrich the vocabulary of topographic feature properties for semantic technology, English language spatial relation predicates were analyzed in three standard topographic feature glossaries. The analytical approach drew from disciplinary concepts in geography, linguistics, and information science. Five major classes of spatial relation predicates were identified from the analysis; representations for most of these are not widely available. The classes are: part-whole (which are commonly modeled throughout semantic and linked-data networks), geometric, processes, human intention, and spatial prepositions. These are commonly found in the ‘real world’ and support the environmental science basis for digital topographical mapping. The spatial relation concepts are based on sets of relation terms presented in this chapter, though these lists are not prescriptive or exhaustive. The results of this study make explicit the concepts forming a broad set of spatial relation expressions, which in turn form the basis for expanding the range of possible queries for topographical data analysis and mapping.
Reaction schemes visualized in network form: the syntheses of strychnine as an example.
Proudfoot, John R
2013-05-24
Representation of synthesis sequences in a network form provides an effective method for the comparison of multiple reaction schemes and an opportunity to emphasize features such as reaction scale that are often relegated to experimental sections. An example of data formatting that allows construction of network maps in Cytoscape is presented, along with maps that illustrate the comparison of multiple reaction sequences, comparison of scaffold changes within sequences, and consolidation to highlight common key intermediates used across sequences. The 17 different synthetic routes reported for strychnine are used as an example basis set. The reaction maps presented required a significant data extraction and curation, and a standardized tabular format for reporting reaction information, if applied in a consistent way, could allow the automated combination of reaction information across different sources.
Cunningham, S G; Carinci, F; Brillante, M; Leese, G P; McAlpine, R R; Azzopardi, J; Beck, P; Bratina, N; Bocquet, V; Doggen, K; Jarosz-Chobot, P K; Jecht, M; Lindblad, U; Moulton, T; Metelko, Ž; Nagy, A; Olympios, G; Pruna, S; Skeie, S; Storms, F; Di Iorio, C T; Massi Benedetti, M
2016-01-01
A set of core diabetes indicators were identified in a clinical review of current evidence for the EUBIROD project. In order to allow accurate comparisons of diabetes indicators, a standardised currency for data storage and aggregation was required. We aimed to define a robust European data dictionary with appropriate clinical definitions that can be used to analyse diabetes outcomes and provide the foundation for data collection from existing electronic health records for diabetes. Existing clinical datasets used by 15 partner institutions across Europe were collated and common data items analysed for consistency in terms of recording, data definition and units of measurement. Where necessary, data mappings and algorithms were specified in order to allow partners to meet the standard definitions. A series of descriptive elements were created to document metadata for each data item, including recording, consistency, completeness and quality. While datasets varied in terms of consistency, it was possible to create a common standard that could be used by all. The minimum dataset defined 53 data items that were classified according to their feasibility and validity. Mappings and standardised definitions were used to create an electronic directory for diabetes care, providing the foundation for the EUBIROD data analysis repository, also used to implement the diabetes registry and model of care for Cyprus. The development of data dictionaries and standards can be used to improve the quality and comparability of health information. A data dictionary has been developed to be compatible with other existing data sources for diabetes, within and beyond Europe.
MapEdit: solution to continuous raster map creation
NASA Astrophysics Data System (ADS)
Rančić, Dejan; Djordjevi-Kajan, Slobodanka
2003-03-01
The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.
Estimating an exchange rate between the EQ-5D-3L and ASCOT.
Stevens, Katherine; Brazier, John; Rowen, Donna
2018-06-01
The aim was to estimate an exchange rate between EQ-5D-3L and the Adult Social Care Outcome Tool (ASCOT) using preference-based mapping via common time trade-off (TTO) valuations. EQ-5D and ASCOT are useful for examining cost-effectiveness within the health and social care sectors, respectively, but there is a policy need to understand overall benefits and compare across sectors to assess relative value for money. Standard statistical mapping is unsuitable since it relies on conceptual overlap of the measures but EQ-5D and ASCOT have different conceptualisations of quality of life. We use a preference-based mapping approach to estimate the exchange rate using common TTO valuations for both measures. A sample of health states from each measure was valued using TTO by 200 members of the UK adult general population. Regression analyses are used to generate separate equations between EQ-5D-3L and ASCOT values using their original value set and TTO values elicited here. These are solved as simultaneous equations to estimate the relationship between EQ-5D-3L and ASCOT. The relationship for moving from ASCOT to EQ-5D-3L is a linear transformation with an intercept of -0.0488 and gradient of 0.978. This enables QALY gains generated by ASCOT and EQ-5D to be compared across different interventions. This paper estimated an exchange rate between ASCOT and EQ-5D-3L using a preference-based mapping approach that does not compromise the descriptive systems of the two measures. This contributes to the development of preference-based mapping through the use of TTO as the common metric used to estimate the exchange rate between measures.
FGDC Digital Cartographic Standard for Geologic Map Symbolization (PostScript Implementation)
,
2006-01-01
PLEASE NOTE: This now-approved 'FGDC Digital Cartographic Standard for Geologic Map Symbolization (PostScript Implementation)' officially supercedes its earlier (2000) Public Review Draft version (see 'Earlier Versions of the Standard' below). In August 2006, the Digital Cartographic Standard for Geologic Map Symbolization was officially endorsed by the Federal Geographic Data Committee (FGDC) as the national standard for the digital cartographic representation of geologic map features (FGDC Document Number FGDC-STD-013-2006). Presented herein is the PostScript Implementation of the standard, which will enable users to directly apply the symbols in the standard to geologic maps and illustrations prepared in desktop illustration and (or) publishing software. The FGDC Digital Cartographic Standard for Geologic Map Symbolization contains descriptions, examples, cartographic specifications, and notes on usage for a wide variety of symbols that may be used on typical, general-purpose geologic maps and related products such as cross sections. The standard also can be used for different kinds of special-purpose or derivative map products and databases that may be focused on a specific geoscience topic (for example, slope stability) or class of features (for example, a fault map). The standard is scale-independent, meaning that the symbols are appropriate for use with geologic mapping compiled or published at any scale. It will be useful to anyone who either produces or uses geologic map information, whether in analog or digital form. Please be aware that this standard is not intended to be used inflexibly or in a manner that will limit one's ability to communicate the observations and interpretations gained from geologic mapping. In certain situations, a symbol or its usage might need to be modified in order to better represent a particular feature on a geologic map or cross section. This standard allows the use of any symbol that doesn't conflict with others in the standard, provided that it is clearly explained on the map and in the database. In addition, modifying the size, color, and (or) lineweight of an existing symbol to suit the needs of a particular map or output device also is permitted, provided that the modified symbol's appearance is not too similar to another symbol on the map. Be aware, however, that reducing lineweights below .125 mm (.005 inch) may cause symbols to plot incorrectly if output at higher resolutions (1800 dpi or higher). For guidelines on symbol usage, as well as on color design and map labeling, please refer to the standard's introductory text. Also found there are informational sections covering concepts of geologic mapping and some definitions of geologic map features, as well as sections on the newly defined concepts and terminology for the scientific confidence and locational accuracy of geologic map features. More information on both the past development and the future maintenance of the FGDC Digital Cartographic Standard for Geologic Map Symbolization can be found at the FGDC Geologic Data Subcommittee website (http://ngmdb.usgs.gov/fgdc_gds/). Earlier Versions of the Standard
Clinical data integration of distributed data sources using Health Level Seven (HL7) v3-RIM mapping
2011-01-01
Background Health information exchange and health information integration has become one of the top priorities for healthcare systems across institutions and hospitals. Most organizations and establishments implement health information exchange and integration in order to support meaningful information retrieval among their disparate healthcare systems. The challenges that prevent efficient health information integration for heterogeneous data sources are the lack of a common standard to support mapping across distributed data sources and the numerous and diverse healthcare domains. Health Level Seven (HL7) is a standards development organization which creates standards, but is itself not the standard. They create the Reference Information Model. RIM is developed by HL7's technical committees. It is a standardized abstract representation of HL7 data across all the domains of health care. In this article, we aim to present a design and a prototype implementation of HL7 v3-RIM mapping for information integration of distributed clinical data sources. The implementation enables the user to retrieve and search information that has been integrated using HL7 v3-RIM technology from disparate health care systems. Method and results We designed and developed a prototype implementation of HL7 v3-RIM mapping function to integrate distributed clinical data sources using R-MIM classes from HL7 v3-RIM as a global view along with a collaborative centralized web-based mapping tool to tackle the evolution of both global and local schemas. Our prototype was implemented and integrated with a Clinical Database management Systems CDMS as a plug-in module. We tested the prototype system with some use case scenarios for distributed clinical data sources across several legacy CDMS. The results have been effective in improving information delivery, completing tasks that would have been otherwise difficult to accomplish, and reducing the time required to finish tasks which are used in collaborative information retrieval and sharing with other systems. Conclusions We created a prototype implementation of HL7 v3-RIM mapping for information integration between distributed clinical data sources to promote collaborative healthcare and translational research. The prototype has effectively and efficiently ensured the accuracy of the information and knowledge extractions for systems that have been integrated PMID:22104558
Lin, Yanhua; Staes, Catherine J; Shields, David E; Kandula, Vijay; Welch, Brandon M; Kawamoto, Kensaku
2015-01-01
When coupled with a common information model, a common terminology for clinical decision support (CDS) and electronic clinical quality measurement (eCQM) could greatly facilitate the distributed development and sharing of CDS and eCQM knowledge resources. To enable such scalable knowledge authoring and sharing, we systematically developed an extensible and standards-based terminology for CDS and eCQM in the context of the HL7 Virtual Medical Record (vMR) information model. The development of this terminology entailed three steps: (1) systematic, physician-curated concept identification from sources such as the Health Information Technology Standards Panel (HITSP) and the SNOMED-CT CORE problem list; (2) concept de-duplication leveraging the Unified Medical Language System (UMLS) MetaMap and Metathesaurus; and (3) systematic concept naming using standard terminologies and heuristic algorithms. This process generated 3,046 concepts spanning 68 domains. Evaluation against representative CDS and eCQM resources revealed approximately 50-70% concept coverage, indicating the need for continued expansion of the terminology.
Lin, Yanhua; Staes, Catherine J; Shields, David E; Kandula, Vijay; Welch, Brandon M; Kawamoto, Kensaku
2015-01-01
When coupled with a common information model, a common terminology for clinical decision support (CDS) and electronic clinical quality measurement (eCQM) could greatly facilitate the distributed development and sharing of CDS and eCQM knowledge resources. To enable such scalable knowledge authoring and sharing, we systematically developed an extensible and standards-based terminology for CDS and eCQM in the context of the HL7 Virtual Medical Record (vMR) information model. The development of this terminology entailed three steps: (1) systematic, physician-curated concept identification from sources such as the Health Information Technology Standards Panel (HITSP) and the SNOMED-CT CORE problem list; (2) concept de-duplication leveraging the Unified Medical Language System (UMLS) MetaMap and Metathesaurus; and (3) systematic concept naming using standard terminologies and heuristic algorithms. This process generated 3,046 concepts spanning 68 domains. Evaluation against representative CDS and eCQM resources revealed approximately 50–70% concept coverage, indicating the need for continued expansion of the terminology. PMID:26958220
Modern Data Center Services Supporting Science
NASA Astrophysics Data System (ADS)
Varner, J. D.; Cartwright, J.; McLean, S. J.; Boucher, J.; Neufeld, D.; LaRocque, J.; Fischman, D.; McQuinn, E.; Fugett, C.
2011-12-01
The National Oceanic and Atmospheric Administration's National Geophysical Data Center (NGDC) World Data Center for Geophysics and Marine Geology provides scientific stewardship, products and services for geophysical data, including bathymetry, gravity, magnetics, seismic reflection, data derived from sediment and rock samples, as well as historical natural hazards data (tsunamis, earthquakes, and volcanoes). Although NGDC has long made many of its datasets available through map and other web services, it has now developed a second generation of services to improve the discovery and access to data. These new services use off-the-shelf commercial and open source software, and take advantage of modern JavaScript and web application frameworks. Services are accessible using both RESTful and SOAP queries as well as Open Geospatial Consortium (OGC) standard protocols such as WMS, WFS, WCS, and KML. These new map services (implemented using ESRI ArcGIS Server) are finer-grained than their predecessors, feature improved cartography, and offer dramatic speed improvements through the use of map caches. Using standards-based interfaces allows customers to incorporate the services without having to coordinate with the provider. Providing fine-grained services increases flexibility for customers building custom applications. The Integrated Ocean and Coastal Mapping program and Coastal and Marine Spatial Planning program are two examples of national initiatives that require common data inventories from multiple sources and benefit from these modern data services. NGDC is also consuming its own services, providing a set of new browser-based mapping applications which allow the user to quickly visualize and search for data. One example is a new interactive mapping application to search and display information about historical natural hazards. NGDC continues to increase the amount of its data holdings that are accessible and is augmenting the capabilities with modern web application frameworks such as Groovy and Grails. Data discovery is being improved and simplified by leveraging ISO metadata standards along with ESRI Geoportal Server.
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan; Jaumann, Ralf
2010-05-01
The Helmholtz Alliance and the European Planetary Network are research communities with different main topics. One of the main research topics which are shared by these communities is the question about the geomorphological evolutions of planetary surfaces as well as the geological context of life. This research contains questions like "Is there volcanic activity on a planet?" or "Where are possible landing sites?". In order to help answering such questions, analyses of surface features and morphometric measurements need to be performed. This ultimately leads to the generation of thematic maps (e.g. geological and geomorphologic maps) as a basis for the further studies. By using modern GIS techniques the comparative work and generalisation during mapping processes results in new information. These insights are crucial for subsequent investigations. Therefore, the aim is to make these results available to the research community as a secondary data basis. In order to obtain a common and interoperable data collection results of different mapping projects have to follow a standardised data-infrastructure, metadata definition and map layout. Therefore, we are currently focussing on the generation of a database model arranging all data and processes in a uniform mapping schema. With the help of such a schema, the mapper will be able to utilise a predefined (but customisable) GIS environment with individual tool items as well as a standardised symbolisation and a metadata environment. This environment is based on a data model which is currently on a conceptual level and provides the layout of the data infrastructure including relations and topologies. One of the first tasks towards this data model is the definition of a consistent basis of symbolisation standards developed for planetary mapping. The mapper/geologist will be able to access the pre-built signatures and utilise these in scale dependence within the mapping project. The symbolisation will be related to the data model in the next step. As second task, we designed a concept for description of the digital mapping result. Therefore, we are creating a metadata template based on existing standards for individual needs in planetary sciences. This template is subdivided in (meta) data about the general map content (e.g. on which data the mapping result based on) and in metadata for each individual mapping element/layer comprising information like minimum mapping scale, interpretation hints, etc. The assignment of such a metadata description in combination with the usage of a predefined mapping schema facilitates the efficient and traceable storage of data information on a network server and enables a subsequent representation, e.g. as a mapserver data structure. Acknowledgement: This work is partly supported by DLR and the Helmholtz Alliance "Planetary Evolution and Life".
US EPA Nonattainment Areas and Designations
This web service contains the following state level layers:Ozone 8-hr (1997 standard), Ozone 8-hr (2008 standard), Lead (2008 standard), SO2 1-hr (2010 standard), PM2.5 24hr (2006 standard), PM2.5 Annual (1997 standard), PM2.5 Annual (2012 standard), and PM10 (1987 standard). Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NonattainmentAreas/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each
A multi-factor designation method for mapping particulate-pollution control zones in China.
Qin, Y; Xie, S D
2011-09-01
A multi-factor designation method for mapping particulate-pollution control zones was brought out through synthetically considering PM(10) pollution status, PM(10) anthropogenic emissions, fine particle pollution, long-range transport and economic situation. According to this method, China was divided into four different particulate-pollution control regions: PM Suspended Control Region, PM(10) Pollution Control Region, PM(2.5) Pollution Control Region and PM(10) and PM(2.5) Common Control Region, which accounted for 69.55%, 9.66%, 4.67% and 16.13% of China's territory, respectively. The PM(10) and PM(2.5) Common Control Region was mainly distributed in Bohai Region, Yangtze River Delta, Pearl River Delta, eastern of Sichuan province and Chongqing municipality, calling for immediate control of both PM(10) and PM(2.5). Cost-effective control effects can be achieved through concentrating efforts on PM(10) and PM(2.5) Common Control Region to address 60.32% of national PM(10) anthropogenic emissions. Air quality in districts belonging to PM(2.5) Pollution Control Region suggested that Chinese national ambient air quality standard for PM(10) was not strict enough. The result derived from application to China proved that this approach was feasible for mapping pollution control regions for a country with vast territory, complicated pollution characteristics and limited available monitoring data. Copyright © 2011 Elsevier B.V. All rights reserved.
Yuksel, Mustafa; Dogac, Asuman
2011-07-01
Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.
A Bayesian approach to the creation of a study-customized neonatal brain atlas
Zhang, Yajing; Chang, Linda; Ceritoglu, Can; Skranes, Jon; Ernst, Thomas; Mori, Susumu; Miller, Michael I.; Oishi, Kenichi
2014-01-01
Atlas-based image analysis (ABA), in which an anatomical “parcellation map” is used for parcel-by-parcel image quantification, is widely used to analyze anatomical and functional changes related to brain development, aging, and various diseases. The parcellation maps are often created based on common MRI templates, which allow users to transform the template to target images, or vice versa, to perform parcel-by-parcel statistics, and report the scientific findings based on common anatomical parcels. The use of a study-specific template, which represents the anatomical features of the study population better than common templates, is preferable for accurate anatomical labeling; however, the creation of a parcellation map for a study-specific template is extremely labor intensive, and the definitions of anatomical boundaries are not necessarily compatible with those of the common template. In this study, we employed a Volume-based Template Estimation (VTE) method to create a neonatal brain template customized to a study population, while keeping the anatomical parcellation identical to that of a common MRI atlas. The VTE was used to morph the standardized parcellation map of the JHU-neonate-SS atlas to capture the anatomical features of a study population. The resultant “study-customized” T1-weighted and diffusion tensor imaging (DTI) template, with three-dimensional anatomical parcellation that defined 122 brain regions, was compared with the JHU-neonate-SS atlas, in terms of the registration accuracy. A pronounced increase in the accuracy of cortical parcellation and superior tensor alignment were observed when the customized template was used. With the customized atlas-based analysis, the fractional anisotropy (FA) detected closely approximated the manual measurements. This tool provides a solution for achieving normalization-based measurements with increased accuracy, while reporting scientific findings in a consistent framework. PMID:25026155
Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS
NASA Technical Reports Server (NTRS)
Long, S. M.; Grosfils, E. B.
2005-01-01
Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.
A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration
NASA Astrophysics Data System (ADS)
Moosdorf, N.; Richard, S. M.
2012-12-01
A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the lithologic character of such units in a meaningful way. A lithogenetic unit category scheme accessible as a GeoSciML-portrayal-based OGC Styled Layer Description resource is key to enabling OneGeology (http://oneGeology.org) geologic map services to achieve a high degree of visual harmonization.
Coherent visualization of spatial data adapted to roles, tasks, and hardware
NASA Astrophysics Data System (ADS)
Wagner, Boris; Peinsipp-Byma, Elisabeth
2012-06-01
Modern crisis management requires that users with different roles and computer environments have to deal with a high volume of various data from different sources. For this purpose, Fraunhofer IOSB has developed a geographic information system (GIS) which supports the user depending on available data and the task he has to solve. The system provides merging and visualization of spatial data from various civilian and military sources. It supports the most common spatial data standards (OGC, STANAG) as well as some proprietary interfaces, regardless if these are filebased or database-based. To set the visualization rules generic Styled Layer Descriptors (SLDs) are used, which are an Open Geospatial Consortium (OGC) standard. SLDs allow specifying which data are shown, when and how. The defined SLDs consider the users' roles and task requirements. In addition it is possible to use different displays and the visualization also adapts to the individual resolution of the display. Too high or low information density is avoided. Also, our system enables users with different roles to work together simultaneously using the same data base. Every user is provided with the appropriate and coherent spatial data depending on his current task. These so refined spatial data are served via the OGC services Web Map Service (WMS: server-side rendered raster maps), or the Web Map Tile Service - (WMTS: pre-rendered and cached raster maps).
NASA Astrophysics Data System (ADS)
Lamarche, Geoffroy; Lurton, Xavier
2018-06-01
Multibeam echosounders are becoming widespread for the purposes of seafloor bathymetry mapping, but the acquisition and the use of seafloor backscatter measurements, acquired simultaneously with the bathymetric data, are still insufficiently understood, controlled and standardized. This presents an obstacle to well-accepted, standardized analysis and application by end users. The Marine Geological and Biological Habitat Mapping group (Geohab.org) has long recognized the need for better coherence and common agreement on acquisition, processing and interpretation of seafloor backscatter data, and established the Backscatter Working Group (BSWG) in May 2013. This paper presents an overview of this initiative, the mandate, structure and program of the working group, and a synopsis of the BSWG Guidelines and Recommendations to date. The paper includes (1) an overview of the current status in sensors and techniques available in seafloor backscatter data from multibeam sonars; (2) the presentation of the BSWG structure and results; (3) recommendations to operators, end-users, sonar manufacturers, and software developers using sonar backscatter for seafloor-mapping applications, for best practice methods and approaches for data acquisition and processing; and (4) a discussion on the development needs for future systems and data processing. We propose for the first time a nomenclature of backscatter processing levels that affords a means to accurately and efficiently describe the data processing status, and to facilitate comparisons of final products from various origins.
NASA Astrophysics Data System (ADS)
Lin, K.; Wald, D. J.
2007-12-01
ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users" facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for emergency managers and responders. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, provides overall information regarding the affected areas. When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. To this end, ShakeCast estimates the potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps showing structures or facilities most likely impacted. All ShakeMap and ShakeCast files and products are non-propriety to simplify interfacing with existing users" response tools and to encourage user-made enhancement to the software. ShakeCast uses standard RSS and HTTP requests to communicate with the USGS Web servers that host ShakeMaps, which are widely-distributed and heavily mirrored. The RSS approach allows ShakeCast users to initiate and receive selected ShakeMap products and information on software updates. To assess facility damage estimates, ShakeCast users can combine measured or estimated ground motion parameters with damage relationships that can be pre-computed, use one of these ground motion parameters as input, and produce a multi-state discrete output of damage likelihood. Presently three common approaches are being used to provide users with an indication of damage: HAZUS-based, intensity-based, and customized damage functions. Intensity-based thresholds are for locations with poorly established damage relationships; custom damage levels are for advanced ShakeCast users such as Caltrans which produces its own set of damage functions that correspond to the specific details of each California bridge or overpass in its jurisdiction. For users whose portfolio of structures is comprised of common, standard designs, ShakeCast offers a simplified structural damage-state estimation capability adapted from the HAZUS-MH earthquake module (NIBS and FEMA, 2003). Currently the simplified fragility settings consist of 128 combinations of HAZUS model building types, construction materials, building heights, and building-code eras.
GeneSigDB—a curated database of gene expression signatures
Culhane, Aedín C.; Schwarzl, Thomas; Sultana, Razvan; Picard, Kermshlise C.; Picard, Shaita C.; Lu, Tim H.; Franklin, Katherine R.; French, Simon J.; Papenhausen, Gerald; Correll, Mick; Quackenbush, John
2010-01-01
The primary objective of most gene expression studies is the identification of one or more gene signatures; lists of genes whose transcriptional levels are uniquely associated with a specific biological phenotype. Whilst thousands of experimentally derived gene signatures are published, their potential value to the community is limited by their computational inaccessibility. Gene signatures are embedded in published article figures, tables or in supplementary materials, and are frequently presented using non-standard gene or probeset nomenclature. We present GeneSigDB (http://compbio.dfci.harvard.edu/genesigdb) a manually curated database of gene expression signatures. GeneSigDB release 1.0 focuses on cancer and stem cells gene signatures and was constructed from more than 850 publications from which we manually transcribed 575 gene signatures. Most gene signatures (n = 560) were successfully mapped to the genome to extract standardized lists of EnsEMBL gene identifiers. GeneSigDB provides the original gene signature, the standardized gene list and a fully traceable gene mapping history for each gene from the original transcribed data table through to the standardized list of genes. The GeneSigDB web portal is easy to search, allows users to compare their own gene list to those in the database, and download gene signatures in most common gene identifier formats. PMID:19934259
A Case Study in Integrating Multiple E-commerce Standards via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Yu, Yang; Hillman, Donald; Setio, Basuki; Heflin, Jeff
Internet business-to-business transactions present great challenges in merging information from different sources. In this paper we describe a project to integrate four representative commercial classification systems with the Federal Cataloging System (FCS). The FCS is used by the US Defense Logistics Agency to name, describe and classify all items under inventory control by the DoD. Our approach uses the ECCMA Open Technical Dictionary (eOTD) as a common vocabulary to accommodate all different classifications. We create a semantic bridging ontology between each classification and the eOTD to describe their logical relationships in OWL DL. The essential idea is that since each classification has formal definitions in a common vocabulary, we can use subsumption to automatically integrate them, thus mitigating the need for pairwise mappings. Furthermore our system provides an interactive interface to let users choose and browse the results and more importantly it can translate catalogs that commit to these classifications using compiled mapping results.
Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling
NASA Astrophysics Data System (ADS)
Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando
2013-04-01
SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-03-01
Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. Creative Commons Attribution License
Tong, Yubing; Udupa, Jayaram K.; Torigian, Drew A.
2014-01-01
Purpose: The quantification of body fat plays an important role in the study of numerous diseases. It is common current practice to use the fat area at a single abdominal computed tomography (CT) slice as a marker of the body fat content in studying various disease processes. This paper sets out to answer three questions related to this issue which have not been addressed in the literature. At what single anatomic slice location do the areas of subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT) estimated from the slice correlate maximally with the corresponding fat volume measures? How does one ensure that the slices used for correlation calculation from different subjects are at the same anatomic location? Are there combinations of multiple slices (not necessarily contiguous) whose area sum correlates better with volume than does single slice area with volume? Methods: The authors propose a novel strategy for mapping slice locations to a standardized anatomic space so that same anatomic slice locations are identified in different subjects. The authors then study the volume-to-area correlations and determine where they become maximal. To address the third issue, the authors carry out similar correlation studies by utilizing two and three slices for calculating area sum. Results: Based on 50 abdominal CT data sets, the proposed mapping achieves significantly improved consistency of anatomic localization compared to current practice. Maximum correlations are achieved at different anatomic locations for SAT and VAT which are both different from the L4-L5 junction commonly utilized currently for single slice area estimation as a marker. Conclusions: The maximum area-to-volume correlation achieved is quite high, suggesting that it may be reasonable to estimate body fat by measuring the area of fat from a single anatomic slice at the site of maximum correlation and use this as a marker. The site of maximum correlation is not at L4-L5 as commonly assumed, but is more superiorly located at T12-L1 for SAT and at L3-L4 for VAT. Furthermore, the optimal anatomic locations for SAT and VAT estimation are not the same, contrary to common assumption. The proposed standardized space mapping achieves high consistency of anatomic localization by accurately managing nonlinearities in the relationships among landmarks. Multiple slices achieve greater improvement in correlation for VAT than for SAT. The optimal locations in the case of multiple slices are not contiguous. PMID:24877839
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, Yubing; Udupa, Jayaram K., E-mail: jay@mail.med.upenn.edu; Torigian, Drew A.
Purpose: The quantification of body fat plays an important role in the study of numerous diseases. It is common current practice to use the fat area at a single abdominal computed tomography (CT) slice as a marker of the body fat content in studying various disease processes. This paper sets out to answer three questions related to this issue which have not been addressed in the literature. At what single anatomic slice location do the areas of subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT) estimated from the slice correlate maximally with the corresponding fat volume measures? How doesmore » one ensure that the slices used for correlation calculation from different subjects are at the same anatomic location? Are there combinations of multiple slices (not necessarily contiguous) whose area sum correlates better with volume than does single slice area with volume? Methods: The authors propose a novel strategy for mapping slice locations to a standardized anatomic space so that same anatomic slice locations are identified in different subjects. The authors then study the volume-to-area correlations and determine where they become maximal. To address the third issue, the authors carry out similar correlation studies by utilizing two and three slices for calculating area sum. Results: Based on 50 abdominal CT data sets, the proposed mapping achieves significantly improved consistency of anatomic localization compared to current practice. Maximum correlations are achieved at different anatomic locations for SAT and VAT which are both different from the L4-L5 junction commonly utilized currently for single slice area estimation as a marker. Conclusions: The maximum area-to-volume correlation achieved is quite high, suggesting that it may be reasonable to estimate body fat by measuring the area of fat from a single anatomic slice at the site of maximum correlation and use this as a marker. The site of maximum correlation is not at L4-L5 as commonly assumed, but is more superiorly located at T12-L1 for SAT and at L3-L4 for VAT. Furthermore, the optimal anatomic locations for SAT and VAT estimation are not the same, contrary to common assumption. The proposed standardized space mapping achieves high consistency of anatomic localization by accurately managing nonlinearities in the relationships among landmarks. Multiple slices achieve greater improvement in correlation for VAT than for SAT. The optimal locations in the case of multiple slices are not contiguous.« less
Customised City Maps in Mobile Applications for Senior Citizens.
Reins, Frank; Berker, Frank; Heck, Helmut
2017-01-01
Map services should be used in mobile applications for senior citizens. Do the commonly used map services meet the needs of elderly people? - Exemplarily, the contrast ratios of common maps in comparison to an optimized custom rendered map are examined in the paper.
Terrestrial Ecosystems-Surficial Lithology of the Conterminous United States
Cress, Jill; Soller, David; Sayre, Roger G.; Comer, Patrick; Warner, Harumi
2010-01-01
As part of an effort to map terrestrial ecosystems, the U.S. Geological Survey (USGS) has generated a new classification of the lithology of surficial materials to be used in creating maps depicting standardized, terrestrial ecosystem models for the conterminous United States. The ecosystems classification used in this effort was developed by NatureServe. A biophysical stratification approach, developed for South America and now being implemented globally, was used to model the ecosystem distributions. This ecosystem mapping methodology is transparent, replicable, and rigorous. Surficial lithology strongly influences the differentiation and distribution of terrestrial ecosystems, and is one of the key input layers in this biophysical stratification. These surficial lithology classes were derived from the USGS map 'Surficial Materials in the Conterminous United States,' which was based on texture, internal structure, thickness, and environment of deposition or formation of materials. This original map was produced from a compilation of regional surficial and bedrock geology source maps using broadly defined common map units for the purpose of providing an overview of the existing data and knowledge. For the terrestrial ecosystem effort, the 28 lithology classes of Soller and Reheis (2004) were generalized and then reclassified into a set of 17 lithologies that typically control or influence the distribution of vegetation types.
Standard map in magnetized relativistic systems: fixed points and regular acceleration.
de Sousa, M C; Steffens, F M; Pakter, R; Rizzato, F B
2010-08-01
We investigate the concept of a standard map for the interaction of relativistic particles and electrostatic waves of arbitrary amplitudes, under the action of external magnetic fields. The map is adequate for physical settings where waves and particles interact impulsively, and allows for a series of analytical result to be exactly obtained. Unlike the traditional form of the standard map, the present map is nonlinear in the wave amplitude and displays a series of peculiar properties. Among these properties we discuss the relation involving fixed points of the maps and accelerator regimes.
NASA Technical Reports Server (NTRS)
Hollyday, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Streamflow characteristics in the Delmarva Peninsula derived from the records of daily discharge of 20 gaged basins are representative of the full range in flow conditions and include all of those commonly used for design or planning purposes. They include annual flood peaks with recurrence intervals of 2, 5, 10, 25, and 50 years, mean annual discharge, standard deviation of the mean annual discharge, mean monthly discharges, standard deviation of the mean monthly discharges, low-flow characteristics, flood volume characteristics, and the discharge equalled or exceeded 50 percent of the time. Streamflow and basin characteristics were related by a technique of multiple regression using a digital computer. A control group of equations was computed using basin characteristics derived from maps and climatological records. An experimental group of equations was computed using basin characteristics derived from LANDSAT imagery as well as from maps and climatological records. Based on a reduction in standard error of estimate equal to or greater than 10 percent, the equations for 12 stream flow characteristics were substantially improved by adding to the analyses basin characteristics derived from LANDSAT imagery.
Mapping of quantitative trait loci using the skew-normal distribution.
Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos
2007-11-01
In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.
Toward digital geologic map standards: a progress report
Ulrech, George E.; Reynolds, Mitchell W.; Taylor, Richard B.
1992-01-01
Establishing modern scientific and technical standards for geologic maps and their derivative map products is vital to both producers and users of such maps as we move into an age of digital cartography. Application of earth-science data in complex geographic information systems, acceleration of geologic map production, and reduction of population costs require that national standards be developed for digital geologic cartography and computer analysis. Since December 1988, under commission of the Chief Geologic of the U.S. Geological Survey and the mandate of the National Geologic Mapping Program (with added representation from the Association of American State Geologists), a committee has been designing a comprehensive set of scientific map standards. Three primary issues were: (1) selecting scientific symbology and its digital representation; (2) creating an appropriate digital coding system that characterizes geologic features with respect to their physical properties, stratigraphic and structural relations, spatial orientation, and interpreted mode of origin; and (3) developing mechanisms for reporting levels of certainty for descriptive as well as measured properties. Approximately 650 symbols for geoscience maps, including present usage of the U.S Geological Survey, state geological surveys, industry, and academia have been identified and tentatively adopted. A proposed coding system comprises four-character groupings of major and minor codes that can identify all attributes of a geologic feature. Such a coding system allows unique identification of as many as 105 geologic names and values on a given map. The new standard will track closely the latest developments of the Proposed Standard for Digital Cartographic Data soon to be submitted to the National Institute of Standards and Technology by the Federal Interagency Coordinating Committee on Digital Cartography. This standard will adhere generally to the accepted definitions and specifications for spatial data transfer. It will require separate specifications of digital cartographic quality relating to positional accuracy and ranges of measured and interpreted values such as geologic age and rock composition. Provisional digital geologic map standards will be published for trial implementation. After approximately two years, when comments on the proposed standards have been solicited and modifications made, formal adoption of the standards will be recommended. Widespread acceptance of the new standards will depend on their applicability to the broadest range of earth-science map products and their adaptability to changing cartographic technology.
Astronomical Data Integration Beyond the Virtual Observatory
NASA Astrophysics Data System (ADS)
Lemson, G.; Laurino, O.
2015-09-01
"Data integration" generally refers to the process of combining data from different source data bases into a unified view. Much work has been devoted in this area by the International Virtual Observatory Alliance (IVOA), allowing users to discover and access databases through standard protocols. However, different archives present their data through their own schemas and users must still select, filter, and combine data for each archive individually. An important reason for this is that the creation of common data models that satisfy all sub-disciplines is fraught with difficulties. Furthermore it requires a substantial amount of work for data providers to present their data according to some standard representation. We will argue that existing standards allow us to build a data integration framework that works around these problems. The particular framework requires the implementation of the IVOA Table Access Protocol (TAP) only. It uses the newly developed VO data modelling language (VO-DML) specification, which allows one to define extensible object-oriented data models using a subset of UML concepts through a simple XML serialization language. A rich mapping language allows one to describe how instances of VO-DML data models are represented by the TAP service, bridging the possible mismatch between a local archive's schema and some agreed-upon representation of the astronomical domain. In this so called local-as-view approach to data integration, “mediators" use the mapping prescriptions to translate queries phrased in terms of the common schema to the underlying TAP service. This mapping language has a graphical representation, which we expose through a web based graphical “drag-and-drop-and-connect" interface. This service allows any user to map the holdings of any TAP service to the data model(s) of choice. The mappings are defined and stored outside of the data sources themselves, which allows the interface to be used in a kind of crowd-sourcing effort to annotate any remote database of interest. This reduces the burden of publishing one's data and allows a great flexibility in the definition of the views through which particular communities might wish to access remote archives. At the same time, the framework easies the user's effort to select, filter, and combine data from many different archives, so as to build knowledge bases for their analysis. We will present the framework and demonstrate a prototype implementation. We will discuss ideas for producing the missing elements, in particular the query language and the implementation of mediator tools to translate object queries to ADQL
Generalizing the Arden Syntax to a Common Clinical Application Language.
Kraus, Stefan
2018-01-01
The Arden Syntax for Medical Logic Systems is a standard for encoding and sharing knowledge in the form of Medical Logic Modules (MLMs). Although the Arden Syntax has been designed to meet the requirements of data-driven clinical event monitoring, multiple studies suggest that its language constructs may be suitable for use outside the intended application area and even as a common clinical application language. Such a broader context, however, requires to reconsider some language features. The purpose of this paper is to outline the related modifications on the basis of a generalized Arden Syntax version. The implemented prototype provides multiple adjustments to the standard, such as an option to use programming language constructs without the frame-like MLM structure, a JSON compliant data type system, a means to use MLMs as user-defined functions, and native support of restful web services with integrated data mapping. This study does not aim to promote an actually new language, but a more generic version of the proven Arden Syntax standard. Such an easy-to-understand domain-specific language for common clinical applications might cover multiple additional medical subdomains and serve as a lingua franca for arbitrary clinical algorithms, therefore avoiding a patchwork of multiple all-purpose languages between, and even within, institutions.
Elmer, Jonathan; Flickinger, Katharyn L; Anderson, Maighdlin W; Koller, Allison C; Sundermann, Matthew L; Dezfulian, Cameron; Okonkwo, David O; Shutter, Lori A; Salcido, David D; Callaway, Clifton W; Menegazzi, James J
2018-04-18
Brain tissue hypoxia may contribute to preventable secondary brain injury after cardiac arrest. We developed a porcine model of opioid overdose cardiac arrest and post-arrest care including invasive, multimodal neurological monitoring of regional brain physiology. We hypothesized brain tissue hypoxia is common with usual post-arrest care and can be prevented by modifying mean arterial pressure (MAP) and arterial oxygen concentration (PaO 2 ). We induced opioid overdose and cardiac arrest in sixteen swine, attempted resuscitation after 9 min of apnea, and randomized resuscitated animals to three alternating 6-h blocks of standard or titrated care. We invasively monitored physiological parameters including brain tissue oxygen (PbtO 2 ). During standard care blocks, we maintained MAP > 65 mmHg and oxygen saturation 94-98%. During titrated care, we targeted PbtO2 > 20 mmHg. Overall, 10 animals (63%) achieved ROSC after a median of 12.4 min (range 10.8-21.5 min). PbtO 2 was higher during titrated care than standard care blocks (unadjusted β = 0.60, 95% confidence interval (CI) 0.42-0.78, P < 0.001). In an adjusted model controlling for MAP, vasopressors, sedation, and block sequence, PbtO 2 remained higher during titrated care (adjusted β = 0.75, 95%CI 0.43-1.06, P < 0.001). At three predetermined thresholds, brain tissue hypoxia was significantly less common during titrated care blocks (44 vs 2% of the block duration spent below 20 mmHg, P < 0.001; 21 vs 0% below 15 mmHg, P < 0.001; and, 7 vs 0% below 10 mmHg, P = .01). In this model of opioid overdose cardiac arrest, brain tissue hypoxia is common and treatable. Further work will elucidate best strategies and impact of titrated care on functional outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.
Geologic map of the Valjean Hills 7.5' quadrangle, San Bernardino County, California
Calzia, J.P.; Troxel, Bennie W.; digital database by Raumann, Christian G.
2003-01-01
FGDC-compliant metadata for the ARC/INFO coverages. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3 above) or plotting the postscript file (2 above).
A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy
Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.
2000-01-01
This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576
Beyond the double banana: improved recognition of temporal lobe seizures in long-term EEG.
Rosenzweig, Ivana; Fogarasi, András; Johnsen, Birger; Alving, Jørgen; Fabricius, Martin Ejler; Scherg, Michael; Neufeld, Miri Y; Pressler, Ronit; Kjaer, Troels W; van Emde Boas, Walter; Beniczky, Sándor
2014-02-01
To investigate whether extending the 10-20 array with 6 electrodes in the inferior temporal chain and constructing computed montages increases the diagnostic value of ictal EEG activity originating in the temporal lobe. In addition, the accuracy of computer-assisted spectral source analysis was investigated. Forty EEG samples were reviewed by 7 EEG experts in various montages (longitudinal and transversal bipolar, common average, source derivation, source montage, current source density, and reference-free montages) using 2 electrode arrays (10-20 and the extended one). Spectral source analysis used source montage to calculate density spectral array, defining the earliest oscillatory onset. From this, phase maps were calculated for localization. The reference standard was the decision of the multidisciplinary epilepsy surgery team on the seizure onset zone. Clinical performance was compared with the double banana (longitudinal bipolar montage, 10-20 array). Adding the inferior temporal electrode chain, computed montages (reference free, common average, and source derivation), and voltage maps significantly increased the sensitivity. Phase maps had the highest sensitivity and identified ictal activity at earlier time-point than visual inspection. There was no significant difference concerning specificity. The findings advocate for the use of these digital EEG technology-derived analysis methods in clinical practice.
Conversion of KEGG metabolic pathways to SBGN maps including automatic layout
2013-01-01
Background Biologists make frequent use of databases containing large and complex biological networks. One popular database is the Kyoto Encyclopedia of Genes and Genomes (KEGG) which uses its own graphical representation and manual layout for pathways. While some general drawing conventions exist for biological networks, arbitrary graphical representations are very common. Recently, a new standard has been established for displaying biological processes, the Systems Biology Graphical Notation (SBGN), which aims to unify the look of such maps. Ideally, online repositories such as KEGG would automatically provide networks in a variety of notations including SBGN. Unfortunately, this is non‐trivial, since converting between notations may add, remove or otherwise alter map elements so that the existing layout cannot be simply reused. Results Here we describe a methodology for automatic translation of KEGG metabolic pathways into the SBGN format. We infer important properties of the KEGG layout and treat these as layout constraints that are maintained during the conversion to SBGN maps. Conclusions This allows for the drawing and layout conventions of SBGN to be followed while creating maps that are still recognizably the original KEGG pathways. This article details the steps in this process and provides examples of the final result. PMID:23953132
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Burk, Thomas E; Lime, Steve
2012-01-01
The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explainedmore » in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.« less
Standardized morbidity ratio for leptospirosis mapping in Malaysia
NASA Astrophysics Data System (ADS)
Awang, Aznida Che; Samat, Nor Azah
2017-05-01
Leptospirosis is a worldwide zoonotic disease that affects human health in many parts of the world including Malaysia. Leptospirosis is a disease caused by the infection of pathogenic Leptospira genus called Spirochaetes. Leptospirosis can be transmitted directly or indirectly from rats to human. The human infection is usually caused by human contact with urine or tissues of infected animal. This disease can be spread through mucus membrane such as mouth, nose and eyes, ingestion of contaminated food and water and also exposed injured skin to contaminated water or soil. There is still no vaccine currently available for the prevention or treatment of leptospirosis disease but this disease can be treated if it is diagnosed early. Therefore, the aim of this study is to estimate the relative risk for leptospirosis disease based initially on the most common statistic used in the study of disease mapping called Standardized Morbidity Ratio (SMR). We then apply SMR to leptospirosis data obtained in Malaysia. The results show that the states of Melaka have very high risk areas. The states of Kedah, Terengganu and Kelantan are identified as high risk areas. The states of Perak, Perlis, Sabah and Sarawak showed medium risk areas. This is followed by low risk by other states except Pahang, Johor and Labuan with very low risk areas. In conclusion, SMR method is the best method for mapping leptospirosis because by referring to the relative risk maps, the states that deserve closer look and disease prevention can be identified.
Disease Mapping for Stomach Cancer in Libya Based on Besag– York– Mollié (BYM) Model
Alhdiri, Maryam Ahmed Salem; Samat, Nor Azah; Mohamed, Zulkifley
2017-06-25
Globally, Cancer is the ever-increasing health problem and most common cause of medical deaths. In Libya, it is an important health concern, especially in the setting of an aging population and limited healthcare facilities. Therefore, the goal of this research is to map of the county’ cancer incidence rate using the Bayesian method and identify the high-risk regions (for the first time in a decade). In the field of disease mapping, very little has been done to address the issue of analyzing sparse cancer diseases in Libya. Standardized Morbidity Ratio or SMR is known as a traditional approach to measure the relative risk of the disease, which is the ratio of observed and expected number of accounts in a region that has the greatest uncertainty if the disease is rare or small geographical region. Therefore, to solve some of SMR’s problems, we used statistical smoothing or Bayesian models to estimate the relative risk for stomach cancer incidence in Libya in 2007 based on the BYM model. This research begins with a short offer of the SMR and Bayesian model with BYM model, which we applied to stomach cancer incidence in Libya. We compared all of the results using maps and tables. We found that BYM model is potentially beneficial, because it gives better relative risk estimates compared to SMR method. As well as, it has can overcome the classical method problem when there is no observed stomach cancer in a region. Creative Commons Attribution License
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
Ferrand, Guillaume; Luong, Michel; Cloos, Martijn A; Amadon, Alexis; Wackernagel, Hans
2014-08-01
Transmit arrays have been developed to mitigate the RF field inhomogeneity commonly observed in high field magnetic resonance imaging (MRI), typically above 3T. To this end, the knowledge of the RF complex-valued B1 transmit-sensitivities of each independent radiating element has become essential. This paper details a method to speed up a currently available B1-calibration method. The principle relies on slice undersampling, slice and channel interleaving and kriging, an interpolation method developed in geostatistics and applicable in many domains. It has been demonstrated that, under certain conditions, kriging gives the best estimator of a field in a region of interest. The resulting accelerated sequence allows mapping a complete set of eight volumetric field maps of the human head in about 1 min. For validation, the accuracy of kriging is first evaluated against a well-known interpolation technique based on Fourier transform as well as to a B1-maps interpolation method presented in the literature. This analysis is carried out on simulated and decimated experimental B1 maps. Finally, the accelerated sequence is compared to the standard sequence on a phantom and a volunteer. The new sequence provides B1 maps three times faster with a loss of accuracy limited potentially to about 5%.
[Standardization of terminology in laboratory medicine I].
Yoon, Soo Young; Yoon, Jong Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Lee, Chang Kyu; Kwon, Jung Ah; Lee, Kap No
2007-04-01
Standardization of medical terminology is essential for data transmission between health-care institutions or clinical laboratories and for maximizing the benefits of information technology. Purpose of our study was to standardize the medical terms used in the clinical laboratory, such as test names, units, terms used in result descriptions, etc. During the first year of the study, we developed a standard database of concept names for laboratory terms, which covered the terms used in government health care centers, their branch offices, and primary health care units. Laboratory terms were collected from the electronic data interchange (EDI) codes from National Health Insurance Corporation (NHIC), Logical Observation Identifier Names and Codes (LOINC) database, community health centers and their branch offices, and clinical laboratories of representative university medical centers. For standard expression, we referred to the English-Korean/ Korean-English medical dictionary of Korean Medical Association and the rules for foreign language translation. Programs for mapping between LOINC DB and EDI code and for translating English to Korean were developed. A Korean standard laboratory terminology database containing six axial concept names such as components, property, time aspect, system (specimen), scale type, and method type was established for 7,508 test observations. Short names and a mapping table for EDI codes and Unified Medical Language System (UMLS) were added. Synonym tables for concept names, words used in the database, and six axial terms were prepared to make it easier to find the standard terminology with common terms used in the field of laboratory medicine. Here we report for the first time a Korean standard laboratory terminology database for test names, result description terms, result units covering most laboratory tests in primary healthcare centers.
GridWise Standards Mapping Overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosquet, Mia L.
''GridWise'' is a concept of how advanced communications, information and controls technology can transform the nation's energy system--across the spectrum of large scale, central generation to common consumer appliances and equipment--into a collaborative network, rich in the exchange of decision making information and an abundance of market-based opportunities (Widergren and Bosquet 2003) accompanying the electric transmission and distribution system fully into the information and telecommunication age. This report summarizes a broad review of standards efforts which are related to GridWise--those which could ultimately contribute significantly to advancements toward the GridWise vision, or those which represent today's current technological basis uponmore » which this vision must build.« less
Imprints of non-standard dark energy and dark matter models on the 21cm intensity map power spectrum
NASA Astrophysics Data System (ADS)
Carucci, Isabella P.; Corasaniti, Pier-Stefano; Viel, Matteo
2017-12-01
We study the imprint of non-standard dark energy (DE) and dark matter (DM) models on the 21cm intensity map power spectra from high-redshift neutral hydrogen (HI) gas. To this purpose we use halo catalogs from N-body simulations of dynamical DE models and DM scenarios which are as successful as the standard Cold Dark Matter model with Cosmological Constant (ΛCDM) at interpreting available cosmological observations. We limit our analysis to halo catalogs at redshift z=1 and 2.3 which are common to all simulations. For each catalog we model the HI distribution by using a simple prescription to associate the HI gas mass to N-body halos. We find that the DE models leave a distinct signature on the HI spectra across a wide range of scales, which correlates with differences in the halo mass function and the onset of the non-linear regime of clustering. In the case of the non-standard DM model significant differences of the HI spectra with respect to the ΛCDM model only arise from the suppressed abundance of low mass halos. These cosmological model dependent features also appear in the 21cm spectra. In particular, we find that future SKA measurements can distinguish the imprints of DE and DM models at high statistical significance.
Magnetic Resonance Imaging for Patellofemoral Chondromalacia: Is There a Role for T2 Mapping?
van Eck, Carola F; Kingston, R Scott; Crues, John V; Kharrazi, F Daniel
2017-11-01
Patellofemoral pain is common, and treatment is guided by the presence and grade of chondromalacia. To evaluate and compare the sensitivity and specificity in detecting and grading chondral abnormalities of the patella between proton density fat suppression (PDFS) and T2 mapping magnetic resonance imaging (MRI). Cohort study; Level of evidence, 2. A total of 25 patients who underwent MRI of the knee with both a PDFS sequence and T2 mapping and subsequently underwent arthroscopic knee surgery were included. The cartilage surface of the patella was graded on both MRI sequences by 2 independent, blinded radiologists. Cartilage was then graded during arthroscopic surgery by a sports medicine fellowship-trained orthopaedic surgeon. Reliability, sensitivity, specificity, and accuracy were determined for both MRI methods. The findings during arthroscopic surgery were considered the gold standard. Intraobserver and interobserver agreement for both PDFS (98.5% and 89.4%, respectively) and T2 mapping (99.4% and 91.3%, respectively) MRI were excellent. For T2 mapping, the sensitivity (61%) and specificity (64%) were comparable, whereas for PDFS there was a lower sensitivity (37%) but higher specificity (81%) in identifying cartilage abnormalities. This resulted in a similar accuracy for PDFS (59%) and T2 mapping (62%). Both PDFS and T2 mapping MRI were reliable but only moderately accurate in predicting patellar chondromalacia found during knee arthroscopic surgery.
Charon Message-Passing Toolkit for Scientific Computations
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.; Yan, Jerry (Technical Monitor)
2000-01-01
Charon is a library, callable from C and Fortran, that aids the conversion of structured-grid legacy codes-such as those used in the numerical computation of fluid flows-into parallel, high- performance codes. Key are functions that define distributed arrays, that map between distributed and non-distributed arrays, and that allow easy specification of common communications on structured grids. The library is based on the widely accepted MPI message passing standard. We present an overview of the functionality of Charon, and some representative results.
Common Sense Guide to Mitigating Insider Threats, Fifth Edition
2016-12-01
background investigation on its employees. 6.4 Quick Wins and High -Impact Solutions 6.4.1 All Organizations Have all employees, contractors , and trusted...Studies 15 1.6 Quick Wins and High -Impact Solutions 16 1.6.1 All Organizations 16 1.7 Mapping to Standards 16 Practice 2: Develop a formalized insider...Threat Program 29 2.5 Case Studies 30 2.6 Quick Wins and High -Impact Solutions 31 2.6.1 All Organizations 31 2.6.2 Large Organizations 32 2.7
NASA Technical Reports Server (NTRS)
Shiokari, T.
1975-01-01
The feasibility and cost savings of using flight-proven components in designing spacecraft were investigated. The components analyzed were (1) large space telescope, (2) stratospheric aerosol and gas equipment, (3) mapping mission, (4) solar maximum mission, and (5) Tiros-N. It is concluded that flight-proven hardware can be used with not-too-extensive modification, and significant savings can be realized. The cost savings for each component are presented.
Peng, Wenzhu; Xu, Jian; Zhang, Yan; Feng, Jianxin; Dong, Chuanju; Jiang, Likun; Feng, Jingyan; Chen, Baohua; Gong, Yiwen; Chen, Lin; Xu, Peng
2016-01-01
High density genetic linkage maps are essential for QTL fine mapping, comparative genomics and high quality genome sequence assembly. In this study, we constructed a high-density and high-resolution genetic linkage map with 28,194 SNP markers on 14,146 distinct loci for common carp based on high-throughput genotyping with the carp 250 K single nucleotide polymorphism (SNP) array in a mapping family. The genetic length of the consensus map was 10,595.94 cM with an average locus interval of 0.75 cM and an average marker interval of 0.38 cM. Comparative genomic analysis revealed high level of conserved syntenies between common carp and the closely related model species zebrafish and medaka. The genome scaffolds were anchored to the high-density linkage map, spanning 1,357 Mb of common carp reference genome. QTL mapping and association analysis identified 22 QTLs for growth-related traits and 7 QTLs for sex dimorphism. Candidate genes underlying growth-related traits were identified, including important regulators such as KISS2, IGF1, SMTLB, NPFFR1 and CPE. Candidate genes associated with sex dimorphism were also identified including 3KSR and DMRT2b. The high-density and high-resolution genetic linkage map provides an important tool for QTL fine mapping and positional cloning of economically important traits, and improving common carp genome assembly. PMID:27225429
Saturation of an intra-gene pool linkage map: toward unified consensus linkage map in common bean
USDA-ARS?s Scientific Manuscript database
Map-based cloning to find genes of interest and marker assisted selection (MAS) requires good genetic maps with high reproducible markers. In this study, we saturated the linkage map of the intra-gene pool population of common bean DOR364×BAT477 (DB) by evaluating 2,706 molecular markers in includin...
Competency frameworks for advanced practice nursing: a literature review.
Sastre-Fullana, P; De Pedro-Gómez, J E; Bennasar-Veny, M; Serrano-Gallardo, P; Morales-Asencio, J M
2014-12-01
This paper describes a literature review that identified common traits in advanced practice nursing that are specific to competency development worldwide. There is a lack of international agreement on the definition of advanced practice nursing and its core competencies. Despite the lack of consensus, there is an ongoing process worldwide to establish and outline the standards and competencies for advanced practice nursing roles. International agencies, such as the International Council of Nurses, have provided general definitions for advanced practice nursing. Additionally, a set of competency standards for this aim has been developed. A literature review and a directed search of institutional websites were performed to identify specific developments in advanced practice nursing competencies and standards of practice. To determine a competency map specific to international advanced practice nursing, key documents were analysed using a qualitative approach based on content analysis to identify common traits among documents and countries. The review process identified 119 relevant journal articles related to advanced practice nursing competencies. Additionally, 97 documents from grey literature that were related to advanced practice nursing competency mapping were identified. From the text analysis, 17 worldwide transversal competency domains emerged. Despite the variety of patterns in international advanced practice nursing development, essential competency domains can be found in most national frameworks for the role development of international advanced practice nursing. These 17 core competencies can be used to further develop instruments that assess the perceived competency of advanced practice nurses. The results of this review can help policy developers and researchers develop instruments to compare advanced practice nursing services in various contexts and to examine their association with related outcomes. © 2014 International Council of Nurses.
2013-01-01
Background The harmonization of European health systems brings with it a need for tools to allow the standardized collection of information about medical care. A common coding system and standards for the description of services are needed to allow local data to be incorporated into evidence-informed policy, and to permit equity and mobility to be assessed. The aim of this project has been to design such a classification and a related tool for the coding of services for Long Term Care (DESDE-LTC), based on the European Service Mapping Schedule (ESMS). Methods The development of DESDE-LTC followed an iterative process using nominal groups in 6 European countries. 54 researchers and stakeholders in health and social services contributed to this process. In order to classify services, we use the minimal organization unit or “Basic Stable Input of Care” (BSIC), coded by its principal function or “Main Type of Care” (MTC). The evaluation of the tool included an analysis of feasibility, consistency, ontology, inter-rater reliability, Boolean Factor Analysis, and a preliminary impact analysis (screening, scoping and appraisal). Results DESDE-LTC includes an alpha-numerical coding system, a glossary and an assessment instrument for mapping and counting LTC. It shows high feasibility, consistency, inter-rater reliability and face, content and construct validity. DESDE-LTC is ontologically consistent. It is regarded by experts as useful and relevant for evidence-informed decision making. Conclusion DESDE-LTC contributes to establishing a common terminology, taxonomy and coding of LTC services in a European context, and a standard procedure for data collection and international comparison. PMID:23768163
Sollmann, Nico; Ille, Sebastian; Boeckh-Behrens, Tobias; Ringel, Florian; Meyer, Bernhard; Krieg, Sandro M
2016-07-01
Functional magnetic resonance imaging (fMRI) is considered to be the standard method regarding non-invasive language mapping. However, repetitive navigated transcranial magnetic stimulation (rTMS) gains increasing importance with respect to that purpose. However, comparisons between both methods are sparse. We performed fMRI and rTMS language mapping of the left hemisphere in 40 healthy, right-handed subjects in combination with the tasks that are most commonly used in the neurosurgical context (fMRI: word-generation = WGEN task; rTMS: object-naming = ON task). Different rTMS error rate thresholds (ERTs) were calculated, and Cohen's kappa coefficient and the cortical parcellation system (CPS) were used for systematic comparison of the two techniques. Overall, mean kappa coefficients were low, revealing no distinct agreement. We found the highest agreement for both techniques when using the 2-out-of-3 rule (CPS region defined as language positive in terms of rTMS if at least 2 out of 3 stimulations led to a naming error). However, kappa for this threshold was only 0.24 (kappa of <0, 0.01-0.20, 0.21-0.40, 0.41-0.60, 0.61-0.80 and 0.81-0.99 indicate less than chance, slight, fair, moderate, substantial and almost perfect agreement, respectively). Because of the inherent differences in the underlying physiology of fMRI and rTMS, the different tasks used and the impossibility of verifying the results via direct cortical stimulation (DCS) in the population of healthy volunteers, one must exercise caution in drawing conclusions about the relative usefulness of each technique for language mapping. Nevertheless, this study yields valuable insights into these two mapping techniques for the most common language tasks currently used in neurosurgical practice.
Valorisation of Como Historical Cadastral Maps Through Modern Web Geoservices
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Minghini, M.; Zamboni, G.
2012-07-01
Cartographic cultural heritage preserved in worldwide archives is often stored in the original paper version only, thus restricting both the chances of utilization and the range of possible users. The Web C.A.R.T.E. system addressed this issue with regard to the precious cadastral maps preserved at the State Archive of Como. Aim of the project was to improve the visibility and accessibility of this heritage using the latest free and open source tools for processing, cataloguing and web publishing the maps. The resulting architecture should therefore assist the State Archive of Como in managing its cartographic contents. After a pre-processing consisting of digitization and georeferencing steps, maps were provided with metadata, compiled according to the current Italian standards and managed through an ad hoc version of the GeoNetwork Opensource geocatalog software. A dedicated MapFish-based webGIS client, with an optimized version also for mobile platforms, was built for maps publication and 2D navigation. A module for 3D visualization of cadastral maps was finally developed using the NASA World Wind Virtual Globe. Thanks to a temporal slidebar, time was also included in the system producing a 4D Graphical User Interface. The overall architecture was totally built with free and open source software and allows a direct and intuitive consultation of historical maps. Besides the notable advantage of keeping original paper maps intact, the system greatly simplifies the work of the State Archive of Como common users and together widens the same range of users thanks to the modernization of map consultation tools.
DOT National Transportation Integrated Search
1997-07-14
These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...
1982-09-30
not changed because they are not subject to a careful evaluation. The solution The four job aids contained in this manual provide specific techniques...lesson plans tions. " training design, or * testing NOTICE This manual has been developed using the standards of the Information MappingO writing service...Information Mapping, Inc. S NOTICE This manual has been developed using the standards of the Information MappingS writing service. Infornation Mapping
Robust, open-source removal of systematics in Kepler data
NASA Astrophysics Data System (ADS)
Aigrain, S.; Parviainen, H.; Roberts, S.; Reece, S.; Evans, T.
2017-10-01
We present ARC2 (Astrophysically Robust Correction 2), an open-source python-based systematics-correction pipeline, to correct for the Kepler prime mission long-cadence light curves. The ARC2 pipeline identifies and corrects any isolated discontinuities in the light curves and then removes trends common to many light curves. These trends are modelled using the publicly available co-trending basis vectors, within an (approximate) Bayesian framework with 'shrinkage' priors to minimize the risk of overfitting and the injection of any additional noise into the corrected light curves, while keeping any astrophysical signals intact. We show that the ARC2 pipeline's performance matches that of the standard Kepler PDC-MAP data products using standard noise metrics, and demonstrate its ability to preserve astrophysical signals using injection tests with simulated stellar rotation and planetary transit signals. Although it is not identical, the ARC2 pipeline can thus be used as an open-source alternative to PDC-MAP, whenever the ability to model the impact of the systematics removal process on other kinds of signal is important.
Evolution of tsunami warning systems and products.
Bernard, Eddie; Titov, Vasily
2015-10-28
Each year, about 60 000 people and $4 billion (US$) in assets are exposed to the global tsunami hazard. Accurate and reliable tsunami warning systems have been shown to provide a significant defence for this flooding hazard. However, the evolution of warning systems has been influenced by two processes: deadly tsunamis and available technology. In this paper, we explore the evolution of science and technology used in tsunami warning systems, the evolution of their products using warning technologies, and offer suggestions for a new generation of warning products, aimed at the flooding nature of the hazard, to reduce future tsunami impacts on society. We conclude that coastal communities would be well served by receiving three standardized, accurate, real-time tsunami warning products, namely (i) tsunami energy estimate, (ii) flooding maps and (iii) tsunami-induced harbour current maps to minimize the impact of tsunamis. Such information would arm communities with vital flooding guidance for evacuations and port operations. The advantage of global standardized flooding products delivered in a common format is efficiency and accuracy, which leads to effectiveness in promoting tsunami resilience at the community level. © 2015 The Authors.
Evolution of tsunami warning systems and products
Bernard, Eddie; Titov, Vasily
2015-01-01
Each year, about 60 000 people and $4 billion (US$) in assets are exposed to the global tsunami hazard. Accurate and reliable tsunami warning systems have been shown to provide a significant defence for this flooding hazard. However, the evolution of warning systems has been influenced by two processes: deadly tsunamis and available technology. In this paper, we explore the evolution of science and technology used in tsunami warning systems, the evolution of their products using warning technologies, and offer suggestions for a new generation of warning products, aimed at the flooding nature of the hazard, to reduce future tsunami impacts on society. We conclude that coastal communities would be well served by receiving three standardized, accurate, real-time tsunami warning products, namely (i) tsunami energy estimate, (ii) flooding maps and (iii) tsunami-induced harbour current maps to minimize the impact of tsunamis. Such information would arm communities with vital flooding guidance for evacuations and port operations. The advantage of global standardized flooding products delivered in a common format is efficiency and accuracy, which leads to effectiveness in promoting tsunami resilience at the community level. PMID:26392620
Caine, Jonathan S.; Manning, Andrew H.; Berger, Byron R.; Kremer, Yannick; Guzman, Mario A.; Eberl, Dennis D.; Schuller, Kathryn
2010-01-01
The Standard Mine Superfund Site is a source of mine drainage and associated heavy metal contamination of surface and groundwaters. The site contains Tertiary polymetallic quartz veins and fault zones that host precious and base metal sulfide mineralization common in Colorado. To assist the U.S. Environmental Protection Agency in its effort to remediate mine-related contamination, we characterized geologic structures, host rocks, and their potential hydraulic properties to better understand the sources of contaminants and the local hydrogeology. Real time kinematic and handheld global positioning systems were used to locate and map precisely the geometry of the surface traces of structures and mine-related features, such as portals. New reconnaissance geologic mapping, field and x-ray diffraction mineralogy, rock sample collection, thin-section analysis, and elemental geochemical analysis were completed to characterize hydrothermal alteration, mineralization, and subsequent leaching of metallic phases. Surface and subsurface observations, fault vein and fracture network characterization, borehole geophysical logging, and mercury injection capillary entry pressure data were used to document potential controls on the hydrologic system.
cudaMap: a GPU accelerated program for gene expression connectivity mapping.
McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong
2013-10-11
Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.
Construct Maps: A Tool to Organize Validity Evidence
ERIC Educational Resources Information Center
McClarty, Katie Larsen
2013-01-01
The construct map is a promising tool for organizing the data standard-setting panelists interpret. The challenge in applying construct maps to standard-setting procedures will be the judicious selection of data to include within this organizing framework. Therefore, this commentary focuses on decisions about what to include in the construct map.…
Galeano, Carlos H.; Fernandez, Andrea C.; Franco-Herrera, Natalia; Cichy, Karen A.; McClean, Phillip E.; Vanderleyden, Jos; Blair, Matthew W.
2011-01-01
Map-based cloning and fine mapping to find genes of interest and marker assisted selection (MAS) requires good genetic maps with reproducible markers. In this study, we saturated the linkage map of the intra-gene pool population of common bean DOR364×BAT477 (DB) by evaluating 2,706 molecular markers including SSR, SNP, and gene-based markers. On average the polymorphism rate was 7.7% due to the narrow genetic base between the parents. The DB linkage map consisted of 291 markers with a total map length of 1,788 cM. A consensus map was built using the core mapping populations derived from inter-gene pool crosses: DOR364×G19833 (DG) and BAT93×JALO EEP558 (BJ). The consensus map consisted of a total of 1,010 markers mapped, with a total map length of 2,041 cM across 11 linkage groups. On average, each linkage group on the consensus map contained 91 markers of which 83% were single copy markers. Finally, a synteny analysis was carried out using our highly saturated consensus maps compared with the soybean pseudo-chromosome assembly. A total of 772 marker sequences were compared with the soybean genome. A total of 44 syntenic blocks were identified. The linkage group Pv6 presented the most diverse pattern of synteny with seven syntenic blocks, and Pv9 showed the most consistent relations with soybean with just two syntenic blocks. Additionally, a co-linear analysis using common bean transcript map information against soybean coding sequences (CDS) revealed the relationship with 787 soybean genes. The common bean consensus map has allowed us to map a larger number of markers, to obtain a more complete coverage of the common bean genome. Our results, combined with synteny relationships provide tools to increase marker density in selected genomic regions to identify closely linked polymorphic markers for indirect selection, fine mapping or for positional cloning. PMID:22174773
Mean template for tensor-based morphometry using deformation tensors.
Leporé, Natasha; Brun, Caroline; Pennec, Xavier; Chou, Yi-Yu; Lopez, Oscar L; Aizenstein, Howard J; Becker, James T; Toga, Arthur W; Thompson, Paul M
2007-01-01
Tensor-based morphometry (TBM) studies anatomical differences between brain images statistically, to identify regions that differ between groups, over time, or correlate with cognitive or clinical measures. Using a nonlinear registration algorithm, all images are mapped to a common space, and statistics are most commonly performed on the Jacobian determinant (local expansion factor) of the deformation fields. In, it was shown that the detection sensitivity of the standard TBM approach could be increased by using the full deformation tensors in a multivariate statistical analysis. Here we set out to improve the common space itself, by choosing the shape that minimizes a natural metric on the deformation tensors from that space to the population of control subjects. This method avoids statistical bias and should ease nonlinear registration of new subjects data to a template that is 'closest' to all subjects' anatomies. As deformation tensors are symmetric positive-definite matrices and do not form a vector space, all computations are performed in the log-Euclidean framework. The control brain B that is already the closest to 'average' is found. A gradient descent algorithm is then used to perform the minimization that iteratively deforms this template and obtains the mean shape. We apply our method to map the profile of anatomical differences in a dataset of 26 HIV/AIDS patients and 14 controls, via a log-Euclidean Hotelling's T2 test on the deformation tensors. These results are compared to the ones found using the 'best' control, B. Statistics on both shapes are evaluated using cumulative distribution functions of the p-values in maps of inter-group differences.
Multistrategy Self-Organizing Map Learning for Classification Problems
Hasan, S.; Shamsuddin, S. M.
2011-01-01
Multistrategy Learning of Self-Organizing Map (SOM) and Particle Swarm Optimization (PSO) is commonly implemented in clustering domain due to its capabilities in handling complex data characteristics. However, some of these multistrategy learning architectures have weaknesses such as slow convergence time always being trapped in the local minima. This paper proposes multistrategy learning of SOM lattice structure with Particle Swarm Optimisation which is called ESOMPSO for solving various classification problems. The enhancement of SOM lattice structure is implemented by introducing a new hexagon formulation for better mapping quality in data classification and labeling. The weights of the enhanced SOM are optimised using PSO to obtain better output quality. The proposed method has been tested on various standard datasets with substantial comparisons with existing SOM network and various distance measurement. The results show that our proposed method yields a promising result with better average accuracy and quantisation errors compared to the other methods as well as convincing significant test. PMID:21876686
Coming Full Circle in Standard Setting: A Commentary on Wyse
ERIC Educational Resources Information Center
Skaggs, Gary
2013-01-01
The construct map is a particularly good way to approach instrument development, and this author states that he was delighted to read Adam Wyse's thoughts about how to use construct maps for standard setting. For a number of popular standard-setting methods, Wyse shows how typical feedback to panelists fits within a construct map framework.…
An Atlas of ShakeMaps for Landslide and Liquefaction Modeling
NASA Astrophysics Data System (ADS)
Johnson, K. L.; Nowicki, M. A.; Mah, R. T.; Garcia, D.; Harp, E. L.; Godt, J. W.; Lin, K.; Wald, D. J.
2012-12-01
The human consequences of a seismic event are often a result of subsequent hazards induced by the earthquake, such as landslides. While the United States Geological Survey (USGS) ShakeMap and Prompt Assessment of Global Earthquakes for Response (PAGER) systems are, in conjunction, capable of estimating the damage potential of earthquake shaking in near-real time, they do not currently provide estimates for the potential of further damage by secondary processes. We are developing a sound basis for providing estimates of the likelihood and spatial distribution of landslides for any global earthquake under the PAGER system. Here we discuss several important ingredients in this effort. First, we report on the development of a standardized hazard layer from which to calibrate observed landslide distributions; in contrast, prior studies have used a wide variety of means for estimating the hazard input. This layer now takes the form of a ShakeMap, a standardized approach for computing geospatial estimates for a variety of shaking metrics (both peak ground motions and shaking intensity) from any well-recorded earthquake. We have created ShakeMaps for about 20 historical landslide "case history" events, significant in terms of their landslide occurrence, as part of an updated release of the USGS ShakeMap Atlas. We have also collected digitized landslide data from open-source databases for many of the earthquake events of interest. When these are combined with up-to-date topographic and geologic maps, we have the basic ingredients for calibrating landslide probabilities for a significant collection of earthquakes. In terms of modeling, rather than focusing on mechanistic models of landsliding, we adopt a strictly statistical approach to quantify landslide likelihood. We incorporate geology, slope, peak ground acceleration, and landslide data as variables in a logistic regression, selecting the best explanatory variables given the standardized new hazard layers (see Nowicki et al., this meeting, for more detail on the regression). To make the ShakeMap and PAGER systems more comprehensive in terms of secondary losses, we are working to calibrate a similarly constrained regression for liquefaction estimation using a suite of well-studied earthquakes for which detailed, digitized liquefaction datasets are available; here variants of wetness index and soil strength replace geology and slope. We expect that this Atlas of ShakeMaps for landslide and liquefaction case history events, which will soon be publicly available via the internet, will aid in improving the accuracy of loss-modeling systems such as PAGER, as well as allow for a common framework for numerous other mechanistic and empirical studies.
Grigoryan, Marine; Shamshurin, Dmitry; Spicer, Victor; Krokhin, Oleg V
2013-11-19
As an initial step in our efforts to unify the expression of peptide retention times in proteomic liquid chromatography-mass spectrometry (LC-MS) experiments, we aligned the chromatographic properties of a number of peptide retention standards against a collection of peptides commonly observed in proteomic experiments. The standard peptide mixtures and tryptic digests of samples of different origins were separated under the identical chromatographic condition most commonly employed in proteomics: 100 Å C18 sorbent with 0.1% formic acid as an ion-pairing modifier. Following our original approach (Krokhin, O. V.; Spicer, V. Anal. Chem. 2009, 81, 9522-9530) the retention characteristics of these standards and collection of tryptic peptides were mapped into hydrophobicity index (HI) or acetonitrile percentage units. This scale allows for direct visualization of the chromatographic outcome of LC-MS acquisitions, monitors the performance of the gradient LC system, and simplifies method development and interlaboratory data alignment. Wide adoption of this approach would significantly aid understanding the basic principles of gradient peptide RP-HPLC and solidify our collective efforts in acquiring confident peptide retention libraries, a key component in the development of targeted proteomic approaches.
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-10-18
Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-01-01
Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at . PMID:17945017
NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data
Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.
2005-01-01
NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.
NASA Astrophysics Data System (ADS)
Ebner, M.; Schiegl, M.; Stöckl, W.; Heger, H.
2012-04-01
The Geological Survey of Austria is legally obligated by the INSPIRE directive to provide data that fall under this directive (geology, mineral resources and natural risk zones) to the European commission in a semantically harmonized and technically interoperable way. Until recently the focus was entirely on the publication of high quality printed cartographic products. These have a complex (carto-)graphic data-model, which allows visualizing several thematic aspects, such as lithology, stratigraphy, tectonics, geologic age, mineral resources, mass movements, geomorphology etc. in a single planar map/product. Nonetheless these graphic data-models do not allow retrieving individual thematic aspects since these were coded in a complex portrayal scheme. Automatic information retrieval is thus impossible; and domain knowledge is necessary to interpret these "encrypted datasets". With INSPIRE becoming effective and a variety of conceptual models (e.g. GeoSciML), built around a semantic framework (i.e. controlled vocabularies), being available it is necessary to develop a strategy and workflow for semantic harmonization of such datasets. In this contribution we demonstrate the development of a multistage workflow which will allow us to transform our printed maps to semantically enabled datasets and services and discuss some prerequisites, foundations and problems. In a first step in our workflow we analyzed our maps and developed controlled vocabularies that describe the thematic content of our data. We then developed a physical data-model which we use to attribute our spatial data with thematic information from our controlled vocabularies to form core thematic data sets. This physical data model is geared towards use on an organizational level but builds upon existing standards (INSPIRE, GeoSciML) to allow transformation to international standards. In a final step we will develop a standardized mapping scheme to publish INSPIRE conformant services from our core datasets. This two-step transformation is necessary since a direct mapping to international standards is not possible for traditional map-based data. Controlled vocabularies provide the foundation of a semantic harmonization. For the encoding of the vocabularies we build upon the W3C standard SKOS (=Simple Knowledge Organisation System), a thesaurus specification for the semantic web, which is itself based on the Resource Description Framework (RDF) and RDF Schema and added some DublinCore and VoID for the metadata of our vocabularies and resources. For the development of these thesauri we use the commercial software PoolParty, which is a tool specially build to develop, manage and publish multilingual thesauri. The corporate thesauri of the Austrian Geological Survey are exposed via a web-service that is conformant with the linked data principles. This web-service gives access to a (1) RDF/HTML representation of the resources via a simple, robust and thus persistent http URIs (2) a download of the complete vocabularies in RDF-format (3) a full-fledged SPARQL-Endpoint to query the thesaurus. With the development of physical data-models (based on preexisting conceptual models) one must dismiss the classical schemes of map-based portrayal of data. E.g. for individual Geological units on traditional geological maps usually a single age range is given (e.g. formation age). But one might want to attribute several geologic ages (formation age, metamorphic age, cooling ages etc.) to individual units. Such issues have to be taken into account when developing robust physical data-models. Based on our experience we are convinced that individual institutions need to develop their own controlled vocabularies and individual data-models that fit the specific needs on an organizational level. If externally developed vocabularies and data-models are introduced to established workflows newly generated and existing data may be diverging and it will be hard to achieve or maintain a common standard. We thus suggest that it is necessary for institutions to keep (or develop) to their organizational standards and vocabularies and map them to generally agreed international standards such as INSPIRE or GeoSciML in a fashion suggested by the linked data principles.
Standardization of Terminology in Laboratory Medicine II
Lee, Kap No; Yoon, Jong-Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Jang, Seongsoo; Ki, Chang-Seok; Bae, Sook Young; Kim, Jang Su; Kwon, Jung-Ah; Lee, Chang Kyu
2008-01-01
Standardization of medical terminology is essential in data transmission between health care institutes and in maximizing the benefits of information technology. The purpose of this study was to standardize medical terms for laboratory observations. During the second year of the study, a standard database of concept names for laboratory terms that covered those used in tertiary health care institutes and reference laboratories was developed. The laboratory terms in the Logical Observation Identifier Names and Codes (LOINC) database were adopted and matched with the electronic data interchange (EDI) codes in Korea. A public hearing and a workshop for clinical pathologists were held to collect the opinions of experts. The Korean standard laboratory terminology database containing six axial concept names, components, property, time aspect, system (specimen), scale type, and method type, was established for 29,340 test observations. Short names and mapping tables for EDI codes and UMLS were added. Synonym tables were prepared to help match concept names to common terms used in the fields. We herein described the Korean standard laboratory terminology database for test names, result description terms, and result units encompassing most of the laboratory tests in Korea. PMID:18756062
Park, Sung-Hong; Wang, Danny J J; Duong, Timothy Q
2013-09-01
We implemented pseudo-continuous ASL (pCASL) with 2D and 3D balanced steady state free precession (bSSFP) readout for mapping blood flow in the human brain, retina, and kidney, free of distortion and signal dropout, which are typically observed in the most commonly used echo-planar imaging acquisition. High resolution functional brain imaging in the human visual cortex was feasible with 3D bSSFP pCASL. Blood flow of the human retina could be imaged with pCASL and bSSFP in conjunction with a phase cycling approach to suppress the banding artifacts associated with bSSFP. Furthermore, bSSFP based pCASL enabled us to map renal blood flow within a single breath hold. Control and test-retest experiments suggested that the measured blood flow values in retina and kidney were reliable. Because there is no specific imaging tool for mapping human retina blood flow and the standard contrast agent technique for mapping renal blood flow can cause problems for patients with kidney dysfunction, bSSFP based pCASL may provide a useful tool for the diagnosis of retinal and renal diseases and can complement existing imaging techniques. Copyright © 2013 Elsevier Inc. All rights reserved.
Endoscopic management of benign biliary strictures.
Rustagi, Tarun; Jamidar, Priya A
2015-01-01
Benign biliary strictures are a common indication for endoscopic retrograde cholangiopancreatography (ERCP). Endoscopic management has evolved over the last 2 decades as the current standard of care. The most common etiologies of strictures encountered are following surgery and those related to chronic pancreatitis. High-quality cross-sectional imaging provides a road map for endoscopic management. Currently, sequential placement of multiple plastic biliary stents represents the preferred approach. There is an increasing role for the treatment of these strictures using covered metal stents, but due to conflicting reports of efficacies as well as cost and complications, this approach should only be entertained following careful consideration. Optimal management of strictures is best achieved using a team approach with the surgeon and interventional radiologist playing an important role.
Lowry, J.; Ramsey, R.D.; Thomas, K.; Schrupp, D.; Sajwaj, T.; Kirby, J.; Waller, E.; Schrader, S.; Falzarano, S.; Langs, L.; Manis, G.; Wallace, C.; Schulz, K.; Comer, P.; Pohs, K.; Rieth, W.; Velasquez, C.; Wolk, B.; Kepner, W.; Boykin, K.; O'Brien, L.; Bradford, D.; Thompson, B.; Prior-Magee, J.
2007-01-01
Land-cover mapping efforts within the USGS Gap Analysis Program have traditionally been state-centered; each state having the responsibility of implementing a project design for the geographic area within their state boundaries. The Southwest Regional Gap Analysis Project (SWReGAP) was the first formal GAP project designed at a regional, multi-state scale. The project area comprises the southwestern states of Arizona, Colorado, Nevada, New Mexico, and Utah. The land-cover map/dataset was generated using regionally consistent geospatial data (Landsat ETM+ imagery (1999-2001) and DEM derivatives), similar field data collection protocols, a standardized land-cover legend, and a common modeling approach (decision tree classifier). Partitioning of mapping responsibilities amongst the five collaborating states was organized around ecoregion-based "mapping zones". Over the course of 21/2 field seasons approximately 93,000 reference samples were collected directly, or obtained from other contemporary projects, for the land-cover modeling effort. The final map was made public in 2004 and contains 125 land-cover classes. An internal validation of 85 of the classes, representing 91% of the land area was performed. Agreement between withheld samples and the validated dataset was 61% (KHAT = .60, n = 17,030). This paper presents an overview of the methodologies used to create the regional land-cover dataset and highlights issues associated with large-area mapping within a coordinated, multi-institutional management framework. ?? 2006 Elsevier Inc. All rights reserved.
2011-01-01
Background Common carp is one of the most important aquaculture teleost fish in the world. Common carp and other closely related Cyprinidae species provide over 30% aquaculture production in the world. However, common carp genomic resources are still relatively underdeveloped. BAC end sequences (BES) are important resources for genome research on BAC-anchored genetic marker development, linkage map and physical map integration, and whole genome sequence assembling and scaffolding. Result To develop such valuable resources in common carp (Cyprinus carpio), a total of 40,224 BAC clones were sequenced on both ends, generating 65,720 clean BES with an average read length of 647 bp after sequence processing, representing 42,522,168 bp or 2.5% of common carp genome. The first survey of common carp genome was conducted with various bioinformatics tools. The common carp genome contains over 17.3% of repetitive elements with GC content of 36.8% and 518 transposon ORFs. To identify and develop BAC-anchored microsatellite markers, a total of 13,581 microsatellites were detected from 10,355 BES. The coding region of 7,127 genes were recognized from 9,443 BES on 7,453 BACs, with 1,990 BACs have genes on both ends. To evaluate the similarity to the genome of closely related zebrafish, BES of common carp were aligned against zebrafish genome. A total of 39,335 BES of common carp have conserved homologs on zebrafish genome which demonstrated the high similarity between zebrafish and common carp genomes, indicating the feasibility of comparative mapping between zebrafish and common carp once we have physical map of common carp. Conclusion BAC end sequences are great resources for the first genome wide survey of common carp. The repetitive DNA was estimated to be approximate 28% of common carp genome, indicating the higher complexity of the genome. Comparative analysis had mapped around 40,000 BES to zebrafish genome and established over 3,100 microsyntenies, covering over 50% of the zebrafish genome. BES of common carp are tremendous tools for comparative mapping between the two closely related species, zebrafish and common carp, which should facilitate both structural and functional genome analysis in common carp. PMID:21492448
A real-time standard parts inspection based on deep learning
NASA Astrophysics Data System (ADS)
Xu, Kuan; Li, XuDong; Jiang, Hongzhi; Zhao, Huijie
2017-10-01
Since standard parts are necessary components in mechanical structure like bogie and connector. These mechanical structures will be shattered or loosen if standard parts are lost. So real-time standard parts inspection systems are essential to guarantee their safety. Researchers would like to take inspection systems based on deep learning because it works well in image with complex backgrounds which is common in standard parts inspection situation. A typical inspection detection system contains two basic components: feature extractors and object classifiers. For the object classifier, Region Proposal Network (RPN) is one of the most essential architectures in most state-of-art object detection systems. However, in the basic RPN architecture, the proposals of Region of Interest (ROI) have fixed sizes (9 anchors for each pixel), they are effective but they waste much computing resources and time. In standard parts detection situations, standard parts have given size, thus we can manually choose sizes of anchors based on the ground-truths through machine learning. The experiments prove that we could use 2 anchors to achieve almost the same accuracy and recall rate. Basically, our standard parts detection system could reach 15fps on NVIDIA GTX1080 (GPU), while achieving detection accuracy 90.01% mAP.
Statistical density modification using local pattern matching
Terwilliger, Thomas C.
2007-01-23
A computer implemented method modifies an experimental electron density map. A set of selected known experimental and model electron density maps is provided and standard templates of electron density are created from the selected experimental and model electron density maps by clustering and averaging values of electron density in a spherical region about each point in a grid that defines each selected known experimental and model electron density maps. Histograms are also created from the selected experimental and model electron density maps that relate the value of electron density at the center of each of the spherical regions to a correlation coefficient of a density surrounding each corresponding grid point in each one of the standard templates. The standard templates and the histograms are applied to grid points on the experimental electron density map to form new estimates of electron density at each grid point in the experimental electron density map.
Adaptation of video game UVW mapping to 3D visualization of gene expression patterns
NASA Astrophysics Data System (ADS)
Vize, Peter D.; Gerth, Victor E.
2007-01-01
Analysis of gene expression patterns within an organism plays a critical role in associating genes with biological processes in both health and disease. During embryonic development the analysis and comparison of different gene expression patterns allows biologists to identify candidate genes that may regulate the formation of normal tissues and organs and to search for genes associated with congenital diseases. No two individual embryos, or organs, are exactly the same shape or size so comparing spatial gene expression in one embryo to that in another is difficult. We will present our efforts in comparing gene expression data collected using both volumetric and projection approaches. Volumetric data is highly accurate but difficult to process and compare. Projection methods use UV mapping to align texture maps to standardized spatial frameworks. This approach is less accurate but is very rapid and requires very little processing. We have built a database of over 180 3D models depicting gene expression patterns mapped onto the surface of spline based embryo models. Gene expression data in different models can easily be compared to determine common regions of activity. Visualization software, both Java and OpenGL optimized for viewing 3D gene expression data will also be demonstrated.
Interoperability In The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.
2015-12-01
As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.
Brain-mapping projects using the common marmoset.
Okano, Hideyuki; Mitra, Partha
2015-04-01
Globally, there is an increasing interest in brain-mapping projects, including the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative project in the USA, the Human Brain Project (HBP) in Europe, and the Brain Mapping by Integrated Neurotechnologies for Disease Studies (Brain/MINDS) project in Japan. These projects aim to map the structure and function of neuronal circuits to ultimately understand the vast complexity of the human brain. Brain/MINDS is focused on structural and functional mapping of the common marmoset (Callithrix jacchus) brain. This non-human primate has numerous advantages for brain mapping, including a well-developed frontal cortex and a compact brain size, as well as the availability of transgenic technologies. In the present review article, we discuss strategies for structural and functional mapping of the marmoset brain and the relation of the common marmoset to other animals models. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
2014-08-07
S95-16445 (13-22 July 1995) --- A wide angle view from the rear shows activity in the new Mission Control Center (MCC), opened for operation and dedicated during the STS-70 mission. The Space Shuttle Discovery was just passing over Florida at the time this photo was taken (note mercator map and TV scene on screens). The new MCC, developed at a cost of about 50 million, replaces the main-frame based, NASA-unique design of the old Mission Control with a standard workstation-based, local area network system commonly in use today.
NASA Astrophysics Data System (ADS)
Dolezalova, J.; Popelka, S.
2016-06-01
The paper is dealing with scanpath comparison of eye-tracking data recorded during case study focused on the evaluation of 2D and 3D city maps. The experiment contained screenshots from three map portals. Two types of maps were used - standard map and 3D visualization. Respondents' task was to find particular point symbol on the map as fast as possible. Scanpath comparison is one group of the eye-tracking data analyses methods used for revealing the strategy of the respondents. In cartographic studies, the most commonly used application for scanpath comparison is eyePatterns that output is hierarchical clustering and a tree graph representing the relationships between analysed sequences. During an analysis of the algorithm generating a tree graph, it was found that the outputs do not correspond to the reality. We proceeded to the creation of a new tool called ScanGraph. This tool uses visualization of cliques in simple graphs and is freely available at www.eyetracking.upol.cz/scangraph. Results of the study proved the functionality of the tool and its suitability for analyses of different strategies of map readers. Based on the results of the tool, similar scanpaths were selected, and groups of respondents with similar strategies were identified. With this knowledge, it is possible to analyse the relationship between belonging to the group with similar strategy and data gathered from the questionnaire (age, sex, cartographic knowledge, etc.) or type of stimuli (2D, 3D map).
The bedrock electrical conductivity map of the UK
NASA Astrophysics Data System (ADS)
Beamish, David
2013-09-01
Airborne electromagnetic (AEM) surveys, when regionally extensive, may sample a wide-range of geological formations. The majority of AEM surveys can provide estimates of apparent (half-space) conductivity and such derived data provide a mapping capability. Depth discrimination of the geophysical mapping information is controlled by the bandwidth of each particular system. The objective of this study is to assess the geological information contained in accumulated frequency-domain AEM survey data from the UK where existing geological mapping can be considered well-established. The methodology adopted involves a simple GIS-based, spatial join of AEM and geological databases. A lithology-based classification of bedrock is used to provide an inherent association with the petrophysical rock parameters controlling bulk conductivity. At a scale of 1:625k, the UK digital bedrock geological lexicon comprises just 86 lithological classifications compared with 244 standard lithostratigraphic assignments. The lowest common AEM survey frequency of 3 kHz is found to provide an 87% coverage (by area) of the UK formations. The conductivities of the unsampled classes have been assigned on the basis of inherent lithological associations between formations. The statistical analysis conducted uses over 8 M conductivity estimates and provides a new UK national scale digital map of near-surface bedrock conductivity. The new baseline map, formed from central moments of the statistical distributions, allows assessments/interpretations of data exhibiting departures from the norm. The digital conductivity map developed here is believed to be the first such UK geophysical map compilation for over 75 years. The methodology described can also be applied to many existing AEM data sets.
Two-trait-locus linkage analysis: A powerful strategy for mapping complex genetic traits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schork, N.J.; Boehnke, M.; Terwilliger, J.D.
1993-11-01
Nearly all diseases mapped to date follow clear Mendelian, single-locus segregation patterns. In contrast, many common familial diseases such as diabetes, psoriasis, several forms of cancer, and schizophrenia are familial and appear to have a genetic component but do not exhibit simple Mendelian transmission. More complex models are required to explain the genetics of these important diseases. In this paper, the authors explore two-trait-locus, two-marker-locus linkage analysis in which two trait loci are mapped simultaneously to separate genetic markers. The authors compare the utility of this approach to standard one-trait-locus, one-marker-locus linkage analysis with and without allowance for heterogeneity. Themore » authors also compare the utility of the two-trait-locus, two-marker-locus analysis to two-trait-locus, one-marker-locus linkage analysis. For common diseases, pedigrees are often bilineal, with disease genes entering via two or more unrelated pedigree members. Since such pedigrees often are avoided in linkage studies, the authors also investigate the relative information content of unilineal and bilineal pedigrees. For the dominant-or-recessive and threshold models that the authors consider, the authors find that two-trait-locus, two-marker-locus linkage analysis can provide substantially more linkage information, as measured by expected maximum lod score, than standard one-trait-locus, one-marker-locus methods, even allowing for heterogeneity, while, for a dominant-or-dominant generating model, one-locus models that allow for heterogeneity extract essentially as much information as the two-trait-locus methods. For these three models, the authors also find that bilineal pedigrees provide sufficient linkage information to warrant their inclusion in such studies. The authors discuss strategies for assessing the significance of the two linkages assumed in two-trait-locus, two-marker-locus models. 37 refs., 1 fig., 4 tabs.« less
Common Coupled Fixed Point Theorems for Two Hybrid Pairs of Mappings under φ-ψ Contraction
Handa, Amrish
2014-01-01
We introduce the concept of (EA) property and occasional w-compatibility for hybrid pair F : X × X → 2X and f : X → X. We also introduce common (EA) property for two hybrid pairs F, G : X → 2X and f, g : X → X. We establish some common coupled fixed point theorems for two hybrid pairs of mappings under φ-ψ contraction on noncomplete metric spaces. An example is also given to validate our results. We improve, extend and generalize several known results. The results of this paper generalize the common fixed point theorems for hybrid pairs of mappings and essentially contain fixed point theorems for hybrid pair of mappings. PMID:27340688
Wolff, A P; Groen, G J; Crul, B J
2001-01-01
Selective spinal nerve infiltration blocks are used diagnostically in patients with chronic low back pain radiating into the leg. Generally, a segmental nerve block is considered successful if the pain is reduced substantially. Hypesthesia and elicited paresthesias coinciding with the presumed segmental level are used as controls. The interpretation depends on a standard dermatomal map. However, it is not clear if this interpretation is reliable enough, because standard dermatomal maps do not show the overlap of neighboring dermatomes. The goal of the present study is to establish if dissimilarities exist between areas of hypesthesia, spontaneous pain reported by the patient, pain reduction by local anesthetics, and paresthesias elicited by sensory electrostimulation. A secondary goal is to determine to what extent the interpretation is improved when the overlaps of neighboring dermatomes are taken into account. Patients suffering from chronic low back pain with pain radiating into the leg underwent lumbosacral segmental nerve root blocks at subsequent levels on separate days. Lidocaine (2%, 0.5 mL) mixed with radiopaque fluid (0.25 mL) was injected after verifying the target location using sensory and motor electrostimulation. Sensory changes (pinprick method), paresthesias (reported by the patient), and pain reduction (Numeric Rating Scale) were reported. Hypesthesia and paresthesias were registered in a standard dermatomal map and in an adapted map which included overlap of neighboring dermatomes. The relationships between spinal level of injection, extent of hypesthesia, location of paresthesias, and corresponding dermatome were assessed quantitatively. Comparison of the results between both dermatomal maps was done by paired t-tests. After inclusion, data were processed for 40 segmental nerve blocks (L2-S1) performed in 29 patients. Pain reduction was achieved in 43%. Hypesthetic areas showed a large variability in size and location, and also in comparison to paresthesias. Mean hypesthetic area amounted 2.7 +/- 1.4 (+/- SD: range, 0 to 6; standard map) and 3.6 +/- 1.8 (0 to 6; adapted map; P <.001) dermatomes. In these cases, hypesthesia in the corresponding dermatome was found in 80% (standard map) and 88% of the cases (adapted map, not significant). Paresthesias occurring in the corresponding dermatome were found in 80% (standard map) compared with 98% (adapted map, P <.001). In 85% (standard map) and 88% (adapted map), spontaneous pain was present in the dermatome corresponding to the level of local anesthetic injection. In 55% (standard map) versus 75% (adapted map, P <.005), a combination of spontaneous pain, hypesthesia, and paresthesias was found in the corresponding dermatome. Hypesthetic areas determined after lumbosacral segmental nerve blocks show a large variability in size and location compared with elicited paresthesias. Confirmation of an adequately performed segmental nerve block, determined by coexistence of hypesthesia, elicited paresthesias and pain in the presumed dermatome, is more reliable when the overlap of neighboring dermatomes is taken into account.
NASA Technical Reports Server (NTRS)
Wilson, C.; Dye, R.; Reed, L.
1982-01-01
The errors associated with planimetric mapping of the United States using satellite remote sensing techniques are analyzed. Assumptions concerning the state of the art achievable for satellite mapping systems and platforms in the 1995 time frame are made. An analysis of these performance parameters is made using an interactive cartographic satellite computer model, after first validating the model using LANDSAT 1 through 3 performance parameters. An investigation of current large scale (1:24,000) US National mapping techniques is made. Using the results of this investigation, and current national mapping accuracy standards, the 1995 satellite mapping system is evaluated for its ability to meet US mapping standards for planimetric and topographic mapping at scales of 1:24,000 and smaller.
Brun, Caroline; Leporé, Natasha; Pennec, Xavier; Lee, Agatha D.; Barysheva, Marina; Madsen, Sarah K.; Avedissian, Christina; Chou, Yi-Yu; de Zubicaray, Greig I.; McMahon, Katie; Wright, Margaret; Toga, Arthur W.; Thompson, Paul M.
2010-01-01
Genetic and environmental factors influence brain structure and function profoundly The search for heritable anatomical features and their influencing genes would be accelerated with detailed 3D maps showing the degree to which brain morphometry is genetically determined. As part of an MRI study that will scan 1150 twins, we applied Tensor-Based Morphometry to compute morphometric differences in 23 pairs of identical twins and 23 pairs of same-sex fraternal twins (mean age: 23.8 ± 1.8 SD years). All 92 twins’ 3D brain MRI scans were nonlinearly registered to a common space using a Riemannian fluid-based warping approach to compute volumetric differences across subjects. A multi-template method was used to improve volume quantification. Vector fields driving each subject’s anatomy onto the common template were analyzed to create maps of local volumetric excesses and deficits relative to the standard template. Using a new structural equation modeling method, we computed the voxelwise proportion of variance in volumes attributable to additive (A) or dominant (D) genetic factors versus shared environmental (C) or unique environmental factors (E). The method was also applied to various anatomical regions of interest (ROIs). As hypothesized, the overall volumes of the brain, basal ganglia, thalamus, and each lobe were under strong genetic control; local white matter volumes were mostly controlled by common environment. After adjusting for individual differences in overall brain scale, genetic influences were still relatively high in the corpus callosum and in early-maturing brain regions such as the occipital lobes, while environmental influences were greater in frontal brain regions which have a more protracted maturational time-course. PMID:19446645
King, Julie; Thomas, Ann; James, Caron; King, Ian; Armstead, Ian
2013-07-03
Ryegrasses and fescues (genera, Lolium and Festuca) are species of forage and turf grasses which are used widely in agricultural and amenity situations. They are classified within the sub-family Pooideae and so are closely related to Brachypodium distachyon, wheat, barley, rye and oats. Recently, a DArT array has been developed which can be used in generating marker and mapping information for ryegrasses and fescues. This represents a potential common marker set for ryegrass and fescue researchers which can be linked through to comparative genomic information for the grasses. A F2 perennial ryegrass genetic map was developed consisting of 7 linkage groups defined by 1316 markers and deriving a total map length of 683 cM. The marker set included 866 DArT and 315 gene sequence-based markers. Comparison with previous DArT mapping studies in perennial and Italian ryegrass (L. multiflorum) identified 87 and 105 DArT markers in common, respectively, of which 94% and 87% mapped to homoeologous linkage groups. A similar comparison with meadow fescue (F. pratensis) identified only 28 DArT markers in common, of which c. 50% mapped to non-homoelogous linkage groups. In L. perenne, the genetic distance spanned by the DArT markers encompassed the majority of the regions that could be described in terms of comparative genomic relationships with rice, Brachypodium distachyon, and Sorghum bicolor. DArT markers are likely to be a useful common marker resource for ryegrasses and fescues, though the success in aligning different populations through the mapping of common markers will be influenced by degrees of population interrelatedness. The detailed mapping of DArT and gene-based markers in this study potentially allows comparative relationships to be derived in future mapping populations characterised using solely DArT markers.
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
2015-01-01
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Mapping Opthalmic Terms to a Standardized Vocabulary.
ERIC Educational Resources Information Center
Patrick, Timothy B.; Reid, John C.; Sievert, MaryEllen; Popescu, Mihail; Gigantelli, James W.; Shelton, Mark E.; Schiffman, Jade S.
2000-01-01
Describes work by the American Academy of Ophthalmology (AAO) to expand the standardized vocabulary, Systematized Nomenclature of Medicine (SNOMED), to accommodate a definitive ophthalmic standardized vocabulary. Mapped a practice-based clinical ophthalmic vocabulary to SNOMED and other vocabularies in the Metathesaurus of the Unified Medical…
Soller, David R.
1996-01-01
This report summarizes a technical review of USGS Open-File Report 95-525, 'Cartographic and Digital Standard for Geologic Map Information' and OFR 95-526 (diskettes containing digital representations of the standard symbols). If you are considering the purchase or use of those documents, you should read this report first. For some purposes, OFR 95-525 (the printed document) will prove to be an excellent resource. However, technical review identified significant problems with the two documents that will be addressed by various Federal and State committees composed of geologists and cartographers, as noted below. Therefore, the 2-year review period noted in OFR 95-525 is no longer applicable. Until those problems are resolved and formal standards are issued, you may consult the following World-Wide Web (WWW) site which contains information about development of geologic map standards: URL: http://ncgmp.usgs.gov/ngmdbproject/home.html
Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi
2014-01-01
For genetic studies and genomics-assisted breeding, particularly of minor crops, a genotyping system that does not require a priori genomic information is preferable. Here, we demonstrated the potential of a novel array-based genotyping system for the rapid construction of high-density linkage map and quantitative trait loci (QTL) mapping. By using the system, we successfully constructed an accurate, high-density linkage map for common buckwheat (Fagopyrum esculentum Moench); the map was composed of 756 loci and included 8,884 markers. The number of linkage groups converged to eight, which is the basic number of chromosomes in common buckwheat. The sizes of the linkage groups of the P1 and P2 maps were 773.8 and 800.4 cM, respectively. The average interval between adjacent loci was 2.13 cM. The linkage map constructed here will be useful for the analysis of other common buckwheat populations. We also performed QTL mapping for main stem length and detected four QTL. It took 37 days to process 178 samples from DNA extraction to genotyping, indicating the system enables genotyping of genome-wide markers for a few hundred buckwheat plants before the plants mature. The novel system will be useful for genomics-assisted breeding in minor crops without a priori genomic information. PMID:25914583
Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi
2014-12-01
For genetic studies and genomics-assisted breeding, particularly of minor crops, a genotyping system that does not require a priori genomic information is preferable. Here, we demonstrated the potential of a novel array-based genotyping system for the rapid construction of high-density linkage map and quantitative trait loci (QTL) mapping. By using the system, we successfully constructed an accurate, high-density linkage map for common buckwheat (Fagopyrum esculentum Moench); the map was composed of 756 loci and included 8,884 markers. The number of linkage groups converged to eight, which is the basic number of chromosomes in common buckwheat. The sizes of the linkage groups of the P1 and P2 maps were 773.8 and 800.4 cM, respectively. The average interval between adjacent loci was 2.13 cM. The linkage map constructed here will be useful for the analysis of other common buckwheat populations. We also performed QTL mapping for main stem length and detected four QTL. It took 37 days to process 178 samples from DNA extraction to genotyping, indicating the system enables genotyping of genome-wide markers for a few hundred buckwheat plants before the plants mature. The novel system will be useful for genomics-assisted breeding in minor crops without a priori genomic information.
First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Manaud, N.; Gonzalez, J.
2014-04-01
We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES infrared maps or planetary surface nomenclature, provided in both simple cylindrical and polar stereographic projections, shall enhance this user experience. In addition, users should be able to import and export data in commonly used open- GIS formats. It is also intended to serve all PSA geospatial data through OGC-compliant Web Services so that they can be captured, visualised and analysed directly from GIS software, along with data from other sources. The following figure illustrates how the PSA web map interface and services shall fit in a typical Planetary GIS user working environment.
US EPA Nonattainment Areas and Designations-PM10 (1987 NAAQS)
This web service contains the following layer: PM10 Nonattainment Areas (1987 NAAQS). Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1987PM10/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure levels. The metho
US EPA Nonattainment Areas and Designations-Lead (2008 NAAQS)
This web service contains the following layers: Lead NAA 2008 NAAQS and Lead NAA Centroids 2008 NAAQS. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2008Lead/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure l
US EPA Nonattainment Areas and Designations-8 Hour Ozone (2008 NAAQS)
This web service contains the following layers: Ozone 2008 NAAQS NAA State Level and Ozone 2008 NAAQS NAA National Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2008Ozone8hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-
US EPA Nonattainment Areas and Designations-8 Hour Ozone (1997 NAAQS)
This web service contains the following layers: Ozone 1997 NAAQS NAA State Level and Ozone 1997 NAAQS NAA National Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1997Ozone8hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short
A comprehensive map of the influenza A virus replication cycle
2013-01-01
Background Influenza is a common infectious disease caused by influenza viruses. Annual epidemics cause severe illnesses, deaths, and economic loss around the world. To better defend against influenza viral infection, it is essential to understand its mechanisms and associated host responses. Many studies have been conducted to elucidate these mechanisms, however, the overall picture remains incompletely understood. A systematic understanding of influenza viral infection in host cells is needed to facilitate the identification of influential host response mechanisms and potential drug targets. Description We constructed a comprehensive map of the influenza A virus (‘IAV’) life cycle (‘FluMap’) by undertaking a literature-based, manual curation approach. Based on information obtained from publicly available pathway databases, updated with literature-based information and input from expert virologists and immunologists, FluMap is currently composed of 960 factors (i.e., proteins, mRNAs etc.) and 456 reactions, and is annotated with ~500 papers and curation comments. In addition to detailing the type of molecular interactions, isolate/strain specific data are also available. The FluMap was built with the pathway editor CellDesigner in standard SBML (Systems Biology Markup Language) format and visualized as an SBGN (Systems Biology Graphical Notation) diagram. It is also available as a web service (online map) based on the iPathways+ system to enable community discussion by influenza researchers. We also demonstrate computational network analyses to identify targets using the FluMap. Conclusion The FluMap is a comprehensive pathway map that can serve as a graphically presented knowledge-base and as a platform to analyze functional interactions between IAV and host factors. Publicly available webtools will allow continuous updating to ensure the most reliable representation of the host-virus interaction network. The FluMap is available at http://www.influenza-x.org/flumap/. PMID:24088197
Template‐based field map prediction for rapid whole brain B0 shimming
Shi, Yuhang; Vannesjo, S. Johanna; Miller, Karla L.
2017-01-01
Purpose In typical MRI protocols, time is spent acquiring a field map to calculate the shim settings for best image quality. We propose a fast template‐based field map prediction method that yields near‐optimal shims without measuring the field. Methods The template‐based prediction method uses prior knowledge of the B0 distribution in the human brain, based on a large database of field maps acquired from different subjects, together with subject‐specific structural information from a quick localizer scan. The shimming performance of using the template‐based prediction is evaluated in comparison to a range of potential fast shimming methods. Results Static B0 shimming based on predicted field maps performed almost as well as shimming based on individually measured field maps. In experimental evaluations at 7 T, the proposed approach yielded a residual field standard deviation in the brain of on average 59 Hz, compared with 50 Hz using measured field maps and 176 Hz using no subject‐specific shim. Conclusions This work demonstrates that shimming based on predicted field maps is feasible. The field map prediction accuracy could potentially be further improved by generating the template from a subset of subjects, based on parameters such as head rotation and body mass index. Magn Reson Med 80:171–180, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. PMID:29193340
Ogunyemi, Omolola I; Meeker, Daniella; Kim, Hyeon-Eui; Ashish, Naveen; Farzaneh, Seena; Boxwala, Aziz
2013-08-01
The need for a common format for electronic exchange of clinical data prompted federal endorsement of applicable standards. However, despite obvious similarities, a consensus standard has not yet been selected in the comparative effectiveness research (CER) community. Using qualitative metrics for data retrieval and information loss across a variety of CER topic areas, we compare several existing models from a representative sample of organizations associated with clinical research: the Observational Medical Outcomes Partnership (OMOP), Biomedical Research Integrated Domain Group, the Clinical Data Interchange Standards Consortium, and the US Food and Drug Administration. While the models examined captured a majority of the data elements that are useful for CER studies, data elements related to insurance benefit design and plans were most detailed in OMOP's CDM version 4.0. Standardized vocabularies that facilitate semantic interoperability were included in the OMOP and US Food and Drug Administration Mini-Sentinel data models, but are left to the discretion of the end-user in Biomedical Research Integrated Domain Group and Analysis Data Model, limiting reuse opportunities. Among the challenges we encountered was the need to model data specific to a local setting. This was handled by extending the standard data models. We found that the Common Data Model from the OMOP met the broadest complement of CER objectives. Minimal information loss occurred in mapping data from institution-specific data warehouses onto the data models from the standards we assessed. However, to support certain scenarios, we found a need to enhance existing data dictionaries with local, institution-specific information.
Cooley, Michael J.; Davis, Larry R.; Fishburn, Kristin A.; Lestinsky, Helmut; Moore, Laurence R.
2011-01-01
A full-size style sheet template in PDF that defines the placement of map elements, marginalia, and font sizes and styles accompanies this standard. The GeoPDF US Topo maps are fashioned to conform to this style sheet so that a user can print out a map at the 1:24,000-scale using the dimensions of the traditional standard 7.5-minute quadrangle. Symbology and type specifications for feature content are published separately. In addition, the GeoPDF design allows for custom printing, so that a user may zoom in and out, turn layers on and off, and view or print any combination of layers or any map portion at any desired scale.
Bien-Willner, Gabriel A.; López-Terrada, Dolores; Bhattacharjee, Meena B.; Patel, Kayuri U.; Stankiewicz, Paweł; Lupski, James R.; Pfeifer, John D.; Perry, Arie
2012-01-01
Medulloblastoma is diagnosed histologically; treatment depends on staging and age of onset. Whereas clinical factors identify a standard- and a high-risk population, these findings cannot differentiate which standard-risk patients will relapse and die. Outcome is thought to be influenced by tumor subtype and molecular alterations. Poor prognosis has been associated with isochromosome (i)17q in some but not all studies. In most instances, molecular investigations document that i17q is not a true isochromosome but rather an isodicentric chromosome, idic(17)(p11.2), with rearrangement breakpoints mapping within the REPA/REPB region on 17p11.2. This study explores the clinical utility of testing for idic(17)(p11.2) rearrangements using an assay based on fluorescent in situ hybridization (FISH). This test was applied to 58 consecutive standard- and high-risk medulloblastomas with a 5-year minimum of clinical follow-up. The presence of i17q (ie, including cases not involving the common breakpoint), idic(17)(p11.2), and histologic subtype was correlated with clinical outcome. Overall survival (OS) and disease-free survival (DFS) were consistent with literature reports. Fourteen patients (25%) had i17q, with 10 (18%) involving the common isodicentric rearrangement. The presence of i17q was associated with a poor prognosis. OS and DFS were poor in all cases with anaplasia (4), unresectable disease (7), and metastases at presentation (10); however, patients with standard-risk tumors fared better. Of these 44 cases, tumors with idic(17)(p11.2) were associated with significantly worse patient outcomes and shorter mean DFS. FISH detection of idic(17)(p11.2) may be useful for risk stratification in standard-risk patients. The presence of this abnormal chromosome is associated with early recurrence of medulloblastoma. PMID:22573308
Description of the U.S. Geological Survey Geo Data Portal data integration framework
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.
2012-01-01
The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.
Digital Mapping of Soil Organic Carbon Contents and Stocks in Denmark
Adhikari, Kabindra; Hartemink, Alfred E.; Minasny, Budiman; Bou Kheir, Rania; Greve, Mette B.; Greve, Mogens H.
2014-01-01
Estimation of carbon contents and stocks are important for carbon sequestration, greenhouse gas emissions and national carbon balance inventories. For Denmark, we modeled the vertical distribution of soil organic carbon (SOC) and bulk density, and mapped its spatial distribution at five standard soil depth intervals (0−5, 5−15, 15−30, 30−60 and 60−100 cm) using 18 environmental variables as predictors. SOC distribution was influenced by precipitation, land use, soil type, wetland, elevation, wetness index, and multi-resolution index of valley bottom flatness. The highest average SOC content of 20 g kg−1 was reported for 0−5 cm soil, whereas there was on average 2.2 g SOC kg−1 at 60−100 cm depth. For SOC and bulk density prediction precision decreased with soil depth, and a standard error of 2.8 g kg−1 was found at 60−100 cm soil depth. Average SOC stock for 0−30 cm was 72 t ha−1 and in the top 1 m there was 120 t SOC ha−1. In total, the soils stored approximately 570 Tg C within the top 1 m. The soils under agriculture had the highest amount of carbon (444 Tg) followed by forest and semi-natural vegetation that contributed 11% of the total SOC stock. More than 60% of the total SOC stock was present in Podzols and Luvisols. Compared to previous estimates, our approach is more reliable as we adopted a robust quantification technique and mapped the spatial distribution of SOC stock and prediction uncertainty. The estimation was validated using common statistical indices and the data and high-resolution maps could be used for future soil carbon assessment and inventories. PMID:25137066
Digital mapping of soil organic carbon contents and stocks in Denmark.
Adhikari, Kabindra; Hartemink, Alfred E; Minasny, Budiman; Bou Kheir, Rania; Greve, Mette B; Greve, Mogens H
2014-01-01
Estimation of carbon contents and stocks are important for carbon sequestration, greenhouse gas emissions and national carbon balance inventories. For Denmark, we modeled the vertical distribution of soil organic carbon (SOC) and bulk density, and mapped its spatial distribution at five standard soil depth intervals (0-5, 5-15, 15-30, 30-60 and 60-100 cm) using 18 environmental variables as predictors. SOC distribution was influenced by precipitation, land use, soil type, wetland, elevation, wetness index, and multi-resolution index of valley bottom flatness. The highest average SOC content of 20 g kg(-1) was reported for 0-5 cm soil, whereas there was on average 2.2 g SOC kg(-1) at 60-100 cm depth. For SOC and bulk density prediction precision decreased with soil depth, and a standard error of 2.8 g kg(-1) was found at 60-100 cm soil depth. Average SOC stock for 0-30 cm was 72 t ha(-1) and in the top 1 m there was 120 t SOC ha(-1). In total, the soils stored approximately 570 Tg C within the top 1 m. The soils under agriculture had the highest amount of carbon (444 Tg) followed by forest and semi-natural vegetation that contributed 11% of the total SOC stock. More than 60% of the total SOC stock was present in Podzols and Luvisols. Compared to previous estimates, our approach is more reliable as we adopted a robust quantification technique and mapped the spatial distribution of SOC stock and prediction uncertainty. The estimation was validated using common statistical indices and the data and high-resolution maps could be used for future soil carbon assessment and inventories.
Standardized anatomic space for abdominal fat quantification
NASA Astrophysics Data System (ADS)
Tong, Yubing; Udupa, Jayaram K.; Torigian, Drew A.
2014-03-01
The ability to accurately measure subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT) from images is important for improved assessment and management of patients with various conditions such as obesity, diabetes mellitus, obstructive sleep apnea, cardiovascular disease, kidney disease, and degenerative disease. Although imaging and analysis methods to measure the volume of these tissue components have been developed [1, 2], in clinical practice, an estimate of the amount of fat is obtained from just one transverse abdominal CT slice typically acquired at the level of the L4-L5 vertebrae for various reasons including decreased radiation exposure and cost [3-5]. It is generally assumed that such an estimate reliably depicts the burden of fat in the body. This paper sets out to answer two questions related to this issue which have not been addressed in the literature. How does one ensure that the slices used for correlation calculation from different subjects are at the same anatomic location? At what anatomic location do the volumes of SAT and VAT correlate maximally with the corresponding single-slice area measures? To answer these questions, we propose two approaches for slice localization: linear mapping and non-linear mapping which is a novel learning based strategy for mapping slice locations to a standardized anatomic space so that same anatomic slice locations are identified in different subjects. We then study the volume-to-area correlations and determine where they become maximal. We demonstrate on 50 abdominal CT data sets that this mapping achieves significantly improved consistency of anatomic localization compared to current practice. Our results also indicate that maximum correlations are achieved at different anatomic locations for SAT and VAT which are both different from the L4-L5 junction commonly utilized.
New false color mapping for image fusion
NASA Astrophysics Data System (ADS)
Toet, Alexander; Walraven, Jan
1996-03-01
A pixel-based color-mapping algorithm is presented that produces a fused false color rendering of two gray-level images representing different sensor modalities. The resulting images have a higher information content than each of the original images and retain sensor-specific image information. The unique component of each image modality is enhanced in the resulting fused color image representation. First, the common component of the two original input images is determined. Second, the common component is subtracted from the original images to obtain the unique component of each image. Third, the unique component of each image modality is subtracted from the image of the other modality. This step serves to enhance the representation of sensor-specific details in the final fused result. Finally, a fused color image is produced by displaying the images resulting from the last step through, respectively, the red and green channels of a color display. The method is applied to fuse thermal and visual images. The results show that the color mapping enhances the visibility of certain details and preserves the specificity of the sensor information. The fused images also have a fairly natural appearance. The fusion scheme involves only operations on corresponding pixels. The resolution of a fused image is therefore directly related to the resolution of the input images. Before fusing, the contrast of the images can be enhanced and their noise can be reduced by standard image- processing techniques. The color mapping algorithm is computationally simple. This implies that the investigated approaches can eventually be applied in real time and that the hardware needed is not too complicated or too voluminous (an important consideration when it has to fit in an airplane, for instance).
Interactive computer methods for generating mineral-resource maps
Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.
1980-01-01
Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.
cudaMap: a GPU accelerated program for gene expression connectivity mapping
2013-01-01
Background Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. Results cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Conclusion Emerging ‘omics’ technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap. PMID:24112435
Farmer, Andrew D.; Huang, Wei; Ambachew, Daniel; Penmetsa, R. Varma; Carrasquilla-Garcia, Noelia; Assefa, Teshale; Cannon, Steven B.
2018-01-01
Recombination (R) rate and linkage disequilibrium (LD) analyses are the basis for plant breeding. These vary by breeding system, by generation of inbreeding or outcrossing and by region in the chromosome. Common bean (Phaseolus vulgaris L.) is a favored food legume with a small sequenced genome (514 Mb) and n = 11 chromosomes. The goal of this study was to describe R and LD in the common bean genome using a 768-marker array of single nucleotide polymorphisms (SNP) based on Trans-legume Orthologous Group (TOG) genes along with an advanced-generation Recombinant Inbred Line reference mapping population (BAT93 x Jalo EEP558) and an internationally available diversity panel. A whole genome genetic map was created that covered all eleven linkage groups (LG). The LGs were linked to the physical map by sequence data of the TOGs compared to each chromosome sequence of common bean. The genetic map length in total was smaller than for previous maps reflecting the precision of allele calling and mapping with SNP technology as well as the use of gene-based markers. A total of 91.4% of TOG markers had singleton hits with annotated Pv genes and all mapped outside of regions of resistance gene clusters. LD levels were found to be stronger within the Mesoamerican genepool and decay more rapidly within the Andean genepool. The recombination rate across the genome was 2.13 cM / Mb but R was found to be highly repressed around centromeres and frequent outside peri-centromeric regions. These results have important implications for association and genetic mapping or crop improvement in common bean. PMID:29522524
Vision-based mapping with cooperative robots
NASA Astrophysics Data System (ADS)
Little, James J.; Jennings, Cullen; Murray, Don
1998-10-01
Two stereo-vision-based mobile robots navigate and autonomously explore their environment safely while building occupancy grid maps of the environment. The robots maintain position estimates within a global coordinate frame using landmark recognition. This allows them to build a common map by sharing position information and stereo data. Stereo vision processing and map updates are done at 3 Hz and the robots move at speeds of 200 cm/s. Cooperative mapping is achieved through autonomous exploration of unstructured and dynamic environments. The map is constructed conservatively, so as to be useful for collision-free path planning. Each robot maintains a separate copy of a shared map, and then posts updates to the common map when it returns to observe a landmark at home base. Issues include synchronization, mutual localization, navigation, exploration, registration of maps, merging repeated views (fusion), centralized vs decentralized maps.
2009-01-01
Background Expressed sequence tags (ESTs) are an important source of gene-based markers such as those based on insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). Several gel based methods have been reported for the detection of sequence variants, however they have not been widely exploited in common bean, an important legume crop of the developing world. The objectives of this project were to develop and map EST based markers using analysis of single strand conformation polymorphisms (SSCPs), to create a transcript map for common bean and to compare synteny of the common bean map with sequenced chromosomes of other legumes. Results A set of 418 EST based amplicons were evaluated for parental polymorphisms using the SSCP technique and 26% of these presented a clear conformational or size polymorphism between Andean and Mesoamerican genotypes. The amplicon based markers were then used for genetic mapping with segregation analysis performed in the DOR364 × G19833 recombinant inbred line (RIL) population. A total of 118 new marker loci were placed into an integrated molecular map for common bean consisting of 288 markers. Of these, 218 were used for synteny analysis and 186 presented homology with segments of the soybean genome with an e-value lower than 7 × 10-12. The synteny analysis with soybean showed a mosaic pattern of syntenic blocks with most segments of any one common bean linkage group associated with two soybean chromosomes. The analysis with Medicago truncatula and Lotus japonicus presented fewer syntenic regions consistent with the more distant phylogenetic relationship between the galegoid and phaseoloid legumes. Conclusion The SSCP technique is a useful and inexpensive alternative to other SNP or Indel detection techniques for saturating the common bean genetic map with functional markers that may be useful in marker assisted selection. In addition, the genetic markers based on ESTs allowed the construction of a transcript map and given their high conservation between species allowed synteny comparisons to be made to sequenced genomes. This synteny analysis may support positional cloning of target genes in common bean through the use of genomic information from these other legumes. PMID:20030833
Effectiveness of Mind Mapping in English Teaching among VIII Standard Students
ERIC Educational Resources Information Center
Hallen, D.; Sangeetha, N.
2015-01-01
The aim of the study is to find out the effectiveness of mind mapping technique over conventional method in teaching English at high school level (VIII), in terms of Control and Experimental group. The sample of the study comprised, 60 VIII Standard students in Tiruchendur Taluk. Mind Maps and Achievement Test (Pretest & Posttest) were…
ERIC Educational Resources Information Center
Andrews, Judith; Eade, Eleanor
2013-01-01
Birmingham City University's Library and Learning Resources' strategic aim is to improve student satisfaction. A key element is the achievement of the Customer Excellence Standard. An important component of the standard is the mapping of services to improve quality. Library and Learning Resources has developed a methodology to map these…
Map-IT! A Web-Based GIS Tool for Watershed Science Education.
ERIC Educational Resources Information Center
Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.
This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…
Validation of a standardized mapping system of the hip joint for radial MRA sequencing.
Klenke, Frank M; Hoffmann, Daniel B; Cross, Brian J; Siebenrock, Klaus A
2015-03-01
Intraarticular gadolinium-enhanced magnetic resonance arthrography (MRA) is commonly applied to characterize morphological disorders of the hip. However, the reproducibility of retrieving anatomic landmarks on MRA scans and their correlation with intraarticular pathologies is unknown. A precise mapping system for the exact localization of hip pathomorphologies with radial MRA sequences is lacking. Therefore, the purpose of the study was the establishment and validation of a reproducible mapping system for radial sequences of hip MRA. Sixty-nine consecutive intraarticular gadolinium-enhanced hip MRAs were evaluated. Radial sequencing consisted of 14 cuts orientated along the axis of the femoral neck. Three orthopedic surgeons read the radial sequences independently. Each MRI was read twice with a minimum interval of 7 days from the first reading. The intra- and inter-observer reliability of the mapping procedure was determined. A clockwise system for hip MRA was established. The teardrop figure served to determine the 6 o'clock position of the acetabulum; the center of the greater trochanter served to determine the 12 o'clock position of the femoral head-neck junction. The intra- and inter-observer ICCs to retrieve the correct 6/12 o'clock positions were 0.906-0.996 and 0.978-0.988, respectively. The established mapping system for radial sequences of hip joint MRA is reproducible and easy to perform.
Process mapping as a framework for performance improvement in emergency general surgery.
DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad
2017-12-01
Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.
Process mapping as a framework for performance improvement in emergency general surgery.
DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad
2018-02-01
Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, B.; King, S. J.; Vallance, C., E-mail: claire.vallance@chem.ox.ac.uk
2014-02-15
The time resolution achievable using standard position-sensitive ion detectors, consisting of a chevron pair of microchannel plates coupled to a phosphor screen, is primarily limited by the emission lifetime of the phosphor, around 70 ns for the most commonly used P47 phosphor. We demonstrate that poly-para-phenylene laser dyes may be employed extremely effectively as scintillators, exhibiting higher brightness and much shorter decay lifetimes than P47. We provide an extensive characterisation of the properties of such scintillators, with a particular emphasis on applications in velocity-map imaging and microscope-mode imaging mass spectrometry. The most promising of the new scintillators exhibits an electron-to-photonmore » conversion efficiency double that of P47, with an emission lifetime an order of magnitude shorter. The new scintillator screens are vacuum stable and show no signs of signal degradation even over longer periods of operation.« less
NASA Astrophysics Data System (ADS)
Lee, Donghoon; Kim, Ye-seul; Choi, Sunghoon; Lee, Haenghwa; Choi, Seungyeon; Kim, Hee-Joung
2016-03-01
Breast cancer is one of the most common malignancies in women. For years, mammography has been used as the gold standard for localizing breast cancer, despite its limitation in determining cancer composition. Therefore, the purpose of this simulation study is to confirm the feasibility of obtaining tumor composition using dual energy digital mammography. To generate X-ray sources for dual energy mammography, 26 kVp and 39 kVp voltages were generated for low and high energy beams, respectively. Additionally, the energy subtraction and inverse mapping functions were applied to provide compositional images. The resultant images showed that the breast composition obtained by the inverse mapping function with cubic fitting achieved the highest accuracy and least noise. Furthermore, breast density analysis with cubic fitting showed less than 10% error compare to true values. In conclusion, this study demonstrated the feasibility of creating individual compositional images and capability of analyzing breast density effectively.
Sequential and parallel image restoration: neural network implementations.
Figueiredo, M T; Leitao, J N
1994-01-01
Sequential and parallel image restoration algorithms and their implementations on neural networks are proposed. For images degraded by linear blur and contaminated by additive white Gaussian noise, maximum a posteriori (MAP) estimation and regularization theory lead to the same high dimension convex optimization problem. The commonly adopted strategy (in using neural networks for image restoration) is to map the objective function of the optimization problem into the energy of a predefined network, taking advantage of its energy minimization properties. Departing from this approach, we propose neural implementations of iterative minimization algorithms which are first proved to converge. The developed schemes are based on modified Hopfield (1985) networks of graded elements, with both sequential and parallel updating schedules. An algorithm supported on a fully standard Hopfield network (binary elements and zero autoconnections) is also considered. Robustness with respect to finite numerical precision is studied, and examples with real images are presented.
Orlando, Lori A.; Sperber, Nina R.; Voils, Corrine; Nichols, Marshall; Myers, Rachel A.; Wu, R. Ryanne; Rakhra-Burris, Tejinder; Levy, Kenneth D.; Levy, Mia; Pollin, Toni I.; Guan, Yue; Horowitz, Carol R.; Ramos, Michelle; Kimmel, Stephen E.; McDonough, Caitrin W.; Madden, Ebony B.; Damschroder, Laura J.
2017-01-01
Purpose Implementation research provides a structure for evaluating the clinical integration of genomic medicine interventions. This paper describes the Implementing GeNomics In PracTicE (IGNITE) Network’s efforts to promote: 1) a broader understanding of genomic medicine implementation research; and 2) the sharing of knowledge generated in the network. Methods To facilitate this goal the IGNITE Network Common Measures Working Group (CMG) members adopted the Consolidated Framework for Implementation Research (CFIR) to guide their approach to: identifying constructs and measures relevant to evaluating genomic medicine as a whole, standardizing data collection across projects, and combining data in a centralized resource for cross network analyses. Results CMG identified ten high-priority CFIR constructs as important for genomic medicine. Of those, eight didn’t have standardized measurement instruments. Therefore, we developed four survey tools to address this gap. In addition, we identified seven high-priority constructs related to patients, families, and communities that did not map to CFIR constructs. Both sets of constructs were combined to create a draft genomic medicine implementation model. Conclusion We developed processes to identify constructs deemed valuable for genomic medicine implementation and codified them in a model. These resources are freely available to facilitate knowledge generation and sharing across the field. PMID:28914267
Orlando, Lori A; Sperber, Nina R; Voils, Corrine; Nichols, Marshall; Myers, Rachel A; Wu, R Ryanne; Rakhra-Burris, Tejinder; Levy, Kenneth D; Levy, Mia; Pollin, Toni I; Guan, Yue; Horowitz, Carol R; Ramos, Michelle; Kimmel, Stephen E; McDonough, Caitrin W; Madden, Ebony B; Damschroder, Laura J
2018-06-01
PurposeImplementation research provides a structure for evaluating the clinical integration of genomic medicine interventions. This paper describes the Implementing Genomics in Practice (IGNITE) Network's efforts to promote (i) a broader understanding of genomic medicine implementation research and (ii) the sharing of knowledge generated in the network.MethodsTo facilitate this goal, the IGNITE Network Common Measures Working Group (CMG) members adopted the Consolidated Framework for Implementation Research (CFIR) to guide its approach to identifying constructs and measures relevant to evaluating genomic medicine as a whole, standardizing data collection across projects, and combining data in a centralized resource for cross-network analyses.ResultsCMG identified 10 high-priority CFIR constructs as important for genomic medicine. Of those, eight did not have standardized measurement instruments. Therefore, we developed four survey tools to address this gap. In addition, we identified seven high-priority constructs related to patients, families, and communities that did not map to CFIR constructs. Both sets of constructs were combined to create a draft genomic medicine implementation model.ConclusionWe developed processes to identify constructs deemed valuable for genomic medicine implementation and codified them in a model. These resources are freely available to facilitate knowledge generation and sharing across the field.
Mapping between the OBO and OWL ontology languages.
Tirmizi, Syed Hamid; Aitken, Stuart; Moreira, Dilvan A; Mungall, Chris; Sequeda, Juan; Shah, Nigam H; Miranker, Daniel P
2011-03-07
Ontologies are commonly used in biomedicine to organize concepts to describe domains such as anatomies, environments, experiment, taxonomies etc. NCBO BioPortal currently hosts about 180 different biomedical ontologies. These ontologies have been mainly expressed in either the Open Biomedical Ontology (OBO) format or the Web Ontology Language (OWL). OBO emerged from the Gene Ontology, and supports most of the biomedical ontology content. In comparison, OWL is a Semantic Web language, and is supported by the World Wide Web consortium together with integral query languages, rule languages and distributed infrastructure for information interchange. These features are highly desirable for the OBO content as well. A convenient method for leveraging these features for OBO ontologies is by transforming OBO ontologies to OWL. We have developed a methodology for translating OBO ontologies to OWL using the organization of the Semantic Web itself to guide the work. The approach reveals that the constructs of OBO can be grouped together to form a similar layer cake. Thus we were able to decompose the problem into two parts. Most OBO constructs have easy and obvious equivalence to a construct in OWL. A small subset of OBO constructs requires deeper consideration. We have defined transformations for all constructs in an effort to foster a standard common mapping between OBO and OWL. Our mapping produces OWL-DL, a Description Logics based subset of OWL with desirable computational properties for efficiency and correctness. Our Java implementation of the mapping is part of the official Gene Ontology project source. Our transformation system provides a lossless roundtrip mapping for OBO ontologies, i.e. an OBO ontology may be translated to OWL and back without loss of knowledge. In addition, it provides a roadmap for bridging the gap between the two ontology languages in order to enable the use of ontology content in a language independent manner.
Mapping between the OBO and OWL ontology languages
2011-01-01
Background Ontologies are commonly used in biomedicine to organize concepts to describe domains such as anatomies, environments, experiment, taxonomies etc. NCBO BioPortal currently hosts about 180 different biomedical ontologies. These ontologies have been mainly expressed in either the Open Biomedical Ontology (OBO) format or the Web Ontology Language (OWL). OBO emerged from the Gene Ontology, and supports most of the biomedical ontology content. In comparison, OWL is a Semantic Web language, and is supported by the World Wide Web consortium together with integral query languages, rule languages and distributed infrastructure for information interchange. These features are highly desirable for the OBO content as well. A convenient method for leveraging these features for OBO ontologies is by transforming OBO ontologies to OWL. Results We have developed a methodology for translating OBO ontologies to OWL using the organization of the Semantic Web itself to guide the work. The approach reveals that the constructs of OBO can be grouped together to form a similar layer cake. Thus we were able to decompose the problem into two parts. Most OBO constructs have easy and obvious equivalence to a construct in OWL. A small subset of OBO constructs requires deeper consideration. We have defined transformations for all constructs in an effort to foster a standard common mapping between OBO and OWL. Our mapping produces OWL-DL, a Description Logics based subset of OWL with desirable computational properties for efficiency and correctness. Our Java implementation of the mapping is part of the official Gene Ontology project source. Conclusions Our transformation system provides a lossless roundtrip mapping for OBO ontologies, i.e. an OBO ontology may be translated to OWL and back without loss of knowledge. In addition, it provides a roadmap for bridging the gap between the two ontology languages in order to enable the use of ontology content in a language independent manner. PMID:21388572
Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi
2017-01-01
We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Uncertainty visualisation in the Model Web
NASA Astrophysics Data System (ADS)
Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.
2012-04-01
Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).
Measures of outdoor play and independent mobility in children and youth: A methodological review.
Bates, Bree; Stone, Michelle R
2015-09-01
Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
The History of Soil Mapping and Classification in Europe: The role of the European Commission
NASA Astrophysics Data System (ADS)
Montanarella, Luca
2014-05-01
Early systematic soil mapping in Europe dates back to the early times of soil science in the 19th Century and was developed at National scales mostly for taxation purposes. National soil classification systems emerged out of the various scientific communities active at that time in leading countries like Germany, Austria, France, Belgium, United Kingdom and many others. Different scientific communities were leading in the various countries, in some cases stemming from geological sciences, in others as a branch of agricultural sciences. Soil classification for the purpose of ranking soils for their capacity to be agriculturally productive emerged as the main priority, allowing in some countries for very detailed and accurate soil maps at 1:5,000 scale and larger. Detailed mapping was mainly driven by taxation purposes in the early times but evolved in several countries also as a planning and management tool for farms and local administrations. The need for pan-European soil mapping and classification efforts emerged only after World War II in the early 1950's under the auspices of FAO with the aim to compile a common European soil map as a contribution to the global soil mapping efforts of FAO at that time. These efforts evolved over the next decades, with the support of the European Commission, towards the establishment of a permanent network of National soil survey institutions (the European Soil Bureau Network). With the introduction of digital soil mapping technologies, the new European Soil Information System (EUSIS) was established, incorporating data at multiple scales for the EU member states and bordering countries. In more recent years, the formal establishment of the European Soil Data Centre (ESDAC) hosted by the European Commission, together with a formal legal framework for soil mapping and soil classification provided by the INSPIRE directive and the related standardization and harmonization efforts, has led to the operational development of advanced digital soil mapping techniques supporting the contribution of Europe to a common global soil information system under the coordination of the Global Soil Partnership (GSP) of FAO. Further information: http://eusoils.jrc.ec.europa.eu/ References: Mark G Kibblewhite, Ladislav Miko, Luca Montanarella, Legal frameworks for soil protection: current development and technical information requirements, Current Opinion in Environmental Sustainability, Volume 4, Issue 5, November 2012, Pages 573-577. Luca Montanarella, Ronald Vargas, Global governance of soil resources as a necessary condition for sustainable development, Current Opinion in Environmental Sustainability, Volume 4, Issue 5, November 2012, Pages 559-564.
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
Landsat Image Map Production Methods at the U. S. Geological Survey
Kidwell, R.D.; Binnie, D.R.; Martin, S.
1987-01-01
To maintain consistently high quality in satellite image map production, the U. S. Geological Survey (USGS) has developed standard procedures for the photographic and digital production of Landsat image mosaics, and for lithographic printing of multispectral imagery. This paper gives a brief review of the photographic, digital, and lithographic procedures currently in use for producing image maps from Landsat data. It is shown that consistency in the printing of image maps is achieved by standardizing the materials and procedures that affect the image detail and color balance of the final product. Densitometric standards are established by printing control targets using the pressplates, inks, pre-press proofs, and paper to be used for printing.
NASA Technical Reports Server (NTRS)
Feng, Wanda; Evans, Cynthia; Gruener, John; Eppler, Dean
2014-01-01
Geologic mapping involves interpreting relationships between identifiable units and landforms to understand the formative history of a region. Traditional field techniques are used to accomplish this on Earth. Mapping proves more challenging for other planets, which are studied primarily by orbital remote sensing and, less frequently, by robotic and human surface exploration. Systematic comparative assessments of geologic maps created by traditional mapping versus photogeology together with data from planned traverses are limited. The objective of this project is to produce a geologic map from data collected on the Desert Research and Technology Studies (RATS) 2010 analog mission using Apollo-style traverses in conjunction with remote sensing data. This map is compared with a geologic map produced using standard field techniques.
Cartographic standards to improve maps produced by the Forest Inventory and Analysis program
Charles H. (Hobie) Perry; Mark D. Nelson
2009-01-01
The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...
The OncoArray Consortium: A Network for Understanding the Genetic Architecture of Common Cancers.
Amos, Christopher I; Dennis, Joe; Wang, Zhaoming; Byun, Jinyoung; Schumacher, Fredrick R; Gayther, Simon A; Casey, Graham; Hunter, David J; Sellers, Thomas A; Gruber, Stephen B; Dunning, Alison M; Michailidou, Kyriaki; Fachal, Laura; Doheny, Kimberly; Spurdle, Amanda B; Li, Yafang; Xiao, Xiangjun; Romm, Jane; Pugh, Elizabeth; Coetzee, Gerhard A; Hazelett, Dennis J; Bojesen, Stig E; Caga-Anan, Charlisse; Haiman, Christopher A; Kamal, Ahsan; Luccarini, Craig; Tessier, Daniel; Vincent, Daniel; Bacot, François; Van Den Berg, David J; Nelson, Stefanie; Demetriades, Stephen; Goldgar, David E; Couch, Fergus J; Forman, Judith L; Giles, Graham G; Conti, David V; Bickeböller, Heike; Risch, Angela; Waldenberger, Melanie; Brüske-Hohlfeld, Irene; Hicks, Belynda D; Ling, Hua; McGuffog, Lesley; Lee, Andrew; Kuchenbaecker, Karoline; Soucy, Penny; Manz, Judith; Cunningham, Julie M; Butterbach, Katja; Kote-Jarai, Zsofia; Kraft, Peter; FitzGerald, Liesel; Lindström, Sara; Adams, Marcia; McKay, James D; Phelan, Catherine M; Benlloch, Sara; Kelemen, Linda E; Brennan, Paul; Riggan, Marjorie; O'Mara, Tracy A; Shen, Hongbing; Shi, Yongyong; Thompson, Deborah J; Goodman, Marc T; Nielsen, Sune F; Berchuck, Andrew; Laboissiere, Sylvie; Schmit, Stephanie L; Shelford, Tameka; Edlund, Christopher K; Taylor, Jack A; Field, John K; Park, Sue K; Offit, Kenneth; Thomassen, Mads; Schmutzler, Rita; Ottini, Laura; Hung, Rayjean J; Marchini, Jonathan; Amin Al Olama, Ali; Peters, Ulrike; Eeles, Rosalind A; Seldin, Michael F; Gillanders, Elizabeth; Seminara, Daniela; Antoniou, Antonis C; Pharoah, Paul D P; Chenevix-Trench, Georgia; Chanock, Stephen J; Simard, Jacques; Easton, Douglas F
2017-01-01
Common cancers develop through a multistep process often including inherited susceptibility. Collaboration among multiple institutions, and funding from multiple sources, has allowed the development of an inexpensive genotyping microarray, the OncoArray. The array includes a genome-wide backbone, comprising 230,000 SNPs tagging most common genetic variants, together with dense mapping of known susceptibility regions, rare variants from sequencing experiments, pharmacogenetic markers, and cancer-related traits. The OncoArray can be genotyped using a novel technology developed by Illumina to facilitate efficient genotyping. The consortium developed standard approaches for selecting SNPs for study, for quality control of markers, and for ancestry analysis. The array was genotyped at selected sites and with prespecified replicate samples to permit evaluation of genotyping accuracy among centers and by ethnic background. The OncoArray consortium genotyped 447,705 samples. A total of 494,763 SNPs passed quality control steps with a sample success rate of 97% of the samples. Participating sites performed ancestry analysis using a common set of markers and a scoring algorithm based on principal components analysis. Results from these analyses will enable researchers to identify new susceptibility loci, perform fine-mapping of new or known loci associated with either single or multiple cancers, assess the degree of overlap in cancer causation and pleiotropic effects of loci that have been identified for disease-specific risk, and jointly model genetic, environmental, and lifestyle-related exposures. Ongoing analyses will shed light on etiology and risk assessment for many types of cancer. Cancer Epidemiol Biomarkers Prev; 26(1); 126-35. ©2016 AACR. ©2016 American Association for Cancer Research.
The OncoArray Consortium: a Network for Understanding the Genetic Architecture of Common Cancers
Amos, Christopher I.; Dennis, Joe; Wang, Zhaoming; Byun, Jinyoung; Schumacher, Fredrick R.; Gayther, Simon A.; Casey, Graham; Hunter, David J.; Sellers, Thomas A.; Gruber, Stephen B.; Dunning, Alison M.; Michailidou, Kyriaki; Fachal, Laura; Doheny, Kimberly; Spurdle, Amanda B.; Li, Yafang; Xiao, Xiangjun; Romm, Jane; Pugh, Elizabeth; Coetzee, Gerhard A.; Hazelett, Dennis J.; Bojesen, Stig E.; Caga-Anan, Charlisse; Haiman, Christopher A.; Kamal, Ahsan; Luccarini, Craig; Tessier, Daniel; Vincent, Daniel; Bacot, François; Van Den Berg, David J.; Nelson, Stefanie; Demetriades, Stephen; Goldgar, David E.; Couch, Fergus J.; Forman, Judith L.; Giles, Graham G.; Conti, David V.; Bickeböller, Heike; Risch, Angela; Waldenberger, Melanie; Brüske, Irene; Hicks, Belynda D.; Ling, Hua; McGuffog, Lesley; Lee, Andrew; Kuchenbaecker, Karoline B.; Soucy, Penny; Manz, Judith; Cunningham, Julie M.; Butterbach, Katja; Kote-Jarai, Zsofia; Kraft, Peter; FitzGerald, Liesel M.; Lindström, Sara; Adams, Marcia; McKay, James D.; Phelan, Catherine M.; Benlloch, Sara; Kelemen, Linda E.; Brennan, Paul; Riggan, Marjorie; O’Mara, Tracy A.; Shen, Hongbin; Shi, Yongyong; Thompson, Deborah J.; Goodman, Marc T.; Nielsen, Sune F.; Berchuck, Andrew; Laboissiere, Sylvie; Schmit, Stephanie L.; Shelford, Tameka; Edlund, Christopher K.; Taylor, Jack A.; Field, John K.; Park, Sue K.; Offit, Kenneth; Thomassen, Mads; Schmutzler, Rita; Ottini, Laura; Hung, Rayjean J.; Marchini, Jonathan; Al Olama, Ali Amin; Peters, Ulrike; Eeles, Rosalind A.; Seldin, Michael F.; Gillanders, Elizabeth; Seminara, Daniela; Antoniou, Antonis C.; Pharoah, Paul D.; Chenevix-Trench, Georgia; Chanock, Stephen J.; Simard, Jacques; Easton, Douglas F.
2016-01-01
Background Common cancers develop through a multistep process often including inherited susceptibility. Collaboration among multiple institutions, and funding from multiple sources, has allowed the development of an inexpensive genotyping microarray, the OncoArray. The array includes a genome-wide backbone, comprising 230,000 SNPs tagging most common genetic variants, together with dense mapping of known susceptibility regions, rare variants from sequencing experiments, pharmacogenetic markers and cancer related traits. Methods The OncoArray can be genotyped using a novel technology developed by Illumina to facilitate efficient genotyping. The consortium developed standard approaches for selecting SNPs for study, for quality control of markers and for ancestry analysis. The array was genotyped at selected sites and with prespecified replicate samples to permit evaluation of genotyping accuracy among centers and by ethnic background. Results The OncoArray consortium genotyped 447,705 samples. A total of 494,763 SNPs passed quality control steps with a sample success rate of 97% of the samples. Participating sites performed ancestry analysis using a common set of markers and a scoring algorithm based on principal components analysis. Conclusions Results from these analyses will enable researchers to identify new susceptibility loci, perform fine mapping of new or known loci associated with either single or multiple cancers, assess the degree of overlap in cancer causation and pleiotropic effects of loci that have been identified for disease-specific risk, and jointly model genetic, environmental and lifestyle related exposures. Impact Ongoing analyses will shed light on etiology and risk assessment for many types of cancer. PMID:27697780
Field methods and data processing techniques associated with mapped inventory plots
William A. Bechtold; Stanley J. Zarnoch
1999-01-01
The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...
Evidence for basal distortion-product otoacoustic emission components.
Martin, Glen K; Stagner, Barden B; Lonsbury-Martin, Brenda L
2010-05-01
Distortion-product otoacoustic emissions (DPOAEs) were measured with traditional DP-grams and level/phase (L/P) maps in rabbits with either normal cochlear function or unique sound-induced cochlear losses that were characterized as either low-frequency or notched configurations. To demonstrate that emission generators distributed basal to the f(2) primary-tone contribute, in general, to DPOAE levels and phases, a high-frequency interference tone (IT) was presented at 1/3 of an octave (oct) above the f(2) primary-tone, and DPOAEs were re-measured as "augmented" DP-grams (ADP-grams) and L/P maps. The vector difference between the control and augmented functions was then computed to derive residual DP-grams (RDP-grams) and L/P maps. The resulting RDP-grams and L/P maps, which described the DPOAEs removed by the IT, supported the notion that basal DPOAE components routinely contribute to the generation of standard measures of DPOAEs. Separate experiments demonstrated that these components could not be attributed to the effects of the 1/3-oct IT on f(2), or DPOAEs generated by the addition of a third interfering tone. These basal components can "fill in" the lesion estimated by the commonly employed DP-gram. Thus, ADP-grams more accurately reveal the pattern of cochlear damage and may eventually lead to an improved DP-gram procedure.
Chemical Space Mapping and Structure-Activity Analysis of the ChEMBL Antiviral Compound Set.
Klimenko, Kyrylo; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre
2016-08-22
Curation, standardization and data fusion of the antiviral information present in the ChEMBL public database led to the definition of a robust data set, providing an association of antiviral compounds to seven broadly defined antiviral activity classes. Generative topographic mapping (GTM) subjected to evolutionary tuning was then used to produce maps of the antiviral chemical space, providing an optimal separation of compound families associated with the different antiviral classes. The ability to pinpoint the specific spots occupied (responsibility patterns) on a map by various classes of antiviral compounds opened the way for a GTM-supported search for privileged structural motifs, typical for each antiviral class. The privileged locations of antiviral classes were analyzed in order to highlight underlying privileged common structural motifs. Unlike in classical medicinal chemistry, where privileged structures are, almost always, predefined scaffolds, privileged structural motif detection based on GTM responsibility patterns has the decisive advantage of being able to automatically capture the nature ("resolution detail"-scaffold, detailed substructure, pharmacophore pattern, etc.) of the relevant structural motifs. Responsibility patterns were found to represent underlying structural motifs of various natures-from very fuzzy (groups of various "interchangeable" similar scaffolds), to the classical scenario in medicinal chemistry (underlying motif actually being the scaffold), to very precisely defined motifs (specifically substituted scaffolds).
Feng, Xiu; Yu, Xiaomu; Fu, Beide; Wang, Xinhua; Liu, Haiyang; Pang, Meixia; Tong, Jingou
2018-04-02
A high-density genetic linkage map is essential for QTL fine mapping, comparative genome analysis, identification of candidate genes and marker-assisted selection for economic traits in aquaculture species. The Yangtze River common carp (Cyprinus carpio haematopterus) is one of the most important aquacultured strains in China. However, quite limited genetics and genomics resources have been developed for genetic improvement of economic traits in such strain. A high-resolution genetic linkage map was constructed by using 7820 2b-RAD (2b-restriction site-associated DNA) and 295 microsatellite markers in a F2 family of the Yangtze River common carp (C. c. haematopterus). The length of the map was 4586.56 cM with an average marker interval of 0.57 cM. Comparative genome mapping revealed that a high proportion (70%) of markers with disagreed chromosome location was observed between C. c. haematopterus and another common carp strain (subspecies) C. c. carpio. A clear 2:1 relationship was observed between C. c. haematopterus linkage groups (LGs) and zebrafish (Danio rerio) chromosomes. Based on the genetic map, 21 QTLs for growth-related traits were detected on 12 LGs, and contributed values of phenotypic variance explained (PVE) ranging from 16.3 to 38.6%, with LOD scores ranging from 4.02 to 11.13. A genome-wide significant QTL (LOD = 10.83) and three chromosome-wide significant QTLs (mean LOD = 4.84) for sex were mapped on LG50 and LG24, respectively. A 1.4 cM confidence interval of QTL for all growth-related traits showed conserved synteny with a 2.06 M segment on chromosome 14 of D. rerio. Five potential candidate genes were identified by blast search in this genomic region, including a well-studied multi-functional growth related gene, Apelin. We mapped a set of suggestive and significant QTLs for growth-related traits and sex based on a high-density genetic linkage map using SNP and microsatellite markers for Yangtze River common carp. Several candidate growth genes were also identified from the QTL regions by comparative mapping. This genetic map would provide a basis for genome assembly and comparative genomics studies, and those QTL-derived candidate genes and genetic markers are useful genomic resources for marker-assisted selection (MAS) of growth-related traits in the Yangtze River common carp.
Automated System for Early Breast Cancer Detection in Mammograms
NASA Technical Reports Server (NTRS)
Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.
1993-01-01
The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.
NASA Astrophysics Data System (ADS)
An, L.; Zhang, J.; Gong, L.
2018-04-01
Playing an important role in gathering information of social infrastructure damage, Synthetic Aperture Radar (SAR) remote sensing is a useful tool for monitoring earthquake disasters. With the wide application of this technique, a standard method, comparing post-seismic to pre-seismic data, become common. However, multi-temporal SAR processes, are not always achievable. To develop a post-seismic data only method for building damage detection, is of great importance. In this paper, the authors are now initiating experimental investigation to establish an object-based feature analysing classification method for building damage recognition.
Toward standardized mapping for left atrial analysis and cardiac ablation guidance
NASA Astrophysics Data System (ADS)
Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.
2014-03-01
In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.
Standards-based sensor interoperability and networking SensorWeb: an overview
NASA Astrophysics Data System (ADS)
Bolling, Sam
2012-06-01
The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.
NASA Astrophysics Data System (ADS)
Zeng, Lu-Chuan; Yao, Jen-Chih
2006-09-01
Recently, Agarwal, Cho, Li and Huang [R.P. Agarwal, Y.J. Cho, J. Li, N.J. Huang, Stability of iterative procedures with errors approximating common fixed points for a couple of quasi-contractive mappings in q-uniformly smooth Banach spaces, J. Math. Anal. Appl. 272 (2002) 435-447] introduced the new iterative procedures with errors for approximating the common fixed point of a couple of quasi-contractive mappings and showed the stability of these iterative procedures with errors in Banach spaces. In this paper, we introduce a new concept of a couple of q-contractive-like mappings (q>1) in a Banach space and apply these iterative procedures with errors for approximating the common fixed point of the couple of q-contractive-like mappings. The results established in this paper improve, extend and unify the corresponding ones of Agarwal, Cho, Li and Huang [R.P. Agarwal, Y.J. Cho, J. Li, N.J. Huang, Stability of iterative procedures with errors approximating common fixed points for a couple of quasi-contractive mappings in q-uniformly smooth Banach spaces, J. Math. Anal. Appl. 272 (2002) 435-447], Chidume [C.E. Chidume, Approximation of fixed points of quasi-contractive mappings in Lp spaces, Indian J. Pure Appl. Math. 22 (1991) 273-386], Chidume and Osilike [C.E. Chidume, M.O. Osilike, Fixed points iterations for quasi-contractive maps in uniformly smooth Banach spaces, Bull. Korean Math. Soc. 30 (1993) 201-212], Liu [Q.H. Liu, On Naimpally and Singh's open questions, J. Math. Anal. Appl. 124 (1987) 157-164; Q.H. Liu, A convergence theorem of the sequence of Ishikawa iterates for quasi-contractive mappings, J. Math. Anal. Appl. 146 (1990) 301-305], Osilike [M.O. Osilike, A stable iteration procedure for quasi-contractive maps, Indian J. Pure Appl. Math. 27 (1996) 25-34; M.O. Osilike, Stability of the Ishikawa iteration method for quasi-contractive maps, Indian J. Pure Appl. Math. 28 (1997) 1251-1265] and many others in the literature.
Elementary maps on nest algebras
NASA Astrophysics Data System (ADS)
Li, Pengtong
2006-08-01
Let , be algebras and let , be maps. An elementary map of is an ordered pair (M,M*) such that for all , . In this paper, the general form of surjective elementary maps on standard subalgebras of nest algebras is described. In particular, such maps are automatically additive.
Topsoil organic carbon content of Europe, a new map based on a generalised additive model
NASA Astrophysics Data System (ADS)
de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas
2014-05-01
There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average OC content predictions for each land cover class compared well between models, with our model always showing smaller standard deviations. We concluded that the chosen model and covariates are appropriate for the prediction of OC content in European mineral soils. We presented in this work the first map of topsoil OC content at European scale based on a harmonised soil dataset. The associated uncertainty map shall support the end-users in a careful use of the predictions.
Standard for the U.S. Geological Survey Historical Topographic Map Collection
Allord, Gregory J.; Fishburn, Kristin A.; Walter, Jennifer L.
2014-01-01
This document defines the digital map product of the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic quadrangle maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. Each printed topographic map is scanned “as is” and captures the content and condition of each map. The HTMC provides ready access to maps that are no longer available for distribution in print. A new generation of topographic maps called “US Topo” was defined in 2009. US Topo maps, though modeled on the legacy 7.5-minute topographic maps, conform to different standards. For more information on the HTMC, see the project Web site at: http://nationalmap.gov/historical/.
US EPA Nonattainment Areas and Designations-SO2 (2010 NAAQS)
This web service contains the following layer: SO2 2010 NAAQS State Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2010SO21hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure levels. The methods and a
US EPA Nonattainment Areas and Designations-24 Hour PM2.5 (2006 NAAQS)
This web service contains the following layers: PM2.5 24hr 2006 NAAQS State Level and PM2.5 24hr 2006 NAAQS National. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2006PM2524hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-ter
US EPA Nonattainment Areas and Designations-Annual PM2.5 (2012 NAAQS)
This web service contains the following layer: PM2.5 Annual 2012 NAAQS State Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2012PM25Annual/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure levels. The me
Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.
Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A
2012-08-01
To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.
Modeling visual problem solving as analogical reasoning.
Lovett, Andrew; Forbus, Kenneth
2017-01-01
We present a computational model of visual problem solving, designed to solve problems from the Raven's Progressive Matrices intelligence test. The model builds on the claim that analogical reasoning lies at the heart of visual problem solving, and intelligence more broadly. Images are compared via structure mapping, aligning the common relational structure in 2 images to identify commonalities and differences. These commonalities or differences can themselves be reified and used as the input for future comparisons. When images fail to align, the model dynamically rerepresents them to facilitate the comparison. In our analysis, we find that the model matches adult human performance on the Standard Progressive Matrices test, and that problems which are difficult for the model are also difficult for people. Furthermore, we show that model operations involving abstraction and rerepresentation are particularly difficult for people, suggesting that these operations may be critical for performing visual problem solving, and reasoning more generally, at the highest level. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
USGS standard quadrangle maps for emergency response
Moore, Laurence R.
2009-01-01
The 1:24,000-scale topographic quadrangle was the primary product of the U.S. Geological Survey's (USGS) National Mapping Program from 1947-1992. This map series includes about 54,000 map sheets for the conterminous United States, and is the only uniform map series ever produced that covers this area at such a large scale. This map series partially was revised under several programs, starting as early as 1968, but these programs were not adequate to keep the series current. Through the 1990s the emphasis of the USGS mapping program shifted away from topographic maps and toward more specialized digital data products. Topographic map revision dropped off rapidly after 1999, and stopped completely by 2004. Since 2001, emergency-response and homeland security requirement have revived the question of whether a standard national topographic series is needed. Emergencies such as Hurricane Katrina in 2005 and California wildfires in 2007-08 demonstrated that familiar maps are important to first responders. Maps that have a standard scale, extent, and grids help reduce confusion and save time in emergencies. Traditional maps are designed to allow the human brain to quickly process large amounts of information, and depend on artistic layout and design that cannot be fully automated. In spite of technical advances, creating a traditional, general-purpose topographic map is still expensive. Although the content and layout of traditional topographic maps probably is still desirable, the preferred packaging and delivery of maps has changed. Digital image files are now desired by most users, but to be useful to the emergency-response community, these files must be easy to view and easy to print without specialized geographic information system expertise or software.
Geologic map of the Sunnymead 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Matti, Jonathan C.
2001-01-01
a. This Readme; includes in Appendix I, data contained in sun_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Kawakami, Eiryo; Singh, Vivek K; Matsubara, Kazuko; Ishii, Takashi; Matsuoka, Yukiko; Hase, Takeshi; Kulkarni, Priya; Siddiqui, Kenaz; Kodilkar, Janhavi; Danve, Nitisha; Subramanian, Indhupriya; Katoh, Manami; Shimizu-Yoshida, Yuki; Ghosh, Samik; Jere, Abhay; Kitano, Hiroaki
2016-01-01
Cellular stress responses require exquisite coordination between intracellular signaling molecules to integrate multiple stimuli and actuate specific cellular behaviors. Deciphering the web of complex interactions underlying stress responses is a key challenge in understanding robust biological systems and has the potential to lead to the discovery of targeted therapeutics for diseases triggered by dysregulation of stress response pathways. We constructed large-scale molecular interaction maps of six major stress response pathways in Saccharomyces cerevisiae (baker’s or budding yeast). Biological findings from over 900 publications were converted into standardized graphical formats and integrated into a common framework. The maps are posted at http://www.yeast-maps.org/yeast-stress-response/ for browse and curation by the research community. On the basis of these maps, we undertook systematic analyses to unravel the underlying architecture of the networks. A series of network analyses revealed that yeast stress response pathways are organized in bow–tie structures, which have been proposed as universal sub-systems for robust biological regulation. Furthermore, we demonstrated a potential role for complexes in stabilizing the conserved core molecules of bow–tie structures. Specifically, complex-mediated reversible reactions, identified by network motif analyses, appeared to have an important role in buffering the concentration and activity of these core molecules. We propose complex-mediated reactions as a key mechanism mediating robust regulation of the yeast stress response. Thus, our comprehensive molecular interaction maps provide not only an integrated knowledge base, but also a platform for systematic network analyses to elucidate the underlying architecture in complex biological systems. PMID:28725465
Zeng, Ping; Mukherjee, Sayan; Zhou, Xiang
2017-01-01
Epistasis, commonly defined as the interaction between multiple genes, is an important genetic component underlying phenotypic variation. Many statistical methods have been developed to model and identify epistatic interactions between genetic variants. However, because of the large combinatorial search space of interactions, most epistasis mapping methods face enormous computational challenges and often suffer from low statistical power due to multiple test correction. Here, we present a novel, alternative strategy for mapping epistasis: instead of directly identifying individual pairwise or higher-order interactions, we focus on mapping variants that have non-zero marginal epistatic effects—the combined pairwise interaction effects between a given variant and all other variants. By testing marginal epistatic effects, we can identify candidate variants that are involved in epistasis without the need to identify the exact partners with which the variants interact, thus potentially alleviating much of the statistical and computational burden associated with standard epistatic mapping procedures. Our method is based on a variance component model, and relies on a recently developed variance component estimation method for efficient parameter inference and p-value computation. We refer to our method as the “MArginal ePIstasis Test”, or MAPIT. With simulations, we show how MAPIT can be used to estimate and test marginal epistatic effects, produce calibrated test statistics under the null, and facilitate the detection of pairwise epistatic interactions. We further illustrate the benefits of MAPIT in a QTL mapping study by analyzing the gene expression data of over 400 individuals from the GEUVADIS consortium. PMID:28746338
Geologic map of the Cucamonga Peak 7.5' quadrangle, San Bernardino County, California
Morton, D.M.; Matti, J.C.; Digital preparation by Koukladas, Catherine; Cossette, P.M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. (Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Cucamonga Peak 7.5’ topographic quadrangle in conjunction with the geologic map.
Geologic map of the Telegraph Peak 7.5' quadrangle, San Bernardino County, California
Morton, D.M.; Woodburne, M.O.; Foster, J.H.; Morton, Gregory; Cossette, P.M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Telegraph Peak 7.5’ topographic quadrangle in conjunction with the geologic map.
Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.
Ivory, Catherine H
2016-07-01
The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.
Grimley, D.A.; Wang, J.-S.; Liebert, D.A.; Dawson, J.O.
2008-01-01
Flooded, saturated, or poorly drained soils are commonly anaerobic, leading to microbially induced magnetite/maghemite dissolution and decreased soil magnetic susceptibility (MS). Thus, MS is considerably higher in well-drained soils (MS typically 40-80 ?? 10-5 standard international [SI]) compared to poorly drained soils (MS typically 10-25 ?? 10-5 SI) in Illinois, other soil-forming factors being equal. Following calibration to standard soil probings, MS values can be used to rapidly and precisely delineate hydric from nonhydric soils in areas with relatively uniform parent material. Furthermore, soil MS has a moderate to strong association with individual tree species' distribution across soil moisture regimes, correlating inversely with independently reported rankings of a tree species' flood tolerance. Soil MS mapping can thus provide a simple, rapid, and quantitative means for precisely guiding reforestation with respect to plant species' adaptations to soil drainage classes. For instance, in native woodlands of east-central Illinois, Quercus alba , Prunus serotina, and Liriodendron tulipifera predominantly occur in moderately well-drained soils (MS 40-60 ?? 10-5 SI), whereas Acer saccharinum, Carya laciniosa, and Fraxinus pennsylvanica predominantly occur in poorly drained soils (MS <20 ?? 10-5 SI). Using a similar method, an MS contour map was used to guide restoration of mesic, wet mesic, and wet prairie species to pre-settlement distributions at Meadowbrook Park (Urbana, IL, U.S.A.). Through use of soil MS maps calibrated to soil drainage class and native vegetation occurrence, restoration efforts can be conducted more successfully and species distributions more accurately reconstructed at the microecosystem level. ?? 2008 Society for Ecological Restoration International.
Towards drought risk mapping on a pan-European scale
NASA Astrophysics Data System (ADS)
Blauhut, Veit; Gudmundsson, Lukas; Stahl, Kerstin; Seneviratne, Sonia
2014-05-01
Drought is a very complex and multifarious natural hazard, which causes a variety of direct and indirect environmental and socio-economic impacts. For the last 30 years, droughts in Europe caused over 100 billion Euros of losses from impacts in various sectors e.g. agriculture, water quality or energy production. Despite the apparent importance of this hazard observed pan-European drought impacts have not yet been quantitatively related to the most important climatological drivers. Fundamentally, a common approach to describe drought risk on a pan-European scale is still missing. This contribution presents an approach for linking climatological drought indices with observed drought impacts at the European scale. Standardized precipitation index (SPI) and standardized precipitation and evapotranspiration index (SPEI) for different time scales were calculated based on E-OBS data and are used to describe the drought hazard. Data from the European Drought Impact Inventory (EDII) compiled by the EU FP7 Drought R&SPI (Fostering European Drought Research and Science-Policy Interfacing) project are used as a proxy for multi-sectorial (impact categories) vulnerability following the assumption that a reported impact reflects a region's vulnerability to the hazard. Drought risk is then modelled statistically by applying logistic regression to estimate the probability of impact report occurrence as a function of SPI and SPEI. This approach finally allows to map the probability of drought impact occurrence on a year by year basis. The emerging patterns compare well to many known European drought events. Such maps may become an essential component of Drought Risk Management to foster resilience for this hazard at the large scale.
Jung, Bo Kyeung; Kim, Jeeyong; Cho, Chi Hyun; Kim, Ju Yeon; Nam, Myung Hyun; Shin, Bong Kyung; Rho, Eun Youn; Kim, Sollip; Sung, Heungsup; Kim, Shinyoung; Ki, Chang Seok; Park, Min Jung; Lee, Kap No; Yoon, Soo Young
2017-04-01
The National Health Information Standards Committee was established in 2004 in Korea. The practical subcommittee for laboratory test terminology was placed in charge of standardizing laboratory medicine terminology in Korean. We aimed to establish a standardized Korean laboratory terminology database, Korea-Logical Observation Identifier Names and Codes (K-LOINC) based on former products sponsored by this committee. The primary product was revised based on the opinions of specialists. Next, we mapped the electronic data interchange (EDI) codes that were revised in 2014, to the corresponding K-LOINC. We established a database of synonyms, including the laboratory codes of three reference laboratories and four tertiary hospitals in Korea. Furthermore, we supplemented the clinical microbiology section of K-LOINC using an alternative mapping strategy. We investigated other systems that utilize laboratory codes in order to investigate the compatibility of K-LOINC with statistical standards for a number of tests. A total of 48,990 laboratory codes were adopted (21,539 new and 16,330 revised). All of the LOINC synonyms were translated into Korean, and 39,347 Korean synonyms were added. Moreover, 21,773 synonyms were added from reference laboratories and tertiary hospitals. Alternative strategies were established for mapping within the microbiology domain. When we applied these to a smaller hospital, the mapping rate was successfully increased. Finally, we confirmed K-LOINC compatibility with other statistical standards, including a newly proposed EDI code system. This project successfully established an up-to-date standardized Korean laboratory terminology database, as well as an updated EDI mapping to facilitate the introduction of standard terminology into institutions. © 2017 The Korean Academy of Medical Sciences.
Development of two socioeconomic indices for Saudi Arabia.
AlOmar, Reem S; Parslow, Roger C; Law, Graham R
2018-06-26
Health and socioeconomic status (SES) are linked in studies worldwide. Measures of SES exist for many countries, however not for Saudi Arabia (SA). We describe two indices of area-based SES for SA. Routine census data has been used to construct two indices of SES at the geographically-delimited administrative region of Governorates in SA (n = 118). The data used included indicators of educational status, employment status, car and material ownership. A continuous measure of SES was constructed using exploratory factor analysis (EFA) and a categorical measure of SES using latent class analysis (LCA). Both indices were mapped by Governorates. The EFA identified three factors: The first explained 51.58% of the common variance within the interrelated factors, the second 15.14%, and the third 14.26%. These proportions were used in the formulation of the standard index. The scores were fixed to range from 100 for the affluent Governorate and 0 for the deprived. The LCA found a 4 class model as the best model fit. Class 1 was termed "affluent" and included 11.01% of Governorates, class 2 "upper middle class" (44.91%), class 3 "lower middle class" (33.05%) and class 4 "deprived" (11.01%). The populated urbanised Governorates were found to be the most affluent whereas the smaller rural Governorates were the most deprived. This is the first description of measures of SES in SA at a geographical level. Two measures have been successfully constructed and mapped. The maps show similar patterns suggesting validity. Both indices support the common perception of SES in SA.
NASA Technical Reports Server (NTRS)
Estefan, J. A.; Sovers, O. J.
1994-01-01
The standard tropospheric calibration model implemented in the operational Orbit Determination Program is the seasonal model developed by C. C. Chao in the early 1970's. The seasonal model has seen only slight modification since its release, particularly in the format and content of the zenith delay calibrations. Chao's most recent standard mapping tables, which are used to project the zenith delay calibrations along the station-to-spacecraft line of sight, have not been modified since they were first published in late 1972. This report focuses principally on proposed upgrades to the zenith delay mapping process, although modeling improvements to the zenith delay calibration process are also discussed. A number of candidate approximation models for the tropospheric mapping are evaluated, including the semi-analytic mapping function of Lanyi, and the semi-empirical mapping functions of Davis, et. al.('CfA-2.2'), of Ifadis (global solution model), of Herring ('MTT'), and of Niell ('NMF'). All of the candidate mapping functions are superior to the Chao standard mapping tables and approximation formulas when evaluated against the current Deep Space Network Mark 3 intercontinental very long baselines interferometry database.
NASA Astrophysics Data System (ADS)
Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi
2018-06-01
Objective. Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (µECoG) electrode arrays, in common marmosets. Approach. ECS was applied using the implanted µECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Main results. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Significance. Using implanted µECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system.
Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi
2018-06-01
Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (µECoG) electrode arrays, in common marmosets. ECS was applied using the implanted µECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Using implanted µECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system.
A cytological-physical map of 22q11
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindsay, E.A.; Rizzu, P.; Gaddini, L.
Our laboratory is involved in the construction of a cytological-physical map of 22q11 and isolation of expressed sequences from the region involved in DiGeorge syndrome (DGS) and Velo-Cardio-Facial syndrome (VCFS). One of the goals of the mapping is an understanding of the molecular mechanisms which generate the 22q11 microdeletions observed with high frequency in DGS and VCFS. Our of over 60 deleted patients studied in our laboratory, all but one were deleted for two loci approximately 1-2 Mb apart. There is evidence from patients with balanced and unbalanced translocations that deletion of the whole region is not necessary for determinationmore » of the clinical phenotype. Therefore, it is possible that deletion breakpoints occur as a consequence of structural characteristics of the DNA that predispose to rearrangements. A striking characteristic of the 22q11 region is the abundance of low copy repeat sequences. It is reasonable to think that recombination between these repeats may lead to microdeletions. However, a direct demonstration of such mechanism is not available yet. The presence of repeats makes standard physical mapping techniques based on hybridization or STS mapping often difficult to interpret. For example, we have found clones positive for the same STS that are located in different positions within 22q11. For this reason we have used high resolution cytological mapping as a supporting technique for map validation. We present the current status map which includes known polymorphic and non-polymorphic loci, newly isolated clones and chromosomal deletion breakpoints. The map extends from the loci D22S9/D22S24 to TOP1P2. Extended chromatin hybridization experiments visually demonstrate the presence of at least two repeat islands flanking (or at) the region where chromosomal breakpoints of the commonly deleted region occur.« less
Preliminary geologic map of the Elsinore 7.5' Quadrangle, Riverside County, California
Morton, Douglas M.; Weber, F. Harold; Digital preparation: Alvarez, Rachel M.; Burns, Diane
2003-01-01
Open-File Report 03-281 contains a digital geologic map database of the Elsinore 7.5’ quadrangle, Riverside County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. Portable Document Format (.pdf) files of: a. This Readme; includes in Appendix I, data contained in els_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Preliminary geologic map of the northeast Dillingham quadrangle (D-1, D-2, C-1, and C-2), Alaska
Wilson, Frederic H.; Hudson, Travis L.; Grybeck, Donald; Stoeser, Douglas B.; Preller, Cindi C.; Bickerstaff, Damon; Labay, Keith A.; Miller, Martha L.
2003-01-01
The Correlation of Map Units and Description of Map Units are in a format similar to that of the USGS Geologic Investigations Series (I-series) maps but have not been edited to comply with I-map standards. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the Stratigraphic Nomenclature of the U.S. Geological Survey. ARC/INFO symbolsets (shade and line) as used for these maps have been made available elsewhere as part of Geologic map of Central (Interior) Alaska, published as a USGS Open-File Report (Wilson and others, 1998, http://geopubs.wr.usgs.gov/open-file/of98-133-a/). This product does not include the digital topographic base or land-grid files used to produce the map, nor does it include the AML and related ancillary key and other files used to assemble the components of the map.
A population MRI brain template and analysis tools for the macaque.
Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam
2018-04-15
The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.
Tock, Andrew J.; Fourie, Deidré; Walley, Peter G.; Holub, Eric B.; Soler, Alvaro; Cichy, Karen A.; Pastor-Corrales, Marcial A.; Song, Qijian; Porch, Timothy G.; Hart, John P.; Vasconcellos, Renato C. C.; Vicente, Joana G.; Barker, Guy C.; Miklas, Phillip N.
2017-01-01
Pseudomonas syringae pv. phaseolicola (Psph) Race 6 is a globally prevalent and broadly virulent bacterial pathogen with devastating impact causing halo blight of common bean (Phaseolus vulgaris L.). Common bean lines PI 150414 and CAL 143 are known sources of resistance against this pathogen. We constructed high-resolution linkage maps for three recombinant inbred populations to map resistance to Psph Race 6 derived from the two common bean lines. This was complemented with a genome-wide association study (GWAS) of Race 6 resistance in an Andean Diversity Panel of common bean. Race 6 resistance from PI 150414 maps to a single major-effect quantitative trait locus (QTL; HB4.2) on chromosome Pv04 and confers broad-spectrum resistance to eight other races of the pathogen. Resistance segregating in a Rojo × CAL 143 population maps to five chromosome arms and includes HB4.2. GWAS detected one QTL (HB5.1) on chromosome Pv05 for resistance to Race 6 with significant influence on seed yield. The same HB5.1 QTL, found in both Canadian Wonder × PI 150414 and Rojo × CAL 143 populations, was effective against Race 6 but lacks broad resistance. This study provides evidence for marker-assisted breeding for more durable halo blight control in common bean by combining alleles of race-nonspecific resistance (HB4.2 from PI 150414) and race-specific resistance (HB5.1 from cv. Rojo). PMID:28736566
A HapMap harvest of insights into the genetics of common disease
Manolio, Teri A.; Brooks, Lisa D.; Collins, Francis S.
2008-01-01
The International HapMap Project was designed to create a genome-wide database of patterns of human genetic variation, with the expectation that these patterns would be useful for genetic association studies of common diseases. This expectation has been amply fulfilled with just the initial output of genome-wide association studies, identifying nearly 100 loci for nearly 40 common diseases and traits. These associations provided new insights into pathophysiology, suggesting previously unsuspected etiologic pathways for common diseases that will be of use in identifying new therapeutic targets and developing targeted interventions based on genetically defined risk. In addition, HapMap-based discoveries have shed new light on the impact of evolutionary pressures on the human genome, suggesting multiple loci important for adapting to disease-causing pathogens and new environments. In this review we examine the origin, development, and current status of the HapMap; its prospects for continued evolution; and its current and potential future impact on biomedical science. PMID:18451988
Italian Present-day Stress Indicators: IPSI Database
NASA Astrophysics Data System (ADS)
Mariucci, M. T.; Montone, P.
2017-12-01
In Italy, since the 90s of the last century, researches concerning the contemporary stress field have been developing at Istituto Nazionale di Geofisica e Vulcanologia (INGV) with local and regional scale studies. Throughout the years many data have been analysed and collected: now they are organized and available for an easy end-use online. IPSI (Italian Present-day Stress Indicators) database, is the first geo-referenced repository of information on the crustal present-day stress field maintained at INGV through a web application database and website development by Gabriele Tarabusi. Data consist of horizontal stress orientations analysed and compiled in a standardized format and quality-ranked for reliability and comparability on a global scale with other database. Our first database release includes 855 data records updated to December 2015. Here we present an updated version that will be released in 2018, after new earthquake data entry up to December 2017. The IPSI web site (http://ipsi.rm.ingv.it/) allows accessing data on a standard map viewer and choose which data (category and/or quality) to plot easily. The main information of each single element (type, quality, orientation) can be viewed simply going over the related symbol, all the information appear by clicking the element. At the same time, simple basic information on the different data type, tectonic regime assignment, quality ranking method are available with pop-up windows. Data records can be downloaded in some common formats, moreover it is possible to download a file directly usable with SHINE, a web based application to interpolate stress orientations (http://shine.rm.ingv.it). IPSI is mainly conceived for those interested in studying the characters of Italian peninsula and surroundings although Italian data are part of the World Stress Map (http://www.world-stress-map.org/) as evidenced by many links that redirect to this database for more details on standard practices in this field.
Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows
NASA Astrophysics Data System (ADS)
Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.
2017-06-01
The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.
Stoeser, Douglas B.; Green, Gregory N.; Morath, Laurie C.; Heran, William D.; Wilson, Anna B.; Moore, David W.; Van Gosen, Bradley S.
2005-01-01
The growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national digital geologic maps attributed with age and lithology information. Such maps can be conveniently used to generate derivative maps for purposes including mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This Open-File Report is a preliminary version of part of a series of integrated state geologic map databases that cover the entire United States. The only national-scale digital geologic maps that portray most or all of the United States for the conterminous U.S. are the digital version of the King and Beikman (1974a, b) map at a scale of 1:2,500,000, as digitized by Schruben and others (1994) and the digital version of the Geologic Map of North America (Reed and others, 2005a, b) compiled at a scale of 1:5,000,000 which is currently being prepared by the U.S. Geological Survey. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. In a few cases, new digital compilations were prepared (e.g. OH, SC, SD) or existing paper maps were digitized (e.g. KY, TX). For Alaska and Hawaii, new regional maps are being compiled and ultimately new state maps will be produced. The digital geologic maps are presented in standardized formats as ARC/INFO (.e00) export files and as ArcView shape (.shp) files. Accompanying these spatial databases are a set of five supplemental data tables that relate the map units to detailed lithologic and age information. The maps for the CONUS have been fitted to a common set of state boundaries based on the 1:100,000 topographic map series of the United States Geological Survey (USGS). When the individual state maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps. No attempt has been made to reconcile differences in mapped geology across state lines. This is the first version of this product and it will be subsequently updated to include four additional states (North Dakota, South Dakota, Nebraska, and Iowa)
Kang, Ju-Hee; Vanderstichele, Hugo; Trojanowski, John Q; Shaw, Leslie M
2012-04-01
The xMAP-Luminex multiplex platform for measurement of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers using Innogenetics AlzBio3 immunoassay reagents that are for research use only has been shown to be an effective tool for early detection of an AD-like biomarker signature based on concentrations of CSF Aβ(1-42), t-tau and p-tau(181). Among the several advantages of the xMAP-Luminex platform for AD CSF biomarkers are: a wide dynamic range of ready-to-use calibrators, time savings for the simultaneous analyses of three biomarkers in one analytical run, reduction of human error, potential of reduced cost of reagents, and a modest reduction of sample volume as compared to conventional enzyme-linked immunosorbant assay (ELISA) methodology. Recent clinical studies support the use of CSF Aβ(1-42), t-tau and p-tau(181) measurement using the xMAP-Luminex platform for the early detection of AD pathology in cognitively normal individuals, and for prediction of progression to AD dementia in subjects with mild cognitive impairment (MCI). Studies that have shown the prediction of risk for progression to AD dementia by MCI patients provide the basis for the use of CSF Aβ(1-42), t-tau and p-tau(181) testing to assign risk for progression in patients enrolled in therapeutic trials. Furthermore emerging study data suggest that these pathologic changes occur in cognitively normal subjects 20 or more years before the onset of clinically detectable memory changes thus providing an objective measurement for use in the assessment of treatment effects in primary treatment trials. However, numerous previous ELISA and Luminex-based multiplex studies reported a wide range of absolute values of CSF Aβ(1-42), t-tau and p-tau(181) indicative of substantial inter-laboratory variability as well as varying degrees of intra-laboratory imprecision. In order to address these issues a recent inter-laboratory investigation that included a common set of CSF pool aliquots from controls as well as AD patients over a range of normal and pathological Aβ(1-42), t-tau and p-tau(181) values as well as agreed-on standard operating procedures (SOPs) assessed the reproducibility of the multiplex methodology and Innogenetics AlzBio3 immunoassay reagents. This study showed within-center precision values of 5% to a little more than 10% and good inter-laboratory %CV values (10-20%). There are several likely factors influencing the variability of CSF Aβ(1-42), t-tau and p-tau(181) measurements. In this review, we describe the pre-analytical, analytical and post-analytical sources of variability including sources inherent to kits, and describe procedures to decrease the variability. A CSF AD biomarker Quality Control program has been established and funded by the Alzheimer Association, and global efforts are underway to further define optimal pre-analytical SOPs and best practices for the methodologies available or in development including plans for production of a standard reference material that could provide for a common standard against which manufacturers of immunoassay kits would assign calibration standard values. Copyright © 2012 Elsevier Inc. All rights reserved.
Gis-Based Crop Support System For Common Oatand Naked Oat in China
NASA Astrophysics Data System (ADS)
Wan, Fan; Wang, Zhen; Li, Fengmin; Cao, Huhua; Sun, Guojun
The identification of the suitable areas for common oat (Avena sativa L.) and naked oat (Avena nuda L.) in China using Multi-Criteria Evaluation (MCE) approach based on GIS is presented in the current article. Climate, topography, soil, land use and oat variety databases were created. Relevant criteria,suitability levels and their weights for each factor were defined. Then the criteria maps were obtained and turned into the MCE process, and suitability maps for common oat and naked oat were created. The land use and the suitability maps were crossed to identify the suitable areas for each crop. The results identified 397,720 km2 of suitable areas for common oats of forage purpose distributed in 744 counties in 17 provinces, and 556,232 km2 of suitable areas for naked oats of grain purpose distributed in 779 counties in 19 provinces. This result is in accordance with the distribution of farmingpastoral ecozones located in semi-arid regions of northern China. The mapped areas can help define the working limits and serve as indicative zones for oat in China. The created databases, mapped results, interface of expert system and relevant hardware facilities could construct a complete crop support system for oats.
Geologic map of the Riverside East 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Cox, Brett F.
2001-01-01
a. This Readme; includes in Appendix I, data contained in rse_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Corona North 7.5' quadrangle, Riverside and San Bernardino counties, California
Morton, Douglas M.; Gray, C.H.; Bovard, Kelly R.; Dawson, Michael
2002-01-01
a. This Readme; includes in Appendix I, data contained in crn_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Corona South 7.5' quadrangle, Riverside and Orange counties, California
Gray, C.H.; Morton, Douglas M.; Weber, F. Harold; Digital preparation by Bovard, Kelly R.; O'Brien, Timothy
2002-01-01
a. A Readme file; includes in Appendix I, data contained in crs_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Lake Mathews 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Weber, F. Harold
2001-01-01
a. This Readme; includes in Appendix I, data contained in lkm_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous.Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand.In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Steele Peak 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; digital preparation by Alvarez, Rachel M.; Diep, Van M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in stp_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Riverside West 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Cox, Brett F.
2001-01-01
a. This Readme; includes in Appendix I, data contained in rsw_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f.Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Common fixed point theorems for maps under a contractive condition of integral type
NASA Astrophysics Data System (ADS)
Djoudi, A.; Merghadi, F.
2008-05-01
Two common fixed point theorems for mapping of complete metric space under a general contractive inequality of integral type and satisfying minimal commutativity conditions are proved. These results extend and improve several previous results, particularly Theorem 4 of Rhoades [B.E. Rhoades, Two fixed point theorems for mappings satisfying a general contractive condition of integral type, Int. J. Math. Math. Sci. 63 (2003) 4007-4013] and Theorem 4 of Sessa [S. Sessa, On a weak commutativity condition of mappings in fixed point considerations, Publ. Inst. Math. (Beograd) (N.S.) 32 (46) (1982) 149-153].
A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS
Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.
2011-01-01
Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey, represents the state of the art for processing planetary remote sensing data, from the raw unprocessed state to the map projected product. The second, the Geographic Resources Analysis Support System (GRASS) is a Geographic Information System developed by an international team of developers, and one of the core projects promoted by the Open Source Geospatial Foundation (OSGeo). We have worked on enabling the combined use of these software systems throughout the set-up of a common user interface, the unification of the cartographic reference system nomenclature and the minimization of data conversion. Both software packages are distributed with free open source licenses, as well as the source code, scripts and configuration files hereafter presented. In this paper we describe our work done to merge these working environments into a common one, where the user benefits from functionalities of both systems without the need to switch or transfer data from one software suite to the other one. Thereafter we provide an example of its usage in the handling of planetary data and the crafting of a digital geologic map. ?? 2010 Elsevier Ltd. All rights reserved.
Rowan, L.C.; Mars, J.C.
2003-01-01
Evaluation of an Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) image of the Mountain Pass, California area indicates that several important lithologic groups can be mapped in areas with good exposure by using spectral-matching techniques. The three visible and six near-infrared bands, which have 15-m and 30-m resolution, respectively, were calibrated by using in situ measurements of spectral reflectance. Calcitic rocks were distinguished from dolomitic rocks by using matched-filter processing in which image spectra were used as references for selected spectral categories. Skarn deposits and associated bright coarse marble were mapped in contact metamorphic zones related to intrusion of Mesozoic and Tertiary granodioritic rocks. Fe-muscovite, which is common in these intrusive rocks, was distinguished from Al-muscovite present in granitic gneisses and Mesozoic granite. Quartzose rocks were readily discriminated, and carbonate rocks were mapped as a single broad unit through analysis of the 90-m resolution, five-band surface emissivity data, which is produced as a standard product at the EROS Data Center. Three additional classes resulting from spectral-angle mapper processing ranged from (1) a broad granitic rock class (2) to predominately granodioritic rocks and (3) a more mafic class consisting mainly of mafic gneiss, amphibolite and variable mixtures of carbonate rocks and silicate rocks. ?? 2002 Elsevier Science Inc. All rights reserved.
Identifying fMRI Model Violations with Lagrange Multiplier Tests
Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor
2013-01-01
The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665
Standardization of mapping practices in the British Geological Survey
NASA Astrophysics Data System (ADS)
Allen, Peter M.
1997-07-01
Because the British Geological Survey (BGS) has had, since its foundation in 1835, a mandate to produce geological maps for the whole of Great Britain, there is a long history of introducing standard practices in the way rocks and rock units have been named, classified and illustrated on maps. The reasons for the failure of some of these practices are examined and assessed in relation to the needs of computerized systems for holding and disseminating geological information.
Geologic map and digital database of the Romoland 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Digital preparation by Bovard, Kelly R.; Morton, Gregory
2003-01-01
Portable Document Format (.pdf) files of: This Readme; includes in Appendix I, data contained in rom_met.txt The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formationname, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). This Readme file describes the digital data, such as types and general contents of files making up the database, and includes information on how to extract and plot the map and accompanying graphic file. Metadata information can be accessed at http://geo-nsdi.er.usgs.gov/metadata/open-file/03-102 and is included in Appendix I of this Readme.
Peterson, Kevin J.; Pathak, Jyotishman
2014-01-01
Automated execution of electronic Clinical Quality Measures (eCQMs) from electronic health records (EHRs) on large patient populations remains a significant challenge, and the testability, interoperability, and scalability of measure execution are critical. The High Throughput Phenotyping (HTP; http://phenotypeportal.org) project aligns with these goals by using the standards-based HL7 Health Quality Measures Format (HQMF) and Quality Data Model (QDM) for measure specification, as well as Common Terminology Services 2 (CTS2) for semantic interpretation. The HQMF/QDM representation is automatically transformed into a JBoss® Drools workflow, enabling horizontal scalability via clustering and MapReduce algorithms. Using Project Cypress, automated verification metrics can then be produced. Our results show linear scalability for nine executed 2014 Center for Medicare and Medicaid Services (CMS) eCQMs for eligible professionals and hospitals for >1,000,000 patients, and verified execution correctness of 96.4% based on Project Cypress test data of 58 eCQMs. PMID:25954459
Harrison, Thomas C; Sigler, Albrecht; Murphy, Timothy H
2009-09-15
We describe a simple and low-cost system for intrinsic optical signal (IOS) imaging using stable LED light sources, basic microscopes, and commonly available CCD cameras. IOS imaging measures activity-dependent changes in the light reflectance of brain tissue, and can be performed with a minimum of specialized equipment. Our system uses LED ring lights that can be mounted on standard microscope objectives or video lenses to provide a homogeneous and stable light source, with less than 0.003% fluctuation across images averaged from 40 trials. We describe the equipment and surgical techniques necessary for both acute and chronic mouse preparations, and provide software that can create maps of sensory representations from images captured by inexpensive 8-bit cameras or by 12-bit cameras. The IOS imaging system can be adapted to commercial upright microscopes or custom macroscopes, eliminating the need for dedicated equipment or complex optical paths. This method can be combined with parallel high resolution imaging techniques such as two-photon microscopy.
MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit
Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer
2012-01-01
MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188
MethPrimer: designing primers for methylation PCRs.
Li, Long-Cheng; Dahiya, Rajvir
2002-11-01
DNA methylation is an epigenetic mechanism of gene regulation. Bisulfite- conversion-based PCR methods, such as bisulfite sequencing PCR (BSP) and methylation specific PCR (MSP), remain the most commonly used techniques for methylation mapping. Existing primer design programs developed for standard PCR cannot handle primer design for bisulfite-conversion-based PCRs due to changes in DNA sequence context caused by bisulfite treatment and many special constraints both on the primers and the region to be amplified for such experiments. Therefore, the present study was designed to develop a program for such applications. MethPrimer, based on Primer 3, is a program for designing PCR primers for methylation mapping. It first takes a DNA sequence as its input and searches the sequence for potential CpG islands. Primers are then picked around the predicted CpG islands or around regions specified by users. MethPrimer can design primers for BSP and MSP. Results of primer selection are delivered through a web browser in text and in graphic view.
Blind decomposition of Herschel-HIFI spectral maps of the NGC 7023 nebula
NASA Astrophysics Data System (ADS)
Berné, O.; Joblin, C.; Deville, Y.; Pilleri, P.; Pety, J.; Teyssier, D.; Gerin, M.; Fuente, A.
2012-12-01
Large spatial-spectral surveys are more and more common in astronomy. This calls for the need of new methods to analyze such mega- to giga-pixel data-cubes. In this paper we present a method to decompose such observations into a limited and comprehensive set of components. The original data can then be interpreted in terms of linear combinations of these components. The method uses non-negative matrix factorization (NMF) to extract latent spectral end-members in the data. The number of needed end-members is estimated based on the level of noise in the data. A Monte-Carlo scheme is adopted to estimate the optimal end-members, and their standard deviations. Finally, the maps of linear coefficients are reconstructed using non-negative least squares. We apply this method to a set of hyperspectral data of the NGC 7023 nebula, obtained recently with the HIFI instrument onboard the Herschel space observatory, and provide a first interpretation of the results in terms of 3-dimensional dynamical structure of the region.
MOCAT: a metagenomics assembly and gene prediction toolkit.
Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer
2012-01-01
MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.
Use Standards to Draw Curriculum Maps
ERIC Educational Resources Information Center
Franklin, Pat; Stephens, Claire Gatrell
2009-01-01
Specific curriculum is taught at every grade level and it is the job of library media specialists to know subject area content. Library media specialists should develop collections to meet the content associated with curriculum standards. To ensure that these collections meet school needs, collection mapping with specific curriculum related to…
The Role of Construct Maps in Standard Setting
ERIC Educational Resources Information Center
Kane, Michael T.; Tannenbaum, Richard J.
2013-01-01
The authors observe in this commentary that construct maps can help standard-setting panels to make realistic and internally consistent recommendations for performance-level descriptions (PLDs) and cut-scores, but the benefits may not be realized if policymakers do not fully understand the rationale for the recommendations provided by the…
Using Clouds for MapReduce Measurement Assignments
ERIC Educational Resources Information Center
Rabkin, Ariel; Reiss, Charles; Katz, Randy; Patterson, David
2013-01-01
We describe our experiences teaching MapReduce in a large undergraduate lecture course using public cloud services and the standard Hadoop API. Using the standard API, students directly experienced the quality of industrial big-data tools. Using the cloud, every student could carry out scalability benchmarking assignments on realistic hardware,…
Critical thinking in graduate medical education: A role for concept mapping assessment?
West, D C; Pomeroy, J R; Park, J K; Gerstenberger, E A; Sandoval, J
2000-09-06
Tools to assess the evolving conceptual framework of physicians-in-training are limited, despite their critical importance to physicians' evolving clinical expertise. Concept mapping assessment (CMA) enables teachers to view students' organization of their knowledge at various points in training. To assess whether CMA reflects expected differences and changes in the conceptual framework of resident physicians, whether concept maps can be scored reliably, and how well CMA scores relate to the results of standard in-training examination. A group of 21 resident physicians (9 first-year and 12 second- and third-year residents) from a university-based pediatric training program underwent concept map training, drew a preinstruction concept map about seizures, completed an education course on seizures, and then drew a postinstruction map. Maps were scored independently by 3 raters using a standardized method. The study was conducted in May and June 1999. Preinstruction map total scores and subscores in 4 categories compared with postinstruction map scores; map scores of second- and third-year residents compared with first-year residents; and interrater correlation of map scores. Total CMA scores increased after instruction from a mean (SD) preinstruction map score of 429 (119) to a mean postinstruction map score of 516 (196) (P =.03). Second- and third-year residents scored significantly higher than first-year residents before instruction (mean [SD] score of 472 [116] vs 371 [102], respectively; P =.04), but not after instruction (mean [SD] scores, 561 [203] vs 456 [179], respectively; P =.16). Second- and third-year residents had greater preinstruction map complexity as measured by cross-link score (P =.01) than first-year residents. The CMA score had a weak to no correlation with the American Board of Pediatrics In-training Examination score (r = 0.10-0.54). Interrater correlation of map scoring ranged from weak to moderate for the preinstruction map (r = 0.51-0.69) and moderate to strong for the postinstruction map (r = 0.74-0.88). Our data provide preliminary evidence that concept mapping assessment reflects expected differences and change in the conceptual framework of resident physicians. Concept mapping assessment and standardized testing may measure different cognitive domains. JAMA. 2000;284:1105-1110
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-01-01
Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. PMID:28440974
Mapping Norway - a Method to Register and Survey the Status of Accessibility
NASA Astrophysics Data System (ADS)
Michaelis, Sven; Bögelsack, Kathrin
2018-05-01
The Norwegian mapping authority has developed a standard method for mapping accessibility mostly for people with limited or no walking abilities in urban and recreational areas. We choose an object-orientated approach where points, lines and polygons represents objects in the environment. All data are stored in a geospatial database, so they can be presented as web map and analyzed using GIS software. By the end of 2016 more than 160 municipalities are mapped using that method. The aim of this project is to establish a national standard for mapping and to provide a geodatabase that shows the status of accessibility throughout Norway. The data provide a useful tool for national statistics, local planning authorities and private users. First results show that accessibility is low and Norway still faces many challenges to meet the government's goals for Universal Design.
NASA Astrophysics Data System (ADS)
Che Awang, Aznida; Azah Samat, Nor
2017-09-01
Leptospirosis is a disease caused by the infection of pathogenic species from the genus of Leptospira. Human can be infected by the leptospirosis from direct or indirect exposure to the urine of infected animals. The excretion of urine from the animal host that carries pathogenic Leptospira causes the soil or water to be contaminated. Therefore, people can become infected when they are exposed to contaminated soil and water by cut on the skin as well as open wound. It also can enter the human body by mucous membrane such nose, eyes and mouth, for example by splashing contaminated water or urine into the eyes or swallowing contaminated water or food. Currently, there is no vaccine available for the prevention or treatment of leptospirosis disease but this disease can be treated if it is diagnosed early to avoid any complication. The disease risk mapping is important in a way to control and prevention of disease. Using a good choice of statistical model will produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for leptospirosis disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and Poisson-gamma model. This paper begins by providing a review of the SMR method and Poisson-gamma model, which we then applied to leptospirosis data of Kelantan, Malaysia. Both results are displayed and compared using graph, tables and maps. The result shows that the second method Poisson-gamma model produces better relative risk estimates compared to the SMR method. This is because the Poisson-gamma model can overcome the drawback of SMR where the relative risk will become zero when there is no observed leptospirosis case in certain regions. However, the Poisson-gamma model also faced problems where the covariate adjustment for this model is difficult and no possibility for allowing spatial correlation between risks in neighbouring areas. The problems of this model have motivated many researchers to introduce other alternative methods for estimating the risk.
Mapping and monitoring High Nature Value farmlands: challenges in European landscapes.
Lomba, Angela; Guerra, Carlos; Alonso, Joaquim; Honrado, João Pradinho; Jongman, Rob; McCracken, David
2014-10-01
The importance of low intensity farming for the conservation of biodiversity throughout Europe was acknowledged early in the 1990s when the concept of 'High Nature Value farmlands' (HNVf) was devised. HNVf has subsequently been given high priority within the EU Rural Development Programme. This puts a requirement on each EU Member State not only to identify the extent and condition of HNVf within their borders but also to track trends in HNVf over time. However, the diversity of rural landscapes across the EU, the scarcity of (adequate) datasets on biodiversity, land cover and land use, and the lack of a common methodology for HNVf mapping currently represent obstacles to the implementation of the HNVf concept across Europe. This manuscript provides an overview of the characteristics of HNVf across Europe together with a description of the development of the HNVf concept. Current methodological approaches for the identification and mapping of HNVf across EU-27 and Switzerland are then reviewed, the main limitations of these approaches highlighted and recommendations made as to how the identification, mapping and reporting of HNVf state and trends across Europe can potentially be improved and harmonised. In particular, we propose a new framework that is built on the need for strategic HNVf monitoring based on a hierarchical, bottom-up structure of assessment units, coincident with the EU levels of political decision and devised indicators, and which is linked strongly to a collaborative European network that can provide the integration and exchange of data from different sources and scales under common standards. Such an approach is essential if the scale of the issues facing HNVf landscapes are to be identified and monitored properly at the European level. This would then allow relevant agri-environmental measures to be developed, implemented and evaluated at the scale(s) required to maintain the habitats and species of high nature conservation value that are intimately associated with those landscapes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Ordered versus Unordered Map for Primitive Data Types
2015-09-01
mapped to some element. C++ provides two types of map containers within the standard template library, the std ::map and the std ::unordered_map...classes. As the name implies, the containers main functional difference is that the elements in the std ::map are ordered by the key, and the std ...unordered_map are not ordered based on their key. The std ::unordered_map elements are placed into “buckets” based on a hash value computed for their key
Optimized MLAA for quantitative non-TOF PET/MR of the brain
NASA Astrophysics Data System (ADS)
Benoit, Didier; Ladefoged, Claes N.; Rezaei, Ahmadreza; Keller, Sune H.; Andersen, Flemming L.; Højgaard, Liselotte; Hansen, Adam E.; Holm, Søren; Nuyts, Johan
2016-12-01
For quantitative tracer distribution in positron emission tomography, attenuation correction is essential. In a hybrid PET/CT system the CT images serve as a basis for generation of the attenuation map, but in PET/MR, the MR images do not have a similarly simple relationship with the attenuation map. Hence attenuation correction in PET/MR systems is more challenging. Typically either of two MR sequences are used: the Dixon or the ultra-short time echo (UTE) techniques. However these sequences have some well-known limitations. In this study, a reconstruction technique based on a modified and optimized non-TOF MLAA is proposed for PET/MR brain imaging. The idea is to tune the parameters of the MLTR applying some information from an attenuation image computed from the UTE sequences and a T1w MR image. In this MLTR algorithm, an {αj} parameter is introduced and optimized in order to drive the algorithm to a final attenuation map most consistent with the emission data. Because the non-TOF MLAA is used, a technique to reduce the cross-talk effect is proposed. In this study, the proposed algorithm is compared to the common reconstruction methods such as OSEM using a CT attenuation map, considered as the reference, and OSEM using the Dixon and UTE attenuation maps. To show the robustness and the reproducibility of the proposed algorithm, a set of 204 [18F]FDG patients, 35 [11C]PiB patients and 1 [18F]FET patient are used. The results show that by choosing an optimized value of {αj} in MLTR, the proposed algorithm improves the results compared to the standard MR-based attenuation correction methods (i.e. OSEM using the Dixon or the UTE attenuation maps), and the cross-talk and the scale problem are limited.
The National Map - Utah Transportation Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Texas Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Florida Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Pennsylvania Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Delaware Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Lake Tahoe Area Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Missouri Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Washington-Idaho Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
Landsat for practical forest type mapping - A test case
NASA Technical Reports Server (NTRS)
Bryant, E.; Dodge, A. G., Jr.; Warren, S. D.
1980-01-01
Computer classified Landsat maps are compared with a recent conventional inventory of forest lands in northern Maine. Over the 196,000 hectare area mapped, estimates of the areas of softwood, mixed wood and hardwood forest obtained by a supervised classification of the Landsat data and a standard inventory based on aerial photointerpretation, probability proportional to prediction, field sampling and a standard forest measurement program are found to agree to within 5%. The cost of the Landsat maps is estimated to be $0.065/hectare. It is concluded that satellite techniques are worth developing for forest inventories, although they are not yet refined enough to be incorporated into current practical inventories.
Generalized contractive mappings and weakly α-admissible pairs in G-metric spaces.
Hussain, N; Parvaneh, V; Hoseini Ghoncheh, S J
2014-01-01
The aim of this paper is to present some coincidence and common fixed point results for generalized (ψ, φ)-contractive mappings using partially weakly G-α-admissibility in the setup of G-metric space. As an application of our results, periodic points of weakly contractive mappings are obtained. We also derive certain new coincidence point and common fixed point theorems in partially ordered G-metric spaces. Moreover, some examples are provided here to illustrate the usability of the obtained results.
Generalized Contractive Mappings and Weakly α-Admissible Pairs in G-Metric Spaces
Hussain, N.; Parvaneh, V.; Hoseini Ghoncheh, S. J.
2014-01-01
The aim of this paper is to present some coincidence and common fixed point results for generalized (ψ, φ)-contractive mappings using partially weakly G-α-admissibility in the setup of G-metric space. As an application of our results, periodic points of weakly contractive mappings are obtained. We also derive certain new coincidence point and common fixed point theorems in partially ordered G-metric spaces. Moreover, some examples are provided here to illustrate the usability of the obtained results. PMID:25202742
von Bary, Christian; Fredersdorf-Hahn, Sabine; Heinicke, Norbert; Jungbauer, Carsten; Schmid, Peter; Riegger, Günter A; Weber, Stefan
2011-08-01
Recently, new catheter technologies have been developed for atrial fibrillation (AF) ablation. We investigate the diagnostic accuracy of a circular mapping and pulmonary vein ablation catheter (PVAC) compared with a standard circular mapping catheter (Orbiter) and the influence of filter settings on signal quality. After reconstruction of the left atrium by three-dimensional atriography, baseline PV potentials (PVP) were recorded consecutively with PVAC and Orbiter in 20 patients with paroxysmal AF. PVPs were compared and attributed to predefined anatomical PV segments. Ablation was performed in 80 PVs using the PVAC. If isolation of the PVs was assumed, signal assessment of each PV was repeated with the Orbiter. If residual PV potentials could be uncovered, different filter settings were tested to improve mapping quality of the PVAC. Ablation was continued until complete PV isolation (PVI) was confirmed with the Orbiter. Baseline mapping demonstrated a good correlation between the Orbiter and PVAC. Mapping accuracy using the PVAC for mapping and ablation was 94% (74 of 79 PVs). Additional mapping with the Orbiter improved the PV isolation rate to 99%. Adjustment of filter settings failed to improve quality of the PV signals compared with standard filter settings. Using the PVAC as a stand-alone strategy for mapping and ablation, one should be aware that in some cases, different signal morphology mimics PVI isolation. Adjustment of filter settings failed to improve signal quality. The use of an additional mapping catheter is recommended to become familiar with the particular signal morphology during the first PVAC cases or whenever there is a doubt about successful isolation of the pulmonary veins.
Mapping Air Quality Index of Carbon Monoxide (CO) in Medan City
NASA Astrophysics Data System (ADS)
Suryati, I.; Khair, H.
2017-03-01
This study aims to map and analyze air quality index of carbon monoxide (CO) in Medan City. This research used 12 (twelve) sampling points around in Medan with an hour duration each point. CO concentration was analyzed using the NDIR CO Analyzer sampling tool. The concentration CO was obtained between 1 ppm - 23 ppm, with an average concentration was 9.5 ppm. This condition is still below the national ambient air quality standard set by Government Regulation of Indonesian Republic Number 41-1999 amounted to 29 ppm. The result of CO concentration measurements was converted into air pollutant standard index, obtained the index value of 58 - 204. Surfer 10 was used to create map of air pollutant standard index for CO. The map illustrates very unhealthy area where located in the Medan Belawan district. The main factors affecting the concentration of CO are from transportation and meteorological factors.
Importance sampling with imperfect cloning for the computation of generalized Lyapunov exponents
NASA Astrophysics Data System (ADS)
Anteneodo, Celia; Camargo, Sabrina; Vallejos, Raúl O.
2017-12-01
We revisit the numerical calculation of generalized Lyapunov exponents, L (q ) , in deterministic dynamical systems. The standard method consists of adding noise to the dynamics in order to use importance sampling algorithms. Then L (q ) is obtained by taking the limit noise-amplitude → 0 after the calculation. We focus on a particular method that involves periodic cloning and pruning of a set of trajectories. However, instead of considering a noisy dynamics, we implement an imperfect (noisy) cloning. This alternative method is compared with the standard one and, when possible, with analytical results. As a workbench we use the asymmetric tent map, the standard map, and a system of coupled symplectic maps. The general conclusion of this study is that the imperfect-cloning method performs as well as the standard one, with the advantage of preserving the deterministic dynamics.
Information Model Translation to Support a Wider Science Community
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Daniel; Ritschel, Bernd; Hardman, Sean; Joyner, Ronald
2014-05-01
The Planetary Data System (PDS), NASA's long-term archive for solar system exploration data, has just released PDS4, a modernization of the PDS architecture, data standards, and technical infrastructure. This next generation system positions the PDS to meet the demands of the coming decade, including big data, international cooperation, distributed nodes, and multiple ways of analysing and interpreting data. It also addresses three fundamental project goals: providing more efficient data delivery by data providers to the PDS, enabling a stable, long-term usable planetary science data archive, and enabling services for the data consumer to find, access, and use the data they require in contemporary data formats. The PDS4 information architecture is used to describe all PDS data using a common model. Captured in an ontology modeling tool it supports a hierarchy of data dictionaries built to the ISO/IEC 11179 standard and is designed to increase flexibility, enable complex searches at the product level, and to promote interoperability that facilitates data sharing both nationally and internationally. A PDS4 information architecture design requirement stipulates that the content of the information model must be translatable to external data definition languages such as XML Schema, XMI/XML, and RDF/XML. To support the semantic Web standards we are now in the process of mapping the contents into RDF/XML to support SPARQL capable databases. We are also building a terminological ontology to support virtually unified data retrieval and access. This paper will provide an overview of the PDS4 information architecture focusing on its domain information model and how the translation and mapping are being accomplished.
MAP Science for Use with Next Generation Science Standards. NWEA External FAQ
ERIC Educational Resources Information Center
Northwest Evaluation Association, 2016
2016-01-01
Measures of Academic Progress® (MAP®) Science for use with Next Generation Science Standards (NGSS) assessments are available for the 2016-17 school year. These new assessments measure student growth toward understanding of the multidimensional NGSS performance expectations. This report presents MAP Science for use with NGSS by presenting and…
"Understanding" medical school curriculum content using KnowledgeMap.
Denny, Joshua C; Smithers, Jeffrey D; Miller, Randolph A; Spickard, Anderson
2003-01-01
To describe the development and evaluation of computational tools to identify concepts within medical curricular documents, using information derived from the National Library of Medicine's Unified Medical Language System (UMLS). The long-term goal of the KnowledgeMap (KM) project is to provide faculty and students with an improved ability to develop, review, and integrate components of the medical school curriculum. The KM concept identifier uses lexical resources partially derived from the UMLS (SPECIALIST lexicon and Metathesaurus), heuristic language processing techniques, and an empirical scoring algorithm. KM differentiates among potentially matching Metathesaurus concepts within a source document. The authors manually identified important "gold standard" biomedical concepts within selected medical school full-content lecture documents and used these documents to compare KM concept recognition with that of a known state-of-the-art "standard"-the National Library of Medicine's MetaMap program. The number of "gold standard" concepts in each lecture document identified by either KM or MetaMap, and the cause of each failure or relative success in a random subset of documents. For 4,281 "gold standard" concepts, MetaMap matched 78% and KM 82%. Precision for "gold standard" concepts was 85% for MetaMap and 89% for KM. The heuristics of KM accurately matched acronyms, concepts underspecified in the document, and ambiguous matches. The most frequent cause of matching failures was absence of target concepts from the UMLS Metathesaurus. The prototypic KM system provided an encouraging rate of concept extraction for representative medical curricular texts. Future versions of KM should be evaluated for their ability to allow administrators, lecturers, and students to navigate through the medical curriculum to locate redundancies, find interrelated information, and identify omissions. In addition, the ability of KM to meet specific, personal information needs should be assessed.
Nested association mapping of stem rust resistance in wheat using genotyping by sequencing
USDA-ARS?s Scientific Manuscript database
Nested association mapping is an approach to map trait loci in which families within populations are interconnected by a common parent. By implementing joint-linkage association analysis, this approach is able to map causative loci with higher power and resolution compared to biparental linkage mapp...
USDA-ARS?s Scientific Manuscript database
Indices derived from remotely-sensed imagery are commonly used to predict soil properties with digital soil mapping (DSM) techniques. The use of images from single dates or a small number of dates is most common for DSM; however, selection of the appropriate images is complicated by temporal variabi...
Modified atmosphere packaging for fresh-cut ‘Kent’ mango under common retail display conditions
USDA-ARS?s Scientific Manuscript database
A modified atmosphere package (MAP) was designed to optimize the quality and shelf-life of fresh-cut ‘Kent’ mango during exposure to common retail display conditions. The synergism between the MAP system and an antioxidant treatment (calcium ascorbate and citric acid) was also investigated. Mango sl...
Ceos Wgiss Common Framework for Wgiss Connected Data Assets
NASA Astrophysics Data System (ADS)
Enloe, Y.; Mitchell, A. E.; Albani, M.; Yapur, M.
2016-12-01
The Committee on Earth Observation Satellites (CEOS), established in 1984 to coordinate civil space-borne observations of the Earth, has been building through its Working Group on Information Systems and Services (WGISS), a common data framework to identify and connect data assets at member agencies. Some of these data assets are federated systems such as the CEOS WGISS Integrated Catalog (CWIC), the European Space Agency's FedEO (Federated Earth Observations Missions Access) system, and the International Directory Network (IDN) which is an international effort developed by NASA to assist researchers in locating information on available data sets. A system level team provides coordination and oversight to make this loosely coupled federated system function and evolve. WGISS has identified 2 search standards, the Open Geospatial Consortium (OGC) Catalog Services for the Web (CSW) and the CEOS OpenSearch Best Practices (which references the OGC OpenSearch Geo and Time Extensions and OGC OpenSearch Extension for Earth Observation) as well as an interoperable metadata standard (ISO 19115) for use within the WGISS Connected Assets. Data partners must register their data collections in the IDN using the Global Change Master Directory (GCMD) Keywords. Data partners need to support one of the 2 search standards and be able to map their internal metadata to the ISO 19115 metadata elements. All searchable data must have a data access path. Clients can offer search and access to all or a subset of the satellite data available through the WGISS Connected Data Assets. Clients can offer support for a 2-step search: (1) Discovery through collection search using platform, instrument, science keywords, etc. at the IDN and (2) Search granule metadata at data partners through CWIC or FedEO. There are more than a dozen international agencies that offer their data through the WGISS Federation or working on developing their connections. This list includes European Space Agency, NASA, NOAA, USGS, National Institute for Space Research (Brazil), Canadian Center for Mapping and Earth Observations (CCMEO), the Academy for Opto-Electronics (China), the Indian Space Research Organization (ISRO), EUMETSAT, Russian Federal Space Agency (ROSCOSMOS) and several agencies within Australia.
Juang, K W; Lee, D Y; Ellsworth, T R
2001-01-01
The spatial distribution of a pollutant in contaminated soils is usually highly skewed. As a result, the sample variogram often differs considerably from its regional counterpart and the geostatistical interpolation is hindered. In this study, rank-order geostatistics with standardized rank transformation was used for the spatial interpolation of pollutants with a highly skewed distribution in contaminated soils when commonly used nonlinear methods, such as logarithmic and normal-scored transformations, are not suitable. A real data set of soil Cd concentrations with great variation and high skewness in a contaminated site of Taiwan was used for illustration. The spatial dependence of ranks transformed from Cd concentrations was identified and kriging estimation was readily performed in the standardized-rank space. The estimated standardized rank was back-transformed into the concentration space using the middle point model within a standardized-rank interval of the empirical distribution function (EDF). The spatial distribution of Cd concentrations was then obtained. The probability of Cd concentration being higher than a given cutoff value also can be estimated by using the estimated distribution of standardized ranks. The contour maps of Cd concentrations and the probabilities of Cd concentrations being higher than the cutoff value can be simultaneously used for delineation of hazardous areas of contaminated soils.
Carswell, William J.
2011-01-01
increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.
Bridging data models and terminologies to support adverse drug event reporting using EHR data.
Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C
2015-01-01
This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.
Smart POI: Open and linked spatial data
NASA Astrophysics Data System (ADS)
Cerba, Otakar; Berzins, Raitis; Charvat, Karel; Mildorf, Tomas
2016-04-01
The Smart Point of Interest (SPOI) represents an unique seamless spatial data set based on standards recommended for Linked and open data, which are supported by scientist and researchers as well as by several government authorities and European Union. This data set developed in cooperation of partners of SDI4Apps project contains almost 24 millions points of interest focused mainly on tourism, natural features, transport or citizen services. The SPOI data covers almost all countries and territories over the world. It is created as a harmonized combination of global data resources (selected points from OpenStreetMap, Natural Earth and GeoNames.org) and several local data sets (for example data published by the Citadel on the Move project, data from Posumavi region in the Czech Republic or experimental ontologies developed in the University of West Bohemia including ski regions in Europe or historical sights in Rome). The added value of the SDI4Apps approach in comparison to other similar solutions consists in implementation of linked data approach (several objects are connected to DBpedia or GeoNames.org), using of universal RDF format, using of standardized and respected properties or vocabularies (for example FOAF or GeoSPARQL) and development of the completely harmonized data set with uniform data model and common classification (not only a copy of original resources). The SPOI data is published as SPARQL endpoint as well as in the map client. The SPOI dataset is a specific set of POIs which could be "a data fuel" for applications and services related to tourism, local business, statistics or landscape monitoring. It can be used also as a background data layer for thematic maps.
Fine mapping on chromosome 13q32-34 and brain expression analysis implicates MYO16 in schizophrenia.
Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria
2014-03-01
We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32-34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32-34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case-control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case-control data sets of European descent highlighted a region across introns 2-6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shih, Helen A.; Harisinghani, Mukesh; Zietman, Anthony L.
2005-11-15
Purpose: Toxicity from pelvic irradiation could be reduced if fields were limited to likely areas of nodal involvement rather than using the standard 'four-field box.' We employed a novel magnetic resonance lymphangiographic technique to highlight the likely sites of occult nodal metastasis from prostate cancer. Methods and Materials: Eighteen prostate cancer patients with pathologically confirmed node-positive disease had a total of 69 pathologic nodes identifiable by lymphotropic nanoparticle-enhanced MRI and semiquantitative nodal analysis. Fourteen of these nodes were in the para-aortic region, and 55 were in the pelvis. The position of each of these malignant nodes was mapped to amore » common template based on its relation to skeletal or vascular anatomy. Results: Relative to skeletal anatomy, nodes covered a diffuse volume from the mid lumbar spine to the superior pubic ramus and along the sacrum and pelvic side walls. In contrast, the nodal metastases mapped much more tightly relative to the large pelvic vessels. A proposed pelvic clinical target volume to encompass the region at greatest risk of containing occult nodal metastases would include a 2.0-cm radial expansion volume around the distal common iliac and proximal external and internal iliac vessels that would encompass 94.5% of the pelvic nodes at risk as defined by our node-positive prostate cancer patient cohort. Conclusions: Nodal metastases from prostate cancer are largely localized along the major pelvic vasculature. Defining nodal radiation treatment portals based on vascular rather than bony anatomy may allow for a significant decrease in normal pelvic tissue irradiation and its associated toxicities.« less
30 CFR 75.1200-2 - Accuracy and scale of mine maps.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Accuracy and scale of mine maps. 75.1200-2... SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Maps § 75.1200-2 Accuracy and scale of mine maps. (a) The scale of mine maps submitted to the Secretary shall not be less than 100 or...
Thinking Connections: Concept Maps for Life Science. Book B.
ERIC Educational Resources Information Center
Burggraf, Frederick
The concept maps contained in this book (for grades 7-12) span 35 topics in life science. Topics were chosen using the National Science Education Standards as a guide. The practice exercise in concept mapping is included to give students an idea of what the tasks ahead will be in content rich maps. Two levels of concept maps are included for each…
Global transport in a nonautonomous periodic standard map
Calleja, Renato C.; del-Castillo-Negrete, D.; Martinez-del-Rio, D.; ...
2017-04-14
A non-autonomous version of the standard map with a periodic variation of the perturbation parameter is introduced and studied via an autonomous map obtained from the iteration of the nonautonomous map over a period. Symmetry properties in the variables and parameters of the map are found and used to find relations between rotation numbers of invariant sets. The role of the nonautonomous dynamics on period-one orbits, stability and bifurcation is studied. The critical boundaries for the global transport and for the destruction of invariant circles with fixed rotation number are studied in detail using direct computation and a continuation method.more » In the case of global transport, the critical boundary has a particular symmetrical horn shape. Here, the results are contrasted with similar calculations found in the literature.« less
Global transport in a nonautonomous periodic standard map
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calleja, Renato C.; del-Castillo-Negrete, D.; Martinez-del-Rio, D.
A non-autonomous version of the standard map with a periodic variation of the perturbation parameter is introduced and studied via an autonomous map obtained from the iteration of the nonautonomous map over a period. Symmetry properties in the variables and parameters of the map are found and used to find relations between rotation numbers of invariant sets. The role of the nonautonomous dynamics on period-one orbits, stability and bifurcation is studied. The critical boundaries for the global transport and for the destruction of invariant circles with fixed rotation number are studied in detail using direct computation and a continuation method.more » In the case of global transport, the critical boundary has a particular symmetrical horn shape. Here, the results are contrasted with similar calculations found in the literature.« less
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
SNPConvert: SNP Array Standardization and Integration in Livestock Species.
Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra
2016-06-09
One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.
Reaction-Map of Organic Chemistry
ERIC Educational Resources Information Center
Murov, Steven
2007-01-01
The Reaction-Map of Organic Chemistry lists all the most commonly studied reactions in organic chemistry on one page. The discussed Reaction-Map will act as another learning aide for the students, making the study of organic chemistry much easier.
Unsupervised Domain Adaptation with Multiple Acoustic Models
2010-12-01
Discriminative MAP Adaptation Standard ML-MAP has been extended to incorporate discrim- inative training criteria such as MMI and MPE [10]. Dis- criminative MAP...smoothing variable I . For example, the MMI - MAP mean is given by ( mmi -map) jm = fnumjm (O) den jm(O)g+Djm̂jm + I (ml-map) jm f numjm den... MMI training, and Djm is the Gaussian-dependent parameter for the extended Baum-Welch (EBW) algorithm. MMI -MAP has been successfully applied in
Length and area equivalents for interpreting wildland resource maps
Elliot L. Amidon; Marilyn S. Whitfield
1969-01-01
Map users must refer to an appropriate scale in interpreting wildland resource maps. Length and area equivalents for nine map scales commonly used have been computed. For each scale a 1-page table consists of map-to-ground equivalents, buffer strip or road widths, and cell dimensions required for a specified acreage. The conversion factors are stored in a Fortran...
Thieler, E. Robert; Danforth, William W.
1994-01-01
A new, state-of-the-art method for mapping historical shorelines from maps and aerial photographs, the Digital Shoreline Mapping System (DSMS), has been developed. The DSMS is a freely available, public domain software package that meets the cartographic and photogrammetric requirements of precise coastal mapping, and provides a means to quantify and analyze different sources of error in the mapping process. The DSMS is also capable of resolving imperfections in aerial photography that commonly are assumed to be nonexistent. The DSMS utilizes commonly available computer hardware and software, and permits the entire shoreline mapping process to be executed rapidly by a single person in a small lab. The DSMS generates output shoreline position data that are compatible with a variety of Geographic Information Systems (GIS). A second suite of programs, the Digital Shoreline Analysis System (DSAS) has been developed to calculate shoreline rates-of-change from a series of shoreline data residing in a GIS. Four rate-of-change statistics are calculated simultaneously (end-point rate, average of rates, linear regression and jackknife) at a user-specified interval along the shoreline using a measurement baseline approach. An example of DSMS and DSAS application using historical maps and air photos of Punta Uvero, Puerto Rico provides a basis for assessing the errors associated with the source materials as well as the accuracy of computed shoreline positions and erosion rates. The maps and photos used here represent a common situation in shoreline mapping: marginal-quality source materials. The maps and photos are near the usable upper limit of scale and accuracy, yet the shoreline positions are still accurate ±9.25 m when all sources of error are considered. This level of accuracy yields a resolution of ±0.51 m/yr for shoreline rates-of-change in this example, and is sufficient to identify the short-term trend (36 years) of shoreline change in the study area.
Field Guide to the Plant Community Types of Voyageurs National Park
Faber-Langendoen, Don; Aaseng, Norman; Hop, Kevin; Lew-Smith, Michael
2007-01-01
INTRODUCTION The objective of the U.S. Geological Survey-National Park Service Vegetation Mapping Program is to classify, describe, and map vegetation for most of the park units within the National Park Service (NPS). The program was created in response to the NPS Natural Resources Inventory and Monitoring Guidelines issued in 1992. Products for each park include digital files of the vegetation map and field data, keys and descriptions to the plant communities, reports, metadata, map accuracy verification summaries, and aerial photographs. Interagency teams work in each park and, following standardized mapping and field sampling protocols, develop products and vegetation classification standards that document the various vegetation types found in a given park. The use of a standard national vegetation classification system and mapping protocol facilitate effective resource stewardship by ensuring compatibility and widespread use of the information throughout the NPS as well as by other Federal and state agencies. These vegetation classifications and maps and associated information support a wide variety of resource assessment, park management, and planning needs, and provide a structure for framing and answering critical scientific questions about plant communities and their relation to environmental processes across the landscape. This field guide is intended to make the classification accessible to park visitors and researchers at Voyageurs National Park, allowing them to identify any stand of natural vegetation and showing how the classification can be used in conjunction with the vegetation map (Hop and others, 2001).
MIDAS: Regionally linear multivariate discriminative statistical mapping.
Varol, Erdem; Sotiras, Aristeidis; Davatzikos, Christos
2018-07-01
Statistical parametric maps formed via voxel-wise mass-univariate tests, such as the general linear model, are commonly used to test hypotheses about regionally specific effects in neuroimaging cross-sectional studies where each subject is represented by a single image. Despite being informative, these techniques remain limited as they ignore multivariate relationships in the data. Most importantly, the commonly employed local Gaussian smoothing, which is important for accounting for registration errors and making the data follow Gaussian distributions, is usually chosen in an ad hoc fashion. Thus, it is often suboptimal for the task of detecting group differences and correlations with non-imaging variables. Information mapping techniques, such as searchlight, which use pattern classifiers to exploit multivariate information and obtain more powerful statistical maps, have become increasingly popular in recent years. However, existing methods may lead to important interpretation errors in practice (i.e., misidentifying a cluster as informative, or failing to detect truly informative voxels), while often being computationally expensive. To address these issues, we introduce a novel efficient multivariate statistical framework for cross-sectional studies, termed MIDAS, seeking highly sensitive and specific voxel-wise brain maps, while leveraging the power of regional discriminant analysis. In MIDAS, locally linear discriminative learning is applied to estimate the pattern that best discriminates between two groups, or predicts a variable of interest. This pattern is equivalent to local filtering by an optimal kernel whose coefficients are the weights of the linear discriminant. By composing information from all neighborhoods that contain a given voxel, MIDAS produces a statistic that collectively reflects the contribution of the voxel to the regional classifiers as well as the discriminative power of the classifiers. Critically, MIDAS efficiently assesses the statistical significance of the derived statistic by analytically approximating its null distribution without the need for computationally expensive permutation tests. The proposed framework was extensively validated using simulated atrophy in structural magnetic resonance imaging (MRI) and further tested using data from a task-based functional MRI study as well as a structural MRI study of cognitive performance. The performance of the proposed framework was evaluated against standard voxel-wise general linear models and other information mapping methods. The experimental results showed that MIDAS achieves relatively higher sensitivity and specificity in detecting group differences. Together, our results demonstrate the potential of the proposed approach to efficiently map effects of interest in both structural and functional data. Copyright © 2018. Published by Elsevier Inc.
Iehisa, Julio Cesar Masaru; Ohno, Ryoko; Kimura, Tatsuro; Enoki, Hiroyuki; Nishimura, Satoru; Okamoto, Yuki; Nasuda, Shuhei; Takumi, Shigeo
2014-01-01
The large genome and allohexaploidy of common wheat have complicated construction of a high-density genetic map. Although improvements in the throughput of next-generation sequencing (NGS) technologies have made it possible to obtain a large amount of genotyping data for an entire mapping population by direct sequencing, including hexaploid wheat, a significant number of missing data points are often apparent due to the low coverage of sequencing. In the present study, a microarray-based polymorphism detection system was developed using NGS data obtained from complexity-reduced genomic DNA of two common wheat cultivars, Chinese Spring (CS) and Mironovskaya 808. After design and selection of polymorphic probes, 13,056 new markers were added to the linkage map of a recombinant inbred mapping population between CS and Mironovskaya 808. On average, 2.49 missing data points per marker were observed in the 201 recombinant inbred lines, with a maximum of 42. Around 40% of the new markers were derived from genic regions and 11% from repetitive regions. The low number of retroelements indicated that the new polymorphic markers were mainly derived from the less repetitive region of the wheat genome. Around 25% of the mapped sequences were useful for alignment with the physical map of barley. Quantitative trait locus (QTL) analyses of 14 agronomically important traits related to flowering, spikes, and seeds demonstrated that the new high-density map showed improved QTL detection, resolution, and accuracy over the original simple sequence repeat map. PMID:24972598
Iehisa, Julio Cesar Masaru; Ohno, Ryoko; Kimura, Tatsuro; Enoki, Hiroyuki; Nishimura, Satoru; Okamoto, Yuki; Nasuda, Shuhei; Takumi, Shigeo
2014-10-01
The large genome and allohexaploidy of common wheat have complicated construction of a high-density genetic map. Although improvements in the throughput of next-generation sequencing (NGS) technologies have made it possible to obtain a large amount of genotyping data for an entire mapping population by direct sequencing, including hexaploid wheat, a significant number of missing data points are often apparent due to the low coverage of sequencing. In the present study, a microarray-based polymorphism detection system was developed using NGS data obtained from complexity-reduced genomic DNA of two common wheat cultivars, Chinese Spring (CS) and Mironovskaya 808. After design and selection of polymorphic probes, 13,056 new markers were added to the linkage map of a recombinant inbred mapping population between CS and Mironovskaya 808. On average, 2.49 missing data points per marker were observed in the 201 recombinant inbred lines, with a maximum of 42. Around 40% of the new markers were derived from genic regions and 11% from repetitive regions. The low number of retroelements indicated that the new polymorphic markers were mainly derived from the less repetitive region of the wheat genome. Around 25% of the mapped sequences were useful for alignment with the physical map of barley. Quantitative trait locus (QTL) analyses of 14 agronomically important traits related to flowering, spikes, and seeds demonstrated that the new high-density map showed improved QTL detection, resolution, and accuracy over the original simple sequence repeat map. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
The GAAIN Entity Mapper: An Active-Learning System for Medical Data Mapping.
Ashish, Naveen; Dewan, Peehoo; Toga, Arthur W
2015-01-01
This work is focused on mapping biomedical datasets to a common representation, as an integral part of data harmonization for integrated biomedical data access and sharing. We present GEM, an intelligent software assistant for automated data mapping across different datasets or from a dataset to a common data model. The GEM system automates data mapping by providing precise suggestions for data element mappings. It leverages the detailed metadata about elements in associated dataset documentation such as data dictionaries that are typically available with biomedical datasets. It employs unsupervised text mining techniques to determine similarity between data elements and also employs machine-learning classifiers to identify element matches. It further provides an active-learning capability where the process of training the GEM system is optimized. Our experimental evaluations show that the GEM system provides highly accurate data mappings (over 90% accuracy) for real datasets of thousands of data elements each, in the Alzheimer's disease research domain. Further, the effort in training the system for new datasets is also optimized. We are currently employing the GEM system to map Alzheimer's disease datasets from around the globe into a common representation, as part of a global Alzheimer's disease integrated data sharing and analysis network called GAAIN. GEM achieves significantly higher data mapping accuracy for biomedical datasets compared to other state-of-the-art tools for database schema matching that have similar functionality. With the use of active-learning capabilities, the user effort in training the system is minimal.
The GAAIN Entity Mapper: An Active-Learning System for Medical Data Mapping
Ashish, Naveen; Dewan, Peehoo; Toga, Arthur W.
2016-01-01
This work is focused on mapping biomedical datasets to a common representation, as an integral part of data harmonization for integrated biomedical data access and sharing. We present GEM, an intelligent software assistant for automated data mapping across different datasets or from a dataset to a common data model. The GEM system automates data mapping by providing precise suggestions for data element mappings. It leverages the detailed metadata about elements in associated dataset documentation such as data dictionaries that are typically available with biomedical datasets. It employs unsupervised text mining techniques to determine similarity between data elements and also employs machine-learning classifiers to identify element matches. It further provides an active-learning capability where the process of training the GEM system is optimized. Our experimental evaluations show that the GEM system provides highly accurate data mappings (over 90% accuracy) for real datasets of thousands of data elements each, in the Alzheimer's disease research domain. Further, the effort in training the system for new datasets is also optimized. We are currently employing the GEM system to map Alzheimer's disease datasets from around the globe into a common representation, as part of a global Alzheimer's disease integrated data sharing and analysis network called GAAIN1. GEM achieves significantly higher data mapping accuracy for biomedical datasets compared to other state-of-the-art tools for database schema matching that have similar functionality. With the use of active-learning capabilities, the user effort in training the system is minimal. PMID:26793094
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Patellar cartilage lesions: comparison of magnetic resonance imaging and T2 relaxation-time mapping.
Hannila, I; Nieminen, M T; Rauvala, E; Tervonen, O; Ojala, R
2007-05-01
To evaluate the detection and the size of focal patellar cartilage lesions in T2 mapping as compared to standard clinical magnetic resonance imaging (MRI) at 1.5T. Fifty-five consecutive clinical patients referred to knee MRI were imaged both with a standard knee MRI protocol (proton-density-weighted sagittal and axial series, T2-weighted sagittal and coronal series, and T1-weighted coronal series) and with an axial multislice multi-echo spin-echo measurement to determine the T2 relaxation time of the patellar cartilage. MR images and T2 maps of patellar cartilage were evaluated for focal lesions. The lesions were evaluated for lesion width (mm), lesion depth (1/3, 2/3, or 3/3 of cartilage thickness), and T2 value (20-40 ms, 40-60 ms, or 60-80 ms) based on visual evaluation. Altogether, 36 focal patellar cartilage lesions were detected from 20 human subjects (11 male, nine female, mean age 40+/-15 years). Twenty-eight lesions were detected both on MRI and T2 maps, while eight lesions were only visible on T2 maps. Cartilage lesions were significantly wider (P = 0.001) and thicker (P<0.001) on T2 maps as compared to standard knee MRI. Most lesions 27 had moderately (T2 40-60 ms) increased T2 values, while two lesions had slightly (T2 20-40 ms) and seven lesions remarkably (T2 60-80 ms) increased T2 relaxation times. T2 mapping of articular cartilage is feasible in the clinical setting and may reveal early cartilage lesions not visible with standard clinical MRI.
Habib, A.; Jarvis, A.; Al-Durgham, M. M.; Lay, J.; Quackenbush, P.; Stensaas, G.; Moe, D.
2007-01-01
The mapping community is witnessing significant advances in available sensors, such as medium format digital cameras (MFDC) and Light Detection and Ranging (LiDAR) systems. In this regard, the Digital Photogrammetry Research Group (DPRG) of the Department of Geomatics Engineering at the University of Calgary has been actively involved in the development of standards and specifications for regulating the use of these sensors in mapping activities. More specifically, the DPRG has been working on developing new techniques for the calibration and stability analysis of medium format digital cameras. This research is essential since these sensors have not been developed with mapping applications in mind. Therefore, prior to their use in Geomatics activies, new standards should be developed to ensure the quality of the developed products. In another front, the persistent improvement in direct geo-referencing technology has led to an expansion in the use of LiDAR systems for the acquisition of dense and accurate surface information. However, the processing of the raw LiDAR data (e.g., ranges, mirror angles, and navigation data) remains a non-transparent process that is proprietary to the manufacturers of LiDAR systems. Therefore, the DPRG has been focusing on the development of quality control procedures to quantify the accuracy of LiDAR output in the absence of initial system measurements. This paper presents a summary of the research conducted by the DPRG together with the British Columbia Base Mapping and Geomatic Services (BMGS) and the United States Geological Survey (USGS) for the development of quality assurance and quality control procedures for emerging mapping technologies. The outcome of this research will allow for the possiblity of introducing North American Standards and Specifications to regulate the use of MFDC and LiDAR systems in the mapping industry.
Topographic Map of the West Candor Chasma Region of Mars, MTM 500k -05/282E OMKT
,
2004-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.
Topographic Map of the Ophir and Central Candor Chasmata Region of Mars MTM 500k -05/287E OMKT
,
2004-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.
Topographic map of the Tithonium Chasma Region of Mars, MTM 500k -05/277E OMKT
,
2004-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 270? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.
Connecticut Music Trace Map for Grades 6 and 8. Revised.
ERIC Educational Resources Information Center
Connecticut State Board of Education, Hartford.
These Connecticut Curriculum Trace Maps for music are designed to help curriculum developers and teachers translate Connecticut's K-12 performance standards into objectives and classroom practices. Trace Maps provide specific descriptions of what students should know and be able to do at smaller grade level clusters. Elements in the Trace Maps are…
Absolute color scale for improved diagnostics with wavefront error mapping.
Smolek, Michael K; Klyce, Stephen D
2007-11-01
Wavefront data are expressed in micrometers and referenced to the pupil plane, but current methods to map wavefront error lack standardization. Many use normalized or floating scales that may confuse the user by generating ambiguous, noisy, or varying information. An absolute scale that combines consistent clinical information with statistical relevance is needed for wavefront error mapping. The color contours should correspond better to current corneal topography standards to improve clinical interpretation. Retrospective analysis of wavefront error data. Historic ophthalmic medical records. Topographic modeling system topographical examinations of 120 corneas across 12 categories were used. Corneal wavefront error data in micrometers from each topography map were extracted at 8 Zernike polynomial orders and for 3 pupil diameters expressed in millimeters (3, 5, and 7 mm). Both total aberrations (orders 2 through 8) and higher-order aberrations (orders 3 through 8) were expressed in the form of frequency histograms to determine the working range of the scale across all categories. The standard deviation of the mean error of normal corneas determined the map contour resolution. Map colors were based on corneal topography color standards and on the ability to distinguish adjacent color contours through contrast. Higher-order and total wavefront error contour maps for different corneal conditions. An absolute color scale was produced that encompassed a range of +/-6.5 microm and a contour interval of 0.5 microm. All aberrations in the categorical database were plotted with no loss of clinical information necessary for classification. In the few instances where mapped information was beyond the range of the scale, the type and severity of aberration remained legible. When wavefront data are expressed in micrometers, this absolute scale facilitates the determination of the severity of aberrations present compared with a floating scale, particularly for distinguishing normal from abnormal levels of wavefront error. The new color palette makes it easier to identify disorders. The corneal mapping method can be extended to mapping whole eye wavefront errors. When refraction data are expressed in diopters, the previously published corneal topography scale is suggested.
Strong Convergence of Iteration Processes for Infinite Family of General Extended Mappings
NASA Astrophysics Data System (ADS)
Hussein Maibed, Zena
2018-05-01
The aim of this paper, we introduce a concept of general extended mapping which is independent of nonexpansive mapping and give an iteration process of families of quasi nonexpansive and of general extended mappings. Also, the existence of common fixed point are studied for these process in the Hilbert spaces.
Kannry, J L; Wright, L; Shifman, M; Silverstein, S; Miller, P L
1996-01-01
OBJECTIVE: To examine the issues involved in mapping an existing structured controlled vocabulary, the Medical Entities Dictionary (MED) developed at Columbia University, to an institutional vocabulary, the laboratory and pharmacy vocabularies of the Yale New Haven Medical Center. DESIGN: 200 Yale pharmacy terms and 200 Yale laboratory terms were randomly selected from database files containing all of the Yale laboratory and pharmacy terms. These 400 terms were then mapped to the MED in three phases: mapping terms, mapping relationships between terms, and mapping attributes that modify terms. RESULTS: 73% of the Yale pharmacy terms mapped to MED terms. 49% of the Yale laboratory terms mapped to MED terms. After certain obsolete and otherwise inappropriate laboratory terms were eliminated, the latter rate improved to 59%. 23% of the unmatched Yale laboratory terms failed to match because of differences in granularity with MED terms. The Yale and MED pharmacy terms share 12 of 30 distinct attributes. The Yale and MED laboratory terms share 14 of 23 distinct attributes. CONCLUSION: The mapping of an institutional vocabulary to a structured controlled vocabulary requires that the mapping be performed at the level of terms, relationships, and attributes. The mapping process revealed the importance of standardization of local vocabulary subsets, standardization of attribute representation, and term granularity. PMID:8750391
Labeling Projections on Published Maps
Snyder, John P.
1987-01-01
To permit accurate scaling on a map, and to use the map as a source of accurate positions in the transfer of data, certain parameters - such as the standard parallels selected for a conic projection - must be stated on the map. This information is often missing on published maps. Three current major world atlases are evaluated with respect to map projection identification. The parameters essential for the projections used in these three atlases are discussed and listed. These parameters should be stated on any map based on the same projection.
NASA Astrophysics Data System (ADS)
Günther, Andreas; Duscher, Klaus; Broda, Stefan; Clos, Patrick; Reichling, Jörg
2017-04-01
Since the mid of the last century, pan-European hydrogeological overview-mapping is conducted at the scale 1 : 1.5 Million following common standards and guidelines to interpret suitable geologic mapping units in terms of potential uppermost aquifer (or non-aquifer) characteristics. These comprises potential aquifer productivities and general hydrogeological aquifer conditions (fissured vs. porous). The printed IHME1500 dataset successively elaborated and published from 1970 - 2013 consists of 25 individual map sheets. Besides the potential aquifer characterization grouped in six classes, IHME1500 offers a complete coverage of lithological material properties of potential shallow aquifer assemblages, and tracelines of major fault structures. Regional information on groundwater surfaces, aquifer thicknesses and depths, locations and characteristics of groundwater springs and other punctual information related to European groundwater resources is present for some areas in selected map sheets, however not digitally available. Synoptic IHME1500 vector data consists of a topographically corrected, seamless and harmonized polygon layer with attribute information on potential aquifer productivity and lithology. While the standardized aquifer-classification is relatively easy to harmonize across the entire mapped area, the lithological information of IHME1500 is presented using sheet-specific legend information resulting in more than 1000 aquifer lithology classes. An attempt was made to harmonize this information utilizing a specifically developed taxonomic scheme, treating consolidated, partly consolidated and unconsolidated materials separately. The translation of the original lithological information into this scheme allows for a hierarchical grouping of the mapping units into five generalization levels, where the highest aggregation level displays a ternary map showing the distribution of consolidated, partially consolidated and unconsolidated aquifer materials. The harmonized and hierarchically structured IHME1500 information based on the published map sheet data also allows for the extension of the mapped area in regions where only incomplete, unpublished IHME1500 draft information is available. IHME1500 now covers the entire European continent up to the Urals, the Caucasus region, and parts of the Middle East (Turkey, Cyprus, parts of Syria and Iraq). IHME1500 represents the only digitally available coherent overview information on potential groundwater resources and shallow aquifer characteristics across Europe. The data is therefore of great use for European policy support in terms of e.g. transboundary aquifer identification and characterization, the harmonization of regional European groundwater bodies, and the delineation of hot spot regions for aquifer systems under potential environmental stress with respect to climate change, natural hazards or migratory flows. Additionally, the lithological information of IHME1500 represents the only harmonized pan-European dataset on shallow subsurface geologic materials available and can used for the spatial delineation of soil parent materials and as a spatial predictor for the evaluation of geomorphological hazards at overview scales. IHME1500 GIS data can be downloaded through BGŔs product centre (http://produktcenter.bgr.de).
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
Chiang, Ann-Shyn; Lin, Chih-Yung; Chuang, Chao-Chun; Chang, Hsiu-Ming; Hsieh, Chang-Huain; Yeh, Chang-Wei; Shih, Chi-Tin; Wu, Jian-Jheng; Wang, Guo-Tzau; Chen, Yung-Chang; Wu, Cheng-Chi; Chen, Guan-Yu; Ching, Yu-Tai; Lee, Ping-Chang; Lin, Chih-Yang; Lin, Hui-Hao; Wu, Chia-Chou; Hsu, Hao-Wei; Huang, Yun-Ann; Chen, Jing-Yi; Chiang, Hsin-Jung; Lu, Chun-Fang; Ni, Ru-Fen; Yeh, Chao-Yuan; Hwang, Jenn-Kang
2011-01-11
Animal behavior is governed by the activity of interconnected brain circuits. Comprehensive brain wiring maps are thus needed in order to formulate hypotheses about information flow and also to guide genetic manipulations aimed at understanding how genes and circuits orchestrate complex behaviors. To assemble this map, we deconstructed the adult Drosophila brain into approximately 16,000 single neurons and reconstructed them into a common standardized framework to produce a virtual fly brain. We have constructed a mesoscopic map and found that it consists of 41 local processing units (LPUs), six hubs, and 58 tracts covering the whole Drosophila brain. Despite individual local variation, the architecture of the Drosophila brain shows invariance for both the aggregation of local neurons (LNs) within specific LPUs and for the connectivity of projection neurons (PNs) between the same set of LPUs. An open-access image database, named FlyCircuit, has been constructed for online data archiving, mining, analysis, and three-dimensional visualization of all single neurons, brain-wide LPUs, their wiring diagrams, and neural tracts. We found that the Drosophila brain is assembled from families of multiple LPUs and their interconnections. This provides an essential first step in the analysis of information processing within and between neurons in a complete brain. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David
2015-08-01
Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Choi, Y; Jung, C; Chae, Y; Kang, M; Kim, J; Joung, K; Lim, J; Cho, S; Sung, S; Lee, E; Kim, S
2014-01-01
Mapping of drug indications to ICD-10 was undertaken in Korea by a public and a private institution for their own purposes. A different mapping approach was used by each institution, which presented a good opportunity to compare the validity of the two approaches. This study was undertaken to compare the validity of a direct mapping approach and an indirect terminology based mapping approach of drug indications against the gold standard drawn from the results of the two mapping processes. Three hundred and seventy-five cardiovascular reference drugs were selected from all listed cardiovascular drugs for the study. In the direct approach, two experienced nurse coders mapped the free text indications directly to ICD-10. In the indirect terminology based approach, the indications were extracted and coded in the Korean Standard Terminology of Medicine. These terminology coded indications were then manually mapped to ICD-10. The results of the two approaches were compared to the gold standard. A kappa statistic was calculated to see the compatibility of both mapping approaches. Recall, precision and F1 score of each mapping approach were calculated and analyzed using a paired t-test. The mean number of indications for the study drugs was 5.42. The mean number of ICD-10 codes that matched in direct approach was 46.32 and that of indirect terminology based approach was 56.94. The agreement of the mapping results between the two approaches were poor (kappa = 0.19). The indirect terminology based approach showed higher recall (86.78%) than direct approach (p < 0.001). However, there was no difference in precision and F1 score between the two approaches. Considering no differences in the F1 scores, both approaches may be used in practice for mapping drug indications to ICD-10. However, in terms of consistency, time and manpower, better results are expected from the indirect terminology based approach.
NASA Technical Reports Server (NTRS)
Halpern, D.; Zlotnicki, V.; Newman, J.; Brown, O.; Wentz, F.
1991-01-01
Monthly mean global distributions for 1988 are presented with a common color scale and geographical map. Distributions are included for sea surface height variation estimated from GEOSAT; surface wind speed estimated from the Special Sensor Microwave Imager on the Defense Meteorological Satellite Program spacecraft; sea surface temperature estimated from the Advanced Very High Resolution Radiometer on NOAA spacecrafts; and the Cartesian components of the 10m height wind vector computed by the European Center for Medium Range Weather Forecasting. Charts of monthly mean value, sampling distribution, and standard deviation value are displayed. Annual mean distributions are displayed.
An interactive environment for the analysis of large Earth observation and model data sets
NASA Technical Reports Server (NTRS)
Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.
1993-01-01
We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.
An interactive environment for the analysis of large Earth observation and model data sets
NASA Technical Reports Server (NTRS)
Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.
1992-01-01
We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.
Mapping of Outdoor Classrooms.
ERIC Educational Resources Information Center
Horvath, Victor G.
Mapping symbols adopted by the Michigan Department of Natural Resources are presented with their explanations. In an effort to provide standardization and familiarity teachers and other school people involved in an outdoor education program are encouraged to utilize the same symbols in constructing maps. (DK)
The Iranian National Geodata Revision Strategy and Realization Based on Geodatabase
NASA Astrophysics Data System (ADS)
Haeri, M.; Fasihi, A.; Ayazi, S. M.
2012-07-01
In recent years, using of spatial database for storing and managing spatial data has become a hot topic in the field of GIS. Accordingly National Cartographic Center of Iran (NCC) produces - from time to time - some spatial data which is usually included in some databases. One of the NCC major projects was designing National Topographic Database (NTDB). NCC decided to create National Topographic Database of the entire country-based on 1:25000 coverage maps. The standard of NTDB was published in 1994 and its database was created at the same time. In NTDB geometric data was stored in MicroStation design format (DGN) which each feature has a link to its attribute data (stored in Microsoft Access file). Also NTDB file was produced in a sheet-wise mode and then stored in a file-based style. Besides map compilation, revision of existing maps has already been started. Key problems of NCC are revision strategy, NTDB file-based style storage and operator challenges (NCC operators are almost preferred to edit and revise geometry data in CAD environments). A GeoDatabase solution for national Geodata, based on NTDB map files and operators' revision preferences, is introduced and released herein. The proposed solution extends the traditional methods to have a seamless spatial database which it can be revised in CAD and GIS environment, simultaneously. The proposed system is the common data framework to create a central data repository for spatial data storage and management.
Nonlinear mapping of the luminance in dual-layer high dynamic range displays
NASA Astrophysics Data System (ADS)
Guarnieri, Gabriele; Ramponi, Giovanni; Bonfiglio, Silvio; Albani, Luigi
2009-02-01
It has long been known that the human visual system (HVS) has a nonlinear response to luminance. This nonlinearity can be quantified using the concept of just noticeable difference (JND), which represents the minimum amplitude of a specified test pattern an average observer can discern from a uniform background. The JND depends on the background luminance following a threshold versus intensity (TVI) function. It is possible to define a curve which maps physical luminances into a perceptually linearized domain. This mapping can be used to optimize a digital encoding, by minimizing the visibility of quantization noise. It is also commonly used in medical applications to display images adapting to the characteristics of the display device. High dynamic range (HDR) displays, which are beginning to appear on the market, can display luminance levels outside the range in which most standard mapping curves are defined. In particular, dual-layer LCD displays are able to extend the gamut of luminance offered by conventional liquid crystals towards the black region; in such areas suitable and HVS-compliant luminance transformations need to be determined. In this paper we propose a method, which is primarily targeted to the extension of the DICOM curve used in medical imaging, but also has a more general application. The method can be modified in order to compensate for the ambient light, which can be significantly greater than the black level of an HDR display and consequently reduce the visibility of the details in dark areas.
NASA Astrophysics Data System (ADS)
Proctor, R.; Mancini, S.; Hoenner, X.; Tattersall, K.; Pasquer, B.; Galibert, G.; Moltmann, T.
2016-02-01
Salinity and temperature measurements from different sources have been assembled into a common data structure in a relational database. Quality Control flags have been mapped to a common scheme and associated to each measurement. For datasets like gliders, moorings or ship underway which are sampled at high temporal resolution (e.g. data every second) a binning and sub-sampling approach has been applied to some datasets in order to reduce the number of measurements to hourly sampling. After averaging approximately 25 Million measurements are available in this dataset collection. A national shelf and coastal data atlas has been created using all the temperature and salinity measurements that pass various quality control checks. These observations have been binned spatially on a horizontal grid of ¼ degree with standard vertical levels (every 10 meters from the surface to 500m depth) and temporally on a monthly time range over the period January 1995 to December 2014. The number of observations in each bin has been determined and additional statistics, the mean, the standard deviation, minimum and maximum values, have been calculated, enabling a degree of uncertainty to be associated with any measurement. The data atlas is available as a Web Feature Service.
Preliminary geologic map of the Perris 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Digital preparation by Bovard, Kelly R.; Alvarez, Rachel M.
2003-01-01
Open-File Report 03-270 contains a digital geologic map database of the Perris 7.5’ quadrangle, Riverside County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. Portable Document Format (.pdf) files of: a. A Readme file b. The same graphic as described in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formationname, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc.
Geologic map of the Devore 7.5' quadrangle, San Bernardino County, California
Morton, Douglas M.; Matti, Jonathan C.
2001-01-01
This Open-File Report contains a digital geologic map database of the Devore 7.5' quadrangle, San Bernardino County, California, that includes: 1. ARC/INFO (Environmental Systems Research Institute) version 7.2.1 coverages of the various components of the geologic map 2. A PostScript (.ps) file to plot the geologic map on a topographic base, containing a Correlation of Map Units diagram, a Description of Map Units, an index map, and a regional structure map 3. Portable Document Format (.pdf) files of: a. This Readme; includes an Appendix, containing metadata details found in devre_met.txt b. The same graphic as plotted in 2 above. (Test plots from this .pdf do not produce 1:24,000-scale maps. Adobe Acrobat page-size settings control map scale.) The Correlation of Map Units and Description of Map Units are in the editorial format of USGS Miscellaneous Investigations Series maps (I-maps) but have not been edited to comply with I-map standards. Within the geologic-map data package, map units are identified by such standard geologic-map criteria as formation name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Devore 7.5’ topographic quadrangle in conjunction with the geologic map.
Interval data clustering using self-organizing maps based on adaptive Mahalanobis distances.
Hajjar, Chantal; Hamdan, Hani
2013-10-01
The self-organizing map is a kind of artificial neural network used to map high dimensional data into a low dimensional space. This paper presents a self-organizing map for interval-valued data based on adaptive Mahalanobis distances in order to do clustering of interval data with topology preservation. Two methods based on the batch training algorithm for the self-organizing maps are proposed. The first method uses a common Mahalanobis distance for all clusters. In the second method, the algorithm starts with a common Mahalanobis distance per cluster and then switches to use a different distance per cluster. This process allows a more adapted clustering for the given data set. The performances of the proposed methods are compared and discussed using artificial and real interval data sets. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wang, Mingjie; Ye, Yuzhen; Tang, Haixu
2012-06-01
The wide applications of next-generation sequencing (NGS) technologies in metagenomics have raised many computational challenges. One of the essential problems in metagenomics is to estimate the taxonomic composition of a microbial community, which can be approached by mapping shotgun reads acquired from the community to previously characterized microbial genomes followed by quantity profiling of these species based on the number of mapped reads. This procedure, however, is not as trivial as it appears at first glance. A shotgun metagenomic dataset often contains DNA sequences from many closely-related microbial species (e.g., within the same genus) or strains (e.g., within the same species), thus it is often difficult to determine which species/strain a specific read is sampled from when it can be mapped to a common region shared by multiple genomes at high similarity. Furthermore, high genomic variations are observed among individual genomes within the same species, which are difficult to be differentiated from the inter-species variations during reads mapping. To address these issues, a commonly used approach is to quantify taxonomic distribution only at the genus level, based on the reads mapped to all species belonging to the same genus; alternatively, reads are mapped to a set of representative genomes, each selected to represent a different genus. Here, we introduce a novel approach to the quantity estimation of closely-related species within the same genus by mapping the reads to their genomes represented by a de Bruijn graph, in which the common genomic regions among them are collapsed. Using simulated and real metagenomic datasets, we show the de Bruijn graph approach has several advantages over existing methods, including (1) it avoids redundant mapping of shotgun reads to multiple copies of the common regions in different genomes, and (2) it leads to more accurate quantification for the closely-related species (and even for strains within the same species).
Manousaki, Tereza; Tsakogiannis, Alexandros; Taggart, John B.; Palaiokostas, Christos; Tsaparis, Dimitris; Lagnel, Jacques; Chatziplis, Dimitrios; Magoulas, Antonios; Papandroulakis, Nikos; Mylonas, Constantinos C.; Tsigenopoulos, Costas S.
2015-01-01
Common pandora (Pagellus erythrinus) is a benthopelagic marine fish belonging to the teleost family Sparidae, and a newly recruited species in Mediterranean aquaculture. The paucity of genetic information relating to sparids, despite their growing economic value for aquaculture, provides the impetus for exploring the genomics of this fish group. Genomic tool development, such as genetic linkage maps provision, lays the groundwork for linking genotype to phenotype, allowing fine-mapping of loci responsible for beneficial traits. In this study, we applied ddRAD methodology to identify polymorphic markers in a full-sib family of common pandora. Employing the Illumina MiSeq platform, we sampled and sequenced a size-selected genomic fraction of 99 individuals, which led to the identification of 920 polymorphic loci. Downstream mapping analysis resulted in the construction of 24 robust linkage groups, corresponding to the karyotype of the species. The common pandora linkage map showed varying degrees of conserved synteny with four other teleost genomes, namely the European seabass (Dicentrarchus labrax), Nile tilapia (Oreochromis niloticus), stickleback (Gasterosteus aculeatus), and medaka (Oryzias latipes), suggesting a conserved genomic evolution in Sparidae. Our work exploits the possibilities of genotyping by sequencing to gain novel insights into genome structure and evolution. Such information will boost the study of cultured species and will set the foundation for a deeper understanding of the complex evolutionary history of teleosts. PMID:26715088
Modelling and approaching pragmatic interoperability of distributed geoscience data
NASA Astrophysics Data System (ADS)
Ma, Xiaogang
2010-05-01
Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.
ERIC Educational Resources Information Center
Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John
2018-01-01
Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that…
NASA Astrophysics Data System (ADS)
Peterson, James Preston, II
Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.
FEASIBILITY AND APPROACH FOR MAPPING RADON POTENTIALS IN FLORIDA
The report gives results of an analysis of the feasibility and approach for developing statewide maps of radon potentials in Florida. he maps would provide a geographic basis for implementing new radon-protective building construction standards to reduce public health risks from ...
Fine Mapping on Chromosome 13q32–34 and Brain Expression Analysis Implicates MYO16 in Schizophrenia
Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria
2014-01-01
We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32–34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32–34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case–control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case–control data sets of European descent highlighted a region across introns 2–6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia. PMID:24141571
Cartographic services contract...for everything geographic
,
2003-01-01
The U.S. Geological Survey's (USGS) Cartographic Services Contract (CSC) is used to award work for photogrammetric and mapping services under the umbrella of Architect-Engineer (A&E) contracting. The A&E contract is broad in scope and can accommodate any activity related to standard, nonstandard, graphic, and digital cartographic products. Services provided may include, but are not limited to, photogrammetric mapping and aerotriangulation; orthophotography; thematic mapping (for example, land characterization); analog and digital imagery applications; geographic information systems development; surveying and control acquisition, including ground-based and airborne Global Positioning System; analog and digital image manipulation, analysis, and interpretation; raster and vector map digitizing; data manipulations (for example, transformations, conversions, generalization, integration, and conflation); primary and ancillary data acquisition (for example, aerial photography, satellite imagery, multispectral, multitemporal, and hyperspectral data); image scanning and processing; metadata production, revision, and creation; and production or revision of standard USGS products defined by formal and informal specification and standards, such as those for digital line graphs, digital elevation models, digital orthophoto quadrangles, and digital raster graphics.
Calculating Lyapunov Exponents: Applying Products and Evaluating Integrals
ERIC Educational Resources Information Center
McCartney, Mark
2010-01-01
Two common examples of one-dimensional maps (the tent map and the logistic map) are generalized to cases where they have more than one control parameter. In the case of the tent map, this still allows the global Lyapunov exponent to be found analytically, and permits various properties of the resulting global Lyapunov exponents to be investigated…
A Geophysical Atlas for Interpretation of Satellite-derived Data
NASA Technical Reports Server (NTRS)
Lowman, P. D., Jr. (Editor); Frey, H. V. (Editor); Davis, W. M.; Greenberg, A. P.; Hutchinson, M. K.; Langel, R. A.; Lowrey, B. E.; Marsh, J. G.; Mead, G. D.; Okeefe, J. A.
1979-01-01
A compilation of maps of global geophysical and geological data plotted on a common scale and projection is presented. The maps include satellite gravity, magnetic, seismic, volcanic, tectonic activity, and mantle velocity anomaly data. The Bibliographic references for all maps are included.
Atlas of United States Trees, Volume 2: Alaska Trees and Common Shrubs.
ERIC Educational Resources Information Center
Viereck, Leslie A.; Little, Elbert L., Jr.
This volume is the second in a series of atlases describing the natural distribution or range of native tree species in the United States. The 82 species maps include 32 of trees in Alaska, 6 of shrubs rarely reaching tree size, and 44 more of common shrubs. More than 20 additional maps summarize environmental factors and furnish general…
Fusiform-Rust-Hazard Maps for Loblolly and Slash Pines
Robert L. Anderson; Thomas C. McCartney; Noel D. Cost; Hugh Devine; Martin Botkin
1988-01-01
Rust-hazard saps made from Forest Inventory and Analysis plot data show that fusiform rust on slash pine is most common in north-central Florida, in southeastern Georgia, and in areas north of slash pine's natural range. On loblolly pine, the disease is most common in central and southeastern Georgia and in portions of South Carolina. These maps show the general...
Arango-Sabogal, Juan C; Côté, Geneviève; Paré, Julie; Labrecque, Olivia; Roy, Jean-Philippe; Buczinski, Sébastien; Doré, Elizabeth; Fairbrother, Julie H; Bissonnette, Nathalie; Wellemans, Vincent; Fecteau, Gilles
2016-07-01
Mycobacterium avium ssp. paratuberculosis (MAP) is the etiologic agent of Johne's disease, a chronic contagious enteritis of ruminants that causes major economic losses. Several studies, most involving large free-stall herds, have found environmental sampling to be a suitable method for detecting MAP-infected herds. In eastern Canada, where small tie-stall herds are predominant, certain conditions and management practices may influence the survival and transmission of MAP and recovery (isolation). Our objective was to estimate the performance of a standardized environmental and targeted pooled sampling technique for the detection of MAP-infected tie-stall dairy herds. Twenty-four farms (19 MAP-infected and 5 non-infected) were enrolled, but only 20 were visited twice in the same year, to collect 7 environmental samples and 2 pooled samples (sick cows and cows with poor body condition). Concurrent individual sampling of all adult cows in the herds was also carried out. Isolation of MAP was achieved using the MGIT Para TB culture media and the BACTEC 960 detection system. Overall, MAP was isolated in 7% of the environmental cultures. The sensitivity of the environmental culture was 44% [95% confidence interval (CI): 20% to 70%] when combining results from 2 different herd visits and 32% (95% CI: 13% to 57%) when results from only 1 random herd visit were used. The best sampling strategy was to combine samples from the manure pit, gutter, sick cows, and cows with poor body condition. The standardized environmental sampling technique and the targeted pooled samples presented in this study is an alternative sampling strategy to costly individual cultures for detecting MAP-infected tie-stall dairies. Repeated samplings may improve the detection of MAP-infected herds.
Rose, Kathryn V.; Nayegandhi, Amar; Moses, Christopher S.; Beavers, Rebecca; Lavoie, Dawn; Brock, John C.
2012-01-01
The National Park Service (NPS) Inventory and Monitoring (I&M) Program initiated a benthic habitat mapping program in ocean and coastal parks in 2008-2009 in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With more than 80 ocean and Great Lakes parks encompassing approximately 2.5 million acres of submerged territory and approximately 12,000 miles of coastline (Curdts, 2011), this Servicewide Benthic Mapping Program (SBMP) is essential. This report presents an initial gap analysis of three pilot parks under the SBMP: Assateague Island National Seashore (ASIS), Channel Islands National Park (CHIS), and Sleeping Bear Dunes National Lakeshore (SLBE) (fig. 1). The recommended SBMP protocols include servicewide standards (for example, gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). The SBMP requires the inventory and mapping of critical components of coastal and marine ecosystems: bathymetry, geoforms, surface geology, and biotic cover. In order for a park unit benthic inventory to be considered complete, maps of bathymetry and other key components must be combined into a final report (Moses and others, 2010). By this standard, none of the three pilot parks are mapped (inventoried) to completion with respect to submerged resources. After compiling the existing benthic datasets for these parks, this report has concluded that CHIS, with 49 percent of its submerged area mapped, has the most complete benthic inventory of the three. The ASIS submerged inventory is 41 percent complete, and SLBE is 17.5 percent complete.
Fundamental procedures of geographic information analysis
NASA Technical Reports Server (NTRS)
Berry, J. K.; Tomlin, C. D.
1981-01-01
Analytical procedures common to most computer-oriented geographic information systems are composed of fundamental map processing operations. A conceptual framework for such procedures is developed and basic operations common to a broad range of applications are described. Among the major classes of primitive operations identified are those associated with: reclassifying map categories as a function of the initial classification, the shape, the position, or the size of the spatial configuration associated with each category; overlaying maps on a point-by-point, a category-wide, or a map-wide basis; measuring distance; establishing visual or optimal path connectivity; and characterizing cartographic neighborhoods based on the thematic or spatial attributes of the data values within each neighborhood. By organizing such operations in a coherent manner, the basis for a generalized cartographic modeling structure can be developed which accommodates a variety of needs in a common, flexible and intuitive manner. The use of each is limited only by the general thematic and spatial nature of the data to which it is applied.
NASA Astrophysics Data System (ADS)
Castillo, Carlos; Gomez, Jose Alfonso
2016-04-01
Standardization is the process of developing common conventions or proceedings to facilitate the communication, use, comparison and exchange of products or information among different parties. It has been an useful tool in different fields from industry to statistics due to technical, economic and social reasons. In science the need for standardization has been recognised in the definition of methods as well as in publication formats. With respect to gully erosion, a number of initiatives have been carried out to propose common methodologies, for instance, for gully delineation (Castillo et al., 2014) and geometrical measurements (Casalí et al., 2015). The main aims of this work are: 1) to examine previous proposals in gully erosion literature implying standardization processes; 2) to contribute with new approaches to improve the homogeneity of methodologies and presentation of results for a better communication among the gully erosion community. For this purpose, we evaluated the basic information provided on environmental factors, discussed the delineation and measurement procedures proposed in previous works and, finally, we analysed statistically the severity of degradation levels derived from different indicators at the world scale. As a result, we presented suggestions aiming to serve as guidance for survey design as well as for the interpretation of vulnerability levels and degradation rates for future gully erosion studies. References Casalí, J., Giménez, R., and Campo-Bescós, M. A.: Gully geometry: what are we measuring?, SOIL, 1, 509-513, doi:10.5194/soil-1-509-2015, 2015. Castillo C., Taguas E. V., Zarco-Tejada P., James M. R., and Gómez J. A. (2014), The normalized topographic method: an automated procedure for gully mapping using GIS, Earth Surf. Process. Landforms, 39, 2002-2015, doi: 10.1002/esp.3595
A natural-color mapping for single-band night-time image based on FPGA
NASA Astrophysics Data System (ADS)
Wang, Yilun; Qian, Yunsheng
2018-01-01
A natural-color mapping for single-band night-time image method based on FPGA can transmit the color of the reference image to single-band night-time image, which is consistent with human visual habits and can help observers identify the target. This paper introduces the processing of the natural-color mapping algorithm based on FPGA. Firstly, the image can be transformed based on histogram equalization, and the intensity features and standard deviation features of reference image are stored in SRAM. Then, the real-time digital images' intensity features and standard deviation features are calculated by FPGA. At last, FPGA completes the color mapping through matching pixels between images using the features in luminance channel.
A Numerical Study of New Logistic Map
NASA Astrophysics Data System (ADS)
Khmou, Youssef
In this paper, we propose a new logistic map based on the relation of the information entropy, we study the bifurcation diagram comparatively to the standard logistic map. In the first part, we compare the obtained diagram, by numerical simulations, with that of the standard logistic map. It is found that the structures of both diagrams are similar where the range of the growth parameter is restricted to the interval [0,e]. In the second part, we present an application of the proposed map in traffic flow using macroscopic model. It is found that the bifurcation diagram is an exact model of the Greenberg’s model of traffic flow where the growth parameter corresponds to the optimal velocity and the random sequence corresponds to the density. In the last part, we present a second possible application of the proposed map which consists of random number generation. The results of the analysis show that the excluded initial values of the sequences are (0,1).
Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005
Soller, David R.
2005-01-01
Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; 6) continued development of the National Geologic Map Database; and 7) progress toward building and implementing a standard geologic map data model and standard science language for the U.S. and for North America.
Masojć, Piotr; Krajewski, Paweł; Stochmal, Anna; Kowalczyk, Mariusz; Angelov, Mihail; Ivanova, Valentina; Schollenberger, Małgorzata; Wakuliński, Wojciech; Banaszak, Zofia; Banaszak, Katarzyna; Rakoczy-Trojanowska, Monika
2017-01-01
Mapping population of recombinant inbred lines (RILs) representing 541 × Ot1-3 cross exhibited wide variations of benzoxazinoid (BX) content in leaves and roots, brown rust resistance, α-amylase activity in the grain, and resistance to preharvest sprouting. QTL mapping of major BX species using a DArT-based map revealed a complex genetic architecture underlying the production of these main secondary metabolites engaged in stress and allelopathy responses. The synthesis of BX in leaves and roots was found to be regulated by different QTL. The QTL for the BX content, rust resistance, α-amylase activity, and preharvest sprouting partially overlapped; this points to their common genetic regulation by a definite subset of genes. Only one QTL for BX located on chromosome 7R coincided with the loci of the ScBx genes, which were mapped as two clusters on chromosomes 5RS (Bx3-Bx5) and 7R (Bx1-Bx2). The QTL common for several BX species, rust resistance, preharvest sprouting, and α-amylase activity are interesting objects for further exploration aimed at developing common markers for these important agronomic traits. PMID:29267335
Fourth international circumpolar arctic vegetation mapping workshop
Raynolds, Martha K.; Markon, C.J.
2002-01-01
During the week of April 10, 2001, the Fourth International Circumpolar Arctic Vegetation Mapping Workshop was held in Moscow, Russia. The purpose of this meeting was to bring together the vegetation scientists working on the Circumpolar Arctic Vegetation Map (CAVM) to (1) review the progress of current mapping activities, (2) discuss and agree upon a standard set of arctic tundra subzones, (3) plan for the production and dissemination of a draft map, and (4) begin work on a legend for the final map.
Gene-Based Single Nucleotide Polymorphism Markers for Genetic and Association Mapping in Common Bean
2012-01-01
Background In common bean, expressed sequence tags (ESTs) are an underestimated source of gene-based markers such as insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). However, due to the nature of these conserved sequences, detection of markers is difficult and portrays low levels of polymorphism. Therefore, development of intron-spanning EST-SNP markers can be a valuable resource for genetic experiments such as genetic mapping and association studies. Results In this study, a total of 313 new gene-based markers were developed at target genes. Intronic variation was deeply explored in order to capture more polymorphism. Introns were putatively identified after comparing the common bean ESTs with the soybean genome, and the primers were designed over intron-flanking regions. The intronic regions were evaluated for parental polymorphisms using the single strand conformational polymorphism (SSCP) technique and Sequenom MassARRAY system. A total of 53 new marker loci were placed on an integrated molecular map in the DOR364 × G19833 recombinant inbred line (RIL) population. The new linkage map was used to build a consensus map, merging the linkage maps of the BAT93 × JALO EEP558 and DOR364 × BAT477 populations. A total of 1,060 markers were mapped, with a total map length of 2,041 cM across 11 linkage groups. As a second application of the generated resource, a diversity panel with 93 genotypes was evaluated with 173 SNP markers using the MassARRAY-platform and KASPar technology. These results were coupled with previous SSR evaluations and drought tolerance assays carried out on the same individuals. This agglomerative dataset was examined, in order to discover marker-trait associations, using general linear model (GLM) and mixed linear model (MLM). Some significant associations with yield components were identified, and were consistent with previous findings. Conclusions In short, this study illustrates the power of intron-based markers for linkage and association mapping in common bean. The utility of these markers is discussed in relation with the usefulness of microsatellites, the molecular markers by excellence in this crop. PMID:22734675
Natural resources research and development in Lesotho using LANDSAT imagery
NASA Technical Reports Server (NTRS)
Jackson, A. A. (Principal Investigator)
1976-01-01
The author has identified the following significant results. A map of the drainage of the whole country to include at least third order streams was constructed from LANDSAT imagery. This was digitized and can be plotted at any required scale to provide base maps for other cartographic projects. A suite of programs for the interpretation of digital LANDSAT data is under development for a low cost programmable calculator. Initial output from these programs has proved to have better resolution and detail than the standard photographic products, and was to update the standard topographic map of a particular region.
Geological maps and models: are we certain how uncertain they are?
NASA Astrophysics Data System (ADS)
Mathers, Steve; Waters, Colin; McEvoy, Fiona
2014-05-01
Geological maps and latterly 3D models provide the spatial framework for geology at diverse scales or resolutions. As demands continue to rise for sustainable use of the subsurface, use of these maps and models is informing decisions on management of natural resources, hazards and environmental change. Inaccuracies and uncertainties in geological maps and models can impact substantially on the perception, assessment and management of opportunities and the associated risks . Lithostratigraphical classification schemes predominate, and are used in most geological mapping and modelling. The definition of unit boundaries, as 2D lines or 3D surfaces is the prime objective. The intervening area or volume is rarely described other than by its bulk attributes, those relating to the whole unit. Where sufficient data exist on the spatial and/or statistical distribution of properties it can be gridded or voxelated with integrity. Here we only discuss the uncertainty involved in defining the boundary conditions. The primary uncertainty of any geological map or model is the accuracy of the geological boundaries, i.e. tops, bases, limits, fault intersections etc. Traditionally these have been depicted on BGS maps using three line styles that reflect the uncertainty of the boundary, e.g. observed, inferred, conjectural. Most geological maps tend to neglect the subsurface expression (subcrops etc). Models could also be built with subsurface geological boundaries (as digital node strings) tagged with levels of uncertainty; initial experience suggests three levels may again be practicable. Once tagged these values could be used to autogenerate uncertainty plots. Whilst maps are predominantly explicit and based upon evidence and the conceptual the understanding of the geologist, models of this type are less common and tend to be restricted to certain software methodologies. Many modelling packages are implicit, being driven by simple statistical interpolation or complex algorithms for building surfaces in ways that are invisible and so not controlled by the working geologist. Such models have the advantage of being replicable within a software package and so can discount some interpretational differences between modellers. They can however create geologically implausible results unless good geological rules and control are established prior to model calculation. Comparisons of results from varied software packages yield surprisingly diverse results. This is a significant and often overlooked source of uncertainty in models. Expert elicitation is commonly employed to establish values used in statistical treatments of model uncertainty. However this introduces another possible source of uncertainty created by the different judgements of the modellers. The pragmatic solution appears to be using panels of experienced geologists to elicit the values. Treatments of uncertainty in maps and models yield relative rather than absolute values even though many of these are expressed numerically. This makes it extremely difficult to devise standard methodologies to determine uncertainty or propose fixed numerical scales for expressing the results. Furthermore, these may give a misleading impression of greater certainty than actually exists. This contribution outlines general perceptions with regard to uncertainty in our maps and models and presents results from recent BGS studies
Rossi, G; Grohn, Y T; Schukken, Y H; Smith, R L
2017-09-01
Endemic diseases can be counted among the most serious sources of losses for livestock production. In dairy farms in particular, one of the most common diseases is Johne's disease, caused by Mycobacterium avium ssp. paratuberculosis (MAP). Infection with MAP causes direct costs because it affects milk production, but it has also been suspected to increase the risk of clinical mastitis (CM) among infected animals. This might contribute to further costs for farmers. We asked whether MAP infection represents a risk factor for CM and, in particular, whether CM occurrences were more common in MAP-infected animals. Our results, obtained by survival analysis, suggest that MAP-infected cows had an increased probability of experiencing CM during lactation. These results highlight the need to account for the interplay of infectious diseases and other health conditions in economic and epidemiological modeling. In this case, accounting for MAP-infected cows having an increased CM occurrence might have nonnegligible effects on the estimated benefit of MAP control. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Dorin, Ryan P; Daneshmand, Siamak; Eisenberg, Manuel S; Chandrasoma, Shahin; Cai, Jie; Miranda, Gus; Nichols, Peter W; Skinner, Donald G; Skinner, Eila C
2011-11-01
The value of lymph node dissection (LND) in the treatment of bladder urothelial carcinoma is well established. However, standards for the quality of LND remain controversial. We compared the distribution of lymph node (LN) metastases in a two-institution cohort of patients undergoing radical cystectomy (RC) using a uniformly applied extended LND template. Patients undergoing RC at the University of Southern California (USC) Institute of Urology and at Oregon Health Sciences University (OHSU) were included if they met the following criteria: (1) no prior pelvic radiotherapy or LND; (2) lymphatic tissue submitted from all nine predesignated regions, including the paracaval and para-aortic LNs; (3) bladder primary; and (4) category M0 disease. The number and location of LN metastases were prospectively entered into corresponding databases. LN maps were constructed and correlated with preoperative and pathologic characteristics. Kaplan-Meier curves were constructed to estimate overall survival (OS) and recurrence free survival (RFS) among LN-positive (LN+) patients. Inclusion criteria were met by 646 patients (439 USC, 207 OHSU), and 23% had LN metastases at time of cystectomy. Although there was a difference in the median per-patient LN count between institutions, there were no significant interinstitutional differences in the incidence or distribution of positive LNs, which were found in 11% of patients with ≤pT2b and in 44% of patients with ≥pT3a tumors. Among LN+ patients, 41% had positive LNs above the common iliac bifurcation. Estimated 5-yr RFS and OS rates for LN+ patients were 45% and 33%, respectively, and did not differ significantly between institutions. LN metastases in regions outside the boundaries of standard LND are common. Adherence to meticulous dissection technique within an extended template is likely more important than total LN count for achieving optimal oncologic outcomes. Copyright © 2011 European Association of Urology. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Saarinen, N.; Vastaranta, M.; Näsi, R.; Rosnell, T.; Hakala, T.; Honkavaara, E.; Wulder, M. A.; Luoma, V.; Tommaselli, A. M. G.; Imai, N. N.; Ribeiro, E. A. W.; Guimarães, R. B.; Holopainen, M.; Hyyppä, J.
2017-10-01
Biodiversity is commonly referred to as species diversity but in forest ecosystems variability in structural and functional characteristics can also be treated as measures of biodiversity. Small unmanned aerial vehicles (UAVs) provide a means for characterizing forest ecosystem with high spatial resolution, permitting measuring physical characteristics of a forest ecosystem from a viewpoint of biodiversity. The objective of this study is to examine the applicability of photogrammetric point clouds and hyperspectral imaging acquired with a small UAV helicopter in mapping biodiversity indicators, such as structural complexity as well as the amount of deciduous and dead trees at plot level in southern boreal forests. Standard deviation of tree heights within a sample plot, used as a proxy for structural complexity, was the most accurately derived biodiversity indicator resulting in a mean error of 0.5 m, with a standard deviation of 0.9 m. The volume predictions for deciduous and dead trees were underestimated by 32.4 m3/ha and 1.7 m3/ha, respectively, with standard deviation of 50.2 m3/ha for deciduous and 3.2 m3/ha for dead trees. The spectral features describing brightness (i.e. higher reflectance values) were prevailing in feature selection but several wavelengths were represented. Thus, it can be concluded that structural complexity can be predicted reliably but at the same time can be expected to be underestimated with photogrammetric point clouds obtained with a small UAV. Additionally, plot-level volume of dead trees can be predicted with small mean error whereas identifying deciduous species was more challenging at plot level.
Cyberinfrastructure for the digital brain: spatial standards for integrating rodent brain atlases
Zaslavsky, Ilya; Baldock, Richard A.; Boline, Jyl
2014-01-01
Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today's data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems (SRSs) and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project. PMID:25309417
Cyberinfrastructure for the digital brain: spatial standards for integrating rodent brain atlases.
Zaslavsky, Ilya; Baldock, Richard A; Boline, Jyl
2014-01-01
Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today's data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems (SRSs) and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project.
Cartographic quality of ERTS-1 images
NASA Technical Reports Server (NTRS)
Welch, R. I.
1973-01-01
Analyses of simulated and operational ERTS images have provided initial estimates of resolution, ground resolution, detectability thresholds and other measures of image quality of interest to earth scientists and cartographers. Based on these values, including an approximate ground resolution of 250 meters for both RBV and MSS systems, the ERTS-1 images appear suited to the production and/or revision of planimetric and photo maps of 1:500,000 scale and smaller for which map accuracy standards are compatible with the imaged detail. Thematic mapping, although less constrained by map accuracy standards, will be influenced by measurement thresholds and errors which have yet to be accurately determined for ERTS images. This study also indicates the desirability of establishing a quantitative relationship between image quality values and map products which will permit both engineers and cartographers/earth scientists to contribute to the design requirements of future satellite imaging systems.
aMAP is a validated pipeline for registration and segmentation of high-resolution mouse brain data
Niedworok, Christian J.; Brown, Alexander P. Y.; Jorge Cardoso, M.; Osten, Pavel; Ourselin, Sebastien; Modat, Marc; Margrie, Troy W.
2016-01-01
The validation of automated image registration and segmentation is crucial for accurate and reliable mapping of brain connectivity and function in three-dimensional (3D) data sets. While validation standards are necessarily high and routinely met in the clinical arena, they have to date been lacking for high-resolution microscopy data sets obtained from the rodent brain. Here we present a tool for optimized automated mouse atlas propagation (aMAP) based on clinical registration software (NiftyReg) for anatomical segmentation of high-resolution 3D fluorescence images of the adult mouse brain. We empirically evaluate aMAP as a method for registration and subsequent segmentation by validating it against the performance of expert human raters. This study therefore establishes a benchmark standard for mapping the molecular function and cellular connectivity of the rodent brain. PMID:27384127
Risk maps for targeting exotic plant pest detection programs in the United States
R.D. Magarey; D.M. Borchert; J.S. Engle; M Garcia-Colunga; Frank H. Koch; et al
2011-01-01
In the United States, pest risk maps are used by the Cooperative Agricultural Pest Survey for spatial and temporal targeting of exotic plant pest detection programs. Methods are described to create standardized host distribution, climate and pathway risk maps for the top nationally ranked exotic pest targets. Two examples are provided to illustrate the risk mapping...
Thematic and positional accuracy assessment of digital remotely sensed data
Russell G. Congalton
2007-01-01
Accuracy assessment or validation has become a standard component of any land cover or vegetation map derived from remotely sensed data. Knowing the accuracy of the map is vital to any decisionmaking performed using that map. The process of assessing the map accuracy is time consuming and expensive. It is very important that the procedure be well thought out and...
ERIC Educational Resources Information Center
Afify, Mohammed Kamal
2018-01-01
The present study aims to identify standards of interactive digital concepts maps design and their measurement indicators as a tool to develop, organize and administer e-learning content in the light of Meaningful Learning Theory and Constructivist Learning Theory. To achieve the objective of the research, the author prepared a list of E-learning…
Topic Maps e-Learning Portal Development
ERIC Educational Resources Information Center
Olsevicova, Kamila
2006-01-01
Topic Maps, ISO/IEC 13250 standard, are designed to facilitate the organization and navigation of large collections of information objects by creating meta-level perspectives of their underlying concepts and relationships. The underlying structure of concepts and relations is expressed by domain ontologies. The Topics Maps technology can become…
DOT National Transportation Integrated Search
2004-09-01
The Federal Aviation Administration (FAA) has requested human factors guidance to support the new moving map Technical Standard Order (TSO)-C165, Electronic Map Display Equipment for Graphical Depiction of Aircraft Position. This document was develop...
Auto-Generated Semantic Processing Services
NASA Technical Reports Server (NTRS)
Davis, Rodney; Hupf, Greg
2009-01-01
Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.
Workflow with pitfalls to derive a regional airborne magnetic compilation
NASA Astrophysics Data System (ADS)
Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg
2017-04-01
Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated countries like Sweden and Australia (AWAGS) to collect high altitude- long distance airborne magnetic data for the entire country to homogenous the high-resolution magnetic data before the merger with satellite data. We present the compilation of a regional magnetic map for an area in northern Europe and discuss the problems and pitfalls for a common workflow applied.
Augustinavicius, Jura L; Greene, M Claire; Lakin, Daniel P; Tol, Wietse A
2018-01-01
Monitoring and evaluation of mental health and psychosocial support (MHPSS) programs is critical to facilitating learning and providing accountability to stakeholders. As part of an inter-agency effort to develop recommendations on MHPSS monitoring and evaluation, this scoping review aimed to identify the terminology and focus of monitoring and evaluation frameworks in this field. We collected program documents (logical frameworks (logframes) and theories of change) from members of the Inter-Agency Standing Committee Reference Group on MHPSS, and systematically searched the peer-reviewed literature across five databases. We included program documents and academic articles that reported on monitoring and evaluation of MHPSS in low- and middle-income countries describing original data. Inclusion and data extraction were conducted in parallel by independent reviewers. Thematic analysis was used to identify common language in the description of practices and the focus of each monitoring and evaluation framework. Logframe outcomes were mapped to MHPSS activity categories. We identified 38 program documents and 89 peer-reviewed articles, describing monitoring and evaluation of a wide range of MHPSS activities. In both program documents and peer-reviewed literature there was a lack of specificity and overlap in language used for goals and outcomes. Well-validated, reliable instruments were reported in the academic literature, but rarely used in monitoring and evaluation practices. We identified six themes in the terminology used to describe goals and outcomes. Logframe outcomes were more commonly mapped to generic program implementation activities (e.g. "capacity building") and those related to family and community support, while outcomes from academic articles were most frequently mapped to specialized psychological treatments. Inconsistencies between the language used in research and practice and discrepancies in measurement have broader implications for monitoring and evaluation in MHPSS programs in humanitarian settings within low- and middle-income countries. This scoping review of the terminology commonly used to describe monitoring and evaluation practices and their focus within MHPSS programming highlights areas of importance for the development of a more standardized approach to monitoring and evaluation.
USDA-ARS?s Scientific Manuscript database
High-density linkage maps are fundamental to contemporary organismal research and scientific approaches to genetic improvement, especially in paleopolyploids with exceptionally complex genomes, e.g., Upland cotton (Gossypium hirsutum L., 2n=52). Using 3 full-sib intra-specific mapping populations fr...
Web Services as Building Blocks for an Open Coastal Observing System
NASA Astrophysics Data System (ADS)
Breitbach, G.; Krasemann, H.
2012-04-01
In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC Web Feature Service (WFS) is used by the COSYNA data portal. This Web Feature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.
Commowick, Olivier; Akhondi-Asl, Alireza; Warfield, Simon K.
2012-01-01
We present a new algorithm, called local MAP STAPLE, to estimate from a set of multi-label segmentations both a reference standard segmentation and spatially varying performance parameters. It is based on a sliding window technique to estimate the segmentation and the segmentation performance parameters for each input segmentation. In order to allow for optimal fusion from the small amount of data in each local region, and to account for the possibility of labels not being observed in a local region of some (or all) input segmentations, we introduce prior probabilities for the local performance parameters through a new Maximum A Posteriori formulation of STAPLE. Further, we propose an expression to compute confidence intervals in the estimated local performance parameters. We carried out several experiments with local MAP STAPLE to characterize its performance and value for local segmentation evaluation. First, with simulated segmentations with known reference standard segmentation and spatially varying performance, we show that local MAP STAPLE performs better than both STAPLE and majority voting. Then we present evaluations with data sets from clinical applications. These experiments demonstrate that spatial adaptivity in segmentation performance is an important property to capture. We compared the local MAP STAPLE segmentations to STAPLE, and to previously published fusion techniques and demonstrate the superiority of local MAP STAPLE over other state-of-the- art algorithms. PMID:22562727
Hemispherical map for the human brain cortex
NASA Astrophysics Data System (ADS)
Tosun, Duygu; Prince, Jerry L.
2001-07-01
Understanding the function of the human brain cortex is a primary goal in human brain mapping. Methods to unfold and flatten the cortical surface for visualization and measurement have been described in previous literature; but comparison across multiple subjects is still difficult because of the lack of a standard mapping technique. We describe a new approach that maps each hemisphere of the cortex to a portion of a sphere in a standard way, making comparison of anatomy and function across different subjects possible. Starting with a three-dimensional magnetic resonance image of the brain, the cortex is segmented and represented as a triangle mesh. Defining a cut around the corpus collosum identifies the left and right hemispheres. Together, the two hemispheres are mapped to the complex plane using a conformal mapping technique. A Mobius transformation, which is conformal, is used to transform the points on the complex plane so that a projective transformation maps each brain hemisphere onto a spherical segment comprising a sphere with a cap removed. We determined the best size of the spherical cap by minimizing the relative area distortion between hemispherical maps and original cortical surfaces. The relative area distortion between the hemispherical maps and the original cortical surfaces for fifteen human brains is analyzed.
Normalization of metabolomics data with applications to correlation maps.
Jauhiainen, Alexandra; Madhu, Basetti; Narita, Masako; Narita, Masashi; Griffiths, John; Tavaré, Simon
2014-08-01
In metabolomics, the goal is to identify and measure the concentrations of different metabolites (small molecules) in a cell or a biological system. The metabolites form an important layer in the complex metabolic network, and the interactions between different metabolites are often of interest. It is crucial to perform proper normalization of metabolomics data, but current methods may not be applicable when estimating interactions in the form of correlations between metabolites. We propose a normalization approach based on a mixed model, with simultaneous estimation of a correlation matrix. We also investigate how the common use of a calibration standard in nuclear magnetic resonance (NMR) experiments affects the estimation of correlations. We show with both real and simulated data that our proposed normalization method is robust and has good performance when discovering true correlations between metabolites. The standardization of NMR data is shown in simulation studies to affect our ability to discover true correlations to a small extent. However, comparing standardized and non-standardized real data does not result in any large differences in correlation estimates. Source code is freely available at https://sourceforge.net/projects/metabnorm/ alexandra.jauhiainen@ki.se Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Mariano, John; Grauch, V.J.
1988-01-01
Aeromagnetic anomalies are produced by variations in the strength and direction of the magnetic field of rocks that include magnetic minerals, commonly magnetite. Patterns of anomalies on aeromagnetic maps can reveal structures - for example, faults which have juxtaposed magnetic rocks against non-magnetic rocks, or areas of alteration where magnetic minerals have been destroyed by hydrothermal activity. Tectonic features of regional extent may not become apparent until a number of aeromagnetic surveys have been compiled and plotted at the same scale. Commonly the compilation involves piecing together data from surveys that were flown at different times with widely disparate flight specifications and data reduction procedures. The data may be compiled into a composite map, where all the pieces are plotted onto one map without regard to the difference in flight elevation and datum, or they may be compiled into a merged map, where all survey data are analytically reduced to a common flight elevation and datum, and then digitally merged at the survey boundaries. The composite map retains the original resolution of all the survey data, but computer methods to enhance regional features crossing the survey boundaries may not be applied. On the other hand, computer methods can be applied to the merged data, but the accuracy of the data may be slightly diminished.
NASA Astrophysics Data System (ADS)
Gu, Ying; Lu, Cuiyun; Zhang, Xiaofeng; Li, Chao; Yu, Juhua; Sun, Xiaowen
2015-05-01
We report the genetic linkage map of Jian carp ( Cyprinus carpio var. Jian). An F1 population comprising 94 Jian carp individuals was mapped using 254 microsatellite markers. The genetic map spanned 1 381.592 cM and comprised 44 linkage groups, with an average marker distance of 6.58 cM. We identified eight quantitative trait loci (QTLs) for body weight (BW) in seven linkage groups, explaining 12.6% to 17.3% of the phenotypic variance. Comparative mapping was performed between Jian carp and mirror carp ( Cyprinus carpio L.), which both have 50 chromosomes. One hundred and ninety-eight Jian carp marker loci were found in common with the mirror carp map, with 186 (93.94%) showing synteny. All 44 Jian carp linkage groups could be one-to-one aligned to the 44 mirror carp linkage groups, mostly sharing two or more common loci. Three QTLs for BW in Jian carp were conserved in mirror carp. QTL comparison suggested that the QTL confidence interval in mirror carp was more precise than the homologous interval in Jian carp, which was contained within the QTL interval in Jian carp. The syntenic relationship and consensus QTLs between the two varieties provide a foundation for genomic research and genetic breeding in common carp.
Mapping Resource Selection Functions in Wildlife Studies: Concerns and Recommendations
Morris, Lillian R.; Proffitt, Kelly M.; Blackburn, Jason K.
2018-01-01
Predicting the spatial distribution of animals is an important and widely used tool with applications in wildlife management, conservation, and population health. Wildlife telemetry technology coupled with the availability of spatial data and GIS software have facilitated advancements in species distribution modeling. There are also challenges related to these advancements including the accurate and appropriate implementation of species distribution modeling methodology. Resource Selection Function (RSF) modeling is a commonly used approach for understanding species distributions and habitat usage, and mapping the RSF results can enhance study findings and make them more accessible to researchers and wildlife managers. Currently, there is no consensus in the literature on the most appropriate method for mapping RSF results, methods are frequently not described, and mapping approaches are not always related to accuracy metrics. We conducted a systematic review of the RSF literature to summarize the methods used to map RSF outputs, discuss the relationship between mapping approaches and accuracy metrics, performed a case study on the implications of employing different mapping methods, and provide recommendations as to appropriate mapping techniques for RSF studies. We found extensive variability in methodology for mapping RSF results. Our case study revealed that the most commonly used approaches for mapping RSF results led to notable differences in the visual interpretation of RSF results, and there is a concerning disconnect between accuracy metrics and mapping methods. We make 5 recommendations for researchers mapping the results of RSF studies, which are focused on carefully selecting and describing the method used to map RSF studies, and relating mapping approaches to accuracy metrics. PMID:29887652
Morton, Douglas M.; Digital preparation by Bovard, Kelly R.
2003-01-01
Open-File Report 03-418 is a digital geologic data set that maps and describes the geology of the Fontana 7.5’ quadrangle, Riverside and San Bernardino Counties, California. The Fontana quadrangle database is one of several 7.5’ quadrangle databases that are being produced by the Southern California Areal Mapping Project (SCAMP). These maps and databases are, in turn, part of the nation-wide digital geologic map coverage being developed by the National Cooperative Geologic Map Program of the U.S. Geological Survey (USGS). General Open-File Report 03-418 contains a digital geologic map database of the Fontana 7.5’ quadrangle, Riverside and San Bernardino Counties, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file (fon_map.ps) to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. An Encapsulated PostScript (EPS) file (fon_grey.eps) created in Adobe Illustrator 10.0 to plot the geologic map on a grey topographic base, and containing a Correlation of Map Units (CMU), a Description of Map Units (DMU), and an index map. 4. Portable Document Format (.pdf) files of: a. the Readme file; includes in Appendix I, data contained in fon_met.txt b. The same graphics as plotted in 2 and 3 above.Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (4b above) or plotting the postscript files (2 or 3 above).
30 CFR 75.508 - Map of electrical system.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Map of electrical system. 75.508 Section 75.508... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.508 Map of electrical system. [Statutory Provisions] The location and the electrical rating of all stationary electric...
30 CFR 75.508 - Map of electrical system.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Map of electrical system. 75.508 Section 75.508... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.508 Map of electrical system. [Statutory Provisions] The location and the electrical rating of all stationary electric...
30 CFR 75.508 - Map of electrical system.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Map of electrical system. 75.508 Section 75.508... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.508 Map of electrical system. [Statutory Provisions] The location and the electrical rating of all stationary electric...
State Highway Maps: A Route to a Learning Adventure
ERIC Educational Resources Information Center
McDuffie, Thomas E.; Cifelli, Joseph
2006-01-01
Science within the folds of highway maps is explored through a series of hands-on experiences designed to reinforce and extend map-reading skills in grades 6-8. The increasingly sophisticated, standards-related activities include measuring distances between population centers, finding communities named after trees, animals, and geologic features,…
This document summarizes the process followed to utilize the fuel consumption map of a Ricardo modeled engine and vehicle fuel consumption data to generate a full engine fuel consumption map which can be used by EPA's ALPHA vehicle simulations.
The Development of Clinical Document Standards for Semantic Interoperability in China
Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping
2011-01-01
Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722
Manousaki, Tereza; Tsakogiannis, Alexandros; Taggart, John B; Palaiokostas, Christos; Tsaparis, Dimitris; Lagnel, Jacques; Chatziplis, Dimitrios; Magoulas, Antonios; Papandroulakis, Nikos; Mylonas, Constantinos C; Tsigenopoulos, Costas S
2015-12-29
Common pandora (Pagellus erythrinus) is a benthopelagic marine fish belonging to the teleost family Sparidae, and a newly recruited species in Mediterranean aquaculture. The paucity of genetic information relating to sparids, despite their growing economic value for aquaculture, provides the impetus for exploring the genomics of this fish group. Genomic tool development, such as genetic linkage maps provision, lays the groundwork for linking genotype to phenotype, allowing fine-mapping of loci responsible for beneficial traits. In this study, we applied ddRAD methodology to identify polymorphic markers in a full-sib family of common pandora. Employing the Illumina MiSeq platform, we sampled and sequenced a size-selected genomic fraction of 99 individuals, which led to the identification of 920 polymorphic loci. Downstream mapping analysis resulted in the construction of 24 robust linkage groups, corresponding to the karyotype of the species. The common pandora linkage map showed varying degrees of conserved synteny with four other teleost genomes, namely the European seabass (Dicentrarchus labrax), Nile tilapia (Oreochromis niloticus), stickleback (Gasterosteus aculeatus), and medaka (Oryzias latipes), suggesting a conserved genomic evolution in Sparidae. Our work exploits the possibilities of genotyping by sequencing to gain novel insights into genome structure and evolution. Such information will boost the study of cultured species and will set the foundation for a deeper understanding of the complex evolutionary history of teleosts. Copyright © 2016 Manousaki et al.
Generalized Weyl-Wigner map and Vey quantum mechanics
NASA Astrophysics Data System (ADS)
Dias, Nuno Costa; Prata, João Nuno
2001-12-01
The Weyl-Wigner map yields the entire structure of Moyal quantum mechanics directly from the standard operator formulation. The covariant generalization of Moyal theory, also known as Vey quantum mechanics, was presented in the literature many years ago. However, a derivation of the formalism directly from standard operator quantum mechanics, clarifying the relation between the two formulations, is still missing. In this article we present a covariant generalization of the Weyl order prescription and of the Weyl-Wigner map and use them to derive Vey quantum mechanics directly from the standard operator formulation. The procedure displays some interesting features: it yields all the key ingredients and provides a more straightforward interpretation of the Vey theory including a direct implementation of unitary operator transformations as phase space coordinate transformations in the Vey idiom. These features are illustrated through a simple example.
Charge to Road Map Development Sessions
NASA Technical Reports Server (NTRS)
Barth, Janet
2004-01-01
Develop a road map for new standard Model applications radiation belt models. Model applications: Spacecraft and instruments. Reduce risk. Reduce cost. Improve performance. Increase system lifetime. Reduce risk to astronauts.
Biomedical Terminology Mapper for UML projects.
Thibault, Julien C; Frey, Lewis
2013-01-01
As the biomedical community collects and generates more and more data, the need to describe these datasets for exchange and interoperability becomes crucial. This paper presents a mapping algorithm that can help developers expose local implementations described with UML through standard terminologies. The input UML class or attribute name is first normalized and tokenized, then lookups in a UMLS-based dictionary are performed. For the evaluation of the algorithm 142 UML projects were extracted from caGrid and automatically mapped to National Cancer Institute (NCI) terminology concepts. Resulting mappings at the UML class and attribute levels were compared to the manually curated annotations provided in caGrid. Results are promising and show that this type of algorithm could speed-up the tedious process of mapping local implementations to standard biomedical terminologies.
Biomedical Terminology Mapper for UML projects
Thibault, Julien C.; Frey, Lewis
As the biomedical community collects and generates more and more data, the need to describe these datasets for exchange and interoperability becomes crucial. This paper presents a mapping algorithm that can help developers expose local implementations described with UML through standard terminologies. The input UML class or attribute name is first normalized and tokenized, then lookups in a UMLS-based dictionary are performed. For the evaluation of the algorithm 142 UML projects were extracted from caGrid and automatically mapped to National Cancer Institute (NCI) terminology concepts. Resulting mappings at the UML class and attribute levels were compared to the manually curated annotations provided in caGrid. Results are promising and show that this type of algorithm could speed-up the tedious process of mapping local implementations to standard biomedical terminologies. PMID:24303278
A high-resolution cattle CNV map by population-scale genome sequencing
USDA-ARS?s Scientific Manuscript database
Copy Number Variations (CNVs) are common genomic structural variations that have been linked to human diseases and phenotypic traits. Prior studies in cattle have produced low-resolution CNV maps. We constructed a draft, high-resolution map of cattle CNVs based on whole genome sequencing data from 7...
Joint QTL linkage mapping for multiple-cross mating design sharing one common parent
USDA-ARS?s Scientific Manuscript database
Nested association mapping (NAM) is a novel genetic mating design that combines the advantages of linkage analysis and association mapping. This design provides opportunities to study the inheritance of complex traits, but also requires more advanced statistical methods. In this paper, we present th...
Scoping of Flood Hazard Mapping Needs for Merrimack County, New Hampshire
2006-01-01
DOQ Digital Orthophoto Quadrangle DOQQ Digital Ortho Quarter Quadrangle DTM Digital Terrain Model FBFM Flood Boundary and Floodway Map FEMA Federal...discussed available data and coverages within New Hampshire (for example, 2003 National Agriculture Imag- ery Program (NAIP) color Digital Orthophoto ... orthophotos providing improved base map accuracy. NH GRANIT is presently converting the standard, paper FIRMs and Flood Boundary and Floodway maps (FBFMs
MetaMapping the nursing procedure manual.
Peace, Jane; Brennan, Patricia Flatley
2006-01-01
Nursing procedure manuals are an important resource for practice, but ensuring that the correct procedure can be located when needed is an ongoing challenge. This poster presents an approach used to automatically index nursing procedures with standardized nursing terminology. Although indexing yielded a low number of mappings, examination of successfully mapped terms, incorrect mappings, and unmapped terms reveals important information about the reasons automated indexing fails.
NASA Astrophysics Data System (ADS)
Kowalewski, M. G.; Janz, S. J.
2015-02-01
Methods of absolute radiometric calibration of backscatter ultraviolet (BUV) satellite instruments are compared as part of an effort to minimize pre-launch calibration uncertainties. An internally illuminated integrating sphere source has been used for the Shuttle Solar BUV, Total Ozone Mapping Spectrometer, Ozone Mapping Instrument, and Global Ozone Monitoring Experiment 2 using standardized procedures traceable to national standards. These sphere-based spectral responsivities agree to within the derived combined standard uncertainty of 1.87% relative to calibrations performed using an external diffuser illuminated by standard irradiance sources, the customary spectral radiance responsivity calibration method for BUV instruments. The combined standard uncertainty for these calibration techniques as implemented at the NASA Goddard Space Flight Center’s Radiometric Calibration and Development Laboratory is shown to less than 2% at 250 nm when using a single traceable calibration standard.
Study on the standard architecture for geoinformation common services
NASA Astrophysics Data System (ADS)
Zha, Z.; Zhang, L.; Wang, C.; Jiang, J.; Huang, W.
2014-04-01
The construction of platform for geoinformation common services was completed or on going in in most provinces and cities in these years in China, and the platforms plays an important role in the economic and social activities. Geoinfromation and geoinfromation based services are the key issues in the platform. The standards on geoinormation common services play as bridges among the users, systems and designers of the platform. The standard architecture for geoinformation common services is the guideline for designing and using the standard system in which the standards integrated to each other to promote the development, sharing and services of geoinformation resources. To establish the standard architecture for geoinformation common services is one of the tasks of "Study on important standards for geonformation common services and management of public facilities in city". The scope of the standard architecture is defined, such as data or information model, interoperability interface or service, information management. Some Research work on the status of international standards of geoinormation common services in organization and countries, like ISO/TC 211, OGC and other countries or unions like USA, EU, Japan have done. Some principles are set up to evaluate the standard, such as availability, suitability and extensible ability. Then the development requirement and practical situation are analyzed, and a framework of the standard architecture for geoinformation common services are proposed. Finally, a summary and prospects of the geoinformation standards are made.
A framework for evaluating and utilizing medical terminology mappings.
Hussain, Sajjad; Sun, Hong; Sinaci, Anil; Erturkmen, Gokce Banu Laleci; Mead, Charles; Gray, Alasdair J G; McGuinness, Deborah L; Prud'Hommeaux, Eric; Daniel, Christel; Forsberg, Kerstin
2014-01-01
Use of medical terminologies and mappings across them are considered to be crucial pre-requisites for achieving interoperable eHealth applications. Built upon the outcomes of several research projects, we introduce a framework for evaluating and utilizing terminology mappings that offers a platform for i) performing various mappings strategies, ii) representing terminology mappings together with their provenance information, and iii) enabling terminology reasoning for inferring both new and erroneous mappings. We present the results of the introduced framework from SALUS project where we evaluated the quality of both existing and inferred terminology mappings among standard terminologies.
The Power Plant Mapping Student Project: Bringing Citizen Science to Schools
NASA Astrophysics Data System (ADS)
Tayne, K.; Oda, T.; Gurney, K. R.; O'Keeffe, D.; Petron, G.; Tans, P. P.; Frost, G. J.
2014-12-01
An emission inventory (EI) is a conventional tool to quantify and monitor anthropogenic emissions of greenhouse gases and air pollutants into the atmosphere. Gridded EI can visually show geographical patterns of emissions and their changes over time. These patterns, when available, are often determined using location data collected by regional governments, industries, and researchers. Datasets such as Carbon Monitoring and Action (CARMA, www.carma.org) are particularly useful for mapping emissions from large point sources and have been widely used in the EI community. The EI community is aware of potentially significant errors in the geographical locations of point sources, including power plants. The big challenge, however, is to review tens of thousands of power plant locations around the world and correct them where needed. The Power Plant Mapping Student Project (PPMSP) is a platform designed for students in 4th through 12th grade to improve the geographical location of power plants indicated in existing datasets to benefit international EI research. In PPMSP, we use VENTUS, a web-based platform (http://ventus.project.asu.edu/) that invites citizens to contribute power plant location data. Using VENTUS, students view scenes in the vicinity of reported power plant coordinates on Google Maps. Students either verify the location of a power plant or search for it within a designated radius using various indicators, an e-guide, and a power plant photo gallery for assistance. If the power plant cannot be found, students mark the plant as unverified. To assure quality for research use, the project contains multiple checkpoints and levels of review. While participating in meaningful research that directly benefits the EI research community, students are engaged in relevant science curricula designed to meet each grade level's Next Generation Science Standards. Students study energy, climate change, the atmosphere, and geographical information systems. The curricula is integrated with math and writing, connecting to the Common Core Standards. PPMSP is designed to be accessible and relevant to all learners, including students learning English. With PPMSP, students are empowered to participate in relevant research and become future leaders in mitigating climate change.
Mapping groundwater quality distinguishing geogenic and anthropogenic contribution using NBL
NASA Astrophysics Data System (ADS)
Preziosi, Elisabetta; Ducci, Daniela; Condesso de Melo, Maria Teresa; Parrone, Daniele; Sellerino, Mariangela; Ghergo, Stefano; Oliveira, Joana; Ribeiro, Luis
2015-04-01
Groundwaters are threatened by anthropic activities and pollution is interesting a large number of aquifers worldwide. Qualitative and quantitative monitoring is required to assess the status and track its evolution in time and space especially where anthropic pressures are stronger. Up to now, groundwater quality mapping has been performed separately from the assessment of its natural status, i.e. the definition of the natural background level of a particular element in a particular area or groundwater body. The natural background level (NBL) of a substance or element allows to distinguish anthropogenic pollution from contamination of natural origin in a population of groundwater samples. NBLs are the result of different atmospheric, geological, chemical and biological interaction processes during groundwater infiltration and circulation. There is an increasing need for the water managers to have sound indications on good quality groundwater exploitation. Indeed the extension of a groundwater body is often very large, in the order of tens or hundreds of square km. How to select a proper location for good quality groundwater abstraction is often limited to a question of facility for drilling (access, roads, authorizations, etc.) or at the most related to quantitative aspects driven by geophysical exploration (the most promising from a transmissibility point of view). So how to give indications to the administrators and water managers about the exploitation of good quality drinking water? In the case of anthropic contamination, how to define which area is to be restored and to which threshold (e.g. background level) should the concentration be lowered through the restoration measures? In the framework of a common project between research institutions in Italy (funded by CNR) and Portugal (funded by FCT), our objective is to establish a methodology aiming at merging together 1) the evaluation of NBL and 2) the need to take into account the drinking water standards with a spatial analysis. We compare diverse case studies using geochemical maps built by kriging in which we interpolate the conditional probability of exceeding the reference value (i.e. the drinking water standard) OR the local natural background level. The resulting maps provide a useful reference for management purposes.
Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia
NASA Astrophysics Data System (ADS)
Swasti Kanthi, Nurin; Hery Purwanto, Taufik
2016-11-01
Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.
NASA Astrophysics Data System (ADS)
Chiocci, F. L.; Gorini, C.; Ercilla, G.; Sakellariou, D.; Casalbore, D.; Ridente, D.
2017-12-01
46,000 km of densely settled coastlines characterise the Mediterranean Sea. The region connects three continents, where the population doubled in the last 20 years, and among which, trade, maritime transports and migratory fluxes have been increasing. Moreover, the Mediterranean is by far the world's largest tourist destination, attracting almost a third of international tourists and generating more than a quarter of tourism-related revenues worldwide. The Mediterranean area lays in a plate boundary zone highly active in terms of seismicity, volcanism and submarine geological processes that over recent time have repeatedly demonstrated to be able to generate catastrophic events.. As an example 98 tsunamis where recorded in the Mediterranean on historical times (on average one every century). This census do not encompasses small events, such as minor tsunamis generated by submarine landslides that can produce serious damage in the near field. In Stromboli volcanic island (Southern Tyrrhenian Sea) for instance, the frequency of such events accounts for 5 events over the last century Mapping the seafloor for geohazard assessment becomes, therefore, especially important for the sustainable development of the marine and coastal areas both economically and socially. The increasing amount of high resolution seafloor mapping data allows defining geohazard features such as volcanic vents, active faults, submarine landslide, canyon head migrating bedforms fluid expulsion structure with a detail able to highlight even locally dangerous situations. If the marine geoscience community will be able to build common standards to interpret and cartographically represent the marine geohazard features private industries and public agencies will benefit of an unvaluable tool which will help in better exploit the marine resource and/or preserve the marine and coastal environment. This contribution will present spectacular examples of marine geohazards from the Mediterranean Seas; the results of MAGIC (Marine Geohazards along the Italian Coasts) Project, and the aims of the EU-H2020 SHAREMED infrastructure, a possible new initiative that will involve the whole Mediterranean geoscience community to realize geohazard feature mapping (see figure), will be presented as well.
Pathways of Knowing: Integrating Citizen Science and Critical Thinking in the Adult ELL Classroom
NASA Astrophysics Data System (ADS)
Basham, Melody
This action research study examines what common perceptions and constructs currently exist in educating adult immigrants in Arizona and considers how might the integration of citizen science with the current English curriculum promote higher order thinking and educational equity in this population. A citizen science project called the Mastodon Matrix Project was introduced to a Level 2 ELAA (English Language Acquisition for Adults) classroom and aligned with the Arizona Adult Standards for ELAA education. Pre and post attitudinal surveys, level tests, and personal meaning maps were implemented to assess student attitudes towards science, views on technology, English skills, and knowledge gained as a result of doing citizen science over a period of 8 weeks.
New approach to estimating variability in visual field data using an image processing technique.
Crabb, D P; Edgar, D F; Fitzke, F W; McNaught, A I; Wynn, H P
1995-01-01
AIMS--A new framework for evaluating pointwise sensitivity variation in computerised visual field data is demonstrated. METHODS--A measure of local spatial variability (LSV) is generated using an image processing technique. Fifty five eyes from a sample of normal and glaucomatous subjects, examined on the Humphrey field analyser (HFA), were used to illustrate the method. RESULTS--Significant correlation between LSV and conventional estimates--namely, HFA pattern standard deviation and short term fluctuation, were found. CONCLUSION--LSV is not dependent on normals' reference data or repeated threshold determinations, thus potentially reducing test time. Also, the illustrated pointwise maps of LSV could provide a method for identifying areas of fluctuation commonly found in early glaucomatous field loss. PMID:7703196
SOHO EIT Carrington maps from synoptic full-disk data
NASA Technical Reports Server (NTRS)
Thompson, B. J.; Newmark, J. S.; Gurman, J. B.; Delaboudiniere, J. P.; Clette, F.; Gibson, S. E.
1997-01-01
The solar synoptic maps, obtained from observations carried out since May 1996 by the extreme-ultraviolet imaging telescope (EIT) onboard the Solar and Heliospheric Observatory (SOHO), are presented. The maps were constructed for each Carrington rotation with the calibrated data. The off-limb maps at 1.05 and 1.10 solar radii were generated for three coronal lines using the standard applied to coronagraph synoptic maps. The maps reveal several aspects of the solar structure over the entire rotation and are used in the whole sun month modeling campaign. @txt extreme-ultraviolet imaging telescope
Mapping QTLs for drought tolerance in a SEA 5 x AND 277 common bean cross with SSRs and SNP markers.
Briñez, Boris; Perseguini, Juliana Morini Küpper Cardoso; Rosa, Juliana Santa; Bassi, Denis; Gonçalves, João Guilherme Ribeiro; Almeida, Caléo; Paulino, Jean Fausto de Carvalho; Blair, Matthew Ward; Chioratto, Alisson Fernando; Carbonell, Sérgio Augusto Morais; Valdisser, Paula Arielle Mendes Ribeiro; Vianello, Rosana Pereira; Benchimol-Reis, Luciana Lasry
2017-01-01
The common bean is characterized by high sensitivity to drought and low productivity. Breeding for drought resistance in this species involves genes of different genetic groups. In this work, we used a SEA 5 x AND 277 cross to map quantitative trait loci associated with drought tolerance in order to assess the factors that determine the magnitude of drought response in common beans. A total of 438 polymorphic markers were used to genotype the F8 mapping population. Phenotyping was done in two greenhouses, one used to simulate drought and the other to simulate irrigated conditions. Fourteen traits associated with drought tolerance were measured to identify the quantitative trait loci (QTLs). The map was constructed with 331 markers that covered all 11 chromosomes and had a total length of 1515 cM. Twenty-two QTLs were discovered for chlorophyll, leaf and stem fresh biomass, leaf biomass dry weight, leaf temperature, number of pods per plant, number of seeds per plant, seed weight, days to flowering, dry pod weight and total yield under well-watered and drought (stress) conditions. All the QTLs detected under drought conditions showed positive effects of the SEA 5 allele. This study provides a better understanding of the genetic inheritance of drought tolerance in common bean.
Maser: one-stop platform for NGS big data from analysis to visualization
Kinjo, Sonoko; Monma, Norikazu; Misu, Sadahiko; Kitamura, Norikazu; Imoto, Junichi; Yoshitake, Kazutoshi; Gojobori, Takashi; Ikeo, Kazuho
2018-01-01
Abstract A major challenge in analyzing the data from high-throughput next-generation sequencing (NGS) is how to handle the huge amounts of data and variety of NGS tools and visualize the resultant outputs. To address these issues, we developed a cloud-based data analysis platform, Maser (Management and Analysis System for Enormous Reads), and an original genome browser, Genome Explorer (GE). Maser enables users to manage up to 2 terabytes of data to conduct analyses with easy graphical user interface operations and offers analysis pipelines in which several individual tools are combined as a single pipeline for very common and standard analyses. GE automatically visualizes genome assembly and mapping results output from Maser pipelines, without requiring additional data upload. With this function, the Maser pipelines can graphically display the results output from all the embedded tools and mapping results in a web browser. Therefore Maser realized a more user-friendly analysis platform especially for beginners by improving graphical display and providing the selected standard pipelines that work with built-in genome browser. In addition, all the analyses executed on Maser are recorded in the analysis history, helping users to trace and repeat the analyses. The entire process of analysis and its histories can be shared with collaborators or opened to the public. In conclusion, our system is useful for managing, analyzing, and visualizing NGS data and achieves traceability, reproducibility, and transparency of NGS analysis. Database URL: http://cell-innovation.nig.ac.jp/maser/ PMID:29688385
NASA Astrophysics Data System (ADS)
Pathak, Sayan D.; Haynor, David R.; Thompson, Carol L.; Lein, Ed; Hawrylycz, Michael
2009-02-01
Understanding the geography of genetic expression in the mouse brain has opened previously unexplored avenues in neuroinformatics. The Allen Brain Atlas (www.brain-map.org) (ABA) provides genome-wide colorimetric in situ hybridization (ISH) gene expression images at high spatial resolution, all mapped to a common three-dimensional 200μm3 spatial framework defined by the Allen Reference Atlas (ARA) and is a unique data set for studying expression based structural and functional organization of the brain. The goal of this study was to facilitate an unbiased data-driven structural partitioning of the major structures in the mouse brain. We have developed an algorithm that uses nonnegative matrix factorization (NMF) to perform parts based analysis of ISH gene expression images. The standard NMF approach and its variants are limited in their ability to flexibly integrate prior knowledge, in the context of spatial data. In this paper, we introduce spatial connectivity as an additional regularization in NMF decomposition via the use of Markov Random Fields (mNMF). The mNMF algorithm alternates neighborhood updates with iterations of the standard NMF algorithm to exploit spatial correlations in the data. We present the algorithm and show the sub-divisions of hippocampus and somatosensory-cortex obtained via this approach. The results are compared with established neuroanatomic knowledge. We also highlight novel gene expression based sub divisions of the hippocampus identified by using the mNMF algorithm.
Common Core State Standards 101
ERIC Educational Resources Information Center
Rothman, Robert
2013-01-01
The Common Core State Standards (CCSS) represent the first time that nearly every state has set common expectations for what students should know and be able to do. In the past, each state set its own standards, and the results varied widely. And while states collectively developed these common standards, decisions about the curriculum and…
Hydrologic Unit Map -- 1978, state of South Dakota
,
1978-01-01
This map and accompanying table show Hydrologic Unites that are basically hydrographic in nature. The Cataloging Unites shown supplant the Cataloging Units previously depicted n the 1974 State Hydrologic Unit Map. The boundaries as shown have been adapted from the 1974 State Hydrologic Unit Map, "The Catalog of Information on Water Data" (1972), "Water Resources Regions and Subregions for the National Assessment of Water and Related Land Resources" by the U.S. Water Resources Council (1970), "River Basin of the United States" by the U.S. Soil Conservation Service (1963, 1970), "River Basin Maps Showing Hydrologic Stations" by the Inter-Agency Committee on Water Resources, Subcommittee on Hydrology (1961), and State planning maps. The Political Subdivision has been adopted from "Counties and County Equivalents of the States if the United States" presented in Federal Information Processing Standards Publication 6-2, issued by the National Bureau of Standards (1973) in which each county or county equivalent is identified by a 2-character State code and a 3-character county code. The Regions, Subregions and Accounting Units are aggregates of the Cataloging Unites. The Regions and Sub regions are currently (1978) used by the U.S> Water Resources Council for comprehensive planning, including the National Assessment, and as a standard geographical framework for more detailed water and related land-resources planning. The Accounting Units are those currently (1978) in use by the U.S. Geological Survey for managing the National Water Data Network. This map was revised to include a boundary realinement between Cataloging Units 10140103 and 10160009.
Etalon (standard) for surface potential distribution produced by electric activity of the heart.
Szathmáry, V; Ruttkay-Nedecký, I
1981-01-01
The authors submit etalon (standard) equipotential maps as an aid in the evaluation of maps of surface potential distributions in living subjects. They were obtained by measuring potentials on the surface of an electrolytic tank shaped like the thorax. The individual etalon maps were determined in such a way that the parameters of the physical dipole forming the source of the electric field in the tank corresponded to the mean vectorcardiographic parameters measured in a healthy population sample. The technique also allows a quantitative estimate of the degree of non-dipolarity of the heart as the source of the electric field.
Mester, David; Ronin, Yefim; Schnable, Patrick; Aluru, Srinivas; Korol, Abraham
2015-01-01
Our aim was to develop a fast and accurate algorithm for constructing consensus genetic maps for chip-based SNP genotyping data with a high proportion of shared markers between mapping populations. Chip-based genotyping of SNP markers allows producing high-density genetic maps with a relatively standardized set of marker loci for different mapping populations. The availability of a standard high-throughput mapping platform simplifies consensus analysis by ignoring unique markers at the stage of consensus mapping thereby reducing mathematical complicity of the problem and in turn analyzing bigger size mapping data using global optimization criteria instead of local ones. Our three-phase analytical scheme includes automatic selection of ~100-300 of the most informative (resolvable by recombination) markers per linkage group, building a stable skeletal marker order for each data set and its verification using jackknife re-sampling, and consensus mapping analysis based on global optimization criterion. A novel Evolution Strategy optimization algorithm with a global optimization criterion presented in this paper is able to generate high quality, ultra-dense consensus maps, with many thousands of markers per genome. This algorithm utilizes "potentially good orders" in the initial solution and in the new mutation procedures that generate trial solutions, enabling to obtain a consensus order in reasonable time. The developed algorithm, tested on a wide range of simulated data and real world data (Arabidopsis), outperformed two tested state-of-the-art algorithms by mapping accuracy and computation time. PMID:25867943
NASA Astrophysics Data System (ADS)
Jencks, J. H.; Cartwright, J.; Varner, J. D.
2016-12-01
Exploring, understanding, and managing the global oceans are a challenge when hydrographic maps are available for only 5% of the world's oceans. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is easier said than done.While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. The National Oceanic and Atmospheric Administration (NOAA) is an active participant in numerous campaign mapping projects whose goals are to carry out coordinated and comprehensive ocean mapping efforts. One of these international programs is an outcome of the Galway Statement on Atlantic Ocean Cooperation signed by the European Union, Canada, and the United States in 2013. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. Through the use of industry standards, the services are constructed such that they can be combined and re-used in a variety of contexts. For example, users may leverage the services in desktop analysis tools, web applications created by the hosting organizations (e.g. the North Atlantic Data Portal), or in custom applications they develop themselves. In order to maximize the return on campaign mapping investments, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. Working together, we can ensure that valuable data are made available to the broadest community.
Watanabe, Shota; Sakaguchi, Kenta; Hosono, Makoto; Ishii, Kazunari; Murakami, Takamichi; Ichikawa, Katsuhiro
The purpose of this study was to evaluate the effect of a hybrid-type iterative reconstruction method on Z-score mapping of hyperacute stroke in unenhanced computed tomography (CT) images. We used a hybrid-type iterative reconstruction [adaptive statistical iterative reconstruction (ASiR)] implemented in a CT system (Optima CT660 Pro advance, GE Healthcare). With 15 normal brain cases, we reconstructed CT images with a filtered back projection (FBP) and ASiR with a blending factor of 100% (ASiR100%). Two standardized normal brain data were created from normal databases of FBP images (FBP-NDB) and ASiR100% images (ASiR-NDB), and standard deviation (SD) values in basal ganglia were measured. The Z-score mapping was performed for 12 hyperacute stroke cases by using FBP-NDB and ASiR-NDB, and compared Z-score value on hyperacute stroke area and normal area between FBP-NDB and ASiR-NDB. By using ASiR-NDB, the SD value of standardized brain was decreased by 16%. The Z-score value of ASiR-NDB on hyperacute stroke area was significantly higher than FBP-NDB (p<0.05). Therefore, the use of images reconstructed with ASiR100% for Z-score mapping had potential to improve the accuracy of Z-score mapping.
Emadzadeh, Ehsan; Sarker, Abeed; Nikfarjam, Azadeh; Gonzalez, Graciela
2017-01-01
Social networks, such as Twitter, have become important sources for active monitoring of user-reported adverse drug reactions (ADRs). Automatic extraction of ADR information can be crucial for healthcare providers, drug manufacturers, and consumers. However, because of the non-standard nature of social media language, automatically extracted ADR mentions need to be mapped to standard forms before they can be used by operational pharmacovigilance systems. We propose a modular natural language processing pipeline for mapping (normalizing) colloquial mentions of ADRs to their corresponding standardized identifiers. We seek to accomplish this task and enable customization of the pipeline so that distinct unlabeled free text resources can be incorporated to use the system for other normalization tasks. Our approach, which we call Hybrid Semantic Analysis (HSA), sequentially employs rule-based and semantic matching algorithms for mapping user-generated mentions to concept IDs in the Unified Medical Language System vocabulary. The semantic matching component of HSA is adaptive in nature and uses a regression model to combine various measures of semantic relatedness and resources to optimize normalization performance on the selected data source. On a publicly available corpus, our normalization method achieves 0.502 recall and 0.823 precision (F-measure: 0.624). Our proposed method outperforms a baseline based on latent semantic analysis and another that uses MetaMap.
ERIC Educational Resources Information Center
Kansas State Univ., Manhattan. Dept. of Adult and Occupational Education.
This document contains recommended standards for quality vocational programs in agricultural/agribusiness education which are divided into (1) standards common to all programs, (2) standards specific to adult education in agriculture/agribusiness, and (3) standards specific to production agriculture, secondary. The sixty common standards are…
Schwab, David Emanuel; Lepski, Guilherme; Borchers, Christian; Trautmann, Katrin; Paulsen, Frank; Schittenhelm, Jens
2018-01-01
Immunohistochemistry is routinely used in differential diagnosis of tumours of the central nervous system (CNS). The latest 2016 WHO 2016 revision now includes molecular data such as IDH mutation and 1p/19q codeletion thus restructuring glioma classification. Direct comparative information between commonly used immunohistochemical markers for glial tumours GFAP, MAP - 2, NOGO - A, OLIG - 2 and WT - 1 concerning quality and quantity of expression and their relation to the new molecular markers are lacking. We therefore compared the immunohistochemical staining results of all five antibodies in 34 oligodendrogliomas, 106 ependymomas and 423 astrocytic tumours. GFAP expression was reduced in cases with higher WHO grade, oligodendroglial differentiation and in IDH wildtype diffuse astrocytomas. By contrast MAP - 2 expression was significantly increased in diffuse astrocytomas with IDH mutation, while NOGO - A expression was not associated with any molecular marker. WT - 1 expression was significantly decreased in tumours with IDH mutation and ATRX loss. OLIG - 2 was increased in IDH-mutant grade II astrocytomas and in cases with higher proliferation rate. In univariate survival analysis high WT - 1 expression was significantly associated with worse outcome in diffuse astrocytic tumours (log rank p < 0.0001; n = 211; median time: 280 days vs 562 days). None of the markers was prognostic in multivariate survival analysis. Among the evaluated markers MAP - 2, OLIG - 2 and WT - 1 showed the best potential to separate between glioma entities and can be recommended for a standardized immunohistochemical panel. Copyright © 2017 Elsevier GmbH. All rights reserved.
Functional quantitative susceptibility mapping (fQSM).
Balla, Dávid Z; Sanchez-Panchuelo, Rosa M; Wharton, Samuel J; Hagberg, Gisela E; Scheffler, Klaus; Francis, Susan T; Bowtell, Richard
2014-10-15
Blood oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI) is a powerful technique, typically based on the statistical analysis of the magnitude component of the complex time-series. Here, we additionally interrogated the phase data of the fMRI time-series and used quantitative susceptibility mapping (QSM) in order to investigate the potential of functional QSM (fQSM) relative to standard magnitude BOLD fMRI. High spatial resolution data (1mm isotropic) were acquired every 3 seconds using zoomed multi-slice gradient-echo EPI collected at 7 T in single orientation (SO) and multiple orientation (MO) experiments, the latter involving 4 repetitions with the subject's head rotated relative to B0. Statistical parametric maps (SPM) were reconstructed for magnitude, phase and QSM time-series and each was subjected to detailed analysis. Several fQSM pipelines were evaluated and compared based on the relative number of voxels that were coincidentally found to be significant in QSM and magnitude SPMs (common voxels). We found that sensitivity and spatial reliability of fQSM relative to the magnitude data depended strongly on the arbitrary significance threshold defining "activated" voxels in SPMs, and on the efficiency of spatio-temporal filtering of the phase time-series. Sensitivity and spatial reliability depended slightly on whether MO or SO fQSM was performed and on the QSM calculation approach used for SO data. Our results present the potential of fQSM as a quantitative method of mapping BOLD changes. We also critically discuss the technical challenges and issues linked to this intriguing new technique. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Shumchenia, Emily J.; Guarinello, Marisa L.; Carey, Drew A.; Lipsky, Andrew; Greene, Jennifer; Mayer, Larry; Nixon, Matthew E.; Weber, John
2015-06-01
Efforts are in motion globally to address coastal and marine management needs through spatial planning and concomitant seabed habitat mapping. Contrasting strategies are often evident in these processes among local, regional, national and international scientific approaches and policy needs. In answer to such contrasts among its member states, the United States Northeast Regional Ocean Council formed a Habitat Working Group to conduct a regional inventory and comparative evaluation of seabed characterization, classification, and modeling activities in New England. The goals of this effort were to advance regional understanding of ocean habitats and identify opportunities for collaboration. Working closely with the Habitat Working Group, we organized and led the inventory and comparative analysis with a focus on providing processes and tools that can be used by scientists and managers, updated and adapted for future use, and applied in other ocean management regions throughout the world. Visual schematics were a critical component of the comparative analysis and aided discussion among scientists and managers. Regional consensus was reached on a common habitat classification scheme (U.S. Coastal and Marine Ecological Classification Standard) for regional seabed maps. Results and schematics were presented at a region-wide workshop where further steps were taken to initiate collaboration among projects. The workshop culminated in an agreement on a set of future seabed mapping goals for the region. The work presented here may serve as an example to other ocean planning regions in the U.S., Europe or elsewhere seeking to integrate a variety of seabed characterization, classification and modeling activities.
Mapping of the chromosome 1p36 region surrounding the Charcot-Marie-Tooth disease type 2A locus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denton, P.; Gere, S.; Wolpert, C.
1994-09-01
Charcot-Marie-Tooth (CMT) disease is the most common inherited peripheral neuropathy. Although CMT2 is clinically indistinguishable from CMT1, the two forms can be differentiated by pathological and neurophysiological methods. We have established one locus, CMT2A on chromosome 1p36, and have established genetic heterogeneity. This locus maps to the region of the deletions associated with neuroblastoma. We have now identified an additional 11 CMT2 families. Three families are linked to chromosome 1p36 while six families are excluded from this region. Another six families are currently under analysis and collection. To date the CMT2A families represent one third of those CMT2 families examined.more » We have established a microdissection library of the 1p36 region which is currently being characterized for microsatellite repeats and STSs using standard hybridization techniques and a modified degenerate primer method. In addition, new markers (D1S253, D1S450, D1S489, D1S503, GATA27E04, and GATA4H04) placed in this region are being mapped using critical recombinants in the CEPH reference pedigrees. Fluorescent in situ hybridization (FISH) has been used to confirm mapping. A YAC contig is being assembled from the CEPH megabase library using STSs to isolate key YACs which are extended by vectorette end clone and Alu-PCR. These findings suggest that the CMT2 phenotype is secondary to at least two different genes and demonstrates further heterogeneity in the CMT phenotype.« less
Schneider, Galen B; Cunningham-Ford, Marsha A; Johnsen, David C; Eckert, Mary Lynn; Mulder, Michael
2014-09-01
This project, utilizing a seldom-used approach to dental education, was designed to define the desired characteristics of a graduating dental student; convert those characteristics to educational outcomes; and use those outcomes to map a dental school's learning and assessment programs, based on outcomes rather than courses and disciplines. A detailed rubric of the outcomes expected of a graduating dental student from this school was developed, building on Commission on Dental Accreditation (CODA) standards and the school's competencies. The presence of each characteristic in the rubric was mapped within and across courses and disciplines. To assess implementation of the rubric, members of two faculty committees and all fourth-year students were asked to use it to rate 1) the importance of each characteristic, 2) the extent to which the school teaches and assesses each, and 3) the extent to which each counts toward overall assessment of competence. All thirty-three faculty members (100 percent) on the committees participated, as did forty-six of the fifty-five students (84 percent). The groups gave high scores to the importance of each characteristic, especially for knowledge and technical competence (then separate categories but merged in the final rubric) and for self-assessment, as well as the extent to which they are being taught and assessed. Respondents most commonly named critical thinking as the area that should be emphasized more. Mapping the curriculum and creating its related database allow the faculty and administration to more systematically coordinate learning and assessment than was possible with a course-based approach.
Gür Güngör, Sirel; Akman, Ahmet; Sarıgül Sezenöz, Almila; Tanrıaşıkı, Gülşah
2016-12-01
The presence of retinal nerve fiber layer (RNFL) split bundles was recently described in normal eyes scanned using scanning laser polarimetry and by histologic studies. Split bundles may resemble RNFL loss in healthy eyes. The aim of our study was to determine the prevalence of nerve fiber layer split bundles in healthy people. We imaged 718 eyes of 359 healthy persons with the spectral domain optical coherence tomography in this cross-sectional study. All eyes had intraocular pressure of 21 mmHg or less, normal appearance of the optic nerve head, and normal visual fields (Humphrey Field Analyzer 24-2 full threshold program). In our study, a bundle was defined as 'split' when there is localized defect not resembling a wedge defect in the RNFL deviation map with a symmetrically divided RNFL appearance on the RNFL thickness map. The classification was performed by two independent observers who used an identical set of reference examples to standardize the classification. Inter-observer consensus was reached in all cases. Bilateral superior split bundles were seen in 19 cases (5.29%) and unilateral superior split was observed in 15 cases (4.16%). In 325 cases (90.52%) there was no split bundle. Split nerve fiber layer bundles, in contrast to single nerve fiber layer bundles, are not common findings in healthy eyes. In eyes with normal optic disc appearance, especially when a superior RNFL defect is observed in RNFL deviation map, the RNLF thickness map and graphs should also be examined for split nerve fiber layer bundles.
Wollborn, Jakob; Ruetten, Eva; Schlueter, Bjoern; Haberstroh, Joerg; Goebel, Ulrich; Schick, Martin A
2018-01-22
Standardized modeling of cardiac arrest and cardiopulmonary resuscitation (CPR) is crucial to evaluate new treatment options. Experimental porcine models are ideal, closely mimicking human-like physiology. However, anteroposterior chest diameter differs significantly, being larger in pigs and thus poses a challenge to achieve adequate perfusion pressures and consequently hemodynamics during CPR, which are commonly achieved during human resuscitation. The aim was to prove that standardized resuscitation is feasible and renders adequate hemodynamics and perfusion in pigs, using a specifically designed resuscitation board for a pneumatic chest compression device. A "porcine-fit" resuscitation board was designed for our experiments to optimally use a pneumatic compression device (LUCAS® II, Physio-Control Inc.), which is widely employed in emergency medicine and ideal in an experimental setting due to its high standardization. Asphyxial cardiac arrest was induced in 10 German hybrid landrace pigs and cardiopulmonary resuscitation was performed according to ERC/AHA 2015 guidelines with mechanical chest compressions. Hemodynamics were measured in the carotid and pulmonary artery. Furthermore, arterial blood gas was drawn to assess oxygenation and tissue perfusion. The custom-designed resuscitation board in combination with the LUCAS® device demonstrated highly sufficient performance regarding hemodynamics during CPR (mean arterial blood pressure, MAP 46 ± 1 mmHg and mean pulmonary artery pressure, mPAP of 36 ± 1 mmHg over the course of CPR). MAP returned to baseline values at 2 h after ROSC (80 ± 4 mmHg), requiring moderate doses of vasopressors. Furthermore, stroke volume and contractility were analyzed using pulse contour analysis (106 ± 3 ml and 1097 ± 22 mmHg/s during CPR). Blood gas analysis revealed CPR-typical changes, normalizing in the due course. Thermodilution parameters did not show persistent intravascular volume shift. Standardized cardiopulmonary resuscitation is feasible in a porcine model, achieving adequate hemodynamics and consecutive tissue perfusion of consistent quality. Copyright © 2018 Elsevier Inc. All rights reserved.
NaviCell Web Service for network-based data visualization.
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei
2015-07-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
NaviCell Web Service for network-based data visualization
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei
2015-01-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
..., contour maps; ownership reports and related materials; portions of the Equal Employment Opportunity file... maps; ownership reports and related materials; portions of the Equal Employment Opportunity file held... immediately following the shortened license term. See 47 CFR 73.3526((e)(2), 73.3527(e)(2). Contour Maps (as...
Code of Federal Regulations, 2010 CFR
2010-07-01
... SAFETY STANDARDS, SURFACE COAL MINES AND SURFACE WORK AREAS OF UNDERGROUND COAL MINES Maps § 77.1200 Mine... elevation of any body of water dammed or held back in any portion of the mine: Provided, however, Such bodies of water may be shown on overlays or tracings attached to the mine maps; (g) All prospect drill...
Copyright | USDA Plant Hardiness Zone Map
Copyright Copyright Map graphics. As a U.S. Government publication, the USDA Plant Hardiness Zone Map itself Specific Cooperative Agreement, Oregon State University agreed to supply the U.S. Government with unenhanced (standard resolution) GIS data in grid and shapefile formats. U.S. Government users may use these
Connecticut Music Trace Map for Grades 10 and 12. Revised.
ERIC Educational Resources Information Center
Connecticut State Board of Education, Hartford.
The Connecticut Curriculum Trace Maps for music are designed to help curriculum developers and teachers translate Connecticut's K-12 performance standards into objectives and classroom practice. The Trace Maps provide specific descriptions of what students should know and be able to do at smaller grade level clusters. The elements in the Trace…
Connecticut Music Trace Map for Grades 2 and 4. Revised.
ERIC Educational Resources Information Center
Connecticut State Board of Education, Hartford.
These Connecticut Curriculum Trace Maps for music are designed to help curriculum developers and teachers translate Connecticut's K-12 performance standards into objectives and classroom practice. The music Trace Maps provide specific descriptions of what students should know and be able to do at smaller grade level clusters. Connecticut's Trace…
Network geometry inference using common neighbors
NASA Astrophysics Data System (ADS)
Papadopoulos, Fragkiskos; Aldecoa, Rodrigo; Krioukov, Dmitri
2015-08-01
We introduce and explore a method for inferring hidden geometric coordinates of nodes in complex networks based on the number of common neighbors between the nodes. We compare this approach to the HyperMap method, which is based only on the connections (and disconnections) between the nodes, i.e., on the links that the nodes have (or do not have). We find that for high degree nodes, the common-neighbors approach yields a more accurate inference than the link-based method, unless heuristic periodic adjustments (or "correction steps") are used in the latter. The common-neighbors approach is computationally intensive, requiring O (t4) running time to map a network of t nodes, versus O (t3) in the link-based method. But we also develop a hybrid method with O (t3) running time, which combines the common-neighbors and link-based approaches, and we explore a heuristic that reduces its running time further to O (t2) , without significant reduction in the mapping accuracy. We apply this method to the autonomous systems (ASs) Internet, and we reveal how soft communities of ASs evolve over time in the similarity space. We further demonstrate the method's predictive power by forecasting future links between ASs. Taken altogether, our results advance our understanding of how to efficiently and accurately map real networks to their latent geometric spaces, which is an important necessary step toward understanding the laws that govern the dynamics of nodes in these spaces, and the fine-grained dynamics of network connections.
Jiang, Guoqian; Kiefer, Richard; Prud'hommeaux, Eric; Solbrig, Harold R
2017-01-01
The OHDSI Common Data Model (CDM) is a deep information model, in which its vocabulary component plays a critical role in enabling consistent coding and query of clinical data. The objective of the study is to create methods and tools to expose the OHDSI vocabularies and mappings as the vocabulary mapping services using two HL7 FHIR core terminology resources ConceptMap and ValueSet. We discuss the benefits and challenges in building the FHIR-based terminology services.
NASA Technical Reports Server (NTRS)
Morrison, R. B. (Principal Investigator); Hallberg, G. R.
1973-01-01
The author has identified the following significant results. The main landform associations and larger landforms are readily identifiable on the better images and commonly the gross associations of surficial Quaternary deposits also can be determined primarily by information on landforms and soils (obtained by analysis of stream dissection and drainage and stream-divide patterns, land use patterns, etc.). Maps showing the Quaternary geologic-terrain units that can be distinguished on the ERTS-1 images are being prepared for study areas in Illinois, Iowa, Missouri, Kansas, Nebraska, and South Dakota. Preliminary maps of 1:1,000,000 scale are included for three of the study areas: the Grand Island and Fremont, Nebraska, and the Davenport, Iowa-Illinois, 1 deg x 2 deg quadrangles. These maps exemplify the first phase of investigations, which consists of identifying and mapping landform and land use characteristics and geologic-surficial materials directly from the ERTS-1 images alone, with no additional information. These maps show that commonly the boundaries of geologic-terrain units can be delineated more accurately on ERTS-1 images than on topographic maps at 1:250,000 scale.
Using Digital Video Production to Meet the Common Core Standards
ERIC Educational Resources Information Center
Nichols, Maura
2012-01-01
The implementation of the Common Core Standards has just begun and these standards will impact a generation that communicates with technology more than anything else. Texting, cell phones, Facebook, YouTube, Skype, etc. are the ways they speak with their friends and the world. The Common Core Standards recognize this. According to the Common Core…
ERIC Educational Resources Information Center
DeMeo, Stephen
2007-01-01
Common examples of graphic organizers include flow diagrams, concept maps, and decision trees. The author has created a novel type of graphic organizer called a decision map. A decision map is a directional heuristic that helps learners solve problems within a generic framework. It incorporates questions that the user must answer and contains…
NASA Technical Reports Server (NTRS)
Morrison, R. B. (Principal Investigator); Hallberg, G. R.
1973-01-01
The author has identified the following significant results. The main landform associations and larger landforms are readily identifiable on the better images and commonly the gross associations of surficial Quaternary deposits also can be differentiated, primarily by information on landforms and soils. Maps showing the Quaternary geologic-terrain units that can be differentiated from the ERTS-1 images are being prepared for study areas in Illinois, Iowa, Missouri, Kansas, Nebraska, and South Dakota. Preliminary maps at 1:1 million scale are given of two of the study areas, the Peoria and Decatur, Illinois, 1 deg x 2 quadrangles. These maps exemplify the first phase of investigations, which consists of identifying and mapping landform and land use characteristics and geologic-surficial materials directly from ERTS-1 images alone, without input of additional data. These maps shown that commonly the boundaries of geologic-terrain units can be identified more accurately on ERTS-1 images than on topographic maps of 1:250,000 scale. From analysis of drainage patterns, stream-divide relations, and tone and textural variations on the ERTS-1 images, the trends of numerous moraines of Wisconsinan and possibly some of Illinoian age were mapped. In the Peoria study area the trend of a buried valley of the Mississippi River is revealed.
An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns
NASA Astrophysics Data System (ADS)
Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.
2017-12-01
The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.
Classification criteria and probability risk maps: limitations and perspectives.
Saisana, Michaela; Dubois, Gregoire; Chaloulakou, Archontoula; Spyrellis, Nikolas
2004-03-01
Delineation of polluted zones with respect to regulatory standards, accounting at the same time for the uncertainty of the estimated concentrations, relies on classification criteria that can lead to significantly different pollution risk maps, which, in turn, can depend on the regulatory standard itself. This paper reviews four popular classification criteria related to the violation of a probability threshold or a physical threshold, using annual (1996-2000) nitrogen dioxide concentrations from 40 air monitoring stations in Milan. The relative advantages and practical limitations of each criterion are discussed, and it is shown that some of the criteria are more appropriate for the problem at hand and that the choice of the criterion can be supported by the statistical distribution of the data and/or the regulatory standard. Finally, the polluted area is estimated over the different years and concentration thresholds using the appropriate risk maps as an additional source of uncertainty.
Constellation labeling optimization for bit-interleaved coded APSK
NASA Astrophysics Data System (ADS)
Xiang, Xingyu; Mo, Zijian; Wang, Zhonghai; Pham, Khanh; Blasch, Erik; Chen, Genshe
2016-05-01
This paper investigates the constellation and mapping optimization for amplitude phase shift keying (APSK) modulation, which is deployed in Digital Video Broadcasting Satellite - Second Generation (DVB-S2) and Digital Video Broadcasting - Satellite services to Handhelds (DVB-SH) broadcasting standards due to its merits of power and spectral efficiency together with the robustness against nonlinear distortion. The mapping optimization is performed for 32-APSK according to combined cost functions related to Euclidean distance and mutual information. A Binary switching algorithm and its modified version are used to minimize the cost function and the estimated error between the original and received data. The optimized constellation mapping is tested by combining DVB-S2 standard Low-Density Parity-Check (LDPC) codes in both Bit-Interleaved Coded Modulation (BICM) and BICM with iterative decoding (BICM-ID) systems. The simulated results validate the proposed constellation labeling optimization scheme which yields better performance against conventional 32-APSK constellation defined in DVB-S2 standard.
NASA Astrophysics Data System (ADS)
Yashin, A. A.
1985-04-01
A semiconductor or hybrid structure into a calculable two-dimensional region mapped by the Schwarz-Christoffel transformation and a universal algorithm can be constructed on the basis of Maxwell's electro-magnetic-thermal similarity principle for engineering design of integrated-circuit elements. The design procedure involves conformal mapping of the original region into a polygon and then the latter into a rectangle with uniform field distribution, where conductances and capacitances are calculated, using tabulated standard mapping functions. Subsequent synthesis of a device requires inverse conformal mapping. Devices adaptable as integrated-circuit elements are high-resistance film resistors with periodic serration, distributed-resistance film attenuators with high transformation ratio, coplanar microstrip lines, bipolar transistors, directional couplers with distributed coupling to microstrip lines for microwave bulk devices, and quasirregular smooth matching transitions from asymmetric to coplanar microstrip lines.
National Water Quality Standards Database (NWQSD)
The National Water Quality Standards Database (WQSDB) provides access to EPA and state water quality standards (WQS) information in text, tables, and maps. This data source was last updated in December 2007 and will no longer be updated.
NASA Astrophysics Data System (ADS)
Hatzopoulos, N.; Kim, S. H.; Kafatos, M.; Nghiem, S. V.; Myoung, B.
2016-12-01
Live Fuel Moisture is a dryness measure used by the fire departments to determine how dry is the current situation of the fuels from the forest areas. In order to map Live Fuel Moisture we conducted an analysis with a standardized regressional approach from various vegetation indices derived from remote sensing data of MODIS. After analyzing the results we concluded mapping Live Fuel Moisture using a standardized NDVI product. From the mapped remote sensed product we observed the appearance of extremely high dry fuels to be highly correlated with very dry years based on the overall yearly precipitation. The appearances of the extremely dry mapped fuels tend to have a direct association with fire events and observed to be a post fire indicator. In addition we studied the appearance of extreme dry fuels during critical months when season changes from spring to summer as well as the relation to fire events.
Research on Integrated Mapping——A Case Study of Integrated Land Use with Swamp Mapping
NASA Astrophysics Data System (ADS)
Zhang, S.; Yan, F.; Chang, L.
2015-12-01
Unified real estate registration system shows the attention, determination and effort to of CPC Central Committee and State Council on real estate registration in China. However, under current situation, China's real estate registration work made less progress. One of the reasons is that it's hard to express the property right of real estate on one map under the multi-sector management system. Under current multi-sector management system in China, different departments usually just survey and mapping the land type under its jurisdiction. For example, wetland investigation only mapping all kinds of wetland resources but not mapping other resource types. As a result, it cause he problem of coincidence or leak in integration of different results from different departments. As resources of the earth's surface, the total area of forest, grassland, wetland and so on should be equal to the total area of the earth's surface area. However, under the current system, the area of all kinds of resources is not equal to the sum of the earth's surface. Therefore, it is of great importance to express all the resources on one map. On one hand, this is conducive to find out the real area and distribution of resources and avoid the problem of coincidence or leak in integration; On the other hand, it is helpful to study the dynamic change of different resources. Therefore, we first proposed the "integrated mapping" as a solution, and take integrated land use with swamp mapping in Northeast China as an example to investigate the feasibility and difficulty. Study showed that: integrated land use with swamp mapping can be achieved through combining land use survey standards with swamps survey standards and "second mapping" program. Based on the experience of integrated land use with swamp mapping, we point out its reference function on integrated mapping and unified real estate registration system. We concluded that: (1) Comprehending and integrating different survey standard of different resources is the premise of "integrated mapping", (2) We put forward "multiple code" and "multiple interpretation" scheme in order to solve the problem of "attribute overlap", (3) The area of "attribute overlap" can be segmented by a certain ratio to determine the property right in unified real estate registration.
Reinprecht, Yarmilla; Yadegari, Zeinab; Perry, Gregory E.; Siddiqua, Mahbuba; Wright, Lori C.; McClean, Phillip E.; Pauls, K. Peter
2013-01-01
Legumes contain a variety of phytochemicals derived from the phenylpropanoid pathway that have important effects on human health as well as seed coat color, plant disease resistance and nodulation. However, the information about the genes involved in this important pathway is fragmentary in common bean (Phaseolus vulgaris L.). The objectives of this research were to isolate genes that function in and control the phenylpropanoid pathway in common bean, determine their genomic locations in silico in common bean and soybean, and analyze sequences of the 4CL gene family in two common bean genotypes. Sequences of phenylpropanoid pathway genes available for common bean or other plant species were aligned, and the conserved regions were used to design sequence-specific primers. The PCR products were cloned and sequenced and the gene sequences along with common bean gene-based (g) markers were BLASTed against the Glycine max v.1.0 genome and the P. vulgaris v.1.0 (Andean) early release genome. In addition, gene sequences were BLASTed against the OAC Rex (Mesoamerican) genome sequence assembly. In total, fragments of 46 structural and regulatory phenylpropanoid pathway genes were characterized in this way and placed in silico on common bean and soybean sequence maps. The maps contain over 250 common bean g and SSR (simple sequence repeat) markers and identify the positions of more than 60 additional phenylpropanoid pathway gene sequences, plus the putative locations of seed coat color genes. The majority of cloned phenylpropanoid pathway gene sequences were mapped to one location in the common bean genome but had two positions in soybean. The comparison of the genomic maps confirmed previous studies, which show that common bean and soybean share genomic regions, including those containing phenylpropanoid pathway gene sequences, with conserved synteny. Indels identified in the comparison of Andean and Mesoamerican common bean 4CL gene sequences might be used to develop inter-pool phenylpropanoid pathway gene-based markers. We anticipate that the information obtained by this study will simplify and accelerate selections of common bean with specific phenylpropanoid pathway alleles to increase the contents of beneficial phenylpropanoids in common bean and other legumes. PMID:24046770
HIV antiretroviral medication stock-outs in Ghana: contributors and consequences.
Poku, Rebecca A; Owusu, Adobea Yaa; Mullen, Patricia Dolan; Markham, Christine; McCurdy, Sheryl A
2017-09-01
Drug stock-outs are an unfortunate yet common reality for patients living in low and middle income countries, particularly in sub-Saharan Africa where trouble with consistent stock of antiretroviral medications (ARVs) continues. Our study takes a snapshot of this problem in Ghana. Although the country launched its antiretroviral therapy (ART) programme in 2003, progress toward realising the full benefit of ART for treated individuals has been limited, in part, because of stock-outs. In Ghana's Greater Accra region, we conducted semi-structured interviews with 40 women living with HIV (WLHIV) and 15 individuals with a history of HIV-related work in government or non-governmental organisations, or healthcare facilities. We used repeated review with coding and mapping techniques to analyse the transcripts and identify common themes. Stock-outs of ARVs result in inconsistent administration of therapy, increased indirect medical costs for WLHIV, and negative labelling of patients. Inefficiencies in drug supply, poor coordination with port authorities, inadequate government funding and dependence on international aid contribute to the stock-outs experienced in Ghana. Although using ARVs produced in-country could reduce supply problems, the domestically-manufactured product currently does not meet World Health Organization (WHO) standards. We recommend focused efforts to produce WHO standard ARVs in Ghana, and a review of current supply chain management to identify and mend pitfalls in the system.
High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.
Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue
2010-11-13
Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.
Ensemble Learning of QTL Models Improves Prediction of Complex Traits
Bian, Yang; Holland, James B.
2015-01-01
Quantitative trait locus (QTL) models can provide useful insights into trait genetic architecture because of their straightforward interpretability but are less useful for genetic prediction because of the difficulty in including the effects of numerous small effect loci without overfitting. Tight linkage between markers introduces near collinearity among marker genotypes, complicating the detection of QTL and estimation of QTL effects in linkage mapping, and this problem is exacerbated by very high density linkage maps. Here we developed a thinning and aggregating (TAGGING) method as a new ensemble learning approach to QTL mapping. TAGGING reduces collinearity problems by thinning dense linkage maps, maintains aspects of marker selection that characterize standard QTL mapping, and by ensembling, incorporates information from many more markers-trait associations than traditional QTL mapping. The objective of TAGGING was to improve prediction power compared with QTL mapping while also providing more specific insights into genetic architecture than genome-wide prediction models. TAGGING was compared with standard QTL mapping using cross validation of empirical data from the maize (Zea mays L.) nested association mapping population. TAGGING-assisted QTL mapping substantially improved prediction ability for both biparental and multifamily populations by reducing both the variance and bias in prediction. Furthermore, an ensemble model combining predictions from TAGGING-assisted QTL and infinitesimal models improved prediction abilities over the component models, indicating some complementarity between model assumptions and suggesting that some trait genetic architectures involve a mixture of a few major QTL and polygenic effects. PMID:26276383
An interactive method for digitizing zone maps
NASA Technical Reports Server (NTRS)
Giddings, L. E.; Thompson, E. J.
1975-01-01
A method is presented for digitizing maps that consist of zones, such as contour or climatic zone maps. A color-coded map is prepared by any convenient process. The map is then read into memory of an Image 100 computer by means of its table scanner, using colored filters. Zones are separated and stored in themes, using standard classification procedures. Thematic data are written on magnetic tape and these data, appropriately coded, are combined to make a digitized image on tape. Step-by-step procedures are given for digitization of crop moisture index maps with this procedure. In addition, a complete example of the digitization of a climatic zone map is given.
,
1996-01-01
Many people want maps that show an area of the United States as it existed many years ago. These are called historical maps, and there are two types. The most common type consists of special maps prepared by commercial firms to show such historical features as battle-fields, military routes, or the paths taken by famous travelers. Typically, these maps are for sale to tourists at the sites of historical events. The other type is the truly old map--one compiled by a surveyor or cartographer many years ago. Lewis and Clark, for example, made maps of their journeys into the Northwest Territories in 1803-6, and originals of some of these maps still exist.
Jin, S B; Zhang, X F; Lu, J G; Fu, H T; Jia, Z Y; Sun, X W
2015-04-17
A group of 107 F1 hybrid common carp was used to construct a linkage map using JoinMap 4.0. A total of 4877 microsatellite and single nucleotide polymorphism (SNP) markers isolated from a genomic library (978 microsatellite and 3899 SNP markers) were assigned to construct the genetic map, which comprised 50 linkage groups. The total length of the linkage map for the common carp was 4775.90 cM with an average distance between markers of 0.98 cM. Ten quantitative trait loci (QTL) were associated with eye diameter, corresponding to 10.5-57.2% of the total phenotypic variation. Twenty QTL were related to eye cross, contributing to 10.8-36.9% of the total phenotypic variation. Two QTL for eye diameter and four QTL for eye cross each accounted for more than 20% of the total phenotypic variation and were considered to be major QTL. One growth factor related to eye diameter was observed on LG10 of the common carp genome, and three growth factors related to eye cross were observed on LG10, LG35, and LG44 of the common carp genome. The significant positive relationship of eye cross and eye diameter with other commercial traits suggests that eye diameter and eye cross can be used to assist in indirect selection for many commercial traits, particularly body weight. Thus, the growth factor for eye cross may also contribute to the growth of body weight, implying that aggregate breeding could have multiple effects. These findings provide information for future genetic studies and breeding of common carp.
Digital floodplain mapping and an analysis of errors involved
Hamblen, C.S.; Soong, D.T.; Cai, X.
2007-01-01
Mapping floodplain boundaries using geographical information system (GIS) and digital elevation models (DEMs) was completed in a recent study. However convenient this method may appear at first, the resulting maps potentially can have unaccounted errors. Mapping the floodplain using GIS is faster than mapping manually, and digital mapping is expected to be more common in the future. When mapping is done manually, the experience and judgment of the engineer or geographer completing the mapping and the contour resolution of the surface topography are critical in determining the flood-plain and floodway boundaries between cross sections. When mapping is done digitally, discrepancies can result from the use of the computing algorithm and digital topographic datasets. Understanding the possible sources of error and how the error accumulates through these processes is necessary for the validation of automated digital mapping. This study will evaluate the procedure of floodplain mapping using GIS and a 3 m by 3 m resolution DEM with a focus on the accumulated errors involved in the process. Within the GIS environment of this mapping method, the procedural steps of most interest, initially, include: (1) the accurate spatial representation of the stream centerline and cross sections, (2) properly using a triangulated irregular network (TIN) model for the flood elevations of the studied cross sections, the interpolated elevations between them and the extrapolated flood elevations beyond the cross sections, and (3) the comparison of the flood elevation TIN with the ground elevation DEM, from which the appropriate inundation boundaries are delineated. The study area involved is of relatively low topographic relief; thereby, making it representative of common suburban development and a prime setting for the need of accurately mapped floodplains. This paper emphasizes the impacts of integrating supplemental digital terrain data between cross sections on floodplain delineation. ?? 2007 ASCE.
Communications among elements of a space construction ensemble
NASA Technical Reports Server (NTRS)
Davis, Randal L.; Grasso, Christopher A.
1989-01-01
Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.
30 CFR 75.1204-1 - Places to give notice and file maps.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Places to give notice and file maps. 75.1204-1 Section 75.1204-1 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Maps § 75.1204-1 Places to give...
ERIC Educational Resources Information Center
Wisconsin Department of Public Instruction, 2011
2011-01-01
Wisconsin's adoption of the Common Core State Standards provides an excellent opportunity for Wisconsin school districts and communities to define expectations from birth through preparation for college and work. By aligning the existing Wisconsin Model Early Learning Standards with the Wisconsin Common Core State Standards, expectations can be…
Preschool Literacy and the Common Core: A Professional Development Model
ERIC Educational Resources Information Center
Wake, Donna G.; Benson, Tammy Rachelle
2016-01-01
Many states have adopted the Common Core Standards for literacy and math and have begun enacting these standards in school curriculum. In states where these standards have been adopted, professional educators working in K-12 contexts have been working to create transition plans from existing state-based standards to the Common Core standards. A…
Crisp, Ginny D; Burkhart, Jena Ivey; Esserman, Denise A; Weinberger, Morris; Roth, Mary T
2011-12-01
Medication is one of the most important interventions for improving the health of older adults, yet it has great potential for causing harm. Clinical pharmacists are well positioned to engage in medication assessment and planning. The Individualized Medication Assessment and Planning (iMAP) tool was developed to aid clinical pharmacists in documenting medication-related problems (MRPs) and associated recommendations. The purpose of our study was to assess the reliability and usability of the iMAP tool in classifying MRPs and associated recommendations in older adults in the ambulatory care setting. Three cases, representative of older adults seen in an outpatient setting, were developed. Pilot testing was conducted and a "gold standard" key developed. Eight eligible pharmacists consented to participate in the study. They were instructed to read each case, make an assessment of MRPs, formulate a plan, and document the information using the iMAP tool. Inter-rater reliability was assessed for each case, comparing the pharmacists' identified MRPs and recommendations to the gold standard. Consistency of categorization across reviewers was assessed using the κ statistic or percent agreement. The mean κ across the 8 pharmacists in classifying MRPs compared with the gold standard was 0.74 (range, 0.54-1.00) for case 1 and 0.68 (range, 0.36-1.00) for case 2, indicating substantial agreement. For case 3, percent agreement was 63% (range, 40%-100%). The mean κ across the 8 pharmacists when classifying recommendations compared with the gold standard was 0.87 (range, 0.58-1.00) for case 1 and 0.88 (range, 0.75-1.00) for case 2, indicating almost perfect agreement. For case 3, percent agreement was 68% (range, 40%-100%). Clinical pharmacists found the iMAP tool easy to use. The iMAP tool provides a reliable and standardized approach for clinical pharmacists to use in the ambulatory care setting to classify MRPs and associated recommendations. Future studies will explore the predictive validity of the tool on clinical outcomes such as health care utilization. Copyright © 2011 Elsevier HS Journals, Inc. All rights reserved.
Self-Consistent Chaotic Transport in a High-Dimensional Mean-Field Hamiltonian Map Model
Martínez-del-Río, D.; del-Castillo-Negrete, D.; Olvera, A.; ...
2015-10-30
We studied the self-consistent chaotic transport in a Hamiltonian mean-field model. This model provides a simplified description of transport in marginally stable systems including vorticity mixing in strong shear flows and electron dynamics in plasmas. Self-consistency is incorporated through a mean-field that couples all the degrees-of-freedom. The model is formulated as a large set of N coupled standard-like area-preserving twist maps in which the amplitude and phase of the perturbation, rather than being constant like in the standard map, are dynamical variables. Of particular interest is the study of the impact of periodic orbits on the chaotic transport and coherentmore » structures. Furthermore, numerical simulations show that self-consistency leads to the formation of a coherent macro-particle trapped around the elliptic fixed point of the system that appears together with an asymptotic periodic behavior of the mean field. To model this asymptotic state, we introduced a non-autonomous map that allows a detailed study of the onset of global transport. A turnstile-type transport mechanism that allows transport across instantaneous KAM invariant circles in non-autonomous systems is discussed. As a first step to understand transport, we study a special type of orbits referred to as sequential periodic orbits. Using symmetry properties we show that, through replication, high-dimensional sequential periodic orbits can be generated starting from low-dimensional periodic orbits. We show that sequential periodic orbits in the self-consistent map can be continued from trivial (uncoupled) periodic orbits of standard-like maps using numerical and asymptotic methods. Normal forms are used to describe these orbits and to find the values of the map parameters that guarantee their existence. Numerical simulations are used to verify the prediction from the asymptotic methods.« less
Self-Consistent Chaotic Transport in a High-Dimensional Mean-Field Hamiltonian Map Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martínez-del-Río, D.; del-Castillo-Negrete, D.; Olvera, A.
We studied the self-consistent chaotic transport in a Hamiltonian mean-field model. This model provides a simplified description of transport in marginally stable systems including vorticity mixing in strong shear flows and electron dynamics in plasmas. Self-consistency is incorporated through a mean-field that couples all the degrees-of-freedom. The model is formulated as a large set of N coupled standard-like area-preserving twist maps in which the amplitude and phase of the perturbation, rather than being constant like in the standard map, are dynamical variables. Of particular interest is the study of the impact of periodic orbits on the chaotic transport and coherentmore » structures. Furthermore, numerical simulations show that self-consistency leads to the formation of a coherent macro-particle trapped around the elliptic fixed point of the system that appears together with an asymptotic periodic behavior of the mean field. To model this asymptotic state, we introduced a non-autonomous map that allows a detailed study of the onset of global transport. A turnstile-type transport mechanism that allows transport across instantaneous KAM invariant circles in non-autonomous systems is discussed. As a first step to understand transport, we study a special type of orbits referred to as sequential periodic orbits. Using symmetry properties we show that, through replication, high-dimensional sequential periodic orbits can be generated starting from low-dimensional periodic orbits. We show that sequential periodic orbits in the self-consistent map can be continued from trivial (uncoupled) periodic orbits of standard-like maps using numerical and asymptotic methods. Normal forms are used to describe these orbits and to find the values of the map parameters that guarantee their existence. Numerical simulations are used to verify the prediction from the asymptotic methods.« less
Linking late cognitive outcome with glioma surgery location using resection cavity maps.
Hendriks, Eef J; Habets, Esther J J; Taphoorn, Martin J B; Douw, Linda; Zwinderman, Aeilko H; Vandertop, W Peter; Barkhof, Frederik; Klein, Martin; De Witt Hamer, Philip C
2018-05-01
Patients with a diffuse glioma may experience cognitive decline or improvement upon resective surgery. To examine the impact of glioma location, cognitive alteration after glioma surgery was quantified and related to voxel-based resection probability maps. A total of 59 consecutive patients (range 18-67 years of age) who had resective surgery between 2006 and 2011 for a supratentorial nonenhancing diffuse glioma (grade I-III, WHO 2007) were included in this observational cohort study. Standardized neuropsychological examination and MRI were obtained before and after surgery. Intraoperative stimulation mapping guided resections towards neurological functions (language, sensorimotor function, and visual fields). Maps of resected regions were constructed in standard space. These resection cavity maps were compared between patients with and without new cognitive deficits (z-score difference >1.5 SD between baseline and one year after resection), using a voxel-wise randomization test and calculation of false discovery rates. Brain regions significantly associated with cognitive decline were classified in standard cortical and subcortical anatomy. Cognitive improvement in any domain occurred in 10 (17%) patients, cognitive decline in any domain in 25 (42%), and decline in more than one domain in 10 (17%). The most frequently affected subdomains were attention in 10 (17%) patients and information processing speed in 9 (15%). Resection regions associated with decline in more than one domain were predominantly located in the right hemisphere. For attention decline, no specific region could be identified. For decline in information speed, several regions were found, including the frontal pole and the corpus callosum. Cognitive decline after resective surgery of diffuse glioma is prevalent, in particular, in patients with a tumor located in the right hemisphere without cognitive function mapping. © The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
Cho, Chung Y; Oles, Carolyn; Nowatzke, William; Oliver, Kerry; Garber, Eric A E
2017-10-01
The homology between proteins in legumes and tree nuts makes it common for individuals with food allergies to be allergic to multiple legumes and tree nuts. This propensity for allergenic and antigenic cross-reactivity means that commonly employed commercial immunodiagnostic assays (e.g., dipsticks) for the detection of food allergens may not always accurately detect, identify, and quantitate legumes and tree nuts unless additional orthogonal analytical methods or secondary measures of analysis are employed. The xMAP ® Multiplex Food Allergen Detection Assay (FADA) was used to determine the cross-reactivity patterns and the utility of multi-antibody antigenic profiling to distinguish between legumes and tree nuts. Pure legumes and tree nuts extracted using buffered detergent displayed a high level of cross-reactivity that decreased upon dilution or by using a buffer (UD buffer) designed to increase the stringency of binding conditions and reduce the occurrence of false positives due to plant-derived lectins. Testing for unexpected food allergens or the screening for multiple food allergens often involves not knowing the identity of the allergen present, its concentration, or the degree of modification during processing. As such, the analytical response measured may represent multiple antigens of varying antigenicity (cross-reactivity). This problem of multiple potential analytes is usually unresolved and the focus becomes the primary analyte, the antigen the antibody was raised against, or quantitative interpretation of the content of the analytical sample problematic. The alternative solution offered here to this problem is the use of an antigenic profile as generated by the xMAP FADA using multiple antibodies (bead sets). By comparing the antigenic profile to standards, the allergen may be identified along with an estimate of the concentration present. Cluster analysis of the xMAP FADA data was also performed and agreed with the known phylogeny of the legumes and tree nuts being analyzed. Graphical abstract The use of cluster analysis to compare the multi-antigen profiles of food allergens.
SoilInfo App: global soil information on your palm
NASA Astrophysics Data System (ADS)
Hengl, Tomislav; Mendes de Jesus, Jorge
2015-04-01
ISRIC ' World Soil Information has released in 2014 and app for mobile de- vices called 'SoilInfo' (http://soilinfo-app.org) and which aims at providing free access to the global soil data. SoilInfo App (available for Android v.4.0 Ice Cream Sandwhich or higher, and Apple v.6.x and v.7.x iOS) currently serves the Soil- Grids1km data ' a stack of soil property and class maps at six standard depths at a resolution of 1 km (30 arc second) predicted using automated geostatistical mapping and global soil data models. The list of served soil data includes: soil organic carbon (), soil pH, sand, silt and clay fractions (%), bulk density (kg/m3), cation exchange capacity of the fine earth fraction (cmol+/kg), coarse fragments (%), World Reference Base soil groups, and USDA Soil Taxonomy suborders (DOI: 10.1371/journal.pone.0105992). New soil properties and classes will be continuously added to the system. SoilGrids1km are available for download under a Creative Commons non-commercial license via http://soilgrids.org. They are also accessible via a Representational State Transfer API (http://rest.soilgrids.org) service. SoilInfo App mimics common weather apps, but is also largely inspired by the crowdsourcing systems such as the OpenStreetMap, Geo-wiki and similar. Two development aspects of the SoilInfo App and SoilGrids are constantly being worked on: Data quality in terms of accuracy of spatial predictions and derived information, and Data usability in terms of ease of access and ease of use (i.e. flexibility of the cyberinfrastructure / functionalities such as the REST SoilGrids API, SoilInfo App etc). The development focus in 2015 is on improving the thematic and spatial accuracy of SoilGrids predictions, primarily by using finer resolution covariates (250 m) and machine learning algorithms (such as random forests) to improve spatial predictions.
NASA Astrophysics Data System (ADS)
Bai, Di; Messinger, David W.; Howell, David
2017-08-01
The Gough Map, one of the earliest surviving maps of Britain, was created and extensively revised over the 15th century. In 2015, the map was imaged using a hyperspectral imaging system while in the collection at the Bodleian Library, Oxford University. The goal of the collection of the hyperspectral image (HSI) of the Gough Map was to address questions such as enhancement of faded text for reading and analysis of the pigments used during its creation and revision. In particular, pigment analysis of the Gough Map will help historians understand the material diversity of its composition and potentially the timeline of, and methods used in, the creation and revision of the map. Multiple analysis methods are presented to analyze a particular pigment in the Gough Map with an emphasis on understanding the within-material diversity, i.e., the number and spatial layout of distinct red pigments. One approach for understanding the number of distinct materials in a scene (i.e., endmember selection and dimensionality estimation) is the Gram matrix approach. Here, this method is used to study the within-material differences of pigments in the map with common visual color. The application is a pigment analysis tool that extracts visually common pixels (here, the red pigments) from the Gough Map and estimates the material diversity of the pixels. Results show that the Gough Map is composed of at least five kinds of dominant red pigments with a particular spatial pattern. This research provides a useful tool for historical geographers and cartographic historians to analyze the material diversity of HSI of cultural heritage artifacts.
Waldron, Anna M.; Begg, Douglas J.; de Silva, Kumudika; Purdie, Auriol C.; Whittington, Richard J.
2015-01-01
Pathogenic mycobacteria are difficult to culture, requiring specialized media and a long incubation time, and have complex and exceedingly robust cell walls. Mycobacterium avium subsp. paratuberculosis (MAP), the causative agent of Johne's disease, a chronic wasting disease of ruminants, is a typical example. Culture of MAP from the feces and intestinal tissues is a commonly used test for confirmation of infection. Liquid medium offers greater sensitivity than solid medium for detection of MAP; however, support for the BD Bactec 460 system commonly used for this purpose has been discontinued. We previously developed a new liquid culture medium, M7H9C, to replace it, with confirmation of growth reliant on PCR. Here, we report an efficient DNA isolation and quantitative PCR methodology for the specific detection and confirmation of MAP growth in liquid culture media containing egg yolk. The analytical sensitivity was at least 104-fold higher than a commonly used method involving ethanol precipitation of DNA and conventional PCR; this may be partly due to the addition of a bead-beating step to manually disrupt the cell wall of the mycobacteria. The limit of detection, determined using pure cultures of two different MAP strains, was 100 to 1,000 MAP organisms/ml. The diagnostic accuracy was confirmed using a panel of cattle fecal (n = 54) and sheep fecal and tissue (n = 90) culture samples. This technique is directly relevant for diagnostic laboratories that perform MAP cultures but may also be applicable to the detection of other species, including M. avium and M. tuberculosis. PMID:25609725
Plain, Karren M; Waldron, Anna M; Begg, Douglas J; de Silva, Kumudika; Purdie, Auriol C; Whittington, Richard J
2015-04-01
Pathogenic mycobacteria are difficult to culture, requiring specialized media and a long incubation time, and have complex and exceedingly robust cell walls. Mycobacterium avium subsp. paratuberculosis (MAP), the causative agent of Johne's disease, a chronic wasting disease of ruminants, is a typical example. Culture of MAP from the feces and intestinal tissues is a commonly used test for confirmation of infection. Liquid medium offers greater sensitivity than solid medium for detection of MAP; however, support for the BD Bactec 460 system commonly used for this purpose has been discontinued. We previously developed a new liquid culture medium, M7H9C, to replace it, with confirmation of growth reliant on PCR. Here, we report an efficient DNA isolation and quantitative PCR methodology for the specific detection and confirmation of MAP growth in liquid culture media containing egg yolk. The analytical sensitivity was at least 10(4)-fold higher than a commonly used method involving ethanol precipitation of DNA and conventional PCR; this may be partly due to the addition of a bead-beating step to manually disrupt the cell wall of the mycobacteria. The limit of detection, determined using pure cultures of two different MAP strains, was 100 to 1,000 MAP organisms/ml. The diagnostic accuracy was confirmed using a panel of cattle fecal (n=54) and sheep fecal and tissue (n=90) culture samples. This technique is directly relevant for diagnostic laboratories that perform MAP cultures but may also be applicable to the detection of other species, including M. avium and M. tuberculosis. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Mapping QTLs for drought tolerance in a SEA 5 x AND 277 common bean cross with SSRs and SNP markers
Briñez, Boris; Perseguini, Juliana Morini Küpper Cardoso; Rosa, Juliana Santa; Bassi, Denis; Gonçalves, João Guilherme Ribeiro; Almeida, Caléo; Paulino, Jean Fausto de Carvalho; Blair, Matthew Ward; Chioratto, Alisson Fernando; Carbonell, Sérgio Augusto Morais; Valdisser, Paula Arielle Mendes Ribeiro; Vianello, Rosana Pereira; Benchimol-Reis, Luciana Lasry
2017-01-01
Abstract The common bean is characterized by high sensitivity to drought and low productivity. Breeding for drought resistance in this species involves genes of different genetic groups. In this work, we used a SEA 5 x AND 277 cross to map quantitative trait loci associated with drought tolerance in order to assess the factors that determine the magnitude of drought response in common beans. A total of 438 polymorphic markers were used to genotype the F8 mapping population. Phenotyping was done in two greenhouses, one used to simulate drought and the other to simulate irrigated conditions. Fourteen traits associated with drought tolerance were measured to identify the quantitative trait loci (QTLs). The map was constructed with 331 markers that covered all 11 chromosomes and had a total length of 1515 cM. Twenty-two QTLs were discovered for chlorophyll, leaf and stem fresh biomass, leaf biomass dry weight, leaf temperature, number of pods per plant, number of seeds per plant, seed weight, days to flowering, dry pod weight and total yield under well-watered and drought (stress) conditions. All the QTLs detected under drought conditions showed positive effects of the SEA 5 allele. This study provides a better understanding of the genetic inheritance of drought tolerance in common bean. PMID:29064511
Digital mapping techniques '00, workshop proceedings - May 17-20, 2000, Lexington, Kentucky
Soller, David R.
2000-01-01
Introduction: The Digital Mapping Techniques '00 (DMT'00) workshop was attended by 99 technical experts from 42 agencies, universities, and private companies, including representatives from 28 state geological surveys (see Appendix A). This workshop was similar in nature to the first three meetings, held in June, 1997, in Lawrence, Kansas (Soller, 1997), in May, 1998, in Champaign, Illinois (Soller, 1998a), and in May, 1999, in Madison, Wisconsin (Soller, 1999). This year's meeting was hosted by the Kentucky Geological Survey, from May 17 to 20, 2000, on the University of Kentucky campus in Lexington. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. When, based on discussions at the workshop, an attendee adopts or modifies a newly learned technique, the workshop clearly has met that objective. Evidence of learning and cooperation among participating agencies continued to be a highlight of the DMT workshops (see example in Soller, 1998b, and various papers in this volume). The meeting's general goal was to help move the state geological surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and geographic information systems (GIS) analysis. Through oral and poster presentations and special discussion sessions, emphasis was given to: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) continued development of the National Geologic Map Database; 3) progress toward building a standard geologic map data model; 4) field data-collection systems; and 5) map citation and authorship guidelines. Four representatives of the GIS hardware and software vendor community were invited to participate. The four annual DMT workshops were coordinated by the AASG/USGS Data Capture Working Group, which was formed in August, 1996, to support the Association of American State Geologists and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ncgmp.usgs.gov/ngmdbproject/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed to help the Database, and the State and Federal geological surveys, provide more high-quality digital maps to the public.
Generalized logistic map and its application in chaos based cryptography
NASA Astrophysics Data System (ADS)
Lawnik, M.
2017-12-01
The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.
A comparison of contour maps derived from independent methods of measuring lunar magnetic fields
NASA Technical Reports Server (NTRS)
Lichtenstein, B. R.; Coleman, P. J., Jr.; Russell, C. T.
1978-01-01
Computer-generated contour maps of strong lunar remanent magnetic fields are presented and discussed. The maps, obtained by previously described (Eliason and Soderblom, 1977) techniques, are derived from a variety of direct and indirect measurements from Apollo 15 and 16 and Explorer 35 magnetometer and electron reflection data. A common display format is used to facilitate comparison of the maps over regions of overlapping coverage. Most large scale features of either weak or strong magnetic field regions are found to correlate fairly well on all the maps considered.
Standards-Based Open-Source Planetary Map Server: Lunaserv
NASA Astrophysics Data System (ADS)
Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.
2018-04-01
Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.
Zhang, Qianqian; Guldbrandtsen, Bernt; Calus, Mario P L; Lund, Mogens Sandø; Sahana, Goutam
2016-08-17
There is growing interest in the role of rare variants in the variation of complex traits due to increasing evidence that rare variants are associated with quantitative traits. However, association methods that are commonly used for mapping common variants are not effective to map rare variants. Besides, livestock populations have large half-sib families and the occurrence of rare variants may be confounded with family structure, which makes it difficult to disentangle their effects from family mean effects. We compared the power of methods that are commonly applied in human genetics to map rare variants in cattle using whole-genome sequence data and simulated phenotypes. We also studied the power of mapping rare variants using linear mixed models (LMM), which are the method of choice to account for both family relationships and population structure in cattle. We observed that the power of the LMM approach was low for mapping a rare variant (defined as those that have frequencies lower than 0.01) with a moderate effect (5 to 8 % of phenotypic variance explained by multiple rare variants that vary from 5 to 21 in number) contributing to a QTL with a sample size of 1000. In contrast, across the scenarios studied, statistical methods that are specialized for mapping rare variants increased power regardless of whether multiple rare variants or a single rare variant underlie a QTL. Different methods for combining rare variants in the test single nucleotide polymorphism set resulted in similar power irrespective of the proportion of total genetic variance explained by the QTL. However, when the QTL variance is very small (only 0.1 % of the total genetic variance), these specialized methods for mapping rare variants and LMM generally had no power to map the variants within a gene with sample sizes of 1000 or 5000. We observed that the methods that combine multiple rare variants within a gene into a meta-variant generally had greater power to map rare variants compared to LMM. Therefore, it is recommended to use rare variant association mapping methods to map rare genetic variants that affect quantitative traits in livestock, such as bovine populations.
NASA Technical Reports Server (NTRS)
Place, J. L.
1974-01-01
Changes in land use between 1970 and 1973 in the Phoenix (1:250,000 scale) Quadrangle in Arizona have been mapped using only the images from ERTS-1, tending to verify the utility of a standard land use classification system proposed for use with ERTS images. Types of changes detected have been: (1) new residential development of former cropland and rangeland; (2) new cropland from the desert; and (3) new reservoir fill-up. The seasonal changing of vegetation patterns in ERTS has complemented air photos in delimiting the boundaries of some land use types. ERTS images, in combination with other sources of information, can assist in mapping the generalized land use of the fifty states by the standard 1:250,000 quadrangles. Several states are already working cooperatively in this type of mapping.
Buda, Alessandro; Papadia, Andrea; Zapardiel, Ignacio; Vizza, Enrico; Ghezzi, Fabio; De Ponti, Elena; Lissoni, Andrea Alberto; Imboden, Sara; Diestro, Maria Dolores; Verri, Debora; Gasparri, Maria Luisa; Bussi, Beatrice; Di Martino, Giampaolo; de la Noval, Begoña Diaz; Mueller, Michael; Crivellaro, Cinzia
2016-09-01
The credibility of sentinel lymph node (SLN) mapping is becoming increasingly more established in cervical cancer. We aimed to assess the sensitivity of SLN biopsy in terms of detection rate and bilateral mapping in women with cervical cancer by comparing technetium-99 radiocolloid (Tc-99(m)) and blue dye (BD) versus fluorescence mapping with indocyanine green (ICG). Data of patients with cervical cancer stage 1A2 to 1B1 from 5 European institutions were retrospectively reviewed. All centers used a laparoscopic approach with the same intracervical dye injection. Detection rate and bilateral mapping of ICG were compared, respectively, with results obtained by standard Tc-99(m) with BD. Overall, 76 (53 %) of 144 of women underwent preoperative SLN mapping with radiotracer and intraoperative BD, whereas 68 of (47 %) 144 patients underwent mapping using intraoperative ICG. The detection rate of SLN mapping was 96 % and 100 % for Tc-99(m) with BD and ICG, respectively. Bilateral mapping was achieved in 98.5 % for ICG and 76.3 % for Tc-99(m) with BD; this difference was statistically significant (p < 0.0001). The fluorescence SLN mapping with ICG achieved a significantly higher detection rate and bilateral mapping compared to standard radiocolloid and BD technique in women with early stage cervical cancer. Nodal staging with an intracervical injection of ICG is accurate, safe, and reproducible in patients with cervical cancer. Before replacing lymphadenectomy completely, the additional value of fluorescence SLN mapping on both perioperative morbidity and survival should be explored and confirmed by ongoing controlled trials.
Issues in Analyzing Alignment of Language Arts Common Core Standards with State Standards
ERIC Educational Resources Information Center
Beach, Richard W.
2011-01-01
This commentary on Porter, McMaken, Hwang, and Yang's "Common Core Standards: The New U.S. Intended Curriculum," which finds a lack of alignment between the Common Core State Standards and state standards and assessments, suggests possible reasons for the lack of alignment. It also offers possible reasons for Porter et al.'s finding of a…
Something in Common: The Common Core Standards and the Next Chapter in American Education
ERIC Educational Resources Information Center
Rothman, Robert
2011-01-01
"Something in Common" is the first book to provide a detailed look at the groundbreaking Common Core State Standards and their potential to transform American education. This book tells the story of the unfolding political drama around the making of the Common Core State Standards for math and English language arts, which were adopted by…
Mapping rare and common causal alleles for complex human diseases
Raychaudhuri, Soumya
2011-01-01
Advances in genotyping and sequencing technologies have revolutionized the genetics of complex disease by locating rare and common variants that influence an individual’s risk for diseases, such as diabetes, cancers, and psychiatric disorders. However, to capitalize on this data for prevention and therapies requires the identification of causal alleles and a mechanistic understanding for how these variants contribute to the disease. After discussing the strategies currently used to map variants for complex diseases, this Primer explores how variants may be prioritized for follow-up functional studies and the challenges and approaches for assessing the contributions of rare and common variants to disease phenotypes. PMID:21962507
The interoperability skill of the Geographic Portal of the ISPRA - Geological Survey of Italy
NASA Astrophysics Data System (ADS)
Pia Congi, Maria; Campo, Valentina; Cipolloni, Carlo; Delogu, Daniela; Ventura, Renato; Battaglini, Loredana
2010-05-01
The Geographic Portal of Geological Survey of Italy (ISPRA) available at http://serviziogeologico.apat.it/Portal was planning according to standard criteria of the INSPIRE directive. ArcIMS services and at the same time WMS and WFS services had been realized to satisfy the different clients. For each database and web-services the metadata had been wrote in agreement with the ISO 19115. The management architecture of the portal allow it to encode the clients input and output requests both in ArcXML and in GML language. The web-applications and web-services had been realized for each database owner of Land Protection and Georesources Department concerning the geological map at the scale 1:50.000 (CARG Project) and 1:100.000, the IFFI landslide inventory, the boreholes due Law 464/84, the large-scale geological map and all the raster format maps. The portal thus far published is at the experimental stage but through the development of a new graphical interface achieves the final version. The WMS and WFS services including metadata will be re-designed. The validity of the methodology and the applied standards allow to look ahead to the growing developments. In addition to this it must be borne in mind that the capacity of the new geological standard language (GeoSciML), which is already incorporated in the web-services deployed, will be allow a better display and query of the geological data according to the interoperability. The characteristics of the geological data demand for the cartographic mapping specific libraries of symbols not yet available in a WMS service. This is an other aspect regards the standards of the geological informations. Therefore at the moment were carried out: - a library of geological symbols to be used for printing, with a sketch of system colors and a library for displaying data on video, which almost completely solves the problems of the coverage point and area data (also directed) but that still introduces problems for the linear data (solutions: ArcIMS services from Arcmap projects or a specific SLD implementation for WMS services); - an update of "Guidelines for the supply of geological data" in a short time will be published; - the Geological Survey of Italy is officially involved in the IUGS-CGI working group for the processing and experimentation on the new GeoSciML language with the WMS/WFS services. The availability of geographic informations occurs through the metadata that can be distributed online so that search engines can find them through specialized research. The collected metadata in catalogs are structured in a standard (ISO 19135). The catalogs are a ‘common' interface to locate, view and query data and metadata services, web services and other resources. Then, while working in a growing sector of the environmental knowledgement the focus is to collect the participation of other subjects that contribute to the enrichment of the informative content available, so as to be able to arrive to a real portal of national interest especially in case of disaster management.
NASA Astrophysics Data System (ADS)
Heijenk, R.; Geertsema, M.; Miller, B.; de Jong, S. M.
2015-12-01
Spreads and other low gradient landslides are common in glacial lake sediments in north eastern British Columbia. Both pre and post glacial lake sediments, largely derived from shale bedrock are susceptible to low-gradient landslides. Bank erosion by rivers and streams and high pore pressures, have contributed to the landslides. We used LiDAR for mapping the extent of the glaciolacustrine sediments and map and characterise landslides in the Pine River valley, near Chetwynd, British Columbia. We included metrics such as travel angle, length, area, and elevation to distinguish rotational and translational landslides. We mapped 45 landslides in the Pine River valley distinguishing between rotational and translational landslides. The rotational landslides commonly have a smaller area and smaller travel length than translational landslides. Most rotational slides involved overlying alluvial fans, while most translational slides involved terraces.
Career Mapping for Professional Development and Succession Planning.
Webb, Tammy; Diamond-Wells, Tammy; Jeffs, Debra
Career mapping facilitates professional development of nurses by education specialists and nurse managers. On the basis of national Nursing Professional Development Scope and Standards, our education and professional development framework supports the organization's professional practice model and provides a foundation for the professional career map. This article describes development, implementation, and evaluation of the professional career map for nurses at a large children's hospital to support achievement of the nursing strategic goals for succession planning and professional development.
Rapid quantitative chemical mapping of surfaces with sub-2 nm resolution
NASA Astrophysics Data System (ADS)
Lai, Chia-Yun; Perri, Saverio; Santos, Sergio; Garcia, Ricardo; Chiesa, Matteo
2016-05-01
We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems.We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00496b
Profiling structured product labeling with NDF-RT and RxNorm
2012-01-01
Background Structured Product Labeling (SPL) is a document markup standard approved by Health Level Seven (HL7) and adopted by United States Food and Drug Administration (FDA) as a mechanism for exchanging drug product information. The SPL drug labels contain rich information about FDA approved clinical drugs. However, the lack of linkage to standard drug ontologies hinders their meaningful use. NDF-RT (National Drug File Reference Terminology) and NLM RxNorm as standard drug ontology were used to standardize and profile the product labels. Methods In this paper, we present a framework that intends to map SPL drug labels with existing drug ontologies: NDF-RT and RxNorm. We also applied existing categorical annotations from the drug ontologies to classify SPL drug labels into corresponding classes. We established the classification and relevant linkage for SPL drug labels using the following three approaches. First, we retrieved NDF-RT categorical information from the External Pharmacologic Class (EPC) indexing SPLs. Second, we used the RxNorm and NDF-RT mappings to classify and link SPLs with NDF-RT categories. Third, we profiled SPLs using RxNorm term type information. In the implementation process, we employed a Semantic Web technology framework, in which we stored the data sets from NDF-RT and SPLs into a RDF triple store, and executed SPARQL queries to retrieve data from customized SPARQL endpoints. Meanwhile, we imported RxNorm data into MySQL relational database. Results In total, 96.0% SPL drug labels were mapped with NDF-RT categories whereas 97.0% SPL drug labels are linked to RxNorm codes. We found that the majority of SPL drug labels are mapped to chemical ingredient concepts in both drug ontologies whereas a relatively small portion of SPL drug labels are mapped to clinical drug concepts. Conclusions The profiling outcomes produced by this study would provide useful insights on meaningful use of FDA SPL drug labels in clinical applications through standard drug ontologies such as NDF-RT and RxNorm. PMID:23256517
Profiling structured product labeling with NDF-RT and RxNorm.
Zhu, Qian; Jiang, Guoqian; Chute, Christopher G
2012-12-20
Structured Product Labeling (SPL) is a document markup standard approved by Health Level Seven (HL7) and adopted by United States Food and Drug Administration (FDA) as a mechanism for exchanging drug product information. The SPL drug labels contain rich information about FDA approved clinical drugs. However, the lack of linkage to standard drug ontologies hinders their meaningful use. NDF-RT (National Drug File Reference Terminology) and NLM RxNorm as standard drug ontology were used to standardize and profile the product labels. In this paper, we present a framework that intends to map SPL drug labels with existing drug ontologies: NDF-RT and RxNorm. We also applied existing categorical annotations from the drug ontologies to classify SPL drug labels into corresponding classes. We established the classification and relevant linkage for SPL drug labels using the following three approaches. First, we retrieved NDF-RT categorical information from the External Pharmacologic Class (EPC) indexing SPLs. Second, we used the RxNorm and NDF-RT mappings to classify and link SPLs with NDF-RT categories. Third, we profiled SPLs using RxNorm term type information. In the implementation process, we employed a Semantic Web technology framework, in which we stored the data sets from NDF-RT and SPLs into a RDF triple store, and executed SPARQL queries to retrieve data from customized SPARQL endpoints. Meanwhile, we imported RxNorm data into MySQL relational database. In total, 96.0% SPL drug labels were mapped with NDF-RT categories whereas 97.0% SPL drug labels are linked to RxNorm codes. We found that the majority of SPL drug labels are mapped to chemical ingredient concepts in both drug ontologies whereas a relatively small portion of SPL drug labels are mapped to clinical drug concepts. The profiling outcomes produced by this study would provide useful insights on meaningful use of FDA SPL drug labels in clinical applications through standard drug ontologies such as NDF-RT and RxNorm.
Shultz, Mary
2006-01-01
Introduction: Given the common use of acronyms and initialisms in the health sciences, searchers may be entering these abbreviated terms rather than full phrases when searching online systems. The purpose of this study is to evaluate how various MEDLINE Medical Subject Headings (MeSH) interfaces map acronyms and initialisms to the MeSH vocabulary. Methods: The interfaces used in this study were: the PubMed MeSH database, the PubMed Automatic Term Mapping feature, the NLM Gateway Term Finder, and Ovid MEDLINE. Acronyms and initialisms were randomly selected from 2 print sources. The test data set included 415 randomly selected acronyms and initialisms whose related meanings were found to be MeSH terms. Each acronym and initialism was entered into each MEDLINE MeSH interface to determine if it mapped to the corresponding MeSH term. Separately, 46 commonly used acronyms and initialisms were tested. Results: While performance differed widely, the success rates were low across all interfaces for the randomly selected terms. The common acronyms and initialisms tested at higher success rates across the interfaces, but the differences between the interfaces remained. Conclusion: Online interfaces do not always map medical acronyms and initialisms to their corresponding MeSH phrases. This may lead to inaccurate results and missed information if acronyms and initialisms are used in search strategies. PMID:17082832
An Example of Unsupervised Networks Kohonen's Self-Organizing Feature Map
NASA Technical Reports Server (NTRS)
Niebur, Dagmar
1995-01-01
Kohonen's self-organizing feature map belongs to a class of unsupervised artificial neural network commonly referred to as topographic maps. It serves two purposes, the quantization and dimensionality reduction of date. A short description of its history and its biological context is given. We show that the inherent classification properties of the feature map make it a suitable candidate for solving the classification task in power system areas like load forecasting, fault diagnosis and security assessment.
Prioritising coastal zone management issues through fuzzy cognitive mapping approach.
Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi
2012-04-30
Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.
Crowdsourced Mapping - Letting Amateurs Into the Temple?
NASA Astrophysics Data System (ADS)
McCullagh, M.; Jackson, M.
2013-05-01
The rise of crowdsourced mapping data is well documented and attempts to integrate such information within existing or potential NSDIs [National Spatial Data Infrastructures] are increasingly being examined. The results of these experiments, however, have been mixed and have left many researchers uncertain and unclear of the benefits of integration and of solutions to problems of use for such combined and potentially synergistic mapping tools. This paper reviews the development of the crowdsource mapping movement and discusses the applications that have been developed and some of the successes achieved thus far. It also describes the problems of integration and ways of estimating success, based partly on a number of on-going studies at the University of Nottingham that look at different aspects of the integration problem: iterative improvement of crowdsource data quality, comparison between crowdsourced data and prior knowledge and models, development of trust in such data, and the alignment of variant ontologies. Questions of quality arise, particularly when crowdsource data are combined with pre-existing NSDI data. The latter is usually stable, meets international standards and often provides national coverage for use at a variety of scales. The former is often partial, without defined quality standards, patchy in coverage, but frequently addresses themes very important to some grass roots group and often to society as a whole. This group might be of regional, national, or international importance that needs a mapping facility to express its views, and therefore should combine with local NSDI initiatives to provide valid mapping. Will both groups use ISO (International Organisation for Standardisation) and OGC (Open Geospatial Consortium) standards? Or might some extension or relaxation be required to accommodate the mostly less rigorous crowdsourced data? So, can crowdsourced data ever be safely and successfully merged into an NSDI? Should it be simply a separate mapping layer? Is full integration possible providing quality standards are fully met, and methods of defining levels of quality agreed? Frequently crowdsourced data sets are anarchic in composition, and based on new and sometimes unproved technologies. Can an NSDI exhibit the necessary flexibility and speed to deal with such rapid technological and societal change?
Powell, E S; Pyburn, R E; Hill, E; Smith, K S; Ribbands, M S; Mickelborough, J; Pomeroy, V M
2002-09-01
Evaluation of the effectiveness of therapy to improve sitting balance has been hampered by the limited number of sensitive objective clinical measures. We developed the Manchester Active Position Seat (MAPS) to provide a portable system to track change in the position of centre of force over time. (1) To investigate whether there is correspondence between the measurement of position change by a forceplate and by MAPS. (2) To explore whether and how MAPS measures changes in position when seated healthy adults change posture. A feasibility study. (1) An adult subject sat on MAPS placed on top of a forceplate. The x and y coordinates of the centre of pressure recorded from the forceplate and centre of force from MAPS during movement were compared graphically. (2) Four adults sat on MAPS using a standardized starting position and moving into six sets of six standardized target postures in a predetermined randomized order. The absolute shift in centre of force from the starting position was calculated. (1) The pattern of change of position over time was similar for the forceplate and for MAPS although there was a measurement difference, which increased with distance from the centre. (2) The direction of change of position corresponded to the direction of movement to the target postures but the amount of change varied between subjects. MAPS shows promise as an objective clinical measure of sitting balance, but peripheral accuracy of measurement needs to be improved.