Sample records for gridwise standards mapping

  1. GridWise Standards Mapping Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosquet, Mia L.

    ''GridWise'' is a concept of how advanced communications, information and controls technology can transform the nation's energy system--across the spectrum of large scale, central generation to common consumer appliances and equipment--into a collaborative network, rich in the exchange of decision making information and an abundance of market-based opportunities (Widergren and Bosquet 2003) accompanying the electric transmission and distribution system fully into the information and telecommunication age. This report summarizes a broad review of standards efforts which are related to GridWise--those which could ultimately contribute significantly to advancements toward the GridWise vision, or those which represent today's current technological basis uponmore » which this vision must build.« less

  2. Smart Grid Interoperability Maturity Model Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less

  3. Pattern Recognition Analysis of Age-Related Retinal Ganglion Cell Signatures in the Human Eye

    PubMed Central

    Yoshioka, Nayuta; Zangerl, Barbara; Nivison-Smith, Lisa; Khuu, Sieu K.; Jones, Bryan W.; Pfeiffer, Rebecca L.; Marc, Robert E.; Kalloniatis, Michael

    2017-01-01

    Purpose To characterize macular ganglion cell layer (GCL) changes with age and provide a framework to assess changes in ocular disease. This study used data clustering to analyze macular GCL patterns from optical coherence tomography (OCT) in a large cohort of subjects without ocular disease. Methods Single eyes of 201 patients evaluated at the Centre for Eye Health (Sydney, Australia) were retrospectively enrolled (age range, 20–85); 8 × 8 grid locations obtained from Spectralis OCT macular scans were analyzed with unsupervised classification into statistically separable classes sharing common GCL thickness and change with age. The resulting classes and gridwise data were fitted with linear and segmented linear regression curves. Additionally, normalized data were analyzed to determine regression as a percentage. Accuracy of each model was examined through comparison of predicted 50-year-old equivalent macular GCL thickness for the entire cohort to a true 50-year-old reference cohort. Results Pattern recognition clustered GCL thickness across the macula into five to eight spatially concentric classes. F-test demonstrated segmented linear regression to be the most appropriate model for macular GCL change. The pattern recognition–derived and normalized model revealed less difference between the predicted macular GCL thickness and the reference cohort (average ± SD 0.19 ± 0.92 and −0.30 ± 0.61 μm) than a gridwise model (average ± SD 0.62 ± 1.43 μm). Conclusions Pattern recognition successfully identified statistically separable macular areas that undergo a segmented linear reduction with age. This regression model better predicted macular GCL thickness. The various unique spatial patterns revealed by pattern recognition combined with core GCL thickness data provide a framework to analyze GCL loss in ocular disease. PMID:28632847

  4. Magnetic Resonance Imaging of Surgical Implants Made from Weak Magnetic Materials

    NASA Astrophysics Data System (ADS)

    Gogola, D.; Krafčík, A.; Štrbák, O.; Frollo, I.

    2013-08-01

    Materials with high magnetic susceptibility cause local inhomogeneities in the main field of the magnetic resonance (MR) tomograph. These inhomogeneities lead to loss of phase coherence, and thus to a rapid loss of signal in the image. In our research we investigated inhomogeneous field of magnetic implants such as magnetic fibers, designed for inner suture during surgery. The magnetic field inhomogeneities were studied at low magnetic planar phantom, which was made from four thin strips of magnetic tape, arranged grid-wise. We optimized the properties of imaging sequences with the aim to find the best setup for magnetic fiber visualization. These fibers can be potentially exploited in surgery for internal stitches. Stitches can be visualized by the magnetic resonance imaging (MRI) method after surgery. This study shows that the imaging of magnetic implants is possible by using the low field MRI systems, without the use of complicated post processing techniques (e.g., IDEAL).

  5. Smart Grid Interoperability Maturity Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Levinson, Alex; Mater, J.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less

  6. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    PubMed

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  7. Automating the selection of standard parallels for conic map projections

    NASA Astrophysics Data System (ADS)

    Šavriǒ, Bojan; Jenny, Bernhard

    2016-05-01

    Conic map projections are appropriate for mapping regions at medium and large scales with east-west extents at intermediate latitudes. Conic projections are appropriate for these cases because they show the mapped area with less distortion than other projections. In order to minimize the distortion of the mapped area, the two standard parallels of conic projections need to be selected carefully. Rules of thumb exist for placing the standard parallels based on the width-to-height ratio of the map. These rules of thumb are simple to apply, but do not result in maps with minimum distortion. There also exist more sophisticated methods that determine standard parallels such that distortion in the mapped area is minimized. These methods are computationally expensive and cannot be used for real-time web mapping and GIS applications where the projection is adjusted automatically to the displayed area. This article presents a polynomial model that quickly provides the standard parallels for the three most common conic map projections: the Albers equal-area, the Lambert conformal, and the equidistant conic projection. The model defines the standard parallels with polynomial expressions based on the spatial extent of the mapped area. The spatial extent is defined by the length of the mapped central meridian segment, the central latitude of the displayed area, and the width-to-height ratio of the map. The polynomial model was derived from 3825 maps-each with a different spatial extent and computationally determined standard parallels that minimize the mean scale distortion index. The resulting model is computationally simple and can be used for the automatic selection of the standard parallels of conic map projections in GIS software and web mapping applications.

  8. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  9. FGDC Digital Cartographic Standard for Geologic Map Symbolization (PostScript Implementation)

    USGS Publications Warehouse

    ,

    2006-01-01

    PLEASE NOTE: This now-approved 'FGDC Digital Cartographic Standard for Geologic Map Symbolization (PostScript Implementation)' officially supercedes its earlier (2000) Public Review Draft version (see 'Earlier Versions of the Standard' below). In August 2006, the Digital Cartographic Standard for Geologic Map Symbolization was officially endorsed by the Federal Geographic Data Committee (FGDC) as the national standard for the digital cartographic representation of geologic map features (FGDC Document Number FGDC-STD-013-2006). Presented herein is the PostScript Implementation of the standard, which will enable users to directly apply the symbols in the standard to geologic maps and illustrations prepared in desktop illustration and (or) publishing software. The FGDC Digital Cartographic Standard for Geologic Map Symbolization contains descriptions, examples, cartographic specifications, and notes on usage for a wide variety of symbols that may be used on typical, general-purpose geologic maps and related products such as cross sections. The standard also can be used for different kinds of special-purpose or derivative map products and databases that may be focused on a specific geoscience topic (for example, slope stability) or class of features (for example, a fault map). The standard is scale-independent, meaning that the symbols are appropriate for use with geologic mapping compiled or published at any scale. It will be useful to anyone who either produces or uses geologic map information, whether in analog or digital form. Please be aware that this standard is not intended to be used inflexibly or in a manner that will limit one's ability to communicate the observations and interpretations gained from geologic mapping. In certain situations, a symbol or its usage might need to be modified in order to better represent a particular feature on a geologic map or cross section. This standard allows the use of any symbol that doesn't conflict with others in the standard, provided that it is clearly explained on the map and in the database. In addition, modifying the size, color, and (or) lineweight of an existing symbol to suit the needs of a particular map or output device also is permitted, provided that the modified symbol's appearance is not too similar to another symbol on the map. Be aware, however, that reducing lineweights below .125 mm (.005 inch) may cause symbols to plot incorrectly if output at higher resolutions (1800 dpi or higher). For guidelines on symbol usage, as well as on color design and map labeling, please refer to the standard's introductory text. Also found there are informational sections covering concepts of geologic mapping and some definitions of geologic map features, as well as sections on the newly defined concepts and terminology for the scientific confidence and locational accuracy of geologic map features. More information on both the past development and the future maintenance of the FGDC Digital Cartographic Standard for Geologic Map Symbolization can be found at the FGDC Geologic Data Subcommittee website (http://ngmdb.usgs.gov/fgdc_gds/). Earlier Versions of the Standard

  10. Construct Maps as a Foundation for Standard Setting

    ERIC Educational Resources Information Center

    Wyse, Adam E.

    2013-01-01

    Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…

  11. Standard map in magnetized relativistic systems: fixed points and regular acceleration.

    PubMed

    de Sousa, M C; Steffens, F M; Pakter, R; Rizzato, F B

    2010-08-01

    We investigate the concept of a standard map for the interaction of relativistic particles and electrostatic waves of arbitrary amplitudes, under the action of external magnetic fields. The map is adequate for physical settings where waves and particles interact impulsively, and allows for a series of analytical result to be exactly obtained. Unlike the traditional form of the standard map, the present map is nonlinear in the wave amplitude and displays a series of peculiar properties. Among these properties we discuss the relation involving fixed points of the maps and accelerator regimes.

  12. Toward digital geologic map standards: a progress report

    USGS Publications Warehouse

    Ulrech, George E.; Reynolds, Mitchell W.; Taylor, Richard B.

    1992-01-01

    Establishing modern scientific and technical standards for geologic maps and their derivative map products is vital to both producers and users of such maps as we move into an age of digital cartography. Application of earth-science data in complex geographic information systems, acceleration of geologic map production, and reduction of population costs require that national standards be developed for digital geologic cartography and computer analysis. Since December 1988, under commission of the Chief Geologic of the U.S. Geological Survey and the mandate of the National Geologic Mapping Program (with added representation from the Association of American State Geologists), a committee has been designing a comprehensive set of scientific map standards. Three primary issues were: (1) selecting scientific symbology and its digital representation; (2) creating an appropriate digital coding system that characterizes geologic features with respect to their physical properties, stratigraphic and structural relations, spatial orientation, and interpreted mode of origin; and (3) developing mechanisms for reporting levels of certainty for descriptive as well as measured properties. Approximately 650 symbols for geoscience maps, including present usage of the U.S Geological Survey, state geological surveys, industry, and academia have been identified and tentatively adopted. A proposed coding system comprises four-character groupings of major and minor codes that can identify all attributes of a geologic feature. Such a coding system allows unique identification of as many as 105 geologic names and values on a given map. The new standard will track closely the latest developments of the Proposed Standard for Digital Cartographic Data soon to be submitted to the National Institute of Standards and Technology by the Federal Interagency Coordinating Committee on Digital Cartography. This standard will adhere generally to the accepted definitions and specifications for spatial data transfer. It will require separate specifications of digital cartographic quality relating to positional accuracy and ranges of measured and interpreted values such as geologic age and rock composition. Provisional digital geologic map standards will be published for trial implementation. After approximately two years, when comments on the proposed standards have been solicited and modifications made, formal adoption of the standards will be recommended. Widespread acceptance of the new standards will depend on their applicability to the broadest range of earth-science map products and their adaptability to changing cartographic technology.

  13. Geologic map of the Valjean Hills 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Calzia, J.P.; Troxel, Bennie W.; digital database by Raumann, Christian G.

    2003-01-01

    FGDC-compliant metadata for the ARC/INFO coverages. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3 above) or plotting the postscript file (2 above).

  14. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  15. National Pipeline Mapping System (NPMS) : standards for creating pipeline location data : standards for electronic data submissions, including metadata standards and examples

    DOT National Transportation Integrated Search

    1997-07-14

    These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...

  16. How to Evaluate Training.

    DTIC Science & Technology

    1982-09-30

    not changed because they are not subject to a careful evaluation. The solution The four job aids contained in this manual provide specific techniques...lesson plans tions. " training design, or * testing NOTICE This manual has been developed using the standards of the Information MappingO writing service...Information Mapping, Inc. S NOTICE This manual has been developed using the standards of the Information MappingS writing service. Infornation Mapping

  17. Construct Maps: A Tool to Organize Validity Evidence

    ERIC Educational Resources Information Center

    McClarty, Katie Larsen

    2013-01-01

    The construct map is a promising tool for organizing the data standard-setting panelists interpret. The challenge in applying construct maps to standard-setting procedures will be the judicious selection of data to include within this organizing framework. Therefore, this commentary focuses on decisions about what to include in the construct map.…

  18. Coming Full Circle in Standard Setting: A Commentary on Wyse

    ERIC Educational Resources Information Center

    Skaggs, Gary

    2013-01-01

    The construct map is a particularly good way to approach instrument development, and this author states that he was delighted to read Adam Wyse's thoughts about how to use construct maps for standard setting. For a number of popular standard-setting methods, Wyse shows how typical feedback to panelists fits within a construct map framework.…

  19. Statistical density modification using local pattern matching

    DOEpatents

    Terwilliger, Thomas C.

    2007-01-23

    A computer implemented method modifies an experimental electron density map. A set of selected known experimental and model electron density maps is provided and standard templates of electron density are created from the selected experimental and model electron density maps by clustering and averaging values of electron density in a spherical region about each point in a grid that defines each selected known experimental and model electron density maps. Histograms are also created from the selected experimental and model electron density maps that relate the value of electron density at the center of each of the spherical regions to a correlation coefficient of a density surrounding each corresponding grid point in each one of the standard templates. The standard templates and the histograms are applied to grid points on the experimental electron density map to form new estimates of electron density at each grid point in the experimental electron density map.

  20. Modular avionics packaging standardization

    NASA Astrophysics Data System (ADS)

    Austin, M.; McNichols, J. K.

    The Modular Avionics Packaging (MAP) Program for packaging future military avionics systems with the objective of improving reliability, maintainability, and supportability, and reducing equipment life cycle costs is addressed. The basic MAP packaging concepts called the Standard Avionics Module, the Standard Enclosure, and the Integrated Rack are summarized, and the benefits of modular avionics packaging, including low risk design, technology independence with common functions, improved maintainability and life cycle costs are discussed. Progress made in MAP is briefly reviewed.

  1. Diagnostic lumbosacral segmental nerve blocks with local anesthetics: a prospective double-blind study on the variability and interpretation of segmental effects.

    PubMed

    Wolff, A P; Groen, G J; Crul, B J

    2001-01-01

    Selective spinal nerve infiltration blocks are used diagnostically in patients with chronic low back pain radiating into the leg. Generally, a segmental nerve block is considered successful if the pain is reduced substantially. Hypesthesia and elicited paresthesias coinciding with the presumed segmental level are used as controls. The interpretation depends on a standard dermatomal map. However, it is not clear if this interpretation is reliable enough, because standard dermatomal maps do not show the overlap of neighboring dermatomes. The goal of the present study is to establish if dissimilarities exist between areas of hypesthesia, spontaneous pain reported by the patient, pain reduction by local anesthetics, and paresthesias elicited by sensory electrostimulation. A secondary goal is to determine to what extent the interpretation is improved when the overlaps of neighboring dermatomes are taken into account. Patients suffering from chronic low back pain with pain radiating into the leg underwent lumbosacral segmental nerve root blocks at subsequent levels on separate days. Lidocaine (2%, 0.5 mL) mixed with radiopaque fluid (0.25 mL) was injected after verifying the target location using sensory and motor electrostimulation. Sensory changes (pinprick method), paresthesias (reported by the patient), and pain reduction (Numeric Rating Scale) were reported. Hypesthesia and paresthesias were registered in a standard dermatomal map and in an adapted map which included overlap of neighboring dermatomes. The relationships between spinal level of injection, extent of hypesthesia, location of paresthesias, and corresponding dermatome were assessed quantitatively. Comparison of the results between both dermatomal maps was done by paired t-tests. After inclusion, data were processed for 40 segmental nerve blocks (L2-S1) performed in 29 patients. Pain reduction was achieved in 43%. Hypesthetic areas showed a large variability in size and location, and also in comparison to paresthesias. Mean hypesthetic area amounted 2.7 +/- 1.4 (+/- SD: range, 0 to 6; standard map) and 3.6 +/- 1.8 (0 to 6; adapted map; P <.001) dermatomes. In these cases, hypesthesia in the corresponding dermatome was found in 80% (standard map) and 88% of the cases (adapted map, not significant). Paresthesias occurring in the corresponding dermatome were found in 80% (standard map) compared with 98% (adapted map, P <.001). In 85% (standard map) and 88% (adapted map), spontaneous pain was present in the dermatome corresponding to the level of local anesthetic injection. In 55% (standard map) versus 75% (adapted map, P <.005), a combination of spontaneous pain, hypesthesia, and paresthesias was found in the corresponding dermatome. Hypesthetic areas determined after lumbosacral segmental nerve blocks show a large variability in size and location compared with elicited paresthesias. Confirmation of an adequately performed segmental nerve block, determined by coexistence of hypesthesia, elicited paresthesias and pain in the presumed dermatome, is more reliable when the overlap of neighboring dermatomes is taken into account.

  2. Cartographic mapping study

    NASA Technical Reports Server (NTRS)

    Wilson, C.; Dye, R.; Reed, L.

    1982-01-01

    The errors associated with planimetric mapping of the United States using satellite remote sensing techniques are analyzed. Assumptions concerning the state of the art achievable for satellite mapping systems and platforms in the 1995 time frame are made. An analysis of these performance parameters is made using an interactive cartographic satellite computer model, after first validating the model using LANDSAT 1 through 3 performance parameters. An investigation of current large scale (1:24,000) US National mapping techniques is made. Using the results of this investigation, and current national mapping accuracy standards, the 1995 satellite mapping system is evaluated for its ability to meet US mapping standards for planimetric and topographic mapping at scales of 1:24,000 and smaller.

  3. Robust feature matching via support-line voting and affine-invariant ratios

    NASA Astrophysics Data System (ADS)

    Li, Jiayuan; Hu, Qingwu; Ai, Mingyao; Zhong, Ruofei

    2017-10-01

    Robust image matching is crucial for many applications of remote sensing and photogrammetry, such as image fusion, image registration, and change detection. In this paper, we propose a robust feature matching method based on support-line voting and affine-invariant ratios. We first use popular feature matching algorithms, such as SIFT, to obtain a set of initial matches. A support-line descriptor based on multiple adaptive binning gradient histograms is subsequently applied in the support-line voting stage to filter outliers. In addition, we use affine-invariant ratios computed by a two-line structure to refine the matching results and estimate the local affine transformation. The local affine model is more robust to distortions caused by elevation differences than the global affine transformation, especially for high-resolution remote sensing images and UAV images. Thus, the proposed method is suitable for both rigid and non-rigid image matching problems. Finally, we extract as many high-precision correspondences as possible based on the local affine extension and build a grid-wise affine model for remote sensing image registration. We compare the proposed method with six state-of-the-art algorithms on several data sets and show that our method significantly outperforms the other methods. The proposed method achieves 94.46% average precision on 15 challenging remote sensing image pairs, while the second-best method, RANSAC, only achieves 70.3%. In addition, the number of detected correct matches of the proposed method is approximately four times the number of initial SIFT matches.

  4. Standardized mappings--a framework to combine different semantic mappers into a standardized web-API.

    PubMed

    Neuhaus, Philipp; Doods, Justin; Dugas, Martin

    2015-01-01

    Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.

  5. Mapping Opthalmic Terms to a Standardized Vocabulary.

    ERIC Educational Resources Information Center

    Patrick, Timothy B.; Reid, John C.; Sievert, MaryEllen; Popescu, Mihail; Gigantelli, James W.; Shelton, Mark E.; Schiffman, Jade S.

    2000-01-01

    Describes work by the American Academy of Ophthalmology (AAO) to expand the standardized vocabulary, Systematized Nomenclature of Medicine (SNOMED), to accommodate a definitive ophthalmic standardized vocabulary. Mapped a practice-based clinical ophthalmic vocabulary to SNOMED and other vocabularies in the Metathesaurus of the Unified Medical…

  6. Review of USGS Open-file Report 95-525 ("Cartographic and digital standard for geologic map information") and plans for development of Federal draft standards for geologic map information

    USGS Publications Warehouse

    Soller, David R.

    1996-01-01

    This report summarizes a technical review of USGS Open-File Report 95-525, 'Cartographic and Digital Standard for Geologic Map Information' and OFR 95-526 (diskettes containing digital representations of the standard symbols). If you are considering the purchase or use of those documents, you should read this report first. For some purposes, OFR 95-525 (the printed document) will prove to be an excellent resource. However, technical review identified significant problems with the two documents that will be addressed by various Federal and State committees composed of geologists and cartographers, as noted below. Therefore, the 2-year review period noted in OFR 95-525 is no longer applicable. Until those problems are resolved and formal standards are issued, you may consult the following World-Wide Web (WWW) site which contains information about development of geologic map standards: URL: http://ncgmp.usgs.gov/ngmdbproject/home.html

  7. US Topo Product Standard

    USGS Publications Warehouse

    Cooley, Michael J.; Davis, Larry R.; Fishburn, Kristin A.; Lestinsky, Helmut; Moore, Laurence R.

    2011-01-01

    A full-size style sheet template in PDF that defines the placement of map elements, marginalia, and font sizes and styles accompanies this standard. The GeoPDF US Topo maps are fashioned to conform to this style sheet so that a user can print out a map at the 1:24,000-scale using the dimensions of the traditional standard 7.5-minute quadrangle. Symbology and type specifications for feature content are published separately. In addition, the GeoPDF design allows for custom printing, so that a user may zoom in and out, turn layers on and off, and view or print any combination of layers or any map portion at any desired scale.

  8. Effectiveness of Mind Mapping in English Teaching among VIII Standard Students

    ERIC Educational Resources Information Center

    Hallen, D.; Sangeetha, N.

    2015-01-01

    The aim of the study is to find out the effectiveness of mind mapping technique over conventional method in teaching English at high school level (VIII), in terms of Control and Experimental group. The sample of the study comprised, 60 VIII Standard students in Tiruchendur Taluk. Mind Maps and Achievement Test (Pretest & Posttest) were…

  9. Listening to Students: Customer Journey Mapping at Birmingham City University Library and Learning Resources

    ERIC Educational Resources Information Center

    Andrews, Judith; Eade, Eleanor

    2013-01-01

    Birmingham City University's Library and Learning Resources' strategic aim is to improve student satisfaction. A key element is the achievement of the Customer Excellence Standard. An important component of the standard is the mapping of services to improve quality. Library and Learning Resources has developed a methodology to map these…

  10. Map-IT! A Web-Based GIS Tool for Watershed Science Education.

    ERIC Educational Resources Information Center

    Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.

    This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…

  11. A Novel Imaging Technique (X-Map) to Identify Acute Ischemic Lesions Using Noncontrast Dual-Energy Computed Tomography.

    PubMed

    Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi

    2017-01-01

    We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Advanced Map For Real-Time Process Control

    NASA Astrophysics Data System (ADS)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  13. Landsat Image Map Production Methods at the U. S. Geological Survey

    USGS Publications Warehouse

    Kidwell, R.D.; Binnie, D.R.; Martin, S.

    1987-01-01

    To maintain consistently high quality in satellite image map production, the U. S. Geological Survey (USGS) has developed standard procedures for the photographic and digital production of Landsat image mosaics, and for lithographic printing of multispectral imagery. This paper gives a brief review of the photographic, digital, and lithographic procedures currently in use for producing image maps from Landsat data. It is shown that consistency in the printing of image maps is achieved by standardizing the materials and procedures that affect the image detail and color balance of the final product. Densitometric standards are established by printing control targets using the pressplates, inks, pre-press proofs, and paper to be used for printing.

  14. Comparing Geologic Data Sets Collected by Planetary Analog Traverses and by Standard Geologic Field Mapping: Desert Rats Data Analysis

    NASA Technical Reports Server (NTRS)

    Feng, Wanda; Evans, Cynthia; Gruener, John; Eppler, Dean

    2014-01-01

    Geologic mapping involves interpreting relationships between identifiable units and landforms to understand the formative history of a region. Traditional field techniques are used to accomplish this on Earth. Mapping proves more challenging for other planets, which are studied primarily by orbital remote sensing and, less frequently, by robotic and human surface exploration. Systematic comparative assessments of geologic maps created by traditional mapping versus photogeology together with data from planned traverses are limited. The objective of this project is to produce a geologic map from data collected on the Desert Research and Technology Studies (RATS) 2010 analog mission using Apollo-style traverses in conjunction with remote sensing data. This map is compared with a geologic map produced using standard field techniques.

  15. Cartographic standards to improve maps produced by the Forest Inventory and Analysis program

    Treesearch

    Charles H. (Hobie) Perry; Mark D. Nelson

    2009-01-01

    The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...

  16. Field methods and data processing techniques associated with mapped inventory plots

    Treesearch

    William A. Bechtold; Stanley J. Zarnoch

    1999-01-01

    The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...

  17. Toward standardized mapping for left atrial analysis and cardiac ablation guidance

    NASA Astrophysics Data System (ADS)

    Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.

    2014-03-01

    In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.

  18. Elementary maps on nest algebras

    NASA Astrophysics Data System (ADS)

    Li, Pengtong

    2006-08-01

    Let , be algebras and let , be maps. An elementary map of is an ordered pair (M,M*) such that for all , . In this paper, the general form of surjective elementary maps on standard subalgebras of nest algebras is described. In particular, such maps are automatically additive.

  19. Standard for the U.S. Geological Survey Historical Topographic Map Collection

    USGS Publications Warehouse

    Allord, Gregory J.; Fishburn, Kristin A.; Walter, Jennifer L.

    2014-01-01

    This document defines the digital map product of the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic quadrangle maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. Each printed topographic map is scanned “as is” and captures the content and condition of each map. The HTMC provides ready access to maps that are no longer available for distribution in print. A new generation of topographic maps called “US Topo” was defined in 2009. US Topo maps, though modeled on the legacy 7.5-minute topographic maps, conform to different standards. For more information on the HTMC, see the project Web site at: http://nationalmap.gov/historical/.

  20. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    PubMed

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. USGS standard quadrangle maps for emergency response

    USGS Publications Warehouse

    Moore, Laurence R.

    2009-01-01

    The 1:24,000-scale topographic quadrangle was the primary product of the U.S. Geological Survey's (USGS) National Mapping Program from 1947-1992. This map series includes about 54,000 map sheets for the conterminous United States, and is the only uniform map series ever produced that covers this area at such a large scale. This map series partially was revised under several programs, starting as early as 1968, but these programs were not adequate to keep the series current. Through the 1990s the emphasis of the USGS mapping program shifted away from topographic maps and toward more specialized digital data products. Topographic map revision dropped off rapidly after 1999, and stopped completely by 2004. Since 2001, emergency-response and homeland security requirement have revived the question of whether a standard national topographic series is needed. Emergencies such as Hurricane Katrina in 2005 and California wildfires in 2007-08 demonstrated that familiar maps are important to first responders. Maps that have a standard scale, extent, and grids help reduce confusion and save time in emergencies. Traditional maps are designed to allow the human brain to quickly process large amounts of information, and depend on artistic layout and design that cannot be fully automated. In spite of technical advances, creating a traditional, general-purpose topographic map is still expensive. Although the content and layout of traditional topographic maps probably is still desirable, the preferred packaging and delivery of maps has changed. Digital image files are now desired by most users, but to be useful to the emergency-response community, these files must be easy to view and easy to print without specialized geographic information system expertise or software.

  2. Geologic map of the Sunnymead 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Matti, Jonathan C.

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in sun_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  3. Geologic map of the Cucamonga Peak 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Morton, D.M.; Matti, J.C.; Digital preparation by Koukladas, Catherine; Cossette, P.M.

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. (Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Cucamonga Peak 7.5’ topographic quadrangle in conjunction with the geologic map.

  4. Geologic map of the Telegraph Peak 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Morton, D.M.; Woodburne, M.O.; Foster, J.H.; Morton, Gregory; Cossette, P.M.

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Telegraph Peak 7.5’ topographic quadrangle in conjunction with the geologic map.

  5. Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.

    PubMed

    Ivory, Catherine H

    2016-07-01

    The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.

  6. Fast multidimensional ensemble empirical mode decomposition for the analysis of big spatio-temporal datasets.

    PubMed

    Wu, Zhaohua; Feng, Jiaxin; Qiao, Fangli; Tan, Zhe-Min

    2016-04-13

    In this big data era, it is more urgent than ever to solve two major issues: (i) fast data transmission methods that can facilitate access to data from non-local sources and (ii) fast and efficient data analysis methods that can reveal the key information from the available data for particular purposes. Although approaches in different fields to address these two questions may differ significantly, the common part must involve data compression techniques and a fast algorithm. This paper introduces the recently developed adaptive and spatio-temporally local analysis method, namely the fast multidimensional ensemble empirical mode decomposition (MEEMD), for the analysis of a large spatio-temporal dataset. The original MEEMD uses ensemble empirical mode decomposition to decompose time series at each spatial grid and then pieces together the temporal-spatial evolution of climate variability and change on naturally separated timescales, which is computationally expensive. By taking advantage of the high efficiency of the expression using principal component analysis/empirical orthogonal function analysis for spatio-temporally coherent data, we design a lossy compression method for climate data to facilitate its non-local transmission. We also explain the basic principles behind the fast MEEMD through decomposing principal components instead of original grid-wise time series to speed up computation of MEEMD. Using a typical climate dataset as an example, we demonstrate that our newly designed methods can (i) compress data with a compression rate of one to two orders; and (ii) speed-up the MEEMD algorithm by one to two orders. © 2016 The Authors.

  7. Report on the Project for Establishment of the Standardized Korean Laboratory Terminology Database, 2015.

    PubMed

    Jung, Bo Kyeung; Kim, Jeeyong; Cho, Chi Hyun; Kim, Ju Yeon; Nam, Myung Hyun; Shin, Bong Kyung; Rho, Eun Youn; Kim, Sollip; Sung, Heungsup; Kim, Shinyoung; Ki, Chang Seok; Park, Min Jung; Lee, Kap No; Yoon, Soo Young

    2017-04-01

    The National Health Information Standards Committee was established in 2004 in Korea. The practical subcommittee for laboratory test terminology was placed in charge of standardizing laboratory medicine terminology in Korean. We aimed to establish a standardized Korean laboratory terminology database, Korea-Logical Observation Identifier Names and Codes (K-LOINC) based on former products sponsored by this committee. The primary product was revised based on the opinions of specialists. Next, we mapped the electronic data interchange (EDI) codes that were revised in 2014, to the corresponding K-LOINC. We established a database of synonyms, including the laboratory codes of three reference laboratories and four tertiary hospitals in Korea. Furthermore, we supplemented the clinical microbiology section of K-LOINC using an alternative mapping strategy. We investigated other systems that utilize laboratory codes in order to investigate the compatibility of K-LOINC with statistical standards for a number of tests. A total of 48,990 laboratory codes were adopted (21,539 new and 16,330 revised). All of the LOINC synonyms were translated into Korean, and 39,347 Korean synonyms were added. Moreover, 21,773 synonyms were added from reference laboratories and tertiary hospitals. Alternative strategies were established for mapping within the microbiology domain. When we applied these to a smaller hospital, the mapping rate was successfully increased. Finally, we confirmed K-LOINC compatibility with other statistical standards, including a newly proposed EDI code system. This project successfully established an up-to-date standardized Korean laboratory terminology database, as well as an updated EDI mapping to facilitate the introduction of standard terminology into institutions. © 2017 The Korean Academy of Medical Sciences.

  8. A comparative survey of current and proposed tropospheric refraction-delay models for DSN radio metric data calibration

    NASA Technical Reports Server (NTRS)

    Estefan, J. A.; Sovers, O. J.

    1994-01-01

    The standard tropospheric calibration model implemented in the operational Orbit Determination Program is the seasonal model developed by C. C. Chao in the early 1970's. The seasonal model has seen only slight modification since its release, particularly in the format and content of the zenith delay calibrations. Chao's most recent standard mapping tables, which are used to project the zenith delay calibrations along the station-to-spacecraft line of sight, have not been modified since they were first published in late 1972. This report focuses principally on proposed upgrades to the zenith delay mapping process, although modeling improvements to the zenith delay calibration process are also discussed. A number of candidate approximation models for the tropospheric mapping are evaluated, including the semi-analytic mapping function of Lanyi, and the semi-empirical mapping functions of Davis, et. al.('CfA-2.2'), of Ifadis (global solution model), of Herring ('MTT'), and of Niell ('NMF'). All of the candidate mapping functions are superior to the Chao standard mapping tables and approximation formulas when evaluated against the current Deep Space Network Mark 3 intercontinental very long baselines interferometry database.

  9. Implementation of fast macromolecular proton fraction mapping on 1.5 and 3 Tesla clinical MRI scanners: preliminary experience

    NASA Astrophysics Data System (ADS)

    Yarnykh, V.; Korostyshevskaya, A.

    2017-08-01

    Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.

  10. Preliminary geologic map of the Elsinore 7.5' Quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Weber, F. Harold; Digital preparation: Alvarez, Rachel M.; Burns, Diane

    2003-01-01

    Open-File Report 03-281 contains a digital geologic map database of the Elsinore 7.5’ quadrangle, Riverside County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. Portable Document Format (.pdf) files of: a. This Readme; includes in Appendix I, data contained in els_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  11. Preliminary geologic map of the northeast Dillingham quadrangle (D-1, D-2, C-1, and C-2), Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Hudson, Travis L.; Grybeck, Donald; Stoeser, Douglas B.; Preller, Cindi C.; Bickerstaff, Damon; Labay, Keith A.; Miller, Martha L.

    2003-01-01

    The Correlation of Map Units and Description of Map Units are in a format similar to that of the USGS Geologic Investigations Series (I-series) maps but have not been edited to comply with I-map standards. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the Stratigraphic Nomenclature of the U.S. Geological Survey. ARC/INFO symbolsets (shade and line) as used for these maps have been made available elsewhere as part of Geologic map of Central (Interior) Alaska, published as a USGS Open-File Report (Wilson and others, 1998, http://geopubs.wr.usgs.gov/open-file/of98-133-a/). This product does not include the digital topographic base or land-grid files used to produce the map, nor does it include the AML and related ancillary key and other files used to assemble the components of the map.

  12. US EPA Nonattainment Areas and Designations-Annual PM2.5 (1997 NAAQS)

    EPA Pesticide Factsheets

    This web service contains the following layers: PM2.5 Annual 1997 NAAQS State Level and PM2.5 Annual 1997 NAAQS National . It also contains the following tables: maps99.FRED_MAP_VIEWER.%fred_area_map_data and maps99.FRED_MAP_VIEWER.%fred_area_map_view. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1997PM25Annual/MapServer) and viewing the layer description.These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there

  13. Geologic map of the Riverside East 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Cox, Brett F.

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in rse_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  14. Geologic map of the Corona North 7.5' quadrangle, Riverside and San Bernardino counties, California

    USGS Publications Warehouse

    Morton, Douglas M.; Gray, C.H.; Bovard, Kelly R.; Dawson, Michael

    2002-01-01

    a. This Readme; includes in Appendix I, data contained in crn_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  15. Geologic map of the Corona South 7.5' quadrangle, Riverside and Orange counties, California

    USGS Publications Warehouse

    Gray, C.H.; Morton, Douglas M.; Weber, F. Harold; Digital preparation by Bovard, Kelly R.; O'Brien, Timothy

    2002-01-01

    a. A Readme file; includes in Appendix I, data contained in crs_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  16. Geologic map of the Lake Mathews 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Weber, F. Harold

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in lkm_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous.Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand.In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  17. Geologic map of the Steele Peak 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; digital preparation by Alvarez, Rachel M.; Diep, Van M.

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in stp_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  18. Geologic map of the Riverside West 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Cox, Brett F.

    2001-01-01

    a. This Readme; includes in Appendix I, data contained in rsw_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f.Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).

  19. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  20. Standardization of mapping practices in the British Geological Survey

    NASA Astrophysics Data System (ADS)

    Allen, Peter M.

    1997-07-01

    Because the British Geological Survey (BGS) has had, since its foundation in 1835, a mandate to produce geological maps for the whole of Great Britain, there is a long history of introducing standard practices in the way rocks and rock units have been named, classified and illustrated on maps. The reasons for the failure of some of these practices are examined and assessed in relation to the needs of computerized systems for holding and disseminating geological information.

  1. RIT Stability through the Transition to Common Core-Aligned MAP® Tests. How Using MAP to Measure Student Learning Growth is Reliable Now and in 2014

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2013

    2013-01-01

    While many educators expect the Common Core State Standards (CCSS) to be more rigorous than previous state standards, some wonder if the transition to CCSS and to a Common Core aligned MAP test will have an impact on their students' RIT scores or the NWEA norms. MAP assessments use a proprietary scale known as the RIT (Rasch unit) scale to measure…

  2. Geologic map and digital database of the Romoland 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Digital preparation by Bovard, Kelly R.; Morton, Gregory

    2003-01-01

    Portable Document Format (.pdf) files of: This Readme; includes in Appendix I, data contained in rom_met.txt The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formationname, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). This Readme file describes the digital data, such as types and general contents of files making up the database, and includes information on how to extract and plot the map and accompanying graphic file. Metadata information can be accessed at http://geo-nsdi.er.usgs.gov/metadata/open-file/03-102 and is included in Appendix I of this Readme.

  3. The First Global Geological Map of Mercury

    NASA Astrophysics Data System (ADS)

    Prockter, L. M.; Head, J. W., III; Byrne, P. K.; Denevi, B. W.; Kinczyk, M. J.; Fassett, C.; Whitten, J. L.; Thomas, R.; Ernst, C. M.

    2015-12-01

    Geological maps are tools with which to understand the distribution and age relationships of surface geological units and structural features on planetary surfaces. Regional and limited global mapping of Mercury has already yielded valuable science results, elucidating the history and distribution of several types of units and features, such as regional plains, tectonic structures, and pyroclastic deposits. To date, however, no global geological map of Mercury exists, and there is currently no commonly accepted set of standardized unit descriptions and nomenclature. With MESSENGER monochrome image data, we are undertaking the global geological mapping of Mercury at the 1:15M scale applying standard U.S. Geological Survey mapping guidelines. This map will enable the development of the first global stratigraphic column of Mercury, will facilitate comparisons among surface units distributed discontinuously across the planet, and will provide guidelines for mappers so that future mapping efforts will be consistent and broadly interpretable by the scientific community. To date we have incorporated three major datasets into the global geological map: smooth plains units, tectonic structures, and impact craters and basins >20 km in diameter. We have classified most of these craters by relative age on the basis of the state of preservation of morphological features and standard classification schemes first applied to Mercury by the Mariner 10 imaging team. Additional datasets to be incorporated include intercrater plains units and crater ejecta deposits. In some regions MESSENGER color data is used to supplement the monochrome data, to help elucidate different plains units. The final map will be published online, together with a peer-reviewed publication. Further, a digital version of the map, containing individual map layers, will be made publicly available for use within geographic information systems (GISs).

  4. A standard photomap of ovarian nurse cell chromosomes and inversion polymorphism in Anopheles beklemishevi.

    PubMed

    Artemov, Gleb N; Gordeev, Mikhail I; Kokhanenko, Alina A; Moskaev, Anton V; Velichevskaya, Alena I; Stegniy, Vladimir N; Sharakhov, Igor V; Sharakhova, Maria V

    2018-03-27

    Anopheles beklemishevi is a member of the Maculipennis group of malaria mosquitoes that has the most northern distribution among other members of the group. Although a cytogenetic map for the larval salivary gland chromosomes of this species has been developed, a high-quality standard cytogenetic photomap that enables genomics and population genetics studies of this mosquito at the adult stage is still lacking. In this study, a cytogenetic map for the polytene chromosomes of An. beklemishevi from ovarian nurse cells was developed using high-resolution digital imaging from field collected mosquitoes. PCR-amplified DNA probes for fluorescence in situ hybridization (FISH) were designed based on the genome of An. atroparvus. The DNA probe obtained by microdissection procedures from the breakpoint region was labelled in a DOP-PCR reaction. Population analysis was performed on 371 specimens collected in 18 locations. We report the development of a high-quality standard photomap for the polytene chromosomes from ovarian nurse cells of An. beklemishevi. To confirm the suitability of the map for physical mapping, several PCR-amplified probes were mapped to the chromosomes of An. beklemishevi using FISH. In addition, we identified and mapped DNA probes to flanking regions of the breakpoints of two inversions on chromosome X of this species. Inversion polymorphism was determined in 13 geographically distant populations of An. beklemishevi. Four polymorphic inversions were detected. The positions of common chromosomal inversions were indicated on the map. The study constructed a standard photomap for ovarian nurse cell chromosomes of An. beklemishevi and tested its suitability for physical genome mapping and population studies. Cytogenetic analysis determined inversion polymorphism in natural populations of An. beklemishevi related to this species' adaptation.

  5. Use Standards to Draw Curriculum Maps

    ERIC Educational Resources Information Center

    Franklin, Pat; Stephens, Claire Gatrell

    2009-01-01

    Specific curriculum is taught at every grade level and it is the job of library media specialists to know subject area content. Library media specialists should develop collections to meet the content associated with curriculum standards. To ensure that these collections meet school needs, collection mapping with specific curriculum related to…

  6. The Role of Construct Maps in Standard Setting

    ERIC Educational Resources Information Center

    Kane, Michael T.; Tannenbaum, Richard J.

    2013-01-01

    The authors observe in this commentary that construct maps can help standard-setting panels to make realistic and internally consistent recommendations for performance-level descriptions (PLDs) and cut-scores, but the benefits may not be realized if policymakers do not fully understand the rationale for the recommendations provided by the…

  7. Using Clouds for MapReduce Measurement Assignments

    ERIC Educational Resources Information Center

    Rabkin, Ariel; Reiss, Charles; Katz, Randy; Patterson, David

    2013-01-01

    We describe our experiences teaching MapReduce in a large undergraduate lecture course using public cloud services and the standard Hadoop API. Using the standard API, students directly experienced the quality of industrial big-data tools. Using the cloud, every student could carry out scalability benchmarking assignments on realistic hardware,…

  8. An XML transfer schema for exchange of genomic and genetic mapping data: implementation as a web service in a Taverna workflow.

    PubMed

    Paterson, Trevor; Law, Andy

    2009-08-14

    Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. We have developed a simple generic XML schema (GenomicMappingData.xsd - GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data.

  9. An XML transfer schema for exchange of genomic and genetic mapping data: implementation as a web service in a Taverna workflow

    PubMed Central

    Paterson, Trevor; Law, Andy

    2009-01-01

    Background Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. Results We have developed a simple generic XML schema (GenomicMappingData.xsd – GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. Conclusion The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data. PMID:19682365

  10. Critical thinking in graduate medical education: A role for concept mapping assessment?

    PubMed

    West, D C; Pomeroy, J R; Park, J K; Gerstenberger, E A; Sandoval, J

    2000-09-06

    Tools to assess the evolving conceptual framework of physicians-in-training are limited, despite their critical importance to physicians' evolving clinical expertise. Concept mapping assessment (CMA) enables teachers to view students' organization of their knowledge at various points in training. To assess whether CMA reflects expected differences and changes in the conceptual framework of resident physicians, whether concept maps can be scored reliably, and how well CMA scores relate to the results of standard in-training examination. A group of 21 resident physicians (9 first-year and 12 second- and third-year residents) from a university-based pediatric training program underwent concept map training, drew a preinstruction concept map about seizures, completed an education course on seizures, and then drew a postinstruction map. Maps were scored independently by 3 raters using a standardized method. The study was conducted in May and June 1999. Preinstruction map total scores and subscores in 4 categories compared with postinstruction map scores; map scores of second- and third-year residents compared with first-year residents; and interrater correlation of map scores. Total CMA scores increased after instruction from a mean (SD) preinstruction map score of 429 (119) to a mean postinstruction map score of 516 (196) (P =.03). Second- and third-year residents scored significantly higher than first-year residents before instruction (mean [SD] score of 472 [116] vs 371 [102], respectively; P =.04), but not after instruction (mean [SD] scores, 561 [203] vs 456 [179], respectively; P =.16). Second- and third-year residents had greater preinstruction map complexity as measured by cross-link score (P =.01) than first-year residents. The CMA score had a weak to no correlation with the American Board of Pediatrics In-training Examination score (r = 0.10-0.54). Interrater correlation of map scoring ranged from weak to moderate for the preinstruction map (r = 0.51-0.69) and moderate to strong for the postinstruction map (r = 0.74-0.88). Our data provide preliminary evidence that concept mapping assessment reflects expected differences and change in the conceptual framework of resident physicians. Concept mapping assessment and standardized testing may measure different cognitive domains. JAMA. 2000;284:1105-1110

  11. Mapping Norway - a Method to Register and Survey the Status of Accessibility

    NASA Astrophysics Data System (ADS)

    Michaelis, Sven; Bögelsack, Kathrin

    2018-05-01

    The Norwegian mapping authority has developed a standard method for mapping accessibility mostly for people with limited or no walking abilities in urban and recreational areas. We choose an object-orientated approach where points, lines and polygons represents objects in the environment. All data are stored in a geospatial database, so they can be presented as web map and analyzed using GIS software. By the end of 2016 more than 160 municipalities are mapped using that method. The aim of this project is to establish a national standard for mapping and to provide a geodatabase that shows the status of accessibility throughout Norway. The data provide a useful tool for national statistics, local planning authorities and private users. First results show that accessibility is low and Norway still faces many challenges to meet the government's goals for Universal Design.

  12. Ordered versus Unordered Map for Primitive Data Types

    DTIC Science & Technology

    2015-09-01

    mapped to some element. C++ provides two types of map containers within the standard template library, the std ::map and the std ::unordered_map...classes. As the name implies, the containers main functional difference is that the elements in the std ::map are ordered by the key, and the std ...unordered_map are not ordered based on their key. The std ::unordered_map elements are placed into “buckets” based on a hash value computed for their key

  13. Tuberculosis disease mapping in Kedah using standardized morbidity ratio

    NASA Astrophysics Data System (ADS)

    Diah, Ijlal Mohd; Aziz, Nazrina; Kasim, Maznah Mat

    2017-10-01

    This paper presents the results of relative risk estimation that applied to TB data in Kedah using the most common approach, Standardized Morbidity Ratio (SMR). Disease mapping has been recognized as one of the methods that can be used by government and public health in order to control diseases since it can give a clear picture of the risk areas. To get good disease mapping, relative risk estimation is an important issue that need to be considered. TB risk areas will be recognized through the map. From the result, Kulim shows the lowest risk areas of contracting TB while Kota Setar has the highest risk area.

  14. Landsat for practical forest type mapping - A test case

    NASA Technical Reports Server (NTRS)

    Bryant, E.; Dodge, A. G., Jr.; Warren, S. D.

    1980-01-01

    Computer classified Landsat maps are compared with a recent conventional inventory of forest lands in northern Maine. Over the 196,000 hectare area mapped, estimates of the areas of softwood, mixed wood and hardwood forest obtained by a supervised classification of the Landsat data and a standard inventory based on aerial photointerpretation, probability proportional to prediction, field sampling and a standard forest measurement program are found to agree to within 5%. The cost of the Landsat maps is estimated to be $0.065/hectare. It is concluded that satellite techniques are worth developing for forest inventories, although they are not yet refined enough to be incorporated into current practical inventories.

  15. Comparison of PV signal quality using a novel circular mapping and ablation catheter versus a standard circular mapping catheter.

    PubMed

    von Bary, Christian; Fredersdorf-Hahn, Sabine; Heinicke, Norbert; Jungbauer, Carsten; Schmid, Peter; Riegger, Günter A; Weber, Stefan

    2011-08-01

    Recently, new catheter technologies have been developed for atrial fibrillation (AF) ablation. We investigate the diagnostic accuracy of a circular mapping and pulmonary vein ablation catheter (PVAC) compared with a standard circular mapping catheter (Orbiter) and the influence of filter settings on signal quality. After reconstruction of the left atrium by three-dimensional atriography, baseline PV potentials (PVP) were recorded consecutively with PVAC and Orbiter in 20 patients with paroxysmal AF. PVPs were compared and attributed to predefined anatomical PV segments. Ablation was performed in 80 PVs using the PVAC. If isolation of the PVs was assumed, signal assessment of each PV was repeated with the Orbiter. If residual PV potentials could be uncovered, different filter settings were tested to improve mapping quality of the PVAC. Ablation was continued until complete PV isolation (PVI) was confirmed with the Orbiter. Baseline mapping demonstrated a good correlation between the Orbiter and PVAC. Mapping accuracy using the PVAC for mapping and ablation was 94% (74 of 79 PVs). Additional mapping with the Orbiter improved the PV isolation rate to 99%. Adjustment of filter settings failed to improve quality of the PV signals compared with standard filter settings. Using the PVAC as a stand-alone strategy for mapping and ablation, one should be aware that in some cases, different signal morphology mimics PVI isolation. Adjustment of filter settings failed to improve signal quality. The use of an additional mapping catheter is recommended to become familiar with the particular signal morphology during the first PVAC cases or whenever there is a doubt about successful isolation of the pulmonary veins.

  16. Mapping Air Quality Index of Carbon Monoxide (CO) in Medan City

    NASA Astrophysics Data System (ADS)

    Suryati, I.; Khair, H.

    2017-03-01

    This study aims to map and analyze air quality index of carbon monoxide (CO) in Medan City. This research used 12 (twelve) sampling points around in Medan with an hour duration each point. CO concentration was analyzed using the NDIR CO Analyzer sampling tool. The concentration CO was obtained between 1 ppm - 23 ppm, with an average concentration was 9.5 ppm. This condition is still below the national ambient air quality standard set by Government Regulation of Indonesian Republic Number 41-1999 amounted to 29 ppm. The result of CO concentration measurements was converted into air pollutant standard index, obtained the index value of 58 - 204. Surfer 10 was used to create map of air pollutant standard index for CO. The map illustrates very unhealthy area where located in the Medan Belawan district. The main factors affecting the concentration of CO are from transportation and meteorological factors.

  17. Importance sampling with imperfect cloning for the computation of generalized Lyapunov exponents

    NASA Astrophysics Data System (ADS)

    Anteneodo, Celia; Camargo, Sabrina; Vallejos, Raúl O.

    2017-12-01

    We revisit the numerical calculation of generalized Lyapunov exponents, L (q ) , in deterministic dynamical systems. The standard method consists of adding noise to the dynamics in order to use importance sampling algorithms. Then L (q ) is obtained by taking the limit noise-amplitude → 0 after the calculation. We focus on a particular method that involves periodic cloning and pruning of a set of trajectories. However, instead of considering a noisy dynamics, we implement an imperfect (noisy) cloning. This alternative method is compared with the standard one and, when possible, with analytical results. As a workbench we use the asymmetric tent map, the standard map, and a system of coupled symplectic maps. The general conclusion of this study is that the imperfect-cloning method performs as well as the standard one, with the advantage of preserving the deterministic dynamics.

  18. MAP Science for Use with Next Generation Science Standards. NWEA External FAQ

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2016

    2016-01-01

    Measures of Academic Progress® (MAP®) Science for use with Next Generation Science Standards (NGSS) assessments are available for the 2016-17 school year. These new assessments measure student growth toward understanding of the multidimensional NGSS performance expectations. This report presents MAP Science for use with NGSS by presenting and…

  19. "Understanding" medical school curriculum content using KnowledgeMap.

    PubMed

    Denny, Joshua C; Smithers, Jeffrey D; Miller, Randolph A; Spickard, Anderson

    2003-01-01

    To describe the development and evaluation of computational tools to identify concepts within medical curricular documents, using information derived from the National Library of Medicine's Unified Medical Language System (UMLS). The long-term goal of the KnowledgeMap (KM) project is to provide faculty and students with an improved ability to develop, review, and integrate components of the medical school curriculum. The KM concept identifier uses lexical resources partially derived from the UMLS (SPECIALIST lexicon and Metathesaurus), heuristic language processing techniques, and an empirical scoring algorithm. KM differentiates among potentially matching Metathesaurus concepts within a source document. The authors manually identified important "gold standard" biomedical concepts within selected medical school full-content lecture documents and used these documents to compare KM concept recognition with that of a known state-of-the-art "standard"-the National Library of Medicine's MetaMap program. The number of "gold standard" concepts in each lecture document identified by either KM or MetaMap, and the cause of each failure or relative success in a random subset of documents. For 4,281 "gold standard" concepts, MetaMap matched 78% and KM 82%. Precision for "gold standard" concepts was 85% for MetaMap and 89% for KM. The heuristics of KM accurately matched acronyms, concepts underspecified in the document, and ambiguous matches. The most frequent cause of matching failures was absence of target concepts from the UMLS Metathesaurus. The prototypic KM system provided an encouraging rate of concept extraction for representative medical curricular texts. Future versions of KM should be evaluated for their ability to allow administrators, lecturers, and students to navigate through the medical curriculum to locate redundancies, find interrelated information, and identify omissions. In addition, the ability of KM to meet specific, personal information needs should be assessed.

  20. National Geospatial Program

    USGS Publications Warehouse

    Carswell, William J.

    2011-01-01

    increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.

  1. Updated symbol catalogue for geologic and geomorphologic mapping in Planetary Scinces

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; Fortezzo, Corey; Skinner, James, Jr.; Hunter, Marc; Hare, Trent

    2017-04-01

    Maps are one of the most powerful communication tools for spatial data. This is true for terrestrial data, as well as the many types of planetary data. Geologic and/or geomorphologic maps of planetary surfaces, in particular those of the Moon, Mars, and Venus, are standardized products and often prepared as a part of hypothesis-driven science investigations. The NASA-funded Planetary Geologic Mapping program, coordinated by the USGS Astrogeology Science Center (ASC), produces high-quality, standardized, and refereed geologic maps and digital databases of planetary bodies. In this context, 242 geologic, geomorphologic, and thematic map sheets and map series have been published since the 1962. However, outside of this program, numerous non-USGS published maps are created as result of scientific investigations and published, e.g. as figures or supplemental materials within a peer-reviewed journal article. Due to the complexity of planetary surfaces, diversity between different planet surfaces, and the varied resolution of the data, geomorphologic and geologic mapping is a challenging task. Because of these limiting conditions, the mapping process is a highly interpretative work and is mostly limited to remotely sensed satellite data - with a few expetions from rover data. Uniform and an unambiguous data are fundamental to make quality observations that lead to unbiased and supported interpretations, especially when there is no current groundtruthing. To allow for correlation between different map products (digital or analog), the most commonly used spatial objects are predefined cartographic symbols. The Federal Geographic Data Committee (FGDC) Digital Cartographic Standard for Geologic Map Symbolization (DCSGMS) defines the most commonly used symbols, colors, and hatch patterns in one comprehensive document. Chapter 25 of the DCSGMS defines the Planetary Geology Features based on the symbols defined in the Venus Mapper's Handbook. After reviewing the 242 planetary geological maps, we propose to 1) review standardized symbols for planetary maps, and 2) recommend an updated symbol collection for adoption by the planetary mapping community. Within these points, the focus is on the changing of symbology with respect to time and how it effects communication within and between the maps. Two key questions to address are 1) does chapter 25 provides enough variability within the subcategories (e.g., faults) to represent the data within the maps? 2) How recommendations to the mapping community and their steering committees could be delivered to enhance a map's communicability, and convey information succinctly but thoroughly. For determining the most representative symbol collection of existing maps to support future map results (within or outside of USGS mapping program) we defined a stepwise task list: 1) Statistical review of existing symbol sets and collections, 2) Establish a representative symbol set for planetary mapping, 3) Update cartographic symbols, 4) Implementation into GIS-based mapping software (this implementation will mimic the 2010 application of the planetary symbol set into ArcGIS (more information https://planetarymapping.wr.usgs.gov/Project). 6) Platform to provide the symbol set to the mapping community. This project was initiated within an ongoing cooperation work between the USGS ASC and the German Aerospace Center (DLR), Dept. of Planetary Geology.

  2. Mapping soil texture targeting predefined depth range or synthetizing from standard layers?

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Dezső Kaposi, András; Szatmári, Gábor; Takács, Katalin; Pásztor, László

    2017-04-01

    There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. Physical soil properties, especially particle size distribution play important role in this context. A few of the requirements can be satisfied by the sand-, silt-, and clay content maps compiled according to global standards such as GlobalSoilMap (GSM) or Soil Grids. Soil texture classes (e. g. according to USDA classification) can be derived from these three fraction data, in this way texture map can be compiled based on the proper separate maps. Soil texture class as well as fraction information represent direct input of crop-, meteorological- and hydrological models. The model inputs frequently require maps representing soil features of 0-30 cm depth, which is covered by three consecutive depth intervals according to standard specifications: 0-5 cm, 5-15 cm, 15-30 cm. Becoming GSM and SoilGrids the most detailed freely available spatial soil data sources, the common model users (e. g. meteorologists, agronomists, or hydrologists) would produce input map from (the weighted mean of) these three layers. However, if the basic soil data and proper knowledge is obtainable, a soil texture map targeting directly the 0-30 cm layer could be independently compiled. In our work we compared Hungary's soil texture maps compiled using the same reference and auxiliary data and inference methods but for differing layer distribution. We produced the 0-30 cm clay, silt and sand map as well as the maps for the three standard layers (0-5 cm, 5-15 cm, 15-30 cm). Maps of sand, silt and clay percentage were computed through regression kriging (RK) applying Additive Log-Ratio (alr) transformation. In addition to the Hungarian Soil Information and Monitoring System as reference soil data, digital elevation model and its derived components, soil physical property maps, remotely sensed images, land use -, geological-, as well as meteorological data were applied as auxiliary variables. We compared the directly compiled and the synthetized clay-, sand content, and texture class maps by different tools. In addition to pairwise comparison of basic statistical features (histograms, scatter plots), we examined the spatial distribution of the differences. We quantified the taxonomical distances of the textural classes, in order to investigate the differences of the map-pairs. We concluded that the directly computed and the synthetized maps show various differences. In the case of clay-, and sand content maps, the map-pairs have to be considered statistically different. On the other hand, the differences of the texture class maps are not significant. However, in all cases, the differences rather concern the extreme ranges and categories. Using of synthetized maps can intensify extremities by error propagation in models and scenarios. Based on our results, we suggest the usage of the directly composed maps.

  3. 30 CFR 75.1200-2 - Accuracy and scale of mine maps.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Accuracy and scale of mine maps. 75.1200-2... SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Maps § 75.1200-2 Accuracy and scale of mine maps. (a) The scale of mine maps submitted to the Secretary shall not be less than 100 or...

  4. Thinking Connections: Concept Maps for Life Science. Book B.

    ERIC Educational Resources Information Center

    Burggraf, Frederick

    The concept maps contained in this book (for grades 7-12) span 35 topics in life science. Topics were chosen using the National Science Education Standards as a guide. The practice exercise in concept mapping is included to give students an idea of what the tasks ahead will be in content rich maps. Two levels of concept maps are included for each…

  5. Global transport in a nonautonomous periodic standard map

    DOE PAGES

    Calleja, Renato C.; del-Castillo-Negrete, D.; Martinez-del-Rio, D.; ...

    2017-04-14

    A non-autonomous version of the standard map with a periodic variation of the perturbation parameter is introduced and studied via an autonomous map obtained from the iteration of the nonautonomous map over a period. Symmetry properties in the variables and parameters of the map are found and used to find relations between rotation numbers of invariant sets. The role of the nonautonomous dynamics on period-one orbits, stability and bifurcation is studied. The critical boundaries for the global transport and for the destruction of invariant circles with fixed rotation number are studied in detail using direct computation and a continuation method.more » In the case of global transport, the critical boundary has a particular symmetrical horn shape. Here, the results are contrasted with similar calculations found in the literature.« less

  6. Global transport in a nonautonomous periodic standard map

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calleja, Renato C.; del-Castillo-Negrete, D.; Martinez-del-Rio, D.

    A non-autonomous version of the standard map with a periodic variation of the perturbation parameter is introduced and studied via an autonomous map obtained from the iteration of the nonautonomous map over a period. Symmetry properties in the variables and parameters of the map are found and used to find relations between rotation numbers of invariant sets. The role of the nonautonomous dynamics on period-one orbits, stability and bifurcation is studied. The critical boundaries for the global transport and for the destruction of invariant circles with fixed rotation number are studied in detail using direct computation and a continuation method.more » In the case of global transport, the critical boundary has a particular symmetrical horn shape. Here, the results are contrasted with similar calculations found in the literature.« less

  7. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  8. Unsupervised Domain Adaptation with Multiple Acoustic Models

    DTIC Science & Technology

    2010-12-01

    Discriminative MAP Adaptation Standard ML-MAP has been extended to incorporate discrim- inative training criteria such as MMI and MPE [10]. Dis- criminative MAP...smoothing variable I . For example, the MMI - MAP mean is given by ( mmi -map) jm = fnumjm (O) den jm(O)g+Djm̂jm + I (ml-map) jm f numjm den... MMI training, and Djm is the Gaussian-dependent parameter for the extended Baum-Welch (EBW) algorithm. MMI -MAP has been successfully applied in

  9. Field Guide to the Plant Community Types of Voyageurs National Park

    USGS Publications Warehouse

    Faber-Langendoen, Don; Aaseng, Norman; Hop, Kevin; Lew-Smith, Michael

    2007-01-01

    INTRODUCTION The objective of the U.S. Geological Survey-National Park Service Vegetation Mapping Program is to classify, describe, and map vegetation for most of the park units within the National Park Service (NPS). The program was created in response to the NPS Natural Resources Inventory and Monitoring Guidelines issued in 1992. Products for each park include digital files of the vegetation map and field data, keys and descriptions to the plant communities, reports, metadata, map accuracy verification summaries, and aerial photographs. Interagency teams work in each park and, following standardized mapping and field sampling protocols, develop products and vegetation classification standards that document the various vegetation types found in a given park. The use of a standard national vegetation classification system and mapping protocol facilitate effective resource stewardship by ensuring compatibility and widespread use of the information throughout the NPS as well as by other Federal and state agencies. These vegetation classifications and maps and associated information support a wide variety of resource assessment, park management, and planning needs, and provide a structure for framing and answering critical scientific questions about plant communities and their relation to environmental processes across the landscape. This field guide is intended to make the classification accessible to park visitors and researchers at Voyageurs National Park, allowing them to identify any stand of natural vegetation and showing how the classification can be used in conjunction with the vegetation map (Hop and others, 2001).

  10. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  11. Patellar cartilage lesions: comparison of magnetic resonance imaging and T2 relaxation-time mapping.

    PubMed

    Hannila, I; Nieminen, M T; Rauvala, E; Tervonen, O; Ojala, R

    2007-05-01

    To evaluate the detection and the size of focal patellar cartilage lesions in T2 mapping as compared to standard clinical magnetic resonance imaging (MRI) at 1.5T. Fifty-five consecutive clinical patients referred to knee MRI were imaged both with a standard knee MRI protocol (proton-density-weighted sagittal and axial series, T2-weighted sagittal and coronal series, and T1-weighted coronal series) and with an axial multislice multi-echo spin-echo measurement to determine the T2 relaxation time of the patellar cartilage. MR images and T2 maps of patellar cartilage were evaluated for focal lesions. The lesions were evaluated for lesion width (mm), lesion depth (1/3, 2/3, or 3/3 of cartilage thickness), and T2 value (20-40 ms, 40-60 ms, or 60-80 ms) based on visual evaluation. Altogether, 36 focal patellar cartilage lesions were detected from 20 human subjects (11 male, nine female, mean age 40+/-15 years). Twenty-eight lesions were detected both on MRI and T2 maps, while eight lesions were only visible on T2 maps. Cartilage lesions were significantly wider (P = 0.001) and thicker (P<0.001) on T2 maps as compared to standard knee MRI. Most lesions 27 had moderately (T2 40-60 ms) increased T2 values, while two lesions had slightly (T2 20-40 ms) and seven lesions remarkably (T2 60-80 ms) increased T2 relaxation times. T2 mapping of articular cartilage is feasible in the clinical setting and may reveal early cartilage lesions not visible with standard clinical MRI.

  12. Canadian and U.S. Cooperation for the development of standards and specifications for emerging mapping technologies

    USGS Publications Warehouse

    Habib, A.; Jarvis, A.; Al-Durgham, M. M.; Lay, J.; Quackenbush, P.; Stensaas, G.; Moe, D.

    2007-01-01

    The mapping community is witnessing significant advances in available sensors, such as medium format digital cameras (MFDC) and Light Detection and Ranging (LiDAR) systems. In this regard, the Digital Photogrammetry Research Group (DPRG) of the Department of Geomatics Engineering at the University of Calgary has been actively involved in the development of standards and specifications for regulating the use of these sensors in mapping activities. More specifically, the DPRG has been working on developing new techniques for the calibration and stability analysis of medium format digital cameras. This research is essential since these sensors have not been developed with mapping applications in mind. Therefore, prior to their use in Geomatics activies, new standards should be developed to ensure the quality of the developed products. In another front, the persistent improvement in direct geo-referencing technology has led to an expansion in the use of LiDAR systems for the acquisition of dense and accurate surface information. However, the processing of the raw LiDAR data (e.g., ranges, mirror angles, and navigation data) remains a non-transparent process that is proprietary to the manufacturers of LiDAR systems. Therefore, the DPRG has been focusing on the development of quality control procedures to quantify the accuracy of LiDAR output in the absence of initial system measurements. This paper presents a summary of the research conducted by the DPRG together with the British Columbia Base Mapping and Geomatic Services (BMGS) and the United States Geological Survey (USGS) for the development of quality assurance and quality control procedures for emerging mapping technologies. The outcome of this research will allow for the possiblity of introducing North American Standards and Specifications to regulate the use of MFDC and LiDAR systems in the mapping industry.

  13. Topographic Map of the West Candor Chasma Region of Mars, MTM 500k -05/282E OMKT

    USGS Publications Warehouse

    ,

    2004-01-01

    This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.

  14. Topographic Map of the Ophir and Central Candor Chasmata Region of Mars MTM 500k -05/287E OMKT

    USGS Publications Warehouse

    ,

    2004-01-01

    This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.

  15. Topographic map of the Tithonium Chasma Region of Mars, MTM 500k -05/277E OMKT

    USGS Publications Warehouse

    ,

    2004-01-01

    This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 270? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.

  16. Connecticut Music Trace Map for Grades 6 and 8. Revised.

    ERIC Educational Resources Information Center

    Connecticut State Board of Education, Hartford.

    These Connecticut Curriculum Trace Maps for music are designed to help curriculum developers and teachers translate Connecticut's K-12 performance standards into objectives and classroom practices. Trace Maps provide specific descriptions of what students should know and be able to do at smaller grade level clusters. Elements in the Trace Maps are…

  17. Absolute color scale for improved diagnostics with wavefront error mapping.

    PubMed

    Smolek, Michael K; Klyce, Stephen D

    2007-11-01

    Wavefront data are expressed in micrometers and referenced to the pupil plane, but current methods to map wavefront error lack standardization. Many use normalized or floating scales that may confuse the user by generating ambiguous, noisy, or varying information. An absolute scale that combines consistent clinical information with statistical relevance is needed for wavefront error mapping. The color contours should correspond better to current corneal topography standards to improve clinical interpretation. Retrospective analysis of wavefront error data. Historic ophthalmic medical records. Topographic modeling system topographical examinations of 120 corneas across 12 categories were used. Corneal wavefront error data in micrometers from each topography map were extracted at 8 Zernike polynomial orders and for 3 pupil diameters expressed in millimeters (3, 5, and 7 mm). Both total aberrations (orders 2 through 8) and higher-order aberrations (orders 3 through 8) were expressed in the form of frequency histograms to determine the working range of the scale across all categories. The standard deviation of the mean error of normal corneas determined the map contour resolution. Map colors were based on corneal topography color standards and on the ability to distinguish adjacent color contours through contrast. Higher-order and total wavefront error contour maps for different corneal conditions. An absolute color scale was produced that encompassed a range of +/-6.5 microm and a contour interval of 0.5 microm. All aberrations in the categorical database were plotted with no loss of clinical information necessary for classification. In the few instances where mapped information was beyond the range of the scale, the type and severity of aberration remained legible. When wavefront data are expressed in micrometers, this absolute scale facilitates the determination of the severity of aberrations present compared with a floating scale, particularly for distinguishing normal from abnormal levels of wavefront error. The new color palette makes it easier to identify disorders. The corneal mapping method can be extended to mapping whole eye wavefront errors. When refraction data are expressed in diopters, the previously published corneal topography scale is suggested.

  18. Preliminary Integrated Geologic Map Databases for the United States: Connecticut, Maine, Massachusetts, New Hampshire, New Jersey, Rhode Island and Vermont

    USGS Publications Warehouse

    Nicholson, Suzanne W.; Dicken, Connie L.; Horton, John D.; Foose, Michael P.; Mueller, Julia A.L.; Hon, Rudi

    2006-01-01

    The rapid growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national scale digital geologic maps that have standardized information about geologic age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. Although two digital geologic maps (Schruben and others, 1994; Reed and Bush, 2004) of the United States currently exist, their scales (1:2,500,000 and 1:5,000,000) are too general for many regional applications. Most states have digital geologic maps at scales of about 1:500,000, but the databases are not comparably structured and, thus, it is difficult to use the digital database for more than one state at a time. This report describes the result for a seven state region of an effort by the U.S. Geological Survey to produce a series of integrated and standardized state geologic map databases that cover the entire United States. In 1997, the United States Geological Survey's Mineral Resources Program initiated the National Surveys and Analysis (NSA) Project to develop national digital databases. One primary activity of this project was to compile a national digital geologic map database, utilizing state geologic maps, to support studies in the range of 1:250,000- to 1:1,000,000-scale. To accomplish this, state databases were prepared using a common standard for the database structure, fields, attribution, and data dictionaries. For Alaska and Hawaii new state maps are being prepared and the preliminary work for Alaska is being released as a series of 1:250,000 scale quadrangle reports. This document provides background information and documentation for the integrated geologic map databases of this report. This report is one of a series of such reports releasing preliminary standardized geologic map databases for the United States. The data products of the project consist of two main parts, the spatial databases and a set of supplemental tables relating to geologic map units. The datasets serve as a data resource to generate a variety of stratigraphic, age, and lithologic maps. This documentation is divided into four main sections: (1) description of the set of data files provided in this report, (2) specifications of the spatial databases, (3) specifications of the supplemental tables, and (4) an appendix containing the data dictionaries used to populate some fields of the spatial database and supplemental tables.

  19. Portability issues for a structured clinical vocabulary: mapping from Yale to the Columbia medical entities dictionary.

    PubMed Central

    Kannry, J L; Wright, L; Shifman, M; Silverstein, S; Miller, P L

    1996-01-01

    OBJECTIVE: To examine the issues involved in mapping an existing structured controlled vocabulary, the Medical Entities Dictionary (MED) developed at Columbia University, to an institutional vocabulary, the laboratory and pharmacy vocabularies of the Yale New Haven Medical Center. DESIGN: 200 Yale pharmacy terms and 200 Yale laboratory terms were randomly selected from database files containing all of the Yale laboratory and pharmacy terms. These 400 terms were then mapped to the MED in three phases: mapping terms, mapping relationships between terms, and mapping attributes that modify terms. RESULTS: 73% of the Yale pharmacy terms mapped to MED terms. 49% of the Yale laboratory terms mapped to MED terms. After certain obsolete and otherwise inappropriate laboratory terms were eliminated, the latter rate improved to 59%. 23% of the unmatched Yale laboratory terms failed to match because of differences in granularity with MED terms. The Yale and MED pharmacy terms share 12 of 30 distinct attributes. The Yale and MED laboratory terms share 14 of 23 distinct attributes. CONCLUSION: The mapping of an institutional vocabulary to a structured controlled vocabulary requires that the mapping be performed at the level of terms, relationships, and attributes. The mapping process revealed the importance of standardization of local vocabulary subsets, standardization of attribute representation, and term granularity. PMID:8750391

  20. Labeling Projections on Published Maps

    USGS Publications Warehouse

    Snyder, John P.

    1987-01-01

    To permit accurate scaling on a map, and to use the map as a source of accurate positions in the transfer of data, certain parameters - such as the standard parallels selected for a conic projection - must be stated on the map. This information is often missing on published maps. Three current major world atlases are evaluated with respect to map projection identification. The parameters essential for the projections used in these three atlases are discussed and listed. These parameters should be stated on any map based on the same projection.

  1. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  2. Comparison of validity of mapping between drug indications and ICD-10. Direct and indirect terminology based approaches.

    PubMed

    Choi, Y; Jung, C; Chae, Y; Kang, M; Kim, J; Joung, K; Lim, J; Cho, S; Sung, S; Lee, E; Kim, S

    2014-01-01

    Mapping of drug indications to ICD-10 was undertaken in Korea by a public and a private institution for their own purposes. A different mapping approach was used by each institution, which presented a good opportunity to compare the validity of the two approaches. This study was undertaken to compare the validity of a direct mapping approach and an indirect terminology based mapping approach of drug indications against the gold standard drawn from the results of the two mapping processes. Three hundred and seventy-five cardiovascular reference drugs were selected from all listed cardiovascular drugs for the study. In the direct approach, two experienced nurse coders mapped the free text indications directly to ICD-10. In the indirect terminology based approach, the indications were extracted and coded in the Korean Standard Terminology of Medicine. These terminology coded indications were then manually mapped to ICD-10. The results of the two approaches were compared to the gold standard. A kappa statistic was calculated to see the compatibility of both mapping approaches. Recall, precision and F1 score of each mapping approach were calculated and analyzed using a paired t-test. The mean number of indications for the study drugs was 5.42. The mean number of ICD-10 codes that matched in direct approach was 46.32 and that of indirect terminology based approach was 56.94. The agreement of the mapping results between the two approaches were poor (kappa = 0.19). The indirect terminology based approach showed higher recall (86.78%) than direct approach (p < 0.001). However, there was no difference in precision and F1 score between the two approaches. Considering no differences in the F1 scores, both approaches may be used in practice for mapping drug indications to ICD-10. However, in terms of consistency, time and manpower, better results are expected from the indirect terminology based approach.

  3. Mapping of Outdoor Classrooms.

    ERIC Educational Resources Information Center

    Horvath, Victor G.

    Mapping symbols adopted by the Michigan Department of Natural Resources are presented with their explanations. In an effort to provide standardization and familiarity teachers and other school people involved in an outdoor education program are encouraged to utilize the same symbols in constructing maps. (DK)

  4. Snake River Plain Geothermal Play Fairway Analysis - Phase 1 Raster Files

    DOE Data Explorer

    John Shervais

    2015-10-09

    Snake River Plain Play Fairway Analysis - Phase 1 CRS Raster Files. This dataset contains raster files created in ArcGIS. These raster images depict Common Risk Segment (CRS) maps for HEAT, PERMEABILITY, AND SEAL, as well as selected maps of Evidence Layers. These evidence layers consist of either Bayesian krige functions or kernel density functions, and include: (1) HEAT: Heat flow (Bayesian krige map), Heat flow standard error on the krige function (data confidence), volcanic vent distribution as function of age and size, groundwater temperature (equivalue interval and natural breaks bins), and groundwater T standard error. (2) PERMEABILTY: Fault and lineament maps, both as mapped and as kernel density functions, processed for both dilational tendency (TD) and slip tendency (ST), along with data confidence maps for each data type. Data types include mapped surface faults from USGS and Idaho Geological Survey data bases, as well as unpublished mapping; lineations derived from maximum gradients in magnetic, deep gravity, and intermediate depth gravity anomalies. (3) SEAL: Seal maps based on presence and thickness of lacustrine sediments and base of SRP aquifer. Raster size is 2 km. All files generated in ArcGIS.

  5. Assessment of Seasonal Water Balance Components over India Using Macroscale Hydrological Model

    NASA Astrophysics Data System (ADS)

    Joshi, S.; Raju, P. V.; Hakeem, K. A.; Rao, V. V.; Yadav, A.; Issac, A. M.; Diwakar, P. G.; Dadhwal, V. K.

    2016-12-01

    Hydrological models provide water balance components which are useful for water resources assessment and for capturing the seasonal changes and impact of anthropogenic interventions and climate change. The study under description is a national level modeling framework for country India using wide range of geo-spatial and hydro-meteorological data sets for estimating daily Water Balance Components (WBCs) at 0.15º grid resolution using Variable Infiltration Capacity model. The model parameters were optimized through calibration of model computed stream flow with field observed yielding Nash-Sutcliffe efficiency between 0.5 to 0.7. The state variables, evapotranspiration (ET) and soil moisture were also validated, obtaining R2 values of 0.57 and 0.69, respectively. Using long-term meteorological data sets, model computation were carried to capture hydrological extremities. During 2013, 2014 and 2015 monsoon seasons, WBCs were estimated and were published in web portal with 2-day time lag. In occurrence of disaster events, weather forecast was ingested, high surface runoff zones were identified for forewarning and disaster preparedness. Cumulative monsoon season rainfall of 2013, 2014 and 2015 were 105, 89 and 91% of long period average (LPA) respectively (Source: India Meteorological Department). Analysis of WBCs indicated that corresponding seasonal surface runoff was 116, 81 and 86% LPA and evapotranspiration was 109, 104 and 90% LPA. Using the grid-wise data, the spatial variation in WBCs among river basins/administrative regions was derived to capture the changes in surface runoff, ET between the years and in comparison with LPA. The model framework is operational and is providing periodic account of national level water balance fluxes which are useful for quantifying spatial and temporal variation in basin/sub-basin scale water resources, periodical water budgeting to form vital inputs for studies on water resources and climate change.

  6. Preliminary geologic map of the Perris 7.5' quadrangle, Riverside County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Digital preparation by Bovard, Kelly R.; Alvarez, Rachel M.

    2003-01-01

    Open-File Report 03-270 contains a digital geologic map database of the Perris 7.5’ quadrangle, Riverside County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. Portable Document Format (.pdf) files of: a. A Readme file b. The same graphic as described in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formationname, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc.

  7. Geologic map of the Devore 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Morton, Douglas M.; Matti, Jonathan C.

    2001-01-01

    This Open-File Report contains a digital geologic map database of the Devore 7.5' quadrangle, San Bernardino County, California, that includes: 1. ARC/INFO (Environmental Systems Research Institute) version 7.2.1 coverages of the various components of the geologic map 2. A PostScript (.ps) file to plot the geologic map on a topographic base, containing a Correlation of Map Units diagram, a Description of Map Units, an index map, and a regional structure map 3. Portable Document Format (.pdf) files of: a. This Readme; includes an Appendix, containing metadata details found in devre_met.txt b. The same graphic as plotted in 2 above. (Test plots from this .pdf do not produce 1:24,000-scale maps. Adobe Acrobat page-size settings control map scale.) The Correlation of Map Units and Description of Map Units are in the editorial format of USGS Miscellaneous Investigations Series maps (I-maps) but have not been edited to comply with I-map standards. Within the geologic-map data package, map units are identified by such standard geologic-map criteria as formation name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Devore 7.5’ topographic quadrangle in conjunction with the geologic map.

  8. Development of a Competency Mapping Tool for Undergraduate Professional Degree Programmes, Using Mechanical Engineering as a Case Study

    ERIC Educational Resources Information Center

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that…

  9. Unmanned aircraft systems image collection and computer vision image processing for surveying and mapping that meets professional needs

    NASA Astrophysics Data System (ADS)

    Peterson, James Preston, II

    Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.

  10. FEASIBILITY AND APPROACH FOR MAPPING RADON POTENTIALS IN FLORIDA

    EPA Science Inventory

    The report gives results of an analysis of the feasibility and approach for developing statewide maps of radon potentials in Florida. he maps would provide a geographic basis for implementing new radon-protective building construction standards to reduce public health risks from ...

  11. Cartographic services contract...for everything geographic

    USGS Publications Warehouse

    ,

    2003-01-01

    The U.S. Geological Survey's (USGS) Cartographic Services Contract (CSC) is used to award work for photogrammetric and mapping services under the umbrella of Architect-Engineer (A&E) contracting. The A&E contract is broad in scope and can accommodate any activity related to standard, nonstandard, graphic, and digital cartographic products. Services provided may include, but are not limited to, photogrammetric mapping and aerotriangulation; orthophotography; thematic mapping (for example, land characterization); analog and digital imagery applications; geographic information systems development; surveying and control acquisition, including ground-based and airborne Global Positioning System; analog and digital image manipulation, analysis, and interpretation; raster and vector map digitizing; data manipulations (for example, transformations, conversions, generalization, integration, and conflation); primary and ancillary data acquisition (for example, aerial photography, satellite imagery, multispectral, multitemporal, and hyperspectral data); image scanning and processing; metadata production, revision, and creation; and production or revision of standard USGS products defined by formal and informal specification and standards, such as those for digital line graphs, digital elevation models, digital orthophoto quadrangles, and digital raster graphics.

  12. Detection of Mycobacterium avium subspecies paratuberculosis in tie-stall dairy herds using a standardized environmental sampling technique and targeted pooled samples.

    PubMed

    Arango-Sabogal, Juan C; Côté, Geneviève; Paré, Julie; Labrecque, Olivia; Roy, Jean-Philippe; Buczinski, Sébastien; Doré, Elizabeth; Fairbrother, Julie H; Bissonnette, Nathalie; Wellemans, Vincent; Fecteau, Gilles

    2016-07-01

    Mycobacterium avium ssp. paratuberculosis (MAP) is the etiologic agent of Johne's disease, a chronic contagious enteritis of ruminants that causes major economic losses. Several studies, most involving large free-stall herds, have found environmental sampling to be a suitable method for detecting MAP-infected herds. In eastern Canada, where small tie-stall herds are predominant, certain conditions and management practices may influence the survival and transmission of MAP and recovery (isolation). Our objective was to estimate the performance of a standardized environmental and targeted pooled sampling technique for the detection of MAP-infected tie-stall dairy herds. Twenty-four farms (19 MAP-infected and 5 non-infected) were enrolled, but only 20 were visited twice in the same year, to collect 7 environmental samples and 2 pooled samples (sick cows and cows with poor body condition). Concurrent individual sampling of all adult cows in the herds was also carried out. Isolation of MAP was achieved using the MGIT Para TB culture media and the BACTEC 960 detection system. Overall, MAP was isolated in 7% of the environmental cultures. The sensitivity of the environmental culture was 44% [95% confidence interval (CI): 20% to 70%] when combining results from 2 different herd visits and 32% (95% CI: 13% to 57%) when results from only 1 random herd visit were used. The best sampling strategy was to combine samples from the manure pit, gutter, sick cows, and cows with poor body condition. The standardized environmental sampling technique and the targeted pooled samples presented in this study is an alternative sampling strategy to costly individual cultures for detecting MAP-infected tie-stall dairies. Repeated samplings may improve the detection of MAP-infected herds.

  13. Gap Analysis of Benthic Mapping at Three National Parks: Assateague Island National Seashore, Channel Islands National Park, and Sleeping Bear Dunes National Lakeshore

    USGS Publications Warehouse

    Rose, Kathryn V.; Nayegandhi, Amar; Moses, Christopher S.; Beavers, Rebecca; Lavoie, Dawn; Brock, John C.

    2012-01-01

    The National Park Service (NPS) Inventory and Monitoring (I&M) Program initiated a benthic habitat mapping program in ocean and coastal parks in 2008-2009 in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With more than 80 ocean and Great Lakes parks encompassing approximately 2.5 million acres of submerged territory and approximately 12,000 miles of coastline (Curdts, 2011), this Servicewide Benthic Mapping Program (SBMP) is essential. This report presents an initial gap analysis of three pilot parks under the SBMP: Assateague Island National Seashore (ASIS), Channel Islands National Park (CHIS), and Sleeping Bear Dunes National Lakeshore (SLBE) (fig. 1). The recommended SBMP protocols include servicewide standards (for example, gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). The SBMP requires the inventory and mapping of critical components of coastal and marine ecosystems: bathymetry, geoforms, surface geology, and biotic cover. In order for a park unit benthic inventory to be considered complete, maps of bathymetry and other key components must be combined into a final report (Moses and others, 2010). By this standard, none of the three pilot parks are mapped (inventoried) to completion with respect to submerged resources. After compiling the existing benthic datasets for these parks, this report has concluded that CHIS, with 49 percent of its submerged area mapped, has the most complete benthic inventory of the three. The ASIS submerged inventory is 41 percent complete, and SLBE is 17.5 percent complete.

  14. A natural-color mapping for single-band night-time image based on FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yilun; Qian, Yunsheng

    2018-01-01

    A natural-color mapping for single-band night-time image method based on FPGA can transmit the color of the reference image to single-band night-time image, which is consistent with human visual habits and can help observers identify the target. This paper introduces the processing of the natural-color mapping algorithm based on FPGA. Firstly, the image can be transformed based on histogram equalization, and the intensity features and standard deviation features of reference image are stored in SRAM. Then, the real-time digital images' intensity features and standard deviation features are calculated by FPGA. At last, FPGA completes the color mapping through matching pixels between images using the features in luminance channel.

  15. Collecting Data to Construct an Isoline Map

    ERIC Educational Resources Information Center

    Lohrengel, C. Frederick, II.; Larson, Paul R.

    2017-01-01

    National Geography Standard 1 requires that students learn:"How to use maps and other geographic representations, geospatial technologies, and spatial thinking to understand and communicate information" (Heffron and Downs 2012). These concepts have real-world applicability. For example, elevation contour maps are common in many…

  16. A Numerical Study of New Logistic Map

    NASA Astrophysics Data System (ADS)

    Khmou, Youssef

    In this paper, we propose a new logistic map based on the relation of the information entropy, we study the bifurcation diagram comparatively to the standard logistic map. In the first part, we compare the obtained diagram, by numerical simulations, with that of the standard logistic map. It is found that the structures of both diagrams are similar where the range of the growth parameter is restricted to the interval [0,e]. In the second part, we present an application of the proposed map in traffic flow using macroscopic model. It is found that the bifurcation diagram is an exact model of the Greenberg’s model of traffic flow where the growth parameter corresponds to the optimal velocity and the random sequence corresponds to the density. In the last part, we present a second possible application of the proposed map which consists of random number generation. The results of the analysis show that the excluded initial values of the sequences are (0,1).

  17. Current Approaches to Improving Marine Geophysical Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Jencks, J. H.; Cartwright, J.; Varner, J. D.; Anderson, C.; Robertson, E.; McLean, S. J.

    2016-02-01

    Exploring, understanding, and managing the global oceans is a challenge when hydrographic maps are available for only 5% of the world's oceans, even less of which have been mapped geologically or to identify benthic habitats. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is not commonplace. While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. NOAA partners with other federal agencies to provide an integrated approach to carry out a coordinated and comprehensive ocean and coastal mapping program. In order to maximize the return on their mapping investment, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. The public value of these data and products is maximized by streamlining data acquisition and processing operations, minimizing redundancies, facilitating discovery, and developing common standards to promote re-use. For its part, NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. In order to maximize effectiveness in ocean and coastal mapping, we must be sure that limited funding is not being used to collect data in areas where data already exist. By making data more accessible, NCEI extends the use of, and therefore the value of, these data. Working together, we can ensure that valuable data are made available to the broadest community.

  18. Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005

    USGS Publications Warehouse

    Soller, David R.

    2005-01-01

    Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; 6) continued development of the National Geologic Map Database; and 7) progress toward building and implementing a standard geologic map data model and standard science language for the U.S. and for North America.

  19. Fourth international circumpolar arctic vegetation mapping workshop

    USGS Publications Warehouse

    Raynolds, Martha K.; Markon, C.J.

    2002-01-01

    During the week of April 10, 2001, the Fourth International Circumpolar Arctic Vegetation Mapping Workshop was held in Moscow, Russia. The purpose of this meeting was to bring together the vegetation scientists working on the Circumpolar Arctic Vegetation Map (CAVM) to (1) review the progress of current mapping activities, (2) discuss and agree upon a standard set of arctic tundra subzones, (3) plan for the production and dissemination of a draft map, and (4) begin work on a legend for the final map.

  20. Natural resources research and development in Lesotho using LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Jackson, A. A. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A map of the drainage of the whole country to include at least third order streams was constructed from LANDSAT imagery. This was digitized and can be plotted at any required scale to provide base maps for other cartographic projects. A suite of programs for the interpretation of digital LANDSAT data is under development for a low cost programmable calculator. Initial output from these programs has proved to have better resolution and detail than the standard photographic products, and was to update the standard topographic map of a particular region.

  1. Cartographic quality of ERTS-1 images

    NASA Technical Reports Server (NTRS)

    Welch, R. I.

    1973-01-01

    Analyses of simulated and operational ERTS images have provided initial estimates of resolution, ground resolution, detectability thresholds and other measures of image quality of interest to earth scientists and cartographers. Based on these values, including an approximate ground resolution of 250 meters for both RBV and MSS systems, the ERTS-1 images appear suited to the production and/or revision of planimetric and photo maps of 1:500,000 scale and smaller for which map accuracy standards are compatible with the imaged detail. Thematic mapping, although less constrained by map accuracy standards, will be influenced by measurement thresholds and errors which have yet to be accurately determined for ERTS images. This study also indicates the desirability of establishing a quantitative relationship between image quality values and map products which will permit both engineers and cartographers/earth scientists to contribute to the design requirements of future satellite imaging systems.

  2. aMAP is a validated pipeline for registration and segmentation of high-resolution mouse brain data

    PubMed Central

    Niedworok, Christian J.; Brown, Alexander P. Y.; Jorge Cardoso, M.; Osten, Pavel; Ourselin, Sebastien; Modat, Marc; Margrie, Troy W.

    2016-01-01

    The validation of automated image registration and segmentation is crucial for accurate and reliable mapping of brain connectivity and function in three-dimensional (3D) data sets. While validation standards are necessarily high and routinely met in the clinical arena, they have to date been lacking for high-resolution microscopy data sets obtained from the rodent brain. Here we present a tool for optimized automated mouse atlas propagation (aMAP) based on clinical registration software (NiftyReg) for anatomical segmentation of high-resolution 3D fluorescence images of the adult mouse brain. We empirically evaluate aMAP as a method for registration and subsequent segmentation by validating it against the performance of expert human raters. This study therefore establishes a benchmark standard for mapping the molecular function and cellular connectivity of the rodent brain. PMID:27384127

  3. Risk maps for targeting exotic plant pest detection programs in the United States

    Treesearch

    R.D. Magarey; D.M. Borchert; J.S. Engle; M Garcia-Colunga; Frank H. Koch; et al

    2011-01-01

    In the United States, pest risk maps are used by the Cooperative Agricultural Pest Survey for spatial and temporal targeting of exotic plant pest detection programs. Methods are described to create standardized host distribution, climate and pathway risk maps for the top nationally ranked exotic pest targets. Two examples are provided to illustrate the risk mapping...

  4. Thematic and positional accuracy assessment of digital remotely sensed data

    Treesearch

    Russell G. Congalton

    2007-01-01

    Accuracy assessment or validation has become a standard component of any land cover or vegetation map derived from remotely sensed data. Knowing the accuracy of the map is vital to any decisionmaking performed using that map. The process of assessing the map accuracy is time consuming and expensive. It is very important that the procedure be well thought out and...

  5. E-Learning Content Design Standards Based on Interactive Digital Concepts Maps in the Light of Meaningful and Constructivist Learning Theory

    ERIC Educational Resources Information Center

    Afify, Mohammed Kamal

    2018-01-01

    The present study aims to identify standards of interactive digital concepts maps design and their measurement indicators as a tool to develop, organize and administer e-learning content in the light of Meaningful Learning Theory and Constructivist Learning Theory. To achieve the objective of the research, the author prepared a list of E-learning…

  6. Topic Maps e-Learning Portal Development

    ERIC Educational Resources Information Center

    Olsevicova, Kamila

    2006-01-01

    Topic Maps, ISO/IEC 13250 standard, are designed to facilitate the organization and navigation of large collections of information objects by creating meta-level perspectives of their underlying concepts and relationships. The underlying structure of concepts and relations is expressed by domain ontologies. The Topics Maps technology can become…

  7. Human factors considerations in the design and evaluation of moving map displays of ownership on the airport surface

    DOT National Transportation Integrated Search

    2004-09-01

    The Federal Aviation Administration (FAA) has requested human factors guidance to support the new moving map Technical Standard Order (TSO)-C165, Electronic Map Display Equipment for Graphical Depiction of Aircraft Position. This document was develop...

  8. Estimating A Reference Standard Segmentation With Spatially Varying Performance Parameters: Local MAP STAPLE

    PubMed Central

    Commowick, Olivier; Akhondi-Asl, Alireza; Warfield, Simon K.

    2012-01-01

    We present a new algorithm, called local MAP STAPLE, to estimate from a set of multi-label segmentations both a reference standard segmentation and spatially varying performance parameters. It is based on a sliding window technique to estimate the segmentation and the segmentation performance parameters for each input segmentation. In order to allow for optimal fusion from the small amount of data in each local region, and to account for the possibility of labels not being observed in a local region of some (or all) input segmentations, we introduce prior probabilities for the local performance parameters through a new Maximum A Posteriori formulation of STAPLE. Further, we propose an expression to compute confidence intervals in the estimated local performance parameters. We carried out several experiments with local MAP STAPLE to characterize its performance and value for local segmentation evaluation. First, with simulated segmentations with known reference standard segmentation and spatially varying performance, we show that local MAP STAPLE performs better than both STAPLE and majority voting. Then we present evaluations with data sets from clinical applications. These experiments demonstrate that spatial adaptivity in segmentation performance is an important property to capture. We compared the local MAP STAPLE segmentations to STAPLE, and to previously published fusion techniques and demonstrate the superiority of local MAP STAPLE over other state-of-the- art algorithms. PMID:22562727

  9. Hemispherical map for the human brain cortex

    NASA Astrophysics Data System (ADS)

    Tosun, Duygu; Prince, Jerry L.

    2001-07-01

    Understanding the function of the human brain cortex is a primary goal in human brain mapping. Methods to unfold and flatten the cortical surface for visualization and measurement have been described in previous literature; but comparison across multiple subjects is still difficult because of the lack of a standard mapping technique. We describe a new approach that maps each hemisphere of the cortex to a portion of a sphere in a standard way, making comparison of anatomy and function across different subjects possible. Starting with a three-dimensional magnetic resonance image of the brain, the cortex is segmented and represented as a triangle mesh. Defining a cut around the corpus collosum identifies the left and right hemispheres. Together, the two hemispheres are mapped to the complex plane using a conformal mapping technique. A Mobius transformation, which is conformal, is used to transform the points on the complex plane so that a projective transformation maps each brain hemisphere onto a spherical segment comprising a sphere with a cap removed. We determined the best size of the spherical cap by minimizing the relative area distortion between hemispherical maps and original cortical surfaces. The relative area distortion between the hemispherical maps and the original cortical surfaces for fifteen human brains is analyzed.

  10. Preliminary geologic map of the Fontana 7.5' quadrangle, Riverside and San Bernardino Counties, California

    USGS Publications Warehouse

    Morton, Douglas M.; Digital preparation by Bovard, Kelly R.

    2003-01-01

    Open-File Report 03-418 is a digital geologic data set that maps and describes the geology of the Fontana 7.5’ quadrangle, Riverside and San Bernardino Counties, California. The Fontana quadrangle database is one of several 7.5’ quadrangle databases that are being produced by the Southern California Areal Mapping Project (SCAMP). These maps and databases are, in turn, part of the nation-wide digital geologic map coverage being developed by the National Cooperative Geologic Map Program of the U.S. Geological Survey (USGS). General Open-File Report 03-418 contains a digital geologic map database of the Fontana 7.5’ quadrangle, Riverside and San Bernardino Counties, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file (fon_map.ps) to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. An Encapsulated PostScript (EPS) file (fon_grey.eps) created in Adobe Illustrator 10.0 to plot the geologic map on a grey topographic base, and containing a Correlation of Map Units (CMU), a Description of Map Units (DMU), and an index map. 4. Portable Document Format (.pdf) files of: a. the Readme file; includes in Appendix I, data contained in fon_met.txt b. The same graphics as plotted in 2 and 3 above.Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (4b above) or plotting the postscript files (2 or 3 above).

  11. 30 CFR 75.508 - Map of electrical system.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Map of electrical system. 75.508 Section 75.508... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.508 Map of electrical system. [Statutory Provisions] The location and the electrical rating of all stationary electric...

  12. 30 CFR 75.508 - Map of electrical system.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Map of electrical system. 75.508 Section 75.508... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.508 Map of electrical system. [Statutory Provisions] The location and the electrical rating of all stationary electric...

  13. 30 CFR 75.508 - Map of electrical system.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Map of electrical system. 75.508 Section 75.508... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.508 Map of electrical system. [Statutory Provisions] The location and the electrical rating of all stationary electric...

  14. State Highway Maps: A Route to a Learning Adventure

    ERIC Educational Resources Information Center

    McDuffie, Thomas E.; Cifelli, Joseph

    2006-01-01

    Science within the folds of highway maps is explored through a series of hands-on experiences designed to reinforce and extend map-reading skills in grades 6-8. The increasingly sophisticated, standards-related activities include measuring distances between population centers, finding communities named after trees, animals, and geologic features,…

  15. Process for Generating Engine Fuel Consumption Map: Ricardo Cooled EGR Boost 24-bar Standard Car Engine Tier 2 Fuel

    EPA Pesticide Factsheets

    This document summarizes the process followed to utilize the fuel consumption map of a Ricardo modeled engine and vehicle fuel consumption data to generate a full engine fuel consumption map which can be used by EPA's ALPHA vehicle simulations.

  16. Generalized Weyl-Wigner map and Vey quantum mechanics

    NASA Astrophysics Data System (ADS)

    Dias, Nuno Costa; Prata, João Nuno

    2001-12-01

    The Weyl-Wigner map yields the entire structure of Moyal quantum mechanics directly from the standard operator formulation. The covariant generalization of Moyal theory, also known as Vey quantum mechanics, was presented in the literature many years ago. However, a derivation of the formalism directly from standard operator quantum mechanics, clarifying the relation between the two formulations, is still missing. In this article we present a covariant generalization of the Weyl order prescription and of the Weyl-Wigner map and use them to derive Vey quantum mechanics directly from the standard operator formulation. The procedure displays some interesting features: it yields all the key ingredients and provides a more straightforward interpretation of the Vey theory including a direct implementation of unitary operator transformations as phase space coordinate transformations in the Vey idiom. These features are illustrated through a simple example.

  17. Charge to Road Map Development Sessions

    NASA Technical Reports Server (NTRS)

    Barth, Janet

    2004-01-01

    Develop a road map for new standard Model applications radiation belt models. Model applications: Spacecraft and instruments. Reduce risk. Reduce cost. Improve performance. Increase system lifetime. Reduce risk to astronauts.

  18. Biomedical Terminology Mapper for UML projects.

    PubMed

    Thibault, Julien C; Frey, Lewis

    2013-01-01

    As the biomedical community collects and generates more and more data, the need to describe these datasets for exchange and interoperability becomes crucial. This paper presents a mapping algorithm that can help developers expose local implementations described with UML through standard terminologies. The input UML class or attribute name is first normalized and tokenized, then lookups in a UMLS-based dictionary are performed. For the evaluation of the algorithm 142 UML projects were extracted from caGrid and automatically mapped to National Cancer Institute (NCI) terminology concepts. Resulting mappings at the UML class and attribute levels were compared to the manually curated annotations provided in caGrid. Results are promising and show that this type of algorithm could speed-up the tedious process of mapping local implementations to standard biomedical terminologies.

  19. Biomedical Terminology Mapper for UML projects

    PubMed Central

    Thibault, Julien C.; Frey, Lewis

    As the biomedical community collects and generates more and more data, the need to describe these datasets for exchange and interoperability becomes crucial. This paper presents a mapping algorithm that can help developers expose local implementations described with UML through standard terminologies. The input UML class or attribute name is first normalized and tokenized, then lookups in a UMLS-based dictionary are performed. For the evaluation of the algorithm 142 UML projects were extracted from caGrid and automatically mapped to National Cancer Institute (NCI) terminology concepts. Resulting mappings at the UML class and attribute levels were compared to the manually curated annotations provided in caGrid. Results are promising and show that this type of algorithm could speed-up the tedious process of mapping local implementations to standard biomedical terminologies. PMID:24303278

  20. Scoping of Flood Hazard Mapping Needs for Merrimack County, New Hampshire

    DTIC Science & Technology

    2006-01-01

    DOQ Digital Orthophoto Quadrangle DOQQ Digital Ortho Quarter Quadrangle DTM Digital Terrain Model FBFM Flood Boundary and Floodway Map FEMA Federal...discussed available data and coverages within New Hampshire (for example, 2003 National Agriculture Imag- ery Program (NAIP) color Digital Orthophoto ... orthophotos providing improved base map accuracy. NH GRANIT is presently converting the standard, paper FIRMs and Flood Boundary and Floodway maps (FBFMs

  1. MetaMapping the nursing procedure manual.

    PubMed

    Peace, Jane; Brennan, Patricia Flatley

    2006-01-01

    Nursing procedure manuals are an important resource for practice, but ensuring that the correct procedure can be located when needed is an ongoing challenge. This poster presents an approach used to automatically index nursing procedures with standardized nursing terminology. Although indexing yielded a low number of mappings, examination of successfully mapped terms, incorrect mappings, and unmapped terms reveals important information about the reasons automated indexing fails.

  2. Rule-based mapping of fire-adapted vegetation and fire regimes for the Monongahela National Forest

    Treesearch

    Melissa A. Thomas-Van Gundy; Gregory J. Nowacki; Thomas M. Schuler

    2007-01-01

    A rule-based approach was employed in GIS to map fire-adapted vegetation and fire regimes within the proclamation boundary of the Monongahela National Forest. Spatial analyses and maps were generated using ArcMap 9.1. The resulting fireadaptation scores were then categorized into standard fire regime groups. Fire regime group V (200+ yrs) was the most common, assigned...

  3. Comparison of spectral radiance responsivity calibration techniques used for backscatter ultraviolet satellite instruments

    NASA Astrophysics Data System (ADS)

    Kowalewski, M. G.; Janz, S. J.

    2015-02-01

    Methods of absolute radiometric calibration of backscatter ultraviolet (BUV) satellite instruments are compared as part of an effort to minimize pre-launch calibration uncertainties. An internally illuminated integrating sphere source has been used for the Shuttle Solar BUV, Total Ozone Mapping Spectrometer, Ozone Mapping Instrument, and Global Ozone Monitoring Experiment 2 using standardized procedures traceable to national standards. These sphere-based spectral responsivities agree to within the derived combined standard uncertainty of 1.87% relative to calibrations performed using an external diffuser illuminated by standard irradiance sources, the customary spectral radiance responsivity calibration method for BUV instruments. The combined standard uncertainty for these calibration techniques as implemented at the NASA Goddard Space Flight Center’s Radiometric Calibration and Development Laboratory is shown to less than 2% at 250 nm when using a single traceable calibration standard.

  4. A framework for evaluating and utilizing medical terminology mappings.

    PubMed

    Hussain, Sajjad; Sun, Hong; Sinaci, Anil; Erturkmen, Gokce Banu Laleci; Mead, Charles; Gray, Alasdair J G; McGuinness, Deborah L; Prud'Hommeaux, Eric; Daniel, Christel; Forsberg, Kerstin

    2014-01-01

    Use of medical terminologies and mappings across them are considered to be crucial pre-requisites for achieving interoperable eHealth applications. Built upon the outcomes of several research projects, we introduce a framework for evaluating and utilizing terminology mappings that offers a platform for i) performing various mappings strategies, ii) representing terminology mappings together with their provenance information, and iii) enabling terminology reasoning for inferring both new and erroneous mappings. We present the results of the introduced framework from SALUS project where we evaluated the quality of both existing and inferred terminology mappings among standard terminologies.

  5. Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia

    NASA Astrophysics Data System (ADS)

    Swasti Kanthi, Nurin; Hery Purwanto, Taufik

    2016-11-01

    Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.

  6. SOHO EIT Carrington maps from synoptic full-disk data

    NASA Technical Reports Server (NTRS)

    Thompson, B. J.; Newmark, J. S.; Gurman, J. B.; Delaboudiniere, J. P.; Clette, F.; Gibson, S. E.

    1997-01-01

    The solar synoptic maps, obtained from observations carried out since May 1996 by the extreme-ultraviolet imaging telescope (EIT) onboard the Solar and Heliospheric Observatory (SOHO), are presented. The maps were constructed for each Carrington rotation with the calibrated data. The off-limb maps at 1.05 and 1.10 solar radii were generated for three coronal lines using the standard applied to coronagraph synoptic maps. The maps reveal several aspects of the solar structure over the entire rotation and are used in the whole sun month modeling campaign. @txt extreme-ultraviolet imaging telescope

  7. Hydrologic Unit Map -- 1978, state of South Dakota

    USGS Publications Warehouse

    ,

    1978-01-01

    This map and accompanying table show Hydrologic Unites that are basically hydrographic in nature. The Cataloging Unites shown supplant the Cataloging Units previously depicted n the 1974 State Hydrologic Unit Map. The boundaries as shown have been adapted from the 1974 State Hydrologic Unit Map, "The Catalog of Information on Water Data" (1972), "Water Resources Regions and Subregions for the National Assessment of Water and Related Land Resources" by the U.S. Water Resources Council (1970), "River Basin of the United States" by the U.S. Soil Conservation Service (1963, 1970), "River Basin Maps Showing Hydrologic Stations" by the Inter-Agency Committee on Water Resources, Subcommittee on Hydrology (1961), and State planning maps. The Political Subdivision has been adopted from "Counties and County Equivalents of the States if the United States" presented in Federal Information Processing Standards Publication 6-2, issued by the National Bureau of Standards (1973) in which each county or county equivalent is identified by a 2-character State code and a 3-character county code. The Regions, Subregions and Accounting Units are aggregates of the Cataloging Unites. The Regions and Sub regions are currently (1978) used by the U.S> Water Resources Council for comprehensive planning, including the National Assessment, and as a standard geographical framework for more detailed water and related land-resources planning. The Accounting Units are those currently (1978) in use by the U.S. Geological Survey for managing the National Water Data Network. This map was revised to include a boundary realinement between Cataloging Units 10140103 and 10160009.

  8. Etalon (standard) for surface potential distribution produced by electric activity of the heart.

    PubMed

    Szathmáry, V; Ruttkay-Nedecký, I

    1981-01-01

    The authors submit etalon (standard) equipotential maps as an aid in the evaluation of maps of surface potential distributions in living subjects. They were obtained by measuring potentials on the surface of an electrolytic tank shaped like the thorax. The individual etalon maps were determined in such a way that the parameters of the physical dipole forming the source of the electric field in the tank corresponded to the mean vectorcardiographic parameters measured in a healthy population sample. The technique also allows a quantitative estimate of the degree of non-dipolarity of the heart as the source of the electric field.

  9. Fast and Accurate Construction of Ultra-Dense Consensus Genetic Maps Using Evolution Strategy Optimization

    PubMed Central

    Mester, David; Ronin, Yefim; Schnable, Patrick; Aluru, Srinivas; Korol, Abraham

    2015-01-01

    Our aim was to develop a fast and accurate algorithm for constructing consensus genetic maps for chip-based SNP genotyping data with a high proportion of shared markers between mapping populations. Chip-based genotyping of SNP markers allows producing high-density genetic maps with a relatively standardized set of marker loci for different mapping populations. The availability of a standard high-throughput mapping platform simplifies consensus analysis by ignoring unique markers at the stage of consensus mapping thereby reducing mathematical complicity of the problem and in turn analyzing bigger size mapping data using global optimization criteria instead of local ones. Our three-phase analytical scheme includes automatic selection of ~100-300 of the most informative (resolvable by recombination) markers per linkage group, building a stable skeletal marker order for each data set and its verification using jackknife re-sampling, and consensus mapping analysis based on global optimization criterion. A novel Evolution Strategy optimization algorithm with a global optimization criterion presented in this paper is able to generate high quality, ultra-dense consensus maps, with many thousands of markers per genome. This algorithm utilizes "potentially good orders" in the initial solution and in the new mutation procedures that generate trial solutions, enabling to obtain a consensus order in reasonable time. The developed algorithm, tested on a wide range of simulated data and real world data (Arabidopsis), outperformed two tested state-of-the-art algorithms by mapping accuracy and computation time. PMID:25867943

  10. The North Atlantic Data Portal: A Current Approach To Improving Marine Geophysical Data Discovery And Access

    NASA Astrophysics Data System (ADS)

    Jencks, J. H.; Cartwright, J.; Varner, J. D.

    2016-12-01

    Exploring, understanding, and managing the global oceans are a challenge when hydrographic maps are available for only 5% of the world's oceans. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is easier said than done.While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. The National Oceanic and Atmospheric Administration (NOAA) is an active participant in numerous campaign mapping projects whose goals are to carry out coordinated and comprehensive ocean mapping efforts. One of these international programs is an outcome of the Galway Statement on Atlantic Ocean Cooperation signed by the European Union, Canada, and the United States in 2013. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. Through the use of industry standards, the services are constructed such that they can be combined and re-used in a variety of contexts. For example, users may leverage the services in desktop analysis tools, web applications created by the hosting organizations (e.g. the North Atlantic Data Portal), or in custom applications they develop themselves. In order to maximize the return on campaign mapping investments, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. Working together, we can ensure that valuable data are made available to the broadest community.

  11. [Impact to Z-score Mapping of Hyperacute Stroke Images by Computed Tomography in Adaptive Statistical Iterative Reconstruction].

    PubMed

    Watanabe, Shota; Sakaguchi, Kenta; Hosono, Makoto; Ishii, Kazunari; Murakami, Takamichi; Ichikawa, Katsuhiro

    The purpose of this study was to evaluate the effect of a hybrid-type iterative reconstruction method on Z-score mapping of hyperacute stroke in unenhanced computed tomography (CT) images. We used a hybrid-type iterative reconstruction [adaptive statistical iterative reconstruction (ASiR)] implemented in a CT system (Optima CT660 Pro advance, GE Healthcare). With 15 normal brain cases, we reconstructed CT images with a filtered back projection (FBP) and ASiR with a blending factor of 100% (ASiR100%). Two standardized normal brain data were created from normal databases of FBP images (FBP-NDB) and ASiR100% images (ASiR-NDB), and standard deviation (SD) values in basal ganglia were measured. The Z-score mapping was performed for 12 hyperacute stroke cases by using FBP-NDB and ASiR-NDB, and compared Z-score value on hyperacute stroke area and normal area between FBP-NDB and ASiR-NDB. By using ASiR-NDB, the SD value of standardized brain was decreased by 16%. The Z-score value of ASiR-NDB on hyperacute stroke area was significantly higher than FBP-NDB (p<0.05). Therefore, the use of images reconstructed with ASiR100% for Z-score mapping had potential to improve the accuracy of Z-score mapping.

  12. Hybrid Semantic Analysis for Mapping Adverse Drug Reaction Mentions in Tweets to Medical Terminology.

    PubMed

    Emadzadeh, Ehsan; Sarker, Abeed; Nikfarjam, Azadeh; Gonzalez, Graciela

    2017-01-01

    Social networks, such as Twitter, have become important sources for active monitoring of user-reported adverse drug reactions (ADRs). Automatic extraction of ADR information can be crucial for healthcare providers, drug manufacturers, and consumers. However, because of the non-standard nature of social media language, automatically extracted ADR mentions need to be mapped to standard forms before they can be used by operational pharmacovigilance systems. We propose a modular natural language processing pipeline for mapping (normalizing) colloquial mentions of ADRs to their corresponding standardized identifiers. We seek to accomplish this task and enable customization of the pipeline so that distinct unlabeled free text resources can be incorporated to use the system for other normalization tasks. Our approach, which we call Hybrid Semantic Analysis (HSA), sequentially employs rule-based and semantic matching algorithms for mapping user-generated mentions to concept IDs in the Unified Medical Language System vocabulary. The semantic matching component of HSA is adaptive in nature and uses a regression model to combine various measures of semantic relatedness and resources to optimize normalization performance on the selected data source. On a publicly available corpus, our normalization method achieves 0.502 recall and 0.823 precision (F-measure: 0.624). Our proposed method outperforms a baseline based on latent semantic analysis and another that uses MetaMap.

  13. NaviCell Web Service for network-based data visualization.

    PubMed

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei

    2015-07-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. NaviCell Web Service for network-based data visualization

    PubMed Central

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei

    2015-01-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393

  15. 76 FR 72144 - Standardized and Enhanced Disclosure Requirements for Television Broadcast Licensee Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    ..., contour maps; ownership reports and related materials; portions of the Equal Employment Opportunity file... maps; ownership reports and related materials; portions of the Equal Employment Opportunity file held... immediately following the shortened license term. See 47 CFR 73.3526((e)(2), 73.3527(e)(2). Contour Maps (as...

  16. 30 CFR 77.1200 - Mine map.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SAFETY STANDARDS, SURFACE COAL MINES AND SURFACE WORK AREAS OF UNDERGROUND COAL MINES Maps § 77.1200 Mine... elevation of any body of water dammed or held back in any portion of the mine: Provided, however, Such bodies of water may be shown on overlays or tracings attached to the mine maps; (g) All prospect drill...

  17. Copyright | USDA Plant Hardiness Zone Map

    Science.gov Websites

    Copyright Copyright Map graphics. As a U.S. Government publication, the USDA Plant Hardiness Zone Map itself Specific Cooperative Agreement, Oregon State University agreed to supply the U.S. Government with unenhanced (standard resolution) GIS data in grid and shapefile formats. U.S. Government users may use these

  18. Connecticut Music Trace Map for Grades 10 and 12. Revised.

    ERIC Educational Resources Information Center

    Connecticut State Board of Education, Hartford.

    The Connecticut Curriculum Trace Maps for music are designed to help curriculum developers and teachers translate Connecticut's K-12 performance standards into objectives and classroom practice. The Trace Maps provide specific descriptions of what students should know and be able to do at smaller grade level clusters. The elements in the Trace…

  19. Connecticut Music Trace Map for Grades 2 and 4. Revised.

    ERIC Educational Resources Information Center

    Connecticut State Board of Education, Hartford.

    These Connecticut Curriculum Trace Maps for music are designed to help curriculum developers and teachers translate Connecticut's K-12 performance standards into objectives and classroom practice. The music Trace Maps provide specific descriptions of what students should know and be able to do at smaller grade level clusters. Connecticut's Trace…

  20. Classification criteria and probability risk maps: limitations and perspectives.

    PubMed

    Saisana, Michaela; Dubois, Gregoire; Chaloulakou, Archontoula; Spyrellis, Nikolas

    2004-03-01

    Delineation of polluted zones with respect to regulatory standards, accounting at the same time for the uncertainty of the estimated concentrations, relies on classification criteria that can lead to significantly different pollution risk maps, which, in turn, can depend on the regulatory standard itself. This paper reviews four popular classification criteria related to the violation of a probability threshold or a physical threshold, using annual (1996-2000) nitrogen dioxide concentrations from 40 air monitoring stations in Milan. The relative advantages and practical limitations of each criterion are discussed, and it is shown that some of the criteria are more appropriate for the problem at hand and that the choice of the criterion can be supported by the statistical distribution of the data and/or the regulatory standard. Finally, the polluted area is estimated over the different years and concentration thresholds using the appropriate risk maps as an additional source of uncertainty.

  1. Constellation labeling optimization for bit-interleaved coded APSK

    NASA Astrophysics Data System (ADS)

    Xiang, Xingyu; Mo, Zijian; Wang, Zhonghai; Pham, Khanh; Blasch, Erik; Chen, Genshe

    2016-05-01

    This paper investigates the constellation and mapping optimization for amplitude phase shift keying (APSK) modulation, which is deployed in Digital Video Broadcasting Satellite - Second Generation (DVB-S2) and Digital Video Broadcasting - Satellite services to Handhelds (DVB-SH) broadcasting standards due to its merits of power and spectral efficiency together with the robustness against nonlinear distortion. The mapping optimization is performed for 32-APSK according to combined cost functions related to Euclidean distance and mutual information. A Binary switching algorithm and its modified version are used to minimize the cost function and the estimated error between the original and received data. The optimized constellation mapping is tested by combining DVB-S2 standard Low-Density Parity-Check (LDPC) codes in both Bit-Interleaved Coded Modulation (BICM) and BICM with iterative decoding (BICM-ID) systems. The simulated results validate the proposed constellation labeling optimization scheme which yields better performance against conventional 32-APSK constellation defined in DVB-S2 standard.

  2. Standardization of Schwarz-Christoffel transformation for engineering design of semiconductor and hybrid integrated-circuit elements

    NASA Astrophysics Data System (ADS)

    Yashin, A. A.

    1985-04-01

    A semiconductor or hybrid structure into a calculable two-dimensional region mapped by the Schwarz-Christoffel transformation and a universal algorithm can be constructed on the basis of Maxwell's electro-magnetic-thermal similarity principle for engineering design of integrated-circuit elements. The design procedure involves conformal mapping of the original region into a polygon and then the latter into a rectangle with uniform field distribution, where conductances and capacitances are calculated, using tabulated standard mapping functions. Subsequent synthesis of a device requires inverse conformal mapping. Devices adaptable as integrated-circuit elements are high-resistance film resistors with periodic serration, distributed-resistance film attenuators with high transformation ratio, coplanar microstrip lines, bipolar transistors, directional couplers with distributed coupling to microstrip lines for microwave bulk devices, and quasirregular smooth matching transitions from asymmetric to coplanar microstrip lines.

  3. National Water Quality Standards Database (NWQSD)

    EPA Pesticide Factsheets

    The National Water Quality Standards Database (WQSDB) provides access to EPA and state water quality standards (WQS) information in text, tables, and maps. This data source was last updated in December 2007 and will no longer be updated.

  4. Mapping Live Fuel Moisture and the relation to drought and post fire events for Southern California region

    NASA Astrophysics Data System (ADS)

    Hatzopoulos, N.; Kim, S. H.; Kafatos, M.; Nghiem, S. V.; Myoung, B.

    2016-12-01

    Live Fuel Moisture is a dryness measure used by the fire departments to determine how dry is the current situation of the fuels from the forest areas. In order to map Live Fuel Moisture we conducted an analysis with a standardized regressional approach from various vegetation indices derived from remote sensing data of MODIS. After analyzing the results we concluded mapping Live Fuel Moisture using a standardized NDVI product. From the mapped remote sensed product we observed the appearance of extremely high dry fuels to be highly correlated with very dry years based on the overall yearly precipitation. The appearances of the extremely dry mapped fuels tend to have a direct association with fire events and observed to be a post fire indicator. In addition we studied the appearance of extreme dry fuels during critical months when season changes from spring to summer as well as the relation to fire events.

  5. Isolation and characterization of a cDNA clone specific for avian vitellogenin II.

    PubMed Central

    Protter, A A; Wang, S Y; Shelness, G S; Ostapchuk, P; Williams, D L

    1982-01-01

    A clone for vitellogenin, a major avian, estrogen responsive egg yolk protein, was isolated from the cDNA library of estrogen-induced rooster liver. Two forms of plasma vitellogenin, vitellogenin I (VTG I) and vitellogenin II (VTG II), distinguishable on the basis of their unique partial proteolysis maps, have been characterized and their corresponding hepatic precursor forms identified. We have used this criterion to specifically characterize which vitellogenin protein had been cloned. Partial proteolysis maps of BTG I and VTG II standards, synthesized in vivo, were compared to maps of protein synthesized in vitro using RNA hybrid-selected by the vitellogenin plasmid. Eight major digest fragments were found common to the in vitro synthesized vitellogenin and the VTG II standard while no fragments were observed to correspond to the VTG I map. A restriction map of the VTG II cDNA clone permits comparison to previously described cDNA and genomic vitellogenin clones. Images PMID:6182527

  6. Research on Integrated Mapping——A Case Study of Integrated Land Use with Swamp Mapping

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Yan, F.; Chang, L.

    2015-12-01

    Unified real estate registration system shows the attention, determination and effort to of CPC Central Committee and State Council on real estate registration in China. However, under current situation, China's real estate registration work made less progress. One of the reasons is that it's hard to express the property right of real estate on one map under the multi-sector management system. Under current multi-sector management system in China, different departments usually just survey and mapping the land type under its jurisdiction. For example, wetland investigation only mapping all kinds of wetland resources but not mapping other resource types. As a result, it cause he problem of coincidence or leak in integration of different results from different departments. As resources of the earth's surface, the total area of forest, grassland, wetland and so on should be equal to the total area of the earth's surface area. However, under the current system, the area of all kinds of resources is not equal to the sum of the earth's surface. Therefore, it is of great importance to express all the resources on one map. On one hand, this is conducive to find out the real area and distribution of resources and avoid the problem of coincidence or leak in integration; On the other hand, it is helpful to study the dynamic change of different resources. Therefore, we first proposed the "integrated mapping" as a solution, and take integrated land use with swamp mapping in Northeast China as an example to investigate the feasibility and difficulty. Study showed that: integrated land use with swamp mapping can be achieved through combining land use survey standards with swamps survey standards and "second mapping" program. Based on the experience of integrated land use with swamp mapping, we point out its reference function on integrated mapping and unified real estate registration system. We concluded that: (1) Comprehending and integrating different survey standard of different resources is the premise of "integrated mapping", (2) We put forward "multiple code" and "multiple interpretation" scheme in order to solve the problem of "attribute overlap", (3) The area of "attribute overlap" can be segmented by a certain ratio to determine the property right in unified real estate registration.

  7. Ensemble Learning of QTL Models Improves Prediction of Complex Traits

    PubMed Central

    Bian, Yang; Holland, James B.

    2015-01-01

    Quantitative trait locus (QTL) models can provide useful insights into trait genetic architecture because of their straightforward interpretability but are less useful for genetic prediction because of the difficulty in including the effects of numerous small effect loci without overfitting. Tight linkage between markers introduces near collinearity among marker genotypes, complicating the detection of QTL and estimation of QTL effects in linkage mapping, and this problem is exacerbated by very high density linkage maps. Here we developed a thinning and aggregating (TAGGING) method as a new ensemble learning approach to QTL mapping. TAGGING reduces collinearity problems by thinning dense linkage maps, maintains aspects of marker selection that characterize standard QTL mapping, and by ensembling, incorporates information from many more markers-trait associations than traditional QTL mapping. The objective of TAGGING was to improve prediction power compared with QTL mapping while also providing more specific insights into genetic architecture than genome-wide prediction models. TAGGING was compared with standard QTL mapping using cross validation of empirical data from the maize (Zea mays L.) nested association mapping population. TAGGING-assisted QTL mapping substantially improved prediction ability for both biparental and multifamily populations by reducing both the variance and bias in prediction. Furthermore, an ensemble model combining predictions from TAGGING-assisted QTL and infinitesimal models improved prediction abilities over the component models, indicating some complementarity between model assumptions and suggesting that some trait genetic architectures involve a mixture of a few major QTL and polygenic effects. PMID:26276383

  8. An interactive method for digitizing zone maps

    NASA Technical Reports Server (NTRS)

    Giddings, L. E.; Thompson, E. J.

    1975-01-01

    A method is presented for digitizing maps that consist of zones, such as contour or climatic zone maps. A color-coded map is prepared by any convenient process. The map is then read into memory of an Image 100 computer by means of its table scanner, using colored filters. Zones are separated and stored in themes, using standard classification procedures. Thematic data are written on magnetic tape and these data, appropriately coded, are combined to make a digitized image on tape. Step-by-step procedures are given for digitization of crop moisture index maps with this procedure. In addition, a complete example of the digitization of a climatic zone map is given.

  9. Communications among elements of a space construction ensemble

    NASA Technical Reports Server (NTRS)

    Davis, Randal L.; Grasso, Christopher A.

    1989-01-01

    Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.

  10. 30 CFR 75.1204-1 - Places to give notice and file maps.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Places to give notice and file maps. 75.1204-1 Section 75.1204-1 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Maps § 75.1204-1 Places to give...

  11. Development and testing of a tool for assessing and resolving medication-related problems in older adults in an ambulatory care setting: the individualized medication assessment and planning (iMAP) tool.

    PubMed

    Crisp, Ginny D; Burkhart, Jena Ivey; Esserman, Denise A; Weinberger, Morris; Roth, Mary T

    2011-12-01

    Medication is one of the most important interventions for improving the health of older adults, yet it has great potential for causing harm. Clinical pharmacists are well positioned to engage in medication assessment and planning. The Individualized Medication Assessment and Planning (iMAP) tool was developed to aid clinical pharmacists in documenting medication-related problems (MRPs) and associated recommendations. The purpose of our study was to assess the reliability and usability of the iMAP tool in classifying MRPs and associated recommendations in older adults in the ambulatory care setting. Three cases, representative of older adults seen in an outpatient setting, were developed. Pilot testing was conducted and a "gold standard" key developed. Eight eligible pharmacists consented to participate in the study. They were instructed to read each case, make an assessment of MRPs, formulate a plan, and document the information using the iMAP tool. Inter-rater reliability was assessed for each case, comparing the pharmacists' identified MRPs and recommendations to the gold standard. Consistency of categorization across reviewers was assessed using the κ statistic or percent agreement. The mean κ across the 8 pharmacists in classifying MRPs compared with the gold standard was 0.74 (range, 0.54-1.00) for case 1 and 0.68 (range, 0.36-1.00) for case 2, indicating substantial agreement. For case 3, percent agreement was 63% (range, 40%-100%). The mean κ across the 8 pharmacists when classifying recommendations compared with the gold standard was 0.87 (range, 0.58-1.00) for case 1 and 0.88 (range, 0.75-1.00) for case 2, indicating almost perfect agreement. For case 3, percent agreement was 68% (range, 40%-100%). Clinical pharmacists found the iMAP tool easy to use. The iMAP tool provides a reliable and standardized approach for clinical pharmacists to use in the ambulatory care setting to classify MRPs and associated recommendations. Future studies will explore the predictive validity of the tool on clinical outcomes such as health care utilization. Copyright © 2011 Elsevier HS Journals, Inc. All rights reserved.

  12. Self-Consistent Chaotic Transport in a High-Dimensional Mean-Field Hamiltonian Map Model

    DOE PAGES

    Martínez-del-Río, D.; del-Castillo-Negrete, D.; Olvera, A.; ...

    2015-10-30

    We studied the self-consistent chaotic transport in a Hamiltonian mean-field model. This model provides a simplified description of transport in marginally stable systems including vorticity mixing in strong shear flows and electron dynamics in plasmas. Self-consistency is incorporated through a mean-field that couples all the degrees-of-freedom. The model is formulated as a large set of N coupled standard-like area-preserving twist maps in which the amplitude and phase of the perturbation, rather than being constant like in the standard map, are dynamical variables. Of particular interest is the study of the impact of periodic orbits on the chaotic transport and coherentmore » structures. Furthermore, numerical simulations show that self-consistency leads to the formation of a coherent macro-particle trapped around the elliptic fixed point of the system that appears together with an asymptotic periodic behavior of the mean field. To model this asymptotic state, we introduced a non-autonomous map that allows a detailed study of the onset of global transport. A turnstile-type transport mechanism that allows transport across instantaneous KAM invariant circles in non-autonomous systems is discussed. As a first step to understand transport, we study a special type of orbits referred to as sequential periodic orbits. Using symmetry properties we show that, through replication, high-dimensional sequential periodic orbits can be generated starting from low-dimensional periodic orbits. We show that sequential periodic orbits in the self-consistent map can be continued from trivial (uncoupled) periodic orbits of standard-like maps using numerical and asymptotic methods. Normal forms are used to describe these orbits and to find the values of the map parameters that guarantee their existence. Numerical simulations are used to verify the prediction from the asymptotic methods.« less

  13. Self-Consistent Chaotic Transport in a High-Dimensional Mean-Field Hamiltonian Map Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martínez-del-Río, D.; del-Castillo-Negrete, D.; Olvera, A.

    We studied the self-consistent chaotic transport in a Hamiltonian mean-field model. This model provides a simplified description of transport in marginally stable systems including vorticity mixing in strong shear flows and electron dynamics in plasmas. Self-consistency is incorporated through a mean-field that couples all the degrees-of-freedom. The model is formulated as a large set of N coupled standard-like area-preserving twist maps in which the amplitude and phase of the perturbation, rather than being constant like in the standard map, are dynamical variables. Of particular interest is the study of the impact of periodic orbits on the chaotic transport and coherentmore » structures. Furthermore, numerical simulations show that self-consistency leads to the formation of a coherent macro-particle trapped around the elliptic fixed point of the system that appears together with an asymptotic periodic behavior of the mean field. To model this asymptotic state, we introduced a non-autonomous map that allows a detailed study of the onset of global transport. A turnstile-type transport mechanism that allows transport across instantaneous KAM invariant circles in non-autonomous systems is discussed. As a first step to understand transport, we study a special type of orbits referred to as sequential periodic orbits. Using symmetry properties we show that, through replication, high-dimensional sequential periodic orbits can be generated starting from low-dimensional periodic orbits. We show that sequential periodic orbits in the self-consistent map can be continued from trivial (uncoupled) periodic orbits of standard-like maps using numerical and asymptotic methods. Normal forms are used to describe these orbits and to find the values of the map parameters that guarantee their existence. Numerical simulations are used to verify the prediction from the asymptotic methods.« less

  14. Linking late cognitive outcome with glioma surgery location using resection cavity maps.

    PubMed

    Hendriks, Eef J; Habets, Esther J J; Taphoorn, Martin J B; Douw, Linda; Zwinderman, Aeilko H; Vandertop, W Peter; Barkhof, Frederik; Klein, Martin; De Witt Hamer, Philip C

    2018-05-01

    Patients with a diffuse glioma may experience cognitive decline or improvement upon resective surgery. To examine the impact of glioma location, cognitive alteration after glioma surgery was quantified and related to voxel-based resection probability maps. A total of 59 consecutive patients (range 18-67 years of age) who had resective surgery between 2006 and 2011 for a supratentorial nonenhancing diffuse glioma (grade I-III, WHO 2007) were included in this observational cohort study. Standardized neuropsychological examination and MRI were obtained before and after surgery. Intraoperative stimulation mapping guided resections towards neurological functions (language, sensorimotor function, and visual fields). Maps of resected regions were constructed in standard space. These resection cavity maps were compared between patients with and without new cognitive deficits (z-score difference >1.5 SD between baseline and one year after resection), using a voxel-wise randomization test and calculation of false discovery rates. Brain regions significantly associated with cognitive decline were classified in standard cortical and subcortical anatomy. Cognitive improvement in any domain occurred in 10 (17%) patients, cognitive decline in any domain in 25 (42%), and decline in more than one domain in 10 (17%). The most frequently affected subdomains were attention in 10 (17%) patients and information processing speed in 9 (15%). Resection regions associated with decline in more than one domain were predominantly located in the right hemisphere. For attention decline, no specific region could be identified. For decline in information speed, several regions were found, including the frontal pole and the corpus callosum. Cognitive decline after resective surgery of diffuse glioma is prevalent, in particular, in patients with a tumor located in the right hemisphere without cognitive function mapping. © The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  15. Single-edition quadrangle maps

    USGS Publications Warehouse

    ,

    1998-01-01

    In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a common standard, as well as by significantly reducing duplication of effort.

  16. Digital mapping techniques '00, workshop proceedings - May 17-20, 2000, Lexington, Kentucky

    USGS Publications Warehouse

    Soller, David R.

    2000-01-01

    Introduction: The Digital Mapping Techniques '00 (DMT'00) workshop was attended by 99 technical experts from 42 agencies, universities, and private companies, including representatives from 28 state geological surveys (see Appendix A). This workshop was similar in nature to the first three meetings, held in June, 1997, in Lawrence, Kansas (Soller, 1997), in May, 1998, in Champaign, Illinois (Soller, 1998a), and in May, 1999, in Madison, Wisconsin (Soller, 1999). This year's meeting was hosted by the Kentucky Geological Survey, from May 17 to 20, 2000, on the University of Kentucky campus in Lexington. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. When, based on discussions at the workshop, an attendee adopts or modifies a newly learned technique, the workshop clearly has met that objective. Evidence of learning and cooperation among participating agencies continued to be a highlight of the DMT workshops (see example in Soller, 1998b, and various papers in this volume). The meeting's general goal was to help move the state geological surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and geographic information systems (GIS) analysis. Through oral and poster presentations and special discussion sessions, emphasis was given to: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) continued development of the National Geologic Map Database; 3) progress toward building a standard geologic map data model; 4) field data-collection systems; and 5) map citation and authorship guidelines. Four representatives of the GIS hardware and software vendor community were invited to participate. The four annual DMT workshops were coordinated by the AASG/USGS Data Capture Working Group, which was formed in August, 1996, to support the Association of American State Geologists and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ncgmp.usgs.gov/ngmdbproject/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed to help the Database, and the State and Federal geological surveys, provide more high-quality digital maps to the public.

  17. Standards-Based Open-Source Planetary Map Server: Lunaserv

    NASA Astrophysics Data System (ADS)

    Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.

    2018-04-01

    Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.

  18. Change in land use in the Phoenix (1:250,000) Quadrangle, Arizona between 1970 and 1973: ERTS as an aid in a nationwide program for mapping general land use. [Phoenix Quadrangle, Arizona

    NASA Technical Reports Server (NTRS)

    Place, J. L.

    1974-01-01

    Changes in land use between 1970 and 1973 in the Phoenix (1:250,000 scale) Quadrangle in Arizona have been mapped using only the images from ERTS-1, tending to verify the utility of a standard land use classification system proposed for use with ERTS images. Types of changes detected have been: (1) new residential development of former cropland and rangeland; (2) new cropland from the desert; and (3) new reservoir fill-up. The seasonal changing of vegetation patterns in ERTS has complemented air photos in delimiting the boundaries of some land use types. ERTS images, in combination with other sources of information, can assist in mapping the generalized land use of the fifty states by the standard 1:250,000 quadrangles. Several states are already working cooperatively in this type of mapping.

  19. From Conventional Radiotracer Tc-99(m) with Blue Dye to Indocyanine Green Fluorescence: A Comparison of Methods Towards Optimization of Sentinel Lymph Node Mapping in Early Stage Cervical Cancer for a Laparoscopic Approach.

    PubMed

    Buda, Alessandro; Papadia, Andrea; Zapardiel, Ignacio; Vizza, Enrico; Ghezzi, Fabio; De Ponti, Elena; Lissoni, Andrea Alberto; Imboden, Sara; Diestro, Maria Dolores; Verri, Debora; Gasparri, Maria Luisa; Bussi, Beatrice; Di Martino, Giampaolo; de la Noval, Begoña Diaz; Mueller, Michael; Crivellaro, Cinzia

    2016-09-01

    The credibility of sentinel lymph node (SLN) mapping is becoming increasingly more established in cervical cancer. We aimed to assess the sensitivity of SLN biopsy in terms of detection rate and bilateral mapping in women with cervical cancer by comparing technetium-99 radiocolloid (Tc-99(m)) and blue dye (BD) versus fluorescence mapping with indocyanine green (ICG). Data of patients with cervical cancer stage 1A2 to 1B1 from 5 European institutions were retrospectively reviewed. All centers used a laparoscopic approach with the same intracervical dye injection. Detection rate and bilateral mapping of ICG were compared, respectively, with results obtained by standard Tc-99(m) with BD. Overall, 76 (53 %) of 144 of women underwent preoperative SLN mapping with radiotracer and intraoperative BD, whereas 68 of (47 %) 144 patients underwent mapping using intraoperative ICG. The detection rate of SLN mapping was 96 % and 100 % for Tc-99(m) with BD and ICG, respectively. Bilateral mapping was achieved in 98.5 % for ICG and 76.3 % for Tc-99(m) with BD; this difference was statistically significant (p < 0.0001). The fluorescence SLN mapping with ICG achieved a significantly higher detection rate and bilateral mapping compared to standard radiocolloid and BD technique in women with early stage cervical cancer. Nodal staging with an intracervical injection of ICG is accurate, safe, and reproducible in patients with cervical cancer. Before replacing lymphadenectomy completely, the additional value of fluorescence SLN mapping on both perioperative morbidity and survival should be explored and confirmed by ongoing controlled trials.

  20. Earthquake scenario and probabilistic ground-shaking hazard maps for the Albuquerque-Belen-Santa Fe, New Mexico, corridor

    USGS Publications Warehouse

    Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.

    2004-01-01

    These maps are not intended to be a substitute for site-specific studies for engineering design nor to replace standard maps commonly referenced in building codes. Rather, we hope that these maps will be used as a guide by government agencies; the engineering, urban planning, emergency preparedness, and response communities; and the general public as part of an overall program to reduce earthquake risk and losses in New Mexico.

  1. Career Mapping for Professional Development and Succession Planning.

    PubMed

    Webb, Tammy; Diamond-Wells, Tammy; Jeffs, Debra

    Career mapping facilitates professional development of nurses by education specialists and nurse managers. On the basis of national Nursing Professional Development Scope and Standards, our education and professional development framework supports the organization's professional practice model and provides a foundation for the professional career map. This article describes development, implementation, and evaluation of the professional career map for nurses at a large children's hospital to support achievement of the nursing strategic goals for succession planning and professional development.

  2. Rapid quantitative chemical mapping of surfaces with sub-2 nm resolution

    NASA Astrophysics Data System (ADS)

    Lai, Chia-Yun; Perri, Saverio; Santos, Sergio; Garcia, Ricardo; Chiesa, Matteo

    2016-05-01

    We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems.We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00496b

  3. Profiling structured product labeling with NDF-RT and RxNorm

    PubMed Central

    2012-01-01

    Background Structured Product Labeling (SPL) is a document markup standard approved by Health Level Seven (HL7) and adopted by United States Food and Drug Administration (FDA) as a mechanism for exchanging drug product information. The SPL drug labels contain rich information about FDA approved clinical drugs. However, the lack of linkage to standard drug ontologies hinders their meaningful use. NDF-RT (National Drug File Reference Terminology) and NLM RxNorm as standard drug ontology were used to standardize and profile the product labels. Methods In this paper, we present a framework that intends to map SPL drug labels with existing drug ontologies: NDF-RT and RxNorm. We also applied existing categorical annotations from the drug ontologies to classify SPL drug labels into corresponding classes. We established the classification and relevant linkage for SPL drug labels using the following three approaches. First, we retrieved NDF-RT categorical information from the External Pharmacologic Class (EPC) indexing SPLs. Second, we used the RxNorm and NDF-RT mappings to classify and link SPLs with NDF-RT categories. Third, we profiled SPLs using RxNorm term type information. In the implementation process, we employed a Semantic Web technology framework, in which we stored the data sets from NDF-RT and SPLs into a RDF triple store, and executed SPARQL queries to retrieve data from customized SPARQL endpoints. Meanwhile, we imported RxNorm data into MySQL relational database. Results In total, 96.0% SPL drug labels were mapped with NDF-RT categories whereas 97.0% SPL drug labels are linked to RxNorm codes. We found that the majority of SPL drug labels are mapped to chemical ingredient concepts in both drug ontologies whereas a relatively small portion of SPL drug labels are mapped to clinical drug concepts. Conclusions The profiling outcomes produced by this study would provide useful insights on meaningful use of FDA SPL drug labels in clinical applications through standard drug ontologies such as NDF-RT and RxNorm. PMID:23256517

  4. Profiling structured product labeling with NDF-RT and RxNorm.

    PubMed

    Zhu, Qian; Jiang, Guoqian; Chute, Christopher G

    2012-12-20

    Structured Product Labeling (SPL) is a document markup standard approved by Health Level Seven (HL7) and adopted by United States Food and Drug Administration (FDA) as a mechanism for exchanging drug product information. The SPL drug labels contain rich information about FDA approved clinical drugs. However, the lack of linkage to standard drug ontologies hinders their meaningful use. NDF-RT (National Drug File Reference Terminology) and NLM RxNorm as standard drug ontology were used to standardize and profile the product labels. In this paper, we present a framework that intends to map SPL drug labels with existing drug ontologies: NDF-RT and RxNorm. We also applied existing categorical annotations from the drug ontologies to classify SPL drug labels into corresponding classes. We established the classification and relevant linkage for SPL drug labels using the following three approaches. First, we retrieved NDF-RT categorical information from the External Pharmacologic Class (EPC) indexing SPLs. Second, we used the RxNorm and NDF-RT mappings to classify and link SPLs with NDF-RT categories. Third, we profiled SPLs using RxNorm term type information. In the implementation process, we employed a Semantic Web technology framework, in which we stored the data sets from NDF-RT and SPLs into a RDF triple store, and executed SPARQL queries to retrieve data from customized SPARQL endpoints. Meanwhile, we imported RxNorm data into MySQL relational database. In total, 96.0% SPL drug labels were mapped with NDF-RT categories whereas 97.0% SPL drug labels are linked to RxNorm codes. We found that the majority of SPL drug labels are mapped to chemical ingredient concepts in both drug ontologies whereas a relatively small portion of SPL drug labels are mapped to clinical drug concepts. The profiling outcomes produced by this study would provide useful insights on meaningful use of FDA SPL drug labels in clinical applications through standard drug ontologies such as NDF-RT and RxNorm.

  5. Crowdsourced Mapping - Letting Amateurs Into the Temple?

    NASA Astrophysics Data System (ADS)

    McCullagh, M.; Jackson, M.

    2013-05-01

    The rise of crowdsourced mapping data is well documented and attempts to integrate such information within existing or potential NSDIs [National Spatial Data Infrastructures] are increasingly being examined. The results of these experiments, however, have been mixed and have left many researchers uncertain and unclear of the benefits of integration and of solutions to problems of use for such combined and potentially synergistic mapping tools. This paper reviews the development of the crowdsource mapping movement and discusses the applications that have been developed and some of the successes achieved thus far. It also describes the problems of integration and ways of estimating success, based partly on a number of on-going studies at the University of Nottingham that look at different aspects of the integration problem: iterative improvement of crowdsource data quality, comparison between crowdsourced data and prior knowledge and models, development of trust in such data, and the alignment of variant ontologies. Questions of quality arise, particularly when crowdsource data are combined with pre-existing NSDI data. The latter is usually stable, meets international standards and often provides national coverage for use at a variety of scales. The former is often partial, without defined quality standards, patchy in coverage, but frequently addresses themes very important to some grass roots group and often to society as a whole. This group might be of regional, national, or international importance that needs a mapping facility to express its views, and therefore should combine with local NSDI initiatives to provide valid mapping. Will both groups use ISO (International Organisation for Standardisation) and OGC (Open Geospatial Consortium) standards? Or might some extension or relaxation be required to accommodate the mostly less rigorous crowdsourced data? So, can crowdsourced data ever be safely and successfully merged into an NSDI? Should it be simply a separate mapping layer? Is full integration possible providing quality standards are fully met, and methods of defining levels of quality agreed? Frequently crowdsourced data sets are anarchic in composition, and based on new and sometimes unproved technologies. Can an NSDI exhibit the necessary flexibility and speed to deal with such rapid technological and societal change?

  6. Measurement of sitting balance using the Manchester Active Position Seat (MAPS): a feasibility study.

    PubMed

    Powell, E S; Pyburn, R E; Hill, E; Smith, K S; Ribbands, M S; Mickelborough, J; Pomeroy, V M

    2002-09-01

    Evaluation of the effectiveness of therapy to improve sitting balance has been hampered by the limited number of sensitive objective clinical measures. We developed the Manchester Active Position Seat (MAPS) to provide a portable system to track change in the position of centre of force over time. (1) To investigate whether there is correspondence between the measurement of position change by a forceplate and by MAPS. (2) To explore whether and how MAPS measures changes in position when seated healthy adults change posture. A feasibility study. (1) An adult subject sat on MAPS placed on top of a forceplate. The x and y coordinates of the centre of pressure recorded from the forceplate and centre of force from MAPS during movement were compared graphically. (2) Four adults sat on MAPS using a standardized starting position and moving into six sets of six standardized target postures in a predetermined randomized order. The absolute shift in centre of force from the starting position was calculated. (1) The pattern of change of position over time was similar for the forceplate and for MAPS although there was a measurement difference, which increased with distance from the centre. (2) The direction of change of position corresponded to the direction of movement to the target postures but the amount of change varied between subjects. MAPS shows promise as an objective clinical measure of sitting balance, but peripheral accuracy of measurement needs to be improved.

  7. Mapping Applications Center, National Mapping Division, U.S. Geological Survey

    USGS Publications Warehouse

    ,

    1996-01-01

    The Mapping Applications Center (MAC), National Mapping Division (NMD), is the eastern regional center for coordinating the production, distribution, and sale of maps and digital products of the U.S. Geological Survey (USGS). It is located in the John Wesley Powell Federal Building in Reston, Va. The MAC's major functions are to (1) establish and manage cooperative mapping programs with State and Federal agencies; (2) perform new research in preparing and applying geospatial information; (3) prepare digital cartographic data, special purpose maps, and standard maps from traditional and classified source materials; (4) maintain the domestic names program of the United States; (5) manage the National Aerial Photography Program (NAPP); (6) coordinate the NMD's publications and outreach programs; and (7) direct the USGS mapprinting operations.

  8. Remote-sensing applications as utilized in Florida's coastal zone management program

    NASA Technical Reports Server (NTRS)

    Worley, D. R.

    1975-01-01

    Land use maps were developed from photomaps obtained by remote sensing in order to develop a comprehensive state plan for the protection, development, and zoning of coastal regions. Only photographic remote sensors have been used in support of the coastal council's planning/management methodology. Standard photointerpretation and cartographic application procedures for map compilation were used in preparing base maps.

  9. Mapping Patterns of Multiple Deprivation and Well-Being Using Self-Organizing Maps: An Application to Swiss Household Panel Data

    ERIC Educational Resources Information Center

    Lucchini, Mario; Assi, Jenny

    2013-01-01

    The aim of this paper is to propose multidimensional measures of deprivation and wellbeing in contemporary Switzerland, in order to overcome the limitations of standard approaches. More precisely, we have developed self organising maps (SOM) using data drawn from the 2009 Swiss Household Panel wave, in order to identify highly homogeneous clusters…

  10. A Body of Work Standard-Setting Method with Construct Maps

    ERIC Educational Resources Information Center

    Wyse, Adam E.; Bunch, Michael B.; Deville, Craig; Viger, Steven G.

    2014-01-01

    This article describes a novel variation of the Body of Work method that uses construct maps to overcome problems of transparency, rater inconsistency, and scores gaps commonly occurring with the Body of Work method. The Body of Work method with construct maps was implemented to set cut-scores for two separate K-12 assessment programs in a large…

  11. A New Map of Standardized Terrestrial Ecosystems of the Conterminous United States

    USGS Publications Warehouse

    Sayre, Roger G.; Comer, Patrick; Warner, Harumi; Cress, Jill

    2009-01-01

    A new map of standardized, mesoscale (tens to thousands of hectares) terrestrial ecosystems for the conterminous United States was developed by using a biophysical stratification approach. The ecosystems delineated in this top-down, deductive modeling effort are described in NatureServe's classification of terrestrial ecological systems of the United States. The ecosystems were mapped as physically distinct areas and were associated with known distributions of vegetation assemblages by using a standardized methodology first developed for South America. This approach follows the geoecosystems concept of R.J. Huggett and the ecosystem geography approach of R.G. Bailey. Unique physical environments were delineated through a geospatial combination of national data layers for biogeography, bioclimate, surficial materials lithology, land surface forms, and topographic moisture potential. Combining these layers resulted in a comprehensive biophysical stratification of the conterminous United States, which produced 13,482 unique biophysical areas. These were considered as fundamental units of ecosystem structure and were aggregated into 419 potential terrestrial ecosystems. The ecosystems classification effort preceded the mapping effort and involved the independent development of diagnostic criteria, descriptions, and nomenclature for describing expert-derived ecological systems. The aggregation and labeling of the mapped ecosystem structure units into the ecological systems classification was accomplished in an iterative, expert-knowledge-based process using automated rulesets for identifying ecosystems on the basis of their biophysical and biogeographic attributes. The mapped ecosystems, at a 30-meter base resolution, represent an improvement in spatial and thematic (class) resolution over existing ecoregionalizations and are useful for a variety of applications, including ecosystem services assessments, climate change impact studies, biodiversity conservation, and resource management.

  12. Developing Process Maps as a Tool for a Surgical Infection Prevention Quality Improvement Initiative in Resource-Constrained Settings.

    PubMed

    Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G

    2018-06-01

    Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  13. A new map of standardized terrestrial ecosystems of Africa

    USGS Publications Warehouse

    Sayre, Roger G.; Comer, Patrick; Hak, Jon; Josse, Carmen; Bow, Jacquie; Warner, Harumi; Larwanou, Mahamane; Kelbessa, Ensermu; Bekele, Tamrat; Kehl, Harald; Amena, Ruba; Andriamasimanana, Rado; Ba, Taibou; Benson, Laurence; Boucher, Timothy; Brown, Matthew; Cress, Jill J.; Dassering, Oueddo; Friesen, Beverly A.; Gachathi, Francis; Houcine, Sebei; Keita, Mahamadou; Khamala, Erick; Marangu, Dan; Mokua, Fredrick; Morou, Boube; Mucina, Ladislav; Mugisha, Samuel; Mwavu, Edward; Rutherford, Michael; Sanou, Patrice; Syampungani, Stephen; Tomor, Bojoi; Vall, Abdallahi Ould Mohamed; Vande Weghe, Jean Pierre; Wangui, Eunice; Waruingi, Lucy

    2013-01-01

    Terrestrial ecosystems and vegetation of Africa were classified and mapped as part of a larger effort and global protocol (GEOSS – the Global Earth Observation System of Systems), which includes an activity to map terrestrial ecosystems of the earth in a standardized, robust, and practical manner, and at the finest possible spatial resolution. To model the potential distribution of ecosystems, new continental datasets for several key physical environment datalayers (including coastline, landforms, surficial lithology, and bioclimates) were developed at spatial and classification resolutions finer than existing similar datalayers. A hierarchical vegetation classification was developed by African ecosystem scientists and vegetation geographers, who also provided sample locations of the newly classified vegetation units. The vegetation types and ecosystems were then mapped across the continent using a classification and regression tree (CART) inductive model, which predicted the potential distribution of vegetation types from a suite of biophysical environmental attributes including bioclimate region, biogeographic region, surficial lithology, landform, elevation and land cover. Multi-scale ecosystems were classified and mapped in an increasingly detailed hierarchical framework using vegetation-based concepts of class, subclass, formation, division, and macrogroup levels. The finest vegetation units (macrogroups) classified and mapped in this effort are defined using diagnostic plant species and diagnostic growth forms that reflect biogeographic differences in composition and sub-continental to regional differences in mesoclimate, geology, substrates, hydrology, and disturbance regimes (FGDC, 2008). The macrogroups are regarded as meso-scale (100s to 10,000s of hectares) ecosystems. A total of 126 macrogroup types were mapped, each with multiple, repeating occurrences on the landscape. The modeling effort was implemented at a base spatial resolution of 90 m. In addition to creating several rich, new continent-wide biophysical datalayers describing African vegetation and ecosystems, our intention was to explore feasible approaches to rapidly moving this type of standardized, continent-wide, ecosystem classification and mapping effort forward.

  14. Standardized unfold mapping: a technique to permit left atrial regional data display and analysis.

    PubMed

    Williams, Steven E; Tobon-Gomez, Catalina; Zuluaga, Maria A; Chubb, Henry; Butakoff, Constantine; Karim, Rashed; Ahmed, Elena; Camara, Oscar; Rhode, Kawal S

    2017-10-01

    Left atrial arrhythmia substrate assessment can involve multiple imaging and electrical modalities, but visual analysis of data on 3D surfaces is time-consuming and suffers from limited reproducibility. Unfold maps (e.g., the left ventricular bull's eye plot) allow 2D visualization, facilitate multimodal data representation, and provide a common reference space for inter-subject comparison. The aim of this work is to develop a method for automatic representation of multimodal information on a left atrial standardized unfold map (LA-SUM). The LA-SUM technique was developed and validated using 18 electroanatomic mapping (EAM) LA geometries before being applied to ten cardiac magnetic resonance/EAM paired geometries. The LA-SUM was defined as an unfold template of an average LA mesh, and registration of clinical data to this mesh facilitated creation of new LA-SUMs by surface parameterization. The LA-SUM represents 24 LA regions on a flattened surface. Intra-observer variability of LA-SUMs for both EAM and CMR datasets was minimal; root-mean square difference of 0.008 ± 0.010 and 0.007 ± 0.005 ms (local activation time maps), 0.068 ± 0.063 gs (force-time integral maps), and 0.031 ± 0.026 (CMR LGE signal intensity maps). Following validation, LA-SUMs were used for automatic quantification of post-ablation scar formation using CMR imaging, demonstrating a weak but significant relationship between ablation force-time integral and scar coverage (R 2  = 0.18, P < 0.0001). The proposed LA-SUM displays an integrated unfold map for multimodal information. The method is applicable to any LA surface, including those derived from imaging and EAM systems. The LA-SUM would facilitate standardization of future research studies involving segmental analysis of the LA.

  15. GIS-based realization of international standards for digital geological mapping - developments in planetary mapping

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan; Jaumann, Ralf

    2010-05-01

    The Helmholtz Alliance and the European Planetary Network are research communities with different main topics. One of the main research topics which are shared by these communities is the question about the geomorphological evolutions of planetary surfaces as well as the geological context of life. This research contains questions like "Is there volcanic activity on a planet?" or "Where are possible landing sites?". In order to help answering such questions, analyses of surface features and morphometric measurements need to be performed. This ultimately leads to the generation of thematic maps (e.g. geological and geomorphologic maps) as a basis for the further studies. By using modern GIS techniques the comparative work and generalisation during mapping processes results in new information. These insights are crucial for subsequent investigations. Therefore, the aim is to make these results available to the research community as a secondary data basis. In order to obtain a common and interoperable data collection results of different mapping projects have to follow a standardised data-infrastructure, metadata definition and map layout. Therefore, we are currently focussing on the generation of a database model arranging all data and processes in a uniform mapping schema. With the help of such a schema, the mapper will be able to utilise a predefined (but customisable) GIS environment with individual tool items as well as a standardised symbolisation and a metadata environment. This environment is based on a data model which is currently on a conceptual level and provides the layout of the data infrastructure including relations and topologies. One of the first tasks towards this data model is the definition of a consistent basis of symbolisation standards developed for planetary mapping. The mapper/geologist will be able to access the pre-built signatures and utilise these in scale dependence within the mapping project. The symbolisation will be related to the data model in the next step. As second task, we designed a concept for description of the digital mapping result. Therefore, we are creating a metadata template based on existing standards for individual needs in planetary sciences. This template is subdivided in (meta) data about the general map content (e.g. on which data the mapping result based on) and in metadata for each individual mapping element/layer comprising information like minimum mapping scale, interpretation hints, etc. The assignment of such a metadata description in combination with the usage of a predefined mapping schema facilitates the efficient and traceable storage of data information on a network server and enables a subsequent representation, e.g. as a mapserver data structure. Acknowledgement: This work is partly supported by DLR and the Helmholtz Alliance "Planetary Evolution and Life".

  16. Geographic Information System Software to Remodel Population Data Using Dasymetric Mapping Methods

    USGS Publications Warehouse

    Sleeter, Rachel; Gould, Michael

    2007-01-01

    The U.S. Census Bureau provides decadal demographic data collected at the household level and aggregated to larger enumeration units for anonymity purposes. Although this system is appropriate for the dissemination of large amounts of national demographic data, often the boundaries of the enumeration units do not reflect the distribution of the underlying statistical phenomena. Conventional mapping methods such as choropleth mapping, are primarily employed due to their ease of use. However, the analytical drawbacks of choropleth methods are well known ranging from (1) the artificial transition of population at the boundaries of mapping units to (2) the assumption that the phenomena is evenly distributed across the enumeration unit (when in actuality there can be significant variation). Many methods to map population distribution have been practiced in geographic information systems (GIS) and remote sensing fields. Many cartographers prefer dasymetric mapping to map population because of its ability to more accurately distribute data over geographic space. Similar to ?choropleth maps?, a dasymetric map utilizes standardized data (for example, census data). However, rather than using arbitrary enumeration zones to symbolize population distribution, a dasymetric approach introduces ancillary information to redistribute the standardized data into zones relative to land use and land cover (LULC), taking into consideration actual changing densities within the boundaries of the enumeration unit. Thus, new zones are created that correlate to the function of the map, capturing spatial variations in population density. The transfer of data from census enumeration units to ancillary-driven homogenous zones is performed by a process called areal interpolation.

  17. Intravenous lipid emulsion alters the hemodynamic response to epinephrine in a rat model.

    PubMed

    Carreiro, Stephanie; Blum, Jared; Jay, Gregory; Hack, Jason B

    2013-09-01

    Intravenous lipid emulsion (ILE) is an adjunctive antidote used in selected critically ill poisoned patients. These patients may also require administration of advanced cardiac life support (ACLS) drugs. Limited data is available to describe interactions of ILE with standard ACLS drugs, specifically epinephrine. Twenty rats with intra-arterial and intravenous access were sedated with isoflurane and split into ILE or normal saline (NS) pretreatment groups. All received epinephrine 15 μm/kg intravenously (IV). Continuous mean arterial pressure (MAP) and heart rate (HR) were monitored until both indices returned to baseline. Standardized t tests were used to compare peak MAP, time to peak MAP, maximum change in HR, time to maximum change in HR, and time to return to baseline MAP/HR. There was a significant difference (p = 0.023) in time to peak MAP in the ILE group (54 s, 95 % CI 44-64) versus the NS group (40 s, 95 % CI 32-48) and a significant difference (p = 0.004) in time to return to baseline MAP in ILE group (171 s, 95 % CI 148-194) versus NS group (130 s, 95 % CI 113-147). There were no significant differences in the peak change in MAP, peak change in HR, time to minimum HR, or time to return to baseline HR between groups. ILE-pretreated rats had a significant difference in MAP response to epinephrine; ILE delayed the peak effect and prolonged the duration of effect of epinephrine on MAP, but did not alter the peak increase in MAP or the HR response.

  18. Personalized-detailed clinical model for data interoperability among clinical standards.

    PubMed

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-08-01

    Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.

  19. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    PubMed Central

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir

    2013-01-01

    Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730

  20. Mapping global health research investments, time for new thinking--a Babel Fish for research data.

    PubMed

    Terry, Robert F; Allen, Liz; Gardner, Charles A; Guzman, Javier; Moran, Mary; Viergever, Roderik F

    2012-09-01

    Today we have an incomplete picture of how much the world is spending on health and disease-related research and development (R&D). As such it is difficult to align, or even begin to coordinate, health R&D investments with international public health priorities. Current efforts to track and map global health research investments are complex, resource-intensive, and caveat-laden. An ideal situation would be for all research funding to be classified using a set of common standards and definitions. However, the adoption of such a standard by everyone is not a realistic, pragmatic or even necessary goal. It is time for new thinking informed by the innovations in automated online translation - e.g. Yahoo's Babel Fish. We propose a feasibility study to develop a system that can translate and map the diverse research classification systems into a common standard, allowing the targeting of scarce research investments to where they are needed most.

  1. Mapping global health research investments, time for new thinking - A Babel Fish for research data

    PubMed Central

    2012-01-01

    Today we have an incomplete picture of how much the world is spending on health and disease-related research and development (R&D). As such it is difficult to align, or even begin to coordinate, health R&D investments with international public health priorities. Current efforts to track and map global health research investments are complex, resource-intensive, and caveat-laden. An ideal situation would be for all research funding to be classified using a set of common standards and definitions. However, the adoption of such a standard by everyone is not a realistic, pragmatic or even necessary goal. It is time for new thinking informed by the innovations in automated online translation - e.g. Yahoo's Babel Fish. We propose a feasibility study to develop a system that can translate and map the diverse research classification systems into a common standard, allowing the targeting of scarce research investments to where they are needed most. PMID:22938160

  2. Geometric accuracy of Landsat-4 and Landsat-5 Thematic Mapper images.

    USGS Publications Warehouse

    Borgeson, W.T.; Batson, R.M.; Kieffer, H.H.

    1985-01-01

    The geometric accuracy of the Landsat Thematic Mappers was assessed by a linear least-square comparison of the positions of conspicuous ground features in digital images with their geographic locations as determined from 1:24 000-scale maps. For a Landsat-5 image, the single-dimension standard deviations of the standard digital product, and of this image with additional linear corrections, are 11.2 and 10.3 m, respectively (0.4 pixel). An F-test showed that skew and affine distortion corrections are not significant. At this level of accuracy, the granularity of the digital image and the probable inaccuracy of the 1:24 000 maps began to affect the precision of the comparison. The tested image, even with a moderate accuracy loss in the digital-to-graphic conversion, meets National Horizontal Map Accuracy standards for scales of 1:100 000 and smaller. Two Landsat-4 images, obtained with the Multispectral Scanner on and off, and processed by an interim software system, contain significant skew and affine distortions. -Authors

  3. Optimization of Brain T2 Mapping Using Standard CPMG Sequence In A Clinical Scanner

    NASA Astrophysics Data System (ADS)

    Hnilicová, P.; Bittšanský, M.; Dobrota, D.

    2014-04-01

    In magnetic resonance imaging, transverse relaxation time (T2) mapping is a useful quantitative tool enabling enhanced diagnostics of many brain pathologies. The aim of our study was to test the influence of different sequence parameters on calculated T2 values, including multi-slice measurements, slice position, interslice gap, echo spacing, and pulse duration. Measurements were performed using standard multi-slice multi-echo CPMG imaging sequence on a 1.5 Tesla routine whole body MR scanner. We used multiple phantoms with different agarose concentrations (0 % to 4 %) and verified the results on a healthy volunteer. It appeared that neither the pulse duration, the size of interslice gap nor the slice shift had any impact on the T2. The measurement accuracy was increased with shorter echo spacing. Standard multi-slice multi-echo CPMG protocol with the shortest echo spacing, also the smallest available interslice gap (100 % of slice thickness) and shorter pulse duration was found to be optimal and reliable for calculating T2 maps in the human brain.

  4. The United States Board on Geographic Names: Standardization or regulation?

    USGS Publications Warehouse

    Payne, R.L.

    2000-01-01

    The United States Board on Geographic Names was created in 1890 to standardize the use of geographic names on federal maps and documents, and was established in its present form in 1947 by public law. The Board is responsible for geographic name usage and application throughout the federal government and its members must approve a name change or new name before it can be applied to federal maps and publications. To accomplish its mission, the Board has developed principles, policies, and procedures for use in the standardization process. The Board is also responsible legally for the promulgation of standardized names, whether or not these names have ever been controversial, and today this is accomplished by the universal availability of electronic databases for domestic and foreign names. This paper examines the development of Board policies and the implementation of these policies to achieve standardization with a view to relating these policies and activities to questions of standardization or regulation. ?? 2000 by The American Name Society.

  5. Developing an Application to Increase the Accessibility of Planetary Geologic Maps

    NASA Astrophysics Data System (ADS)

    Jacobsen, R. E.; Fay, C.

    2018-06-01

    USGS planetary geologic maps are widely used digital products with text, raster, vector, and temporal data, within a highly standardized design. This tool will augment the user experience by improving accessibility among the various forms of data.

  6. Laser mobile mapping standards and applications in transportation.

    DOT National Transportation Integrated Search

    2015-11-01

    This report describes the work that was done to support the development of a chapter for the INDOT Survey Manual on Mobile : Mapping. The work includes experiments that were done, data that was collected, analysis that was carried out, and conclusion...

  7. Interagency Report: Astrogeology 58, television cartography

    USGS Publications Warehouse

    Batson, Raymond M.

    1973-01-01

    The purpose of this paper is to illustrate the processing of digital television pictures into base maps. In this context, a base map is defined as a pictorial representation of planetary surface morphology accurately reproduced on standard map projections. Topographic contour lines, albedo or geologic overprints may be super imposed on these base maps. The compilation of geodetic map controls, the techniques of mosaic compilation, computer processing and airbrush enhancement, and the compilation of con tour lines are discussed elsewhere by the originators of these techniques. A bibliography of applicable literature is included for readers interested in more detailed discussions.

  8. A Two-Layers Based Approach of an Enhanced-Map for Urban Positioning Support

    PubMed Central

    Piñana-Díaz, Carolina; Toledo-Moreo, Rafael; Toledo-Moreo, F. Javier; Skarmeta, Antonio

    2012-01-01

    This paper presents a two-layer based enhanced map that can support navigation in urban environments. One layer is dedicated to describe the drivable road with a special focus on the accurate description of its bounds. This feature can support positioning and advanced map-matching when compared with standard polyline-based maps. The other layer depicts building heights and locations, thus enabling the detection of non-line-of-sight signals coming from GPS satellites not in direct view. Both the concept and the methodology for creating these enhanced maps are shown in the paper. PMID:23202172

  9. Sampling intensity and normalizations: Exploring cost-driving factors in nationwide mapping of tree canopy cover

    Treesearch

    John Tipton; Gretchen Moisen; Paul Patterson; Thomas A. Jackson; John Coulston

    2012-01-01

    There are many factors that will determine the final cost of modeling and mapping tree canopy cover nationwide. For example, applying a normalization process to Landsat data used in the models is important in standardizing reflectance values among scenes and eliminating visual seams in the final map product. However, normalization at the national scale is expensive and...

  10. Countries of the World and International Organizations: Sources of Information

    DTIC Science & Technology

    2007-01-08

    provides business addresses worldwide by geographical areas and by standard industrial classification (SIC) codes. Maps Map Catalog (New York, Tilden...Francisco. These offices are often excellent sources of free printed matter — brochures, maps, posters, etc. A directory of tourism offices worldwide is...Peterson’s Guides, Inc.). Published irregularly, these sources offer information by field of work (agriculture, business and industry , teaching English

  11. Prevalence of Extracochlear Electrodes: Computerized Tomography Scans, Cochlear Implant Maps, and Operative Reports.

    PubMed

    Holder, Jourdan T; Kessler, David M; Noble, Jack H; Gifford, René H; Labadie, Robert F

    2018-06-01

    To quantify and compare the number of cochlear implant (CI) electrodes found to be extracochlear on postoperative computerized tomography (CT) scans, the number of basal electrodes deactivated during standard CI mapping (without knowledge of the postoperative CT scan), and the extent of electrode insertion noted by the surgeon. Retrospective. Academic Medical Center. Two hundred sixty-two patients underwent standard cochlear implantation and postoperative temporal bone CT scanning. Scans were analyzed to determine the number of extracochlear electrodes. Standard CI programming had been completed without knowledge of the extracochlear electrodes identified on the CT. These standard CI maps were reviewed to record the number of deactivated basal electrodes. Lastly, each operative report was reviewed to record the extent of reported electrode insertion. 13.4% (n = 35) of CIs were found to have at least one electrode outside of the cochlea on the CT scan. Review of CI mapping indicated that audiologists had deactivated extracochlear electrodes in 60% (21) of these cases. Review of operative reports revealed that surgeons correctly indicated the number of extracochlear electrodes in 6% (2) of these cases. Extracochlear electrodes were correctly identified audiologically in 60% of cases and in surgical reports in 6% of cases; however, it is possible that at least a portion of these cases involved postoperative electrode migration. Given these findings, postoperative CT scans can provide information regarding basal electrode location, which could help improve programming accuracy, associated frequency allocation, and audibility with appropriate deactivation of extracochlear electrodes.

  12. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  13. The View from the Top of the Mountain: Building a Community of Practice with the GridWise Transactive Energy Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forfia, David; Knight, Mark; Melton, Ron

    The topic of “transactive energy” has received growing attention over the past eighteen months. It has been a part, for example, of the NY Reforming the Energy Vision discussions and the topic of activities such as the National Institute of Standards Transactive Energy Challenge. The growing discussion stems from the realization that new approaches are needed to efficiently and reliably integrate growing numbers of distributed energy resources. In anticipation of the applicability of transactive energy systems to emerging challenges in Grid integration, the GridWise® Architecture Council (GWAC) began to build a community of practice in the area of transactive energymore » systems. Starting with a workshop on the topic of “transactive energy” in May 2011, the Council brought together about a dozen interested researchers and practitioners from utilities, vendors, labs and academia to compare their approaches and experience in order to create common definitions and understanding within this topical area. This was followed in March 2012 with a second workshop during which about twice as many attendees continued the discussion. At this workshop the need for both a roadmap and a document documenting the foundations of transactive energy, common vocabulary and other definitional aspects was recognized. These two workshops led to the Council organizing the First International Conference and Workshop on Transactive Energy which took place May 23 – 24, 2013 in Portland, Oregon. The Council has continued this work with additional topical workshops, the Second International Conference and Workshop on Transactive Energy held in December 2014, and is currently organizing the Third International Conference and Workshop on Transactive Energy Systems to be held in May 2016. This article provides a summary of the Council’s work to build the community of practice through creation of a Transactive Energy Framework document and related activities. In addition to seeing transactive energy discussions on the agenda for many conferences there are also group activities relating to transactive energy being coordinated by both NIST and SGIP with which GWAC is also involved. The NIST work aims to develop and enhance modeling and simulation tools and integration into modeling and simulation platforms for Transactive Energy evaluation, as well as demonstrate how different transactive approaches may be used to improve reliability and efficiency of the electric grid. This will be accomplished through development of a set of scenarios that can serve as ongoing reference points for modeling and simulation. It is also an example of helping to develop a Transactive Energy community. The ongoing Transactive Energy Coordination Group formed by SGIP reviews the progress and directions of transactive energy activities in related parts of the SGIP and collaborating organizations such as GWAC. One of its activities is assembling a core set of transactive energy use cases as representative of the transactive energy interface requirements. This will enable assessment of interoperability requirements for transactive energy applications and an analysis of standards coverage, gaps, and future needs.« less

  14. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Standard Flood Hazard... MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination...

  15. Custom map projections for regional groundwater models

    USGS Publications Warehouse

    Kuniansky, Eve L.

    2017-01-01

    For regional groundwater flow models (areas greater than 100,000 km2), improper choice of map projection parameters can result in model error for boundary conditions dependent on area (recharge or evapotranspiration simulated by application of a rate using cell area from model discretization) and length (rivers simulated with head-dependent flux boundary). Smaller model areas can use local map coordinates, such as State Plane (United States) or Universal Transverse Mercator (correct zone) without introducing large errors. Map projections vary in order to preserve one or more of the following properties: area, shape, distance (length), or direction. Numerous map projections are developed for different purposes as all four properties cannot be preserved simultaneously. Preservation of area and length are most critical for groundwater models. The Albers equal-area conic projection with custom standard parallels, selected by dividing the length north to south by 6 and selecting standard parallels 1/6th above or below the southern and northern extent, preserves both area and length for continental areas in mid latitudes oriented east-west. Custom map projection parameters can also minimize area and length error in non-ideal projections. Additionally, one must also use consistent vertical and horizontal datums for all geographic data. The generalized polygon for the Floridan aquifer system study area (306,247.59 km2) is used to provide quantitative examples of the effect of map projections on length and area with different projections and parameter choices. Use of improper map projection is one model construction problem easily avoided.

  16. Fast or slow? Compressions (or not) in number-to-line mappings.

    PubMed

    Candia, Victor; Deprez, Paola; Wernery, Jannis; Núñez, Rafael

    2015-01-01

    We investigated, in a university student population, spontaneous (non-speeded) fast and slow number-to-line mapping responses using non-symbolic (dots) and symbolic (words) stimuli. Seeking for less conventionalized responses, we used anchors 0-130, rather than the standard 0-100. Slow responses to both types of stimuli only produced linear mappings with no evidence of non-linear compression. In contrast, fast responses revealed distinct patterns of non-linear compression for dots and words. A predicted logarithmic compression was observed in fast responses to dots in the 0-130 range, but not in the reduced 0-100 range, indicating compression in proximity of the upper anchor 130, not the standard 100. Moreover, fast responses to words revealed an unexpected significant negative compression in the reduced 0-100 range, but not in the 0-130 range, indicating compression in proximity to the lower anchor 0. Results show that fast responses help revealing the fundamentally distinct nature of symbolic and non-symbolic quantity representation. Whole number words, being intrinsically mediated by cultural phenomena such as language and education, emphasize the invariance of magnitude between them—essential for linear mappings, and therefore, unlike non-symbolic (psychophysical) stimuli, yield spatial mappings that don't seem to be influenced by the Weber-Fechner law of psychophysics. However, high levels of education (when combined with an absence of standard upper anchors) may lead fast responses to overestimate magnitude invariance on the lower end of word numerals.

  17. Using Standardized Lexicons for Report Template Validation with LexMap, a Web-based Application.

    PubMed

    Hostetter, Jason; Wang, Kenneth; Siegel, Eliot; Durack, Jeremy; Morrison, James J

    2015-06-01

    An enormous amount of data exists in unstructured diagnostic and interventional radiology reports. Free text or non-standardized terminologies limit the ability to parse, extract, and analyze these report data elements. Medical lexicons and ontologies contain standardized terms for relevant concepts including disease entities, radiographic technique, and findings. The use of standardized terms offers the potential to improve reporting consistency and facilitate computer analysis. The purpose of this project was to implement an interface to aid in the creation of standards-compliant reporting templates for use in interventional radiology. Non-standardized procedure report text was analyzed and referenced to RadLex, SNOMED-CT, and LOINC. Using JavaScript, a web application was developed which determined whether exact terms or synonyms in reports existed within these three reference resources. The NCBO BioPortal Annotator web service was used to map terms, and output from this application was used to create an interactive annotated version of the original report. The application was successfully used to analyze and modify five distinct reports for the Society of Interventional Radiology's standardized reporting project.

  18. Open Standards in Practice: An OGC China Forum Initiative

    NASA Astrophysics Data System (ADS)

    Yue, Peng; Zhang, Mingda; Taylor, Trevor; Xie, Jibo; Zhang, Hongping; Tong, Xiaochong; Yu, Jinsongdi; Huang, Juntao

    2016-11-01

    Open standards like OGC standards can be used to improve interoperability and support machine-to-machine interaction over the Web. In the Big Data era, standard-based data and processing services from various vendors could be combined to automate the extraction of information and knowledge from heterogeneous and large volumes of geospatial data. This paper introduces an ongoing OGC China forum initiative, which will demonstrate how OGC standards can benefit the interaction among multiple organizations in China. The ability to share data and processing functions across organizations using standard services could change traditional manual interactions in their business processes, and provide on-demand decision support results by on-line service integration. In the initiative, six organizations are involved in two “MashUp” scenarios on disaster management. One “MashUp” is to derive flood maps in the Poyang Lake, Jiangxi. And the other one is to generate turbidity maps on demand in the East Lake, Wuhan, China. The two scenarios engage different organizations from the Chinese community by integrating sensor observations, data, and processing services from them, and improve the automation of data analysis process using open standards.

  19. The NIST radioactivity measurement assurance program for the radiopharmaceutical industry.

    PubMed

    Cessna, Jeffrey T; Golas, Daniel B

    2012-09-01

    The National Institute of Standards and Technology (NIST) maintains a program for the establishment and dissemination of activity measurement standards in nuclear medicine. These standards are disseminated through Standard Reference Materials (SRMs), Calibration Services, radionuclide calibrator settings, and the NIST Radioactivity Measurement Assurance Program (NRMAP, formerly the NEI/NIST MAP). The MAP for the radiopharmaceutical industry is described here. Consolidated results show that, for over 3600 comparisons, 96% of the participants' results differed from that of NIST by less than 10%, with 98% being less than 20%. Individual radionuclide results are presented from 214 to 439 comparisons, per radionuclide, for (67)Ga, (90)Y, (99m)Tc, (99)Mo, (111)In, (125)I, (131)I, and (201)Tl. The percentage of participants results within 10% of NIST ranges from 88% to 98%. Published by Elsevier Ltd.

  20. Clinical data integration of distributed data sources using Health Level Seven (HL7) v3-RIM mapping

    PubMed Central

    2011-01-01

    Background Health information exchange and health information integration has become one of the top priorities for healthcare systems across institutions and hospitals. Most organizations and establishments implement health information exchange and integration in order to support meaningful information retrieval among their disparate healthcare systems. The challenges that prevent efficient health information integration for heterogeneous data sources are the lack of a common standard to support mapping across distributed data sources and the numerous and diverse healthcare domains. Health Level Seven (HL7) is a standards development organization which creates standards, but is itself not the standard. They create the Reference Information Model. RIM is developed by HL7's technical committees. It is a standardized abstract representation of HL7 data across all the domains of health care. In this article, we aim to present a design and a prototype implementation of HL7 v3-RIM mapping for information integration of distributed clinical data sources. The implementation enables the user to retrieve and search information that has been integrated using HL7 v3-RIM technology from disparate health care systems. Method and results We designed and developed a prototype implementation of HL7 v3-RIM mapping function to integrate distributed clinical data sources using R-MIM classes from HL7 v3-RIM as a global view along with a collaborative centralized web-based mapping tool to tackle the evolution of both global and local schemas. Our prototype was implemented and integrated with a Clinical Database management Systems CDMS as a plug-in module. We tested the prototype system with some use case scenarios for distributed clinical data sources across several legacy CDMS. The results have been effective in improving information delivery, completing tasks that would have been otherwise difficult to accomplish, and reducing the time required to finish tasks which are used in collaborative information retrieval and sharing with other systems. Conclusions We created a prototype implementation of HL7 v3-RIM mapping for information integration between distributed clinical data sources to promote collaborative healthcare and translational research. The prototype has effectively and efficiently ensured the accuracy of the information and knowledge extractions for systems that have been integrated PMID:22104558

  1. Temperature-dependent instability of the cTnI subunit in NIST SRM2921 characterized by tryptic peptide mapping.

    PubMed

    van der Burgt, Yuri E M; Cobbaert, Christa M; Dalebout, Hans; Smit, Nico; Deelder, André M

    2012-08-01

    In this study temperature-dependent instability of the cTnI subunit of the three-protein complex NIST SRM2921 was demonstrated using a mass spectrometric tryptic peptide mapping approach. The results were compared to the cTnI subunit obtained as a protein standard from Calbiochem with identical amino acid sequence. Both the three-protein complex from NIST as well as the cTnI subunit were incubated at elevated temperatures and then evaluated with respect to the primary sequence. The corresponding peptide maps were analyzed using LC-MS/MS. From a Mascot database search in combination with "semiTrypsin" tolerance it was found that two peptide backbone cleavages had occurred in subunit cTnI in NIST SRM2921 material upon incubation at 37°C, namely between amino acids at 148/149 and 194/195. The Calbiochem standard did not show increased levels of "unexpected" peptides in tryptic peptide maps. One of the two peptide backbone cleavages could also be monitored using a "single-step" MALDI-MS approach, i.e. without the need for peptide separation. The amount of degradation appeared rather constant in replicate temperature-instability experiments. However, for accurate quantification internal labelled standards are needed. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Building Information Modeling (BIM): A Road Map for Implementation to Support MILCON Transformation and Civil Works Projects within the U.S. Army Corps of Engineers

    DTIC Science & Technology

    2006-10-01

    benefit from BIM , but the data that can be gleaned from the BIM model will also feed many systems and users. What is Needed and What Must be...ER D C TR -0 6 -1 0 Building Information Modeling ( BIM ) A Road Map for Implementation To Support MILCON Transformation and Civil Works...compliant with National BIM Standard (NBIMS) 8 Centers of Standardization (COS) productive in BIM by 2008 All districts productive in NBIMS

  3. 24 CFR 3285.103 - Site suitability with design zone maps.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS Pre-Installation....305(c)(2) of the Manufactured Home Construction and Safety Standards in this chapter. (b) Roof load... § 3280.305(c)(3) of the Manufactured Home Construction and Safety Standards in this chapter. Refer to...

  4. 24 CFR 3285.103 - Site suitability with design zone maps.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS Pre-Installation....305(c)(2) of the Manufactured Home Construction and Safety Standards in this chapter. (b) Roof load... § 3280.305(c)(3) of the Manufactured Home Construction and Safety Standards in this chapter. Refer to...

  5. 24 CFR 3285.103 - Site suitability with design zone maps.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS Pre-Installation....305(c)(2) of the Manufactured Home Construction and Safety Standards in this chapter. (b) Roof load... § 3280.305(c)(3) of the Manufactured Home Construction and Safety Standards in this chapter. Refer to...

  6. Family Care Map: Sustaining family-centered care in Polytrauma Rehabilitation Centers

    PubMed Central

    Ford, James H.; Wise, Meg; Krahn, Dean; Oliver, Karen Anderson; Hall, Carmen; Sayer, Nina

    2015-01-01

    The study assessed sustainability of the Family Care Map, a family-centered approach to providing care for Veterans with polytrauma-related injuries, in four Department of Veterans Affairs Polytrauma Rehabilitation Centers. We applied a mixed-methods approach. Staff surveys used standardized measures of sustainability, commitment to change, information, and participation during implementation. Qualitative inquiry assessed Family Care Map implementation and facilitators and barriers to sustainability. Staff sustainability perceptions had a significant positive correlation with affective commitment to change, participation, and information received about the change process. Family Care Map integration into standard practices and use of its concepts with patients and families related to staff perceptions about sustainability. The degree of use and integration of the Family Care Map in traumatic brain injury/polytrauma care varied among the Polytrauma Rehabilitation Centers. Some successful sustainability strategies included integration into daily workflow and organizational culture. Examples of sustainability barriers included staff awareness and use and outdated information. Some practices, such as measuring and documenting the use of the Family Care Map in treatment plans, may not routinely occur. The focus on family-centered care will require further evaluation of organization-, staff-, and innovation-level attributes that influence sustainability of changes designed to improve family-centered care. PMID:25671632

  7. Development of new mapping standards for geological surveys in Greenland

    NASA Astrophysics Data System (ADS)

    Mätzler, Eva; langley, Kirsty; Hollis, Julie; Heide-Jørgensen, Helene

    2017-04-01

    The current official topographic and geological maps of Greenland are in scale of 1:250:000 and 1:500.000 respectively, allowing only very limited amount of detail. The maps are outdated, and periglacial landscapes have changed significantly since the acquisition date. Hence, new affordable mapping products of high quality are in demand that can be available within a restricted time frame. In order to fulfill those demands a new mapping standard based on satellite imagery was developed, where classifications are mainly carried out with algorithms suitable for automatization. A Digital Elevation Model (ArcticDEM) was applied allowing examination of topographic and geological structures and 3D visualizing. Information on topographic features and lithology was extracted based on analysis of spectral characteristics from different multispectral data sources (Landsat 8, ASTER, WorldView-3) partly combined with the DEM. A first product is completed, and validation was carried out by field surveys. Field and remotely sensed data were integrated into a GIS database, and derived data will be freely available providing a valuable tool for planning and carrying out mineral exploration and other field activities. This study offers a method for generating up-to-date, low-cost and high quality mapping products suitable for Arctic regions, where accessibility is restricted due to remoteness and lack of infrastructure.

  8. USGS national surveys and analysis projects: Preliminary compilation of integrated geological datasets for the United States

    USGS Publications Warehouse

    Nicholson, Suzanne W.; Stoeser, Douglas B.; Wilson, Frederic H.; Dicken, Connie L.; Ludington, Steve

    2007-01-01

    The growth in the use of Geographic nformation Systems (GS) has highlighted the need for regional and national digital geologic maps attributed with age and rock type information. Such spatial data can be conveniently used to generate derivative maps for purposes that include mineral-resource assessment, metallogenic studies, tectonic studies, human health and environmental research. n 1997, the United States Geological Survey’s Mineral Resources Program initiated an effort to develop national digital databases for use in mineral resource and environmental assessments. One primary activity of this effort was to compile a national digital geologic map database, utilizing state geologic maps, to support mineral resource studies in the range of 1:250,000- to 1:1,000,000-scale. Over the course of the past decade, state databases were prepared using a common standard for the database structure, fields, attributes, and data dictionaries. As of late 2006, standardized geological map databases for all conterminous (CONUS) states have been available on-line as USGS Open-File Reports. For Alaska and Hawaii, new state maps are being prepared, and the preliminary work for Alaska is being released as a series of 1:500,000-scale regional compilations. See below for a list of all published databases.

  9. Topographic mapping data semantics through data conversion and enhancement: Chapter 7

    USGS Publications Warehouse

    Varanka, Dalia; Carter, Jonathan; Usery, E. Lynn; Shoberg, Thomas; Edited by Ashish, Naveen; Sheth, Amit P.

    2011-01-01

    This paper presents research on the semantics of topographic data for triples and ontologies to blend the capabilities of the Semantic Web and The National Map of the U.S. Geological Survey. Automated conversion of relational topographic data of several geographic sample areas to the triple data model standard resulted in relatively poor semantic associations. Further research employed vocabularies of feature type and spatial relation terms. A user interface was designed to model the capture of non-standard terms relevant to public users and to map those terms to existing data models of The National Map through the use of ontology. Server access for the study area triple stores was made publicly available, illustrating how the development of linked data may transform institutional policies to open government data resources to the public. This paper presents these data conversion and research techniques that were tested as open linked data concepts leveraged through a user-centered interface and open USGS server access to the public.

  10. Improved Topographic Mapping Through Multi-Baseline SAR Interferometry with MAP Estimation

    NASA Astrophysics Data System (ADS)

    Dong, Yuting; Jiang, Houjun; Zhang, Lu; Liao, Mingsheng; Shi, Xuguo

    2015-05-01

    There is an inherent contradiction between the sensitivity of height measurement and the accuracy of phase unwrapping for SAR interferometry (InSAR) over rough terrain. This contradiction can be resolved by multi-baseline InSAR analysis, which exploits multiple phase observations with different normal baselines to improve phase unwrapping accuracy, or even avoid phase unwrapping. In this paper we propose a maximum a posteriori (MAP) estimation method assisted by SRTM DEM data for multi-baseline InSAR topographic mapping. Based on our method, a data processing flow is established and applied in processing multi-baseline ALOS/PALSAR dataset. The accuracy of resultant DEMs is evaluated by using a standard Chinese national DEM of scale 1:10,000 as reference. The results show that multi-baseline InSAR can improve DEM accuracy compared with single-baseline case. It is noteworthy that phase unwrapping is avoided and the quality of multi-baseline InSAR DEM can meet the DTED-2 standard.

  11. Bridging the Gap between HL7 CDA and HL7 FHIR: A JSON Based Mapping.

    PubMed

    Rinner, Christoph; Duftschmid, Georg

    2016-01-01

    The Austrian electronic health record (EHR) system ELGA went live in December 2016. It is a document oriented EHR system and is based on the HL7 Clinical Document Architecture (CDA). The HL7 Fast Healthcare Interoperability Resources (FHIR) is a relatively new standard that combines the advantages of HL7 messages and CDA Documents. In order to offer easier access to information stored in ELGA we present a method based on adapted FHIR resources to map CDA documents to FHIR resources. A proof-of-concept tool using Java, the open-source FHIR framework HAPI-FHIR and publicly available FHIR servers was created to evaluate the presented mapping. In contrast to other approaches the close resemblance of the mapping file to the FHIR specification allows existing FHIR infrastructure to be reused. In order to reduce information overload and facilitate the access to CDA documents, FHIR could offer a standardized way to query CDA data on a fine granular base in Austria.

  12. Effect of temperature on the visualization by digital color mapping of latent fingerprint deposits on metal.

    PubMed

    Peel, Alicia; Bond, John W

    2014-03-01

    Visualization of fingerprint deposits by digital color mapping of light reflected from the surface of heated brass, copper, aluminum, and tin has been investigated using Adobe® Photoshop®. Metals were heated to a range of temperatures (T) between 50°C and 500°C in 50°C intervals with enhancement being optimal when the metals are heated to 250°C, 350°C, 50°C, and 300°C, respectively, and the hue values adjusted to 247°, 245°, 5°, and 34°, respectively. Fingerprint visualization after color mapping was not degraded by subsequent washing of the metals and color mapping did not compromise the visibility of the fingerprint for all values of T. The optimum value of T for fingerprint visibility is significantly dependent of the standard reduction potential of the metal with Kendall’s Tau (τ) = 0.953 (p < 0.001). For brass, this correlation is obtained when considering the standard reduction potential of zinc rather than copper.

  13. Selection of vegetation indices for mapping the sugarcane condition around the oil and gas field of North West Java Basin, Indonesia

    NASA Astrophysics Data System (ADS)

    Muji Susantoro, Tri; Wikantika, Ketut; Saepuloh, Asep; Handoyo Harsolumakso, Agus

    2018-05-01

    Selection of vegetation indices in plant mapping is needed to provide the best information of plant conditions. The methods used in this research are the standard deviation and the linear regression. This research tried to determine the vegetation indices used for mapping the sugarcane conditions around oil and gas fields. The data used in this study is Landsat 8 OLI/TIRS. The standard deviation analysis on the 23 vegetation indices with 27 samples has resulted in the six highest standard deviations of vegetation indices, termed as GRVI, SR, NLI, SIPI, GEMI and LAI. The standard deviation values are 0.47; 0.43; 0.30; 0.17; 0.16 and 0.13. Regression correlation analysis on the 23 vegetation indices with 280 samples has resulted in the six vegetation indices, termed as NDVI, ENDVI, GDVI, VARI, LAI and SIPI. This was performed based on regression correlation with the lowest value R2 than 0,8. The combined analysis of the standard deviation and the regression correlation has obtained the five vegetation indices, termed as NDVI, ENDVI, GDVI, LAI and SIPI. The results of the analysis of both methods show that a combination of two methods needs to be done to produce a good analysis of sugarcane conditions. It has been clarified through field surveys and showed good results for the prediction of microseepages.

  14. From printed geological maps to web-based service oriented data products - strategies, foundations and problems.

    NASA Astrophysics Data System (ADS)

    Ebner, M.; Schiegl, M.; Stöckl, W.; Heger, H.

    2012-04-01

    The Geological Survey of Austria is legally obligated by the INSPIRE directive to provide data that fall under this directive (geology, mineral resources and natural risk zones) to the European commission in a semantically harmonized and technically interoperable way. Until recently the focus was entirely on the publication of high quality printed cartographic products. These have a complex (carto-)graphic data-model, which allows visualizing several thematic aspects, such as lithology, stratigraphy, tectonics, geologic age, mineral resources, mass movements, geomorphology etc. in a single planar map/product. Nonetheless these graphic data-models do not allow retrieving individual thematic aspects since these were coded in a complex portrayal scheme. Automatic information retrieval is thus impossible; and domain knowledge is necessary to interpret these "encrypted datasets". With INSPIRE becoming effective and a variety of conceptual models (e.g. GeoSciML), built around a semantic framework (i.e. controlled vocabularies), being available it is necessary to develop a strategy and workflow for semantic harmonization of such datasets. In this contribution we demonstrate the development of a multistage workflow which will allow us to transform our printed maps to semantically enabled datasets and services and discuss some prerequisites, foundations and problems. In a first step in our workflow we analyzed our maps and developed controlled vocabularies that describe the thematic content of our data. We then developed a physical data-model which we use to attribute our spatial data with thematic information from our controlled vocabularies to form core thematic data sets. This physical data model is geared towards use on an organizational level but builds upon existing standards (INSPIRE, GeoSciML) to allow transformation to international standards. In a final step we will develop a standardized mapping scheme to publish INSPIRE conformant services from our core datasets. This two-step transformation is necessary since a direct mapping to international standards is not possible for traditional map-based data. Controlled vocabularies provide the foundation of a semantic harmonization. For the encoding of the vocabularies we build upon the W3C standard SKOS (=Simple Knowledge Organisation System), a thesaurus specification for the semantic web, which is itself based on the Resource Description Framework (RDF) and RDF Schema and added some DublinCore and VoID for the metadata of our vocabularies and resources. For the development of these thesauri we use the commercial software PoolParty, which is a tool specially build to develop, manage and publish multilingual thesauri. The corporate thesauri of the Austrian Geological Survey are exposed via a web-service that is conformant with the linked data principles. This web-service gives access to a (1) RDF/HTML representation of the resources via a simple, robust and thus persistent http URIs (2) a download of the complete vocabularies in RDF-format (3) a full-fledged SPARQL-Endpoint to query the thesaurus. With the development of physical data-models (based on preexisting conceptual models) one must dismiss the classical schemes of map-based portrayal of data. E.g. for individual Geological units on traditional geological maps usually a single age range is given (e.g. formation age). But one might want to attribute several geologic ages (formation age, metamorphic age, cooling ages etc.) to individual units. Such issues have to be taken into account when developing robust physical data-models. Based on our experience we are convinced that individual institutions need to develop their own controlled vocabularies and individual data-models that fit the specific needs on an organizational level. If externally developed vocabularies and data-models are introduced to established workflows newly generated and existing data may be diverging and it will be hard to achieve or maintain a common standard. We thus suggest that it is necessary for institutions to keep (or develop) to their organizational standards and vocabularies and map them to generally agreed international standards such as INSPIRE or GeoSciML in a fashion suggested by the linked data principles.

  15. Gamut mapping in a high-dynamic-range color space

    NASA Astrophysics Data System (ADS)

    Preiss, Jens; Fairchild, Mark D.; Ferwerda, James A.; Urban, Philipp

    2014-01-01

    In this paper, we present a novel approach of tone mapping as gamut mapping in a high-dynamic-range (HDR) color space. High- and low-dynamic-range (LDR) images as well as device gamut boundaries can simultaneously be represented within such a color space. This enables a unified transformation of the HDR image into the gamut of an output device (in this paper called HDR gamut mapping). An additional aim of this paper is to investigate the suitability of a specific HDR color space to serve as a working color space for the proposed HDR gamut mapping. For the HDR gamut mapping, we use a recent approach that iteratively minimizes an image-difference metric subject to in-gamut images. A psychophysical experiment on an HDR display shows that the standard reproduction workflow of two subsequent transformations - tone mapping and then gamut mapping - may be improved by HDR gamut mapping.

  16. Validation Workshop of the DRDC Concept Map Knowledge Model: Issues in Intelligence Analysis

    DTIC Science & Technology

    2010-06-29

    group noted problems with grammar , and a more standard approach to the grammar of the linking term (e.g. use only active tense ) would certainly have...Knowledge Model is distinct from a Concept Map. A Concept Map is a single map, probably presented in one view, while a Knowledge Model is a set of...Agenda The workshop followed the agenda presented in Table 2-3. Table 2-3: Workshop Agenda Time Title 13:00 – 13:15 Registration 13:15 – 13:45

  17. Online, interactive assessment of geothermal energy potential in the U.S

    NASA Astrophysics Data System (ADS)

    Allison, M. L.; Richard, S. M.; Clark, R.; Coleman, C.; Love, D.; Pape, E.; Musil, L.

    2011-12-01

    Geothermal-relevant geosciences data from all 50 states (www.stategeothermaldata.org), federal agencies, national labs, and academic centers are being digitized and linked in a distributed network via the U.S. Department of Energy-funded National Geothermal Data System (NGDS) to foster geothermal energy exploration and development through use of interactive online 'mashups,' data integration, and applications. Emphasis is first to make as much information as possible accessible, with a long range goal to make data interoperable through standardized services and interchange formats. Resources may be made available as documents (files) in whatever format they are currently in, converted to tabular files using standard content models, or published as Open Geospatial Consortium or ESRI Web services using the standard xml schema. An initial set of thirty geoscience data content models are in use or under development to define standardized interchange format: aqueous chemistry, borehole temperature data, direct use feature, drill stem test, earthquake hypocenter, fault feature, geologic contact feature, geologic unit feature, thermal/hot spring description, metadata, quaternary fault, volcanic vent description, well header feature, borehole lithology log, crustal stress, gravity, heat flow/temperature gradient, permeability, and feature description data like developed geothermal systems, geologic unit geothermal properties, permeability, production data, rock alteration description, rock chemistry, and thermal conductivity. Map services are also being developed for isopach maps (depth to bedrock), aquifer temperature maps, and several states are working on geothermal resource overview maps. Content models are developed preferentially from existing community use in order to encourage widespread adoption and promulgate minimum metadata quality standards. Geoscience data and maps from NGDS participating institutions (USGS, Southern Methodist University, Boise State University Geothermal Data Coalition) are being supplemented with extensive land management and land use resources from the Western Regional Partnership (15 federal agencies and 5 Western states) to provide access to a comprehensive, holistic set of data critical to geothermal energy development. As of August 2011, over 33,000 data resources have been registered in the system catalog, along with scores of Web services to deliver integrated data to the desktop for free downloading or online use. The data exchange mechanism is built on the U.S. Geoscience Information Network (USGIN, http://lab.usgin.org) protocols and standards developed in partnership with the U.S. Geological Survey.

  18. Social Network Map: Some Further Refinements on Administration.

    ERIC Educational Resources Information Center

    Tracy, Elizabeth M.; Abell, Neil

    1994-01-01

    Notes that social network mapping techniques have been advanced as means of assessing social and environmental resources. Addresses issue of convergent construct validity, correlations among dimensions of perceived social support as measured by social network data with other standardized social support instruments. Findings confirm that structural…

  19. An approach for mapping the number and distribution of Salmonella contamination on the poultry carcass.

    PubMed

    Oscar, T P

    2008-09-01

    Mapping the number and distribution of Salmonella on poultry carcasses will help guide better design of processing procedures to reduce or eliminate this human pathogen from poultry. A selective plating media with multiple antibiotics (xylose-lysine agar medium [XL] containing N-(2-hydroxyethyl)piperazine-N'-(2-ethanesulfonic acid) and the antibiotics chloramphenicol, ampicillin, tetracycline, and streptomycin [XLH-CATS]) and a multiple-antibiotic-resistant strain (ATCC 700408) of Salmonella Typhimurium definitive phage type 104 (DT104) were used to develop an enumeration method for mapping the number and distribution of Salmonella Typhimurium DT104 on the carcasses of young chickens in the Cornish game hen class. The enumeration method was based on the concept that the time to detection by drop plating on XLH-CATS during incubation of whole chicken parts in buffered peptone water would be inversely related to the initial log number (N0) of Salmonella Typhimurium DT104 on the chicken part. The sampling plan for mapping involved dividing the chicken into 12 parts, which ranged in average size from 36 to 80 g. To develop the enumeration method, whole parts were spot inoculated with 0 to 6 log Salmonella Typhimurium DT104, incubated in 300 ml of buffered peptone water, and detected on XLH-CATS by drop plating. An inverse relationship between detection time on XLH-CATS and N0 was found (r = -0.984). The standard curve was similar for the individual chicken parts and therefore, a single standard curve for all 12 chicken parts was developed. The final standard curve, which contained a 95% prediction interval for providing stochastic results for N0, had high goodness of fit (r2 = 0.968) and was N0 (log) = 7.78 +/- 0.61 - (0.995 x detention time). Ninety-five percent of N0 were within +/- 0.61 log of the standard curve. The enumeration method and sampling plan will be used in future studies to map changes in the number and distribution of Salmonella on carcasses of young chickens fed the DT104 strain used in standard curve development and subjected to different processing procedures.

  20. Enabling Joint Commission Medication Reconciliation Objectives with the HL7 / ASTM Continuity of Care Document Standard

    PubMed Central

    Dolin, Robert H.; Giannone, Gay; Schadow, Gunther

    2007-01-01

    We sought to determine how well the HL7 / ASTM Continuity of Care Document (CCD) standard supports the requirements underlying the Joint Commission medication reconciliation recommendations. In particular, the Joint Commission emphasizes that transition points in the continuum of care are vulnerable to communication breakdowns, and that these breakdowns are a common source of medication errors. These transition points are the focus of communication standards, suggesting that CCD can support and enable medication related patient safety initiatives. Data elements needed to support the Joint Commission recommendations were identified and mapped to CCD, and a detailed clinical scenario was constructed. The mapping identified minor gaps, and identified fields present in CCD not specifically identified by Joint Commission, but useful nonetheless when managing medications across transitions of care, suggesting that a closer collaboration between the Joint Commission and standards organizations will be mutually beneficial. The nationally recognized CCD specification provides a standards-based solution for enabling Joint Commission medication reconciliation objectives. PMID:18693823

  1. Enabling joint commission medication reconciliation objectives with the HL7 / ASTM Continuity of Care Document standard.

    PubMed

    Dolin, Robert H; Giannone, Gay; Schadow, Gunther

    2007-10-11

    We sought to determine how well the HL7/ASTM Continuity of Care Document (CCD) standard supports the requirements underlying the Joint Commission medication reconciliation recommendations. In particular, the Joint Commission emphasizes that transition points in the continuum of care are vulnerable to communication breakdowns, and that these breakdowns are a common source of medication errors. These transition points are the focus of communication standards, suggesting that CCD can support and enable medication related patient safety initiatives. Data elements needed to support the Joint Commission recommendations were identified and mapped to CCD, and a detailed clinical scenario was constructed. The mapping identified minor gaps, and identified fields present in CCD not specifically identified by Joint Commission, but useful nonetheless when managing medications across transitions of care, suggesting that a closer collaboration between the Joint Commission and standards organizations will be mutually beneficial. The nationally recognized CCD specification provides a standards-based solution for enabling Joint Commission medication reconciliation objectives.

  2. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  3. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.

  4. Vector Topographic Map Data over the BOREAS NSA and SSA in SIF Format

    NASA Technical Reports Server (NTRS)

    Knapp, David; Nickeson, Jaime; Hall, Forrest G. (Editor)

    2000-01-01

    This data set contains vector contours and other features of individual topographic map sheets from the National Topographic Series (NTS). The map sheet files were received in Standard Interchange Format (SIF) and cover the BOReal Ecosystem-Atmosphere Study (BOREAS) Northern Study Area (NSA) and Southern Study Area (SSA) at scales of 1:50,000 and 1:250,000. The individual files are stored in compressed Unix tar archives.

  5. High-Altitude Electromagnetic Pulse (HEMP) Testing

    DTIC Science & Technology

    2011-11-10

    Security Classification Guide ( SCG ). b. The HEMP simulation facility shall have a measured map of the peak amplitude waveform of the...Quadripartite Standardization Agreement s, sec second SCG security classification guide SN serial number SOP Standard Operating Procedure

  6. Standardized reference ideogram for physical mapping in the saltwater crocodile (Crocodylus porosus).

    PubMed

    Dalzell, P; Miles, L G; Isberg, S R; Glenn, T C; King, C; Murtagh, V; Moran, C

    2009-01-01

    Basic cytogenetic data, such as diploid number and general chromosome morphology, are available for many reptilian species. Here we present a detailed cytogenetic examination of the saltwater crocodile (Crocodylus porosus) karyotype, including the creation of the first fully annotated G-band standard ideogram for any crocodilian species. The C. porosus karyotype contains macrochromosomes and has a diploid number of 34. This study presents a detailed description of each chromosome, permitting unambiguous chromosome identification. The fully annotated standardized C. porosus ideogram provides the backbone to a standard nomenclature system which can be used to accurately identify specific band locations. Seven microsatellite containing fosmid clones were fluorescently labeled and used as fluorescent in situ hybridization (FISH) probes for physical localization. Chromosome locations for each of these FISH probes were successfully assigned, demonstrating the utility of the fully annotated ideogram for genome mapping. Copyright 2010 S. Karger AG, Basel.

  7. NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information

    USGS Publications Warehouse

    ,

    2004-01-01

    Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.

  8. Hydrologic unit maps

    USGS Publications Warehouse

    Seaber, Paul R.; Kapinos, F. Paul; Knapp, George L.

    1987-01-01

    A set of maps depicting approved boundaries of, and numerical codes for, river-basin units of the United States has been developed by the U.S . Geological Survey. These 'Hydrologic Unit Maps' are four-color maps that present information on drainage, culture, hydrography, and hydrologic boundaries and codes of (1) the 21 major water-resources regions and the 222 subregions designated by the U.S . Water Resources Council, (2) the 352 accounting units of the U.S. Geological Survey's National Water Data Network, and (3) the 2,149 cataloging units of the U.S . Geological Survey's 'Catalog of information on Water Data:' The maps are plotted on the Geological Survey State base-map series at a scale of 1 :500,000 and, except for Alaska, depict hydrologic unit boundaries for all drainage basins greater than 700 square miles (1,813 square kilometers). A complete list of all the hydrologic units, along with their drainage areas, their names, and the names of the States or outlying areas in which they reside, is contained in the report. These maps and associated codes provide a standardized base for use by water-resources organizations in locating, storing, retrieving, and exchanging hydrologic data, in indexing and inventorying hydrologic data and information, in cataloging water-data acquisition activities, and in a variety of other applications. Because the maps have undergone extensive review by all principal Federal, regional, and State water-resource agencies, they are widely accepted for use in planning and describing water-use and related land-use activities, and in geographically organizing hydrologic data . Examples of these uses are given in the report . The hydrologic unit codes shown on the maps have been approved as a Federal Information Processing Standard for use by the Federal establishment.

  9. Using remote sensing in support of environmental management: A framework for selecting products, algorithms and methods.

    PubMed

    de Klerk, Helen M; Gilbertson, Jason; Lück-Vogel, Melanie; Kemp, Jaco; Munch, Zahn

    2016-11-01

    Traditionally, to map environmental features using remote sensing, practitioners will use training data to develop models on various satellite data sets using a number of classification approaches and use test data to select a single 'best performer' from which the final map is made. We use a combination of an omission/commission plot to evaluate various results and compile a probability map based on consistently strong performing models across a range of standard accuracy measures. We suggest that this easy-to-use approach can be applied in any study using remote sensing to map natural features for management action. We demonstrate this approach using optical remote sensing products of different spatial and spectral resolution to map the endemic and threatened flora of quartz patches in the Knersvlakte, South Africa. Quartz patches can be mapped using either SPOT 5 (used due to its relatively fine spatial resolution) or Landsat8 imagery (used because it is freely accessible and has higher spectral resolution). Of the variety of classification algorithms available, we tested maximum likelihood and support vector machine, and applied these to raw spectral data, the first three PCA summaries of the data, and the standard normalised difference vegetation index. We found that there is no 'one size fits all' solution to the choice of a 'best fit' model (i.e. combination of classification algorithm or data sets), which is in agreement with the literature that classifier performance will vary with data properties. We feel this lends support to our suggestion that rather than the identification of a 'single best' model and a map based on this result alone, a probability map based on the range of consistently top performing models provides a rigorous solution to environmental mapping. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Developing Tsunami Evacuation Plans, Maps, And Procedures: Pilot Project in Central America

    NASA Astrophysics Data System (ADS)

    Arcos, N. P.; Kong, L. S. L.; Arcas, D.; Aliaga, B.; Coetzee, D.; Leonard, J.

    2015-12-01

    In the End-to-End tsunami warning chain, once a forecast is provided and a warning alert issued, communities must know what to do and where to go. The 'where to' answer would be reliable and practical community-level tsunami evacuation maps. Following the Exercise Pacific Wave 2011, a questionnaire was sent to the 46 Member States of Pacific Tsunami Warning System (PTWS). The results revealed over 42 percent of Member States lacked tsunami mass coastal evacuation plans. Additionally, a significant gap in mapping was exposed as over 55 percent of Member States lacked tsunami evacuation maps, routes, signs and assembly points. Thereby, a significant portion of countries in the Pacific lack appropriate tsunami planning and mapping for their at-risk coastal communities. While a variety of tools exist to establish tsunami inundation areas, these are inconsistent while a methodology has not been developed to assist countries develop tsunami evacuation maps, plans, and procedures. The International Tsunami Information Center (ITIC) and partners is leading a Pilot Project in Honduras demonstrating that globally standardized tools and methodologies can be applied by a country, with minimal tsunami warning and mitigation resources, towards the determination of tsunami inundation areas and subsequently community-owned tsunami evacuation maps and plans for at-risk communities. The Pilot involves a 1- to 2-year long process centered on a series of linked tsunami training workshops on: evacuation planning, evacuation map development, inundation modeling and map creation, tsunami warning & emergency response Standard Operating Procedures (SOPs), and conducting tsunami exercises (including evacuation). The Pilot's completion is capped with a UNESCO/IOC document so that other countries can replicate the process in their tsunami-prone communities.

  11. Crowdsourcing-based evaluation of privacy in HDR images

    NASA Astrophysics Data System (ADS)

    Korshunov, Pavel; Nemoto, Hiromi; Skodras, Athanassios; Ebrahimi, Touradj

    2014-05-01

    The ability of High Dynamic Range imaging (HDRi) to capture details in high-contrast environments, making both dark and bright regions clearly visible, has a strong implication on privacy. However, the extent to which HDRi affects privacy when it is used instead of typical Standard Dynamic Range imaging (SDRi) is not yet clear. In this paper, we investigate the effect of HDRi on privacy via crowdsourcing evaluation using the Microworkers platform. Due to the lack of HDRi standard privacy evaluation dataset, we have created such dataset containing people of varying gender, race, and age, shot indoor and outdoor and under large range of lighting conditions. We evaluate the tone-mapped versions of these images, obtained by several representative tone-mapping algorithms, using subjective privacy evaluation methodology. Evaluation was performed using crowdsourcing-based framework, because it is a popular and effective alternative to traditional lab-based assessment. The results of the experiments demonstrate a significant loss of privacy when even tone-mapped versions of HDR images are used compared to typical SDR images shot with a standard exposure.

  12. 30 CFR 75.1200-1 - Additional information on mine map.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Maps § 75.1200-1 Additional... symbols; (g) The location of railroad tracks and public highways leading to the mine, and mine buildings... permanent base line points coordinated with the underground and surface mine traverses, and the location and...

  13. Thematic Accuracy Assessment of the 2011 National Land Cover Database (NLCD)

    EPA Science Inventory

    Accuracy assessment is a standard protocol of National Land Cover Database (NLCD) mapping. Here we report agreement statistics between map and reference labels for NLCD 2011, which includes land cover for ca. 2001, ca. 2006, and ca. 2011. The two main objectives were assessment o...

  14. A Road Map for Improving Geography Assessment

    ERIC Educational Resources Information Center

    Wertheim, Jill A.; Edelson, Daniel C.; Hildebrant, Barbara; Hinde, Elizabeth; Kenney, Marianne; Kolvoord, Robert; Lanegran, David; Marcello, Jody Smothers; Morrill, Robert; Ruiz-Primo, Maria; Seixas, Peter; Shavelson, Richard

    2013-01-01

    In late 2012, both the second edition of the "Geography for Life: National Geography Standards" and the National Science Foundation-funded "Road Map for Geography Education Project" reports were released; the former document describes the conceptual goals for K-12 geography education, and the latter, a route to coordinating reform efforts to…

  15. Realizing the Full Potential of the Video Disc for Mapping Applications,

    DTIC Science & Technology

    1985-03-01

    symbology, lettering and color usage are all factors that will be tested and evalu- ated for ease of recognition and visual communication when maps are...filmed and displayed on a standard television monitor and the images will then be evaluated for ease of recognition and visual communication . This

  16. Current trends in satellite based emergency mapping - the need for harmonisation

    NASA Astrophysics Data System (ADS)

    Voigt, Stefan

    2013-04-01

    During the past years, the availability and use of satellite image data to support disaster management and humanitarian relief organisations has largely increased. The automation and data processing techniques are greatly improving as well as the capacity in accessing and processing satellite imagery in getting better globally. More and more global activities via the internet and through global organisations like the United Nations or the International Charter Space and Major Disaster engage in the topic, while at the same time, more and more national or local centres engage rapid mapping operations and activities. In order to make even more effective use of this very positive increase of capacity, for the sake of operational provision of analysis results, for fast validation of satellite derived damage assessments, for better cooperation in the joint inter agency generation of rapid mapping products and for general scientific use, rapid mapping results in general need to be better harmonized, if not even standardized. In this presentation, experiences from various years of rapid mapping gained by the DLR Center for satellite based Crisis Information (ZKI) within the context of the national activities, the International Charter Space and Major Disasters, GMES/Copernicus etc. are reported. Furthermore, an overview on how automation, quality assurance and optimization can be achieved through standard operation procedures within a rapid mapping workflow is given. Building on this long term rapid mapping experience, and building on the DLR initiative to set in pace an "International Working Group on Satellite Based Emergency Mapping" current trends in rapid mapping are discussed and thoughts on how the sharing of rapid mapping information can be optimized by harmonizing analysis results and data structures are presented. Such an harmonization of analysis procedures, nomenclatures and representations of data as well as meta data are the basis to better cooperate within the global rapid mapping community throughout local/national, regional/supranational and global scales

  17. Cartographic research 1977

    USGS Publications Warehouse

    ,

    1978-01-01

    Two major subjects of the current research of the Topographic Division as reported here are related to policy decisions affecting the National Mapping Program of the Geological Survey. The adoption of a metric mapping policy has resulted in new cartographic products with associated changes in map design that require new looks in graphics and new equipment. The increasing use of digitized cartographic information has led to developments in data acquisition, processing, and storage and consequent changes in equipment and techniques. This report summarizes the activities in cartographic research and development for the 12-month period ending June 1977 and covers work done at the several facilities of the Topographic Division: the Western Mapping Center at Menlo Park, Calif., the Rocky Mountain Mapping Center at Denver, Colo., the Mid-Continent Mapping Center at Rolla, Mo., and the Eastern Mapping Center, the Special Mapping Center, the Office of Plans and Program Development, and the Office of Research and Technical Standards all at Reston, Va.

  18. Status and future of extraterrestrial mapping programs

    NASA Technical Reports Server (NTRS)

    Batson, R. M.

    1981-01-01

    Extensive mapping programs have been completed for the Earth's Moon and for the planet Mercury. Mars, Venus, and the Galilean satellites of Jupiter (Io, Europa, Ganymede, and Callisto), are currently being mapped. The two Voyager spacecraft are expected to return data from which maps can be made of as many as six of the satellites of Saturn and two or more of the satellites of Uranus. The standard reconnaissance mapping scales used for the planets are 1:25,000,000 and 1:5,000,000; where resolution of data warrants, maps are compiled at the larger scales of 1:2,000,000, 1:1,000,000 and 1:250,000. Planimetric maps of a particular planet are compiled first. The first spacecraft to visit a planet is not designed to return data from which elevations can be determined. As exploration becomes more intensive, more sophisticated missions return photogrammetric and other data to permit compilation of contour maps.

  19. WMS and WFS Standards Implementation of Weather Data

    NASA Astrophysics Data System (ADS)

    Armstrong, M.

    2005-12-01

    CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.

  20. Colour segmentation of multi variants tuberculosis sputum images using self organizing map

    NASA Astrophysics Data System (ADS)

    Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri

    2017-05-01

    Lung tuberculosis detection is still identified from Ziehl-Neelsen sputum smear images in low and middle countries. The clinicians decide the grade of this disease by counting manually the amount of tuberculosis bacilli. It is very tedious for clinicians with a lot number of patient and without standardization for sputum staining. The tuberculosis sputum images have multi variant characterizations in colour, because of no standardization in staining. The sputum has more variants colour and they are difficult to be identified. For helping the clinicians, this research examined the Self Organizing Map method for colouring image segmentation in sputum images based on colour clustering. This method has better performance than k-means clustering which also tried in this research. The Self Organizing Map could segment the sputum images with y good result and cluster the colours adaptively.

  1. Optical frequency standard development in support of NASA's gravity-mapping missions

    NASA Technical Reports Server (NTRS)

    Klipstein, W. M.; Seidel, D. J.; White, J. A.; Young, B. C.

    2001-01-01

    We intend to combine the exquisite performance over short time scales coming from a cavity reference with the long-term stability of an atomic frequency standard with an eye towards reliability in a spaceflight application.

  2. National Pipeline Mapping System (NPMS) : repository standards

    DOT National Transportation Integrated Search

    1997-07-01

    This draft document contains 7 sections. They are as follows: 1. General Topics, 2. Data Formats, 3. Metadata, 4. Attribute Data, 5. Data Flow, 6. Descriptive Process, and 7. Validation and Processing of Submitted Data. These standards were created w...

  3. The Global Genome Biodiversity Network (GGBN) Data Standard specification

    PubMed Central

    Droege, G.; Barker, K.; Seberg, O.; Coddington, J.; Benson, E.; Berendsohn, W. G.; Bunk, B.; Butler, C.; Cawsey, E. M.; Deck, J.; Döring, M.; Flemons, P.; Gemeinholzer, B.; Güntsch, A.; Hollowell, T.; Kelbert, P.; Kostadinov, I.; Kottmann, R.; Lawlor, R. T.; Lyal, C.; Mackenzie-Dodds, J.; Meyer, C.; Mulcahy, D.; Nussbeck, S. Y.; O'Tuama, É.; Orrell, T.; Petersen, G.; Robertson, T.; Söhngen, C.; Whitacre, J.; Wieczorek, J.; Yilmaz, P.; Zetzsche, H.; Zhang, Y.; Zhou, X.

    2016-01-01

    Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today’s ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard. Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard PMID:27694206

  4. The National Map seamless digital elevation model specifications

    USGS Publications Warehouse

    Archuleta, Christy-Ann M.; Constance, Eric W.; Arundel, Samantha T.; Lowe, Amanda J.; Mantey, Kimberly S.; Phillips, Lori A.

    2017-08-02

    This specification documents the requirements and standards used to produce the seamless elevation layers for The National Map of the United States. Seamless elevation data are available for the conterminous United States, Hawaii, Alaska, and the U.S. territories, in three different resolutions—1/3-arc-second, 1-arc-second, and 2-arc-second. These specifications include requirements and standards information about source data requirements, spatial reference system, distribution tiling schemes, horizontal resolution, vertical accuracy, digital elevation model surface treatment, georeferencing, data source and tile dates, distribution and supporting file formats, void areas, metadata, spatial metadata, and quality assurance and control.

  5. Integrating Map Algebra and Statistical Modeling for Spatio- Temporal Analysis of Monthly Mean Daily Incident Photosynthetically Active Radiation (PAR) over a Complex Terrain.

    PubMed

    Evrendilek, Fatih

    2007-12-12

    This study aims at quantifying spatio-temporal dynamics of monthly mean dailyincident photosynthetically active radiation (PAR) over a vast and complex terrain such asTurkey. The spatial interpolation method of universal kriging, and the combination ofmultiple linear regression (MLR) models and map algebra techniques were implemented togenerate surface maps of PAR with a grid resolution of 500 x 500 m as a function of fivegeographical and 14 climatic variables. Performance of the geostatistical and MLR modelswas compared using mean prediction error (MPE), root-mean-square prediction error(RMSPE), average standard prediction error (ASE), mean standardized prediction error(MSPE), root-mean-square standardized prediction error (RMSSPE), and adjustedcoefficient of determination (R² adj. ). The best-fit MLR- and universal kriging-generatedmodels of monthly mean daily PAR were validated against an independent 37-year observeddataset of 35 climate stations derived from 160 stations across Turkey by the Jackknifingmethod. The spatial variability patterns of monthly mean daily incident PAR were moreaccurately reflected in the surface maps created by the MLR-based models than in thosecreated by the universal kriging method, in particular, for spring (May) and autumn(November). The MLR-based spatial interpolation algorithms of PAR described in thisstudy indicated the significance of the multifactor approach to understanding and mappingspatio-temporal dynamics of PAR for a complex terrain over meso-scales.

  6. A Servicewide Benthic Mapping Program for National Parks

    USGS Publications Warehouse

    Moses, Christopher S.; Nayegandhi, Amar; Beavers, Rebecca; Brock, John

    2010-01-01

    In 2007, the National Park Service (NPS) Inventory and Monitoring Program directed the initiation of a benthic habitat mapping program in ocean and coastal parks in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With 74 ocean and Great Lakes parks stretching over more than 5,000 miles of coastline across 26 States and territories, this Servicewide Benthic Mapping Program (SBMP) is essential. This program will deliver benthic habitat maps and their associated inventory reports to NPS managers in a consistent, servicewide format to support informed management and protection of 3 million acres of submerged National Park System natural and cultural resources. The NPS and the U.S. Geological Survey (USGS) convened a workshop June 3-5, 2008, in Lakewood, Colo., to discuss the goals and develop the design of the NPS SBMP with an assembly of experts (Moses and others, 2010) who identified park needs and suggested best practices for inventory and mapping of bathymetry, benthic cover, geology, geomorphology, and some water-column properties. The recommended SBMP protocols include servicewide standards (such as gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). SBMP Mapping Process. The SBMP calls for a multi-step mapping process for each park, beginning with a gap assessment and data mining to determine data resources and needs. An interagency announcement of intent to acquire new data will provide opportunities to leverage partnerships. Prior to new data acquisition, all involved parties should be included in a scoping meeting held at network scale. Data collection will be followed by processing and interpretation, and finally expert review and publication. After publication, all digital materials will be archived in a common format. SBMP Classification Scheme. The SBMP will map using the Coastal and Marine Ecological Classification Standard (CMECS) that is being modified to include all NPS needs, such as lacustrine ecosystems and submerged cultural resources. CMECS Version III (Madden and others, 2010) includes components for water column, biotic cover, surface geology, sub-benthic, and geoform. SBMP Data Archiving. The SBMP calls for the storage of all raw data and final products in common-use data formats. The concept of 'collect once, use often' is essential to efficient use of mapping resources. Data should also be shared with other agencies and the public through various digital clearing houses, such as Geospatial One-Stop (http://gos2.geodata.gov/wps/portal/gos). To be most useful for managing submerged resources, the SBMP advocates the inventory and mapping of the five components of marine ecosystems: surface geology, biotic cover, geoform, sub-benthic, and water column. A complete benthic inventory of a park would include maps of bathymetry and the five components of CMECS. The completion of mapping for any set of components, such as bathymetry and surface geology, or a particular theme (for example, submerged aquatic vegetation) should also include a printed report.

  7. Classification of fMRI resting-state maps using machine learning techniques: A comparative study

    NASA Astrophysics Data System (ADS)

    Gallos, Ioannis; Siettos, Constantinos

    2017-11-01

    We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.

  8. Chesapeake Bay Low Freshwater Inflow Study. Phase II. MAP FOLIO. Biota Assessment.

    DTIC Science & Technology

    1982-05-01

    conditions. These were: 1) Base Average -- average freshwater inflow conditions. by increased water consumption projected for the year 2020. 3) Base Drought...RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS. 1963- A TAI m - ii J May 1982 Chesapeake Bay Low Freshwater Inflow Study Phase II Biota Assessment Map...A PERIOD ZOVERED change was found to CIESAPEAKE BAY LOW FRESHWATER INFLOW STUDY FINAL BIOTA ASSESSMENT PHASE II: FINAL REPORT MAP FOLIO s PERFORMING

  9. A universal method for automated gene mapping

    PubMed Central

    Zipperlen, Peder; Nairz, Knud; Rimann, Ivo; Basler, Konrad; Hafen, Ernst; Hengartner, Michael; Hajnal, Alex

    2005-01-01

    Small insertions or deletions (InDels) constitute a ubiquituous class of sequence polymorphisms found in eukaryotic genomes. Here, we present an automated high-throughput genotyping method that relies on the detection of fragment-length polymorphisms (FLPs) caused by InDels. The protocol utilizes standard sequencers and genotyping software. We have established genome-wide FLP maps for both Caenorhabditis elegans and Drosophila melanogaster that facilitate genetic mapping with a minimum of manual input and at comparatively low cost. PMID:15693948

  10. Emergency mapping and information management during Nepal Earthquake 2015 - Challenges and lesson learned

    NASA Astrophysics Data System (ADS)

    Joshi, G.; Gurung, D. R.

    2016-12-01

    A powerful 7.8 magnitude earthquake struck Nepal at 06:11 UTC on 25 April 2015. Several subsequent aftershocks were deadliest earthquake in recent history of Nepal. In total about 9000 people died and 22,300 people were injured, and lives of eight million people, almost one-third of the population of Nepal was effected. The event lead to massive campaigned to gather data and information on damage and loss using remote sensing, field inspection, and community survey. Information on distribution of relief materials is other important domain of information necessary for equitable relief distribution. Pre and post-earthquake high resolution satellite images helped in damage area assessment and mapping. Many national and international agencies became active to generate and fill the information vacuum. The challenges included data access bottleneck due to lack of good IT infrastructure; inconsistent products due to absence of standard mapping guidelines; dissemination challenges due to absence of Standard Operating Protocols and single information gateway. These challenges were negating opportunities offered by improved earth observation data availability, increasing engagement of volunteers for emergency mapping, and centralized emergency coordination practice. This paper highlights critical practical challenges encountered during emergency mapping and information management during the earthquake in Nepal. There is greater need to address such challenges to effectively use technological leverages that recent advancement in space science, IT and mapping domain provides.

  11. Visualizing disease associations: graphic analysis of frequency distributions as a function of age using moving average plots (MAP) with application to Alzheimer's and Parkinson's disease.

    PubMed

    Payami, Haydeh; Kay, Denise M; Zabetian, Cyrus P; Schellenberg, Gerard D; Factor, Stewart A; McCulloch, Colin C

    2010-01-01

    Age-related variation in marker frequency can be a confounder in association studies, leading to both false-positive and false-negative findings and subsequently to inconsistent reproducibility. We have developed a simple method, based on a novel extension of moving average plots (MAP), which allows investigators to inspect the frequency data for hidden age-related variations. MAP uses the standard case-control association data and generates a birds-eye view of the frequency distributions across the age spectrum; a picture in which one can see if, how, and when the marker frequencies in cases differ from that in controls. The marker can be specified as an allele, genotype, haplotype, or environmental factor; and age can be age-at-onset, age when subject was last known to be unaffected, or duration of exposure. Signature patterns that emerge can help distinguish true disease associations from spurious associations due to age effects, age-varying associations from associations that are uniform across all ages, and associations with risk from associations with age-at-onset. Utility of MAP is illustrated by application to genetic and epidemiological association data for Alzheimer's and Parkinson's disease. MAP is intended as a descriptive method, to complement standard statistical techniques. Although originally developed for age patterns, MAP is equally useful for visualizing any quantitative trait.

  12. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    DTIC Science & Technology

    2016-05-01

    identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education

  13. Moving NSDC's Staff Development Standards into Practice: Innovation Configurations. Volume I

    ERIC Educational Resources Information Center

    National Staff Development Council, 2003

    2003-01-01

    NSDC's groundbreaking work in developing standards for staff development has now been joined by an equally important book that spells out exactly how those standards would look if they were being implemented by school districts. An Innovation Configuration map is a device that identifies and describes the major components of a new practice--in…

  14. Digital Mapping, Charting and Geodesy Data Standardization

    DTIC Science & Technology

    1994-12-19

    The primary objective of the audit was to evaluate DMA’s implementation of the Defense Standardization Program. Specifically, the audit determined...interoperability of digital MC&G data. The audit also evaluated DMA’s implementation of the DoD Internal Management Control Program as it pertains to DMA’S implementation of the Defense Standardization Program.

  15. 77 FR 7489 - Small Business Size Standards: Professional, Technical, and Scientific Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... standards for the remaining industries in NAICS Sector 54. This rule also removes ``Map Drafting'' as the... for public comment in the Federal Register on March 16, 2011 (76 FR 14323), which proposed to increase the size standards for 35 industries and one sub-industry in NAICS Sector 54 and one industry in NAICS...

  16. A/E/C Graphics Standard: Release 2.0 (formerly titled CAD Drafting Standard)

    DTIC Science & Technology

    2015-08-01

    Tables Table 2-1. ANSI and ISO sheet size comparison . ....................................................................................... 3... Comparison of font types. ......................................................................................................... 45 Table 5-2. Inch...used (ISO A0 may be used for large maps). Table 2-1 lists the standard sizes of ANSI and ISO sheets. Table 2-1. ANSI and ISO sheet size comparison

  17. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  18. Specifications for updating USGS land use and land cover maps

    USGS Publications Warehouse

    Milazzo, Valerie A.

    1983-01-01

    To meet the increasing demands for up-to-date land use and land cover information, a primary goal of the U.S. Geological Survey's (USGS) national land use and land cover mapping program is to provide for periodic updating of maps and data in a timely and uniform manner. The technical specifications for updating existing USGS land use and land cover maps that are presented here cover both the interpretive aspects of detecting and identifying land use and land cover changes and the cartographic aspects of mapping and presenting the change data in conventional map format. They provide the map compiler with the procedures and techniques necessary to then use these change data to update existing land use and land cover maps in a manner that is both standardized and repeatable. Included are specifications for the acquisition of remotely sensed source materials, selection of compilation map bases, handling of data base corrections, editing and quality control operations, generation of map update products for USGS open file, and the reproduction and distribution of open file materials. These specifications are planned to become part of the National Mapping Division's Technical Instructions.

  19. iSOIL: Interactions between soil related sciences - Linking geophysics, soil science and digital soil mapping

    NASA Astrophysics Data System (ADS)

    Dietrich, Peter; Werban, Ulrike; Sauer, Uta

    2010-05-01

    High-resolution soil property maps are one major prerequisite for the specific protection of soil functions and restoration of degraded soils as well as sustainable land use, water and environmental management. To generate such maps the combination of digital soil mapping approaches and remote as well as proximal soil sensing techniques is most promising. However, a feasible and reliable combination of these technologies for the investigation of large areas (e.g. catchments and landscapes) and the assessment of soil degradation threats is missing. Furthermore, there is insufficient dissemination of knowledge on digital soil mapping and proximal soil sensing in the scientific community, to relevant authorities as well as prospective users. As one consequence there is inadequate standardization of techniques. At the poster we present the EU collaborative project iSOIL within the 7th framework program of the European Commission. iSOIL focuses on improving fast and reliable mapping methods of soil properties, soil functions and soil degradation risks. This requires the improvement and integration of advanced soil sampling approaches, geophysical and spectroscopic measuring techniques, as well as pedometric and pedophysical approaches. The focus of the iSOIL project is to develop new and to improve existing strategies and innovative methods for generating accurate, high resolution soil property maps. At the same time the developments will reduce costs compared to traditional soil mapping. ISOIL tackles the challenges by the integration of three major components: (i)high resolution, non-destructive geophysical (e.g. Electromagnetic Induction EMI; Ground Penetrating Radar, GPR; magnetics, seismics) and spectroscopic (e.g., Near Surface Infrared, NIR) methods, (ii)Concepts of Digital Soil Mapping (DSM) and pedometrics as well as (iii)optimized soil sampling with respect to profound soil scientific and (geo)statistical strategies. A special focus of iSOIL lies on the sustainable dissemination of technologies and concepts developed in the projects through workshops for stakeholders and the publication of a handbook "Methods and Technologies for Mapping of Soil Properties, Function and Threat Risks". Besides, the CEN Workshop offers a new mechanism and approach to standardization. During the project we decided that the topic of the CEN Workshop should focus on a voluntary standardization of electromagnetic induction measurement to ensure that results can be evaluated and processed under uniform circumstances and can be comparable. At the poster we will also present the idea and the objectives of our CEN Workshop "Best Practice Approach for electromagnetic induction measurements of the near surface"and invite every interested person to participate.

  20. A comparison of two estimates of standard error for a ratio-of-means estimator for a mapped-plot sample design in southeast Alaska.

    Treesearch

    Willem W.S. van Hees

    2002-01-01

    Comparisons of estimated standard error for a ratio-of-means (ROM) estimator are presented for forest resource inventories conducted in southeast Alaska between 1995 and 2000. Estimated standard errors for the ROM were generated by using a traditional variance estimator and also approximated by bootstrap methods. Estimates of standard error generated by both...

  1. ScotlandsPlaces XML: Bespoke XML or XML Mapping?

    ERIC Educational Resources Information Center

    Beamer, Ashley; Gillick, Mark

    2010-01-01

    Purpose: The purpose of this paper is to investigate web services (in the form of parameterised URLs), specifically in the context of the ScotlandsPlaces project. This involves cross-domain querying, data retrieval and display via the development of a bespoke XML standard rather than existing XML formats and mapping between them.…

  2. Bridging Archival Standards: Building Software to Translate Metadata Between PDS3 and PDS4

    NASA Astrophysics Data System (ADS)

    De Cesare, C. M.; Padams, J. H.

    2018-04-01

    Transitioning datasets from PDS3 to PDS4 requires manual and detail-oriented work. To increase efficiency and reduce human error, we've built the Label Mapping Tool, which compares a PDS3 label to a PDS4 label template and outputs mappings between the two.

  3. Construct Maps for the Road Ahead

    ERIC Educational Resources Information Center

    Bunch, Michael B.

    2013-01-01

    In this issue of "Measurement: Interdisciplinary Research and Perspectives," Adam E. Wyse provides a thorough review of research to date on the use of construct maps in standard setting. He juxtaposes concepts and methods in ways that make their connections to one another clearer and more obvious than they might otherwise have been. In…

  4. QTIMaps: A Model to Enable Web Maps in Assessment

    ERIC Educational Resources Information Center

    Navarrete, Toni; Santos, Patricia; Hernandez-Leo, Davinia; Blat, Josep

    2011-01-01

    Test-based e-Assessment approaches are mostly focused on the assessment of knowledge and not on that of other skills, which could be supported by multimedia interactive services. This paper presents the QTIMaps model, which combines the IMS QTI standard with web maps services enabling the computational assessment of geographical skills. We…

  5. 36 CFR 292.22 - Land category assignments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... availability of this map or maps in the local newspapers of record. (b) Changes in land category assignment.../grazing land so long as the intended use or development is consistent with the standards in § 292.23 and... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Land category assignments...

  6. Psychometric and Edumetric Validity of Dimensions of Geomorphological Knowledge Which Are Tapped by Concept Mapping.

    ERIC Educational Resources Information Center

    Hoz, Ron; Bowman, Dan; Chacham, Tova

    1997-01-01

    Students (N=14) in a geomorphology course took an objective geomorphology test, the tree construction task, and the Standardized Concept Structuring Analysis Technique (SConSAT) version of concept mapping. Results suggest that the SConSAT knowledge structure dimensions have moderate to good construct validity. Contains 82 references. (DDR)

  7. Curriculum Mapping in Higher Education: A Case Study and Proposed Content Scope and Sequence Mapping Tool

    ERIC Educational Resources Information Center

    Arafeh, Sousan

    2016-01-01

    Best practice in curriculum development and implementation requires that discipline-based standards or requirements embody both curricular and programme scopes and sequences. Ensuring these are present and aligned in course/programme content, activities and assessments to support student success requires formalised and systematised review and…

  8. Learning to Map and Mapping to Learn Our Students' Worlds

    ERIC Educational Resources Information Center

    Rubel, Laurie H.; Chu, Haiwen; Shookhoff, Lauren

    2011-01-01

    The National Council of Teachers of Mathematics (NCTM), through its Connections Standard, highlights the importance of "the opportunity for students to experience mathematics in a context." Seeing how mathematics can be used to describe real-world phenomena can motivate students to learn more mathematics. Connecting mathematics to the real world…

  9. Trail Orienteering: An Effective Way To Practice Map Interpretation.

    ERIC Educational Resources Information Center

    Horizons, 1999

    1999-01-01

    Discusses a type of orienteering developed in Great Britain to allow people with physical disabilities to compete on equal terms. Sites are viewed from a wheelchair-accessible main route. The main skill is interpreting the maps at each site, not finding the sites. Describes differences from standard orienteering, how sites work, and essential…

  10. Geospatial Augmented Reality for the interactive exploitation of large-scale walkable orthoimage maps in museums

    NASA Astrophysics Data System (ADS)

    Wüest, Robert; Nebiker, Stephan

    2018-05-01

    In this paper we present an app framework for augmenting large-scale walkable maps and orthoimages in museums or public spaces using standard smartphones and tablets. We first introduce a novel approach for using huge orthoimage mosaic floor prints covering several hundred square meters as natural Augmented Reality (AR) markers. We then present a new app architecture and subsequent tests in the Swissarena of the Swiss National Transport Museum in Lucerne demonstrating the capabilities of accurately tracking and augmenting different map topics, including dynamic 3d data such as live air traffic. The resulting prototype was tested with everyday visitors of the museum to get feedback on the usability of the AR app and to identify pitfalls when using AR in the context of a potentially crowded museum. The prototype is to be rolled out to the public after successful testing and optimization of the app. We were able to show that AR apps on standard smartphone devices can dramatically enhance the interactive use of large-scale maps for different purposes such as education or serious gaming in a museum context.

  11. Mapping of the corals around Hendorabi Island (Persian Gulf), using WorldView-2 standard imagery coupled with field observations.

    PubMed

    Kabiri, Keivan; Rezai, Hamid; Moradi, Masoud

    2018-04-01

    High spatial resolution WorldView-2 (WV2) satellite imagery coupled with field observations have been utilized for mapping the coral reefs around Hendorabi Island in the northern Persian Gulf. In doing so, three standard multispectral bands (red, green, and blue) were selected to produce a classified map for benthic habitats. The in-situ observations were included photo-transects taken by snorkeling in water surface and manta tow technique. The satellite image has been classified using support vector machine (SVM) classifier by considering the information obtained from field measurements as both training and control points data. The results obtained from manta tow demonstrated that the mean total live hard coral coverage was 29.04% ± 2.44% around the island. Massive corals poritiids (20.70%) and branching corals acroporiids (20.33%) showed higher live coral coverage compared to other corals. Moreover, the map produced from satellite image illustrated the distribution of habitats with 78.1% of overall accuracy. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Intra-operative multi-site stimulation: Expanding methodology for cortical brain mapping of language functions

    PubMed Central

    Korn, Akiva; Kirschner, Adi; Perry, Daniella; Hendler, Talma; Ram, Zvi

    2017-01-01

    Direct cortical stimulation (DCS) is considered the gold-standard for functional cortical mapping during awake surgery for brain tumor resection. DCS is performed by stimulating one local cortical area at a time. We present a feasibility study using an intra-operative technique aimed at improving our ability to map brain functions which rely on activity in distributed cortical regions. Following standard DCS, Multi-Site Stimulation (MSS) was performed in 15 patients by applying simultaneous cortical stimulations at multiple locations. Language functioning was chosen as a case-cognitive domain due to its relatively well-known cortical organization. MSS, performed at sites that did not produce disruption when applied in a single stimulation point, revealed additional language dysfunction in 73% of the patients. Functional regions identified by this technique were presumed to be significant to language circuitry and were spared during surgery. No new neurological deficits were observed in any of the patients following surgery. Though the neuro-electrical effects of MSS need further investigation, this feasibility study may provide a first step towards sophistication of intra-operative cortical mapping. PMID:28700619

  13. Mapping optical path length and image enhancement using quantitative orientation-independent differential interference contrast microscopy

    PubMed Central

    Shribak, Michael; Larkin, Kieran G.; Biggs, David

    2017-01-01

    Abstract. We describe the principles of using orientation-independent differential interference contrast (OI-DIC) microscopy for mapping optical path length (OPL). Computation of the scalar two-dimensional OPL map is based on an experimentally received map of the OPL gradient vector field. Two methods of contrast enhancement for the OPL image, which reveal hardly visible structures and organelles, are presented. The results obtained can be used for reconstruction of a volume image. We have confirmed that a standard research grade light microscope equipped with the OI-DIC and 100×/1.3 NA objective lens, which was not specially selected for minimum wavefront and polarization aberrations, provides OPL noise level of ∼0.5  nm and lateral resolution if ∼300  nm at a wavelength of 546 nm. The new technology is the next step in the development of the DIC microscopy. It can replace standard DIC prisms on existing commercial microscope systems without modification. This will allow biological researchers that already have microscopy setups to expand the performance of their systems. PMID:28060991

  14. Seismicity of the Earth 1900-2007

    USGS Publications Warehouse

    Tarr, Arthur C.; Villaseñor, Antonio; Furlong, Kevin P.; Rhea, Susan; Benz, Harley M.

    2010-01-01

    This map illustrates more than one century of global seismicity in the context of global plate tectonics and the Earth's physiography. Primarily designed for use by earth scientists and engineers interested in earthquake hazards of the 20th and early 21st centuries, this map provides a comprehensive overview of strong earthquakes since 1900. The map clearly identifies the location of the 'great' earthquakes (M8.0 and larger) and the rupture area, if known, of the M8.3 or larger earthquakes. The earthquake symbols are scaled proportional to the moment magnitude and therefore to the area of faulting, thus providing a better understanding of the relative sizes and distribution of earthquakes in the magnitude range 5.5 to 9.5. Plotting the known rupture area of the largest earthquakes also provides a better appreciation of the extent of some of the most famous and damaging earthquakes in modern history. All earthquakes shown on the map were carefully relocated using a standard earth reference model and standardized location procedures, thereby eliminating gross errors and biases in locations of historically important earthquakes that are often found in numerous seismicity catalogs.

  15. Intra-operative multi-site stimulation: Expanding methodology for cortical brain mapping of language functions.

    PubMed

    Gonen, Tal; Gazit, Tomer; Korn, Akiva; Kirschner, Adi; Perry, Daniella; Hendler, Talma; Ram, Zvi

    2017-01-01

    Direct cortical stimulation (DCS) is considered the gold-standard for functional cortical mapping during awake surgery for brain tumor resection. DCS is performed by stimulating one local cortical area at a time. We present a feasibility study using an intra-operative technique aimed at improving our ability to map brain functions which rely on activity in distributed cortical regions. Following standard DCS, Multi-Site Stimulation (MSS) was performed in 15 patients by applying simultaneous cortical stimulations at multiple locations. Language functioning was chosen as a case-cognitive domain due to its relatively well-known cortical organization. MSS, performed at sites that did not produce disruption when applied in a single stimulation point, revealed additional language dysfunction in 73% of the patients. Functional regions identified by this technique were presumed to be significant to language circuitry and were spared during surgery. No new neurological deficits were observed in any of the patients following surgery. Though the neuro-electrical effects of MSS need further investigation, this feasibility study may provide a first step towards sophistication of intra-operative cortical mapping.

  16. Smart "geomorphological" map browsing - a tale about geomorphological maps and the internet

    NASA Astrophysics Data System (ADS)

    Geilhausen, M.; Otto, J.-C.

    2012-04-01

    With the digital production of geomorphological maps, the dissemination of research outputs now extends beyond simple paper products. Internet technologies can contribute to both, the dissemination of geomorphological maps and access to geomorphologic data and help to make geomorphological knowledge available to a greater public. Indeed, many national geological surveys employ end-to-end digital workflows from data capture in the field to final map production and dissemination. This paper deals with the potential of web mapping applications and interactive, portable georeferenced PDF maps for the distribution of geomorphological information. Web mapping applications such as Google Maps have become very popular and widespread and increased the interest and access to mapping. They link the Internet with GIS technology and are a common way of presenting dynamic maps online. The GIS processing is performed online and maps are visualised in interactive web viewers characterised by different capabilities such as zooming, panning or adding further thematic layers, with the map refreshed after each task. Depending on the system architecture and the components used, advanced symbology, map overlays from different applications and sources and their integration into a Desktop GIS are possible. This interoperability is achieved through the use of international open standards that include mechanisms for the integration and visualisation of information from multiple sources. The portable document format (PDF) is commonly used for printing and is a standard format that can be processed by many graphic software and printers without loss of information. A GeoPDF enables the sharing of geospatial maps and data in PDF documents. Multiple, independent map frames with individual spatial reference systems are possible within a GeoPDF, for example, for map overlays or insets. Geospatial functionality of a GeoPDF includes scalable map display, layer visibility control, access to attribute data, coordinate queries and spatial measurements. The full functionality of GeoPDFs requires free and user-friendly plug-ins for PDF readers and GIS software. A GeoPDF enables fundamental GIS functionality turning the formerly static PDF map into an interactive, portable georeferenced PDF map. GeoPDFs are easy to create and provide an interesting and valuable way to disseminate geomorphological maps. Our motivation to engage with the online distribution of geomorphological maps originates in the increasing number of web mapping applications available today indicating that the Internet has become a medium for displaying geographical information in rich forms and user-friendly interfaces. So, why not use the Internet to distribute geomorphological maps and enhance their practical application? Web mapping and dynamic PDF maps can play a key role in the movement towards a global dissemination of geomorphological information. This will be exemplified by live demonstrations of i.) existing geomorphological WebGIS applications, ii.) data merging from various sources using web map services, and iii.) free to download GeoPDF maps during the presentations.

  17. Targeted Recombinant Progeny: a design for ultra-high resolution mapping of Quantitative Trait Loci in crosses between inbred or pure lines.

    PubMed

    Heifetz, Eliyahu M; Soller, Morris

    2015-07-07

    High-resolution mapping of the loci (QTN) responsible for genetic variation in quantitative traits is essential for positional cloning of candidate genes, and for effective marker assisted selection. The confidence interval (QTL) flanking the point estimate of QTN-location is proportional to the number of individuals in the mapping population carrying chromosomes recombinant in the given interval. Consequently, many designs for high resolution QTN mapping are based on increasing the proportion of recombinants in the mapping population. The "Targeted Recombinant Progeny" (TRP) design is a new design for high resolution mapping of a target QTN in crosses between pure, or inbred lines. It is a three-generation procedure generating a large number of recombinant individuals within a QTL previously shown to contain a QTN. This is achieved by having individuals that carry chromosomes recombinant across the target QTL interval as parents of a large mapping population; most of whom will therefore carry recombinant chromosomes targeted to the given QTL. The TRP design is particularly useful for high resolution mapping of QTN that differentiate inbred or pure lines, and hence are not amenable to high resolution mapping by genome-wide association tests. In the absence of residual polygenic variation, population sizes required for achieving given mapping resolution by the TRP-F2 design relative to a standard F2 design ranged from 0.289 for a QTN with standardized allele substitution effect = 0.2, mapped to an initial QTL of 0.2 Morgan to 0.041 for equivalent QTN mapped to an initial QTL of 0.02 M. In the presence of residual polygenic variation, the relative effectiveness of the TRP design ranges from 1.068 to 0.151 for the same initial QTL intervals and QTN effect. Thus even in the presence of polygenic variation, the TRP can still provide major savings. Simulation showed that mapping by TRP should be based on 30-50 markers spanning the initial interval; and on at least 50 or more G2 families representing this number of recombination points,. The TRP design can be an effective procedure for achieving high and ultra-high mapping resolution of a target QTN previously mapped to a known confidence interval (QTL).

  18. Numerical Aspects of Eigenvalue and Eigenfunction Computations for Chaotic Quantum Systems

    NASA Astrophysics Data System (ADS)

    Bäcker, A.

    Summary: We give an introduction to some of the numerical aspects in quantum chaos. The classical dynamics of two-dimensional area-preserving maps on the torus is illustrated using the standard map and a perturbed cat map. The quantization of area-preserving maps given by their generating function is discussed and for the computation of the eigenvalues a computer program in Python is presented. We illustrate the eigenvalue distribution for two types of perturbed cat maps, one leading to COE and the other to CUE statistics. For the eigenfunctions of quantum maps we study the distribution of the eigenvectors and compare them with the corresponding random matrix distributions. The Husimi representation allows for a direct comparison of the localization of the eigenstates in phase space with the corresponding classical structures. Examples for a perturbed cat map and the standard map with different parameters are shown. Billiard systems and the corresponding quantum billiards are another important class of systems (which are also relevant to applications, for example in mesoscopic physics). We provide a detailed exposition of the boundary integral method, which is one important method to determine the eigenvalues and eigenfunctions of the Helmholtz equation. We discuss several methods to determine the eigenvalues from the Fredholm equation and illustrate them for the stadium billiard. The occurrence of spurious solutions is discussed in detail and illustrated for the circular billiard, the stadium billiard, and the annular sector billiard. We emphasize the role of the normal derivative function to compute the normalization of eigenfunctions, momentum representations or autocorrelation functions in a very efficient and direct way. Some examples for these quantities are given and discussed.

  19. A Web-based Visualization System for Three Dimensional Geological Model using Open GIS

    NASA Astrophysics Data System (ADS)

    Nemoto, T.; Masumoto, S.; Nonogaki, S.

    2017-12-01

    A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.

  20. Temporal similarity perfusion mapping: A standardized and model-free method for detecting perfusion deficits in stroke

    PubMed Central

    Song, Sunbin; Luby, Marie; Edwardson, Matthew A.; Brown, Tyler; Shah, Shreyansh; Cox, Robert W.; Saad, Ziad S.; Reynolds, Richard C.; Glen, Daniel R.; Cohen, Leonardo G.; Latour, Lawrence L.

    2017-01-01

    Introduction Interpretation of the extent of perfusion deficits in stroke MRI is highly dependent on the method used for analyzing the perfusion-weighted signal intensity time-series after gadolinium injection. In this study, we introduce a new model-free standardized method of temporal similarity perfusion (TSP) mapping for perfusion deficit detection and test its ability and reliability in acute ischemia. Materials and methods Forty patients with an ischemic stroke or transient ischemic attack were included. Two blinded readers compared real-time generated interactive maps and automatically generated TSP maps to traditional TTP/MTT maps for presence of perfusion deficits. Lesion volumes were compared for volumetric inter-rater reliability, spatial concordance between perfusion deficits and healthy tissue and contrast-to-noise ratio (CNR). Results Perfusion deficits were correctly detected in all patients with acute ischemia. Inter-rater reliability was higher for TSP when compared to TTP/MTT maps and there was a high similarity between the lesion volumes depicted on TSP and TTP/MTT (r(18) = 0.73). The Pearson's correlation between lesions calculated on TSP and traditional maps was high (r(18) = 0.73, p<0.0003), however the effective CNR was greater for TSP compared to TTP (352.3 vs 283.5, t(19) = 2.6, p<0.03.) and MTT (228.3, t(19) = 2.8, p<0.03). Discussion TSP maps provide a reliable and robust model-free method for accurate perfusion deficit detection and improve lesion delineation compared to traditional methods. This simple method is also computationally faster and more easily automated than model-based methods. This method can potentially improve the speed and accuracy in perfusion deficit detection for acute stroke treatment and clinical trial inclusion decision-making. PMID:28973000

  1. Ergodic theory and visualization. II. Fourier mesochronic plots visualize (quasi)periodic sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levnajić, Zoran; Department of Mechanical Engineering, University of California Santa Barbara, Santa Barbara, California 93106; Mezić, Igor

    We present an application and analysis of a visualization method for measure-preserving dynamical systems introduced by I. Mezić and A. Banaszuk [Physica D 197, 101 (2004)], based on frequency analysis and Koopman operator theory. This extends our earlier work on visualization of ergodic partition [Z. Levnajić and I. Mezić, Chaos 20, 033114 (2010)]. Our method employs the concept of Fourier time average [I. Mezić and A. Banaszuk, Physica D 197, 101 (2004)], and is realized as a computational algorithms for visualization of periodic and quasi-periodic sets in the phase space. The complement of periodic phase space partition contains chaotic zone,more » and we show how to identify it. The range of method's applicability is illustrated using well-known Chirikov standard map, while its potential in illuminating higher-dimensional dynamics is presented by studying the Froeschlé map and the Extended Standard Map.« less

  2. Cross-mapping clinical notes between hospitals: an application of the LOINC Document Ontology.

    PubMed

    Li, Li; Morrey, C Paul; Baorto, David

    2011-01-01

    Standardization of document titles is essential for management as the volume of electronic clinical notes increases. The two campuses of the New York Presbyterian Hospital have over 2,700 distinct document titles. The LOINC Document Ontology (DO) provides a standard for the naming of clinical documents in a multi-axis structure. We have represented the latest LOINC DO structure in the MED, and developed an automated process mapping the clinical documents from both the West (Columbia) and East (Cornell) campuses to the LOINC DO. We find that the LOINC DO can represent the majority of our documents, and about half of the documents map between campuses using the LOINC DO as a reference. We evaluated the possibility of using current LOINC codes in document exchange between different institutions. While there is clear success in the ability of the LOINC DO to represent documents and facilitate exchange we find there are granularity issues.

  3. Ergodic theory and visualization. II. Fourier mesochronic plots visualize (quasi)periodic sets.

    PubMed

    Levnajić, Zoran; Mezić, Igor

    2015-05-01

    We present an application and analysis of a visualization method for measure-preserving dynamical systems introduced by I. Mezić and A. Banaszuk [Physica D 197, 101 (2004)], based on frequency analysis and Koopman operator theory. This extends our earlier work on visualization of ergodic partition [Z. Levnajić and I. Mezić, Chaos 20, 033114 (2010)]. Our method employs the concept of Fourier time average [I. Mezić and A. Banaszuk, Physica D 197, 101 (2004)], and is realized as a computational algorithms for visualization of periodic and quasi-periodic sets in the phase space. The complement of periodic phase space partition contains chaotic zone, and we show how to identify it. The range of method's applicability is illustrated using well-known Chirikov standard map, while its potential in illuminating higher-dimensional dynamics is presented by studying the Froeschlé map and the Extended Standard Map.

  4. Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.

  5. Index of flood maps prepared by the U.S. Geological Survey through 1973

    USGS Publications Warehouse

    Carrigan, Philip Hadley

    1974-01-01

    A listing is presented of flood maps prepared by the U.S. Geological Survey through 1973. Maps are listed by State and county and the list provides information on the type of flooding depicted and the reliability of the delineation.The list was prepared from a computer file, and an available program allows retrieval of data by land-line location, State and county, and Standard Metropolitan Statistical Area (SMSA). The file will be continuously updated.

  6. Investigation of the agricultural resources in Sri Lanka

    NASA Technical Reports Server (NTRS)

    Silva, A. T. M.; Nanayakkara, S. D. F. C.; Herath, L. S. K. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. It is observed that LANDSAT data is easily adaptable to photogrammetric techniques. With such adaptations, revision of topographic or thematic maps can be performed at very little cost. Revision of maps up to scale 1:100,000 (or better) can be performed. The LANDSAT image has definite advantages over the standard methods in areas of extensive development where the synoptic view of the LANDSAT image offers the required control in the form of distant mapped data in one frame.

  7. A critical assessment on the role of sentinel node mapping in endometrial cancer.

    PubMed

    Bogani, Giorgio; Ditto, Antonino; Martinelli, Fabio; Signorelli, Mauro; Perotto, Stefania; Lorusso, Domenica; Raspagliesi, Francesco

    2015-10-01

    Endometrial cancer is the most common gynecologic malignancy in the developed countries. Although the high incidence of this occurrence no consensus, about the role of retroperitoneal staging, still exists. Growing evidence support the safety and efficacy of sentinel lymph node mapping. This technique is emerging as a new standard for endometrial cancer staging procedures. In the present paper, we discuss the role of sentinel lymph node mapping in endometrial cancer, highlighting the most controversies features.

  8. Geodatabase model for global geologic mapping: concept and implementation in planetary sciences

    NASA Astrophysics Data System (ADS)

    Nass, Andrea

    2017-04-01

    One aim of the NASA Dawn mission is to generate global geologic maps of the asteroid Vesta and the dwarf planet Ceres. To accomplish this, the Dawn Science Team followed the technical recommendations for cartographic basemap production. The geological mapping campaign of Vesta was completed and published, but mapping of the dwarf planet Ceres is still ongoing. The tiling schema for the geological mapping is the same for both planetary bodies and for Ceres it is divided into two parts: four overview quadrangles (Survey Orbit, 415 m/pixel) and 15 more detailed quadrangles (High Altitude Mapping HAMO, 140 m/pixel). The first global geologic map was based on survey images (415 m/pixel). The combine 4 Survey quadrangles completed by HAMO data served as basis for generating a more detailed view of the geologic history and also for defining the chronostratigraphy and time scale of the dwarf planet. The most detailed view can be expected within the 15 mapping quadrangles based on HAMO resolution and completed by the Low Altitude Mapping (LAMO) data with 35 m/pixel. For the interpretative mapping process of each quadrangle one responsible mapper was assigned. Unifying the geological mapping of each quadrangle and bringing this together to regional and global valid statements is already a very time intensive task. However, another challenge that has to be accomplished is to consider how the 15 individual mappers can generate one homogenous GIS-based project (w.r.t. geometrical and visual character) thus produce a geologically-consistent final map. Our approach this challenge was already discussed for mapping of Vesta. To accommodate the map requirements regarding rules for data storage and database management, the computer-based GIS environment used for the interpretative mapping process must be designed in a way that it can be adjusted to the unique features of the individual investigation areas. Within this contribution the template will be presented that uses standards for digitizing, visualization, data merging and synchronization in the processes of interpretative mapping project. Following the new technological innovations within GIS software and the individual requirements for mapping Ceres, a template was developed based on the symbology and framework. The template for (GIS-base) mapping presented here directly links the generically descriptive attributes of planetary objects to the predefined and standardized symbology in one data structure. Using this template the map results are more comparable and better controllable. Furthermore, merging and synchronization of the individual maps, map projects and sheets will be far more efficient. The template can be adapted to any other planetary body and or within future discovery missions (e.g., Lucy and Psyche which was selected to explore the early solar system by NASA) for generating reusable map results.

  9. Geological mapping goes 3-D in response to societal needs

    USGS Publications Warehouse

    Thorleifson, H.; Berg, R.C.; Russell, H.A.J.

    2010-01-01

    The transition to 3-D mapping has been made possible by technological advances in digital cartography, GIS, data storage, analysis, and visualization. Despite various challenges, technological advancements facilitated a gradual transition from 2-D maps to 2.5-D draped maps to 3-D geological mapping, supported by digital spatial and relational databases that can be interrogated horizontally or vertically and viewed interactively. Challenges associated with data collection, human resources, and information management are daunting due to their resource and training requirements. The exchange of strategies at the workshops has highlighted the use of basin analysis to develop a process-based predictive knowledge framework that facilitates data integration. Three-dimensional geological information meets a public demand that fills in the blanks left by conventional 2-D mapping. Two-dimensional mapping will, however, remain the standard method for extensive areas of complex geology, particularly where deformed igneous and metamorphic rocks defy attempts at 3-D depiction.

  10. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    PubMed

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Frameshift Suppression in SACCHAROMYCES CEREVISIAE VI. Complete Genetic Map of Twenty-Five Suppressor Genes

    PubMed Central

    Gaber, Richard F.; Mathison, Lorilee; Edelman, Irv; Culbertson, Michael R.

    1983-01-01

    Five previously unmapped frameshift suppressor genes have been located on the yeast genetic map. In addition, we have further characterized the map positions of two suppressors whose approximate locations were determined in an earlier study. These results represent the completion of genetic mapping studies on all 25 of the known frameshift suppressor genes in yeast.—The approximate location of each suppressor gene was initially determined through the use of a set of mapping strains containing 61 signal markers distributed throughout the yeast genome. Standard meiotic linkage was assayed in crosses between strains carrying the suppressors and the mapping strains. Subsequent to these approximate linkage determinations, each suppressor gene was more precisely located in multi-point crosses. The implications of these mapping results for the genomic distribution of frameshift suppressor genes, which include both glycine and proline tRNA genes, are discussed. PMID:17246112

  12. Integrating in-situ, Landsat, and MODIS data for mapping in Southern African savannas: experiences of LCCS-based land-cover mapping in the Kalahari in Namibia.

    PubMed

    Hüttich, Christian; Herold, Martin; Strohbach, Ben J; Dech, Stefan

    2011-05-01

    Integrated ecosystem assessment initiatives are important steps towards a global biodiversity observing system. Reliable earth observation data are key information for tracking biodiversity change on various scales. Regarding the establishment of standardized environmental observation systems, a key question is: What can be observed on each scale and how can land cover information be transferred? In this study, a land cover map from a dry semi-arid savanna ecosystem in Namibia was obtained based on the UN LCCS, in-situ data, and MODIS and Landsat satellite imagery. In situ botanical relevé samples were used as baseline data for the definition of a standardized LCCS legend. A standard LCCS code for savanna vegetation types is introduced. An object-oriented segmentation of Landsat imagery was used as intermediate stage for downscaling in-situ training data on a coarse MODIS resolution. MODIS time series metrics of the growing season 2004/2005 were used to classify Kalahari vegetation types using a tree-based ensemble classifier (Random Forest). The prevailing Kalahari vegetation types based on LCCS was open broadleaved deciduous shrubland with an herbaceous layer which differs from the class assignments of the global and regional land-cover maps. The separability analysis based on Bhattacharya distance measurements applied on two LCCS levels indicated a relationship of spectral mapping dependencies of annual MODIS time series features due to the thematic detail of the classification scheme. The analysis of LCCS classifiers showed an increased significance of life-form composition and soil conditions to the mapping accuracy. An overall accuracy of 92.48% was achieved. Woody plant associations proved to be most stable due to small omission and commission errors. The case study comprised a first suitability assessment of the LCCS classifier approach for a southern African savanna ecosystem.

  13. Modern Data Center Services Supporting Science

    NASA Astrophysics Data System (ADS)

    Varner, J. D.; Cartwright, J.; McLean, S. J.; Boucher, J.; Neufeld, D.; LaRocque, J.; Fischman, D.; McQuinn, E.; Fugett, C.

    2011-12-01

    The National Oceanic and Atmospheric Administration's National Geophysical Data Center (NGDC) World Data Center for Geophysics and Marine Geology provides scientific stewardship, products and services for geophysical data, including bathymetry, gravity, magnetics, seismic reflection, data derived from sediment and rock samples, as well as historical natural hazards data (tsunamis, earthquakes, and volcanoes). Although NGDC has long made many of its datasets available through map and other web services, it has now developed a second generation of services to improve the discovery and access to data. These new services use off-the-shelf commercial and open source software, and take advantage of modern JavaScript and web application frameworks. Services are accessible using both RESTful and SOAP queries as well as Open Geospatial Consortium (OGC) standard protocols such as WMS, WFS, WCS, and KML. These new map services (implemented using ESRI ArcGIS Server) are finer-grained than their predecessors, feature improved cartography, and offer dramatic speed improvements through the use of map caches. Using standards-based interfaces allows customers to incorporate the services without having to coordinate with the provider. Providing fine-grained services increases flexibility for customers building custom applications. The Integrated Ocean and Coastal Mapping program and Coastal and Marine Spatial Planning program are two examples of national initiatives that require common data inventories from multiple sources and benefit from these modern data services. NGDC is also consuming its own services, providing a set of new browser-based mapping applications which allow the user to quickly visualize and search for data. One example is a new interactive mapping application to search and display information about historical natural hazards. NGDC continues to increase the amount of its data holdings that are accessible and is augmenting the capabilities with modern web application frameworks such as Groovy and Grails. Data discovery is being improved and simplified by leveraging ISO metadata standards along with ESRI Geoportal Server.

  14. Fact Sheets and Additional Information Regarding the Primary National Ambient Air Quality Standard (NAAQS) for Sulfur Dioxide

    EPA Pesticide Factsheets

    Find tools for primary standards for Sulfur Dioxide, maps of nonattainment areas, an overview of the proposal, projected nonattainment areas for 2020, and a presentation on the 2011 SO2 primary NAAQS revision.

  15. A technique for determining the deuterium/hydrogen contrast map in neutron macromolecular crystallography.

    PubMed

    Chatake, Toshiyuki; Fujiwara, Satoru

    2016-01-01

    A difference in the neutron scattering length between hydrogen and deuterium leads to a high density contrast in neutron Fourier maps. In this study, a technique for determining the deuterium/hydrogen (D/H) contrast map in neutron macromolecular crystallography is developed and evaluated using ribonuclease A. The contrast map between the D2O-solvent and H2O-solvent crystals is calculated in real space, rather than in reciprocal space as performed in previous neutron D/H contrast crystallography. The present technique can thus utilize all of the amplitudes of the neutron structure factors for both D2O-solvent and H2O-solvent crystals. The neutron D/H contrast maps clearly demonstrate the powerful detectability of H/D exchange in proteins. In fact, alternative protonation states and alternative conformations of hydroxyl groups are observed at medium resolution (1.8 Å). Moreover, water molecules can be categorized into three types according to their tendency towards rotational disorder. These results directly indicate improvement in the neutron crystal structure analysis. This technique is suitable for incorporation into the standard structure-determination process used in neutron protein crystallography; consequently, more precise and efficient determination of the D-atom positions is possible using a combination of this D/H contrast technique and standard neutron structure-determination protocols.

  16. High-dynamic range imaging techniques based on both color-separation algorithms used in conventional graphic arts and the human visual perception modeling

    NASA Astrophysics Data System (ADS)

    Lo, Mei-Chun; Hsieh, Tsung-Hsien; Perng, Ruey-Kuen; Chen, Jiong-Qiao

    2010-01-01

    The aim of this research is to derive illuminant-independent type of HDR imaging modules which can optimally multispectrally reconstruct of every color concerned in high-dynamic-range of original images for preferable cross-media color reproduction applications. Each module, based on either of broadband and multispectral approach, would be incorporated models of perceptual HDR tone-mapping, device characterization. In this study, an xvYCC format of HDR digital camera was used to capture HDR scene images for test. A tone-mapping module was derived based on a multiscale representation of the human visual system and used equations similar to a photoreceptor adaptation equation, proposed by Michaelis-Menten. Additionally, an adaptive bilateral type of gamut mapping algorithm, using approach of a multiple conversing-points (previously derived), was incorporated with or without adaptive Un-sharp Masking (USM) to carry out the optimization of HDR image rendering. An LCD with standard color space of Adobe RGB (D65) was used as a soft-proofing platform to display/represent HDR original RGB images, and also evaluate both renditionquality and prediction-performance of modules derived. Also, another LCD with standard color space of sRGB was used to test gamut-mapping algorithms, used to be integrated with tone-mapping module derived.

  17. Evaluation of MRI sequences for quantitative T1 brain mapping

    NASA Astrophysics Data System (ADS)

    Tsialios, P.; Thrippleton, M.; Glatz, A.; Pernet, C.

    2017-11-01

    T1 mapping constitutes a quantitative MRI technique finding significant application in brain imaging. It allows evaluation of contrast uptake, blood perfusion, volume, providing a more specific biomarker of disease progression compared to conventional T1-weighted images. While there are many techniques for T1-mapping there is a wide range of reported T1-values in tissues, raising the issue of protocols reproducibility and standardization. The gold standard for obtaining T1-maps is based on acquiring IR-SE sequence. Widely used alternative sequences are IR-SE-EPI, VFA (DESPOT), DESPOT-HIFI and MP2RAGE that speed up scanning and fitting procedures. A custom MRI phantom was used to assess the reproducibility and accuracy of the different methods. All scans were performed using a 3T Siemens Prisma scanner. The acquired data processed using two different codes. The main difference was observed for VFA (DESPOT) which grossly overestimated T1 relaxation time by 214 ms [126 270] compared to the IR-SE sequence. MP2RAGE and DESPOT-HIFI sequences gave slightly shorter time than IR-SE (~20 to 30ms) and can be considered as alternative and time-efficient methods for acquiring accurate T1 maps of the human brain, while IR-SE-EPI gave identical result, at a cost of a lower image quality.

  18. A comprehensive neuropsychological mapping battery for functional magnetic resonance imaging.

    PubMed

    Karakas, Sirel; Baran, Zeynel; Ceylan, Arzu Ozkan; Tileylioglu, Emre; Tali, Turgut; Karakas, Hakki Muammer

    2013-11-01

    Existing batteries for FMRI do not precisely meet the criteria for comprehensive mapping of cognitive functions within minimum data acquisition times using standard scanners and head coils. The goal was to develop a battery of neuropsychological paradigms for FMRI that can also be used in other brain imaging techniques and behavioural research. Participants were 61 healthy, young adult volunteers (48 females and 13 males, mean age: 22.25 ± 3.39 years) from the university community. The battery included 8 paradigms for basic (visual, auditory, sensory-motor, emotional arousal) and complex (language, working memory, inhibition/interference control, learning) cognitive functions. Imaging was performed using standard functional imaging capabilities (1.5-T MR scanner, standard head coil). Structural and functional data series were analysed using Brain Voyager QX2.9 and Statistical Parametric Mapping-8. For basic processes, activation centres for individuals were within a distance of 3-11 mm of the group centres of the target regions and for complex cognitive processes, between 7 mm and 15 mm. Based on fixed-effect and random-effects analyses, the distance between the activation centres was 0-4 mm. There was spatial variability between individual cases; however, as shown by the distances between the centres found with fixed-effect and random-effects analyses, the coordinates for individual cases can be used to represent those of the group. The findings show that the neuropsychological brain mapping battery described here can be used in basic science studies that investigate the relationship of the brain to the mind and also as functional localiser in clinical studies for diagnosis, follow-up and pre-surgical mapping. © 2013.

  19. A medical device-grade T1 and ECV phantom for global T1 mapping quality assurance-the T1 Mapping and ECV Standardization in cardiovascular magnetic resonance (T1MES) program.

    PubMed

    Captur, Gabriella; Gatehouse, Peter; Keenan, Kathryn E; Heslinga, Friso G; Bruehl, Ruediger; Prothmann, Marcel; Graves, Martin J; Eames, Richard J; Torlasco, Camilla; Benedetti, Giulia; Donovan, Jacqueline; Ittermann, Bernd; Boubertakh, Redha; Bathgate, Andrew; Royet, Celine; Pang, Wenjie; Nezafat, Reza; Salerno, Michael; Kellman, Peter; Moon, James C

    2016-09-22

    T 1 mapping and extracellular volume (ECV) have the potential to guide patient care and serve as surrogate end-points in clinical trials, but measurements differ between cardiovascular magnetic resonance (CMR) scanners and pulse sequences. To help deliver T 1 mapping to global clinical care, we developed a phantom-based quality assurance (QA) system for verification of measurement stability over time at individual sites, with further aims of generalization of results across sites, vendor systems, software versions and imaging sequences. We thus created T1MES: The T1 Mapping and ECV Standardization Program. A design collaboration consisting of a specialist MRI small-medium enterprise, clinicians, physicists and national metrology institutes was formed. A phantom was designed covering clinically relevant ranges of T 1 and T 2 in blood and myocardium, pre and post-contrast, for 1.5 T and 3 T. Reproducible mass manufacture was established. The device received regulatory clearance by the Food and Drug Administration (FDA) and Conformité Européene (CE) marking. The T1MES phantom is an agarose gel-based phantom using nickel chloride as the paramagnetic relaxation modifier. It was reproducibly specified and mass-produced with a rigorously repeatable process. Each phantom contains nine differently-doped agarose gel tubes embedded in a gel/beads matrix. Phantoms were free of air bubbles and susceptibility artifacts at both field strengths and T 1 maps were free from off-resonance artifacts. The incorporation of high-density polyethylene beads in the main gel fill was effective at flattening the B 1 field. T 1 and T 2 values measured in T1MES showed coefficients of variation of 1 % or less between repeat scans indicating good short-term reproducibility. Temperature dependency experiments confirmed that over the range 15-30 °C the short-T 1 tubes were more stable with temperature than the long-T 1 tubes. A batch of 69 phantoms was mass-produced with random sampling of ten of these showing coefficients of variations for T 1 of 0.64 ± 0.45 % and 0.49 ± 0.34 % at 1.5 T and 3 T respectively. The T1MES program has developed a T 1 mapping phantom to CE/FDA manufacturing standards. An initial 69 phantoms with a multi-vendor user manual are now being scanned fortnightly in centers worldwide. Future results will explore T 1 mapping sequences, platform performance, stability and the potential for standardization.

  20. A Comparative Study of Hawaii Middle School Science Student Academic Achievement

    NASA Astrophysics Data System (ADS)

    Askew Cain, Peggy

    The problem was middle-grade students with specific learning disabilities (SWDs) in reading comprehension perform less well than their peers on standardized assessments. The purpose of this quantitative comparative study was to examine the effect of electronic concept maps on reading comprehension of eighth grade students with SWD reading comprehension in a Hawaii middle school Grade 8 science class on the island of Oahu. The target population consisted of Grade 8 science students for school year 2015-2016. The sampling method was a purposeful sampling with a final sample size of 338 grade 8 science students. De-identified archival records of grade 8 Hawaii standardized science test scores were analyzed using a one way analysis of variance (ANOVA) in SPSS. The finding for hypothesis 1 indicated a significant difference in student achievement between SWDs and SWODs as measured by Hawaii State Assessment (HSA) in science scores (p < 0.05), and for hypothesis 2, a significant difference in instructional modality for SWDs who used concept maps and does who did not as measured by the Hawaii State Assessment in science (p < 0.05). The implications of the findings (a) SWDs performed less well in science achievement than their peers and consequently, and (b) SWODs appeared to remember greater degrees of science knowledge, and answered more questions correctly than SWDs as a result of reading comprehension. Recommendations for practice were for educational leadership and noted: (a) teachers should practice using concept maps with SWDs as a specific reading strategy to support reading comprehension in science classes, (b) involve a strong focus on vocabulary building and concept building during concept map construction because the construction of concept maps sometimes requires frontloading of vocabulary, and (c) model for teachers how concept maps are created and to explain their educational purpose as a tool for learning. Recommendations for future research were to conduct (a) a quantitative comparative study between groups for academic achievement of subtests mean scores of SWDs and SWODs in physical science, earth science, and space science, and (b) a quantitative correlation study to examine relationships and predictive values for academic achievement of SWDs and concept map integration on standardized science assessments.

  1. A gene-signature progression approach to identifying candidate small-molecule cancer therapeutics with connectivity mapping.

    PubMed

    Wen, Qing; Kim, Chang-Sik; Hamilton, Peter W; Zhang, Shu-Dong

    2016-05-11

    Gene expression connectivity mapping has gained much popularity recently with a number of successful applications in biomedical research testifying its utility and promise. Previously methodological research in connectivity mapping mainly focused on two of the key components in the framework, namely, the reference gene expression profiles and the connectivity mapping algorithms. The other key component in this framework, the query gene signature, has been left to users to construct without much consensus on how this should be done, albeit it has been an issue most relevant to end users. As a key input to the connectivity mapping process, gene signature is crucially important in returning biologically meaningful and relevant results. This paper intends to formulate a standardized procedure for constructing high quality gene signatures from a user's perspective. We describe a two-stage process for making quality gene signatures using gene expression data as initial inputs. First, a differential gene expression analysis comparing two distinct biological states; only the genes that have passed stringent statistical criteria are considered in the second stage of the process, which involves ranking genes based on statistical as well as biological significance. We introduce a "gene signature progression" method as a standard procedure in connectivity mapping. Starting from the highest ranked gene, we progressively determine the minimum length of the gene signature that allows connections to the reference profiles (drugs) being established with a preset target false discovery rate. We use a lung cancer dataset and a breast cancer dataset as two case studies to demonstrate how this standardized procedure works, and we show that highly relevant and interesting biological connections are returned. Of particular note is gefitinib, identified as among the candidate therapeutics in our lung cancer case study. Our gene signature was based on gene expression data from Taiwan female non-smoker lung cancer patients, while there is evidence from independent studies that gefitinib is highly effective in treating women, non-smoker or former light smoker, advanced non-small cell lung cancer patients of Asian origin. In summary, we introduced a gene signature progression method into connectivity mapping, which enables a standardized procedure for constructing high quality gene signatures. This progression method is particularly useful when the number of differentially expressed genes identified is large, and when there is a need to prioritize them to be included in the query signature. The results from two case studies demonstrate that the approach we have developed is capable of obtaining pertinent candidate drugs with high precision.

  2. CytometryML, an XML format based on DICOM and FCS for analytical cytology data.

    PubMed

    Leif, Robert C; Leif, Suzanne B; Leif, Stephanie H

    2003-07-01

    Flow Cytometry Standard (FCS) was initially created to standardize the software researchers use to analyze, transmit, and store data produced by flow cytometers and sorters. Because of the clinical utility of flow cytometry, it is necessary to have a standard consistent with the requirements of medical regulatory agencies. We extended the existing mapping of FCS to the Digital Imaging and Communications in Medicine (DICOM) standard to include list-mode data produced by flow cytometry, laser scanning cytometry, and microscopic image cytometry. FCS list-mode was mapped to the DICOM Waveform Information Object. We created a collection of Extensible Markup Language (XML) schemas to express the DICOM analytical cytologic text-based data types except for large binary objects. We also developed a cytometry markup language, CytometryML, in an open environment subject to continuous peer review. The feasibility of expressing the data contained in FCS, including list-mode in DICOM, was demonstrated; and a preliminary mapping for list-mode data in the form of XML schemas and documents was completed. DICOM permitted the creation of indices that can be used to rapidly locate in a list-mode file the cells that are members of a subset. DICOM and its coding schemes for other medical standards can be represented by XML schemas, which can be combined with other relevant XML applications, such as Mathematical Markup Language (MathML). The use of XML format based on DICOM for analytical cytology met most of the previously specified requirements and appears capable of meeting the others; therefore, the present FCS should be retired and replaced by an open, XML-based, standard CytometryML. Copyright 2003 Wiley-Liss, Inc.

  3. Maps for the nation: The current federal mapping establishment

    USGS Publications Warehouse

    North, G.W.

    1983-01-01

    The U.S. Government annually produces an estimated 53,000 new maps and charts and distributes about 160 million copies. A large number of these maps are produced under the national mapping program, a decentralized Federal/State cooperative approach to mapping the country at standard scales. Circular A-16, issued by the Office of Management and Budget in 1953 and revised in 1967, delegates the mapping responsibilities to various federal agencies. The U.S. Department of the Interior's Geological Survey is the principal federal agency responsible for implementing the national mapping program. Other major federal map producing agencies include the Departments of Agriculture, Commerce, Defense, Housing and Urban Development, and Transportation, and the Tennessee Valley Authority. To make maps and mapping information more readily available, the National Cartographic Information Center was established in 1974 and an expanded National Map Library Depository Program in 1981. The most recent of many technological advances made under the mapping program are in the areas of digital cartography and video disc and optical disc information storage systems. Future trends and changes in the federal mapping program will involve expanded information and customer service operations, further developments in the production and use of digital cartographic data, and consideration of a Federal Mapping Agency. ?? 1983.

  4. SpaceWire Plug and Play

    NASA Technical Reports Server (NTRS)

    Rakow, Glenn; McGuirk, Patrick; Kimmery, Clifford; Jaffe, Paul

    2006-01-01

    The ability to rapidly deploy inexpensive satellites to meet tactical goals has become an important goal for military space systems. In fact, Operationally Responsive Space (ORS) has been in the spotlight at the highest levels. The Office of the Secretary of Defense (OSD) has identified that the critical next step is developing the bus standards and modular interfaces. Historically, satellite components have been constructed based on bus standards and standardized interfaces. However, this has not been done to a degree, which would allow the rapid deployment of a satellite. Advancements in plug-and-play (PnP) technologies for terrestrial applications can serve as a baseline model for a PnP approach for satellite applications. Since SpaceWire (SpW) has become a de facto standard for satellite high-speed (greater than 200Mbp) on-board communications, it has become important for SpW to adapt to this Plug and Play (PnP) environment. Because SpW is simply a bulk transport protocol and lacks built-in PnP features, several changes are required to facilitate PnP with SpW. The first is for Host(s) to figure out what the network looks like, i.e., how pieces of the network, routers and nodes, are connected together; network mapping, and to receive notice of changes to the network. The second is for the components connected to the network to be understood so that they can communicate. The first element, network topology mapping & change of status indication, is being defined (topic of this paper). The second element describing how components are to communicate has been defined by ARFL with the electronic data sheets known as XTEDS. The first element, network mapping, is recent activities performed by Air Force Research Lab (ARFL), Naval Research Lab (NRL), NASA and US industry (Honeywell, Clearwater, FL, and others). This work has resulted in the development of a protocol that will perform the lower level functions of network mapping and Change Of Status (COS) indication required by Plug 'n' Play over SpaceWire. This work will be presented to the SpaceWire working group for standardization under European Cooperation for Space Standardization (ECSS) and to obtain a permanent Protocol ID (see SpaceWire Protocol ID: What Does it Mean to You; IEEE Aerospace Conference 2006). The portion of the Plug 'n' Play protocol that will be described in this paper is how the Host(s) of a SpaceWire network map the network and detect additions and deletions of devices on a SpaceWire network.

  5. TRENDS IN ENGINEERING GEOLOGIC AND RELATED MAPPING.

    USGS Publications Warehouse

    Varnes, David J.; Keaton, Jeffrey R.

    1983-01-01

    Progress is reviewed that has been made during the period 1972-1982 in producing medium- and small-scale engineering geologic maps with a variety of content. Improved methods to obtain and present information are evolving. Standards concerning text and map content, soil and rock classification, and map symbols have been proposed. Application of geomorphological techniques in terrain evaluation has increased, as has the use of aerial photography and other remote sensing. Computers are being used to store, analyze, retrieve, and print both text and map information. Development of offshore resources, especially petroleum, has led to marked improvement and growth in marine engineering geology and geotechnology. Coordinated planning for societal needs has required broader scope and increased complexity of both engineering geologic and environmental geologic studies. Refs.

  6. A financial market model with two discontinuities: Bifurcation structures in the chaotic domain

    NASA Astrophysics Data System (ADS)

    Panchuk, Anastasiia; Sushko, Iryna; Westerhoff, Frank

    2018-05-01

    We continue the investigation of a one-dimensional piecewise linear map with two discontinuity points. Such a map may arise from a simple asset-pricing model with heterogeneous speculators, which can help us to explain the intricate bull and bear behavior of financial markets. Our focus is on bifurcation structures observed in the chaotic domain of the map's parameter space, which is associated with robust multiband chaotic attractors. Such structures, related to the map with two discontinuities, have been not studied before. We show that besides the standard bandcount adding and bandcount incrementing bifurcation structures, associated with two partitions, there exist peculiar bandcount adding and bandcount incrementing structures involving all three partitions. Moreover, the map's three partitions may generate intriguing bistability phenomena.

  7. Extracting and standardizing medication information in clinical text - the MedEx-UIMA system.

    PubMed

    Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C; Xu, Hua

    2014-01-01

    Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/.

  8. Moving NSDC's Staff Development Standards into Practice: Innovation Configurations, Volume II. [CD-ROMs

    ERIC Educational Resources Information Center

    National Staff Development Council, 2005

    2005-01-01

    The second volume of "Moving NSDC's Staff Development Standards into Practice: Innovation Configurations" builds on the work that began with the first volume published in 2003. An Innovation Configuration map is a device that identifies and describes the major components of a new practice such as the standards and details of how it would look in…

  9. The Times They Are A-Changing: The Influence of Railroad Technology on the Adoption of Standard Time Zones in 1883.

    ERIC Educational Resources Information Center

    Allen, Nathaniel

    2000-01-01

    Presents the story of the role railroad technology had in the adoption of Standard Time Zones in 1883 and also considers the influence of astronomers at the time. Includes the map of the standard railway time used by W. F. Allen and an annotated bibliography with primary and secondary sources. (CMK)

  10. The Impact of Stability Balls, Activity Breaks, and a Sedentary Classroom on Standardized Math Scores

    ERIC Educational Resources Information Center

    Mead, Tim; Scibora, Lesley

    2016-01-01

    The purpose of the study was to determine if standardized math test scores improve by administering different types of exercise during math instruction. Three sixth grade classes were assessed on the Measures of Academic Progress (MAP) and the Minnesota Comprehensive Assessment (MCA) standardized math tests during the 2012 and 2013 academic year.…

  11. Embracing Challenges in Times of Change: A Survey of the Readiness of Academic Librarians in New Jersey for Transition to the ACRL Framework

    ERIC Educational Resources Information Center

    Charles, Leslin H.

    2017-01-01

    Many academic librarians in the state of New Jersey (NJ) have successfully integrated information literacy (IL) into the curriculum using the ACRL IL Competency Standards for Higher Education ("Standards"). These "Standards" formed the underpinnings of IL curriculum mapping and assessment plans, and have been adopted by…

  12. Improving Low-Dose Blood-Brain Barrier Permeability Quantification Using Sparse High-Dose Induced Prior for Patlak Model

    PubMed Central

    Fang, Ruogu; Karlsson, Kolbeinn; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Blood-brain-barrier permeability (BBBP) measurements extracted from the perfusion computed tomography (PCT) using the Patlak model can be a valuable indicator to predict hemorrhagic transformation in patients with acute stroke. Unfortunately, the standard Patlak model based PCT requires excessive radiation exposure, which raised attention on radiation safety. Minimizing radiation dose is of high value in clinical practice but can degrade the image quality due to the introduced severe noise. The purpose of this work is to construct high quality BBBP maps from low-dose PCT data by using the brain structural similarity between different individuals and the relations between the high- and low-dose maps. The proposed sparse high-dose induced (shd-Patlak) model performs by building a high-dose induced prior for the Patlak model with a set of location adaptive dictionaries, followed by an optimized estimation of BBBP map with the prior regularized Patlak model. Evaluation with the simulated low-dose clinical brain PCT datasets clearly demonstrate that the shd-Patlak model can achieve more significant gains than the standard Patlak model with improved visual quality, higher fidelity to the gold standard and more accurate details for clinical analysis. PMID:24200529

  13. Rapid exploration of configuration space with diffusion-map-directed molecular dynamics.

    PubMed

    Zheng, Wenwei; Rohrdanz, Mary A; Clementi, Cecilia

    2013-10-24

    The gap between the time scale of interesting behavior in macromolecular systems and that which our computational resources can afford often limits molecular dynamics (MD) from understanding experimental results and predicting what is inaccessible in experiments. In this paper, we introduce a new sampling scheme, named diffusion-map-directed MD (DM-d-MD), to rapidly explore molecular configuration space. The method uses a diffusion map to guide MD on the fly. DM-d-MD can be combined with other methods to reconstruct the equilibrium free energy, and here, we used umbrella sampling as an example. We present results from two systems: alanine dipeptide and alanine-12. In both systems, we gain tremendous speedup with respect to standard MD both in exploring the configuration space and reconstructing the equilibrium distribution. In particular, we obtain 3 orders of magnitude of speedup over standard MD in the exploration of the configurational space of alanine-12 at 300 K with DM-d-MD. The method is reaction coordinate free and minimally dependent on a priori knowledge of the system. We expect wide applications of DM-d-MD to other macromolecular systems in which equilibrium sampling is not affordable by standard MD.

  14. Rapid Exploration of Configuration Space with Diffusion Map-directed-Molecular Dynamics

    PubMed Central

    Zheng, Wenwei; Rohrdanz, Mary A.; Clementi, Cecilia

    2013-01-01

    The gap between the timescale of interesting behavior in macromolecular systems and that which our computational resources can afford oftentimes limits Molecular Dynamics (MD) from understanding experimental results and predicting what is inaccessible in experiments. In this paper, we introduce a new sampling scheme, named Diffusion Map-directed-MD (DM-d-MD), to rapidly explore molecular configuration space. The method uses diffusion map to guide MD on the fly. DM-d-MD can be combined with other methods to reconstruct the equilibrium free energy, and here we used umbrella sampling as an example. We present results from two systems: alanine dipeptide and alanine-12. In both systems we gain tremendous speedup with respect to standard MD both in exploring the configuration space and reconstructing the equilibrium distribution. In particular, we obtain 3 orders of magnitude of speedup over standard MD in the exploration of the configurational space of alanine-12 at 300K with DM-d-MD. The method is reaction coordinate free and minimally dependent on a priori knowledge of the system. We expect wide applications of DM-d-MD to other macromolecular systems in which equilibrium sampling is not affordable by standard MD. PMID:23865517

  15. Discussion on the 3D visualizing of 1:200 000 geological map

    NASA Astrophysics Data System (ADS)

    Wang, Xiaopeng

    2018-01-01

    Using United States National Aeronautics and Space Administration Shuttle Radar Topography Mission (SRTM) terrain data as digital elevation model (DEM), overlap scanned 1:200 000 scale geological map, program using Direct 3D of Microsoft with C# computer language, the author realized the three-dimensional visualization of the standard division geological map. User can inspect the regional geology content with arbitrary angle, rotating, roaming, and can examining the strata synthetical histogram, map section and legend at any moment. This will provide an intuitionistic analyzing tool for the geological practitioner to do structural analysis with the assistant of landform, dispose field exploration route etc.

  16. Fact Sheets and Additional Information Regarding the Primary National Ambient Air Quality Standards (NAAQS) for Nitrogen Dioxide (NO2)

    EPA Pesticide Factsheets

    Find tools for primary standards for Nitrogen Dioxide, maps of monitoring areas, an overview of the proposal, monitor requirements, design values for counties, and a presentation on the 2010 NO2 primary NAAQS revision.

  17. 43 CFR 8343.1 - Standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... off-road vehicle may be operated on public lands unless equipped with brakes in good working condition... excessive noise exceeding Environmental Protection Agency standards, when established, may be operated on public lands. (c) By posting appropriate signs or by marking a map which shall be available for public...

  18. A Network Analysis of Concept Maps of Triangle Concepts

    ERIC Educational Resources Information Center

    Haiyue, Jin; Khoon Yoong, Wong

    2010-01-01

    Mathematics educators and mathematics standards of curriculum have emphasised the importance of constructing the interconnectedness among mathematic concepts ("conceptual understanding") instead of only the ability to carry out standard procedures in an isolated fashion. Researchers have attempted to assess the knowledge networks in…

  19. Using ASBO International's Standards To Map Your Professional Growth and Development Plan.

    ERIC Educational Resources Information Center

    Stratton, Susan

    2002-01-01

    Explores various definitions of what constitutes a profession and what characteristics determine a professional. Identifies the need for continued professional growth and development related to the new Association of School Business Officials, International, "Professional Standards" (2001). Examples illustrate how individual school…

  20. Interoperability of Medication Classification Systems: Lessons Learned Mapping Established Pharmacologic Classes (EPCs) to SNOMED CT

    PubMed Central

    Nelson, Scott D; Parker, Jaqui; Lario, Robert; Winnenburg, Rainer; Erlbaum, Mark S.; Lincoln, Michael J.; Bodenreider, Olivier

    2018-01-01

    Interoperability among medication classification systems is known to be limited. We investigated the mapping of the Established Pharmacologic Classes (EPCs) to SNOMED CT. We compared lexical and instance-based methods to an expert-reviewed reference standard to evaluate contributions of these methods. Of the 543 EPCs, 284 had an equivalent SNOMED CT class, 205 were more specific, and 54 could not be mapped. Precision, recall, and F1 score were 0.416, 0.620, and 0.498 for lexical mapping and 0.616, 0.504, and 0.554 for instance-based mapping. Each automatic method has strengths, weaknesses, and unique contributions in mapping between medication classification systems. In our experience, it was beneficial to consider the mapping provided by both automated methods for identifying potential matches, gaps, inconsistencies, and opportunities for quality improvement between classifications. However, manual review by subject matter experts is still needed to select the most relevant mappings. PMID:29295234

  1. Development of a 63K SNP array for Gossypium and high-density mapping of intra- and inter-specific populations of cotton (G. hirsutum L.)

    USDA-ARS?s Scientific Manuscript database

    High-throughput genotyping arrays provide a standardized resource for crop research communities that are useful for a breadth of applications including high-density genetic mapping, genome-wide association studies (GWAS), genomic selection (GS), candidate marker and quantitative trait loci (QTL) ide...

  2. Mapping School Types in England

    ERIC Educational Resources Information Center

    Courtney, Steven J.

    2015-01-01

    The number and range of school types in England is increasing rapidly in response to a neoliberal policy agenda aiming to expand choice of provision as a mechanism for raising educational standards. In this paper, I seek to undertake a mapping of these school types in order to describe and explain what is happening. I capture this busy terrain…

  3. An Investigation of the Standard Errors of Expected A Posteriori Ability Estimates.

    ERIC Educational Resources Information Center

    De Ayala, R. J.; And Others

    Expected a posteriori has a number of advantages over maximum likelihood estimation or maximum a posteriori (MAP) estimation methods. These include ability estimates (thetas) for all response patterns, less regression towards the mean than MAP ability estimates, and a lower average squared error. R. D. Bock and R. J. Mislevy (1982) state that the…

  4. A note on contagion indices for landscape analysis

    Treesearch

    Kurt H. Riitters; Robert V. O' Neill; James D. Wickham; K. Bruce Jones

    1996-01-01

    The landscape contagion index measures the degree of clumping of attributes on raster maps. The index is computed from the frequencies by which different pairs of attributes occur as adjacent pixels on a map. Because there are subtle differences in the way the attribute adjacencies may be tabulated, the standard index formula may not always apply, and published index...

  5. 1989 Exclusive Economic Zone Symposium: summary and recommendations

    USGS Publications Warehouse

    Lockwood, M.; Hill, G.W.

    1989-01-01

    Issues discussed relate to digital seafloor mapping projects, cooperative federal-state programs, and requirements for additional data and information. Symposium recommendations included the need for increased surveying in coastal waters, development of standards for digital data dissemination, increased coordination with coastal states and federal agencies, and additional geophysical measurement systems abroad all mapping ships. -from Authors

  6. Building perceptual color maps for visualizing interval data

    NASA Astrophysics Data System (ADS)

    Kalvin, Alan D.; Rogowitz, Bernice E.; Pelah, Adar; Cohen, Aron

    2000-06-01

    In visualization, a 'color map' maps a range of data values onto a scale of colors. However, unless a color map is e carefully constructed, visual artifacts can be produced. This problem has stimulated considerable interest in creating perceptually based color maps, that is, color maps where equal steps in data value are perceived as equal steps in the color map [Robertson (1988); Pizer (1981); Green (1992); Lefkowitz and Herman, 1992)]. In Rogowitz and Treinish, (1996, 1998) and in Bergman, Treinish and Rogowitz, (1995), we demonstrated that color maps based on luminance or saturation could be good candidates for satisfying this requirement. This work is based on the seminal work of S.S. Stevens (1966), who measured the perceived magnitude of different magnitudes of physical stimuli. He found that for many physical scales, including luminance (cd/m2) and saturation (the 'redness' of a long-wavelength light source), equal ratios in stimulus value produced equal ratios in perceptual magnitude. He interpreted this as indicating that there exists in human cognition a common scale for representing magnitude, and we scale the effects of different physical stimuli to this internal scale. In Rogowitz, Kalvin, Pelahb and Cohen (1999), we used a psychophysical technique to test this hypothesis as it applies to the creation of perceptually uniform color maps. We constructed color maps as trajectories through three-color spaces, a common computer graphics standard (uncalibrated HSV), a common perceptually-based engineering standard for creating visual stimuli (L*a*b*), and a space commonly used in the graphic arts (Munsell). For each space, we created color scales that varied linearly in hue, saturation, or luminance and measured the detectability of increments in hue, saturation or luminance for each of these color scales. We measured the amplitude of the just-detectable Gaussian increments at 20 different values along the range of each color map. For all three color spaces, we found that luminance-based color maps provided the most perceptually- uniform representations of the data. The just-detectable increment was constant at all points in the color map, with the exception of the lowest-luminance values, where a larger increment was required. The saturation-based color maps provided less sensitivity than the luminance-based color maps, requiring much larger increments for detection. For the hue- based color maps, the size of the increment required for detection varied across the range. For example, for the standard 'rainbow' color map (uncalibrated HSV, hue-varying map), a step in the 'green' region required an increment 16 times the size of the increment required in the 'cyan' part of the range. That is, the rainbow color map would not successfully represent changes in the data in the 'green' region of this color map. In this paper, we extend this research by studying the detectability of spatially-modulated Gabor targets based on these hue, saturation and luminance scales. Since, in visualization, the user is called upon to detect and identify patterns that vary in their spatial characteristics, it is important to study how different types of color maps represent data with varying spatial properties. To do so, we measured modulation thresholds for low-(0.2 c/deg) and high-spatial frequency (4.0 c/deg) Gabor patches and compared them with the Gaussian results. As before, we measured increment thresholds for hue, saturation, and luminance modulations. These color scales were constructed as trajectories along the three perceptual dimensions of color (hue, saturation, and luminance) in two color spaces, uncalibrated HSV and calibrated L*a*b. This allowed us to study how the three perceptual dimensions represent magnitude information for test patterns varying in spatial frequency. This design also allowed us to test the hypothesis that the luminance channel best carries high-spatial frequency information while the saturation channel best represents low spatial-frequency information (Mullen 1985; DeValois and DeValois 1988).

  7. The Global Genome Biodiversity Network (GGBN) Data Standard specification.

    PubMed

    Droege, G; Barker, K; Seberg, O; Coddington, J; Benson, E; Berendsohn, W G; Bunk, B; Butler, C; Cawsey, E M; Deck, J; Döring, M; Flemons, P; Gemeinholzer, B; Güntsch, A; Hollowell, T; Kelbert, P; Kostadinov, I; Kottmann, R; Lawlor, R T; Lyal, C; Mackenzie-Dodds, J; Meyer, C; Mulcahy, D; Nussbeck, S Y; O'Tuama, É; Orrell, T; Petersen, G; Robertson, T; Söhngen, C; Whitacre, J; Wieczorek, J; Yilmaz, P; Zetzsche, H; Zhang, Y; Zhou, X

    2016-01-01

    Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today's ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard.Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard. © The Author(s) 2016. Published by Oxford University Press.

  8. Semi-automated quantification and neuroanatomical mapping of heterogeneous cell populations.

    PubMed

    Mendez, Oscar A; Potter, Colin J; Valdez, Michael; Bello, Thomas; Trouard, Theodore P; Koshy, Anita A

    2018-07-15

    Our group studies the interactions between cells of the brain and the neurotropic parasite Toxoplasma gondii. Using an in vivo system that allows us to permanently mark and identify brain cells injected with Toxoplasma protein, we have identified that Toxoplasma-injected neurons (TINs) are heterogeneously distributed throughout the brain. Unfortunately, standard methods to quantify and map heterogeneous cell populations onto a reference brain atlas are time consuming and prone to user bias. We developed a novel MATLAB-based semi-automated quantification and mapping program to allow the rapid and consistent mapping of heterogeneously distributed cells on to the Allen Institute Mouse Brain Atlas. The system uses two-threshold background subtraction to identify and quantify cells of interest. We demonstrate that we reliably quantify and neuroanatomically localize TINs with low intra- or inter-observer variability. In a follow up experiment, we show that specific regions of the mouse brain are enriched with TINs. The procedure we use takes advantage of simple immunohistochemistry labeling techniques, use of a standard microscope with a motorized stage, and low cost computing that can be readily obtained at a research institute. To our knowledge there is no other program that uses such readily available techniques and equipment for mapping heterogeneous populations of cells across the whole mouse brain. The quantification method described here allows reliable visualization, quantification, and mapping of heterogeneous cell populations in immunolabeled sections across whole mouse brains. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Design of parallel transmission radiofrequency pulses robust against respiration in cardiac MRI at 7 Tesla.

    PubMed

    Schmitter, Sebastian; Wu, Xiaoping; Uğurbil, Kâmil; Van de Moortele, Pierre-François

    2015-11-01

    Two-spoke parallel transmission (pTX) radiofrequency (RF) pulses have been demonstrated in cardiac MRI at 7T. However, current pulse designs rely on a single set of B1(+)/B0 maps that may not be valid for subsequent scans acquired at another phase of the respiration cycle because of organ displacement. Such mismatches may yield severe excitation profile degradation. B1(+)/B0 maps were obtained, using 16 transmit channels at 7T, at three breath-hold positions: exhale, half-inhale, and inhale. Standard and robust RF pulses were designed using maps obtained at exhale only, and at multiple respiratory positions, respectively. Excitation patterns were analyzed for all positions using Bloch simulations. Flip-angle homogeneity was compared in vivo in cardiac CINE acquisitions. Standard one- and two-spoke pTX RF pulses are sensitive to breath-hold position, primarily due to B1(+) alterations, with high dependency on excitation trajectory for two spokes. In vivo excitation inhomogeneity varied from nRMSE = 8.2% (exhale) up to 32.5% (inhale) with the standard design; much more stable results were obtained with the robust design with nRMSE = 9.1% (exhale) and 10.6% (inhale). A new pTX RF pulse design robust against respiration induced variations of B1(+)/B0 maps is demonstrated and is expected to have a positive impact on cardiac MRI in breath-hold, free-breathing, and real-time acquisitions. © 2014 Wiley Periodicals, Inc.

  10. Use of sentinel node mapping for cancer of the colon: 'to map or not to map".

    PubMed

    Thomas, Kristen A; Lechner, Jonathan; Shen, Perry; Waters, Gregory S; Geisinger, Kim R; Levine, Edward A

    2006-07-01

    Sentinel lymph node (SLN) mapping has become a cornerstone of oncologic surgery because it is a proven method for identifying nodal disease in melanoma and breast cancer. In addition, it can ameliorate the surgical morbidity secondary to lymphadenectomy. However, experience with SLN mapping for carcinoma of the colon and other visceral malignancies is limited. This study represents an update to our initial pilot experience with SLN mapping for carcinoma of the colon. Consenting patients over the age of 18 diagnosed with adenocarcinoma of the colon were included in this study. At the time of operation, 1 to 2 mL of isosulfan blue was injected with a 25-gauge needle into the subserosa at 4 sites around the edge of the palpable tumor. The SLN was identified visually and excised followed by a standard lymphadenectomy and surgical resection. SLNs were evaluated by standard hematoxylin and eosin (H&E) evaluation as well as immunohistochemical (IHC) techniques for carcinoembryonic antigen and cytokeratin if the H&E was negative. Sixty-nine patients underwent SLN mapping. A SLN was identified in 93 per cent (64 of 69) of patients. Nodal metastases were identified in 38 per cent (26 of 69) of patients overall. In 5 patients, the only positive node identified was the SLN, 2 of which were positive by IHC criteria alone. Therefore, 3 per cent (2 of 69) of patients were upstaged by SLN mapping. This technique was 100 per cent specific while being 46 per cent sensitive. Fourteen patients had false-negative SLNs. Metastasis to regional lymph nodes remains the key prognostic factor for colon cancer. SLN mapping is feasible for colon cancer and can identify a subset of patients who could benefit from adjuvant chemotherapy. Although SLN mapping did not alter the surgical management of colon cancer, it does make possible a more focused and cost-effective pathologic evaluation of nodal disease. We do not suggest routine utilization of SLN mapping for colon cancer, but we believe that the data supports proceeding with a national trial.

  11. An interoperable standard system for the automatic generation and publication of the fire risk maps based on Fire Weather Index (FWI)

    NASA Astrophysics Data System (ADS)

    Julià Selvas, Núria; Ninyerola Casals, Miquel

    2015-04-01

    It has been implemented an automatic system to predict the fire risk in the Principality of Andorra, a small country located in the eastern Pyrenees mountain range, bordered by Catalonia and France, due to its location, his landscape is a set of a rugged mountains with an average elevation around 2000 meters. The system is based on the Fire Weather Index (FWI) that consists on different components, each one, measuring a different aspect of the fire danger calculated by the values of the weather variables at midday. CENMA (Centre d'Estudis de la Neu i de la Muntanya d'Andorra) has a network around 10 automatic meteorological stations, located in different places, peeks and valleys, that measure weather data like relative humidity, wind direction and speed, surface temperature, rainfall and snow cover every ten minutes; this data is sent daily and automatically to the system implemented that will be processed in the way to filter incorrect measurements and to homogenizer measurement units. Then this data is used to calculate all components of the FWI at midday and for the level of each station, creating a database with the values of the homogeneous measurements and the FWI components for each weather station. In order to extend and model this data to all Andorran territory and to obtain a continuous map, an interpolation method based on a multiple regression with spline residual interpolation has been implemented. This interpolation considerer the FWI data as well as other relevant predictors such as latitude, altitude, global solar radiation and sea distance. The obtained values (maps) are validated using a cross-validation leave-one-out method. The discrete and continuous maps are rendered in tiled raster maps and published in a web portal conform to Web Map Service (WMS) Open Geospatial Consortium (OGC) standard. Metadata and other reference maps (fuel maps, topographic maps, etc) are also available from this geoportal.

  12. Standard Anatomic Terminologies: Comparison for Use in a Health Information Exchange–Based Prior Computed Tomography (CT) Alerting System

    PubMed Central

    Lowry, Tina; Vreeman, Daniel J; Loo, George T; Delman, Bradley N; Thum, Frederick L; Slovis, Benjamin H; Shapiro, Jason S

    2017-01-01

    Background A health information exchange (HIE)–based prior computed tomography (CT) alerting system may reduce avoidable CT imaging by notifying ordering clinicians of prior relevant studies when a study is ordered. For maximal effectiveness, a system would alert not only for prior same CTs (exams mapped to the same code from an exam name terminology) but also for similar CTs (exams mapped to different exam name terminology codes but in the same anatomic region) and anatomically proximate CTs (exams in adjacent anatomic regions). Notification of previous same studies across an HIE requires mapping of local site CT codes to a standard terminology for exam names (such as Logical Observation Identifiers Names and Codes [LOINC]) to show that two studies with different local codes and descriptions are equivalent. Notifying of prior similar or proximate CTs requires an additional mapping of exam codes to anatomic regions, ideally coded by an anatomic terminology. Several anatomic terminologies exist, but no prior studies have evaluated how well they would support an alerting use case. Objective The aim of this study was to evaluate the fitness of five existing standard anatomic terminologies to support similar or proximate alerts of an HIE-based prior CT alerting system. Methods We compared five standard anatomic terminologies (Foundational Model of Anatomy, Systematized Nomenclature of Medicine Clinical Terms, RadLex, LOINC, and LOINC/Radiological Society of North America [RSNA] Radiology Playbook) to an anatomic framework created specifically for our use case (Simple ANatomic Ontology for Proximity or Similarity [SANOPS]), to determine whether the existing terminologies could support our use case without modification. On the basis of an assessment of optimal terminology features for our purpose, we developed an ordinal anatomic terminology utility classification. We mapped samples of 100 random and the 100 most frequent LOINC CT codes to anatomic regions in each terminology, assigned utility classes for each mapping, and statistically compared each terminology’s utility class rankings. We also constructed seven hypothetical alerting scenarios to illustrate the terminologies’ differences. Results Both RadLex and the LOINC/RSNA Radiology Playbook anatomic terminologies ranked significantly better (P<.001) than the other standard terminologies for the 100 most frequent CTs, but no terminology ranked significantly better than any other for 100 random CTs. Hypothetical scenarios illustrated instances where no standard terminology would support appropriate proximate or similar alerts, without modification. Conclusions LOINC/RSNA Radiology Playbook and RadLex’s anatomic terminologies appear well suited to support proximate or similar alerts for commonly ordered CTs, but for less commonly ordered tests, modification of the existing terminologies with concepts and relations from SANOPS would likely be required. Our findings suggest SANOPS may serve as a framework for enhancing anatomic terminologies in support of other similar use cases. PMID:29242174

  13. 33 CFR 137.50 - Reviews of historical sources of information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... information. 137.50 Section 137.50 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND...: STANDARDS FOR CONDUCTING ALL APPROPRIATE INQUIRIES UNDER THE INNOCENT LAND-OWNER DEFENSE Standards and... insurance maps, building department records, chain of title documents, and land use records. (b) Historical...

  14. 33 CFR 137.50 - Reviews of historical sources of information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... information. 137.50 Section 137.50 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND...: STANDARDS FOR CONDUCTING ALL APPROPRIATE INQUIRIES UNDER THE INNOCENT LAND-OWNER DEFENSE Standards and... insurance maps, building department records, chain of title documents, and land use records. (b) Historical...

  15. 33 CFR 137.50 - Reviews of historical sources of information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... information. 137.50 Section 137.50 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND...: STANDARDS FOR CONDUCTING ALL APPROPRIATE INQUIRIES UNDER THE INNOCENT LAND-OWNER DEFENSE Standards and... insurance maps, building department records, chain of title documents, and land use records. (b) Historical...

  16. 33 CFR 137.50 - Reviews of historical sources of information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... information. 137.50 Section 137.50 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND...: STANDARDS FOR CONDUCTING ALL APPROPRIATE INQUIRIES UNDER THE INNOCENT LAND-OWNER DEFENSE Standards and... insurance maps, building department records, chain of title documents, and land use records. (b) Historical...

  17. 33 CFR 137.50 - Reviews of historical sources of information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information. 137.50 Section 137.50 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND...: STANDARDS FOR CONDUCTING ALL APPROPRIATE INQUIRIES UNDER THE INNOCENT LAND-OWNER DEFENSE Standards and... insurance maps, building department records, chain of title documents, and land use records. (b) Historical...

  18. Fact Sheets and Additional Information Regarding the 2006 Particulate Matter (PM) National Ambient Air Quality Standards (NAAQS)

    EPA Pesticide Factsheets

    This page contains a fact sheet, a presentation providing an overview of the rule, and graphs and maps pertaining to the new standards that are supplementary to the October 2006 revision for the Particulate Matter (PM) NAAQS

  19. Fact Sheets and Additional Information Regarding the 2010 Revision to the Primary National Ambient Air Quality Standards (NAAQS) for Sulfur Dioxide

    EPA Pesticide Factsheets

    Find tools for primary standards for Sulfur Dioxide, maps of nonattainment areas, an overview of the proposal, projected nonattainment areas for 2020, and a presentation on the 2011 SO2 primary NAAQS revision.

  20. Accuracy assessment of vegetation community maps generated by aerial photography interpretation: perspective from the tropical savanna, Australia

    NASA Astrophysics Data System (ADS)

    Lewis, Donna L.; Phinn, Stuart

    2011-01-01

    Aerial photography interpretation is the most common mapping technique in the world. However, unlike an algorithm-based classification of satellite imagery, accuracy of aerial photography interpretation generated maps is rarely assessed. Vegetation communities covering an area of 530 km2 on Bullo River Station, Northern Territory, Australia, were mapped using an interpretation of 1:50,000 color aerial photography. Manual stereoscopic line-work was delineated at 1:10,000 and thematic maps generated at 1:25,000 and 1:100,000. Multivariate and intuitive analysis techniques were employed to identify 22 vegetation communities within the study area. The accuracy assessment was based on 50% of a field dataset collected over a 4 year period (2006 to 2009) and the remaining 50% of sites were used for map attribution. The overall accuracy and Kappa coefficient for both thematic maps was 66.67% and 0.63, respectively, calculated from standard error matrices. Our findings highlight the need for appropriate scales of mapping and accuracy assessment of aerial photography interpretation generated vegetation community maps.

  1. A Genetic Linkage Map of the Male Goat Genome

    PubMed Central

    Vaiman, D.; Schibler, L.; Bourgeois, F.; Oustry, A.; Amigues, Y.; Cribiu, E. P.

    1996-01-01

    This paper presents a first genetic linkage map of the goat genome. Primers derived from the flanking sequences of 612 bovine, ovine and goat microsatellite markers were gathered and tested for amplification with goat DNA under standardized PCR conditions. This screen made it possible to choose a set of 55 polymorphic markers that can be used in the three species and to define a panel of 223 microsatellites suitable for the goat. Twelve half-sib paternal goat families were then used to build a linkage map of the goat genome. The linkage analysis made it possible to construct a meiotic map covering 2300 cM, i.e., >80% of the total estimated length of the goat genome. Moreover, eight cosmids containing microsatellites were mapped by fluorescence in situ hybridization in goat and sheep. Together with 11 microsatellite-containing cosmids previously mapped in cattle (and supposing conservation of the banding pattern between this species and the goat) and data from the sheep map, these results made the orientation of 15 linkage groups possible. Furthermore, 12 coding sequences were mapped either genetically or physically, providing useful data for comparative mapping. PMID:8878693

  2. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  3. Novel 3D ultrasound image-based biomarkers based on a feature selection from a 2D standardized vessel wall thickness map: a tool for sensitive assessment of therapies for carotid atherosclerosis

    NASA Astrophysics Data System (ADS)

    Chiu, Bernard; Li, Bing; Chow, Tommy W. S.

    2013-09-01

    With the advent of new therapies and management strategies for carotid atherosclerosis, there is a parallel need for measurement tools or biomarkers to evaluate the efficacy of these new strategies. 3D ultrasound has been shown to provide reproducible measurements of plaque area/volume and vessel wall volume. However, since carotid atherosclerosis is a focal disease that predominantly occurs at bifurcations, biomarkers based on local plaque change may be more sensitive than global volumetric measurements in demonstrating efficacy of new therapies. The ultimate goal of this paper is to develop a biomarker that is based on the local distribution of vessel-wall-plus-plaque thickness change (VWT-Change) that has occurred during the course of a clinical study. To allow comparison between different treatment groups, the VWT-Change distribution of each subject must first be mapped to a standardized domain. In this study, we developed a technique to map the 3D VWT-Change distribution to a 2D standardized template. We then applied a feature selection technique to identify regions on the 2D standardized map on which subjects in different treatment groups exhibit greater difference in VWT-Change. The proposed algorithm was applied to analyse the VWT-Change of 20 subjects in a placebo-controlled study of the effect of atorvastatin (Lipitor). The average VWT-Change for each subject was computed (i) over all points in the 2D map and (ii) over feature points only. For the average computed over all points, 97 subjects per group would be required to detect an effect size of 25% that of atorvastatin in a six-month study. The sample size is reduced to 25 subjects if the average were computed over feature points only. The introduction of this sensitive quantification technique for carotid atherosclerosis progression/regression would allow many proof-of-principle studies to be performed before a more costly and longer study involving a larger population is held to confirm the treatment efficacy.

  4. Weaving a Fabric of World History? An Analysis of U.S. State High School World History Standards

    ERIC Educational Resources Information Center

    Marino, Michael; Bolgatz, Jane

    2010-01-01

    Understanding world history is critical for our development as citizens in our interconnected society. Yet it is not clear that the standards for world history courses in the U.S. foster understanding of the whole world or of its history. The authors argue that the high school world history standards mapped out by various states promulgate a…

  5. Transport properties in nontwist area-preserving maps

    DOE PAGES

    Szezech Jr., J. D.; Caldas, I. L.; Lopes, S. R.; ...

    2009-10-23

    Nontwist systems, common in the dynamical descriptions of fluids and plasmas, possess a shearless curve with a concomitant transport barrier that eliminates or reduces chaotic transport, even after its breakdown. In order to investigate the transport properties of nontwist systems, we analyze the barrier escape time and barrier transmissivity for the standard nontwist map, a paradigm of such systems. We interpret the sensitive dependence of these quantities upon map parameters by investigating chaotic orbit stickiness and the associated role played by the dominant crossing of stable and unstable manifolds.

  6. The characteristic and changes of the event-related potentials (ERP) and brain topographic maps before and after treatment with rTMS in subjective tinnitus patients.

    PubMed

    Yang, Haidi; Xiong, Hao; Yu, Rongjun; Wang, Changming; Zheng, Yiqing; Zhang, Xueyuan

    2013-01-01

    To compare the event-related potentials (ERPs) and brain topographic maps characteristic and change in normal controls and subjective tinnitus patients before and after repetitive transcranial magnetic stimulation (rTMS) treatment. The ERPs and brain topographic maps elicited by target stimulus were compared before and after 1-week treatment with rTMS in 20 subjective tinnitus patients and 16 healthy controls. Before rTMS, target stimulus elicited a larger N1 component than the standard stimuli (repeating sounds)in control group but not in tinnitus patients. Instead, the tinnitus group pre-treatment exhibited larger amplitude of N1 in response to standard stimuli than to deviant stimuli. Furthermore tinnitus patients had smaller mismatch negativity (MMN) and late discriminative negativity (LDN)component at Fz compared with the control group. After rTMS treatment, tinnitus patients showed increased N1 response to deviant stimuli and larger MMN and LDN compared with pre-treatment. The topographic maps for the tinnitus group before rTMS -treatment demonstrated global asymmetry between the left and right cerebral hemispheres with more negative activities in left side and more positive activities in right side. In contrast, the brain topographic maps for patients after rTMS-treatment and controls seem roughly symmetrical. The ERP amplitudes and brain topographic maps in post-treatment patient group showed no significant difference with those in controls. The characterical changes in ERP and brain topographic maps in tinnitus patients maybe related with the electrophysiological mechanism of tinnitus induction and development. It can be used as an objective biomarker for the evaluation of auditory central in subjective tinnitus patients. These findings support the notion that rTMS treatment in tinnitus patients may exert a beneficial effect.

  7. Spatial analysis of plutonium-239 + 240 and Americium-241 in soils around Rocky Flats, Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litaor, M.I.

    1995-05-01

    Plutonium and american contamination of soils around Rocky Flats, Colorado resulted from past outdoor storage practices. Four previous studies produce four different Pu isopleth maps. Spatial estimation techniques were not used in the construction of these maps and were also based on an extremely small number of soil samples. The purpose of this study was to elucidate the magnitude of Pu-239 + 240 and Am-241 dispersion in the soil environment east of Rocky Flats using robust spatial estimation techniques. Soils were sampled from 118 plots of 1.01 and 4.05 ha by compositing 25 evenly spaced samples in each plot frommore » the top 0.64 cm. Plutonium-239 + 240 activity ranged from 1.85 to 53 560 Bq/kg with a mean of 1924 Bq/kg and a standard deviation of 6327 Bq/kg. Americium-241 activity ranged from 0.18 to 9990 Bq/kg with a mean of 321 Bq/kg and a standard deviation of 1143 Bq/kg. Geostatistical techniques were used to model the spatial dependency and construct isopleth maps showing Pu-239 + 240 and Am-241 distribution. The isopleth configuration was consistent with the hypothesis that the dominant dispersal mechanism of Pu-239 + 240 was wind dispersion from west to east. The Pu-239 + 240 isopleth map proposed to this study differed significantly in the direction and distance of dispersal from the previously published maps. This ispleth map as well as the Am-241 map should be used as the primary data for future risk assessment associated with public exposure to Pu-239 + 240 and Am-241. 37 refs., 7 figs., 2 tabs.« less

  8. Maps and grids of hydrogeologic information created from standardized water-well drillers’ records of the glaciated United States

    USGS Publications Warehouse

    Bayless, E. Randall; Arihood, Leslie D.; Reeves, Howard W.; Sperl, Benjamin J.S.; Qi, Sharon L.; Stipe, Valerie E.; Bunch, Aubrey R.

    2017-01-18

    As part of the National Water Availability and Use Program established by the U.S. Geological Survey (USGS) in 2005, this study took advantage of about 14 million records from State-managed collections of water-well drillers’ records and created a database of hydrogeologic properties for the glaciated United States. The water-well drillers’ records were standardized to be relatively complete and error-free and to provide consistent variables and naming conventions that span all State boundaries.Maps and geospatial grids were developed for (1) total thickness of glacial deposits, (2) total thickness of coarse-grained deposits, (3) specific-capacity based transmissivity and hydraulic conductivity, and (4) texture-based estimated equivalent horizontal and vertical hydraulic conductivity and transmissivity. The information included in these maps and grids is required for most assessments of groundwater availability, in addition to having applications to studies of groundwater flow and transport. The texture-based estimated equivalent horizontal and vertical hydraulic conductivity and transmissivity were based on an assumed range of hydraulic conductivity values for coarse- and fine-grained deposits and should only be used with complete awareness of the methods used to create them. However, the maps and grids of texture-based estimated equivalent hydraulic conductivity and transmissivity may be useful for application to areas where a range of measured values is available for re-scaling.Maps of hydrogeologic information for some States are presented as examples in this report but maps and grids for all States are available electronically at the project Web site (USGS Glacial Aquifer System Groundwater Availability Study, http://mi.water.usgs.gov/projects/WaterSmart/Map-SIR2015-5105.html) and the Science Base Web site, https://www.sciencebase.gov/catalog/item/58756c7ee4b0a829a3276352.

  9. Application of the ERTS system to the study of Wyoming resources with emphasis on the use of basic data products

    NASA Technical Reports Server (NTRS)

    Houston, R. S.; Marrs, R. W.; Breckenridge, R. M.; Blackstone, D. L., Jr.

    1974-01-01

    Many potential users of ERTS data products and other aircraft and satellite imagery are limited to visual methods of analyses of these products. Illustrations are presented from Wyoming studies that have employed these standard data products for a variety of geologic and related studies. Possible economic applications of these studies are summarized. Studies include regional geologic mapping for updating and correcting existing maps and to supplement incomplete regional mapping; illustrations of the value of seasonal images in geologic mapping; specialized mapping of such features as sand dunes, playa lakes, lineaments, glacial features, regional facies changes, and their possible economic value; and multilevel sensing as an aid in mineral exploration. Examples of cooperative studies involving botanists, plant scientists, and geologists for the preparation of maps of surface resources that can be used by planners and for environmental impact studies are given.

  10. A Mathematical Model for Storage and Recall of Images using Targeted Synchronization of Coupled Maps.

    PubMed

    Palaniyandi, P; Rangarajan, Govindan

    2017-08-21

    We propose a mathematical model for storage and recall of images using coupled maps. We start by theoretically investigating targeted synchronization in coupled map systems wherein only a desired (partial) subset of the maps is made to synchronize. A simple method is introduced to specify coupling coefficients such that targeted synchronization is ensured. The principle of this method is extended to storage/recall of images using coupled Rulkov maps. The process of adjusting coupling coefficients between Rulkov maps (often used to model neurons) for the purpose of storing a desired image mimics the process of adjusting synaptic strengths between neurons to store memories. Our method uses both synchronisation and synaptic weight modification, as the human brain is thought to do. The stored image can be recalled by providing an initial random pattern to the dynamical system. The storage and recall of the standard image of Lena is explicitly demonstrated.

  11. Representation of Nursing Terminologies in UMLS

    PubMed Central

    Kim, Tae Youn; Coenen, Amy; Hardiker, Nicholas; Bartz, Claudia C.

    2011-01-01

    There are seven nursing terminologies or classifications that are considered a standard to support nursing practice in the U.S. Harmonizing these terminologies will enhance the interoperability of clinical data documented across nursing practice. As a first step to harmonize the nursing terminologies, the purpose of this study was to examine how nursing problems or diagnostic concepts from select terminologies were cross-mapped in Unified Medical Language System (UMLS). A comparison analysis was conducted by examining whether cross-mappings available in UMLS through concept unique identifiers were consistent with cross-mappings conducted by human experts. Of 423 concepts from three terminologies, 411 (97%) were manually cross-mapped by experts to the International Classification for Nursing Practice. The UMLS semantic mapping among the 411 nursing concepts presented 33.6% accuracy (i.e., 138 of 411 concepts) when compared to expert cross-mappings. Further research and collaboration among experts in this field are needed for future enhancement of UMLS. PMID:22195127

  12. Topic maps for exploring nosological, lexical, semantic and HL7 structures for clinical data.

    PubMed

    Paterson, Grace I; Grant, Andrew M; Soroka, Steven D

    2008-12-01

    A topic map is implemented for learning about clinical data associated with a hospital stay for patients diagnosed with chronic kidney disease, diabetes and hypertension. The question posed is: how might a topic map help bridge perspectival differences among communities of practice and help make commensurable the different classifications they use? The knowledge layer of the topic map was generated from existing ontological relationships in nosological, lexical, semantic and HL7 boundary objects. Discharge summaries, patient charts and clinical data warehouse entries rectified the clinical knowledge used in practice. These clinical data were normalized to HL7 Clinical Document Architecture (CDA) markup standard and stored in the Clinical Document Repository. Each CDA entry was given a subject identifier and linked with the topic map. The ability of topic maps to function as the infostructure ;glue' is assessed using dimensions of semantic interoperability and commensurability.

  13. Improving Signal-to-Noise Ratio in Scanning Transmission Electron Microscopy Energy-Dispersive X-Ray (STEM-EDX) Spectrum Images Using Single-Atomic-Column Cross-Correlation Averaging.

    PubMed

    Jeong, Jong Seok; Mkhoyan, K Andre

    2016-06-01

    Acquiring an atomic-resolution compositional map of crystalline specimens has become routine practice, thus opening possibilities for extracting subatomic information from such maps. A key challenge for achieving subatomic precision is the improvement of signal-to-noise ratio (SNR) of compositional maps. Here, we report a simple and reliable solution for achieving high-SNR energy-dispersive X-ray (EDX) spectroscopy spectrum images for individual atomic columns. The method is based on standard cross-correlation aided by averaging of single-column EDX maps with modifications in the reference image. It produces EDX maps with minimal specimen drift, beam drift, and scan distortions. Step-by-step procedures to determine a self-consistent reference map with a discussion on the reliability, stability, and limitations of the method are presented here.

  14. The Miller Assessment for Preschoolers: A Longitudinal and Predictive Study. Final Report.

    ERIC Educational Resources Information Center

    Foundation for Knowledge in Development, Littleton, CO.

    The study reported here sought to establish the predictive validity of the Miller Assessment for Preschoolers (MAP), an instrument designed to identify preschool children at risk for school-related problems in the primary years. Children (N=338) in 11 states who were originally tested in 1980 as part of the MAP standardization project were given a…

  15. Middle Atmosphere Program. Handbook for MAP, Volume 5

    NASA Technical Reports Server (NTRS)

    Sechrist, C. F., Jr. (Editor)

    1982-01-01

    The variability of the stratosphere during the winter in the Northern Hemisphere is considered. Long term monthly mean 30-mbar maps are presented that include geopotential heights, temperatures, and standard deviations of 15 year averages. Latitudinal profiles of mean zonal winds and temperatures are given along with meridional time sections of derived quantities for the winters 1965/66 to 1980/81.

  16. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    NASA Astrophysics Data System (ADS)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  17. [A basic research to share Fourier transform near-infrared spectrum information resource].

    PubMed

    Zhang, Lu-Da; Li, Jun-Hui; Zhao, Long-Lian; Zhao, Li-Li; Qin, Fang-Li; Yan, Yan-Lu

    2004-08-01

    A method to share the information resource in the database of Fourier transform near-infrared(FTNIR) spectrum information of agricultural products and utilize the spectrum information sufficiently is explored in this paper. Mapping spectrum information from one instrument to another is studied to express the spectrum information accurately between the instruments. Then mapping spectrum information is used to establish a mathematical model of quantitative analysis without including standard samples. The analysis result is that the relative coefficient r is 0.941 and the relative error is 3.28% between the model estimate values and the Kjeldahl's value for the protein content of twenty-two wheat samples, while the relative coefficient r is 0.963 and the relative error is 2.4% for the other model, which is established by using standard samples. It is shown that the spectrum information can be shared by using the mapping spectrum information. So it can be concluded that the spectrum information in one FTNIR spectrum information database can be transformed to another instrument's mapping spectrum information, which makes full use of the information resource in the database of FTNIR spectrum information to realize the resource sharing between different instruments.

  18. Histological validation of near-infrared reflectance multispectral imaging technique for caries detection and quantification

    NASA Astrophysics Data System (ADS)

    Salsone, Silvia; Taylor, Andrew; Gomez, Juliana; Pretty, Iain; Ellwood, Roger; Dickinson, Mark; Lombardo, Giuseppe; Zakian, Christian

    2012-07-01

    Near infrared (NIR) multispectral imaging is a novel noninvasive technique that maps and quantifies dental caries. The technique has the ability to reduce the confounding effect of stain present on teeth. The aim of this study was to develop and validate a quantitative NIR multispectral imaging system for caries detection and assessment against a histological reference standard. The proposed technique is based on spectral imaging at specific wavelengths in the range from 1000 to 1700 nm. A total of 112 extracted teeth (molars and premolars) were used and images of occlusal surfaces at different wavelengths were acquired. Three spectral reflectance images were combined to generate a quantitative lesion map of the tooth. The maximum value of the map at the corresponding histological section was used as the NIR caries score. The NIR caries score significantly correlated with the histological reference standard (Spearman's Coefficient=0.774, p<0.01). Caries detection sensitivities and specificities of 72% and 91% for sound areas, 36% and 79% for lesions on the enamel, and 82% and 69% for lesions in dentin were found. These results suggest that NIR spectral imaging is a novel and promising method for the detection, quantification, and mapping of dental caries.

  19. Histological validation of near-infrared reflectance multispectral imaging technique for caries detection and quantification.

    PubMed

    Salsone, Silvia; Taylor, Andrew; Gomez, Juliana; Pretty, Iain; Ellwood, Roger; Dickinson, Mark; Lombardo, Giuseppe; Zakian, Christian

    2012-07-01

    Near infrared (NIR) multispectral imaging is a novel noninvasive technique that maps and quantifies dental caries. The technique has the ability to reduce the confounding effect of stain present on teeth. The aim of this study was to develop and validate a quantitative NIR multispectral imaging system for caries detection and assessment against a histological reference standard. The proposed technique is based on spectral imaging at specific wavelengths in the range from 1000 to 1700 nm. A total of 112 extracted teeth (molars and premolars) were used and images of occlusal surfaces at different wavelengths were acquired. Three spectral reflectance images were combined to generate a quantitative lesion map of the tooth. The maximum value of the map at the corresponding histological section was used as the NIR caries score. The NIR caries score significantly correlated with the histological reference standard (Spearman's Coefficient=0.774, p<0.01). Caries detection sensitivities and specificities of 72% and 91% for sound areas, 36% and 79% for lesions on the enamel, and 82% and 69% for lesions in dentin were found. These results suggest that NIR spectral imaging is a novel and promising method for the detection, quantification, and mapping of dental caries.

  20. Accuracy of MRI-based Magnetic Susceptibility Measurements

    NASA Astrophysics Data System (ADS)

    Russek, Stephen; Erdevig, Hannah; Keenan, Kathryn; Stupic, Karl

    Magnetic Resonance Imaging (MRI) is increasingly used to map tissue susceptibility to identify microbleeds associated with brain injury and pathologic iron deposits associated with neurologic diseases such as Parkinson's and Alzheimer's disease. Field distortions with a resolution of a few parts per billion can be measured using MRI phase maps. The field distortion map can be inverted to obtain a quantitative susceptibility map. To determine the accuracy of MRI-based susceptibility measurements, a set of phantoms with paramagnetic salts and nano-iron gels were fabricated. The shapes and orientations of features were varied. Measured susceptibility of 1.0 mM GdCl3 solution in water as a function of temperature agreed well with the theoretical predictions, assuming Gd+3 is spin 7/2. The MRI susceptibility measurements were compared with SQUID magnetometry. The paramagnetic susceptibility sits on top of the much larger diamagnetic susceptibility of water (-9.04 x 10-6), which leads to errors in the SQUID measurements. To extract out the paramagnetic contribution using standard magnetometry, measurements must be made down to low temperature (2K). MRI-based susceptometry is shown to be as or more accurate than standard magnetometry and susceptometry techniques.

  1. Unlocking the 9 Components of CSRD.

    ERIC Educational Resources Information Center

    Hansel, Lisa

    This guide provides a map that schools can follow when implementing the Comprehensive School Reform Demonstration (CSRD) program. It is hoped that CSRD will help schools foster higher standards and ensure that schools help students meet these standards. The booklet describes the goals and benchmarks that must be established for student…

  2. Common Bibliographic Standards for Baylor University Libraries. Revised.

    ERIC Educational Resources Information Center

    Scott, Sharon; And Others

    Developed by a Baylor University (Texas) Task Force, the revised policies of bibliographic standards for the university libraries provide formats for: (1) archives and manuscript control; (2) audiovisual media; (3) books; (4) machine-readable data files; (5) maps; (6) music scores; (7) serials; and (8) sound recordings. The task force assumptions…

  3. 76 FR 51985 - ICD-9-CM Coordination and Maintenance Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... and Public Health Data Standards Staff, announces the following meeting. Name: ICD-9-CM Coordination.... 2012 ICD-10-PCS GEM and Reimbursement Map Updates. ICD-10-PCS Official Coding Guidelines. ICD-10 MS... Pickett, Medical Systems Administrator, Classifications and Public Health Data Standards Staff, NCHS, 3311...

  4. larvalign: Aligning Gene Expression Patterns from the Larval Brain of Drosophila melanogaster.

    PubMed

    Muenzing, Sascha E A; Strauch, Martin; Truman, James W; Bühler, Katja; Thum, Andreas S; Merhof, Dorit

    2018-01-01

    The larval brain of the fruit fly Drosophila melanogaster is a small, tractable model system for neuroscience. Genes for fluorescent marker proteins can be expressed in defined, spatially restricted neuron populations. Here, we introduce the methods for 1) generating a standard template of the larval central nervous system (CNS), 2) spatial mapping of expression patterns from different larvae into a reference space defined by the standard template. We provide a manually annotated gold standard that serves for evaluation of the registration framework involved in template generation and mapping. A method for registration quality assessment enables the automatic detection of registration errors, and a semi-automatic registration method allows one to correct registrations, which is a prerequisite for a high-quality, curated database of expression patterns. All computational methods are available within the larvalign software package: https://github.com/larvalign/larvalign/releases/tag/v1.0.

  5. Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes

    PubMed Central

    Sharma, Deepak K.; Solbrig, Harold R.; Prud’hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian

    2016-01-01

    Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary’s metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration. PMID:28269909

  6. Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes.

    PubMed

    Sharma, Deepak K; Solbrig, Harold R; Prud'hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian

    2016-01-01

    Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary's metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration.

  7. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  8. Geologic map of the San Bernardino North 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Miller, F.K.; Matti, J.C.

    2001-01-01

    3. Portable Document Format (.pdf) files of: a. This Readme; includes an Appendix, containing data found in sbnorth_met.txt . b. The Description of Map Units identical to that found on the plot of the PostScript file. c. The same graphic as plotted in 2 above. (Test plots from this .pdf do not produce 1:24,000-scale maps. Use Adobe Acrobat pagesize setting to control map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS San Bernardino North 7.5’ topographic quadrangle in conjunction with the geologic map.

  9. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed amount of oxygen, or of cation (using an analysis in element or oxide weight-%); this latter includes re-calculation of H2O/CO2 based on stoichiometry, and oxygen correction for F and Cl. Another option offers a list of any available standards and possible peak or background interferences for a series of elements. (3) "X-ray maps" lists the different setups recommended for element mapping using WDS, and a map calculator to facilitate maps setups and to estimate the total mapping time. (4) "X-ray data" lists all x-ray lines for a specific element (K, L, M, absorption edges, and satellite peaks) in term of energy, wavelength and peak position. A check for possible interferences on peak or background is also possible. Theoretical x-ray peak positions for each crystal are calculated based on the 2d spacing of each crystal and the wavelength of each line. (5) "Agenda" menu displays the reservation dates for each month and for each EMP lab defined. It also offers a reservation request option, this request being sent by email to the EMP manager for approval. (6) Finally, "Admin" is password restricted, and contains all necessary options to manage the database through user-friendly forms. The installation of this database is made easy and knowledge of HTML, PHP, or MySQL is unnecessary to install, configure, manage, or use it. A working database is accessible at http://cub.geoloweb.ch.

  10. Toward a national fuels mapping strategy: Lessons from selected mapping programs

    USGS Publications Warehouse

    Loveland, Thomas R.

    2001-01-01

    The establishment of a robust national fuels mapping program must be based on pertinent lessons from relevant national mapping programs. Many large-area mapping programs are under way in numerous Federal agencies. Each of these programs follows unique strategies to achieve mapping goals and objectives. Implementation approaches range from highly centralized programs that use tightly integrated standards and dedicated staff, to dispersed programs that permit considerable flexibility. One model facilitates national consistency, while the other allows accommodation of locally relevant conditions and issues. An examination of the programmatic strategies of four national vegetation and land cover mapping initiatives can identify the unique approaches, accomplishments, and lessons of each that should be considered in the design of a national fuel mapping program. The first three programs are the U.S. Geological Survey Gap Analysis Program, the U.S. Geological Survey National Land Cover Characterization Program, and the U.S. Fish and Wildlife Survey National Wetlands Inventory. A fourth program, the interagency Multiresolution Land Characterization Program, offers insights in the use of partnerships to accomplish mapping goals. Collectively, the programs provide lessons, guiding principles, and other basic concepts that can be used to design a successful national fuels mapping initiative.

  11. Research on Service Platform of Internet of Things for Smart City

    NASA Astrophysics Data System (ADS)

    Wang, W.; He, Z.; Huang, D.; Zhang, X.

    2014-04-01

    The application of Internet of Things in surveying and mapping industry basically is at the exploration stage, has not formed a unified standard. Chongqing Institute of Surveying and Mapping (CQISM) launched the research p roject "Research on the Technology of Internet of Things for Smart City". The project focuses on the key technologies of information transmission and exchange on the Internet of Things platform. The data standards of Internet of Things are designed. The real-time acquisition, mass storage and distributed data service of mass sensors are realized. On this basis, CQISM deploys the prototype platform of Internet of Things. The simulation application in Connected Car proves that the platform design is scientific and practical.

  12. Relation between thallium-201/iodine 123-BMIPP subtraction and fluorine 18 deoxyglucose polar maps in patients with hypertrophic cardiomyopathy.

    PubMed

    Ito, Y; Hasegawa, S; Yamaguchi, H; Yoshioka, J; Uehara, T; Nishimura, T

    2000-01-01

    Clinical studies have shown discrepancies in the distribution of thallium-201 and iodine 123-beta-methyl-iodophenylpentadecanoic acid (BMIPP) in patients with hypertrophic cardiomyopathy (HCM). Myocardial uptake of fluorine 18 deoxyglucose (FDG) is increased in the hypertrophic area in HCM. We examined whether the distribution of a Tl-201/BMIPP subtraction polar map correlates with that of an FDG polar map. We normalized to maximum count each Tl-201 and BMIPP bull's-eye polar map of 6 volunteers and obtained a standard Tl-201/BMIPP subtraction polar map by subtracting a normalized BMIPP bull's-eye polar map from a normalized Tl-201 bull's-eye polar map. The Tl-201/BMIPP subtraction polar map was then applied to 8 patients with HCM (mean age 65+/-12 years) to evaluate the discrepancy between Tl-201 and BMIPP distribution. We compared the Tl-201/BMIPP subtraction polar map with an FDG polar map. In patients with HCM, the Tl-201/BMIPP subtraction polar map showed a focal uptake pattern in the hypertrophic area similar to that of the FDG polar map. By quantitative analysis, the severity score of the Tl-201/BMIPP subtraction polar map was significantly correlated with the percent dose uptake of the FDG polar map. These results suggest that this new quantitative method may be an alternative to FDG positron emission tomography for the routine evaluation of HCM.

  13. BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources.

    PubMed

    Lim, Jeongheui; Kim, Sang-Yoon; Kim, Sungmin; Eo, Hae-Seok; Kim, Chang-Bae; Paek, Woon Kee; Kim, Won; Bhak, Jong

    2009-12-03

    DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org.

  14. Extracting and standardizing medication information in clinical text – the MedEx-UIMA system

    PubMed Central

    Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C.; Xu, Hua

    2014-01-01

    Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/. PMID:25954575

  15. Technical Note: A 3-D rendering algorithm for electromechanical wave imaging of a beating heart.

    PubMed

    Nauleau, Pierre; Melki, Lea; Wan, Elaine; Konofagou, Elisa

    2017-09-01

    Arrhythmias can be treated by ablating the heart tissue in the regions of abnormal contraction. The current clinical standard provides electroanatomic 3-D maps to visualize the electrical activation and locate the arrhythmogenic sources. However, the procedure is time-consuming and invasive. Electromechanical wave imaging is an ultrasound-based noninvasive technique that can provide 2-D maps of the electromechanical activation of the heart. In order to fully visualize the complex 3-D pattern of activation, several 2-D views are acquired and processed separately. They are then manually registered with a 3-D rendering software to generate a pseudo-3-D map. However, this last step is operator-dependent and time-consuming. This paper presents a method to generate a full 3-D map of the electromechanical activation using multiple 2-D images. Two canine models were considered to illustrate the method: one in normal sinus rhythm and one paced from the lateral region of the heart. Four standard echographic views of each canine heart were acquired. Electromechanical wave imaging was applied to generate four 2-D activation maps of the left ventricle. The radial positions and activation timings of the walls were automatically extracted from those maps. In each slice, from apex to base, these values were interpolated around the circumference to generate a full 3-D map. In both cases, a 3-D activation map and a cine-loop of the propagation of the electromechanical wave were automatically generated. The 3-D map showing the electromechanical activation timings overlaid on realistic anatomy assists with the visualization of the sources of earlier activation (which are potential arrhythmogenic sources). The earliest sources of activation corresponded to the expected ones: septum for the normal rhythm and lateral for the pacing case. The proposed technique provides, automatically, a 3-D electromechanical activation map with a realistic anatomy. This represents a step towards a noninvasive tool to efficiently localize arrhythmias in 3-D. © 2017 American Association of Physicists in Medicine.

  16. A method to preserve trends in quantile mapping bias correction of climate modeled temperature

    NASA Astrophysics Data System (ADS)

    Grillakis, Manolis G.; Koutroulis, Aristeidis G.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.

    2017-09-01

    Bias correction of climate variables is a standard practice in climate change impact (CCI) studies. Various methodologies have been developed within the framework of quantile mapping. However, it is well known that quantile mapping may significantly modify the long-term statistics due to the time dependency of the temperature bias. Here, a method to overcome this issue without compromising the day-to-day correction statistics is presented. The methodology separates the modeled temperature signal into a normalized and a residual component relative to the modeled reference period climatology, in order to adjust the biases only for the former and preserve the signal of the later. The results show that this method allows for the preservation of the originally modeled long-term signal in the mean, the standard deviation and higher and lower percentiles of temperature. To illustrate the improvements, the methodology is tested on daily time series obtained from five Euro CORDEX regional climate models (RCMs).

  17. Lithologic mapping of silicate rocks using TIMS

    NASA Technical Reports Server (NTRS)

    Gillespie, A. R.

    1986-01-01

    Common rock-forming minerals have thermal infrared spectral features that are measured in the laboratory to infer composition. An airborne Daedalus scanner (TIMS) that collects six channels of thermal infrared radiance data (8 to 12 microns), may be used to measure these same features for rock identification. Previously, false-color composite pictures made from channels 1, 3, and 5 and emittance spectra for small areas on these images were used to make lithologic maps. Central wavelength, standard deviation, and amplitude of normal curves regressed on the emittance spectra are related to compositional information for crystalline igneous silicate rocks. As expected, the central wavelength varies systematically with silica content and with modal quartz content. Standard deviation is less sensitive to compositional changes, but large values may result from mixed admixture of vegetation. Compression of the six TIMS channels to three image channels made from the regressed parameters may be effective in improving geologic mapping from TIMS data, and these synthetic images may form a basis for the remote assessment of rock composition.

  18. Pedagogical efficiency of melodic contour mapping technology as it relates to vocal timbre in singers of classical music repertoire.

    PubMed

    Barnes-Burroughs, Kathryn; Anderson, Edward E; Hughes, Thomas; Lan, William Y; Dent, Karl; Arnold, Sue; Dolter, Gerald; McNeil, Kathy

    2007-11-01

    The purpose of this investigation was to ascertain the pedagogical viability of computer-generated melodic contour mapping systems in the classical singing studio, as perceived by their resulting effect (if any) on vocal timbre when a singer's head and neck remained in a normal singing posture. The evaluation of data gathered during the course of the study indicates that the development of consistent vocal timbre produced by the classical singing student may be enhanced through visual/kinesthetic response to melodic contour inversion mapping, as it balances the singer's perception of melodic intervals in standard musical notation. Unexpectedly, it was discovered that the system, in its natural melodic contour mode, may also be useful for teaching a student to sing a consistent legato line. The results of the study also suggest that the continued development of this new technology for the general teaching studio, designed to address standard musical notation and a singer's visual/kinesthetic response to it, may indeed be useful.

  19. Towards a well-founded and reproducible snow load map for Austria

    NASA Astrophysics Data System (ADS)

    Winkler, Michael; Schellander, Harald

    2017-04-01

    "EN 1991-1-3 Eurocode 1: Part 1-3: Snow Loads" provides standard for the determination of the snow load to be used for the structural design of buildings etc. Since 2006 national specifications for Austria define a snow load map with four "load zones", allowing the calculation of the characteristic ground snow load sk for locations below 1500 m asl. A quadratic regression between altitude and sk is used, as suggested by EN 1991-1-3. The actual snow load map is based on best meteorological practice, but still it is somewhat subjective and non-reproducible. Underlying snow data series often end in the 1980s; in the best case data until about 2005 is used. Moreover, extreme value statistics only rely on the Gumbel distribution and the way in which snow depths are converted to snow loads is generally unknown. This might be enough reasons to rethink the snow load standard for Austria, all the more since today's situation is different to what it was some 15 years ago: Firstly, Austria is rich of multi-decadal, high quality snow depth measurements. These data are not well represented in the actual standard. Secondly, semi-empirical snow models allow sufficiently precise calculations of snow water equivalents and snow loads from snow depth measurements without the need of other parameters like temperature etc. which often are not available at the snow measurement sites. With the help of these tools, modelling of daily snow load series from daily snow depth measurements is possible. Finally, extreme value statistics nowadays offers convincing methods to calculate snow depths and loads with a return period of 50 years, which is the base of sk, and allows reproducible spatial extrapolation. The project introduced here will investigate these issues in order to update the Austrian snow load standard by providing a well-founded and reproducible snow load map for Austria. Not least, we seek for contact with standards bodies of neighboring countries to find intersections as well as to avoid inconsistencies and duplications of effort.

  20. Statistical parametric mapping of LORETA using high density EEG and individual MRI: application to mismatch negativities in schizophrenia.

    PubMed

    Park, Hae-Jeong; Kwon, Jun Soo; Youn, Tak; Pae, Ji Soo; Kim, Jae-Jin; Kim, Myung-Sun; Ha, Kyoo-Seob

    2002-11-01

    We describe a method for the statistical parametric mapping of low resolution electromagnetic tomography (LORETA) using high-density electroencephalography (EEG) and individual magnetic resonance images (MRI) to investigate the characteristics of the mismatch negativity (MMN) generators in schizophrenia. LORETA, using a realistic head model of the boundary element method derived from the individual anatomy, estimated the current density maps from the scalp topography of the 128-channel EEG. From the current density maps that covered the whole cortical gray matter (up to 20,000 points), volumetric current density images were reconstructed. Intensity normalization of the smoothed current density images was used to reduce the confounding effect of subject specific global activity. After transforming each image into a standard stereotaxic space, we carried out statistical parametric mapping of the normalized current density images. We applied this method to the source localization of MMN in schizophrenia. The MMN generators, produced by a deviant tone of 1,200 Hz (5% of 1,600 trials) under the standard tone of 1,000 Hz, 80 dB binaural stimuli with 300 msec of inter-stimulus interval, were measured in 14 right-handed schizophrenic subjects and 14 age-, gender-, and handedness-matched controls. We found that the schizophrenic group exhibited significant current density reductions of MMN in the left superior temporal gyrus and the left inferior parietal gyrus (P < 0. 0005). This study is the first voxel-by-voxel statistical mapping of current density using individual MRI and high-density EEG. Copyright 2002 Wiley-Liss, Inc.

  1. Integrationof Remote Sensing and Geographic information system in Ground Water Quality Assessment and Management

    NASA Astrophysics Data System (ADS)

    Shakak, N.

    2015-04-01

    Spatial variations in ground water quality in the Khartoum state, Sudan, have been studied using geographic information system (GIS) and remote sensing technique. Gegraphical informtion system a tool which is used for storing, analyzing and displaying spatial data is also used for investigating ground water quality information. Khartoum landsat mosac image aquired in 2013was used, Arc/Gis software applied to extract the boundary of the study area, the image was classified to create land use/land cover map. The land use map,geological and soil map are used for correlation between land use , geological formations, and soil types to understand the source of natural pollution that can lower the ground water quality. For this study, the global positioning system (GPS), used in the field to identify the borehole location in a three dimentional coordinate (Latitude, longitude, and altitude), water samples were collected from 156 borehole wells, and analyzed for physico-chemical parameters like electrical conductivity, Total dissolved solid,Chloride, Nitrate, Sodium, Magnisium, Calcium,and Flouride, using standard techniques in the laboratory and compared with the standards.The ground water quality maps of the entire study area have been prepared using spatial interpolation technique for all the above parameters.then the created maps used to visualize, analyze, and understand the relationship among the measured points. Mapping was coded for potable zones, non-potable zones in the study area, in terms of water quality sutability for drinking water and sutability for irrigation. In general satellite remote sensing in conjunction with geographical information system (GIS) offers great potential for water resource development and management.

  2. Pig brain stereotaxic standard space: mapping of cerebral blood flow normative values and effect of MPTP-lesioning.

    PubMed

    Andersen, Flemming; Watanabe, Hideaki; Bjarkam, Carsten; Danielsen, Erik H; Cumming, Paul

    2005-07-15

    The analysis of physiological processes in brain by position emission tomography (PET) is facilitated when images are spatially normalized to a standard coordinate system. Thus, PET activation studies of human brain frequently employ the common stereotaxic coordinates of Talairach. We have developed an analogous stereotaxic coordinate system for the brain of the Gottingen miniature pig, based on automatic co-registration of magnetic resonance (MR) images obtained in 22 male pigs. The origin of the pig brain stereotaxic space (0, 0, 0) was arbitrarily placed in the centroid of the pineal gland as identified on the average MRI template. The orthogonal planes were imposed using the line between stereotaxic zero and the optic chiasm. A series of mean MR images in the coronal, sagittal and horizontal planes were generated. To test the utility of the common coordinate system for functional imaging studies of minipig brain, we calculated cerebral blood flow (CBF) maps from normal minipigs and from minipigs with a syndrome of parkisonism induced by 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)-poisoning. These maps were transformed from the native space into the common stereotaxic space. After global normalization of these maps, an undirected search for differences between the groups was then performed using statistical parametric mapping. Using this method, we detected a statistically significant focal increase in CBF in the left cerebellum of the MPTP-lesioned group. We expect the present approach to be of general use in the statistical parametric mapping of CBF and other physiological parameters in living pig brain.

  3. Quantum theory of the generalised uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bruneton, Jean-Philippe; Larena, Julien

    2017-04-01

    We extend significantly previous works on the Hilbert space representations of the generalized uncertainty principle (GUP) in 3 + 1 dimensions of the form [X_i,P_j] = i F_{ij} where F_{ij} = f({{P}}^2) δ _{ij} + g({{P}}^2) P_i P_j for any functions f. However, we restrict our study to the case of commuting X's. We focus in particular on the symmetries of the theory, and the minimal length that emerge in some cases. We first show that, at the algebraic level, there exists an unambiguous mapping between the GUP with a deformed quantum algebra and a quadratic Hamiltonian into a standard, Heisenberg algebra of operators and an aquadratic Hamiltonian, provided the boost sector of the symmetries is modified accordingly. The theory can also be mapped to a completely standard Quantum Mechanics with standard symmetries, but with momentum dependent position operators. Next, we investigate the Hilbert space representations of these algebraically equivalent models, and focus specifically on whether they exhibit a minimal length. We carry the functional analysis of the various operators involved, and show that the appearance of a minimal length critically depends on the relationship between the generators of translations and the physical momenta. In particular, because this relationship is preserved by the algebraic mapping presented in this paper, when a minimal length is present in the standard GUP, it is also present in the corresponding Aquadratic Hamiltonian formulation, despite the perfectly standard algebra of this model. In general, a minimal length requires bounded generators of translations, i.e. a specific kind of quantization of space, and this depends on the precise shape of the function f defined previously. This result provides an elegant and unambiguous classification of which universal quantum gravity corrections lead to the emergence of a minimal length.

  4. Interoperability of medical device information and the clinical applications: an HL7 RMIM based on the ISO/IEEE 11073 DIM.

    PubMed

    Yuksel, Mustafa; Dogac, Asuman

    2011-07-01

    Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.

  5. Evolving geometrical heterogeneities of fault trace data

    NASA Astrophysics Data System (ADS)

    Wechsler, Neta; Ben-Zion, Yehuda; Christofferson, Shari

    2010-08-01

    We perform a systematic comparative analysis of geometrical fault zone heterogeneities using derived measures from digitized fault maps that are not very sensitive to mapping resolution. We employ the digital GIS map of California faults (version 2.0) and analyse the surface traces of active strike-slip fault zones with evidence of Quaternary and historic movements. Each fault zone is broken into segments that are defined as a continuous length of fault bounded by changes of angle larger than 1°. Measurements of the orientations and lengths of fault zone segments are used to calculate the mean direction and misalignment of each fault zone from the local plate motion direction, and to define several quantities that represent the fault zone disorder. These include circular standard deviation and circular standard error of segments, orientation of long and short segments with respect to the mean direction, and normal separation distances of fault segments. We examine the correlations between various calculated parameters of fault zone disorder and the following three potential controlling variables: cumulative slip, slip rate and fault zone misalignment from the plate motion direction. The analysis indicates that the circular standard deviation and circular standard error of segments decrease overall with increasing cumulative slip and increasing slip rate of the fault zones. The results imply that the circular standard deviation and error, quantifying the range or dispersion in the data, provide effective measures of the fault zone disorder, and that the cumulative slip and slip rate (or more generally slip rate normalized by healing rate) represent the fault zone maturity. The fault zone misalignment from plate motion direction does not seem to play a major role in controlling the fault trace heterogeneities. The frequency-size statistics of fault segment lengths can be fitted well by an exponential function over the entire range of observations.

  6. Standardizing clinical laboratory data for secondary use.

    PubMed

    Abhyankar, Swapna; Demner-Fushman, Dina; McDonald, Clement J

    2012-08-01

    Clinical databases provide a rich source of data for answering clinical research questions. However, the variables recorded in clinical data systems are often identified by local, idiosyncratic, and sometimes redundant and/or ambiguous names (or codes) rather than unique, well-organized codes from standard code systems. This reality discourages research use of such databases, because researchers must invest considerable time in cleaning up the data before they can ask their first research question. Researchers at MIT developed MIMIC-II, a nearly complete collection of clinical data about intensive care patients. Because its data are drawn from existing clinical systems, it has many of the problems described above. In collaboration with the MIT researchers, we have begun a process of cleaning up the data and mapping the variable names and codes to LOINC codes. Our first step, which we describe here, was to map all of the laboratory test observations to LOINC codes. We were able to map 87% of the unique laboratory tests that cover 94% of the total number of laboratory tests results. Of the 13% of tests that we could not map, nearly 60% were due to test names whose real meaning could not be discerned and 29% represented tests that were not yet included in the LOINC table. These results suggest that LOINC codes cover most of laboratory tests used in critical care. We have delivered this work to the MIMIC-II researchers, who have included it in their standard MIMIC-II database release so that researchers who use this database in the future will not have to do this work. Published by Elsevier Inc.

  7. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  8. Google Maps offers a new way to evaluate claudication.

    PubMed

    Khambati, Husain; Boles, Kim; Jetty, Prasad

    2017-05-01

    Accurate determination of walking capacity is important for the clinical diagnosis and management plan for patients with peripheral arterial disease. The current "gold standard" of measurement is walking distance on a treadmill. However, treadmill testing is not always reflective of the patient's natural walking conditions, and it may not be fully accessible in every vascular clinic. The objective of this study was to determine whether Google Maps, the readily available GPS-based mapping tool, offers an accurate and accessible method of evaluating walking distances in vascular claudication patients. Patients presenting to the outpatient vascular surgery clinic between November 2013 and April 2014 at the Ottawa Hospital with vasculogenic calf, buttock, and thigh claudication symptoms were identified and prospectively enrolled in our study. Onset of claudication symptoms and maximal walking distance (MWD) were evaluated using four tools: history; Walking Impairment Questionnaire (WIQ), a validated claudication survey; Google Maps distance calculator (patients were asked to report their daily walking routes on the Google Maps-based tool runningmap.com, and walking distances were calculated accordingly); and treadmill testing for onset of symptoms and MWD, recorded in a double-blinded fashion. Fifteen patients were recruited for the study. Determination of walking distances using Google Maps proved to be more accurate than by both clinical history and WIQ, correlating highly with the gold standard of treadmill testing for both claudication onset (r = .805; P < .001) and MWD (r = .928; P < .0001). In addition, distances were generally under-reported on history and WIQ. The Google Maps tool was also efficient, with reporting times averaging below 4 minutes. For vascular claudicants with no other walking limitations, Google Maps is a promising new tool that combines the objective strengths of the treadmill test and incorporates real-world walking environments. It offers an accurate, efficient, inexpensive, and readily accessible way to assess walking distances in patients with peripheral vascular disease. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  9. Participatory Development and Analysis of a Fuzzy Cognitive Map of the Establishment of a Bio-Based Economy in the Humber Region

    PubMed Central

    Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren

    2013-01-01

    Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303

  10. Importance of Calibration Method in Central Blood Pressure for Cardiac Structural Abnormalities.

    PubMed

    Negishi, Kazuaki; Yang, Hong; Wang, Ying; Nolan, Mark T; Negishi, Tomoko; Pathan, Faraz; Marwick, Thomas H; Sharman, James E

    2016-09-01

    Central blood pressure (CBP) independently predicts cardiovascular risk, but calibration methods may affect accuracy of central systolic blood pressure (CSBP). Standard central systolic blood pressure (Stan-CSBP) from peripheral waveforms is usually derived with calibration using brachial SBP and diastolic BP (DBP). However, calibration using oscillometric mean arterial pressure (MAP) and DBP (MAP-CSBP) is purported to provide more accurate representation of true invasive CSBP. This study sought to determine which derived CSBP could more accurately discriminate cardiac structural abnormalities. A total of 349 community-based patients with risk factors (71±5years, 161 males) had CSBP measured by brachial oscillometry (Mobil-O-Graph, IEM GmbH, Stolberg, Germany) using 2 calibration methods: MAP-CSBP and Stan-CSBP. Left ventricular hypertrophy (LVH) and left atrial dilatation (LAD) were measured based on standard guidelines. MAP-CSBP was higher than Stan-CSBP (149±20 vs. 128±15mm Hg, P < 0.0001). Although they were modestly correlated (rho = 0.74, P < 0.001), the Bland-Altman plot demonstrated a large bias (21mm Hg) and limits of agreement (24mm Hg). In receiver operating characteristic (ROC) curve analyses, MAP-CSBP significantly better discriminated LVH compared with Stan-CSBP (area under the curve (AUC) 0.66 vs. 0.59, P = 0.0063) and brachial SBP (0.62, P = 0.027). Continuous net reclassification improvement (NRI) (P < 0.001) and integrated discrimination improvement (IDI) (P < 0.001) corroborated superior discrimination of LVH by MAP-CSBP. Similarly, MAP-CSBP better distinguished LAD than Stan-CSBP (AUC 0.63 vs. 0.56, P = 0.005) and conventional brachial SBP (0.58, P = 0.006), whereas Stan-CSBP provided no better discrimination than conventional brachial BP (P = 0.09). CSBP is calibration dependent and when oscillometric MAP and DBP are used, the derived CSBP is a better discriminator for cardiac structural abnormalities. © American Journal of Hypertension, Ltd 2016. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Digital mapping techniques '06 - Workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  12. The effectiveness of texture analysis for mapping forest land using the panchromatic bands of Landsat 7, SPOT, and IRS imagery

    Treesearch

    Michael L. Hoppus; Rachel I. Riemann; Andrew J. Lister; Mark V. Finco

    2002-01-01

    The panchromatic bands of Landsat 7, SPOT, and IRS satellite imagery provide an opportunity to evaluate the effectiveness of texture analysis of satellite imagery for mapping of land use/cover, especially forest cover. A variety of texture algorithms, including standard deviation, Ryherd-Woodcock minimum variance adaptive window, low pass etc., were applied to moving...

  13. Mapping Fusarium solani and Aphanomyces euteiches root rot resistance and root architecture quantitative trait loci in common bean (Phaseolus vulgaris)

    USDA-ARS?s Scientific Manuscript database

    Root rot diseases of bean (Phaseolus vulgaris L.) are a constraint to dry and snap bean production. We developed the RR138 RIL mapping population from the cross of OSU5446, a susceptible line that meets current snap bean processing industry standards, and RR6950, a root rot resistant dry bean in th...

  14. US Topo Maps 2014: Program updates and research

    USGS Publications Warehouse

    Fishburn, Kristin A.

    2014-01-01

    The U. S. Geological Survey (USGS) US Topo map program is now in year two of its second three-year update cycle. Since the program was launched in 2009, the product and the production system tools and processes have undergone enhancements that have made the US Topo maps a popular success story. Research and development continues with structural and content product enhancements, streamlined and more fully automated workflows, and the evaluation of a GIS-friendly US Topo GIS Packet. In addition, change detection methodologies are under evaluation to further streamline product maintenance and minimize resource expenditures for production in the future. The US Topo map program will continue to evolve in the years to come, providing traditional map users and Geographic Information System (GIS) analysts alike with a convenient, freely available product incorporating nationally consistent data that are quality assured to high standards.

  15. The deegree framework - Spatial Data Infrastructure solution for end-users and developers

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Poth, Andreas

    2010-05-01

    The open source software framework deegree is a comprehensive implementa­tion of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public Li­cense, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer cus­tomization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Inter­face (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the En­vironment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Be­sides mature standards like Web Map Service, Web Feature Service and Cata­logue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophistic­ated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Pro­cessing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and en­hanced by components for user management, security and map client function­ality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good ex­ample of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex in­frastructure should be used by urban and spatial planners and therefore re­quires a user-friendly graphical interface hiding the complexity of the underly­ing infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.

  16. Product Review

    EPA Pesticide Factsheets

    Resources and guidelines to develop and get approved communications products at EPA; including print items, Web content standards, video production and submission, interactive maps, and social media policies.

  17. Soil pH Mapping with an On-The-Go Sensor

    PubMed Central

    Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, Jan

    2011-01-01

    Soil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH Manager™, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH Manager™ under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH Manager™ were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r2) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany. PMID:22346591

  18. Soil pH mapping with an on-the-go sensor.

    PubMed

    Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, Jan

    2011-01-01

    Soil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH Manager™, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH Manager™ under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH Manager™ were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r(2)) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany.

  19. High Versus Low Blood-Pressure Target in Experimental Ischemic Prolonged Cardiac Arrest Treated with Extra Corporeal Life Support.

    PubMed

    Fritz, Caroline; Kimmoun, Antoine; Vanhuyse, Fabrice; Trifan, Bogdan Florin; Orlowski, Sophie; Falanga, Aude; Marie, Vanessa; Groubatch, Frederique; Albuisson, Eliane; Tran, N'Guyen; Levy, Bruno

    2017-06-01

    There is currently no recommendation for the mean arterial pressure target in the particular setting of Extracorporeal Cardiopulmonary Resuscitation (ECPR) in the first hours following cardiogenic shock complicated by cardiac arrest. This study aimed to assess the effects of two different levels of mean arterial pressure on macrocirculatory, microcirculatory, and metabolic functions. Randomized animal study. University research laboratory. Ventricular fibrillation was induced in 14 male pigs by surgical ligature of the interventricular coronary artery. After 20 min of cardiopulmonary resuscitation, Extracorporeal Life Support (ECLS) was initiated to restore circulatory flow. Thereafter, animals were randomly allocated to a high mean arterial pressure group (High-MAP, 80-85 mm Hg) or to a standard mean arterial pressure group (Standard-MAP, 65-70 mm Hg). Assessments conducted at baseline, immediately following and 6 h after ECLS initiation were focused on lactate evolution, amount of infused fluid, and microcirculatory parameters. There was no significant difference between the two groups at the time of ECLS initiation and at 6 h with regard to lactate levels (High-MAP vs. Standard-MAP: 8.8 [6.7-12.9] vs. 9.6 [9.1-9.8] mmol·l, P = 0.779 and 8.9 [4.3-11.1] vs. 3.3 [2.4-11] mmol·l, P = 0.603). Infused fluid volume did not significantly differ between the two groups (4,000 [3,500-12,000] vs. 5,000 [2,500-18,000] mL, P = 0.977). There was also no significant difference between the two groups regarding renal and liver functions, and sublingual capillary microvascular flow index assessed by Sidestream Dark Field imaging. Compared with a standard mean arterial pressure regimen, targeting a high mean arterial pressure in the first hours of an experimental ECPR model did not result in any hemodynamic improvement nor in a decrease in the amount of infused fluid.

  20. A Comparison of Readability in Science-Based Texts: Implications for Elementary Teachers

    ERIC Educational Resources Information Center

    Gallagher, Tiffany; Fazio, Xavier; Ciampa, Katia

    2017-01-01

    Science curriculum standards were mapped onto various texts (literacy readers, trade books, online articles). Statistical analyses highlighted the inconsistencies among readability formulae for Grades 2-6 levels of the standards. There was a lack of correlation among the readability measures, and also when comparing different text sources. Online…

  1. The Effect of International Financial Reporting Standards Convergence on U. S. Accounting Curriculum

    ERIC Educational Resources Information Center

    Bates, Homer L.; Waldrup, Bobby E.; Shea, Vincent

    2011-01-01

    Major changes are coming to U.S. financial accounting and accounting education as U. S. generally accepted accounting principles (GAAP) and international financial reporting standards (IFRS) converge within the next few years. In 2008, the U.S. Securities and Exchange Commission (SEC) published a proposed "road map" for the potential…

  2. Wavelength selection method with standard deviation: application to pulse oximetry.

    PubMed

    Vazquez-Jaccaud, Camille; Paez, Gonzalo; Strojnik, Marija

    2011-07-01

    Near-infrared spectroscopy provides useful biological information after the radiation has penetrated through the tissue, within the therapeutic window. One of the significant shortcomings of the current applications of spectroscopic techniques to a live subject is that the subject may be uncooperative and the sample undergoes significant temporal variations, due to his health status that, from radiometric point of view, introduce measurement noise. We describe a novel wavelength selection method for monitoring, based on a standard deviation map, that allows low-noise sensitivity. It may be used with spectral transillumination, transmission, or reflection signals, including those corrupted by noise and unavoidable temporal effects. We apply it to the selection of two wavelengths for the case of pulse oximetry. Using spectroscopic data, we generate a map of standard deviation that we propose as a figure-of-merit in the presence of the noise introduced by the living subject. Even in the presence of diverse sources of noise, we identify four wavelength domains with standard deviation, minimally sensitive to temporal noise, and two wavelengths domains with low sensitivity to temporal noise.

  3. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  4. An Atlas of ShakeMaps for Landslide and Liquefaction Modeling

    NASA Astrophysics Data System (ADS)

    Johnson, K. L.; Nowicki, M. A.; Mah, R. T.; Garcia, D.; Harp, E. L.; Godt, J. W.; Lin, K.; Wald, D. J.

    2012-12-01

    The human consequences of a seismic event are often a result of subsequent hazards induced by the earthquake, such as landslides. While the United States Geological Survey (USGS) ShakeMap and Prompt Assessment of Global Earthquakes for Response (PAGER) systems are, in conjunction, capable of estimating the damage potential of earthquake shaking in near-real time, they do not currently provide estimates for the potential of further damage by secondary processes. We are developing a sound basis for providing estimates of the likelihood and spatial distribution of landslides for any global earthquake under the PAGER system. Here we discuss several important ingredients in this effort. First, we report on the development of a standardized hazard layer from which to calibrate observed landslide distributions; in contrast, prior studies have used a wide variety of means for estimating the hazard input. This layer now takes the form of a ShakeMap, a standardized approach for computing geospatial estimates for a variety of shaking metrics (both peak ground motions and shaking intensity) from any well-recorded earthquake. We have created ShakeMaps for about 20 historical landslide "case history" events, significant in terms of their landslide occurrence, as part of an updated release of the USGS ShakeMap Atlas. We have also collected digitized landslide data from open-source databases for many of the earthquake events of interest. When these are combined with up-to-date topographic and geologic maps, we have the basic ingredients for calibrating landslide probabilities for a significant collection of earthquakes. In terms of modeling, rather than focusing on mechanistic models of landsliding, we adopt a strictly statistical approach to quantify landslide likelihood. We incorporate geology, slope, peak ground acceleration, and landslide data as variables in a logistic regression, selecting the best explanatory variables given the standardized new hazard layers (see Nowicki et al., this meeting, for more detail on the regression). To make the ShakeMap and PAGER systems more comprehensive in terms of secondary losses, we are working to calibrate a similarly constrained regression for liquefaction estimation using a suite of well-studied earthquakes for which detailed, digitized liquefaction datasets are available; here variants of wetness index and soil strength replace geology and slope. We expect that this Atlas of ShakeMaps for landslide and liquefaction case history events, which will soon be publicly available via the internet, will aid in improving the accuracy of loss-modeling systems such as PAGER, as well as allow for a common framework for numerous other mechanistic and empirical studies.

  5. SU-F-J-93: Automated Segmentation of High-Resolution 3D WholeBrain Spectroscopic MRI for Glioblastoma Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreibmann, E; Shu, H; Cordova, J

    Purpose: We report on an automated segmentation algorithm for defining radiation therapy target volumes using spectroscopic MR images (sMRI) acquired at nominal voxel resolution of 100 microliters. Methods: Wholebrain sMRI combining 3D echo-planar spectroscopic imaging, generalized auto-calibrating partially-parallel acquisitions, and elliptical k-space encoding were conducted on 3T MRI scanner with 32-channel head coil array creating images. Metabolite maps generated include choline (Cho), creatine (Cr), and N-acetylaspartate (NAA), as well as Cho/NAA, Cho/Cr, and NAA/Cr ratio maps. Automated segmentation was achieved by concomitantly considering sMRI metabolite maps with standard contrast enhancing (CE) imaging in a pipeline that first uses the watermore » signal for skull stripping. Subsequently, an initial blob of tumor region is identified by searching for regions of FLAIR abnormalities that also display reduced NAA activity using a mean ratio correlation and morphological filters. These regions are used as starting point for a geodesic level-set refinement that adapts the initial blob to the fine details specific to each metabolite. Results: Accuracy of the segmentation model was tested on a cohort of 12 patients that had sMRI datasets acquired pre, mid and post-treatment, providing a broad range of enhancement patterns. Compared to classical imaging, where heterogeneity in the tumor appearance and shape across posed a greater challenge to the algorithm, sMRI’s regions of abnormal activity were easily detected in the sMRI metabolite maps when combining the detail available in the standard imaging with the local enhancement produced by the metabolites. Results can be imported in the treatment planning, leading in general increase in the target volumes (GTV60) when using sMRI+CE MRI compared to the standard CE MRI alone. Conclusion: Integration of automated segmentation of sMRI metabolite maps into planning is feasible and will likely streamline acceptance of this new acquisition modality in clinical practice.« less

  6. Landscape features, standards, and semantics in U.S. national topographic mapping databases

    USGS Publications Warehouse

    Varanka, Dalia

    2009-01-01

    The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.

  7. Identification of mineral resources in Afghanistan-Detecting and mapping resource anomalies in prioritized areas using geophysical and remote sensing (ASTER and HyMap) data

    USGS Publications Warehouse

    King, Trude V.V.; Johnson, Michaela R.; Hubbard, Bernard E.; Drenth, Benjamin J.

    2011-01-01

    During the independent analysis of the geophysical, ASTER, and imaging spectrometer (HyMap) data by USGS scientists, previously unrecognized targets of potential mineralization were identified using evaluation criteria most suitable to the individual dataset. These anomalous zones offer targets of opportunity that warrant additional field verification. This report describes the standards used to define the anomalies, summarizes the results of the evaluations for each type of data, and discusses the importance and implications of regions of anomaly overlap between two or three of the datasets.

  8. Assessment of LANDSAT for rangeland mapping, Rush Valley, Utah

    NASA Technical Reports Server (NTRS)

    Ridd, M. K.; Price, K. P.; Douglass, G. E.

    1984-01-01

    The feasibility of using LANDSAT MSS (multispectral scanner) data to identify and map cover types for rangeland, and to determine comparative condition of the ecotypes was assessed. A supporting objective is to assess the utility of various forms of aerial photography in the process. If rangelands can be efficiently mapped with Landsat data, as supported by appropriate aerial photography and field data, then uniform standards of cover classification and condition may be applied across the rangelands of the state. Further, a foundation may be established for long-term monitoring of range trend, using the same satellite system over time.

  9. GIO-EMS and International Collaboration in Satellite based Emergency Mapping

    NASA Astrophysics Data System (ADS)

    Kucera, Jan; Lemoine, Guido; Broglia, Marco

    2013-04-01

    During the last decade, satellite based emergency mapping has developed into a mature operational stage. The European Union's GMES Initial Operations - Emergency Management Service (GIO-EMS), is operational since April 2012. It's set up differs from other mechanisms (for example from the International Charter "Space and Major Disasters"), as it extends fast satellite tasking and delivery with the value adding map production as a single service, which is available, free of charge, to the authorized users of the service. Maps and vector datasets with standard characteristics and formats ranging from post-disaster damage assessment to recovery and disaster prevention are covered by this initiative. Main users of the service are European civil protection authorities and international organizations active in humanitarian aid. All non-sensitive outputs of the service are accessible to the public. The European Commission's in-house science service Joint Research Centre (JRC) is the technical and administrative supervisor of the GIO-EMS. The EC's DG ECHO Monitoring and Information Centre acts as the service's focal point and DG ENTR is responsible for overall service governance. GIO-EMS also aims to contribute to the synergy with similar existing mechanisms at national and international level. The usage of satellite data for emergency mapping has increased during the last years and this trend is expected to continue because of easier accessibility to suitable satellite and other relevant data in the near future. Furthermore, the data and analyses coming from volunteer emergency mapping communities are expected to further enrich the content of such cartographic products. In the case of major disasters the parallel activity of more providers is likely to generate non-optimal use of resources, e.g. unnecessary duplication; whereas coordination may lead to reduced time needed to cover the disaster area. Furthermore the abundant number of geospatial products of different characteristics and quality can become confusing for users. The urgent need for a better coordination has led to establishment of the International Working Group on Satellite Based Emergency Mapping (IWG-SEM). Members of the IWG-SEM, which include JRC, USGS, DLR-ZKI, SERVIR, Sentinel Asia, UNOSAT, UN-SPIDER, GEO, ITHACA and SERTIT have recognized the need to establish the best practice between operational satellite-based emergency mapping programs. The group intends to: • work with the appropriate organizations on definition of professional standards for emergency mapping, guidelines for product generation and reviewing relevant technical standards and protocols • facilitate communication and collaboration during the major emergencies • stimulate coordination of expertise and capacities. The existence of the group and the cooperation among members already brought benefits during recent disasters in Africa and Europe in 2012 in terms of faster and effective satellite data provision and better product generation.

  10. Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.

  11. Stable Estimation of a Covariance Matrix Guided by Nuclear Norm Penalties

    PubMed Central

    Chi, Eric C.; Lange, Kenneth

    2014-01-01

    Estimation of a covariance matrix or its inverse plays a central role in many statistical methods. For these methods to work reliably, estimated matrices must not only be invertible but also well-conditioned. The current paper introduces a novel prior to ensure a well-conditioned maximum a posteriori (MAP) covariance estimate. The prior shrinks the sample covariance estimator towards a stable target and leads to a MAP estimator that is consistent and asymptotically efficient. Thus, the MAP estimator gracefully transitions towards the sample covariance matrix as the number of samples grows relative to the number of covariates. The utility of the MAP estimator is demonstrated in two standard applications – discriminant analysis and EM clustering – in this sampling regime. PMID:25143662

  12. A maize map standard with sequenced core markers, grass genome reference points and 932 expressed sequence tagged sites (ESTs) in a 1736-locus map.

    PubMed Central

    Davis, G L; McMullen, M D; Baysdorfer, C; Musket, T; Grant, D; Staebell, M; Xu, G; Polacco, M; Koster, L; Melia-Hancock, S; Houchins, K; Chao, S; Coe, E H

    1999-01-01

    We have constructed a 1736-locus maize genome map containing1156 loci probed by cDNAs, 545 probed by random genomic clones, 16 by simple sequence repeats (SSRs), 14 by isozymes, and 5 by anonymous clones. Sequence information is available for 56% of the loci with 66% of the sequenced loci assigned functions. A total of 596 new ESTs were mapped from a B73 library of 5-wk-old shoots. The map contains 237 loci probed by barley, oat, wheat, rice, or tripsacum clones, which serve as grass genome reference points in comparisons between maize and other grass maps. Ninety core markers selected for low copy number, high polymorphism, and even spacing along the chromosome delineate the 100 bins on the map. The average bin size is 17 cM. Use of bin assignments enables comparison among different maize mapping populations and experiments including those involving cytogenetic stocks, mutants, or quantitative trait loci. Integration of nonmaize markers in the map extends the resources available for gene discovery beyond the boundaries of maize mapping information into the expanse of map, sequence, and phenotype information from other grass species. This map provides a foundation for numerous basic and applied investigations including studies of gene organization, gene and genome evolution, targeted cloning, and dissection of complex traits. PMID:10388831

  13. Global Mapping Project - Applications and Development of Version 2 Dataset

    NASA Astrophysics Data System (ADS)

    Ubukawa, T.; Nakamura, T.; Otsuka, T.; Iimura, T.; Kishimoto, N.; Nakaminami, K.; Motojima, Y.; Suga, M.; Yatabe, Y.; Koarai, M.; Okatani, T.

    2012-07-01

    The Global Mapping Project aims to develop basic geospatial information of the whole land area of the globe, named Global Map, through the cooperation of National Mapping Organizations (NMOs) around the world. The Global Map data can be a base of global geospatial infrastructure and is composed of eight layers: Boundaries, Drainage, Transportation, Population Centers, Elevation, Land Use, Land Cover and Vegetation. The Global Map Version 1 was released in 2008, and the Version 2 will be released in 2013 as the data are to be updated every five years. In 2009, the International Steering Committee for Global Mapping (ISCGM) adopted new Specifications to develop the Global Map Version 2 with a change of its format so that it is compatible with the international standards, namely ISO 19136 and ISO 19115. With the support of the secretariat of ISCGM, the project participating countries are accelerating their data development toward the completion of the global coverage in 2013, while some countries have already released their Global Map version 2 datasets since 2010. Global Map data are available from the Internet free of charge for non-commercial purposes, which can be used to predict, assess, prepare for and cope with global issues by combining with other spatial data. There are a lot of Global Map applications in various fields, and further utilization of Global Map is expected. This paper summarises the activities toward the development of the Global Map Version 2 as well as some examples of the Global Map applications in various fields.

  14. Improved diagonal queue medical image steganography using Chaos theory, LFSR, and Rabin cryptosystem.

    PubMed

    Jain, Mamta; Kumar, Anil; Choudhary, Rishabh Charan

    2017-06-01

    In this article, we have proposed an improved diagonal queue medical image steganography for patient secret medical data transmission using chaotic standard map, linear feedback shift register, and Rabin cryptosystem, for improvement of previous technique (Jain and Lenka in Springer Brain Inform 3:39-51, 2016). The proposed algorithm comprises four stages, generation of pseudo-random sequences (pseudo-random sequences are generated by linear feedback shift register and standard chaotic map), permutation and XORing using pseudo-random sequences, encryption using Rabin cryptosystem, and steganography using the improved diagonal queues. Security analysis has been carried out. Performance analysis is observed using MSE, PSNR, maximum embedding capacity, as well as by histogram analysis between various Brain disease stego and cover images.

  15. Regulations in the field of Geo-Information

    NASA Astrophysics Data System (ADS)

    Felus, Y.; Keinan, E.; Regev, R.

    2013-10-01

    The geomatics profession has gone through a major revolution during the last two decades with the emergence of advanced GNSS, GIS and Remote Sensing technologies. These technologies have changed the core principles and working procedures of geomatics professionals. For this reason, surveying and mapping regulations, standards and specifications should be updated to reflect these changes. In Israel, the "Survey Regulations" is the principal document that regulates the professional activities in four key areas geodetic control, mapping, cadastre and Georaphic information systems. Licensed Surveyors and mapping professionals in Israel are required to work according to those regulations. This year a new set of regulations have been published and include a few major amendments as follows: In the Geodesy chapter, horizontal control is officially based on the Israeli network of Continuously Operating GNSS Reference Stations (CORS). The regulations were phrased in a manner that will allow minor datum changes to the CORS stations due to Earth Crustal Movements. Moreover, the regulations permit the use of GNSS for low accuracy height measurements. In the Cadastre chapter, the most critical change is the move to Coordinate Based Cadastre (CBC). Each parcel corner point is ranked according to its quality (accuracy and clarity of definition). The highest ranking for a parcel corner is 1. A point with a rank of 1 is defined by its coordinates alone. Any other contradicting evidence is inferior to the coordinates values. Cadastral Information is stored and managed via the National Cadastral Databases. In the Mapping and GIS chapter; the traditional paper maps (ranked by scale) are replaced by digital maps or spatial databases. These spatial databases are ranked by their quality level. Quality level is determined (similar to the ISO19157 Standard) by logical consistency, completeness, positional accuracy, attribute accuracy, temporal accuracy and usability. Metadata is another critical component of any spatial database. Every component in a map should have a metadata identification, even if the map was compiled from multiple resources. The regulations permit the use of advanced sensors and mapping techniques including LIDAR and digita l cameras that have been certified and meet the defined criteria. The article reviews these new regulations and the decision that led to them.

  16. Assessing the impact of graphical quality on automatic text recognition in digital maps

    NASA Astrophysics Data System (ADS)

    Chiang, Yao-Yi; Leyk, Stefan; Honarvar Nazari, Narges; Moghaddam, Sima; Tan, Tian Xiang

    2016-08-01

    Converting geographic features (e.g., place names) in map images into a vector format is the first step for incorporating cartographic information into a geographic information system (GIS). With the advancement in computational power and algorithm design, map processing systems have been considerably improved over the last decade. However, the fundamental map processing techniques such as color image segmentation, (map) layer separation, and object recognition are sensitive to minor variations in graphical properties of the input image (e.g., scanning resolution). As a result, most map processing results would not meet user expectations if the user does not "properly" scan the map of interest, pre-process the map image (e.g., using compression or not), and train the processing system, accordingly. These issues could slow down the further advancement of map processing techniques as such unsuccessful attempts create a discouraged user community, and less sophisticated tools would be perceived as more viable solutions. Thus, it is important to understand what kinds of maps are suitable for automatic map processing and what types of results and process-related errors can be expected. In this paper, we shed light on these questions by using a typical map processing task, text recognition, to discuss a number of map instances that vary in suitability for automatic processing. We also present an extensive experiment on a diverse set of scanned historical maps to provide measures of baseline performance of a standard text recognition tool under varying map conditions (graphical quality) and text representations (that can vary even within the same map sheet). Our experimental results help the user understand what to expect when a fully or semi-automatic map processing system is used to process a scanned map with certain (varying) graphical properties and complexities in map content.

  17. The evolution of internet-based map server applications in the United States Department of Agriculture, Veterinary Services.

    PubMed

    Maroney, Susan A; McCool, Mary Jane; Geter, Kenneth D; James, Angela M

    2007-01-01

    The internet is used increasingly as an effective means of disseminating information. For the past five years, the United States Department of Agriculture (USDA) Veterinary Services (VS) has published animal health information in internet-based map server applications, each oriented to a specific surveillance or outbreak response need. Using internet-based technology allows users to create dynamic, customised maps and perform basic spatial analysis without the need to buy or learn desktop geographic information systems (GIS) software. At the same time, access can be restricted to authorised users. The VS internet mapping applications to date are as follows: Equine Infectious Anemia Testing 1972-2005, National Tick Survey tick distribution maps, the Emergency Management Response System-Mapping Module for disease investigations and emergency outbreaks, and the Scrapie mapping module to assist with the control and eradication of this disease. These services were created using Environmental Systems Research Institute (ESRI)'s internet map server technology (ArcIMS). Other leading technologies for spatial data dissemination are ArcGIS Server, ArcEngine, and ArcWeb Services. VS is prototyping applications using these technologies, including the VS Atlas of Animal Health Information using ArcGIS Server technology and the Map Kiosk using ArcEngine for automating standard map production in the case of an emergency.

  18. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    NASA Astrophysics Data System (ADS)

    Lehrack, S.; Duckeck, G.; Ebke, J.

    2014-06-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  19. Diversity arrays technology (DArT) markers in apple for genetic linkage maps.

    PubMed

    Schouten, Henk J; van de Weg, W Eric; Carling, Jason; Khan, Sabaz Ali; McKay, Steven J; van Kaauwen, Martijn P W; Wittenberg, Alexander H J; Koehorst-van Putten, Herma J J; Noordijk, Yolanda; Gao, Zhongshan; Rees, D Jasper G; Van Dyk, Maria M; Jaccoud, Damian; Considine, Michael J; Kilian, Andrzej

    2012-03-01

    Diversity Arrays Technology (DArT) provides a high-throughput whole-genome genotyping platform for the detection and scoring of hundreds of polymorphic loci without any need for prior sequence information. The work presented here details the development and performance of a DArT genotyping array for apple. This is the first paper on DArT in horticultural trees. Genetic mapping of DArT markers in two mapping populations and their integration with other marker types showed that DArT is a powerful high-throughput method for obtaining accurate and reproducible marker data, despite the low cost per data point. This method appears to be suitable for aligning the genetic maps of different segregating populations. The standard complexity reduction method, based on the methylation-sensitive PstI restriction enzyme, resulted in a high frequency of markers, although there was 52-54% redundancy due to the repeated sampling of highly similar sequences. Sequencing of the marker clones showed that they are significantly enriched for low-copy, genic regions. The genome coverage using the standard method was 55-76%. For improved genome coverage, an alternative complexity reduction method was examined, which resulted in less redundancy and additional segregating markers. The DArT markers proved to be of high quality and were very suitable for genetic mapping at low cost for the apple, providing moderate genome coverage. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s11032-011-9579-5) contains supplementary material, which is available to authorized users.

  20. Knowledge synthesis with maps of neural connectivity.

    PubMed

    Tallis, Marcelo; Thompson, Richard; Russ, Thomas A; Burns, Gully A P C

    2011-01-01

    This paper describes software for neuroanatomical knowledge synthesis based on neural connectivity data. This software supports a mature methodology developed since the early 1990s. Over this time, the Swanson laboratory at USC has generated an account of the neural connectivity of the sub-structures of the hypothalamus, amygdala, septum, hippocampus, and bed nucleus of the stria terminalis. This is based on neuroanatomical data maps drawn into a standard brain atlas by experts. In earlier work, we presented an application for visualizing and comparing anatomical macro connections using the Swanson third edition atlas as a framework for accurate registration. Here we describe major improvements to the NeuARt application based on the incorporation of a knowledge representation of experimental design. We also present improvements in the interface and features of the data mapping components within a unified web-application. As a step toward developing an accurate sub-regional account of neural connectivity, we provide navigational access between the data maps and a semantic representation of area-to-area connections that they support. We do so based on an approach called "Knowledge Engineering from Experimental Design" (KEfED) model that is based on experimental variables. We have extended the underlying KEfED representation of tract-tracing experiments by incorporating the definition of a neuronanatomical data map as a measurement variable in the study design. This paper describes the software design of a web-application that allows anatomical data sets to be described within a standard experimental context and thus indexed by non-spatial experimental design features.

  1. Empirical Model of Precipitating Ion Oval

    NASA Astrophysics Data System (ADS)

    Goldstein, Jerry

    2017-10-01

    In this brief technical report published maps of ion integral flux are used to constrain an empirical model of the precipitating ion oval. The ion oval is modeled as a Gaussian function of ionospheric latitude that depends on local time and the Kp geomagnetic index. The three parameters defining this function are the centroid latitude, width, and amplitude. The local time dependences of these three parameters are approximated by Fourier series expansions whose coefficients are constrained by the published ion maps. The Kp dependence of each coefficient is modeled by a linear fit. Optimization of the number of terms in the expansion is achieved via minimization of the global standard deviation between the model and the published ion map at each Kp. The empirical model is valid near the peak flux of the auroral oval; inside its centroid region the model reproduces the published ion maps with standard deviations of less than 5% of the peak integral flux. On the subglobal scale, average local errors (measured as a fraction of the point-to-point integral flux) are below 30% in the centroid region. Outside its centroid region the model deviates significantly from the H89 integral flux maps. The model's performance is assessed by comparing it with both local and global data from a 17 April 2002 substorm event. The model can reproduce important features of the macroscale auroral region but none of its subglobal structure, and not immediately following a substorm.

  2. Detection of Brain Reorganization in Pediatric Multiple Sclerosis Using Functional MRI

    DTIC Science & Technology

    2014-10-01

    Unclassified b. ABSTRACT Unclassified c. THIS PAGE Unclassified Unclassified 19b. TELEPHONE NUMBER (include area code ) Standard Form 298 (Rev. 8-98...Research titled: “Passive fMRI mapping of language function for pediatric epilepsy surgery : validation using Wada, ECS, and FMAER” 2. Invited talk to...The mapping of language is important in pediatric patients who will undergo resection surgery near cortical regions essential for language function

  3. Impact of Gender, Ethnicity, Year in School, Social Economic Status, and State Standardized Assessment Scores on Student Content Knowledge Achievement when Using Vee Maps as a Formative Assessment Tool

    ERIC Educational Resources Information Center

    Thoron, Andrew C.; Myers, Brian E.

    2011-01-01

    The National Research Council has recognized the challenge of assessing laboratory investigation and called for the investigation of assessments that are proven through sound research-based studies. The Vee map provides a framework that allows the learners to conceptualize their previous knowledge as they develop success in meaningful learning…

  4. Schwarz maps of algebraic linear ordinary differential equations

    NASA Astrophysics Data System (ADS)

    Sanabria Malagón, Camilo

    2017-12-01

    A linear ordinary differential equation is called algebraic if all its solution are algebraic over its field of definition. In this paper we solve the problem of finding closed form solution to algebraic linear ordinary differential equations in terms of standard equations. Furthermore, we obtain a method to compute all algebraic linear ordinary differential equations with rational coefficients by studying their associated Schwarz map through the Picard-Vessiot Theory.

  5. Magician Simulator: A Realistic Simulator for Heterogenous Teams of Autonomous Robots. MAGIC 2010 Challenge

    DTIC Science & Technology

    2011-02-07

    Sensor UGVs (SUGV) or Disruptor UGVs, depending on their payload. The SUGVs included vision, GPS/IMU, and LIDAR systems for identifying and tracking...employed by all the MAGICian research groups. Objects of interest were tracked using standard LIDAR and Computer Vision template-based feature...tracking approaches. Mapping was solved through Multi-Agent particle-filter based Simultaneous Locali- zation and Mapping ( SLAM ). Our system contains

  6. Topologies on directed graphs

    NASA Technical Reports Server (NTRS)

    Lieberman, R. N.

    1972-01-01

    Given a directed graph, a natural topology is defined and relationships between standard topological properties and graph theoretical concepts are studied. In particular, the properties of connectivity and separatedness are investigated. A metric is introduced which is shown to be related to separatedness. The topological notions of continuity and homeomorphism. A class of maps is studied which preserve both graph and topological properties. Applications involving strong maps and contractions are also presented.

  7. SBKF Modeling and Analysis Plan: Buckling Analysis of Compression-Loaded Orthogrid and Isogrid Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Hilburger, Mark W.

    2013-01-01

    This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.

  8. Estimating the Health and Economic Impacts of Changes in Local Air Quality

    PubMed Central

    Carvour, Martha L.; Hughes, Amy E.; Fann, Neal

    2018-01-01

    Objectives. To demonstrate the benefits-mapping software Environmental Benefits Mapping and Analysis Program-Community Edition (BenMAP-CE), which integrates local air quality data with previously published concentration–response and health–economic valuation functions to estimate the health effects of changes in air pollution levels and their economic consequences. Methods. We illustrate a local health impact assessment of ozone changes in the 10-county nonattainment area of the Dallas–Fort Worth region of Texas, estimating the short-term effects on mortality predicted by 2 scenarios for 3 years (2008, 2011, and 2013): an incremental rollback of the daily 8-hour maximum ozone levels of all area monitors by 10 parts per billion and a rollback-to-a-standard ambient level of 65 parts per billion at only monitors above that level. Results. Estimates of preventable premature deaths attributable to ozone air pollution obtained by the incremental rollback method varied little by year, whereas those obtained by the rollback-to-a-standard method varied by year and were sensitive to the choice of ordinality and the use of preloaded or imported data. Conclusions. BenMAP-CE allows local and regional public health analysts to generate timely, evidence-based estimates of the health impacts and economic consequences of potential policy options in their communities. PMID:29698094

  9. Relative risk estimation of Chikungunya disease in Malaysia: An analysis based on Poisson-gamma model

    NASA Astrophysics Data System (ADS)

    Samat, N. A.; Ma'arof, S. H. Mohd Imam

    2015-05-01

    Disease mapping is a method to display the geographical distribution of disease occurrence, which generally involves the usage and interpretation of a map to show the incidence of certain diseases. Relative risk (RR) estimation is one of the most important issues in disease mapping. This paper begins by providing a brief overview of Chikungunya disease. This is followed by a review of the classical model used in disease mapping, based on the standardized morbidity ratio (SMR), which we then apply to our Chikungunya data. We then fit an extension of the classical model, which we refer to as a Poisson-Gamma model, when prior distributions for the relative risks are assumed known. Both results are displayed and compared using maps and we reveal a smoother map with fewer extremes values of estimated relative risk. The extensions of this paper will consider other methods that are relevant to overcome the drawbacks of the existing methods, in order to inform and direct government strategy for monitoring and controlling Chikungunya disease.

  10. Operational shoreline mapping with high spatial resolution radar and geographic processing

    USGS Publications Warehouse

    Rangoonwala, Amina; Jones, Cathleen E; Chi, Zhaohui; Ramsey, Elijah W.

    2017-01-01

    A comprehensive mapping technology was developed utilizing standard image processing and available GIS procedures to automate shoreline identification and mapping from 2 m synthetic aperture radar (SAR) HH amplitude data. The development used four NASA Uninhabited Aerial Vehicle SAR (UAVSAR) data collections between summer 2009 and 2012 and a fall 2012 collection of wetlands dominantly fronted by vegetated shorelines along the Mississippi River Delta that are beset by severe storms, toxic releases, and relative sea-level rise. In comparison to shorelines interpreted from 0.3 m and 1 m orthophotography, the automated GIS 10 m alongshore sampling found SAR shoreline mapping accuracy to be ±2 m, well within the lower range of reported shoreline mapping accuracies. The high comparability was obtained even though water levels differed between the SAR and photography image pairs and included all shorelines regardless of complexity. The SAR mapping technology is highly repeatable and extendable to other SAR instruments with similar operational functionality.

  11. Map based multimedia tool on Pacific theatre in World War II

    NASA Astrophysics Data System (ADS)

    Pakala Venkata, Devi Prasada Reddy

    Maps have been used for depicting data of all kinds in the educational community for many years. A standout amongst the rapidly changing methods of teaching is through the development of interactive and dynamic maps. The emphasis of the thesis is to develop an intuitive map based multimedia tool, which provides a timeline of battles and events in the Pacific theatre of World War II. The tool contains summaries of major battles and commanders and has multimedia content embedded in it. The primary advantage of this Map tool is that one can quickly know about all the battles and campaigns of the Pacific Theatre by accessing Timeline of Battles in each region or Individual Battles in each region or Summary of each Battle in an interactive way. This tool can be accessed via any standard web browser and motivate the user to know more about the battles involved in the Pacific Theatre. It was made responsive using Google maps API, JavaScript, HTML5 and CSS.

  12. Movie-maps of low-latitude magnetic storm disturbance

    NASA Astrophysics Data System (ADS)

    Love, Jeffrey J.; Gannon, Jennifer L.

    2010-06-01

    We present 29 movie-maps of low-latitude horizontal-intensity magnetic disturbance for the years 1999-2006: 28 recording magnetic storms and 1 magnetically quiescent period. The movie-maps are derived from magnetic vector time series data collected at up to 25 ground-based observatories. Using a technique similar to that used in the calculation of Dst, a quiet time baseline is subtracted from the time series from each observatory. The remaining disturbance time series are shown in a polar coordinate system that accommodates both Earth rotation and the universal time dependence of magnetospheric disturbance. Each magnetic storm recorded in the movie-maps is different. While some standard interpretations about the storm time equatorial ring current appear to apply to certain moments and certain phases of some storms, the movie-maps also show substantial variety in the local time distribution of low-latitude magnetic disturbance, especially during storm commencements and storm main phases. All movie-maps are available at the U.S. Geological Survey Geomagnetism Program Web site (http://geomag.usgs.gov).

  13. From conceptual modeling to a map

    NASA Astrophysics Data System (ADS)

    Gotlib, Dariusz; Olszewski, Robert

    2018-05-01

    Nowadays almost every map is a component of the information system. Design and production of maps requires the use of specific rules for modeling information systems: conceptual, application and data modelling. While analyzing various stages of cartographic modeling the authors ask the question: at what stage of this process a map occurs. Can we say that the "life of the map" begins even before someone define its form of presentation? This question is particularly important at the time of exponentially increasing number of new geoinformation products. During the analysis of the theory of cartography and relations of the discipline to other fields of knowledge it has been attempted to define a few properties of cartographic modeling which distinguish the process from other methods of spatial modeling. Assuming that the map is a model of reality (created in the process of cartographic modeling supported by domain-modeling) the article proposes an analogy of the process of cartographic modeling to the scheme of conceptual modeling presented in ISO 19101 standard.

  14. "X-Map 2.0" for Edema Signal Enhancement for Acute Ischemic Stroke Using Non-Contrast-Enhanced Dual-Energy Computed Tomography.

    PubMed

    Taguchi, Katsuyuki; Itoh, Toshihide; Fuld, Matthew K; Fournie, Eric; Lee, Okkyun; Noguchi, Kyo

    2018-03-14

    A novel imaging technique ("X-map") has been developed to identify acute ischemic lesions for stroke patients using non-contrast-enhanced dual-energy computed tomography (NE-DE-CT). Using the 3-material decomposition technique, the original X-map ("X-map 1.0") eliminates fat and bone from the images, suppresses the gray matter (GM)-white matter (WM) tissue contrast, and makes signals of edema induced by severe ischemia easier to detect. The aim of this study was to address the following 2 problems with the X-map 1.0: (1) biases in CT numbers (or artifacts) near the skull of NE-DE-CT images and (2) large intrapatient and interpatient variations in X-map 1.0 values. We improved both an iterative beam-hardening correction (iBHC) method and the X-map algorithm. The new iBHC (iBHC2) modeled x-ray physics more accurately. The new X-map ("X-map 2.0") estimated regional GM values-thus, maximizing the ability to suppress the GM-WM contrast, make edema signals quantitative, and enhance the edema signals that denote an increased water density for each pixel. We performed a retrospective study of 11 patients (3 men, 8 women; mean age, 76.3 years; range, 68-90 years) who presented to the emergency department with symptoms of acute stroke. Images were reconstructed with the old iBHC (iBHC1) and the iBHC2, and biases in CT numbers near the skull were measured. Both X-map 2.0 maps and X-map 1.0 maps were computed from iBHC2 images, both with and without a material decomposition-based edema signal enhancement (ESE) process. X-map values were measured at 5 to 9 locations on GM without infarct per patient; the mean value was calculated for each patient (we call it the patient-mean X-map value) and subtracted from the measured X-map values to generate zero-mean X-map values. The standard deviation of the patient-mean X-map values over multiple patients denotes the interpatient variation; the standard deviation over multiple zero-mean X-map values denotes the intrapatient variation. The Levene F test was performed to assess the difference in the standard deviations with different algorithms. Using 5 patient data who had diffusion weighted imaging (DWI) within 2 hours of NE-DE-CT, mean values at and near ischemic lesions were measured at 7 to 14 locations per patient with X-map images, CT images (low kV and high kV), and DWI images. The Pearson correlation coefficient was calculated between a normalized increase in DWI signals and either X-map or CT. The bias in CT numbers was lower with iBHC2 than with iBHC1 in both high- and low-kV images (2.5 ± 2.0 HU [95% confidence interval (CI), 1.3-3.8 HU] for iBHC2 vs 6.9 ± 2.3 HU [95% CI, 5.4-8.3 HU] for iBHC1 with high-kV images, P < 0.01; 1.5 ± 3.6 HU [95% CI, -0.8 to 3.7 HU] vs 12.8 ± 3.3 HU [95% CI, 10.7-14.8 HU] with low-kV images, P < 0.01). The interpatient variation was smaller with X-map 2.0 than with X-map 1.0, both with and without ESE (4.3 [95% CI, 3.0-7.6] for X-map 2.0 vs 19.0 [95% CI, 13.3-22.4] for X-map 1.0, both with ESE, P < 0.01; 3.0 [95% CI, 2.1-5.3] vs 12.0 [95% CI, 8.4-21.0] without ESE, P < 0.01). The intrapatient variation was also smaller with X-map 2.0 than with X-map 1.0 (6.2 [95% CI, 5.3-7.3] vs 8.5 [95% CI, 7.3-10.1] with ESE, P = 0.0122; 4.1 [95% CI, 3.6-4.9] vs 6.3 [95% CI, 5.5-7.6] without ESE, P < 0.01). The best 3 correlation coefficients (R) with DWI signals were -0.733 (95% CI, -0.845 to -0.560, P < 0.001) for X-map 2.0 with ESE, -0.642 (95% CI, -0.787 to -0.429, P < 0.001) for high-kV CT, and -0.609 (95% CI, -0.766 to -0.384, P < 0.001) for X-map 1.0 with ESE. Both of the 2 problems outlined in the objectives have been addressed by improving both iBHC and X-map algorithm. The iBHC2 improved the bias in CT numbers and the visibility of GM-WM contrast throughout the brain space. The combination of iBHC2 and X-map 2.0 with ESE decreased both intrapatient and interpatient variations of edema signals significantly and had a strong correlation with DWI signals in terms of the strength of edema signals.

  15. A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy

    PubMed Central

    Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.

    2000-01-01

    This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576

  16. Proposed U.S. Geological Survey standard for digital orthophotos

    USGS Publications Warehouse

    Hooper, David; Caruso, Vincent

    1991-01-01

    The U.S. Geological Survey has added the new category of digital orthophotos to the National Digital Cartographic Data Base. This differentially rectified digital image product enables users to take advantage of the properties of current photoimagery as a source of geographic information. The product and accompanying standard were implemented in spring 1991. The digital orthophotos will be quadrangle based and cast on the Universal Transverse Mercator projection and will extend beyond the 3.75-minute or 7.5-minute quadrangle area at least 300 meters to form a rectangle. The overedge may be used for mosaicking with adjacent digital orthophotos. To provide maximum information content and utility to the user, metadata (header) records exist at the beginning of the digital orthophoto file. Header information includes the photographic source type, date, instrumentation used to create the digital orthophoto, and information relating to the DEM that was used in the rectification process. Additional header information is included on transformation constants from the 1927 and 1983 North American Datums to the orthophoto internal file coordinates to enable the user to register overlays on either datum. The quadrangle corners in both datums are also imprinted on the image. Flexibility has been built into the digital orthophoto format for future enhancements, such as the provision to include the corresponding digital elevation model elevations used to rectify the orthophoto. The digital orthophoto conforms to National Map Accuracy Standards and provides valuable mapping data that can be used as a tool for timely revision of standard map products, for land use and land cover studies, and as a digital layer in a geographic information system.

  17. BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources

    PubMed Central

    2009-01-01

    Background DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. Results We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Conclusion Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org. PMID:19958506

  18. Assessing the readiness of precision medicine interoperabilty: An exploratory study of the National Institutes of Health genetic testing registry.

    PubMed

    Ronquillo, Jay G; Weng, Chunhua; Lester, William T

    2017-11-17

      Precision medicine involves three major innovations currently taking place in healthcare:  electronic health records, genomics, and big data.  A major challenge for healthcare providers, however, is understanding the readiness for practical application of initiatives like precision medicine.   To better understand the current state and challenges of precision medicine interoperability using a national genetic testing registry as a starting point, placed in the context of established interoperability formats.   We performed an exploratory analysis of the National Institutes of Health Genetic Testing Registry.  Relevant standards included Health Level Seven International Version 3 Implementation Guide for Family History, the Human Genome Organization Gene Nomenclature Committee (HGNC) database, and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT).  We analyzed the distribution of genetic testing laboratories, genetic test characteristics, and standardized genome/clinical code mappings, stratified by laboratory setting. There were a total of 25472 genetic tests from 240 laboratories testing for approximately 3632 distinct genes.  Most tests focused on diagnosis, mutation confirmation, and/or risk assessment of germline mutations that could be passed to offspring.  Genes were successfully mapped to all HGNC identifiers, but less than half of tests mapped to SNOMED CT codes, highlighting significant gaps when linking genetic tests to standardized clinical codes that explain the medical motivations behind test ordering.  Conclusion:  While precision medicine could potentially transform healthcare, successful practical and clinical application will first require the comprehensive and responsible adoption of interoperable standards, terminologies, and formats across all aspects of the precision medicine pipeline.

  19. Estimation of Anthropogenic Heat Emissions in Delhi, India and Their Role in Urban Heat Island Effect

    NASA Astrophysics Data System (ADS)

    Bhati, S.; Mohan, M.

    2016-12-01

    Energy consumption in the urban environment impacts the urban surface energy budget and leads to the emission of anthropogenic sensible heat into the atmosphere. Anthropogenic heat (AH) can vary both in time and space, and are not readily measured. In present study, anthropogenic heat emissions have been estimated using an inventory approach for Delhi. The main sources that have been considered are electricity consumption, vehicular emissions, fuel consumption in domestic sector and waste heat from power plants. Total estimated anthropogenic heat is apportioned gridwise (2 km2) and incorporated in the WRF (version 3.5) model coupled with single-layer Urban canopy model (UCM) to assess the impact of these emissions on urban heat island effect in Delhi. Vehicular emissions have been found to be highest contributor to anthropogenic heat emissions (47%) followed by electricity consumption (28%), domestic fuel consumption (16%) and waste heat from power plants (9%). Highest annual average anthropogenic heat flux was estimated to be 25.2 Wm-2. High flux zones are observed in east Delhi and densely occupied and commercial zones of Sitaram Bazar and Connaught Place. Inclusion of anthropogenic heat emissions in the model improves model performance for near surface temperature as well as urban heat island intensities. Maximum simulated night-time UHI improves from 5.95°C (without AH) to 6.24°C (with AH) against observed value of 6.68°C, thereby indicating positive contribution of anthropogenic heat emissions along with urban canopy towards UHI effect in Delhi. Similarly, spatial distribution and UHI hotspots are found to be comparatively closer to corresponding observed distribution and hotspots with anthropogenic heat emissions being added to the WRF model. Overall, relatively improved model performance is indicative of the impact of anthropogenic heat emissions in local urban meteorology and urban heat island effect in Delhi. Hence, rising population and change in land use-cover and associated anthropogenic activities call for strategic mitigation measures in the city to prevent further strengthening of heat island effect.

  20. Yeast proteome map (last update).

    PubMed

    Perrot, Michel; Moes, Suzette; Massoni, Aurélie; Jenoe, Paul; Boucherie, Hélian

    2009-10-01

    The identification of proteins separated on 2-D gels is essential to exploit the full potential of 2-D gel electrophoresis for proteomic investigations. For this purpose we have undertaken the systematic identification of Saccharomyces cerevisiae proteins separated on 2-D gels. We report here the identification by mass spectrometry of 100 novel yeast protein spots that have so far not been tackled due to their scarcity on our standard 2-D gels. These identifications extend the number of protein spots identified on our yeast 2-D proteome map to 716. They correspond to 485 unique proteins. Among these, 154 were resolved into several isoforms. The present data set can now be expanded to report for the first time a map of 363 protein isoforms that significantly deepens our knowledge of the yeast proteome. The reference map and a list of all identified proteins can be accessed on the Yeast Protein Map server (www.ibgc.u-bordeaux2.fr/YPM).

  1. Use of linkage disequilibrium approaches to map genes for bipolar disorder in the Costa Rican population

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Escamilla, M.A.; Reus, V.I.; Smith, L.B.

    1996-05-31

    Linkage disequilibrium (LD) analysis provides a powerful means for screening the genome to map the location of disease genes, such as those for bipolar disorder (BP). As described in this paper, the population of the Central Valley of Costa Rica, which is descended from a small number of founders, should be suitable for LD mapping; this assertion is supported by reconstruction of extended haplotypes shared by distantly related individuals in this population suffering low-frequency hearing loss (LFHL1), which has previously been mapped by linkage analysis. A sampling strategy is described for applying LD methods to map genes for BP, andmore » clinical and demographic characteristics of an initially collected sample are discussed. This sample will provide a complement to a previously collected set of Costa Rican BP families which is under investigation using standard linkage analysis. 42 refs., 4 figs., 2 tabs.« less

  2. Quality changes of fresh filled pasta during storage: influence of modified atmosphere packaging on microbial growth and sensory properties.

    PubMed

    Sanguinetti, A M; Del Caro, A; Mangia, N P; Secchi, N; Catzeddu, P; Piga, A

    2011-02-01

    This study evaluated the shelf life of fresh pasta filled with cheese subjected to modified atmosphere packaging (MAP) or air packaging (AP). After a pasteurization treatment, fresh pasta was packaged under a 50/50 N(2)/CO(2) ratio or in air (air batch). Changes in microbial growth, in-package gas composition, chemical-physical parameters and sensory attributes were monitored for 42 days at 4 (°)C. The pasteurization treatment resulted in suitable microbiological reduction. MAP allowed a mold-free shelf life of the fresh filled pasta of 42 days, whereas air-packaged samples got spoilt between 7 and 14 days. The hurdle approach used (MAP and low storage temperature) prevented the growth of pathogens and alterative microorganisms. MAP samples maintained a high microbiological standard throughout the storage period. The panel judged MAP fresh pasta above the acceptability threshold throughout the shelf life.

  3. Probing the statistical properties of CMB B-mode polarization through Minkowski functionals

    NASA Astrophysics Data System (ADS)

    Santos, Larissa; Wang, Kai; Zhao, Wen

    2016-07-01

    The detection of the magnetic type B-mode polarization is the main goal of future cosmic microwave background (CMB) experiments. In the standard model, the B-mode map is a strong non-gaussian field due to the CMB lensing component. Besides the two-point correlation function, the other statistics are also very important to dig the information of the polarization map. In this paper, we employ the Minkowski functionals to study the morphological properties of the lensed B-mode maps. We find that the deviations from Gaussianity are very significant for both full and partial-sky surveys. As an application of the analysis, we investigate the morphological imprints of the foreground residuals in the B-mode map. We find that even for very tiny foreground residuals, the effects on the map can be detected by the Minkowski functional analysis. Therefore, it provides a complementary way to investigate the foreground contaminations in the CMB studies.

  4. ShakeMap manual: technical manual, user's guide, and software guide

    USGS Publications Warehouse

    Wald, David J.; Worden, Bruce C.; Quitoriano, Vincent; Pankow, Kris L.

    2005-01-01

    ShakeMap (http://earthquake.usgs.gov/shakemap) --rapidly, automatically generated shaking and intensity maps--combines instrumental measurements of shaking with information about local geology and earthquake location and magnitude to estimate shaking variations throughout a geographic area. The results are rapidly available via the Web through a variety of map formats, including Geographic Information System (GIS) coverages. These maps have become a valuable tool for emergency response, public information, loss estimation, earthquake planning, and post-earthquake engineering and scientific analyses. With the adoption of ShakeMap as a standard tool for a wide array of users and uses came an impressive demand for up-to-date technical documentation and more general guidelines for users and software developers. This manual is meant to address this need. ShakeMap, and associated Web and data products, are rapidly evolving as new advances in communications, earthquake science, and user needs drive improvements. As such, this documentation is organic in nature. We will make every effort to keep it current, but undoubtedly necessary changes in operational systems take precedence over producing and making documentation publishable.

  5. Understanding Rasch Measurement: A Mapmark Method of Standard Setting as Implemented for the National Assessment Governing Board

    ERIC Educational Resources Information Center

    Schulz, E. Matthew; Mitzel, Howard C.

    2011-01-01

    This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is…

  6. Geologic map of the Fifteenmile Valley 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Miller, F.K.; Matti, J.C.

    2001-01-01

    Open-File Report OF 01-132 contains a digital geologic map database of the Fifteenmile Valley 7.5’ quadrangle, San Bernardino County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A PostScript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram, a Description of Map Units, an index map, and a regional structure map. 3. Portable Document Format (.pdf) files of: a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. (Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale.) The Correlation of Map Units (CMU) and Description of Map Units (DMU) is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Fifteenmile Valley 7.5’ topographic quadrangle in conjunction with the geologic map.

  7. Impact of Indocyanine Green for Sentinel Lymph Node Mapping in Early Stage Endometrial and Cervical Cancer: Comparison with Conventional Radiotracer (99m)Tc and/or Blue Dye.

    PubMed

    Buda, Alessandro; Crivellaro, Cinzia; Elisei, Federica; Di Martino, Giampaolo; Guerra, Luca; De Ponti, Elena; Cuzzocrea, Marco; Giuliani, Daniela; Sina, Federica; Magni, Sonia; Landoni, Claudio; Milani, Rodolfo

    2016-07-01

    To compare the detection rate (DR) and bilateral optimal mapping (OM) of sentinel lymph nodes (SLNs) in women with endometrial and cervical cancer using indocyanine green (ICG) versus the standard technetium-99m radiocolloid ((99m)Tc) radiotracer plus methylene or isosulfan blue, or blue dye alone. From October 2010 to May 2015, 163 women with stage I endometrial or cervical cancer (118 endometrial and 45 cervical cancer) underwent SLN mapping with (99m)Tc with blue dye, blue dye alone, or ICG. DR and bilateral OM of ICG were compared respectively with the results obtained using the standard (99m)Tc radiotracer with blue dye, or blue dye alone. SLN mapping with (99m)Tc radiotracer with blue dye was performed on 77 of 163 women, 38 with blue dye only and 48 with ICG. The overall DR of SLN mapping was 97, 89, and 100 % for (99m)Tc with blue dye, blue dye alone, and ICG, respectively. The bilateral OM rate for ICG was 85 %-significantly higher than the 58 % obtained with (99m)Tc with blue dye (p = 0.003) and the 54 % for blue dye (p = 0.001). Thirty-one women (19 %) had positive SLNs. Sensitivity and negative predictive value of SLN were 100 % for all techniques. SLNs mapping using ICG demonstrated higher DR compared to other modalities. In addition, ICG was significantly superior to (99m)Tc with blue dye in terms of bilateral OM in women with early stage endometrial and cervical cancer. The higher number of bilateral OM may consequently reduce the overall number of complete lymphadenectomies, reducing the duration and additional costs of surgical treatment.

  8. Bridging scales from satellite to grains: Structural mapping aided by tablet and photogrammetry

    NASA Astrophysics Data System (ADS)

    Hawemann, Friedrich; Mancktelow, Neil; Pennacchioni, Giorgio; Wex, Sebastian; Camacho, Alfredo

    2016-04-01

    Bridging scales from satellite to grains: Structural mapping aided by tablet and photogrammetry A fundamental problem in small-scale mapping is linking outcrop observations to the large scale deformation pattern. The evolution of handheld devices such as tablets with integrated GPS and the availability of airborne imagery allows a precise localization of outcrops. Detailed structural geometries can be analyzed through ortho-rectified photo mosaics generated by photogrammetry software. In this study, we use a cheap standard Samsung-tablet (< 300 Euro) to map individual, up to 60 m long shear zones with the tracking option offered by the program Locus Map. Even though GPS accuracy is about 3 m, the relative error from one point to another during tracking is on the order of only about 1 dm. Parts of the shear zone with excellent outcrop are photographed with a standard camera with a relatively wide angle in a mosaic array. An area of about 30 sqm needs about 50 photographs with enough overlap to be used for photogrammetry. The software PhotoScan from Agisoft matches the photographs in a fully automated manner, calculates a 3D model of the outcrop, and has the option to project this as an orthophoto onto a flat surface. This allows original orientations of grain-scale structures to be recorded over areas on a scale up to tens to hundreds of metres. The photo mosaics can then be georeferenced with the aid of the GPS-tracks of the shear zones and included in a GIS. This provides a cheap recording of the structures in high detail. The great advantages over mapping with UAVs (drones) is the resolution (<1mm to >1cm), the independence from weather and energy source, and the low cost.

  9. Sentinel lymph node mapping in minimally invasive surgery: Role of imaging with color-segmented fluorescence (CSF).

    PubMed

    Lopez Labrousse, Maite I; Frumovitz, Michael; Guadalupe Patrono, M; Ramirez, Pedro T

    2017-09-01

    Sentinel lymph node mapping, alone or in combination with pelvic lymphadenectomy, is considered a standard approach in staging of patients with cervical or endometrial cancer [1-3]. The goal of this video is to demonstrate the use of indocyanine green (ICG) and color-segmented fluorescence when performing lymphatic mapping in patients with gynecologic malignancies. Injection of ICG is performed in two cervical sites using 1mL (0.5mL superficial and deep, respectively) at the 3 and 9 o'clock position. Sentinel lymph nodes are identified intraoperatively using the Pinpoint near-infrared imaging system (Novadaq, Ontario, CA). Color-segmented fluorescence is used to image different levels of ICG uptake demonstrating higher levels of perfusion. A color key on the side of the monitor shows the colors that coordinate with different levels of ICG uptake. Color-segmented fluorescence may help surgeons identify true sentinel nodes from fatty tissue that, although absorbing fluorescent dye, does not contain true nodal tissue. It is not intended to differentiate the primary sentinel node from secondary sentinel nodes. The key ranges from low levels of ICG uptake (gray) to the highest rate of ICG uptake (red). Bilateral sentinel lymph nodes are identified along the external iliac vessels using both standard and color-segmented fluorescence. No evidence of disease was noted after ultra-staging was performed in each of the sentinel nodes. Use of ICG in sentinel lymph node mapping allows for high bilateral detection rates. Color-segmented fluorescence may increase accuracy of sentinel lymph node identification over standard fluorescent imaging. The following are the supplementary data related to this article. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  11. Comparison of manually produced and automated cross country movement maps using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Wynn, L. K.

    1985-01-01

    The Image-Based Information System (IBIS) was used to automate the cross country movement (CCM) mapping model developed by the Defense Mapping Agency (DMA). Existing terrain factor overlays and a CCM map, produced by DMA for the Fort Lewis, Washington area, were digitized and reformatted into geometrically registered images. Terrain factor data from Slope, Soils, and Vegetation overlays were entered into IBIS, and were then combined utilizing IBIS-programmed equations to implement the DMA CCM model. The resulting IBIS-generated CCM map was then compared with the digitized manually produced map to test similarity. The numbers of pixels comprising each CCM region were compared between the two map images, and percent agreement between each two regional counts was computed. The mean percent agreement equalled 86.21%, with an areally weighted standard deviation of 11.11%. Calculation of Pearson's correlation coefficient yielded +9.997. In some cases, the IBIS-calculated map code differed from the DMA codes: analysis revealed that IBIS had calculated the codes correctly. These highly positive results demonstrate the power and accuracy of IBIS in automating models which synthesize a variety of thematic geographic data.

  12. Computer-assisted map projection research

    USGS Publications Warehouse

    Snyder, John Parr

    1985-01-01

    Computers have opened up areas of map projection research which were previously too complicated to utilize, for example, using a least-squares fit to a very large number of points. One application has been in the efficient transfer of data between maps on different projections. While the transfer of moderate amounts of data is satisfactorily accomplished using the analytical map projection formulas, polynomials are more efficient for massive transfers. Suitable coefficients for the polynomials may be determined more easily for general cases using least squares instead of Taylor series. A second area of research is in the determination of a map projection fitting an unlabeled map, so that accurate data transfer can take place. The computer can test one projection after another, and include iteration where required. A third area is in the use of least squares to fit a map projection with optimum parameters to the region being mapped, so that distortion is minimized. This can be accomplished for standard conformal, equalarea, or other types of projections. Even less distortion can result if complex transformations of conformal projections are utilized. This bulletin describes several recent applications of these principles, as well as historical usage and background.

  13. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    USGS Publications Warehouse

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  14. Development of a low cost, GPS-based upgrade to a standard handheld gamma detector for mapping environmental radioactive contamination.

    PubMed

    Paridaens, J

    2006-02-01

    A low cost extension to a standard handheld radiation monitor was developed, allowing one to perform outdoor georeferenced gamma measurements. It consists of a commercial wireless Bluetooth GPS receiver, a commercial RS-232 to Bluetooth converter combined with a standard Bluetooth enabled pocket personal computer (PPC). The system is intended for use in difficult to access areas, typically for foot campaigns. As the operator walks, a straightforward homemade visual basic program alternately reads GPS position and gamma dose rate into the PPC, creating a data log. This allows a single operator on foot to map between 50 and 200 ha of environmental radiation per day in very rugged areas, depending on the accessibility of the terrain and the detail required. On a test field with known contamination, a spatial precision of about 5-10 m was obtainable. The device was also used to reveal complex contamination patterns in the flooding zones of a radioactively contaminated small river.

  15. Preferred reporting items for studies mapping onto preference-based outcome measures: The MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-01-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication. A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprised of six health economists and one Delphi methodologist. Following a two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user friendly 23 item checklist. They are presented numerically and categorised within six sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.

  16. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite the publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication. A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprising six health economists and one Delphi methodologist. Following a two-round modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user-friendly 23-item checklist. They are presented numerically and categorised within six sections, namely: (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  17. Transfer of Technology for Cadastral Mapping in Tajikistan Using High Resolution Satellite Data

    NASA Astrophysics Data System (ADS)

    Kaczynski, R.

    2012-07-01

    European Commission funded project entitled: "Support to the mapping and certification capacity of the Agency of Land Management, Geodesy and Cartography" in Tajikistan was run by FINNMAP FM-International and Human Dynamics from Nov. 2006 to June 2011. The Agency of Land Management, Geodesy and Cartography is the state agency responsible for development, implementation, monitoring and evaluation of state policies on land tenure and land management, including the on-going land reform and registration of land use rights. The specific objective was to support and strengthen the professional capacity of the "Fazo" Institute in the field of satellite geodesy, digital photogrammetry, advanced digital satellite image processing of high resolution satellite data and digital cartography. Lectures and on-the-job trainings for the personnel of "Fazo" and Agency in satellite geodesy, digital photogrammetry, cartography and the use of high resolution satellite data for cadastral mapping have been organized. Standards and Quality control system for all data and products have been elaborated and implemented in the production line. Technical expertise and trainings in geodesy, photogrammetry and satellite image processing to the World Bank project "Land Registration and Cadastre System for Sustainable Agriculture" has also been completed in Tajikistan. The new map projection was chosen and the new unclassified geodetic network has been established for all of the country in which all agricultural parcel boundaries are being mapped. IKONOS, QuickBird and WorldView1 panchromatic data have been used for orthophoto generation. Average accuracy of space triangulation of non-standard (long up to 90km) satellite images of QuickBird Pan and IKONOS Pan on ICPs: RMSEx = 0.5m and RMSEy = 0.5m have been achieved. Accuracy of digital orthophoto map is RMSExy = 1.0m. More then two and half thousands of digital orthophoto map sheets in the scale of 1:5000 with pixel size 0.5m have been produced so far by the "Fazo" Institute in Tajikistan on the basis of technology elaborated in the framework of this project. Digital cadastral maps are produced in "Fazo" and Cadastral Regional Centers in Tajikistan using ArcMap software. These digital orthophotomaps will also be used for digital mapping of water resources and other needs of the country.

  18. A geographic information system-based method for estimating cancer rates in non-census defined geographical areas.

    PubMed

    Freeman, Vincent L; Boylan, Emma E; Pugach, Oksana; Mclafferty, Sara L; Tossas-Milligan, Katherine Y; Watson, Karriem S; Winn, Robert A

    2017-10-01

    To address locally relevant cancer-related health issues, health departments frequently need data beyond that contained in standard census area-based statistics. We describe a geographic information system-based method for calculating age-standardized cancer incidence rates in non-census defined geographical areas using publically available data. Aggregated records of cancer cases diagnosed from 2009 through 2013 in each of Chicago's 77 census-defined community areas were obtained from the Illinois State Cancer Registry. Areal interpolation through dasymetric mapping of census blocks was used to redistribute populations and case counts from community areas to Chicago's 50 politically defined aldermanic wards, and ward-level age-standardized 5-year cumulative incidence rates were calculated. Potential errors in redistributing populations between geographies were limited to <1.5% of the total population, and agreement between our ward population estimates and those from a frequently cited reference set of estimates was high (Pearson correlation r = 0.99, mean difference = -4 persons). A map overlay of safety-net primary care clinic locations and ward-level incidence rates for advanced-staged cancers revealed potential pathways for prevention. Areal interpolation through dasymetric mapping can estimate cancer rates in non-census defined geographies. This can address gaps in local cancer-related health data, inform health resource advocacy, and guide community-centered cancer prevention and control.

  19. Mapping the Association of College and Research Libraries information literacy framework and nursing professional standards onto an assessment rubric.

    PubMed

    Willson, Gloria; Angell, Katelyn

    2017-04-01

    The authors developed a rubric for assessing undergraduate nursing research papers for information literacy skills critical to their development as researchers and health professionals. We developed a rubric mapping six American Nurses Association professional standards onto six related concepts of the Association of College & Research Libraries (ACRL) Framework for Information Literacy for Higher Education. We used this rubric to evaluate fifty student research papers and assess inter-rater reliability. Students tended to score highest on the "Information Has Value" dimension and lowest on the "Scholarship as Conversation" dimension. However, we found a discrepancy between the grading patterns of the two investigators, with inter-rater reliability being "fair" or "poor" for all six rubric dimensions. The development of a rubric that dually assesses information literacy skills and maps relevant disciplinary competencies holds potential. This study offers a template for a rubric inspired by the ACRL Framework and outside professional standards. However, the overall low inter-rater reliability demands further calibration of the rubric. Following additional norming, this rubric can be used to help students identify the key information literacy competencies that they need in order to succeed as college students and future nurses. These skills include developing an authoritative voice, determining the scope of their information needs, and understanding the ramifications of their information choices.

  20. in Mapping of Gastric Cancer Incidence in Iran

    PubMed

    Asmarian, Naeimehossadat; Jafari-Koshki, Tohid; Soleimani, Ali; Taghi Ayatollahi, Seyyed Mohammad

    2016-10-01

    Background: In many countries gastric cancer has the highest incidence among the gastrointestinal cancers and is the second most common cancer in Iran. The aim of this study was to identify and map high risk gastric cancer regions at the county-level in Iran. Methods: In this study we analyzed gastric cancer data for Iran in the years 2003-2010. Areato- area Poisson kriging and Besag, York and Mollie (BYM) spatial models were applied to smoothing the standardized incidence ratios of gastric cancer for the 373 counties surveyed in this study. The two methods were compared in term of accuracy and precision in identifying high risk regions. Result: The highest smoothed standardized incidence rate (SIR) according to area-to-area Poisson kriging was in Meshkinshahr county in Ardabil province in north-western Iran (2.4,SD=0.05), while the highest smoothed standardized incidence rate (SIR) according to the BYM model was in Ardabil, the capital of that province (2.9,SD=0.09). Conclusion: Both methods of mapping, ATA Poisson kriging and BYM, showed the gastric cancer incidence rate to be highest in north and north-west Iran. However, area-to-area Poisson kriging was more precise than the BYM model and required less smoothing. According to the results obtained, preventive measures and treatment programs should be focused on particular counties of Iran. Creative Commons Attribution License

  1. Statistical characterization of the standard map

    NASA Astrophysics Data System (ADS)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-06-01

    The standard map, paradigmatic conservative system in the (x, p) phase space, has been recently shown (Tirnakli and Borges (2016 Sci. Rep. 6 23644)) to exhibit interesting statistical behaviors directly related to the value of the standard map external parameter K. A comprehensive statistical numerical description is achieved in the present paper. More precisely, for large values of K (e.g. K  =  10) where the Lyapunov exponents are neatly positive over virtually the entire phase space consistently with Boltzmann-Gibbs (BG) statistics, we verify that the q-generalized indices related to the entropy production q{ent} , the sensitivity to initial conditions q{sen} , the distribution of a time-averaged (over successive iterations) phase-space coordinate q{stat} , and the relaxation to the equilibrium final state q{rel} , collapse onto a fixed point, i.e. q{ent}=q{sen}=q{stat}=q{rel}=1 . In remarkable contrast, for small values of K (e.g. K  =  0.2) where the Lyapunov exponents are virtually zero over the entire phase space, we verify q{ent}=q{sen}=0 , q{stat} ≃ 1.935 , and q{rel} ≃1.4 . The situation corresponding to intermediate values of K, where both stable orbits and a chaotic sea are present, is discussed as well. The present results transparently illustrate when BG behavior and/or q-statistical behavior are observed.

  2. Mapping the Association of College and Research Libraries information literacy framework and nursing professional standards onto an assessment rubric

    PubMed Central

    Willson, Gloria; Angell, Katelyn

    2017-01-01

    Objective The authors developed a rubric for assessing undergraduate nursing research papers for information literacy skills critical to their development as researchers and health professionals. Methods We developed a rubric mapping six American Nurses Association professional standards onto six related concepts of the Association of College & Research Libraries (ACRL) Framework for Information Literacy for Higher Education. We used this rubric to evaluate fifty student research papers and assess inter-rater reliability. Results Students tended to score highest on the “Information Has Value” dimension and lowest on the “Scholarship as Conversation” dimension. However, we found a discrepancy between the grading patterns of the two investigators, with inter-rater reliability being “fair” or “poor” for all six rubric dimensions. Conclusions The development of a rubric that dually assesses information literacy skills and maps relevant disciplinary competencies holds potential. This study offers a template for a rubric inspired by the ACRL Framework and outside professional standards. However, the overall low inter-rater reliability demands further calibration of the rubric. Following additional norming, this rubric can be used to help students identify the key information literacy competencies that they need in order to succeed as college students and future nurses. These skills include developing an authoritative voice, determining the scope of their information needs, and understanding the ramifications of their information choices. PMID:28377678

  3. Effect of Time-of-Flight Information on PET/MR Reconstruction Artifacts: Comparison of Free-breathing versus Breath-hold MR-based Attenuation Correction.

    PubMed

    Delso, Gaspar; Khalighi, Mohammed; Ter Voert, Edwin; Barbosa, Felipe; Sekine, Tetsuro; Hüllner, Martin; Veit-Haibach, Patrick

    2017-01-01

    Purpose To evaluate the magnitude and anatomic extent of the artifacts introduced on positron emission tomographic (PET)/magnetic resonance (MR) images by respiratory state mismatch in the attenuation map. Materials and Methods The method was tested on 14 patients referred for an oncologic examination who underwent PET/MR imaging. The acquisition included standard PET and MR series for each patient, and an additional attenuation correction series was acquired by using breath hold. PET data were reconstructed with and without time-of-flight (TOF) information, first by using the standard free-breathing attenuation map and then again by using the additional breath-hold map. Two-tailed paired t testing and linear regression with 0 intercept was performed on TOF versus non-TOF and free-breathing versus breath-hold data for all detected lesions. Results Fluorodeoxyglucose-avid lesions were found in eight of the 14 patients included in the study. The uptake differences (maximum standardized uptake values) between PET reconstructions with free-breathing versus breath-hold attenuation ranged, for non-TOF reconstructions, from -18% to 26%. The corresponding TOF reconstructions yielded differences from -15% to 18%. Conclusion TOF information was shown to reduce the artifacts caused at PET/MR by respiratory mismatch between emission and attenuation data. © RSNA, 2016 Online supplemental material is available for this article.

  4. Hazards, Disasters, and The National Map

    USGS Publications Warehouse

    Carswell, William J.; Newell, Mark R.

    2009-01-01

    Federal, State, and local response and management personnel must have current, reliable, and easily accessible geographic information and maps to prepare for, respond to, or recover from emergency situations. In life-threatening events, such as earthquakes, floods, or wildland fires, geographic information is essential for locating critical infrastructure and carrying out evacuation and rescue operations. The USGS promotes partnerships to ensure that base map data are up to date, readily available, and shareable among local, state, and National users. The National Map enables other government agencies, private industry, and the public to link and share additional data that provide even more information. These efforts with state and local governments have helped standardize the data by reducing data inconsistencies between neighboring jurisdictions and will help fill in the gaps for those places where data are lacking.

  5. Combining Semantic and Lexical Methods for Mapping MedDRA to VCM Icons.

    PubMed

    Lamy, Jean-Baptiste; Tsopra, Rosy

    2018-01-01

    VCM (Visualization of Concept in Medicine) is an iconic language that represents medical concepts, such as disorders, by icons. VCM has a formal semantics described by an ontology. The icons can be used in medical software for providing a visual summary or enriching texts. However, the use of VCM icons in user interfaces requires to map standard medical terminologies to VCM. Here, we present a method combining semantic and lexical approaches for mapping MedDRA to VCM. The method takes advantage of the hierarchical relations in MedDRA. It also analyzes the groups of lemmas in the term's labels, and relies on a manual mapping of these groups to the concepts in the VCM ontology. We evaluate the method on 50 terms. Finally, we discuss the method and suggest perspectives.

  6. Awake surgery between art and science. Part II: language and cognitive mapping

    PubMed Central

    Talacchi, Andrea; Santini, Barbara; Casartelli, Marilena; Monti, Alessia; Capasso, Rita; Miceli, Gabriele

    Summary Direct cortical and subcortical stimulation has been claimed to be the gold standard for exploring brain function. In this field, efforts are now being made to move from intraoperative naming-assisted surgical resection towards the use of other language and cognitive tasks. However, before relying on new protocols and new techniques, we need a multi-staged system of evidence (low and high) relating to each step of functional mapping and its clinical validity. In this article we examine the possibilities and limits of brain mapping with the aid of a visual object naming task and various other tasks used to date. The methodological aspects of intraoperative brain mapping, as well as the clinical and operative settings, were discussed in Part I of this review. PMID:24139658

  7. 33 CFR 332.6 - Monitoring.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...., forested wetlands, bogs). Following project implementation, the district engineer may reduce or waive the... performance standards, and may include plans (such as as-built plans), maps, and photographs to illustrate...

  8. Time-efficient high-resolution whole-brain three-dimensional macromolecular proton fraction mapping

    PubMed Central

    Yarnykh, Vasily L.

    2015-01-01

    Purpose Macromolecular proton fraction (MPF) mapping is a quantitative MRI method that reconstructs parametric maps of a relative amount of macromolecular protons causing the magnetization transfer (MT) effect and provides a biomarker of myelination in neural tissues. This study aimed to develop a high-resolution whole-brain MPF mapping technique utilizing a minimal possible number of source images for scan time reduction. Methods The described technique is based on replacement of an actually acquired reference image without MT saturation by a synthetic one reconstructed from R1 and proton density maps, thus requiring only three source images. This approach enabled whole-brain three-dimensional MPF mapping with isotropic 1.25×1.25×1.25 mm3 voxel size and scan time of 20 minutes. The synthetic reference method was validated against standard MPF mapping with acquired reference images based on data from 8 healthy subjects. Results Mean MPF values in segmented white and gray matter appeared in close agreement with no significant bias and small within-subject coefficients of variation (<2%). High-resolution MPF maps demonstrated sharp white-gray matter contrast and clear visualization of anatomical details including gray matter structures with high iron content. Conclusions Synthetic reference method improves resolution of MPF mapping and combines accurate MPF measurements with unique neuroanatomical contrast features. PMID:26102097

  9. Children’s Mapping between Non-Symbolic and Symbolic Numerical Magnitudes and Its Association with Timed and Untimed Tests of Mathematics Achievement

    PubMed Central

    Brankaer, Carmen; Ghesquière, Pol; De Smedt, Bert

    2014-01-01

    The ability to map between non-symbolic numerical magnitudes and Arabic numerals has been put forward as a key factor in children’s mathematical development. This mapping ability has been mainly examined indirectly by looking at children’s performance on a symbolic magnitude comparison task. The present study investigated mapping in a more direct way by using a task in which children had to choose which of two choice quantities (Arabic digits or dot arrays) matched the target quantity (dot array or Arabic digit), thereby focusing on small quantities ranging from 1 to 9. We aimed to determine the development of mapping over time and its relation to mathematics achievement. Participants were 36 first graders (M = 6 years 8 months) and 46 third graders (M = 8 years 8 months) who all completed mapping tasks, symbolic and non-symbolic magnitude comparison tasks and standardized timed and untimed tests of mathematics achievement. Findings revealed that children are able to map between non-symbolic and symbolic representations and that this mapping ability develops over time. Moreover, we found that children’s mapping ability is related to timed and untimed measures of mathematics achievement, over and above the variance accounted for by their numerical magnitude comparison skills. PMID:24699664

  10. Utilization of Neurophysiological Protocols to Characterize Soldier Response to Irritant Gases. Phase 1.

    DTIC Science & Technology

    1990-02-15

    electrical activity mapping procedures. It is necessary to employ approximately 20 electrodes to conduct full- scale brain mapping procedures, using a...animal groups, likewise, showed no observable differences in the animal’s exploratory behavior, nuzzle response, lid-corneal and ear reflexes, pain ...SPECIFICATIONS FOR THE ENVIRONICS SERIES 100 GAS STANDARDS GENERATOR Accuracy of Flow 0.15 % of Full Scale Linearity 0.15 % of Full Scale Repeatability 0.10

  11. On integrability of the Yang-Baxter {sigma}-model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klimcik, Ctirad

    2009-04-15

    We prove that the recently introduced Yang-Baxter {sigma}-model can be considered as an integrable deformation of the principal chiral model. We find also an explicit one-to-one map transforming every solution of the principal chiral model into a solution of the deformed model. With the help of this map, the standard procedure of the dressing of the principal chiral solutions can be directly transferred into the deformed Yang-Baxter context.

  12. Baryon Acoustic Oscillations reconstruction with pixels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obuljen, Andrej; Villaescusa-Navarro, Francisco; Castorina, Emanuele

    2017-09-01

    Gravitational non-linear evolution induces a shift in the position of the baryon acoustic oscillations (BAO) peak together with a damping and broadening of its shape that bias and degrades the accuracy with which the position of the peak can be determined. BAO reconstruction is a technique developed to undo part of the effect of non-linearities. We present and analyse a reconstruction method that consists of displacing pixels instead of galaxies and whose implementation is easier than the standard reconstruction method. We show that this method is equivalent to the standard reconstruction technique in the limit where the number of pixelsmore » becomes very large. This method is particularly useful in surveys where individual galaxies are not resolved, as in 21cm intensity mapping observations. We validate this method by reconstructing mock pixelated maps, that we build from the distribution of matter and halos in real- and redshift-space, from a large set of numerical simulations. We find that this method is able to decrease the uncertainty in the BAO peak position by 30-50% over the typical angular resolution scales of 21 cm intensity mapping experiments.« less

  13. Catlas: An magnetic resonance imaging-based three-dimensional cortical atlas and tissue probability maps for the domestic cat (Felis catus).

    PubMed

    Stolzberg, Daniel; Wong, Carmen; Butler, Blake E; Lomber, Stephen G

    2017-10-15

    Brain atlases play an important role in effectively communicating results from neuroimaging studies in a standardized coordinate system. Furthermore, brain atlases extend analysis of functional magnetic resonance imaging (MRI) data by delineating regions of interest over which to evaluate the extent of functional activation as well as measures of inter-regional connectivity. Here, we introduce a three-dimensional atlas of the cat cerebral cortex based on established cytoarchitectonic and electrophysiological findings. In total, 71 cerebral areas were mapped onto the gray matter (GM) of an averaged T1-weighted structural MRI acquired at 7 T from eight adult domestic cats. In addition, a nonlinear registration procedure was used to generate a common template brain as well as GM, white matter, and cerebral spinal fluid tissue probability maps to facilitate tissue segmentation as part of the standard preprocessing pipeline for MRI data analysis. The atlas and associated files can also be used for planning stereotaxic surgery and for didactic purposes. © 2017 Wiley Periodicals, Inc.

  14. Dynamical emergence of Markovianity in local time scheme.

    PubMed

    Jeknić-Dugić, J; Arsenijević, M; Dugić, M

    2016-06-01

    Recently we pointed out the so-called local time scheme as a novel approach to quantum foundations that solves the preferred pointer-basis problem. In this paper, we introduce and analyse in depth a rather non-standard dynamical map that is imposed by the scheme. On the one hand, the map does not allow for introducing a properly defined generator of the evolution nor does it represent a quantum channel. On the other hand, the map is linear, positive, trace preserving and unital as well as completely positive, but is not divisible and therefore non-Markovian. Nevertheless, we provide quantitative criteria for dynamical emergence of time-coarse-grained Markovianity, for exact dynamics of an open system, as well as for operationally defined approximation of a closed or open many-particle system. A closed system never reaches a steady state, whereas an open system may reach a unique steady state given by the Lüders-von Neumann formula; where the smaller the open system, the faster a steady state is attained. These generic findings extend the standard open quantum systems theory and substantially tackle certain cosmological issues.

  15. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprising six health economists and one Delphi methodologist. A two-round, modified Delphi survey, with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorized within six sections: (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS Explanation and Elaboration paper. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of the reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  16. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-08-01

    "Mapping" onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite the publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist that aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of 6 health economists and 1 Delphi methodologist. A 2-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies, and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorized within 6 sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency, and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by 7 health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years.

  17. Web Map Services (WMS) Global Mosaic

    NASA Technical Reports Server (NTRS)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  18. Mapping flexible protein domains at subnanometer resolution with the atomic force microscope.

    PubMed

    Müller, D J; Fotiadis, D; Engel, A

    1998-06-23

    The mapping of flexible protein domains with the atomic force microscope is reviewed. Examples discussed are the bacteriorhodopsin from Halobacterium salinarum, the head-tail-connector from phage phi29, and the hexagonally packed intermediate layer from Deinococcus radiodurans which all were recorded in physiological buffer solution. All three proteins undergo reversible structural changes that are reflected in standard deviation maps calculated from aligned topographs of individual protein complexes. Depending on the lateral resolution (up to 0.8 nm) flexible surface regions can ultimately be correlated with individual polypeptide loops. In addition, multivariate statistical classification revealed the major conformations of the protein surface.

  19. Automated mapping of pharmacy orders from two electronic health record systems to RxNorm within the STRIDE clinical data warehouse.

    PubMed

    Hernandez, Penni; Podchiyska, Tanya; Weber, Susan; Ferris, Todd; Lowe, Henry

    2009-11-14

    The Stanford Translational Research Integrated Database Environment (STRIDE) clinical data warehouse integrates medication information from two Stanford hospitals that use different drug representation systems. To merge this pharmacy data into a single, standards-based model supporting research we developed an algorithm to map HL7 pharmacy orders to RxNorm concepts. A formal evaluation of this algorithm on 1.5 million pharmacy orders showed that the system could accurately assign pharmacy orders in over 96% of cases. This paper describes the algorithm and discusses some of the causes of failures in mapping to RxNorm.

  20. A technique for the determination of Louisiana marsh salinity zone from vegetation mapped by multispectral scanner data: A comparison of satellite and aircraft data

    NASA Technical Reports Server (NTRS)

    Butera, M. K.

    1977-01-01

    Vegetation in selected study areas on the Louisiana coast was mapped using low altitude aircraft and satellite (LANDSAT) multispectral scanner data. Fresh, brackish, and saline marshes were then determined from the remotely sensed presence of dominant indicator plant associations. Such vegetational classifications were achieved from data processed through a standard pattern recognition computer program. The marsh salinity zone maps from the aircraft and satellite data compared favorably within the broad salinity regimes. The salinity zone boundaries determined by remote sensing compared favorably with those interpolated from line-transect field observations from an earlier year.

Top