Sample records for digital object repository

  1. The Research Library's Role in Digital Repository Services: Final Report of the ARL Digital Repository Issues Task Force

    ERIC Educational Resources Information Center

    Association of Research Libraries, 2009

    2009-01-01

    Libraries are making diverse contributions to the development of many types of digital repositories, particularly those housing locally created digital content, including new digital objects or digitized versions of locally held works. In some instances, libraries are managing a repository and its related services entirely on their own, but often…

  2. Preservation of Earth Science Data History with Digital Content Repository Technology

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.

    2011-12-01

    An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.

  3. Designing Learning Object Repositories as Systems for Managing Educational Communities Knowledge

    ERIC Educational Resources Information Center

    Sampson, Demetrios G.; Zervas, Panagiotis

    2013-01-01

    Over the past years, a number of international initiatives that recognize the importance of sharing and reusing digital educational resources among educational communities through the use of Learning Object Repositories (LORs) have emerged. Typically, these initiatives focus on collecting digital educational resources that are offered by their…

  4. Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses

    PubMed Central

    Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M.; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V.; Ma’ayan, Avi

    2018-01-01

    Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated ‘canned’ analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools. PMID:29485625

  5. Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses.

    PubMed

    Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V; Ma'ayan, Avi

    2018-02-27

    Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated 'canned' analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools.

  6. Quality Assurance for Digital Learning Object Repositories: Issues for the Metadata Creation Process

    ERIC Educational Resources Information Center

    Currier, Sarah; Barton, Jane; O'Beirne, Ronan; Ryan, Ben

    2004-01-01

    Metadata enables users to find the resources they require, therefore it is an important component of any digital learning object repository. Much work has already been done within the learning technology community to assure metadata quality, focused on the development of metadata standards, specifications and vocabularies and their implementation…

  7. History, Context, and Policies of a Learning Object Repository

    ERIC Educational Resources Information Center

    Simpson, Steven Marshall

    2016-01-01

    Learning object repositories, a form of digital libraries, are robust systems that provide educators new ways to search for educational resources, collaborate with peers, and provide instruction to students in unique and varied ways. This study examines a learning object repository created by a large suburban school district to increase teaching…

  8. Trends in the Evolution of the Public Web, 1998-2002; The Fedora Project: An Open-source Digital Object Repository Management System; State of the Dublin Core Metadata Initiative, April 2003; Preservation Metadata; How Many People Search the ERIC Database Each Day?

    ERIC Educational Resources Information Center

    O'Neill, Edward T.; Lavoie, Brian F.; Bennett, Rick; Staples, Thornton; Wayland, Ross; Payette, Sandra; Dekkers, Makx; Weibel, Stuart; Searle, Sam; Thompson, Dave; Rudner, Lawrence M.

    2003-01-01

    Includes five articles that examine key trends in the development of the public Web: size and growth, internationalization, and metadata usage; Flexible Extensible Digital Object and Repository Architecture (Fedora) for use in digital libraries; developments in the Dublin Core Metadata Initiative (DCMI); the National Library of New Zealand Te Puna…

  9. New directions in medical e-curricula and the use of digital repositories.

    PubMed

    Fleiszer, David M; Posel, Nancy H; Steacy, Sean P

    2004-03-01

    Medical educators involved in the growth of multimedia-enhanced e-curricula are increasingly aware of the need for digital repositories to catalogue, store and ensure access to learning objects that are integrated within their online material. The experience at the Faculty of Medicine at McGill University during initial development of a mainstream electronic curriculum reflects this growing recognition that repositories can facilitate the development of a more comprehensive as well as effective electronic curricula. Also, digital repositories can help to ensure efficient utilization of resources through the use, re-use, and reprocessing of multimedia learning, addressing the potential for collaboration among repositories and increasing available material exponentially. The authors review different approaches to the development of a digital repository application, as well as global and specific issues that should be examined in the initial requirements definition and development phase, to ensure current initiatives meet long-term requirements. Often, decisions regarding creation of e-curricula and associated digital repositories are left to interested faculty and their individual development teams. However, the development of an e-curricula and digital repository is not predominantly a technical exercise, but rather one that affects global pedagogical strategies and curricular content and involves a commitment of large-scale resources. Outcomes of these decisions can have long-term consequences and as such, should involve faculty at the highest levels including the dean.

  10. Assuring the Quality of Agricultural Learning Repositories: Issues for the Learning Object Metadata Creation Process of the CGIAR

    NASA Astrophysics Data System (ADS)

    Zschocke, Thomas; Beniest, Jan

    The Consultative Group on International Agricultural Re- search (CGIAR) has established a digital repository to share its teaching and learning resources along with descriptive educational information based on the IEEE Learning Object Metadata (LOM) standard. As a critical component of any digital repository, quality metadata are critical not only to enable users to find more easily the resources they require, but also for the operation and interoperability of the repository itself. Studies show that repositories have difficulties in obtaining good quality metadata from their contributors, especially when this process involves many different stakeholders as is the case with the CGIAR as an international organization. To address this issue the CGIAR began investigating the Open ECBCheck as well as the ISO/IEC 19796-1 standard to establish quality protocols for its training. The paper highlights the implications and challenges posed by strengthening the metadata creation workflow for disseminating learning objects of the CGIAR.

  11. Audit of a Scientific Data Center for Certification as a Trustworthy Digital Repository: A Case Study

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2011-12-01

    Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.

  12. Preservation of Digital Objects.

    ERIC Educational Resources Information Center

    Galloway, Patricia

    2004-01-01

    Presents a literature review that covers the following topics related to preservation of digital objects: practical examples; stakeholders; recordkeeping standards; genre-specific problems; trusted repository standards; preservation methods; preservation metadata standards; and future directions. (Contains 82 references.) (MES)

  13. Ontological Modeling of Educational Resources: A Proposed Implementation for Greek Schools

    ERIC Educational Resources Information Center

    Poulakakis, Yannis; Vassilakis, Kostas; Kalogiannakis, Michail; Panagiotakis, Spyros

    2017-01-01

    In eLearning context searching for suitable educational material is still a challenging issue. During the last two decades, various digital repositories, such as Learning Object Repositories, institutional repositories and latterly Open Educational Resources, have been developed to accommodate collections of learning material that can be used for…

  14. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  15. A Digital Library for Education: The PEN-DOR Project.

    ERIC Educational Resources Information Center

    Fullerton, Karen; Greenberg, Jane; McClure, Maureen; Rasmussen, Edie; Stewart, Darin

    1999-01-01

    Describes Pen-DOR (Pennsylvania Education Network Digital Object Repository), a digital library designed to provide K-12 educators with access to multimedia resources and tools to create new lesson plans and modify existing ones via the World Wide Web. Discusses design problems of a distributed, object-oriented database architecture and describes…

  16. Using OAI-PMH and METS for Exporting Metadata and Digital Objects between Repositories

    ERIC Educational Resources Information Center

    Bell, Jonathan; Lewis, Stuart

    2006-01-01

    Purpose: To examine the relationship between deposit of electronic theses in institutional and archival repositories. Specifically the paper considers the automated export of theses for deposit in the archival repository in continuation of the existing arrangement in Wales for paper-based theses. Design/methodology/approach: The paper presents a…

  17. iLOG: A Framework for Automatic Annotation of Learning Objects with Empirical Usage Metadata

    ERIC Educational Resources Information Center

    Miller, L. D.; Soh, Leen-Kiat; Samal, Ashok; Nugent, Gwen

    2012-01-01

    Learning objects (LOs) are digital or non-digital entities used for learning, education or training commonly stored in repositories searchable by their associated metadata. Unfortunately, based on the current standards, such metadata is often missing or incorrectly entered making search difficult or impossible. In this paper, we investigate…

  18. A metadata-driven approach to data repository design.

    PubMed

    Harvey, Matthew J; McLean, Andrew; Rzepa, Henry S

    2017-01-01

    The design and use of a metadata-driven data repository for research data management is described. Metadata is collected automatically during the submission process whenever possible and is registered with DataCite in accordance with their current metadata schema, in exchange for a persistent digital object identifier. Two examples of data preview are illustrated, including the demonstration of a method for integration with commercial software that confers rich domain-specific data analytics without introducing customisation into the repository itself.

  19. Personal Name Identification in the Practice of Digital Repositories

    ERIC Educational Resources Information Center

    Xia, Jingfeng

    2006-01-01

    Purpose: To propose improvements to the identification of authors' names in digital repositories. Design/methodology/approach: Analysis of current name authorities in digital resources, particularly in digital repositories, and analysis of some features of existing repository applications. Findings: This paper finds that the variations of authors'…

  20. Content and Knowledge Management in a Digital Library and Museum.

    ERIC Educational Resources Information Center

    Yeh, Jian-Hua; Chang, Jia-Yang; Oyang, Yen-Jen

    2000-01-01

    Discusses the design of the National Taiwan University Digital Library and Museum that addresses both content and knowledge management. Describes a two-tier repository architecture that facilitates content management, includes an object-oriented model to facilitate the management of temporal information, and eliminates the need to manually…

  1. Digital Library Storage using iRODS Data Grids

    NASA Astrophysics Data System (ADS)

    Hedges, Mark; Blanke, Tobias; Hasan, Adil

    Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.

  2. Criteria for the evaluation and certification of long-term digital archives in the earth sciences

    NASA Astrophysics Data System (ADS)

    Klump, Jens

    2010-05-01

    Digital information has become an indispensable part of our cultural and scientific heritage. Scientific findings, historical documents and cultural achievements are to a rapidly increasing extent being presented in electronic form - in many cases exclusively so. However, besides the invaluable advantages offered by this form, it also carries a serious disadvantage: users need to invest a great deal of technical effort in accessing the information. Also, the underlying technology is still undergoing further development at an exceptionally fast pace. The rapid obsolescence of the technology required to read the information combined with the frequently imperceptible physical decay of the media themselves represents a serious threat to preservation of the information content. Many data sets in earth science research are from observations that cannot be repeated. This makes these digital assets particularly valuable. Therefore, these data should be kept and made available for re-use long after the end of the project from which they originated. Since research projects only run for a relatively short period of time, it is advisable to shift the burden of responsibility for long-term data curation from the individual researcher to a trusted data repository or archive. But what makes a trusted data repository? Each trusted digital repository has its own targets and specifications. The trustworthiness of digital repositories can be tested and assessed on the basis of a criteria catalogue. This is the main focus of the work of the nestor working group "Trusted repositories - Certification". It identifies criteria which permit the trustworthiness of a digital repository to be evaluated, both at the organisational and technical levels. The criteria are defined in close collaboration with a wide range of different memory organisations, producers of information, experts and other interested parties. This open approach ensures a high degree of universal validity, suitability for daily practical use and also broad-based acceptance of the results. The criteria catalogue is also intended to present the option of documenting trustworthiness by means of certification in a standardised national or international process. The criteria catalogue is based on the Reference Model for an Open Archival Information System (OAIS, ISO 14721:2003) With its broad approach, the nestor criteria catalogue for trusted digital repositories has to remain on a high level of abstraction. For application in the earth sciences the evaluation criteria need to be transferred into the context of earth science data and their designated user community. This presentation offers a brief introduction to the problems surrounding the long-term preservation of digital objects. This introduction is followed by a proposed application of the criteria catalogue for trusted digital repositories to the context of earth science data and their long-term preservation.

  3. Digital Repositories and the Question of Data Usefulness

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Downs, R. R.

    2017-12-01

    The advent of ISO standards for trustworthy long-term digital repositories provides both a set of principles to develop long-term data repositories and the instruments to assess them for trustworthiness. Such mandatory high-level requirements are broad enough to be achievable, to some extent, by many scientific data centers, archives, and other repositories. But the requirement that the data be useful in the future, the requirement that is usually considered to be most relevant to the value of the repository for its user communities, largely remains subject to various interpretations and misunderstanding. However, current and future users will be relying on repositories to preserve and disseminate the data and information needed to discover, understand, and utilize these resources to support their research, learning, and decision-making objectives. Therefore, further study is needed to determine the approaches that can be adopted by repositories to make data useful to future communities of users. This presentation will describe approaches for enabling scientific data and related information, such as software, to be useful for current and potential future user communities and will present the methodology chosen to make one science discipline's data useful for both current and future users. The method uses an ontology-based information model to define and capture the information necessary to make the data useful for contemporary and future users.

  4. The development of digital library system for drug research information.

    PubMed

    Kim, H J; Kim, S R; Yoo, D S; Lee, S H; Suh, O K; Cho, J H; Shin, H T; Yoon, J P

    1998-01-01

    The sophistication of computer technology and information transmission on internet has made various cyber information repository available to information consumers. In the era of information super-highway, the digital library which can be accessed from remote sites at any time is considered the prototype of information repository. Using object-oriented DBMS, the very first model of digital library for pharmaceutical researchers and related professionals in Korea has been developed. The published research papers and researchers' personal information was included in the database. For database with research papers, 13 domestic journals were abstracted and scanned for full-text image files which can be viewed by Internet web browsers. The database with researchers' personal information was also developed and interlinked to the database with research papers. These database will be continuously updated and will be combined with world-wide information as the unique digital library in the field of pharmacy.

  5. Managing and Evaluating Digital Repositories

    ERIC Educational Resources Information Center

    Zuccala, Alesia; Oppenheim, Charles; Dhiensa, Rajveen

    2008-01-01

    Introduction: We examine the role of the digital repository manager, discuss the future of repository management and evaluation and suggest that library and information science schools develop new repository management curricula. Method: Face-to-face interviews were carried out with managers of five different types of repositories and a Web-based…

  6. My Three Wishes for Digital Repositories. Building Digital Libraries

    ERIC Educational Resources Information Center

    Huwe, Terence K.

    2005-01-01

    In this column on digital repository management, the author defines three areas within the sphere of digital repositories that need work. The first two pertain to information architecture, while the last one pertains to taking action. The author's first "wish" is for top-notch library Web sites that act as a gateway to any sphere of knowledge. He…

  7. [The subject repositories of strategy of the Open Access initiative].

    PubMed

    Soares Guimarães, M C; da Silva, C H; Horsth Noronha, I

    2012-11-01

    The subject repositories are defined as a set of digital objects resulting from the research related to a specific disciplinary field and occupy a still restricted space in the discussion agenda of the Free Access Movement when compared to amplitude reached in the discussion of Institutional Repositories. Although the Subject Repository comes to prominence in the field, especially for the success of initiatives such as the arXiv, PubMed and E-prints, the literature on the subject is recognized as very limited. Despite its roots in the Library and Information Science, and focus on the management of disciplinary collections (subject area literature), there is little information available about the development and management of subject repositories. The following text seeks to make a brief summary on the topic as a way to present the potential to develop subject repositories in order to strengthen the initiative of open access.

  8. Digital Rocks Portal: a Sustainable Platform for Data Management, Analysis and Remote Visualization of Volumetric Images of Porous Media

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.

    2017-12-01

    Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.

  9. A University Library Creates a Digital Repository for Documenting and Disseminating Community Engagement

    ERIC Educational Resources Information Center

    Miller, William A.; Billings, Marilyn

    2012-01-01

    Digital repositories are new tools for documenting the accumulated scholarly work produced at academic institutions and disseminating that material broadly via the internet. Digital repositories support all file types and can be adapted to meet the custom design specifications of individual institutions. A section for community engagement…

  10. Digital Libraries and Repositories in India: An Evaluative Study

    ERIC Educational Resources Information Center

    Mittal, Rekha; Mahesh, G.

    2008-01-01

    Purpose: The purpose of this research is to identify and evaluate the collections within digital libraries and repositories in India available in the public domain. Design/methodology/approach: The digital libraries and repositories were identified through a study of the literature, as well as internet searching and browsing. The resulting digital…

  11. Digital Rocks Portal: Preservation, Sharing, Remote Visualization and Automated Analysis of Imaged Datasets

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.; Hanlon, M.; Pettengill, M.; Ranganath, A.; Venkatesh, A.

    2016-12-01

    Due to advances in imaging modalities such as X-ray microtomography and scattered electron microscopy, 2D and 3D imaged datasets of rock microstructure on nanometer to centimeter length scale allow investigation of nonlinear flow and mechanical phenomena using numerical approaches. This in turn produces various upscaled parameters required by subsurface flow and deformation simulators. However, a single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (http://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Our objective is to enable scientific inquiry and engineering decisions founded on a data-driven basis. We show how the data loaded in the portal can be documented, referenced in publications via digital object identifiers, visualize and linked to other repositories. We then show preliminary results on integrating remote parallel visualization and flow simulation workflow with the pore structures currently stored in the repository. We finally discuss the issues of collecting correct metadata, data discoverability and repository sustainability. This is the first repository for this particular data, but is part of the wider ecosystem of geoscience data and model cyber-infrastructure called "Earthcube" (http://earthcube.org/) sponsored by National Science Foundation. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  12. Photogrammetric Analysis of Historical Image Repositories for Virtual Reconstruction in the Field of Digital Humanities

    NASA Astrophysics Data System (ADS)

    Maiwald, F.; Vietze, T.; Schneider, D.; Henze, F.; Münster, S.; Niebling, F.

    2017-02-01

    Historical photographs contain high density of information and are of great importance as sources in humanities research. In addition to the semantic indexing of historical images based on metadata, it is also possible to reconstruct geometric information about the depicted objects or the camera position at the time of the recording by employing photogrammetric methods. The approach presented here is intended to investigate (semi-) automated photogrammetric reconstruction methods for heterogeneous collections of historical (city) photographs and photographic documentation for the use in the humanities, urban research and history sciences. From a photogrammetric point of view, these images are mostly digitized photographs. For a photogrammetric evaluation, therefore, the characteristics of scanned analog images with mostly unknown camera geometry, missing or minimal object information and low radiometric and geometric resolution have to be considered. In addition, these photographs have not been created specifically for documentation purposes and so the focus of these images is often not on the object to be evaluated. The image repositories must therefore be subjected to a preprocessing analysis of their photogrammetric usability. Investigations are carried out on the basis of a repository containing historical images of the Kronentor ("crown gate") of the Dresden Zwinger. The initial step was to assess the quality and condition of available images determining their appropriateness for generating three-dimensional point clouds from historical photos using a structure-from-motion evaluation (SfM). Then, the generated point clouds were assessed by comparing them with current measurement data of the same object.

  13. Enhancing the Reuse of Digital Resources for Integrated Systems to Represent, Understand and Dynamize Complex Interactions in Architectural Cultural Heritage Environments

    NASA Astrophysics Data System (ADS)

    Delgado, F. J.; Martinez, R.; Finat, J.; Martinez, J.; Puche, J. C.; Finat, F. J.

    2013-07-01

    In this work we develop a multiply interconnected system which involves objects, agents and interactions between them from the use of ICT applied to open repositories, users communities and web services. Our approach is applied to Architectural Cultural Heritage Environments (ACHE). It includes components relative to digital accessibility (to augmented ACHE repositories), contents management (ontologies for the semantic web), semiautomatic recognition (to ease the reuse of materials) and serious videogames (for interaction in urban environments). Their combination provides a support for local real/remote virtual tourism (including some tools for low-level RT display of rendering in portable devices), mobile-based smart interactions (with a special regard to monitored environments) and CH related games (as extended web services). Main contributions to AR models on usual GIS applied to architectural environments, concern to an interactive support performed directly on digital files which allows to access to CH contents which are referred to GIS of urban districts (involving facades, historical or preindustrial buildings) and/or CH repositories in a ludic and transversal way to acquire cognitive, medial and social abilities in collaborative environments.

  14. Worth the Work

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2010-01-01

    Even in the age of Google, digital repositories can add tremendous value to an institution. Yet creating and maintaining these collections is no small task. Digital repository advocates will concede that the challenges in building and maintaining these collections can daunt even the most intrepid supporters. Three repository directors share their…

  15. Keeping Research Data from the Continental Deep Drilling Programme (KTB) Accessible and Taking First Steps Towards Digital Preservation

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Ulbricht, D.; Conze, R.

    2014-12-01

    The Continental Deep Drilling Programme (KTB) was a scientific drilling project from 1987 to 1995 near Windischeschenbach, Bavaria. The main super-deep borehole reached a depth of 9,101 meters into the Earth's continental crust. The project used the most current equipment for data capture and processing. After the end of the project key data were disseminated through the web portal of the International Continental Scientific Drilling Program (ICDP). The scientific reports were published as printed volumes. As similar projects have also experienced, it becomes increasingly difficult to maintain a data portal over a long time. Changes in software and underlying hardware make a migration of the entire system inevitable. Around 2009 the data presented on the ICDP web portal were migrated to the Scientific Drilling Database (SDDB) and published through DataCite using Digital Object Identifiers (DOI) as persistent identifiers. The SDDB portal used a relational database with a complex data model to store data and metadata. A PHP-based Content Management System with custom modifications made it possible to navigate and browse datasets using the metadata and then download datasets. The data repository software eSciDoc allows storing self-contained packages consistent with the OAIS reference model. Each package consists of binary data files and XML-metadata. Using a REST-API the packages can be stored in the eSciDoc repository and can be searched using the XML-metadata. During the last maintenance cycle of the SDDB the data and metadata were migrated into the eSciDoc repository. Discovery metadata was generated following the GCMD-DIF, ISO19115 and DataCite schemas. The eSciDoc repository allows to store an arbitrary number of XML-metadata records with each data object. In addition to descriptive metadata each data object may contain pointers to related materials, such as IGSN-metadata to link datasets to physical specimens, or identifiers of literature interpreting the data. Datasets are presented by XSLT-stylesheet transformation using the stored metadata. The presentation shows several migration cycles of data and metadata, which were driven by aging software systems. Currently the datasets reside as self-contained entities in a repository system that is ready for digital preservation.

  16. Digitizing dissertations for an institutional repository: a process and cost analysis.

    PubMed

    Piorun, Mary; Palmer, Lisa A

    2008-07-01

    This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions.

  17. Evolution of a Digital Repository: One Institution's Experience

    ERIC Educational Resources Information Center

    Owen, Terry M.

    2011-01-01

    In this article, the development of a digital repository is examined, specifically how the focus on acquiring content for the repository has transitioned from faculty-published research to include the gray literature produced by the research centers on campus, including unpublished technical reports and undergraduate research from honors programs.…

  18. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  19. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  20. Discovery and Use of Online Learning Resources: Case Study Findings

    ERIC Educational Resources Information Center

    Recker, Mimi M.; Dorward, James; Nelson, Laurie Miller

    2004-01-01

    Much recent research and funding have focused on building Internet-based repositories that contain collections of high-quality learning resources, often called "learning objects." Yet little is known about how non-specialist users, in particular teachers, find, access, and use digital learning resources. To address this gap, this article…

  1. So You Want to Be Trustworthy: A Repository's Guide to Taking Reasonable Steps Towards Achieving ISO 16363

    NASA Astrophysics Data System (ADS)

    Stall, S.

    2016-12-01

    To be trustworthy is to be reliable, dependable, honest, principled, ethical, incorruptible, and more. A trustworthy person demonstrates these qualities over time and under all circumstances. A trustworthy repository demonstrates these qualities through the team that manages the repository and its responsible organization. The requirements of a Trusted Digital Repository (TDR) in ISO 16363 can be tough to reach and tough to maintain. Challenges include: limited funds, limited resources and/or skills, and an unclear path to successfully achieve the requirements. The ISO standard defines each requirement separately, but a successful certification recognizes that there are many cross-dependencies among the requirements. Understanding these dependencies leads to a more efficient path towards success. At AGU we recognize that reaching the goal of the TDR ISO standard, or any set of data management objectives defined by an organization, has a better chance at success if the organization clearly knows their current capability, the improvements that are needed, and the best way to make (and maintain) those changes. AGU has partnered with the CMMI® Institute to adapt their Data Management Maturity (DMM)SM model within the Earth and space sciences. Using the DMM, AGU developed a new Data Management Assessment Program aimed at helping data repositories, large and small, domain-specific to general, assess and improve data management practices to meet their goals - including becoming a Trustworthy Digital Repository. The requirements to achieve the TDR ISO standard are aligned to the data management best practices defined in the Data Management Maturity (DMM)SM model. Using the DMM as a process improvement tool in conjunction with the Data Management Assessment method, a team seeking the objective of the TDR ISO standard receives a clear road map to achieving their goal as an outcome of the assessment. Publishers and agencies are beginning to recommend or even require that repositories demonstrate that they are practicing best practices or meeting certain standards. Data preserved in a data facility that is working on achieving a TDR standard will have the level of care desired by the publishing community as well as the science community. Better Data Management results in Better Science.

  2. Task-Based Navigation of a Taxonomy Interface to a Digital Repository

    ERIC Educational Resources Information Center

    Khoo, Christopher S. G.; Wang, Zhonghong; Chaudhry, Abdus Sattar

    2012-01-01

    Introduction: This is a study of hierarchical navigation; how users browse a taxonomy-based interface to an organizational repository to locate information resources. The study is part of a project to develop a taxonomy for an library and information science department to organize resources and support user browsing in a digital repository.…

  3. Transcribing and digitizing eighteenth- and nineteenth-century letters for a historical digital repository.

    PubMed

    Dunster, Emily S; Kipnis, Daniel G; Angelo, F Michael

    2014-01-01

    In fall 2011, the Scott Memorial Library purchased 53 letters belonging to an 1841 graduate of Jefferson Medical College, John Plimpton Green. The library staff transcribed and digitized the letters, creating an online collection in the university's institutional repository, Jefferson Digital Commons. This article will detail the process of transcribing and digitizing the collection along with sharing statistics and the benefits of this project to global researchers.

  4. Transportation plan repository and archive.

    DOT National Transportation Integrated Search

    2011-04-01

    This project created a repository and archive for transportation planning documents in Texas within the : established Texas A&M Repository (http://digital.library.tamu.edu). This transportation planning archive : and repository provides ready access ...

  5. Visualizing research collections in the National Transportation Library's digital repository : ROSA P.

    DOT National Transportation Integrated Search

    2017-01-01

    The National Transportation Library's (NTL) Repository and Open Science Portal (ROSA P) : is a digital library for transportation, including U. S. Department of Transportation : sponsored research results and technical publications, other documents a...

  6. Data repositories for medical education research: issues and recommendations.

    PubMed

    Schwartz, Alan; Pappas, Cleo; Sandlow, Leslie J

    2010-05-01

    The authors explore issues surrounding digital repositories with the twofold intention of clarifying their creation, structure, content, and use, and considering the implementation of a global digital repository for medical education research data sets-an online site where medical education researchers would be encouraged to deposit their data in order to facilitate the reuse and reanalysis of the data by other researchers. By motivating data sharing and reuse, investigators, medical schools, and other stakeholders might see substantial benefits to their own endeavors and to the progress of the field of medical education.The authors review digital repositories in medicine, social sciences, and education, describe the contents and scope of repositories, and present extant examples. The authors describe the potential benefits of a medical education data repository and report results of a survey of the Society for Directors of Research in Medicine Education, in which participants responded to questions about data sharing and a potential data repository. Respondents strongly endorsed data sharing, with the caveat that principal investigators should choose whether or not to share data they collect. A large majority believed that a repository would benefit their unit and the field of medical education. Few reported using existing repositories. Finally, the authors consider challenges to the establishment of such a repository, including taxonomic organization, intellectual property concerns, human subjects protection, technological infrastructure, and evaluation standards. The authors conclude with recommendations for how a medical education data repository could be successfully developed.

  7. Access and preservation of digital research content: Linked open data services - A research library perspective

    NASA Astrophysics Data System (ADS)

    Kraft, Angelina; Sens, Irina; Löwe, Peter; Dreyer, Britta

    2016-04-01

    Globally resolvable, persistent digital identifiers have become an essential tool to enable unambiguous links between published research results and their underlying digital resources. In addition, this unambiguous identification allows citation. In an ideal research world, any scientific content should be citable and the coherent content, as well as the citation itself, should be persistent. However, today's scientists do not just produce traditional research papers - they produce comprehensive digital collections of objects which, alongside digital texts, include digital resources such as research data, audiovisual media, digital lab journals, images, statistics and software code. Researchers start to look for services which allow management of these digital resources with minimum time investment. In light of this, we show how the German National Library of Science and Technology (TIB) develops supportive frameworks to accompany the life cycle of scientific knowledge generation and transfer. This includes technical infrastructures for • indexing, cataloguing, digital preservation, DOI names and licencing for text and digital objects (the TIB DOI registration, active since 2004) and • a digital repository for the deposition and provision of accessible, traceable and citeable research data (RADAR). One particular problem for the management of data originating from (collaborating) research infrastructures is their dynamic nature in terms of growth, access rights and quality. On a global scale, systems for access and preservation are in place for the big data domains (e.g. environmental sciences, space, climate). However, the stewardship for disciplines without a tradition of data sharing, including the fields of the so-called long tail, remains uncertain. The RADAR - Research Data Repository - project establishes a generic end-point data repository, which can be used in a collaborative way. RADAR enables clients to upload, edit, structure and describe their (collaborative) data in an organizational workspace. In such a workspace, administrators and curators can manage access and editorial rights before the data enters the preservation and optional publication phase. RADAR applies different PID strategies for closed vs. open data. For closed datasets, RADAR uses handles as identifiers and offers format-independent data preservation between 5 and 15 years, which can also be prolonged. By default, preserved data are only available to the respective data curators, which may selectively grant other researches access to preserved data. For open datasets, RADAR provides a Digital Object Identifier (DOI) to enable researchers to clearly reference and reuse data and to guarantee data accessibility. RADAR offers the publication service of research data together with format-independent data preservation for an unlimited time period. Each published dataset can be enriched with discipline-specific metadata and an optional embargo period can be specified. With these two services, RADAR aims to meet demands from a broad range of specialized research disciplines: To provide a secure, citable data storage and citability for researchers which need to retain restricted access to data on one hand, and an e-infrastructure which allows for research data to be stored, found, managed, annotated, cited, curated and published in a digital platform available 24/7, on the other.

  8. Information Technology and the Evolution of the Library

    DTIC Science & Technology

    2009-03-01

    Resource Commons/ Repository/ Federated Search ILS (GLADIS/Pathfinder - Millenium)/ Catalog/ Circulation/ Acquisitions/ Digital Object Content...content management services to help centralize and distribute digi- tal content from across the institution, software to allow for seamless federated ... search - ing across multiple databases, and imaging software to allow for daily reimaging of ter- minals to reduce security concerns that otherwise

  9. Preparing for a Trustworthiness Assessment of the National Transportation Library’s Digital Repository ROSA P

    DOT National Transportation Integrated Search

    2018-01-01

    The National Transportation Library (NTL) is an all-digital repository of transportation knowledge that falls under federal mandates to serve as a central clearinghouse for transportation data and information of the Federal Government. as well ...

  10. ERM Ideas and Innovations: Digital Repository Management as ERM

    ERIC Educational Resources Information Center

    Pinkas, María M.; Lin, Na

    2014-01-01

    This article describes the application of electronic resources management (ERM) to digital repository management at the Health Sciences and Human Services Library at the University of Maryland, Baltimore. The authors discuss electronic resources management techniques, through the application of "Techniques for Electronic Management,"…

  11. A Digital Broadcast Item (DBI) enabling metadata repository for digital, interactive television (digiTV) feedback channel networks

    NASA Astrophysics Data System (ADS)

    Lugmayr, Artur R.; Mailaparampil, Anurag; Tico, Florina; Kalli, Seppo; Creutzburg, Reiner

    2003-01-01

    Digital television (digiTV) is an additional multimedia environment, where metadata is one key element for the description of arbitrary content. This implies adequate structures for content description, which is provided by XML metadata schemes (e.g. MPEG-7, MPEG-21). Content and metadata management is the task of a multimedia repository, from which digiTV clients - equipped with an Internet connection - can access rich additional multimedia types over an "All-HTTP" protocol layer. Within this research work, we focus on conceptual design issues of a metadata repository for the storage of metadata, accessible from the feedback channel of a local set-top box. Our concept describes the whole heterogeneous life-cycle chain of XML metadata from the service provider to the digiTV equipment, device independent representation of content, accessing and querying the metadata repository, management of metadata related to digiTV, and interconnection of basic system components (http front-end, relational database system, and servlet container). We present our conceptual test configuration of a metadata repository that is aimed at a real-world deployment, done within the scope of the future interaction (fiTV) project at the Digital Media Institute (DMI) Tampere (www.futureinteraction.tv).

  12. Embracing the Future: Embedding Digital Repositories in the University of London. Technical Report

    ERIC Educational Resources Information Center

    Hoorens, Stijn; van Dijk, Lidia Villalba; van Stolk, Christian

    2008-01-01

    Digital repositories can help Higher Education Institutions (HEIs) to develop coherent and coordinated approaches to capture, identify, store and retrieve intellectual assets such as datasets, course material and research papers. With the advances of technology, an increasing number of Higher Education Institutions are implementing digital…

  13. Developing criteria to establish Trusted Digital Repositories

    USGS Publications Warehouse

    Faundeen, John L.

    2017-01-01

    This paper details the drivers, methods, and outcomes of the U.S. Geological Survey’s quest to establish criteria by which to judge its own digital preservation resources as Trusted Digital Repositories. Drivers included recent U.S. legislation focused on data and asset management conducted by federal agencies spending $100M USD or more annually on research activities. The methods entailed seeking existing evaluation criteria from national and international organizations such as International Standards Organization (ISO), U.S. Library of Congress, and Data Seal of Approval upon which to model USGS repository evaluations. Certification, complexity, cost, and usability of existing evaluation models were key considerations. The selected evaluation method was derived to allow the repository evaluation process to be transparent, understandable, and defensible; factors that are critical for judging competing, internal units. Implementing the chosen evaluation criteria involved establishing a cross-agency, multi-disciplinary team that interfaced across the organization. 

  14. Data Publication: The Evolving Lifecyle

    NASA Astrophysics Data System (ADS)

    Studwell, S.; Elliott, J.; Anderson, A.

    2015-12-01

    Datasets are recognized as valuable information entities in their own right that, now and in the future, need to be available for citation, discovery, retrieval and reuse. The U.S. Department of Energy's Office of Scientific and Technical Information (OSTI) provides Digital Object Identifiers (DOIs) to DOE-funded data through partnership with DataCite. The Geothermal Data Repository (GDR) has been using OSTI's Data ID Service since summer, 2014 and is a success story for data publishing in several different ways. This presentation attributes the initial success to the insistence of DOE's Geothermal Technologies Office on detailed planning, robust data curation, and submitter participation. OSTI widely disseminates these data products across both U.S. and international platforms and continually enhances the Data ID Service to facilitate better linkage between published literature, supplementary data components, and the underlying datasets within the structure of the GDR repository. Issues of granularity in DOI assignment, the role of new federal government guidelines on public access to digital data, and the challenges still ahead will be addressed.

  15. Training and Best Practice Guidelines: Implications for Metadata Creation

    ERIC Educational Resources Information Center

    Chuttur, Mohammad Y.

    2012-01-01

    In response to the rapid development of digital libraries over the past decade, researchers have focused on the use of metadata as an effective means to support resource discovery within online repositories. With the increasing involvement of libraries in digitization projects and the growing number of institutional repositories, it is anticipated…

  16. The Use of Digital Repositories for Enhancing Teacher Pedagogical Performance

    ERIC Educational Resources Information Center

    Cohen, Anat; Kalimi, Sharon; Nachmias, Rafi

    2013-01-01

    This research examines the usage of local learning material repositories at school, as well as related teachers' attitudes and training. The study investigates the use of these repositories for enhancing teacher performance and assesses whether the assimilation of the local repositories increases their usage of and contribution to by teachers. One…

  17. Learning Object Repositories

    ERIC Educational Resources Information Center

    Lehman, Rosemary

    2007-01-01

    This chapter looks at the development and nature of learning objects, meta-tagging standards and taxonomies, learning object repositories, learning object repository characteristics, and types of learning object repositories, with type examples. (Contains 1 table.)

  18. The Emperor's New Repository

    ERIC Educational Resources Information Center

    Chudnov, Daniel

    2008-01-01

    The author does not know the first thing about building digital repositories. Maybe that is a strange thing to say, given that he works in a repository development group now, worked on the original DSpace project years ago, and worked on a few repository research projects in between. Given how long he has been around people and projects aiming to…

  19. The NCAR Digital Asset Services Hub (DASH): Implementing Unified Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Stott, D.; Worley, S. J.; Hou, C. Y.; Nienhouse, E.

    2017-12-01

    The National Center for Atmospheric Research (NCAR) Directorate created the Data Stewardship Engineering Team (DSET) to plan and implement an integrated single entry point for uniform digital asset discovery and access across the organization in order to improve the efficiency of access, reduce the costs, and establish the foundation for interoperability with other federated systems. This effort supports new policies included in federal funding mandates, NSF data management requirements, and journal citation recommendations. An inventory during the early planning stage identified diverse asset types across the organization that included publications, datasets, metadata, models, images, and software tools and code. The NCAR Digital Asset Services Hub (DASH) is being developed and phased in this year to improve the quality of users' experiences in finding and using these assets. DASH serves to provide engagement, training, search, and support through the following four nodes (see figure). DASH MetadataDASH provides resources for creating and cataloging metadata to the NCAR Dialect, a subset of ISO 19115. NMDEdit, an editor based on a European open source application, has been configured for manual entry of NCAR metadata. CKAN, an open source data portal platform, harvests these XML records (along with records output directly from databases) from a Web Accessible Folder (WAF) on GitHub for validation. DASH SearchThe NCAR Dialect metadata drives cross-organization search and discovery through CKAN, which provides the display interface of search results. DASH search will establish interoperability by facilitating metadata sharing with other federated systems. DASH ConsultingThe DASH Data Curation & Stewardship Coordinator assists with Data Management (DM) Plan preparation and advises on Digital Object Identifiers. The coordinator arranges training sessions on the DASH metadata tools and DM planning, and provides one-on-one assistance as requested. DASH RepositoryA repository is under development for NCAR datasets currently not in existing lab-managed archives. The DASH repository will be under NCAR governance and meet Trustworthy Repositories Audit & Certification (TRAC) requirements. This poster will highlight the processes, lessons learned, and current status of the DASH effort at NCAR.

  20. ROSA P : The National Transportation Library’s Repository and Open Science Access Portal

    DOT National Transportation Integrated Search

    2018-01-01

    The National Transportation Library (NTL) was founded as an all-digital repository of US DOT research reports, technical publications and data products. NTLs primary public offering is ROSA P, the Repository and Open Science Access Portal. An open...

  1. Trustworthy Digital Repositories: Building Trust the Old Fashion Way, EARNING IT.

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Shepherd, A.; Rauch, S.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.

    2016-12-01

    There are several drivers increasing the importance of high quality data management and curation in today's research process (e.g., OSTP PARR memo, journal publishers, funders, academic and private institutions), and proper management is necessary throughout the data lifecycle to enable reuse and reproducibility of results. Many digital data repositories are capable of satisfying the basic management needs of an investigator looking to share their data (i.e., publish data in the public domain), but repository services vary greatly and not all provide mature services that facilitate discovery, access, and reuse of research data. Domain-specific repositories play a vital role in the data curation process by working closely with investigators to create robust metadata, perform first order QC, and assemble and publish research data. In addition, they may employ technologies and services that enable increased discovery, access, and long-term archive. However, smaller domain facilities operate in varying states of capacity and curation ability. Within this repository environment, individual investigators (driven by publishers, funders, or institutions) need to find trustworthy repositories for their data; and funders need to direct investigators to quality repositories to ensure return on their investment. So, how can one determine the best home for valuable research data? Metrics can be applied to varying aspects of data curation, and many credentialing organizations offer services that assess and certify the trustworthiness of a given data management facility. Unfortunately, many of these certifications can be inaccessible to a small repository in cost, time, or scope. Are there alternatives? This presentation will discuss methods and approaches used by the Biological and Chemical Oceanography Data Management Office (BCO-DMO; a domain-specific, intermediate digital data repository) to demonstrate trustworthiness in the face of a daunting accreditation landscape.

  2. Persistent Identifiers for Field Expeditions: A Next Step for the US Oceanographic Research Fleet

    NASA Astrophysics Data System (ADS)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2016-04-01

    Oceanographic research cruises are complex affairs, typically requiring an extensive effort to secure the funding, plan the experiment, and mobilize the field party. Yet cruises are not typically published online as first-class digital objects with persistent, citable identifiers linked to the scientific literature. The Rolling Deck to Repository (R2R; info@rvdata.us) program maintains a master catalog of oceanographic cruises for the United States research fleet, currently documenting over 6,000 expeditions on 37 active and retired vessels. In 2015, R2R started routinely publishing a Digital Object Identifier (DOI) for each completed cruise. Cruise DOIs, in turn, are linked to related persistent identifiers where available including the Open Researcher and Contributor ID (ORCID) for members of the science party, the International Geo Sample Number (IGSN) for physical specimens collected during the cruise, the Open Funder Registry (FundRef) codes that supported the experiment, and additional DOIs for datasets, journal articles, and other products resulting from the cruise. Publishing a persistent identifier for each field expedition will facilitate interoperability between the many different repositories that hold research products from cruises; will provide credit to the investigators who secured the funding and carried out the experiment; and will facilitate the gathering of fleet-wide altmetrics that demonstrate the broad impact of oceanographic research.

  3. Asset Reuse of Images from a Repository

    ERIC Educational Resources Information Center

    Herman, Deirdre

    2014-01-01

    According to Markus's theory of reuse, when digital repositories are deployed to collect and distribute organizational assets, they supposedly help ensure accountability, extend information exchange, and improve productivity. Such repositories require a large investment due to the continuing costs of hardware, software, user licenses, training,…

  4. Developing an Automatic Crawling System for Populating a Digital Repository of Professional Development Resources: A Pilot Study

    ERIC Educational Resources Information Center

    Park, Jung-ran; Yang, Chris; Tosaka, Yuji; Ping, Qing; Mimouni, Houda El

    2016-01-01

    This study is a part of the larger project that develops a sustainable digital repository of professional development resources on emerging data standards and technologies for data organization and management in libraries. Toward that end, the project team developed an automated workflow to crawl for, monitor, and classify relevant web objects…

  5. Loose, Falling Characters and Sentences: The Persistence of the OCR Problem in Digital Repository E-Books

    ERIC Educational Resources Information Center

    Kichuk, Diana

    2015-01-01

    The electronic conversion of scanned image files to readable text using optical character recognition (OCR) software and the subsequent migration of raw OCR text to e-book text file formats are key remediation or media conversion technologies used in digital repository e-book production. Despite real progress, the OCR problem of reliability and…

  6. Collaboration Nation: The Building of the Welsh Repository Network

    ERIC Educational Resources Information Center

    Knowles, Jacqueline

    2010-01-01

    Purpose: The purpose of this paper is to disseminate information about the Welsh Repository Network (WRN), innovative work being undertaken to build an integrated network of institutional digital repositories. A collaborative approach, in particular through the provision of centralised technical and organisational support, has demonstrated…

  7. TRAC Searchable Research Library

    DTIC Science & Technology

    2016-05-01

    network accessible document repository for technical documents and similar document artifacts. We used a model-based approach using the Vector...demonstration and model refinement. 14. SUBJECT TERMS Knowledge Management, Document Repository , Digital Library, Vector Directional Data Model...27 Figure D1. Administrator Repository Upload Page. ................................................................... D-2 Figure D2

  8. GENESI-DR - A single access point to Earth Science data

    NASA Astrophysics Data System (ADS)

    Cossu, R.; Goncalves, P.; Pacini, F.

    2009-04-01

    The amount of information being generated about our planet is increasing at an exponential rate, but it must be easily accessible in order to apply it to the global needs relating to the state of the Earth. Currently, information about the state of the Earth, relevant services, analysis results, applications and tools are accessible in a very scattered and uncoordinated way, often through individual initiatives from Earth Observation mission operators, scientific institutes dealing with ground measurements, service companies, data catalogues, etc. A dedicated infrastructure providing transparent access to all this will support Earth Science communities by allowing them to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. The use of high-speed networks (GÉANT) and the experimentation of new technologies, like BitTorrent, will also contribute to better services for the Earth Science communities. GENESI-DR (Ground European Network for Earth Science Interoperations - Digital Repositories), an ESA-led, European Commission (EC)-funded two-year project, is taking the lead in providing reliable, easy, long-term access to Earth Science data via the Internet. This project will allow scientists from different Earth Science disciplines located across Europe to locate, access, combine and integrate historical and fresh Earth-related data from space, airborne and in-situ sensors archived in large distributed repositories. GENESI-DR builds a federated collection of heterogeneous digital Earth Science repositories to establish a dedicated infrastructure providing transparent access to all this and allowing Earth Science communities to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. The federated digital repositories, seen as services and data providers, will share access to their resources (catalogue functions, data access, processing services etc.) and will adhere to a common set of standards / policies / interfaces. The end-users will be provided with a virtual collection of digital Earth Science data, irrespectively of their location in the various single federated repositories. GENESI-DR objectives have lead to the identification of the basic GENESI-DR infrastructure requirements: • Capability, for Earth Science users, to discover data from different European Earth Science Digital Repositories through the same interface in a transparent and homogeneous way; • Easiness and speed of access to large volumes of coherently maintained distributed data in an effective and timely way; • Capability, for DR owners, to easily make available their data to a significantly increased audience with no need to duplicate them in a different storage system. Data discovery is based on a Central Discovery Service, which allows users and applications to easily query information about data collections and products existing in heterogeneous catalogues, at federated DR sites. This service can be accessed by users via web interface, the GENESI-DR Web Portal, or by external applications via open standardized interfaces exposed by the system. The Central Discovery Service identifies the DRs providing products complying with the user search criteria and returns the corresponding access points to the requester. By taking into consideration different and efficient data transfer technologies such as HTTPS, GridFTP and BitTorrent, the infrastructure provides easiness and speed of access. Conversely, for data publishing GENESI-DR provides several mechanisms to assist DR owners in producing a metadata catalogues. In order to reach its objectives, the GENESI-DR e-Infrastructure will be validated against user needs for accessing and sharing Earth Science data. Initially, four specific applications in the land, atmosphere and marine domains have been selected, including: • Near real time orthorectification for agricultural crops monitoring • Urban area mapping in support of emergency response • Data assimilation in GlobModel, addressing major environmental and health issues in Europe, with a particular focus on air quality • SeaDataNet to aid environmental assessments and to forecast the physical state of the oceans in near real time. Other applications will complement this during the second half of the project. GENESI-DR also aims to develop common approaches to preserve the historical archives and the ability to access the derived user information as both software and hardware transformations occur. Ensuring access to Earth Science data for future generations is of utmost importance because it allows for the continuity of knowledge generation improvement. For instance, scientists accessing today's climate change data in 50 years will be able to better understand and detect trends in global warming and apply this knowledge to ongoing natural phenomena. GENESI-DR will work towards harmonising operations and applying approved standards, policies and interfaces at key Earth Science data repositories. To help with this undertaking, GENESI-DR will establish links with the relevant organisations and programmes such as space agencies, institutional environmental programmes, international Earth Science programmes and standardisation bodies.

  9. Building Connections, Collections, and Communities: Increasing the Visibility and Impact of Extension through Institutional Repositories

    ERIC Educational Resources Information Center

    Inefuku, Harrison W.; Franz, Nancy K.

    2015-01-01

    Over the past 20 years, university libraries have developed and manage institutional repositories--digital libraries that provide free, public access to the research, scholarship, and publications of their university's faculty, staff, and students. Although underused by Extension professionals, institutional repositories are powerful tools that…

  10. Embracing the Future: Embedding Digital Repositories in Higher Education Institutions. Research Brief

    ERIC Educational Resources Information Center

    Hoorens, Stijn; van Dijk, Lidia Villalba; van Stolk, Christian

    2009-01-01

    This briefing paper captures the key findings and recommendations of a study commissioned by the Joint Information Systems Committee on aspects of the strategic commitment of institutions to repository sustainability. This project, labelled EMBRACE (EMBedding Repositories And Consortial Enhancement), is aimed at enhancing the functionality,…

  11. Geoscience Digital Data Resource and Repository Service

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.; Schuster, D.; Hou, C. Y.

    2017-12-01

    The open availability and wide accessibility of digital data sets is becoming the norm for geoscience research. The National Science Foundation (NSF) instituted a data management planning requirement in 2011, and many scientific publishers, including the American Geophysical Union and the American Meteorological Society, have recently implemented data archiving and citation policies. Many disciplinary data facilities exist around the community to provide a high level of technical support and expertise for archiving data of particular kinds, or for particular projects. However, a significant number of geoscience research projects do not have the same level of data facility support due to a combination of several factors, including the research project's size, funding limitations, or topic scope that does not have a clear facility match. These projects typically manage data on an ad hoc basis without limited long-term management and preservation procedures. The NSF is supporting a workshop to be held in Summer of 2018 to develop requirements and expectations for a Geoscience Digital Data Resource and Repository Service (GeoDaRRS). The vision for the prospective GeoDaRRS is to complement existing NSF-funded data facilities by providing: 1) data management planning support resources for the general community, and 2) repository services for researchers who have data that do not fit in any existing repository. Functionally, the GeoDaRRS would support NSF-funded researchers in meeting data archiving requirements set by the NSF and publishers for geosciences, thereby ensuring the availability of digital data for use and reuse in scientific research going forward. This presentation will engage the AGU community in discussion about the needs for a new digital data repository service, specifically to inform the forthcoming GeoDaRRS workshop.

  12. Extending Digital Repository Architectures to Support Disk Image Preservation and Access

    DTIC Science & Technology

    2011-06-01

    Extending Digital Repository Architectures to Support Disk Image Preservation and Access Kam Woods School of Information and Library Science University...of North Carolina 216 Lenoir Drive, CB #3360 1-(919)-966-3598 kamwoods@email.unc.edu Christopher A. Lee School of Information and Library ... Science University of North Carolina 216 Lenoir Drive, CB #3360 1-(919)-962-7204 callee@ils.unc.edu Simson Garfinkel Graduate School of

  13. Push and pull models to manage patient consent and licensing of multimedia resources in digital repositories for case-based reasoning.

    PubMed

    Kononowicz, Andrzej A; Zary, Nabil; Davies, David; Heid, Jörn; Woodham, Luke; Hege, Inga

    2011-01-01

    Patient consents for distribution of multimedia constitute a significant element of medical case-based repositories in medicine. A technical challenge is posed by the right of patients to withdraw permission to disseminate their images or videos. A technical mechanism for spreading information about changes in multimedia usage licenses is sought. The authors gained their experience by developing and managing a large (>340 cases) repository of virtual patients within the European project eViP. The solution for dissemination of license status should reuse and extend existing metadata standards in medical education. Two methods: PUSH and PULL are described differing in the moment of update and the division of responsibilities between parties in the learning object exchange process. The authors recommend usage of the PUSH scenario because it is better adapted to legal requirements in many countries. It needs to be stressed that the solution is based on mutual trust of the exchange partners and therefore is most appropriate for use in educational alliances and consortia. It is hoped that the proposed models for exchanging consents and licensing information will become a crucial part of the technical frameworks for building case-based repositories.

  14. Citing and Reading Behaviors of High-Energy Physics or How a Community Stopped Worrying about Journals and Learned to Love Repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentil-Beccot, Anne; Mele, Salvatore; /CERN

    Contemporary scholarly discourse follows many alternative routes in addition to the three-century old tradition of publication in peer-reviewed journals. The field of High-Energy Physics (HEP) has explored alternative communication strategies for decades, initially via the mass mailing of paper copies of preliminary manuscripts, then via the inception of the first online repositories and digital libraries. This field is uniquely placed to answer recurrent questions raised by the current trends in scholarly communication: is there an advantage for scientists to make their work available through repositories, often in preliminary form? Is there an advantage to publishing in Open Access journals? Domore » scientists still read journals or do they use digital repositories? The analysis of citation data demonstrates that free and immediate online dissemination of preprints creates an immense citation advantage in HEP, whereas publication in Open Access journals presents no discernible advantage. In addition, the analysis of clickstreams in the leading digital library of the field shows that HEP scientists seldom read journals, preferring preprints instead.« less

  15. PGP repository: a plant phenomics and genomics data publication infrastructure

    PubMed Central

    Arend, Daniel; Junker, Astrid; Scholz, Uwe; Schüler, Danuta; Wylie, Juliane; Lange, Matthias

    2016-01-01

    Plant genomics and phenomics represents the most promising tools for accelerating yield gains and overcoming emerging crop productivity bottlenecks. However, accessing this wealth of plant diversity requires the characterization of this material using state-of-the-art genomic, phenomic and molecular technologies and the release of subsequent research data via a long-term stable, open-access portal. Although several international consortia and public resource centres offer services for plant research data management, valuable digital assets remains unpublished and thus inaccessible to the scientific community. Recently, the Leibniz Institute of Plant Genetics and Crop Plant Research and the German Plant Phenotyping Network have jointly initiated the Plant Genomics and Phenomics Research Data Repository (PGP) as infrastructure to comprehensively publish plant research data. This covers in particular cross-domain datasets that are not being published in central repositories because of its volume or unsupported data scope, like image collections from plant phenotyping and microscopy, unfinished genomes, genotyping data, visualizations of morphological plant models, data from mass spectrometry as well as software and documents. The repository is hosted at Leibniz Institute of Plant Genetics and Crop Plant Research using e!DAL as software infrastructure and a Hierarchical Storage Management System as data archival backend. A novel developed data submission tool was made available for the consortium that features a high level of automation to lower the barriers of data publication. After an internal review process, data are published as citable digital object identifiers and a core set of technical metadata is registered at DataCite. The used e!DAL-embedded Web frontend generates for each dataset a landing page and supports an interactive exploration. PGP is registered as research data repository at BioSharing.org, re3data.org and OpenAIRE as valid EU Horizon 2020 open data archive. Above features, the programmatic interface and the support of standard metadata formats, enable PGP to fulfil the FAIR data principles—findable, accessible, interoperable, reusable. Database URL: http://edal.ipk-gatersleben.de/repos/pgp/ PMID:27087305

  16. Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences

    NASA Astrophysics Data System (ADS)

    Smith, P., II

    2015-12-01

    Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to EarthCube Funded projects with academic research libraries facilitating the data and information assets components of the USGS CDI SSF via institutional repositories and/or digital content management. This session will explore the USGS CDI SSF for cross-discipline collaboration considerations from a library perspective.

  17. 10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance objectives for the geologic repository after...-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical Criteria Postclosure Performance Objectives § 63.113 Performance objectives for the geologic repository after permanent...

  18. Secure remote access to a clinical data repository using a wireless personal digital assistant (PDA).

    PubMed

    Duncan, R G; Shabot, M M

    2000-01-01

    TCP/IP and World-Wide-Web (WWW) technology have become the universal standards for networking and delivery of information. Personal digital assistants (PDAs), cellular telephones, and alphanumeric pagers are rapidly converging on a single pocket device that will leverage wireless TCP/IP networks and WWW protocols and can be used to deliver clinical information and alerts anytime, anywhere. We describe a wireless interface to clinical information for physicians based on Palm Corp.'s Palm VII pocket computer, a wireless digital network, encrypted data transmission, secure web servers, and a clinical data repository (CDR).

  19. Secure remote access to a clinical data repository using a wireless personal digital assistant (PDA).

    PubMed Central

    Duncan, R. G.; Shabot, M. M.

    2000-01-01

    TCP/IP and World-Wide-Web (WWW) technology have become the universal standards for networking and delivery of information. Personal digital assistants (PDAs), cellular telephones, and alphanumeric pagers are rapidly converging on a single pocket device that will leverage wireless TCP/IP networks and WWW protocols and can be used to deliver clinical information and alerts anytime, anywhere. We describe a wireless interface to clinical information for physicians based on Palm Corp.'s Palm VII pocket computer, a wireless digital network, encrypted data transmission, secure web servers, and a clinical data repository (CDR). PMID:11079875

  20. Building Scientific Data's list of recommended data repositories

    NASA Astrophysics Data System (ADS)

    Hufton, A. L.; Khodiyar, V.; Hrynaszkiewicz, I.

    2016-12-01

    When Scientific Data launched in 2014 we provided our authors with a list of recommended data repositories to help them identify data hosting options that were likely to meet the journal's requirements. This list has grown in size and scope, and is now a central resource for authors across the Nature-titled journals. It has also been used in the development of data deposition policies and recommended repository lists across Springer Nature and at other publishers. Each new addition to the list is assessed according to a series of criteria that emphasize the stability of the resource, its commitment to principles of open science and its implementation of relevant community standards and reporting guidelines. A preference is expressed for repositories that issue digital object identifiers (DOIs) through the DataCite system and that share data under the Creative Commons CC0 waiver. Scientific Data currently lists fourteen repositories that focus on specific areas within the Earth and environmental sciences, as well as the broad scope repositories, Dryad and figshare. Readers can browse and filter datasets published at the journal by the host repository using ISA-explorer, a demo tool built by the ISA-tools team at Oxford University1. We believe that well-maintained lists like this one help publishers build a network of trust with community data repositories and provide an important complement to more comprehensive data repository indices and more formal certification efforts. In parallel, Scientific Data has also improved its policies to better support submissions from authors using institutional and project-specific repositories, without requiring each to apply for listing individually. Online resources Journal homepage: http://www.nature.com/scientificdata Data repository criteria: http://www.nature.com/sdata/policies/data-policies#repo-criteria Recommended data repositories: http://www.nature.com/sdata/policies/repositories Archived copies of the list: https://dx.doi.org/10.6084/m9.figshare.1434640.v6 Reference Gonzalez-Beltran, A. ISA-explorer: A demo tool for discovering and exploring Scientific Data's ISA-tab metadata. Scientific Data Updates http://blogs.nature.com/scientificdata/2015/12/17/isa-explorer/ (2015).

  1. The public cancer radiology imaging collections of The Cancer Imaging Archive.

    PubMed

    Prior, Fred; Smith, Kirk; Sharma, Ashish; Kirby, Justin; Tarbox, Lawrence; Clark, Ken; Bennett, William; Nolan, Tracy; Freymann, John

    2017-09-19

    The Cancer Imaging Archive (TCIA) is the U.S. National Cancer Institute's repository for cancer imaging and related information. TCIA contains 30.9 million radiology images representing data collected from approximately 37,568 subjects. This data is organized into collections by tumor-type with many collections also including analytic results or clinical data. TCIA staff carefully de-identify and curate all incoming collections prior to making the information available via web browser or programmatic interfaces. Each published collection within TCIA is assigned a Digital Object Identifier that references the collection. Additionally, researchers who use TCIA data may publish the subset of information used in their analysis by requesting a TCIA generated Digital Object Identifier. This data descriptor is a review of a selected subset of existing publicly available TCIA collections. It outlines the curation and publication methods employed by TCIA and makes available 15 collections of cancer imaging data.

  2. Identifying & Inventorying Legacy Materials for Digitization at the National Transportation Library

    DOT National Transportation Integrated Search

    2018-01-01

    As an all-digital repository of transportation knowledge, the National Transportation Library (NTL) has undertaken several digitization projects over the years to preserve legacy print materials and make them accessible to stakeholders, researchers, ...

  3. Buckets: Smart Objects for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    2001-01-01

    Current discussion of digital libraries (DLs) is often dominated by the merits of the respective storage, search and retrieval functionality of archives, repositories, search engines, search interfaces and database systems. While these technologies are necessary for information management, the information content is more important than the systems used for its storage and retrieval. Digital information should have the same long-term survivability prospects as traditional hardcopy information and should be protected to the extent possible from evolving search engine technologies and vendor vagaries in database management systems. Information content and information retrieval systems should progress on independent paths and make limited assumptions about the status or capabilities of the other. Digital information can achieve independence from archives and DL systems through the use of buckets. Buckets are an aggregative, intelligent construct for publishing in DLs. Buckets allow the decoupling of information content from information storage and retrieval. Buckets exist within the Smart Objects and Dumb Archives model for DLs in that many of the functionalities and responsibilities traditionally associated with archives are pushed down (making the archives dumber) into the buckets (making them smarter). Some of the responsibilities imbued to buckets are the enforcement of their terms and conditions, and maintenance and display of their contents.

  4. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    ERIC Educational Resources Information Center

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  5. Crowdsourcing Content to Promote Community and Collection Development in Public Libraries

    ERIC Educational Resources Information Center

    Carr, Melissa Eleftherion

    2013-01-01

    With the Poetry Center at San Francisco State University, the author has begun to build an open-access digital repository for poetry chapbooks. The repository is essentially a chapbook exchange: a place for poets to share their current works. Users are invited to share their chapbooks via upload and as such gain access to the chapbook repository.…

  6. Where Will All Your Samples Go?

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to define standards that institutions must comply with to function as a trustworthy sample repository similar to trustworthy digital repositories. The iSamples Research Coordination Network of the EarthCube program aims to address some of these questions in workshops planned for 2018. This panel session offers an opportunity to ignite the discussion.

  7. 10 CFR 60.112 - Overall system performance objective for the geologic repository after permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... repository after permanent closure. 60.112 Section 60.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.112 Overall system performance objective for the geologic repository after permanent closure...

  8. NASA's Big Earth Data Initiative Accomplishments

    NASA Technical Reports Server (NTRS)

    Klene, Stephan A.; Pauli, Elisheva; Pressley, Natalie N.; Cechini, Matthew F.; McInerney, Mark

    2017-01-01

    The goal of NASA's effort for BEDI is to improve the usability, discoverability, and accessibility of Earth Observation data in support of societal benefit areas. Accomplishments: In support of BEDI goals, datasets have been entered into Common Metadata Repository(CMR), made available via the Open-source Project for a Network Data Access Protocol (OPeNDAP), have a Digital Object Identifier (DOI) registered for the dataset, and to support fast visualization many layers have been added in to the Global Imagery Browse Services (GIBS).

  9. NASA's Big Earth Data Initiative Accomplishments

    NASA Astrophysics Data System (ADS)

    Klene, S. A.; Pauli, E.; Pressley, N. N.; Cechini, M. F.; McInerney, M.

    2017-12-01

    The goal of NASA's effort for BEDI is to improve the usability, discoverability, and accessibility of Earth Observation data in support of societal benefit areas. Accomplishments: In support of BEDI goals, datasets have been entered into Common Metadata Repository(CMR), made available via the Open-source Project for a Network Data Access Protocol (OPeNDAP), have a Digital Object Identifier (DOI) registered for the dataset, and to support fast visualization many layers have been added in to the Global Imagery Browse Service(GIBS)

  10. What is the Value Proposition of Persistent Identifiers?

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Huber, Robert

    2017-04-01

    Persistent identifiers (PID) are widely used today in scientific communication and documentation. Global unique identification plus persistent resolution of links to referenced digital research objects have been strong selling points for PID Systems as enabling technical infrastructures. Novel applications of PID Systems in research now go beyond the identification of file based objects such as literature or data sets and include the identification of dynamically changing datasets accessed through web services, physical objects, persons and organisations. But not only do we see more use cases but also a proliferation of identifier systems. An analysis of PID Systems used by 1381 repositories listed in the Registry of Research Data Repositories (re3data.org, status of 14 Dec 2015) showed that many disciplinary data repositories make use of PID that are not among the systems promoted by the libraries and publishers (DOI, PURL, ARK). This indicates that a number of communities have developed their own PID Systems. This begs the question, do we need more identifier systems? What makes their value proposition more appealing than those of already existing systems? On the other hand, some of these new use cases deal with entities outside the digital domain, the original scope of application for PIDs. It is therefore necessary to critically appraise the value propositions of available PID Systems and compare these against the requirements of new use cases for PID. Undoubtedly, DOI are the most used persistent identifier in scholarly communication. It was originally designed "to link customers with publishers, facilitate electronic commerce, and enable copyright management systems." Today, the DOI system is described as providing "a technical and social infrastructure for the registration and use of persistent interoperable identifiers for use on digital networks". This example shows how value propositions can change over time. Additional value can be gained by cross-linking between PID Systems, thus allowing new scholarly documentation and evaluation methods such as documenting the track record of researchers in publications and successful funding proposals, apply advanced bibliometric approaches, estimate the output and impact of funding, assess the reuse and subsequent impact of data publications, demonstrate the efficient use of research infrastructures, etc. This recombination of systems raise a series of new expectations and each stakeholder group may have its own vision of the benefits and value proposition of PIDs, which might be in conflict with others. New PID applications will arise with the application of PID Systems to semantic web technologies and to the Internet of Things, which extend PID applications to beyond digital objects to concepts and things, respectively, raising yet again their own expectations and value propositions. What are we trying to identify? What is the purpose served by identifying it? What are the implications for semantic web technologies? How certain can we be about the identity of an object and its state changes over time (Ship of Theseus Paradox)? In this presentation we will discuss a number of PID use cases and match these against the value propositions offered by a selection of PID Systems.

  11. Digital asset management.

    PubMed

    Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David

    2010-05-01

    Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.

  12. Correcting names of bacteria deposited in National Microbial Repositories: an analysed sequence data necessary for taxonomic re-categorization of misclassified bacteria-ONE example, genus Lysinibacillus.

    PubMed

    Rekadwad, Bhagwan N; Gonzalez, Juan M

    2017-08-01

    A report on 16S rRNA gene sequence re-analysis and digitalization is presented using Lysinibacillus species (one example) deposited in National Microbial Repositories in India. Lysinibacillus species 16S rRNA gene sequences were digitalized to provide quick response (QR) codes, Chaose Game Representation (CGR) and Frequency of Chaose Game Representation (FCGR). GC percentage, phylogenetic analysis, and principal component analysis (PCA) are tools used for the differentiation and reclassification of the strains under investigation. The seven reasons supporting the statements made by us as misclassified Lysinibacillus species deposited in National Microbial Depositories are given in this paper. Based on seven reasons, bacteria deposited in National Microbial Repositories such as Lysinibacillus and many other needs reanalyses for their exact identity. Leaves of identity with type strains of related species shows difference 2 to 8 % suggesting that reclassification is needed to correctly assign species names to the analyzed Lysinibacillus strains available in National Microbial Repositories.

  13. Building and Using Digital Repository Certifications across Science

    NASA Astrophysics Data System (ADS)

    McIntosh, L.

    2017-12-01

    When scientific recommendations are made based upon research, the quality and integrity of the data should be rigorous enough to verify claims and in a trusted location. Key to ensuring the transparency and verifiability of research, reproducibility hinges not only on the availability of the documentation, analyses, and data, but the ongoing accessibility and viability of the files and documents, enhanced through a process of curation. The Research Data Alliance (RDA) is an international, community-driven, action-oriented, virtual organization committed to enabling the open sharing of data by building social and technical bridges. Within the RDA, multiple groups are working on consensus-building around the certification of digital repositories across scientific domains. For this section of the panel, we will discuss the work to date on repository certification from this RDA perspective.

  14. A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.

    PubMed

    Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos

    2016-01-01

    Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.

  15. JavaScript Access to DICOM Network and Objects in Web Browser.

    PubMed

    Drnasin, Ivan; Grgić, Mislav; Gogić, Goran

    2017-10-01

    Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.

  16. The Digital Sample: Metadata, Unique Identification, and Links to Data and Publications

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Vinayagamoorthy, S.; Djapic, B.; Klump, J.

    2006-12-01

    A significant part of digital data in the Geosciences refers to physical samples of Earth materials, from igneous rocks to sediment cores to water or gas samples. The application and long-term utility of these sample-based data in research is critically dependent on (a) the availability of information (metadata) about the samples such as geographical location and time of sampling, or sampling method, (b) links between the different data types available for individual samples that are dispersed in the literature and in digital data repositories, and (c) access to the samples themselves. Major problems for achieving this include incomplete documentation of samples in publications, use of ambiguous sample names, and the lack of a central catalog that allows to find a sample's archiving location. The International Geo Sample Number IGSN, managed by the System for Earth Sample Registration SESAR, provides solutions for these problems. The IGSN is a unique persistent identifier for samples and other GeoObjects that can be obtained by submitting sample metadata to SESAR (www.geosamples.org). If data in a publication is referenced to an IGSN (rather than an ambiguous sample name), sample metadata can readily be extracted from the SESAR database, which evolves into a Global Sample Catalog that also allows to locate the owner or curator of the sample. Use of the IGSN in digital data systems allows building linkages between distributed data. SESAR is contributing to the development of sample metadata standards. SESAR will integrate the IGSN in persistent, resolvable identifiers based on the handle.net service to advance direct linkages between the digital representation of samples in SESAR (sample profiles) and their related data in the literature and in web-accessible digital data repositories. Technologies outlined by Klump et al. (this session) such as the automatic creation of ontologies by text mining applications will be explored for harvesting identifiers of publications and datasets that contain information about a specific sample in order to establish comprehensive data profiles for samples.

  17. PGP repository: a plant phenomics and genomics data publication infrastructure.

    PubMed

    Arend, Daniel; Junker, Astrid; Scholz, Uwe; Schüler, Danuta; Wylie, Juliane; Lange, Matthias

    2016-01-01

    Plant genomics and phenomics represents the most promising tools for accelerating yield gains and overcoming emerging crop productivity bottlenecks. However, accessing this wealth of plant diversity requires the characterization of this material using state-of-the-art genomic, phenomic and molecular technologies and the release of subsequent research data via a long-term stable, open-access portal. Although several international consortia and public resource centres offer services for plant research data management, valuable digital assets remains unpublished and thus inaccessible to the scientific community. Recently, the Leibniz Institute of Plant Genetics and Crop Plant Research and the German Plant Phenotyping Network have jointly initiated the Plant Genomics and Phenomics Research Data Repository (PGP) as infrastructure to comprehensively publish plant research data. This covers in particular cross-domain datasets that are not being published in central repositories because of its volume or unsupported data scope, like image collections from plant phenotyping and microscopy, unfinished genomes, genotyping data, visualizations of morphological plant models, data from mass spectrometry as well as software and documents.The repository is hosted at Leibniz Institute of Plant Genetics and Crop Plant Research using e!DAL as software infrastructure and a Hierarchical Storage Management System as data archival backend. A novel developed data submission tool was made available for the consortium that features a high level of automation to lower the barriers of data publication. After an internal review process, data are published as citable digital object identifiers and a core set of technical metadata is registered at DataCite. The used e!DAL-embedded Web frontend generates for each dataset a landing page and supports an interactive exploration. PGP is registered as research data repository at BioSharing.org, re3data.org and OpenAIRE as valid EU Horizon 2020 open data archive. Above features, the programmatic interface and the support of standard metadata formats, enable PGP to fulfil the FAIR data principles-findable, accessible, interoperable, reusable.Database URL:http://edal.ipk-gatersleben.de/repos/pgp/. © The Author(s) 2016. Published by Oxford University Press.

  18. Digital Libraries on the Internet.

    ERIC Educational Resources Information Center

    Sharon, Taly; Frank, Ariel J.

    This paper discusses digital libraries on the Internet. The resource repository hierarchy, consisting of two major paradigms, search engines (SEs) and digital libraries, is presented. SEs are classified into three categories: basic-SE, directory, and meta-SE. The following six major characteristics of a library are summarized: collection of data…

  19. Motivations of Faculty Self-Archiving in Institutional Repositories

    ERIC Educational Resources Information Center

    Kim, Jihyun

    2011-01-01

    Professors contribute to Institutional Repositories (IRs) to make their materials widely accessible in keeping with the benefits of Open Access. However, universities' commitment to IRs depends on building trust with faculty and solving copyright concerns. Digital preservation and copyright management in IRs should be strengthened to increase…

  20. Metadata Effectiveness in Internet Discovery: An Analysis of Digital Collection Metadata Elements and Internet Search Engine Keywords

    ERIC Educational Resources Information Center

    Yang, Le

    2016-01-01

    This study analyzed digital item metadata and keywords from Internet search engines to learn what metadata elements actually facilitate discovery of digital collections through Internet keyword searching and how significantly each metadata element affects the discovery of items in a digital repository. The study found that keywords from Internet…

  1. Digital Authenticity and Integrity: Digital Cultural Heritage Documents as Research Resources

    ERIC Educational Resources Information Center

    Bradley; Rachael

    2005-01-01

    This article presents the results of a survey addressing methods of securing digital content and ensuring the content's authenticity and integrity, as well as the perceived importance of authenticity and integrity. The survey was sent to 40 digital repositories in the United States and Canada between June 30 and July 19, 2003. Twenty-two…

  2. Repository Profiles for Atmospheric and Climate Sciences: Capabilities and Trends in Data Services

    NASA Astrophysics Data System (ADS)

    Hou, C. Y.; Thompson, C. A.; Palmer, C. L.

    2014-12-01

    As digital research data proliferate and expectations for open access escalate, the landscape of data repositories is becoming more complex. For example, DataBib currently identifies 980 data repositories across the disciplines, with 117 categorized under Geosciences. In atmospheric and climate sciences, there are great expectations for the integration and reuse of data for advancing science. To realize this potential, resources are needed that explicate the range of repository options available for locating and depositing open data, their conditions of access and use, and the services and tools they provide. This study profiled 38 open digital repositories in the atmospheric and climate sciences, analyzing each on 55 criteria through content analysis of their websites. The results provide a systematic way to assess and compare capabilities, services, and institutional characteristics and identify trends across repositories. Selected results from the more detailed outcomes to be presented: Most repositories offer guidance on data format(s) for submission and dissemination. 42% offer authorization-free access. More than half use some type of data identification system such as DOIs. Nearly half offer some data processing, with a similar number providing software or tools. 78.9% request that users cite or acknowledge datasets used and the data center. Only 21.1% recommend specific metadata standards, such as ISO 19115 or Dublin Core, with more than half utilizing a customized metadata scheme. Information was rarely provided on repository certification and accreditation and uneven for transfer of rights and data security. Few provided policy information on preservation, migration, reappraisal, disposal, or long-term sustainability. As repository use increases, it will be important for institutions to make their procedures and policies explicit, to build trust with user communities and improve efficiencies in data sharing. Resources such as repository profiles will be essential for scientists to weigh options and understand trends in data services across the evolving network of repositories.

  3. 10 CFR 63.111 - Performance objectives for the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance objectives for the geologic repository... (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA... repository operations area through permanent closure. (a) Protection against radiation exposures and releases...

  4. Inventory of Shale Formations in the US, Including Geologic, Hydrological, and Mechanical Characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobson, Patrick; Houseworth, James

    2013-11-22

    The objective of this report is to build upon previous compilations of shale formations within many of the major sedimentary basins in the US by developing GIS data delineating isopach and structural depth maps for many of these units. These data are being incorporated into the LANL digital GIS database being developed for determining host rock distribution and depth/thickness parameters consistent with repository design. Methods were developed to assess hydrological and geomechanical properties and conditions for shale formations based on sonic velocity measurements.

  5. Health Professional Learner Attitudes and Use of Digital Learning Resources

    PubMed Central

    Chamberlain, Michael; Morrison, Shane; Kotsanas, George; Keating, Jennifer L; Ilic, Dragan

    2013-01-01

    Background Web-based digital repositories allow educational resources to be accessed efficiently and conveniently from diverse geographic locations, hold a variety of resource formats, enable interactive learning, and facilitate targeted access for the user. Unlike some other learning management systems (LMS), resources can be retrieved through search engines and meta-tagged labels, and content can be streamed, which is particularly useful for multimedia resources. Objective The aim of this study was to examine usage and user experiences of an online learning repository (Physeek) in a population of physiotherapy students. The secondary aim of this project was to examine how students prefer to access resources and which resources they find most helpful. Methods The following data were examined using an audit of the repository server: (1) number of online resources accessed per day in 2010, (2) number of each type of resource accessed, (3) number of resources accessed during business hours (9 am to 5 pm) and outside business hours (years 1-4), (4) session length of each log-on (years 1-4), and (5) video quality (bit rate) of each video accessed. An online questionnaire and 3 focus groups assessed student feedback and self-reported experiences of Physeek. Results Students preferred the support provided by Physeek to other sources of educational material primarily because of its efficiency. Peak usage commonly occurred at times of increased academic need (ie, examination times). Students perceived online repositories as a potential tool to support lifelong learning and health care delivery. Conclusions The results of this study indicate that today’s health professional students welcome the benefits of online learning resources because of their convenience and usability. This represents a transition away from traditional learning styles and toward technological learning support and may indicate a growing link between social immersions in Internet-based connections and learning styles. The true potential for Web-based resources to support student learning is as yet unknown. PMID:23324800

  6. Measuring Trust: Standards for Trusted Digital Repositories

    ERIC Educational Resources Information Center

    Dryden, Jean

    2011-01-01

    Ensuring the long-term preservation and use of born-digital and digitized records of enduring value has preoccupied archivists and their cultural heritage siblings for several decades. The professional literature of the 1980s and 1990s bemoans the challenges posed by rapid technological change (and the concomitant obsolescence of hardware and…

  7. Community Stories and Institutional Stewardship: Digital Curation's Dual Roles of Story Creation and Resource Preservation

    ERIC Educational Resources Information Center

    Kunda, Sue; Anderson-Wilk, Mark

    2011-01-01

    Our institutions of record are facing a new digital knowledge management challenge: stakeholder communities are now expecting customized Web interfaces to institutional knowledge repositories, online environments where community members can contribute content and see themselves represented, as well as access archived resources. Digital curation…

  8. Preservation Health Check: Monitoring Threats to Digital Repository Content

    ERIC Educational Resources Information Center

    Kool, Wouter; van der Werf, Titia; Lavoie, Brian

    2014-01-01

    The Preservation Health Check (PHC) project, undertaken as a joint effort by Open Planets Foundation (OPF) and OCLC Research, aims to evaluate the usefulness of the preservation metadata created and maintained by operational repositories for assessing basic preservation properties. The PHC project seeks to develop an implementable logic to support…

  9. Microsoft Repository Version 2 and the Open Information Model.

    ERIC Educational Resources Information Center

    Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David

    1999-01-01

    Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…

  10. High Res at High Speed: Automated Delivery of High-Resolution Images from Digital Library Collections

    ERIC Educational Resources Information Center

    Westbrook, R. Niccole; Watkins, Sean

    2012-01-01

    As primary source materials in the library are digitized and made available online, the focus of related library services is shifting to include new and innovative methods of digital delivery via social media, digital storytelling, and community-based and consortial image repositories. Most images on the Web are not of sufficient quality for most…

  11. D Cultural Heritage Documentation: a Comparison Between Different Photogrammetric Software and Their Products

    NASA Astrophysics Data System (ADS)

    Gagliolo, S.; Ausonio, E.; Federici, B.; Ferrando, I.; Passoni, D.; Sguerso, D.

    2018-05-01

    The conservation of Cultural Heritage depends on the availability of means and resources and, consequently, on the possibility to make effective operations of data acquisition. In facts, on the one hand the creation of data repositories allows the description of the present state-of-art, in order to preserve the testimonial value and to permit the fruition. On the other hand, data acquisition grants a metrical knowledge, which is particularly useful for a direct restoration of the surveyed objects, through the analysis of their 3D digital models. In the last decades, the continuous increase and improvement of 3D survey techniques and of tools for the geometric and digital data management have represented a great support to the development of documentary activities. In particular, Photogrammetry is a survey technique highly appropriate in the creation of data repositories in the field of Cultural Heritage, thanks to its advantages of cheapness, flexibility, speed, and the opportunity to ensure the operators' safety in hazardous areas too. In order to obtain a complete documentation, the high precision of the on-site operations must be coupled with an effective post-processing phase. Hence, a comparison among some of the photogrammetric software currently available was performed by the authors, with a particular attention to the workflow completeness and the final products quality.

  12. Oldies, Music Rights, and the Digital Age

    ERIC Educational Resources Information Center

    McDonald, Peter

    2005-01-01

    The author discusses the issue of copyright, oldies, and digital preservation. He examines efforts being made to create digital sound repositories for music record prior to 1970 at such places as Yale, Syracuse, the New York Public Library, and the Library of Congress. These issues are explored by contrasting the music industry's concern for loss…

  13. Do You Hear What I See? Assessing Accessibility of Digital Commons and CONTENTdm

    ERIC Educational Resources Information Center

    Walker, Wendy; Keenan, Teressa

    2015-01-01

    This article discusses the accessibility of two content management systems, Berkeley Electronic Press's Digital Commons and OCLC's CONTENTdm, widely used in libraries to host institutional repository and digital collections content. Based on observations by a visually impaired student who used the JAWS screen reader to view the design and display…

  14. Enhancing Scientific Practice and Education through Collaborative Digital Libraries.

    ERIC Educational Resources Information Center

    Maini, Gaurav; Leggett, John J.; Ong, Teongjoo; Wilson, Hugh D.; Reed, Monique D.; Hatch, Stephan L.; Dawson, John E.

    The need for accurate and current scientific information in the fast paced Internet-aware world has prompted the scientific community to develop tools that reduce the scientist's time and effort to make digital information available to all interested parties. The availability of such tools has made the Internet a vast digital repository of…

  15. Semantic Linking of Learning Object Repositories to DBpedia

    ERIC Educational Resources Information Center

    Lama, Manuel; Vidal, Juan C.; Otero-Garcia, Estefania; Bugarin, Alberto; Barro, Senen

    2012-01-01

    Large-sized repositories of learning objects (LOs) are difficult to create and also to maintain. In this paper we propose a way to reduce this drawback by improving the classification mechanisms of the LO repositories. Specifically, we present a solution to automate the LO classification of the Universia repository, a collection of more than 15…

  16. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    NASA Astrophysics Data System (ADS)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.

  17. Components for the Global Digital Object Cloud

    NASA Astrophysics Data System (ADS)

    Glaves, Helen; Hanahoe, Hilary; Weigel, Tobias; Lannom, Larry; Wittenburg, Peter; Koureas, Dimitris; Almas, Bridget

    2017-04-01

    We are at a tipping point in the development of a common conceptual framework and set of tools and components which will revolutionize the management of scientific data. It is widely acknowledged that the current volumes and complexity of data now being collected, and the inevitable and enormous increase in that volume and complexity, have reached the point where action is required. Around 80% of the data generated is being lost after short time periods and a corresponding amount of time is being wasted by reseachers on routine data management tasks. At the same time, and largely in response to this perceived crisis, a number of principles (G8, RDA DFT, FAIR) for the management of scientific data have arisen and been widely endorsed. The danger now is that agreement will stop at the level of principles and that multiple non-interoperable domain and technology specific silos will continue to arise, all based on the abstract principles. If this happens, we will lose the opportunity to create a common set of low-level tools and components based on an agreed conceptual approach. The Research Data Alliance (RDA) is now combining recommendations from its individual working and interest groups, such as suggestions for proper citation of dynamic data or how to assess the quality of repositories, to design configurations of core components (as specified by RDA and other initiatives such as W3C) and stimulate their implementation. Together with a few global communities such as climate modeling, biodiversity and material science, experts involved in RDA are developing a concept called Global Digital Object Cloud (GDOC) which has the potential to overcome the huge fragmentation which hampers efficient data management and re-use. It is compliant with the FAIR principles in so far as a) it puts Digital Objects (DOs) in its center, b) has all DOs assigned PIDs which are resolvable to useful state information, c) has all DOs associated with metadata, and d) has all DO bit sequences stored in trustworthy repositories. The presentation will give an overview of the types of components involved, the corresponding specifications of RDA, and the concept of the GDOC.

  18. ESDORA: A Data Archive Infrastructure Using Digital Object Model and Open Source Frameworks

    NASA Astrophysics Data System (ADS)

    Shrestha, Biva; Pan, Jerry; Green, Jim; Palanisamy, Giriprakash; Wei, Yaxing; Lenhardt, W.; Cook, R. Bob; Wilson, B. E.; Leggott, M.

    2011-12-01

    There are an array of challenges associated with preserving, managing, and using contemporary scientific data. Large volume, multiple formats and data services, and the lack of a coherent mechanism for metadata/data management are some of the common issues across data centers. It is often difficult to preserve the data history and lineage information, along with other descriptive metadata, hindering the true science value for the archived data products. In this project, we use digital object abstraction architecture as the information/knowledge framework to address these challenges. We have used the following open-source frameworks: Fedora-Commons Repository, Drupal Content Management System, Islandora (Drupal Module) and Apache Solr Search Engine. The system is an active archive infrastructure for Earth Science data resources, which include ingestion, archiving, distribution, and discovery functionalities. We use an ingestion workflow to ingest the data and metadata, where many different aspects of data descriptions (including structured and non-structured metadata) are reviewed. The data and metadata are published after reviewing multiple times. They are staged during the reviewing phase. Each digital object is encoded in XML for long-term preservation of the content and relations among the digital items. The software architecture provides a flexible, modularized framework for adding pluggable user-oriented functionality. Solr is used to enable word search as well as faceted search. A home grown spatial search module is plugged in to allow user to make a spatial selection in a map view. A RDF semantic store within the Fedora-Commons Repository is used for storing information on data lineage, dissemination services, and text-based metadata. We use the semantic notion "isViewerFor" to register internally or externally referenced URLs, which are rendered within the same web browser when possible. With appropriate mapping of content into digital objects, many different data descriptions, including structured metadata, data history, auditing trails, are captured and coupled with the data content. The semantic store provides a foundation for possible further utilizations, including provide full-fledged Earth Science ontology for data interpretation or lineage tracking. Datasets from the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) as well as from the Synthesis Thematic Data Center (MAST-DC) are used in a testing deployment with the system. The testing deployment allows us to validate the features and values described here for the integrated system, which will be presented here. Overall, we believe that the integrated system is valid, reusable data archive software that provides digital stewardship for Earth Sciences data content, now and in the future. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2] Devarakonda, Ranjeet, et al. "Semantic search integration to climate data." Collaboration Technologies and Systems (CTS), 2014 International Conference on. IEEE, 2014.

  19. Open Access to Physics and Astronomy Theses: A Case Study of the Raman Research Institute Digital Repository

    NASA Astrophysics Data System (ADS)

    Nagaraj, M. N.; Manjunath, M.; Savanur, K. P.; Sheshadri, G.

    2010-10-01

    With the introduction of information technology (IT) and its applications, libraries have started looking for ways to promote their institutes' research output. At the Raman Research Institute (RRI), we have showcased research output such as research papers, newspaper clippings, annual reports, technical reports, and the entire collection of C.V. Raman through the RRI digital repository, using DSpace. Recently, we have added doctoral dissertations to the repository and have made them accessible with the author's permission. In this paper, we describe the challenges and problems encountered in this project. The various stages including policy decisions, the scanning process, getting permissions, metadata standards and other related issues are described. We conclude by making a plea to other institutions also to make their theses available open-access so that this valuable information resource is accessible to all.

  20. The National Geological and Geophysical Data Preservation Program

    NASA Astrophysics Data System (ADS)

    Dickinson, T. L.; Steinmetz, J. C.; Gundersen, L. C.; Pierce, B. S.

    2006-12-01

    The ability to preserve and maintain geoscience data and collections has not kept pace with the growing need for accessible digital information and the technology to make it so. The Nation has lost valuable and unique geologic records and is in danger of losing much more. Many federal and state geological repositories are currently at their capacity for maintaining and storing data or samples. Some repositories are gaining additional, but temporary and substandard space, using transport containers or offsite warehouses where access is limited and storage conditions are poor. Over the past several years, there has been an increasing focus on the state of scientific collections in the United States. For example, the National Geological and Geophysical Data Preservation Program (NGGDPP) Act was passed as part of the Energy Policy Act of 2005, authorizing $30 million in funding for each of five years. The Act directs the U.S. Geological Survey to administer this program that includes a National Digital Catalog and Federal assistance to support our nation's repositories. Implementation of the Program awaits federal appropriations. The NGGDPP is envisioned as a national network of cooperating geoscience materials and data repositories that are operated independently yet guided by unified standards, procedures, and protocols for metadata. The holdings will be widely accessible through a common and mirrored Internet-based catalog (National Digital Catalog). The National Digital Catalog will tie the observations and analyses to the physical materials they come from. Our Nation's geological and geophysical data are invaluable and in some instances irreplaceable due to the destruction of outcrops, urbanization and restricted access. These data will enable the next generation of scientific research and education, enable more effective and efficient research, and may have future economic benefits through the discovery of new oil and gas accumulations, and mineral deposits.

  1. Exploring Characterizations of Learning Object Repositories Using Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Segura, Alejandra; Vidal, Christian; Menendez, Victor; Zapata, Alfredo; Prieto, Manuel

    Learning object repositories provide a platform for the sharing of Web-based educational resources. As these repositories evolve independently, it is difficult for users to have a clear picture of the kind of contents they give access to. Metadata can be used to automatically extract a characterization of these resources by using machine learning techniques. This paper presents an exploratory study carried out in the contents of four public repositories that uses clustering and association rule mining algorithms to extract characterizations of repository contents. The results of the analysis include potential relationships between different attributes of learning objects that may be useful to gain an understanding of the kind of resources available and eventually develop search mechanisms that consider repository descriptions as a criteria in federated search.

  2. An Examination of the Adoption of Preservation Metadata in Cultural Heritage Institutions: An Exploratory Study Using Diffusion of Innovations Theory

    ERIC Educational Resources Information Center

    Alemneh, Daniel Gelaw

    2009-01-01

    Digital preservation is a significant challenge for cultural heritage institutions and other repositories of digital information resources. Recognizing the critical role of metadata in any successful digital preservation strategy, the Preservation Metadata Implementation Strategies (PREMIS) has been extremely influential on providing a "core" set…

  3. Streaming the Archives: Repurposing Systems to Advance a Small Media Digitization and Dissemination Program

    ERIC Educational Resources Information Center

    Anderson, Talea

    2015-01-01

    In 2013-2014, Brooks Library at Central Washington University (CWU) launched library content in three systems: a digital asset-management system, an institutional repository (IR), and a web-based discovery layer. In early 2014, the archives at the library began to use these systems to disseminate media recently digitized from legacy formats. As…

  4. Collaboration, Coherence and Capacity-Building: The Role of DSpace in Supporting and Understanding the TLRP

    ERIC Educational Resources Information Center

    Procter, Richard

    2007-01-01

    This paper describes how the Teaching and Learning Research Programme (TLRP) has implemented and applied DSpace as a digital repository for project and programme outputs, including published articles, conference papers, research reports, briefings and press releases. The DSpace repository has become a major element in the user engagement strategy…

  5. Use of Digital Repositories by Chemistry Researchers: Results of a Survey

    ERIC Educational Resources Information Center

    Polydoratou, Panayiota

    2007-01-01

    Purpose: This paper aims to present findings from a survey that aimed to identify the issues around the use and linkage of source and output repositories and the chemistry researchers' expectations about their use. Design/methodology/approach: This survey was performed by means of an online questionnaire and structured interviews with academic and…

  6. re3data.org - a global registry of research data repositories

    NASA Astrophysics Data System (ADS)

    Pampel, Heinz; Vierkant, Paul; Elger, Kirsten; Bertelmann, Roland; Witt, Michael; Schirmbacher, Peter; Rücknagel, Jessika; Kindling, Maxi; Scholze, Frank; Ulrich, Robert

    2016-04-01

    re3data.org - the registry of research data repositories lists over 1,400 research data repositories from all over the world making it the largest and most comprehensive online catalog of research data repositories on the web. The registry is a valuable tool for researchers, funding organizations, publishers and libraries. re3data.org provides detailed information about research data repositories, and its distinctive icons help researchers to easily identify relevant repositories for accessing and depositing data sets [1]. Funding agencies, like the European Commission [2] and research institutions like the University of Bielefeld [3] already recommend the use of re3data.org in their guidelines and policies. Several publishers and journals like Copernicus Publications, PeerJ, and Nature's Scientific Data recommend re3data.org in their editorial policies as a tool for the easy identification of appropriate data repositories to store research data. Project partners in re3data.org are the Library and Information Services department (LIS) of the GFZ German Research Centre for Geosciences, the Computer and Media Service at the Humboldt-Universität zu Berlin, the Purdue University Libraries and the KIT Library at the Karlsruhe Institute of Technology (KIT). After its fusion with the U.S. American DataBib in 2014, re3data.org continues as a service of DataCite from 2016 on. DataCite is the international organization for the registration of Digital Object Identifiers (DOI) for research data and aims to improve their citation. The poster describes the current status and the future plans of re3data.org. [1] Pampel H, et al. (2013) Making Research Data Repositories Visible: The re3data.org Registry. PLoS ONE 8(11): e78080. doi:10.1371/journal.pone.0078080. [2] European Commission (2015): Guidelines on Open Access to Scientific Publications and Research Data in Horizon 2020. Available: http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf Accessed 11 January 2016. [3] Bielefeld University (2013): Resolution on Research Data Management. Available: http://data.uni-bielefeld.de/en/resolution Accessed 11 January 2016.

  7. DSpace and customized controlled vocabularies

    NASA Astrophysics Data System (ADS)

    Skourlas, C.; Tsolakidis, A.; Kakoulidis, P.; Giannakopoulos, G.

    2015-02-01

    The open source platform of DSpace could be defined as a repository application used to provide access to digital resources. DSpace is installed and used by more than 1000 organizations worldwide. A predefined taxonomy of keyword, called the Controlled Vocabulary, can be used for describing and accessing the information items stored in the repository. In this paper, we describe how the users can create, and customize their own vocabularies. Various heterogeneous items, such as research papers, videos, articles and educational material of the repository, can be indexed in order to provide advanced search functionality using new controlled vocabularies.

  8. GENESI-DR: Discovery, Access and on-Demand Processing in Federated Repositories

    NASA Astrophysics Data System (ADS)

    Cossu, Roberto; Pacini, Fabrizio; Parrini, Andrea; Santi, Eliana Li; Fusco, Luigi

    2010-05-01

    GENESI-DR (Ground European Network for Earth Science Interoperations - Digital Repositories) is a European Commission (EC)-funded project, kicked-off early 2008 lead by ESA; partners include Space Agencies (DLR, ASI, CNES), both space and no-space data providers such as ENEA (I), Infoterra (UK), K-SAT (N), NILU (N), JRC (EU) and industry as Elsag Datamat (I), CS (F) and TERRADUE (I). GENESI-DR intends to meet the challenge of facilitating "time to science" from different Earth Science disciplines in discovery, access and use (combining, integrating, processing, …) of historical and recent Earth-related data from space, airborne and in-situ sensors, which are archived in large distributed repositories. In fact, a common dedicated infrastructure such as the GENESI-DR one permits the Earth Science communities to derive objective information and to share knowledge in all environmental sensitive domains over a continuum of time and a variety of geographical scales so addressing urgent challenges such as Global Change. GENESI-DR federates data, information and knowledge for the management of our fragile planet in line with one of the major goals of the many international environmental programmes such as GMES, GEO/GEOSS. As of today, 12 different Digital Repositories hosting more than 60 heterogeneous dataset series are federated in GENESI-DR. Series include satellite data, in situ data, images acquired by airborne sensors, digital elevation models and model outputs. ESA has started providing access to: Category-1 data systematically available on Internet; level 3 data (e.g., GlobCover map, MERIS Global Vegetation Index); ASAR products available in ESA Virtual Archive and related to the Supersites initiatives. In all cases, existing data policies and security constraints are fully respected. GENESI-DR also gives access to Grid and Cloud computing resources allowing authorized users to run a number of different processing services on the available data. The GENESI-DR operational platform is currently being validated against several applications from different domains, such as: automatic orthorectification of SPOT data; SAR Interferometry; GlobModel results visualization and verification by comparison with satellite observations; ozone estimation from ERS-GOME products and comparison with in-situ LIDAR measures; access to ocean-related heterogeneous data and on-the-fly generated products. The project is adopting, ISO 19115, ISO 19139 and OGC standards for geospatial metadata discovery and processing, is compliant with the basis of INSPIRE Implementing Rules for Metadata and Discovery, and uses the OpenSearch protocol with Geo extensions for data and services discovery. OpenSearch is now considered by OGC a mass-market standard to provide machine accessible search interface to data repositories. GENESI-DR is gaining momentum in the Earth Science community thanks to the active participation to the GEO task force "Data Integration and Analysis Systems" and to the several collaborations with EC projects. It is now extending international cooperation agreements specifically with the NASA (Goddard Earth Sciences Data Information Services), with CEODE (the Center of Earth Observation for Digital Earth of Beijing), with the APN (Asia-Pacific Network), with University of Tokyo (Japanese GeoGrid and Data Integration and Analysis System).

  9. Design and Development of an Institutional Repository at the Indian Institute of Technology Kharagpur

    ERIC Educational Resources Information Center

    Sutradhar, B.

    2006-01-01

    Purpose: To describe how an institutional repository (IR) was set up, using open source software, at the Indian Institute of Technology (IIT) in Kharagpur. Members of the IIT can publish their research documents in the IR for online access as well as digital preservation. Material in this IR includes instructional materials, records, data sets,…

  10. An open repositories network development for medical teaching resources.

    PubMed

    Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius

    2010-01-01

    The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.

  11. Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems

    ERIC Educational Resources Information Center

    Mason, Robert T.

    2011-01-01

    An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…

  12. Automatic Target Recognition Based on Cross-Plot

    PubMed Central

    Wong, Kelvin Kian Loong; Abbott, Derek

    2011-01-01

    Automatic target recognition that relies on rapid feature extraction of real-time target from photo-realistic imaging will enable efficient identification of target patterns. To achieve this objective, Cross-plots of binary patterns are explored as potential signatures for the observed target by high-speed capture of the crucial spatial features using minimal computational resources. Target recognition was implemented based on the proposed pattern recognition concept and tested rigorously for its precision and recall performance. We conclude that Cross-plotting is able to produce a digital fingerprint of a target that correlates efficiently and effectively to signatures of patterns having its identity in a target repository. PMID:21980508

  13. An ontology based information system for the management of institutional repository's collections

    NASA Astrophysics Data System (ADS)

    Tsolakidis, A.; Kakoulidis, P.; Skourlas, C.

    2015-02-01

    In this paper we discuss a simple methodological approach to create, and customize institutional repositories for the domain of the technological education. The use of the open source software platform of DSpace is proposed to build up the repository application and provide access to digital resources including research papers, dissertations, administrative documents, educational material, etc. Also the use of owl ontologies is proposed for indexing and accessing the various, heterogeneous items stored in the repository. Customization and operation of a platform for the selection and use of terms or parts of similar existing owl ontologies is also described. This platform could be based on the open source software Protégé that supports owl, is widely used, and also supports visualization, SPARQL etc. The combined use of the owl platform and the DSpace repository form a basis for creating customized ontologies, accommodating the semantic metadata of items and facilitating searching.

  14. Semantics-Based Intelligent Indexing and Retrieval of Digital Images - A Case Study

    NASA Astrophysics Data System (ADS)

    Osman, Taha; Thakker, Dhavalkumar; Schaefer, Gerald

    The proliferation of digital media has led to a huge interest in classifying and indexing media objects for generic search and usage. In particular, we are witnessing colossal growth in digital image repositories that are difficult to navigate using free-text search mechanisms, which often return inaccurate matches as they typically rely on statistical analysis of query keyword recurrence in the image annotation or surrounding text. In this chapter we present a semantically enabled image annotation and retrieval engine that is designed to satisfy the requirements of commercial image collections market in terms of both accuracy and efficiency of the retrieval process. Our search engine relies on methodically structured ontologies for image annotation, thus allowing for more intelligent reasoning about the image content and subsequently obtaining a more accurate set of results and a richer set of alternatives matchmaking the original query. We also show how our well-analysed and designed domain ontology contributes to the implicit expansion of user queries as well as presenting our initial thoughts on exploiting lexical databases for explicit semantic-based query expansion.

  15. A Unique Digital Electrocardiographic Repository for the Development of Quantitative Electrocardiography and Cardiac Safety: The Telemetric and Holter ECG Warehouse (THEW)

    PubMed Central

    Couderc, Jean-Philippe

    2010-01-01

    The sharing of scientific data reinforces open scientific inquiry; it encourages diversity of analysis and opinion while promoting new research and facilitating the education of next generations of scientists. In this article, we present an initiative for the development of a repository containing continuous electrocardiographic information and their associated clinical information. This information is shared with the worldwide scientific community in order to improve quantitative electrocardiology and cardiac safety. First, we present the objectives of the initiative and its mission. Then, we describe the resources available in this initiative following three components: data, expertise and tools. The Data available in the Telemetric and Holter ECG Warehouse (THEW) includes continuous ECG signals and associated clinical information. The initiative attracted various academic and private partners whom expertise covers a large list of research arenas related to quantitative electrocardiography; their contribution to the THEW promotes cross-fertilization of scientific knowledge, resources, and ideas that will advance the field of quantitative electrocardiography. Finally, the tools of the THEW include software and servers to access and review the data available in the repository. To conclude, the THEW is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. It is a new repository designed to complement the existing ones such as Physionet, the AHA-BIH Arrhythmia Database, and the CSE database. The THEW hosts unique datasets from clinical trials and drug safety studies that, so far, were not available to the worldwide scientific community. PMID:20863512

  16. Personal Spaces in Public Repositories as a Facilitator for Open Educational Resource Usage

    ERIC Educational Resources Information Center

    Cohen, Anat; Reisman, Sorel; Sperling, Barbra Bied

    2015-01-01

    Learning object repositories are a shared, open and public space; however, the possibility and ability of personal expression in an open, global, public space is crucial. The aim of this study is to explore personal spaces in a big learning object repository as a facilitator for adoption of Open Educational Resources (OER) into teaching practices…

  17. Development and Implementation of a Learning Object Repository for French Teaching and Learning: Issues and Promises

    ERIC Educational Resources Information Center

    Caws, Catherine

    2008-01-01

    This paper discusses issues surrounding the development of a learning object repository (FLORE) for teaching and learning French at the postsecondary level. An evaluation based on qualitative and quantitative data was set up in order to better assess how second-language (L2) students in French perceived the integration of this new repository into…

  18. 12 CFR 7.5005 - National bank acting as digital certification authority.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... persons associated with a particular public/private key pair. As part of this service, the bank may also maintain a listing or repository of public keys. (b) A national bank may issue digital certificates verifying attributes in addition to identity of persons associated with a particular public/private key pair...

  19. 12 CFR 7.5005 - National bank acting as digital certification authority.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... persons associated with a particular public/private key pair. As part of this service, the bank may also maintain a listing or repository of public keys. (b) A national bank may issue digital certificates verifying attributes in addition to identity of persons associated with a particular public/private key pair...

  20. 12 CFR 7.5005 - National bank acting as digital certification authority.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... persons associated with a particular public/private key pair. As part of this service, the bank may also maintain a listing or repository of public keys. (b) A national bank may issue digital certificates verifying attributes in addition to identity of persons associated with a particular public/private key pair...

  1. 12 CFR 7.5005 - National bank acting as digital certification authority.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... persons associated with a particular public/private key pair. As part of this service, the bank may also maintain a listing or repository of public keys. (b) A national bank may issue digital certificates verifying attributes in addition to identity of persons associated with a particular public/private key pair...

  2. ASK-LDT 2.0: A Web-Based Graphical Tool for Authoring Learning Designs

    ERIC Educational Resources Information Center

    Zervas, Panagiotis; Fragkos, Konstantinos; Sampson, Demetrios G.

    2013-01-01

    During the last decade, Open Educational Resources (OERs) have gained increased attention for their potential to support open access, sharing and reuse of digital educational resources. Therefore, a large amount of digital educational resources have become available worldwide through web-based open access repositories which are referred to as…

  3. Supporting Student Research with Semantic Technologies and Digital Archives

    ERIC Educational Resources Information Center

    Martinez-Garcia, Agustina; Corti, Louise

    2012-01-01

    This article discusses how the idea of higher education students as producers of knowledge rather than consumers can be operationalised by means of student research projects, in which processes of research archiving and analysis are enabled through the use of semantic technologies. It discusses how existing digital repository frameworks can be…

  4. Electronic Scientific Data & Literature Aggregation: A Review for Librarians

    ERIC Educational Resources Information Center

    Losoff, Barbara

    2009-01-01

    The advent of large-scale digital repositories, along with the need for sharing useful data world-wide, demands change to the current information structure. The merging of digital scientific data with scholarly literature has the potential to fulfill the Semantic Web design principles. This paper will identify factors leading to integration of…

  5. Re-Using Today's Metadata for Tomorrow's Research: Five Practical Examples for Enhancing Access to Digital Collections

    ERIC Educational Resources Information Center

    Tzoc, Elias

    2011-01-01

    According to the "Framework of Guidance for Building Good Digital Collections," a good collection is broadly available and avoids unnecessary impediments to use. Two challenges, however, are the constant change in users' expectations and the increasing volume of information in local repositories. Therefore, as academic and research…

  6. Ondigita: A Platform for the Management and Delivery of Digital Documents

    ERIC Educational Resources Information Center

    Mazza, Riccardo; Baldassari, Andrea; Guidi, Roberto

    2013-01-01

    This paper presents Ondigita, a platform developed at the University of Applied Sciences of Southern Switzerland for the management and delivery of digital documents to students enrolled in bachelor's courses in various curricula within the field of engineering. Ondigita allows our organization to have a cloud-based repository of educational…

  7. Standards-based metadata procedures for retrieving data for display or mining utilizing persistent (data-DOI) identifiers.

    PubMed

    Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S

    2015-01-01

    We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.

  8. 10 CFR 60.134 - Design of seals for shafts and boreholes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60... the geologic repository's ability to meet the performance objectives or the period following permanent...

  9. 10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance of the geologic repository operations area... OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.111 Performance of the geologic repository operations area through permanent closure. (a...

  10. Linking Big and Small Data Across the Social, Engineering, and Earth Sciences

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; de Sherbinin, A. M.; Levy, M. A.; Downs, R. R.

    2014-12-01

    The challenges of sustainable development cut across the social, health, ecological, engineering, and Earth sciences, across a wide range of spatial and temporal scales, and across the spectrum from basic to applied research and decision making. The rapidly increasing availability of data and information in digital form from a variety of data repositories, networks, and other sources provides new opportunities to link and integrate both traditional data holdings as well as emerging "big data" resources in ways that enable interdisciplinary research and facilitate the use of objective scientific data and information in society. Taking advantage of these opportunities not only requires improved technical and scientific data interoperability across disciplines, scales, and data types, but also concerted efforts to bridge gaps and barriers between key communities, institutions, and networks. Given the long time perspectives required in planning sustainable approaches to development, it is also imperative to address user requirements for long-term data continuity and stewardship by trustworthy repositories. We report here on lessons learned by CIESIN working on a range of sustainable development issues to integrate data across multiple repositories and networks. This includes CIESIN's roles in developing policy-relevant climate and environmental indicators, soil data for African agriculture, and exposure and risk measures for hazards, disease, and conflict, as well as CIESIN's participation in a range of national and international initiatives related both to sustainable development and to open data access, interoperability, and stewardship.

  11. Evaluation of web-based annotation of ophthalmic images for multicentric clinical trials.

    PubMed

    Chalam, K V; Jain, P; Shah, V A; Shah, Gaurav Y

    2006-06-01

    An Internet browser-based annotation system can be used to identify and describe features in digitalized retinal images, in multicentric clinical trials, in real time. In this web-based annotation system, the user employs a mouse to draw and create annotations on a transparent layer, that encapsulates the observations and interpretations of a specific image. Multiple annotation layers may be overlaid on a single image. These layers may correspond to annotations by different users on the same image or annotations of a temporal sequence of images of a disease process, over a period of time. In addition, geometrical properties of annotated figures may be computed and measured. The annotations are stored in a central repository database on a server, which can be retrieved by multiple users in real time. This system facilitates objective evaluation of digital images and comparison of double-blind readings of digital photographs, with an identifiable audit trail. Annotation of ophthalmic images allowed clinically feasible and useful interpretation to track properties of an area of fundus pathology. This provided an objective method to monitor properties of pathologies over time, an essential component of multicentric clinical trials. The annotation system also allowed users to view stereoscopic images that are stereo pairs. This web-based annotation system is useful and valuable in monitoring patient care, in multicentric clinical trials, telemedicine, teaching and routine clinical settings.

  12. Trends in academic health sciences libraries and their emergence as the “knowledge nexus” for their academic health centers*

    PubMed Central

    Kronenfeld, Michael R.

    2005-01-01

    Objectives: The objective of this study was to identify trends in academic health sciences libraries (AHSLs) as they adapt to the shift from a print knowledgebase to an increasingly digital knowledgebase. This research was funded by the 2003 David A. Kronick Traveling Fellowship. Methods: The author spent a day and a half interviewing professional staff at each library. The questionnaire used was sent to the directors of each library in advance of the visit, and the directors picked the staff to be interviewed and set up the schedule. Results: Seven significant trends were identified. These trends are part of the shift of AHSLs from being facility and print oriented with a primary focus on their role as repositories of a print-based knowledgebase to a new focus on their role as the center or “nexus” for the organization, access, and use of an increasingly digital-based knowledgebase. Conclusion: This paper calls for a national effort to develop a new model or structure for health sciences libraries to more effectively respond to the challenges of access and use of a digital knowledgebase, much the same way the National Library of Medicine did in the 1960s and 1970s in developing and implementing the National Network of Libraries of Medicine. The paper then concludes with some examples or ideas for research to assist in this process. PMID:15685271

  13. A Room with a View: Observations on "Unanticipated" Licensing Agreements and Born Digital Content

    ERIC Educational Resources Information Center

    Lapinski, P. Scott

    2012-01-01

    One of the many challenges that content creators and repository administrators are both struggling with in this "born digital" information environment is the "ownership" of content. After several years of engaging directly with researchers across their campus and providing seminars on the National Institutes of Health (NIH) Public Access Policy,…

  14. A Rising Tide of Digitization--The Ohio Memory Project

    ERIC Educational Resources Information Center

    Kupfer, Shannon

    2010-01-01

    In 2009, after a year of planning and preparation, the second generation of Ohio Memory was launched. A collaborative effort of the Ohio Historical Society (OHS) and the State Library of Ohio, Ohio Memory is a repository for more than 75,000 digital items, including photographs, journals, and other manuscript materials, as well as print documents…

  15. FILILAB: Creation and Use of a Learning Object Repository for EFL

    ERIC Educational Resources Information Center

    Litzler, Mary Frances; Garcia Laborda, Jesus; Halbach, Ana

    2012-01-01

    Background: Students at the Universidad de Alcala need batteries of learning objects and exercises. Although student textbooks tend to include a wide range of additional exercises, students in advanced linguistics and language courses require learning objects to obtain additional practice. Online repositories offer excellent opportunities for…

  16. Object links in the repository

    NASA Technical Reports Server (NTRS)

    Beck, Jon; Eichmann, David

    1991-01-01

    Some of the architectural ramifications of extending the Eichmann/Atkins lattice-based classification scheme to encompass the assets of the full life-cycle of software development are explored. In particular, we wish to consider a model which provides explicit links between objects in addition to the edges connecting classification vertices in the standard lattice. The model we consider uses object-oriented terminology. Thus, the lattice is viewed as a data structure which contains class objects which exhibit inheritance. A description of the types of objects in the repository is presented, followed by a discussion of how they interrelate. We discuss features of the object-oriented model which support these objects and their links, and consider behavior which an implementation of the model should exhibit. Finally, we indicate some thoughts on implementing a prototype of this repository architecture.

  17. Understand your Algorithm: Drill Down to Sample Visualizations in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Mapes, B. E.; Ho, Y.; Cheedela, S. K.; McWhirter, J.

    2017-12-01

    Statistics are the currency of climate dynamics, but the space of all possible algorithms is fathomless - especially for 4-dimensional weather-resolving data that many "impact" variables depend on. Algorithms are designed on data samples, but how do you know if they measure what you expect when turned loose on Big Data? We will introduce the year-1 prototype of a 3-year scientist-led, NSF-supported, Unidata-quality software stack called DRILSDOWN (https://brianmapes.github.io/EarthCube-DRILSDOWN/) for automatically extracting, integrating, and visualizing multivariate 4D data samples. Based on a customizable "IDV bundle" of data sources, fields and displays supplied by the user, the system will teleport its space-time coordinates to fetch Cases of Interest (edge cases, typical cases, etc.) from large aggregated repositories. These standard displays can serve as backdrops to overlay with your value-added fields (such as derived quantities stored on a user's local disk). Fields can be readily pulled out of the visualization object for further processing in Python. The hope is that algorithms successfully tested in this visualization space will then be lifted out and added to automatic processing toolchains, lending confidence in the next round of processing, to seek the next Cases of Interest, in light of a user's statistical measures of "Interest". To log the scientific work done in this vein, the visualizations are wrapped in iPython-based Jupyter notebooks for rich, human-readable documentation (indeed, quasi-publication with formatted text, LaTex math, etc.). Such notebooks are readable and executable, with digital replicability and provenance built in. The entire digital object of a case study can be stored in a repository, where libraries of these Case Study Notebooks can be examined in a browser. Model data (the session topic) are of course especially convenient for this system, but observations of all sorts can also be brought in, overlain, and differenced or otherwise co-processed. The system is available in various tiers, from minimal-install GUI visualizations only, to GUI+Notebook system, to the full system with the repository software. We seek interested users, initially in a "beta tester" mode with the goodwill to offer reports and requests to help drive improvements in project years 2 and 3.

  18. Virtual patient repositories--a comparative analysis.

    PubMed

    Küfner, Julia; Kononowicz, Andrzej A; Hege, Inga

    2014-01-01

    Virtual Patients (VPs) are an important component of medical education. One way to reduce the costs for creating VPs is sharing through repositories. We conducted a literature review to identify existing repositories and analyzed the 17 included repositories in regards to the search functions and metadata they provide. Most repositories provided some metadata such as title or description, whereas other data, such as educational objectives, were less frequent. Future research could, in cooperation with the repository provider, investigate user expectations and usage patterns.

  19. DataONE: Gateway to Earth and Environmental Data Repositories

    NASA Astrophysics Data System (ADS)

    Koskela, R.; Michener, W. K.; Vieglais, D.; Budden, A. E.

    2017-12-01

    DataONE (Data Observation Network for Earth) is a National Science Foundation DataNet project that enables universal access to data and also facilitates researchers in fulfilling their need for data management and in providing secure and permanent access to their data. DataONE offers the scientific community a suite of tools and training materials that cover all aspects of the data life cycle from data collection, to management, analysis and publication. Data repositories affiliated with DataONE are referred to as Member Nodes and represent large regional, national and international research networks, agencies, and other institutions. As part of the DataONE Federation, the repositories gain access to a range of value-added services to support their users. These services include usage tracking and reporting, content replication, and the ability to register the services created by the repository. In addition, DataONE and the California Digital Library manage ONEShare, a repository that accepts content submitted through Dash, a platform allowing researchers to easily describe, deposit and share their research data.

  20. ESGF and WDCC: The Double Structure of the Digital Data Storage at DKRZ

    NASA Astrophysics Data System (ADS)

    Toussaint, F.; Höck, H.

    2016-12-01

    Since a couple of years, Digital Repositories of climate science face new challenges: International projects are global collaborations. The data storage in parallel moved to federated, distributed storage systems like ESGF. For the long term archival storage (LTA) on the other hand, communities, funders, and data users make stronger demands for data and metadata quality to facilitate data use and reuse. At DKRZ, this situation led to a twofold data dissemination system - a situation which has influence on administration, workflows, and sustainability of the data. The ESGF system is focused on the needs of users as partners in global projects. It includes replication tools, detailed global project standards, and efficient search for the data to download. In contrast, DKRZ's classical CERA LTA storage aims for long term data holding and data curation as well as for data reuse requiring high metadata quality standards. In addition, for LTA data a Digital Object Identifier publication service for the direct integration of research data in scientific publications has been implemented. The editorial process at DKRZ-LTA ensures the quality of metadata and research data. The DOI and a citation code are provided and afterwards registered under DataCite's (datacite.org) regulations. In the overall data life cycle continuous reliability of the data and metadata quality is essential to allow for data handling at Petabytes level, data long term usability, and adequate publication of the results. These considerations lead to the question "What is quality" - with respect to data, to the repository itself, to the publisher, and the user? Global consensus is needed for these assessments as the phases of the end to end workflow gear into each other: For data and metadata, checks need to go hand in hand with the processes of production and storage. The results can be judged following a Quality Maturity Matrix (QMM). Repositories can be certified according to their trustworthiness. For the publication of any scientific conclusions, scientific community, funders, media, and policy makers ask for the publisher's impact in terms of readers' credit, run, and presentation quality. The paper describes the data life cycle. Emphasis is put on the different levels of quality assessment which at DKRZ ensure the data and metadata quality.

  1. An Assistant for Loading Learning Object Metadata: An Ontology Based Approach

    ERIC Educational Resources Information Center

    Casali, Ana; Deco, Claudia; Romano, Agustín; Tomé, Guillermo

    2013-01-01

    In the last years, the development of different Repositories of Learning Objects has been increased. Users can retrieve these resources for reuse and personalization through searches in web repositories. The importance of high quality metadata is key for a successful retrieval. Learning Objects are described with metadata usually in the standard…

  2. XDS in healthcare: Could it lead to a duplication problem? Field study from GVR Sweden

    NASA Astrophysics Data System (ADS)

    Wintell, M.; Lundberg, N.; Lindsköld, L.

    2011-03-01

    Managing different registries and repositories within healthcare regions grows the risk of having almost the same information but with different status and with different content. This is due to the fact that when medical information is created it's done in a dynamical process that will lead to that information will change its contents during lifetime within the "active" healthcare phase. The information needs to be easy accessible, being the platform for making the medical decisions transparent. In the Region Västra Götaland (VGR), Sweden, data is shared from 29 X-ray departments with different Picture Archive and Communication Systems (PACS) and Radiology Information Systems (RIS) systems through the Infobroker solution, that's acts as a broker between the actors involved. Request/reports from RIS are stored as DIgital COmmunication in Medicine (DICOM)-Structured Reports (SR) objects, together with the images. Every status change within this activities are updated within the Information Infrastructure based on Integrating the Healthcare Enterprise (IHE) mission. Cross-enterprise Document Sharing for Imaging (XDS-I) were the registry and the central repository are the components used for sharing medical documentation. The VGR strategy was not to apply one regional XDS-I registry and repository, instead VGR applied an Enterprise Architecture (EA) intertwined with the Information Infrastructure for the dynamic delivery to consumers. The upcoming usage of different Regional XDS registries and repositories could lead to new ways of carrying out shared work but it can also lead into "problems". XDS and XDS-I implemented without a strategy could lead to increased numbers of status/versions but also duplication of information in the Information Infrastructure.

  3. Rolling Deck to Repository (R2R): Collaborative Development of Linked Data for Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Arko, Robert; Chandler, Cynthia; Stocks, Karen; Smith, Shawn; Clark, Paul; Shepherd, Adam; Moore, Carla; Beaulieu, Stace

    2013-04-01

    The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from U.S. academic oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. The entire R2R Catalog is published online as a Linked Data collection, making it easily accessible to encourage discovery and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by publishing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation; 2.) We facilitate data citation through the entire lifecycle from field acquisition to shoreside archiving to journal articles and global syntheses, by publishing Digital Object Identifiers (DOIs) for datasets and encoding them directly into our Linked Data resources; and 3.) We facilitate federation with other repositories such as the Biological and Chemical Oceanography Data Management Office (BCO-DMO), InterRidge Vents Database, and Index to Marine and Lacustrine Geological Samples (IMLGS), by reciprocal linking between RDF resources and supporting the RDF Query Language. R2R participates in the Ocean Data Interoperability Platform (ODIP), a joint European-U.S.-Australian partnership to facilitate the sharing of data and documentation across international borders. We publish our controlled vocabularies as a Simple Knowledge Organization System (SKOS) concept collection, and are working toward alignment with SeaDataNet and other community-standard terms using the NERC Vocabulary Server (NVS). http://rvdata.us/

  4. Wikipedia Lover, Not a Hater: Harnessing Wikipedia to Increase the Discoverability of Library Resources

    ERIC Educational Resources Information Center

    Elder, Danielle; Westbrook, R. Niccole; Reilly, Michele

    2012-01-01

    During the spring of 2010, the University of Houston Libraries Digital Services Department began an initiative to promote existing and upcoming collections in the University of Houston Digital Library and drive traffic to the online repository. Spurred by an OCLC report (De Rosa et al. 2005) that only two percent of college and university students…

  5. A digital repository with an extensible data model for biobanking and genomic analysis management.

    PubMed

    Izzo, Massimiliano; Mortola, Francesco; Arnulfo, Gabriele; Fato, Marco M; Varesio, Luigi

    2014-01-01

    Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid.

  6. A digital repository with an extensible data model for biobanking and genomic analysis management

    PubMed Central

    2014-01-01

    Motivation Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. Results We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Conclusions Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid. PMID:25077808

  7. A global snapshot of the state of digital collections in the health sciences, 2013*

    PubMed Central

    Pickett, Keith M.; Knapp, Maureen M.

    2014-01-01

    Two hundred twenty-nine health sciences libraries (HSLs) worldwide were surveyed regarding the availability of digital collections, evidence of the type of digital collections, level of access, software used, and HSL type. Of the surveyed libraries, 69% (n = 157) had digital collections, with an average of 1,531 items in each collection; 49% (n = 112) also had institutional repositories. In most cases (n = 147), these collections were publicly available. The predominant platforms for disseminating these digital collections were CONTENTdm and library web pages. Only 50% (n = 77) of these collections were managed by the health sciences library itself. PMID:24860271

  8. A global snapshot of the state of digital collections in the health sciences, 2013.

    PubMed

    Pickett, Keith M; Knapp, Maureen M

    2014-04-01

    Two hundred twenty-nine health sciences libraries (HSLs) worldwide were surveyed regarding the availability of digital collections, evidence of the type of digital collections, level of access, software used, and HSL type. Of the surveyed libraries, 69% (n = 157) had digital collections, with an average of 1,531 items in each collection; 49% (n = 112) also had institutional repositories. In most cases (n = 147), these collections were publicly available. The predominant platforms for disseminating these digital collections were CONTENTdm and library web pages. Only 50% (n = 77) of these collections were managed by the health sciences library itself.

  9. A digital library for medical imaging activities

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Furuie, Sérgio S.

    2007-03-01

    This work presents the development of an electronic infrastructure to make available a free, online, multipurpose and multimodality medical image database. The proposed infrastructure implements a distributed architecture for medical image database, authoring tools, and a repository for multimedia documents. Also it includes a peer-reviewed model that assures quality of dataset. This public repository provides a single point of access for medical images and related information to facilitate retrieval tasks. The proposed approach has been used as an electronic teaching system in Radiology as well.

  10. Open Access to Geophysical Data

    NASA Astrophysics Data System (ADS)

    Sergeyeva, Nataliya A.; Zabarinskaya, Ludmila P.

    2017-04-01

    Russian World Data Centers for Solar-Terrestrial Physics & Solid Earth Physics hosted by the Geophysical Center of the Russian Academy of Sciences are the Regular Members of the ICSU-World Data System. Guided by the principles of the WDS Constitution and WDS Data Sharing Principles, the WDCs provide full and open access to data, long-term data stewardship, compliance with agreed-upon data standards and conventions, and mechanisms to facilitate and improve access to data. Historical and current geophysical data on different media, in the form of digital data sets, analog records, collections of maps, descriptions are stored and collected in the Centers. The WDCs regularly fill up repositories and database with new data, support them up to date. Now the WDCs focus on four new projects, aimed at increase of data available in network by retrospective data collection and digital preservation of data; creation of a modern system of registration and publication of data with digital object identifier (DOI) assignment, and promotion of data citation culture; creation of databases instead of file system for more convenient access to data; participation in the WDS Metadata Catalogue and Data Portal by creating of metadata for information resources of WDCs.

  11. NCI and the Precision Medicine Initiative®

    Cancer.gov

    NCI's activities related to precision medicine focuses on new and expanded precision medicine clinical trials; mechanisms to overcome drug resistance to cancer treatments; and developing a shared digital repository of precision medicine trials data.

  12. The Index to Marine and Lacustrine Geological Samples (IMLGS): Linking Digital Data to Physical Samples for the Marine Community

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Jencks, J. H.; Eakins, B.

    2016-12-01

    The Index to Marine and Lacustrine Geological Samples (IMLGS) is a community designed and maintained resource enabling researchers to locate and request seafloor and lakebed geologic samples curated by partner institutions. The Index was conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center, now the National Centers for Environmental Information (NCEI), at a 1977 meeting convened by the National Science Foundation (NSF). The Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. The Curators Consortium, international in scope, meets biennially to share ideas and discuss best practices. NCEI serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the laborious task of creating and contributing metadata for over 205,000 sea floor and lake-bed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The IMLGS has a persistent URL/Digital Object Identifier (DOI), as well as DOIs assigned to partner collections for citation and to provide a persistent link to curator collections. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images : 1) at participating institutions, 2) in the NCEI archive, and 3) through a Linked Data interface maintained by the Rolling Deck to Repository R2R. Over 43,000 International GeoSample Numbers (IGSNs) linking to the System for Earth Sample Registration (SESAR) are included in anticipation of opportunities for interconnectivity with Integrated Earth Data Applications (IEDA) systems. The paper will discuss the database with a goal to increase the connections and links to related data at partner institutions.

  13. An Analysis on Usage Preferences of Learning Objects and Learning Object Repositories among Pre-Service Teachers

    ERIC Educational Resources Information Center

    Yeni, Sabiha; Ozdener, Nesrin

    2014-01-01

    The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…

  14. Standards-based curation of a decade-old digital repository dataset of molecular information.

    PubMed

    Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Murray-Rust, Peter; Rzepa, Henry S; Stewart, James J P

    2015-01-01

    The desirable curation of 158,122 molecular geometries derived from the NCI set of reference molecules together with associated properties computed using the MOPAC semi-empirical quantum mechanical method and originally deposited in 2005 into the Cambridge DSpace repository as a data collection is reported. The procedures involved in the curation included annotation of the original data using new MOPAC methods, updating the syntax of the CML documents used to express the data to ensure schema conformance and adding new metadata describing the entries together with a XML schema transformation to map the metadata schema to that used by the DataCite organisation. We have adopted a granularity model in which a DataCite persistent identifier (DOI) is created for each individual molecule to enable data discovery and data metrics at this level using DataCite tools. We recommend that the future research data management (RDM) of the scientific and chemical data components associated with journal articles (the "supporting information") should be conducted in a manner that facilitates automatic periodic curation. Graphical abstractStandards and metadata-based curation of a decade-old digital repository dataset of molecular information.

  15. Places to Go: Connexions

    ERIC Educational Resources Information Center

    Downes, Stephen

    2005-01-01

    When compared with, say, blogging, the deployment of learning objects has been slow indeed. While blog aggregation services are recording millions of blogs and hundreds of millions of blog posts, academic learning object repositories number their resources only in the thousands, and even major corporate repositories have only one or two million…

  16. Architecture Studio Archive: A Case Study in the Comprehensive Digital Capture and Repository of Student Design Work as an Aid to Teaching, Research, and Accreditation

    ERIC Educational Resources Information Center

    Anderson, Ross; Arndell, Michael; Christensen, Sten

    2009-01-01

    The "Architecture Studio Archive" pilot sought to form a comprehensive digital archive of the diverse student work conducted in the first year of the Bachelor of Design in Architecture Degree at the University of Sydney. The design studio is the primary vehicle for teaching architectural design. It is a locus for creative activity, with…

  17. 49 CFR 195.59 - Abandonment or deactivation of facilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....phmsa.dot.gov or contact the NPMS National Repository at 703-317-3073. A digital data format is preferred, but hard copy submissions are acceptable if they comply with the NPMS Standards. In addition to...

  18. Digital data preservation for scholarly publications in astronomy

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayeed; di Lauro, Tim; Szalay, Alex; Vishniac, Ethan; Hanisch, Robert; Steffen, Julie; Milkey, Robert; Ehling, Teresa; Plante, Ray

    2007-11-01

    Astronomy is similar to other scientific disciplines in that scholarly publication relies on the presentation and interpretation of data. But although astronomy now has archives for its primary research telescopes and associated surveys, the highly processed data that is presented in the peer-reviewed journals and is the basis for final analysis and interpretation is generally not archived and has no permanent repository. We have initiated a project whose goal is to implement an end-to-end prototype system which, through a partnership of a professional society, that society's scholarly publications/publishers, research libraries, and an information technology substrate provided by the Virtual Observatory, will capture high-level digital data as part of the publication process and establish a distributed network of curated, permanent data repositories. The data in this network will be accessible through the research journals, astronomy data centers, and Virtual Observatory data discovery portals.

  19. 10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...

  20. 10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...

  1. 10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...

  2. 10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...

  3. Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.

  4. Electronic theses and dissertations: a review of this valuable resource for nurse scholars worldwide.

    PubMed

    Goodfellow, L M

    2009-06-01

    A worldwide repository of electronic theses and dissertations (ETDs) could provide worldwide access to the most up-to-date research generated by masters and doctoral students. Until that international repository is established, it is possible to access some of these valuable knowledge resources. ETDs provide a technologically advanced medium with endless multimedia capabilities that far exceed the print and bound copies of theses and dissertations housed traditionally in individual university libraries. CURRENT USE: A growing trend exists for universities worldwide to require graduate students to submit theses or dissertations as electronic documents. However, nurse scholars underutilize ETDs, as evidenced by perusing bibliographic citation lists in many of the research journals. ETDs can be searched for and retrieved through several digital resources such as the Networked Digital Library of Theses and Dissertations (http://www.ndltd.org), ProQuest Dissertations and Theses (http://www.umi.com), the Australasian Digital Theses Program (http://adt.caul.edu.au/) and through individual university web sites and online catalogues. An international repository of ETDs benefits the community of nurse scholars in many ways. The ability to access recent graduate students' research electronically from anywhere in the world is advantageous. For scholars residing in developing countries, access to these ETDs may prove to be even more valuable. In some cases, ETDs are not available for worldwide access and can only be accessed through the university library from which the student graduated. Public access to university library ETD collections is not always permitted. Nurse scholars from both developing and developed countries could benefit from ETDs.

  5. Repositories for Deep, Dark, and Offline Data - Building Grey Literature Repositories and Discoverability

    NASA Astrophysics Data System (ADS)

    Keane, C. M.; Tahirkheli, S.

    2017-12-01

    Data repositories, especially in the geosciences, have been focused on the management of large quantities of born-digital data and facilitating its discovery and use. Unfortunately, born-digital data, even with its immense scale today, represents only the most recent data acquisitions, leaving a large proportion of the historical data record of the science "out in the cold." Additionally, the data record in the peer-reviewed literature, whether captured directly in the literature or through the journal data archive, represents only a fraction of the reliable data collected in the geosciences. Federal and state agencies, state surveys, and private companies, collect vast amounts of geoscience information and data that is not only reliable and robust, but often the only data representative of specific spatial and temporal conditions. Likewise, even some academic publications, such as senior theses, are unique sources of data, but generally do not have wide discoverability nor guarantees of longevity. As more of these `grey' sources of information and data are born-digital, they become increasingly at risk for permanent loss, not to mention poor discoverability. Numerous studies have shown that grey literature across all disciplines, including geosciences, disappears at a rate of about 8% per year. AGI has been working to develop systems to both improve the discoverability and the preservation of the geoscience grey literature by coupling several open source platforms from the information science community. We will detail the rationale, the technical and legal frameworks for these systems, and the long-term strategies for improving access, use, and stability of these critical data sources.

  6. Rolling Deck to Repository (R2R): Products and Services for the U.S. Research Fleet Community

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.

    2016-02-01

    The Rolling Deck to Repository (R2R) program is working to ensure open access to environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 TB/year of data to R2R from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R ensures these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R has recently expanded to include the vessels Sikuliaq, operated by the University of Alaska; Falkor, operated by the Schmidt Ocean Institute; and Ronald H. Brown and Okeanos Explorer, operated by NOAA. R2R maintains a master catalog of U.S. research cruises, currently holding over 4,670 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. Standard post-field cruise products are published including shiptrack navigation, near-real-time MET/TSG data, underway geophysical profiles, and CTD profiles. Software tools available to users include the R2R Event Logger and the R2R Nav Manager. A Digital Object Identifier (DOI) is published for each cruise, original field sensor dataset, standard post-field product, and document (e.g. cruise report) submitted by the science party. Scientists are linked to personal identifiers such as ORCIDs where available. Using standard identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. R2R collaborates in the Ocean Data Interoperability Platform (ODIP) to strengthen links among regional and national data systems, populates U.S. cruises in the POGO global catalog, and is working toward membership in the DataONE alliance. It is a lead partner in the EarthCube GeoLink project, developing Semantic Web technologies to share data and documentation between repositories, and in the newly-launched EarthCube SeaView project, delivering data from R2R and other ocean data facilities to scientists using the Ocean Data View (ODV) software tool.

  7. A digital future for the history of psychology?

    PubMed

    Green, Christopher D

    2016-08-01

    This article discusses the role that digital approaches to the history of psychology are likely to play in the near future. A tentative hierarchy of digital methods is proposed. A few examples are briefly described: a digital repository, a simple visualization using ready-made online database and tools, and more complex visualizations requiring the assembly of the database and, possibly, the analytic tools by the researcher. The relationship of digital history to the old "New Economic History" (Cliometrics) is considered. The question of whether digital history and traditional history need be at odds or, instead, might complement each other is woven throughout. The rapidly expanding territory of digital humanistic research outside of psychology is briefly discussed. Finally, the challenging current employment trends in history and the humanities more broadly are considered, along with the role that digital skills might play in mitigating those factors for prospective academic workers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. RiPLE: Recommendation in Peer-Learning Environments Based on Knowledge Gaps and Interests

    ERIC Educational Resources Information Center

    Khosravi, Hassan; Kitto, Kirsty; Cooper, Kendra

    2017-01-01

    Various forms of Peer-Learning Environments are increasingly being used in post-secondary education, often to help build repositories of student generated learning objects. However, large classes can result in an extensive repository, which can make it more challenging for students to search for suitable objects that both reflect their interests…

  9. Through Efficient Use of LORs: Prospective Teachers' Views on Operational Aspects of Learning Object Repositories

    ERIC Educational Resources Information Center

    Yalcinalp, Serpil; Emiroglu, Bulent

    2012-01-01

    Although many developments have been made in the design and development of learning object repositories (LORs), the efficient use of such systems is still questionable. Without realising the functional use of such systems or considering the involvement of their dynamic users, these systems would probably become obsolete. This study includes both…

  10. Collaborative Learning Utilizing a Domain-Based Shared Data Repository to Enhance Learning Outcomes

    ERIC Educational Resources Information Center

    Lubliner, David; Widmeyer, George; Deek, Fadi P.

    2009-01-01

    The objective of this study was to determine whether there was a quantifiable improvement in learning outcomes by integrating course materials in a 4-year baccalaureate program, utilizing a knowledge repository with a conceptual map that spans a discipline. Two new models were developed to provide the framework for this knowledge repository. A…

  11. Looking for Skeletons in the Data Centre `Cupboard': How Repository Certification Can Help

    NASA Astrophysics Data System (ADS)

    Sorvari, S.; Glaves, H.

    2017-12-01

    There has been a national geoscience repository at the British Geological Survey (or one of its previous incarnations) almost since its inception in 1835. This longevity has resulted in vast amounts of analogue material and, more recently, digital data some of which has been collected by our scientists but much more has been acquired either through various legislative obligations or donated from various sources. However, the role and operation of the UK National Geoscience Data Centre (NGDC) in the 21st Century is very different to that of the past, with new systems and procedures dealing with predominantly digital data. A web-based ingestion portal allows users to submit their data directly to the NGDC while online services provide discovery and access to data and derived products. Increasingly we are also required to implement an array of standards e.g. ISO, OGC, W3C, best practices e.g. FAIR and legislation e.g. EU INSPIRE Directive; whilst at the same time needing to justifying our very existence to our funding agency and hosting organisation. External pressures to demonstrate that we can be recognised as a trusted repository by researchers, various funding agencies, publishers and other related entities have forced us to look at how we function, and to benchmark our operations against those of other organisations and current relevant standards such as those laid down by different repository certification processes. Following an assessment of the various options, the WDS/DSA certification process was selected as the most appropriate route for accreditation of NGDC as a trustworthy repository. It provided a suitable framework for reviewing the current systems, procedures and best practices. Undertaking this process allowed us to identify where the NGDC already has robust systems in place and where there were gaps and deficiencies in current practices. The WDS/DSA assessment process also helped to reinforce best practice throughout the NGDC and demonstrated that many of the recognised and required procedures and standards for recognition as a trusted repository were already in place, even if they were not always followed!

  12. Beams of particles and papers: How digital preprint archives shape authorship and credit.

    PubMed

    Delfanti, Alessandro

    2016-08-01

    In high energy physics, scholarly papers circulate primarily through online preprint archives based on a centralized repository, arXiv, that physicists simply refer to as 'the archive'. The archive is not just a tool for preservation and memory but also a space of flows where written objects are detected and their authors made available for scrutiny. In this article, I analyze the reading and publishing practices of two subsets of high energy physicists: theorists and experimentalists. In order to be recognized as legitimate and productive members of their community, they need to abide by the temporalities and authorial practices structured by the archive. Theorists live in a state of accelerated time that shapes their reading and publishing practices around precise cycles. Experimentalists turn to tactics that allow them to circumvent the slowed-down time and invisibility they experience as members of large collaborations. As digital platforms for the exchange of scholarly articles emerge in other fields, high energy physics could help shed light on general transformations of contemporary scholarly communication systems.

  13. Environmental digital data repositories project : final report, June 22, 2009.

    DOT National Transportation Integrated Search

    2010-06-22

    This research body of work addresses two outstanding needs of the FDOT. The first need is to support the FDOT's Strategic Intermodal System (SIS) initiative and their efforts to define and manage existing and proposed multimodal centers, modes (air, ...

  14. The Athabasca University eduSource Project: Building an Accessible Learning Object Repository

    ERIC Educational Resources Information Center

    Cleveland-Innes, Martha; McGreal, Rory; Anderson, Terry; Friesen, Norm; Ally, Mohamed; Tin, Tony; Graham, Rodger; Moisey, Susan; Petrinjak, Anita; Schafer, Steve

    2005-01-01

    Athabasca University--Canada's Open University (AU) made the commitment to put all of its courses online as part of its Strategic University Plan. In pursuit of this goal, AU participated in the eduSource project, a pan-Canadian effort to build the infrastructure for an interoperable network of learning object repositories. AU acted as a leader in…

  15. Flight Data Entry, Descent, and Landing (EDL) Repository

    NASA Technical Reports Server (NTRS)

    Martinez, Elmain M.; Winterhalter, Daniel

    2012-01-01

    Dr. Daniel Winterhalter, NASA Engineering and Safety Center Chief Engineer at the Jet Propulsion Laboratory, requested the NASA Engineering and Safety Center sponsor a 3-year effort to collect entry, descent, and landing material and to establish a NASA-wide archive to serve the material. The principle focus of this task was to identify entry, descent, and landing repository material that was at risk of being permanently lost due to damage, decay, and undocumented storage. To provide NASA-wide access to this material, a web-based digital archive was created. This document contains the outcome of the effort.

  16. TraitBank: An Open Digital Repository for Organism Traits

    USDA-ARS?s Scientific Manuscript database

    TraitBank currently serves over 11 million measurements and facts for more than 1.7 million taxa. These data are mobilized from major biodiversity information systems (e.g., International Union for Conservation of Nature, Ocean Biogeographic Information System, Paleobiology Database), literature sup...

  17. Digital Rocks Portal: a sustainable platform for imaged dataset sharing, translation and automated analysis

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.

    2015-12-01

    Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  18. Improving the Discoverability and Availability of Sample Data and Imagery in NASA's Astromaterials Curation Digital Repository Using a New Common Architecture for Sample Databases

    NASA Technical Reports Server (NTRS)

    Todd, N. S.; Evans, C.

    2015-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (JSC) is the designated facility for curating all of NASA's extraterrestrial samples. The suite of collections includes the lunar samples from the Apollo missions, cosmic dust particles falling into the Earth's atmosphere, meteorites collected in Antarctica, comet and interstellar dust particles from the Stardust mission, asteroid particles from the Japanese Hayabusa mission, and solar wind atoms collected during the Genesis mission. To support planetary science research on these samples, NASA's Astromaterials Curation Office hosts the Astromaterials Curation Digital Repository, which provides descriptions of the missions and collections, and critical information about each individual sample. Our office is implementing several informatics initiatives with the goal of better serving the planetary research community. One of these initiatives aims to increase the availability and discoverability of sample data and images through the use of a newly designed common architecture for Astromaterials Curation databases.

  19. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  20. A hybrid organic-inorganic perovskite dataset

    NASA Astrophysics Data System (ADS)

    Kim, Chiho; Huan, Tran Doan; Krishnan, Sridevi; Ramprasad, Rampi

    2017-05-01

    Hybrid organic-inorganic perovskites (HOIPs) have been attracting a great deal of attention due to their versatility of electronic properties and fabrication methods. We prepare a dataset of 1,346 HOIPs, which features 16 organic cations, 3 group-IV cations and 4 halide anions. Using a combination of an atomic structure search method and density functional theory calculations, the optimized structures, the bandgap, the dielectric constant, and the relative energies of the HOIPs are uniformly prepared and validated by comparing with relevant experimental and/or theoretical data. We make the dataset available at Dryad Digital Repository, NoMaD Repository, and Khazana Repository (http://khazana.uconn.edu/), hoping that it could be useful for future data-mining efforts that can explore possible structure-property relationships and phenomenological models. Progressive extension of the dataset is expected as new organic cations become appropriate within the HOIP framework, and as additional properties are calculated for the new compounds found.

  1. eNOSHA, a Free, Open and Flexible Learning Object Repository--An Iterative Development Process for Global User-Friendliness

    ERIC Educational Resources Information Center

    Mozelius, Peter; Hettiarachchi, Enosha

    2012-01-01

    This paper describes the iterative development process of a Learning Object Repository (LOR), named eNOSHA. Discussions on a project for a LOR started at the e-Learning Centre (eLC) at The University of Colombo, School of Computing (UCSC) in 2007. The eLC has during the last decade been developing learning content for a nationwide e-learning…

  2. Utilizing online resources for taxonomy: a cybercatalog of Afrotropical apiocerid flies (Insecta: Diptera: Apioceridae).

    PubMed

    Dikow, Torsten; Agosti, Donat

    2015-01-01

    A cybercatalog to the Apioceridae (apiocerid flies) of the Afrotropical Region is provided. Each taxon entry includes links to open-access, online repositories such as ZooBank, BHL/BioStor/BLR, Plazi, GBIF, Morphbank, EoL, and a research web-site to access taxonomic information, digitized literature, morphological descriptions, specimen occurrence data, and images. Cybercatalogs as the one presented here will need to become the future of taxonomic catalogs taking advantage of the growing number of online repositories, linked data, and be easily updatable. Comments on the deposition of the holotype of Apiocera braunsi Melander, 1907 are made.

  3. Shared Geospatial Metadata Repository for Ontario University Libraries: Collaborative Approaches

    ERIC Educational Resources Information Center

    Forward, Erin; Leahey, Amber; Trimble, Leanne

    2015-01-01

    Successfully providing access to special collections of digital geospatial data in academic libraries relies upon complete and accurate metadata. Creating and maintaining metadata using specialized standards is a formidable challenge for libraries. The Ontario Council of University Libraries' Scholars GeoPortal project, which created a shared…

  4. Retrieving Online Language Learning Resources: Classification and Quality

    ERIC Educational Resources Information Center

    Krajcso, Zita; Frimmel, Ulrike

    2017-01-01

    Foreign language teachers and learners use digital repositories frequently to find appropriate activities for their teaching and learning activities. The question is: How can content providers support them in finding exactly what they need and in retrieving high quality resources? This question has been discussed in the literature and in the…

  5. Extending the ARIADNE Web-Based Learning Environment.

    ERIC Educational Resources Information Center

    Van Durm, Rafael; Duval, Erik; Verhoeven, Bart; Cardinaels, Kris; Olivie, Henk

    One of the central notions of the ARIADNE learning platform is a share-and-reuse approach toward the development of digital course material. The ARIADNE infrastructure includes a distributed database called the Knowledge Pool System (KPS), which acts as a repository of pedagogical material, described with standardized IEEE LTSC Learning Object…

  6. National Pipeline Mapping System (NPMS) : standards for creating pipeline location data : standards for electronic data submissions, including metadata standards and examples

    DOT National Transportation Integrated Search

    1997-07-14

    These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...

  7. Trends in academic health sciences libraries and their emergence as the "knowledge nexus" for their academic health centers.

    PubMed

    Kronenfeld, Michael R

    2005-01-01

    The objective of this study was to identify trends in academic health sciences libraries (AHSLs) as they adapt to the shift from a print knowledgebase to an increasingly digital knowledgebase. This research was funded by the 2003 David A. Kronick Traveling Fellowship. The author spent a day and a half interviewing professional staff at each library. The questionnaire used was sent to the directors of each library in advance of the visit, and the directors picked the staff to be interviewed and set up the schedule. Seven significant trends were identified. These trends are part of the shift of AHSLs from being facility and print oriented with a primary focus on their role as repositories of a print-based knowledgebase to a new focus on their role as the center or "nexus" for the organization, access, and use of an increasingly digital-based knowledgebase. This paper calls for a national effort to develop a new model or structure for health sciences libraries to more effectively respond to the challenges of access and use of a digital knowledgebase, much the same way the National Library of Medicine did in the 1960s and 1970s in developing and implementing the National Network of Libraries of Medicine. The paper then concludes with some examples or ideas for research to assist in this process.

  8. Applications for unique identifiers in the geological sciences

    NASA Astrophysics Data System (ADS)

    Klump, J.; Lehnert, K. A.

    2012-12-01

    Even though geology has always been a generalist discipline in many parts, approaches towards questions about Earth's past have become increasingly interdisciplinary. At the same time, a wealth of samples has been collected, the resulting data have been stored in in disciplinary databases, the interpretations published in scientific literature. In the past these resources have existed alongside each other, semantically linked only by the knowledge of the researcher and his peers. One of the main drivers towards the inception of the world wide web was the ability to link scientific sources over the internet. The Uniform Resource Locator (URL) used to locate resources on the web soon turned out to be ephemeral in nature. A more reliable way of addressing objects was needed, a way of persistent identification to make digital objects, or digital representations of objects, part of the record of science. With their high degree of centralisation the scientific publishing houses were quick to implement and adopt a system for unique and persistent identification, the Digital Object Identifier (DOI) ®. At the same time other identifier systems exist alongside DOI, e.g. URN, ARK, handle ®, and others. There many uses for persistent identification in science, other than the identification of journal articles. DOI are already used for the identification of data, thus making data citable. There are several initiatives to assign identifiers to authors and institutions to allow unique identification. A recent development is the application of persistent identifiers for geological samples. As most data in the geosciences are derived from samples, it is crucial to be able to uniquely identify the samples from which a set of data were derived. Incomplete documentation of samples in publications, use of ambiguous sample names are major obstacles for synthesis studies and re-use of data. Access to samples for re-analysis and re-appraisal is limited due to the lack of a central catalogue that allows finding a sample's archiving location. The International Geo Sample Number (IGSN) provides solutions to the questions of unique sample identification and discovery. Use of the IGSN in digital data systems allows building linkages between the digital representation of samples in sample registries, e.g. SESAR, and their related data in the literature and in web accessible digital data repositories. Persistent identifiers are now available for literature, data, samples, and authors. More applications, e.g. identification of methods or instruments, will follow. In conjunction with semantic web technology the application of unique and persistent identifiers in the geosciences will aid discovery both through systematic data mining, exploratory data analysis, and serendipity effects. This talk will discuss existing and emerging applications for persistent identifiers in the geological sciences.

  9. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology

    PubMed Central

    Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.

    2017-01-01

    Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625

  10. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology.

    PubMed

    Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M

    2017-04-01

    The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.

  11. Rock and Core Repository Coming Digital

    NASA Astrophysics Data System (ADS)

    Maicher, Doris; Fleischer, Dirk; Czerniak, Andreas

    2016-04-01

    In times of whole city centres being available by a mouse click in 3D to virtually walk through, reality sometimes becomes neglected. The reality of scientific sample collections not being digitised to the essence of molecules, isotopes and electrons becomes unbelievable to the upgrowing generation of scientists. Just like any other geological institute the Helmholtz Centre for Ocean Research GEOMAR accumulated thousands of specimen. The samples, collected mainly during marine expeditions, date back as far as 1964. Today GEOMAR houses a central geological sample collection of at least 17 000 m of sediment core and more than 4 500 boxes with hard rock samples and refined sample specimen. This repository, having been dormant, missed the onset of the interconnected digital age. Physical samples without barcodes, QR codes or RFID tags need to be migrated and reconnected, urgently. In our use case, GEOMAR opted for the International Geo Sample Number IGSN as the persistent identifier. Consequentially, the software CurationDIS by smartcube GmbH as the central component of this project was selected. The software is designed to handle acquisition and administration of sample material and sample archiving in storage places. In addition, the software allows direct embedding of IGSN. We plan to adopt IGSN as a future asset, while for the initial inventory taking of our sample material, simple but unique QR codes act as "bridging identifiers" during the process. Currently we compile an overview of the broad variety of sample types and their associated data. QR-coding of the boxes of rock samples and sediment cores is near completion, delineating their location in the repository and linking a particular sample to any information available about the object. Planning is in progress to streamline the flow from receiving new samples to their curation to sharing samples and information publically. Additionally, interface planning for linkage to GEOMAR databases OceanRep (publications) and OSIS (expeditions) as well as for external data retrieval are in the pipeline. Looking ahead to implement IGSN, taking on board lessons learned from earlier generations, it will enable to comply with our institute's open science policy. Also it will allow to register newly collected samples already during ship expeditions. They thus receive their "birth certificate" contemporarily in this ever faster revolving scientific world.

  12. A data library management system for midwest FreightView and its data repository.

    DOT National Transportation Integrated Search

    2011-03-01

    Midwest FreightView (MWFV) and its associated data repository is part of a large multifaceted : effort to promote regional economic development throughout the Great Lakes : system. The main objective for the system is to promote sustainable maritime ...

  13. Rolling Deck to Repository (R2R): Standards and Semantics for Open Access to Research Data

    NASA Astrophysics Data System (ADS)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2015-04-01

    In recent years, a growing number of funding agencies and professional societies have issued policies calling for open access to research data. The Rolling Deck to Repository (R2R) program is working to ensure open access to the environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 terabytes of data to R2R each year, acquired from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R is working to ensure these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R maintains a master catalog of cruises for the U.S. academic research fleet, currently holding essential documentation for over 3,800 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. A Digital Object Identifier (DOI) is published for 1) each cruise, 2) each original field sensor dataset, 3) each post-field data product such as quality-controlled shiptrack navigation produced by the R2R program, and 4) each document such as a cruise report submitted by the science party. Scientists are linked to personal identifiers, such as the Open Researcher and Contributor ID (ORCID), where known. Using standard global identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. Since its inception, the R2R program has worked in close collaboration with other data repositories in the development of shared semantics for oceanographic research. The R2R cruise catalog uses community-standard terms and definitions hosted by the NERC Vocabulary Server, and publishes ISO metadata records for each cruise that use community-standard profiles developed with the NOAA Data Centers and the EU SeaDataNet project. R2R is a partner in the Ocean Data Interoperability Platform (ODIP), working to strengthen links among regional and national data systems, as well as a lead partner in the EarthCube "GeoLink" project, developing a standard set of ontology design patterns for publishing research data using Semantic Web protocols.

  14. Data Publication: A Partnership between Scientists, Data Managers and Librarians

    NASA Astrophysics Data System (ADS)

    Raymond, L.; Chandler, C.; Lowry, R.; Urban, E.; Moncoiffe, G.; Pissierssens, P.; Norton, C.; Miller, H.

    2012-04-01

    Current literature on the topic of data publication suggests that success is best achieved when there is a partnership between scientists, data managers, and librarians. The Marine Biological Laboratory/Woods Hole Oceanographic Institution (MBLWHOI) Library and the Biological and Chemical Oceanography Data Management Office (BCO-DMO) have developed tools and processes to automate the ingestion of metadata from BCO-DMO for deposit with datasets into the Institutional Repository (IR) Woods Hole Open Access Server (WHOAS). The system also incorporates functionality for BCO-DMO to request a Digital Object Identifier (DOI) from the Library. This partnership allows the Library to work with a trusted data repository to ensure high quality data while the data repository utilizes library services and is assured of a permanent archive of the copy of the data extracted from the repository database. The assignment of persistent identifiers enables accurate data citation. The Library can assign a DOI to appropriate datasets deposited in WHOAS. A primary activity is working with authors to deposit datasets associated with published articles. The DOI would ideally be assigned before submission and be included in the published paper so readers can link directly to the dataset, but DOIs are also being assigned to datasets related to articles after publication. WHOAS metadata records link the article to the datasets and the datasets to the article. The assignment of DOIs has enabled another important collaboration with Elsevier, publisher of educational and professional science journals. Elsevier can now link from articles in the Science Direct database to the datasets available from WHOAS that are related to that article. The data associated with the article are freely available from WHOAS and accompanied by a Dublin Core metadata record. In addition, the Library has worked with researchers to deposit datasets in WHOAS that are not appropriate for national, international, or domain specific data repositories. These datasets currently include audio, text and image files. This research is being conducted by a team of librarians, data managers and scientists that are collaborating with representatives from the Scientific Committee on Oceanic Research (SCOR) and the International Oceanographic Data and Information Exchange (IODE) of the Intergovernmental Oceanographic Commission (IOC). The goal is to identify best practices for tracking data provenance and clearly attributing credit to data collectors/providers.

  15. Advancing the science of forensic data management

    NASA Astrophysics Data System (ADS)

    Naughton, Timothy S.

    2002-07-01

    Many individual elements comprise a typical forensics process. Collecting evidence, analyzing it, and using results to draw conclusions are all mutually distinct endeavors. Different physical locations and personnel are involved, juxtaposed against an acute need for security and data integrity. Using digital technologies and the Internet's ubiquity, these diverse elements can be conjoined using digital data as the common element. This result is a new data management process that can be applied to serve all elements of the community. The first step is recognition of a forensics lifecycle. Evidence gathering, analysis, storage, and use in legal proceedings are actually just distinct parts of a single end-to-end process, and thus, it is hypothesized that a single data system that can also accommodate each constituent phase using common network and security protocols. This paper introduces the idea of web-based Central Data Repository. Its cornerstone is anywhere, anytime Internet upload, viewing, and report distribution. Archives exist indefinitely after being created, and high-strength security and encryption protect data and ensure subsequent case file additions do not violate chain-of-custody or other handling provisions. Several legal precedents have been established for using digital information in courts of law, and in fact, effective prosecution of cyber crimes absolutely relies on its use. An example is a US Department of Agriculture division's use of digital images to back up its inspection process, with pictures and information retained on secure servers to enforce the Perishable Agricultural Commodities Act. Forensics is a cumulative process. Secure, web-based data management solutions, such as the Central Data Repository postulated here, can support each process step. Logically marrying digital technologies with Internet accessibility should help nurture a thought process to explore alternatives that make forensics data accessible to authorized individuals, whenever and wherever they need it.

  16. A Global Registry for Scientific Collections: Striking a Balance Between Disciplinary Detail and Interdisciplinary Discoverability

    NASA Astrophysics Data System (ADS)

    Graham, E.; Schindel, D. E.

    2014-12-01

    The Global Registry of Scientific Collections (GRSciColl) is an online information resource developed to gather and disseminate basic information on scientific collections. Building on initiatives started for biological collections, GRSciColl expands this framework to encompass all scientific disciplines including earth and space sciences, anthropology, archaeology, biomedicine, and applied fields such as agriculture and technology. The goals of this registry are to (1) provide a single source of synoptic information about the repositories, their component collections, access and use policies, and staff contact information; and (2) facilitate the assignment of identifiers for repositories and their collections that are globally unique across all disciplines. As digitization efforts continue, the importance of globally unique identifiers is paramount to ensuring interoperability across datasets. Search capabilities and web services will significantly increase the web visibility and accessibility of these collections. Institutional records include categorization by governance (e.g., national, state or local governmental, private non-profit) and by scientific discipline (e.g., earth science, biomedical, agricultural). Collection-level metadata categorize the types of contained specimens/samples and modes of preservation. In selecting the level of granularity for these categories, designers sought a compromise that would capture enough information to be useful in searches and inquiries and would complement the detailed archives in specimen-level databases such (which are increasingly digital) hosted by discipline-specific groups (e.g. SESAR) or the repositories themselves (e.g. KE EMu).

  17. The visualization and availability of experimental research data at Elsevier

    NASA Astrophysics Data System (ADS)

    Keall, Bethan

    2014-05-01

    In the digital age, the visualization and availability of experimental research data is an increasingly prominent aspect of the research process and of the scientific output that researchers generate. We expect that the importance of data will continue to grow, driven by technological advancements, requirements from funding bodies to make research data available, and a developing research data infrastructure that is supported by data repositories, science publishers, and other stakeholders. Elsevier is actively contributing to these efforts, for example by setting up bidirectional links between online articles on ScienceDirect and relevant data sets on trusted data repositories. A key aspect of Elsevier's "Article of the Future" program, these links enrich the online article and make it easier for researchers to find relevant data and articles and help place data in the right context for re-use. Recently, we have set up such links with some of the leading data repositories in Earth Sciences, including the British Geological Survey, Integrated Earth Data Applications, the UK Natural Environment Research Council, and the Oak Ridge National Laboratory DAAC. Building on these links, Elsevier has also developed a number of data integration and visualization tools, such as an interactive map viewer that displays the locations of relevant data from PANGAEA next to articles on ScienceDirect. In this presentation we will give an overview of these and other capabilities of the Article of the Future, focusing on how they help advance communication of research in the digital age.

  18. DataUp: Helping manage and archive data within the researcher's workflow

    NASA Astrophysics Data System (ADS)

    Strasser, C.

    2012-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are lacks of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. We have developed an open-source add-in for Excel and an open source web application intended to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. The researcher does not need a prior relationship with a data repository to use DataUp; the newly implemented ONEShare repository, a DataONE member node, is available for any researcher to archive and share their data. By meeting researchers where they already work, in spreadsheets, DataUp becomes part of the researcher's workflow and data management and sharing becomes easier. Future enhancement of DataUp will rely on members of the community adopting and adapting the DataUp tools to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between Microsoft Research Connections, the University of California's California Digital Library, the Gordon and Betty Moore Foundation, and DataONE.

  19. LingoBee--Crowd-Sourced Mobile Language Learning in the Cloud

    ERIC Educational Resources Information Center

    Petersen, Sobah Abbas; Procter-Legg, Emma; Cacchione, Annamaria

    2013-01-01

    This paper describes three case studies, where language learners were invited to use "LingoBee" as a means of supporting their language learning. LingoBee is a mobile app that provides user-generated language content in a cloud-based shared repository. Assuming that today's students are mobile savvy and "Digital Natives" able…

  20. LingoBee: Engaging Mobile Language Learners through Crowd-Sourcing

    ERIC Educational Resources Information Center

    Petersen, Sobah Abbas; Procter-Legg, Emma; Cacchione, Annamaria

    2014-01-01

    This paper describes three case studies, where language learners were invited to use "LingoBee" as a means of supporting their language learning. LingoBee is a mobile app that provides user-generated language content in a cloud-based shared repository. Assuming that today's students are mobile savvy and "Digital Natives" able…

  1. Reuse, Repurposing and Learning Design--Lessons from the DART Project

    ERIC Educational Resources Information Center

    Bond, Stephen T.; Ingram, Caroline; Ryan, Steve

    2008-01-01

    Digital Anthropological Resources for Teaching (DART) is a major project examining ways in which the use of online learning activities and repositories can enhance the teaching of anthropology and, by extension, other disciplines. This paper reports on one strand of DART activity, the development of customisable learning activities that can be…

  2. Geospatial Data Curation at the University of Idaho

    ERIC Educational Resources Information Center

    Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.

    2012-01-01

    The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…

  3. Integrating a Learning Management System with a Student Assignments Digital Repository. A Case Study

    ERIC Educational Resources Information Center

    Díaz, Javier; Schiavoni, Alejandra; Osorio, María Alejandra; Amadeo, Ana Paola; Charnelli, María Emilia

    2013-01-01

    The integration of different platforms and information Systems in the academic environment is highly important and quite a challenge within the field of Information Technology. This integration allows for higher resource availability and improved interaction among intervening actors. In the field of e-Learning, where Learning Management Systems…

  4. A large scale GIS geodatabase of soil parameters supporting the modeling of conservation practice alternatives in the United States

    USDA-ARS?s Scientific Manuscript database

    Water quality modeling requires across-scale support of combined digital soil elements and simulation parameters. This paper presents the unprecedented development of a large spatial scale (1:250,000) ArcGIS geodatabase coverage designed as a functional repository of soil-parameters for modeling an...

  5. The G-Portal Digital Repository as a Potentially Disruptive Pedagogical Innovation

    ERIC Educational Resources Information Center

    Hedberg, John G.; Chang, Chew-Hung

    2007-01-01

    Christensen defined a disruptive innovation or technology as one that eventually takes over the existing dominant technology in the market, despite the fact that the disruptive technology is both radically different to the leading technology and often initially performs worse than the leading technology according to existing measures of…

  6. Using a Combination of UML, C2RM, XML, and Metadata Registries to Support Long-Term Development/Engineering

    DTIC Science & Technology

    2003-01-01

    Authenticat’n (XCBF) Authorizat’n (XACML) (SAML) Privacy (P3P) Digital Rights Management (XrML) Content Mngmnt (DASL) (WebDAV) Content Syndicat’n...Registry/ Repository BPSS eCommerce XML/EDI Universal Business Language (UBL) Internet & Computing Human Resources (HR-XML) Semantic KEY XML SPECIFICATIONS

  7. Partner Resources at CBE

    Science.gov Websites

    Resources CBE's Next Industry Advisor Board Meeting CBE's Industry Advisory Board (IAB) provides guidance meeting logistics page >> Partner Website We maintain a website that allows partner access to all searchable by keyword and author on the eRepository, a service of the California Digital Library. Go to CBE

  8. Opportunistic Collaboration: Unlocking the Archives of the Birmingham Institute of Art and Design

    ERIC Educational Resources Information Center

    Everitt, Sian

    2005-01-01

    Purpose: To review a small specialist repository's strategic and opportunistic approach to utilising collaborative regional and national digital initiatives to increase access. The Birmingham Institute of Art and Design (BIAD) Archives activity is evaluated to determine whether a project-based approach recognises and meets the needs of historians,…

  9. Metadata Dictionary Database: A Proposed Tool for Academic Library Metadata Management

    ERIC Educational Resources Information Center

    Southwick, Silvia B.; Lampert, Cory

    2011-01-01

    This article proposes a metadata dictionary (MDD) be used as a tool for metadata management. The MDD is a repository of critical data necessary for managing metadata to create "shareable" digital collections. An operational definition of metadata management is provided. The authors explore activities involved in metadata management in…

  10. An open repository of earthquake-triggered ground-failure inventories

    USGS Publications Warehouse

    Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.

    2017-12-20

    Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.

  11. CELLPEDIA: a repository for human cell information for cell studies and differentiation analyses.

    PubMed

    Hatano, Akiko; Chiba, Hirokazu; Moesa, Harry Amri; Taniguchi, Takeaki; Nagaie, Satoshi; Yamanegi, Koji; Takai-Igarashi, Takako; Tanaka, Hiroshi; Fujibuchi, Wataru

    2011-01-01

    CELLPEDIA is a repository database for current knowledge about human cells. It contains various types of information, such as cell morphologies, gene expression and literature references. The major role of CELLPEDIA is to provide a digital dictionary of human cells for the biomedical field, including support for the characterization of artificially generated cells in regenerative medicine. CELLPEDIA features (i) its own cell classification scheme, in which whole human cells are classified by their physical locations in addition to conventional taxonomy; and (ii) cell differentiation pathways compiled from biomedical textbooks and journal papers. Currently, human differentiated cells and stem cells are classified into 2260 and 66 cell taxonomy keys, respectively, from which 934 parent-child relationships reported in cell differentiation or transdifferentiation pathways are retrievable. As far as we know, this is the first attempt to develop a digital cell bank to function as a public resource for the accumulation of current knowledge about human cells. The CELLPEDIA homepage is freely accessible except for the data submission pages that require authentication (please send a password request to cell-info@cbrc.jp). Database URL: http://cellpedia.cbrc.jp/

  12. CTN summary of DSREDS, EDCARS, EDMICS CALS readiness testing. [Computer-aided Acquisition and Logistic Support (CALS) CALS Test Network (CTN), Digital Storage Retrieval Eng. Data System (DSREDS), Eng. Data Computer Assisted Retrieval System (EDCARS), Eng. Data Management Information and Control System (EDMICS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitschkowetz, N.; Vickers, D.L.

    This report provides a summary of the Computer-aided Acquisition and Logistic Support (CALS) Test Network (CTN) Laboratory Acceptance Test (LAT) and User Application Test (UAT) activities undertaken to evaluate the CALS capabilities being implemented as part of the Department of Defense (DOD) engineering repositories. Although the individual testing activities provided detailed reports for each repository, a synthesis of the results, conclusions, and recommendations is offered to provide a more concise presentation of the issues and the strategies, as viewed from the CTN perspective.

  13. HEPData: a repository for high energy physics data

    NASA Astrophysics Data System (ADS)

    Maguire, Eamonn; Heinrich, Lukas; Watt, Graeme

    2017-10-01

    The Durham High Energy Physics Database (HEPData) has been built up over the past four decades as a unique open-access repository for scattering data from experimental particle physics papers. It comprises data points underlying several thousand publications. Over the last two years, the HEPData software has been completely rewritten using modern computing technologies as an overlay on the Invenio v3 digital library framework. The software is open source with the new site available at https://hepdata.net now replacing the previous site at http://hepdata.cedar.ac.uk. In this write-up, we describe the development of the new site and explain some of the advantages it offers over the previous platform.

  14. Historic Bim: a New Repository for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Barazzetti, L.; Previtali, M.; Roncoroni, F.

    2017-05-01

    Recent developments in Building Information Modelling (BIM) technologies are facilitating the management of historic complex structures using new applications. This paper proposes a generative method combining the morphological and typological aspects of the historic buildings (H-BIM), with a set of monitoring information. This combination of 3D digital survey, parametric modelling and monitoring datasets allows for the development of a system for archiving and visualizing structural health monitoring (SHM) data (Fig. 1). The availability of a BIM database allows one to integrate a different kind of data stored in different ways (e.g. reports, tables, graphs, etc.) with a representation directly connected to the 3D model of the structure with appropriate levels of detail (LoD). Data can be interactively accessed by selecting specific objects of the BIM, i.e. connecting the 3D position of the sensors installed with additional digital documentation. Such innovative BIM objects, which form a new BIM family for SHM, can be then reused in other projects, facilitating data archiving and exploitation of data acquired and processed. The application of advanced modeling techniques allows for the reduction of time and costs of the generation process, and support cooperation between different disciplines using a central workspace. However, it also reveals new challenges for parametric software and exchange formats. The case study presented is the medieval bridge Azzone Visconti in Lecco (Italy), in which multi-temporal vertical movements during load testing were integrated into H-BIM.

  15. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data.

    PubMed

    Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J

    2017-04-01

    This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.

  16. Exposing exposure: automated anatomy-specific CT radiation exposure extraction for quality assurance and radiation monitoring.

    PubMed

    Sodickson, Aaron; Warden, Graham I; Farkas, Cameron E; Ikuta, Ichiro; Prevedello, Luciano M; Andriole, Katherine P; Khorasani, Ramin

    2012-08-01

    To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. This institutional review board-approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools.

  17. Promoting Academic Physicists, Their Students, and Their Research through Library Partnerships

    NASA Astrophysics Data System (ADS)

    Rozum, B.; Wesolek, A.

    2012-12-01

    At many institutions, attracting and mentoring quality students is of key importance. Through their developing careers, typically under the tutelage of one primary faculty member, students build portfolios, prepare for graduate school, and apply to post-doc programs or faculty positions. Often though, the corpus of that primary faculty member's work is not available in a single location. This is a disadvantage both for current students, who wish to highlight the importance of their work within the context of a research group and for the department, which can miss opportunities to attract high-quality future students. Utah State University Libraries hosts a thriving institutional repository, DigitalCommons@USU, which provides open access to scholarly works, research, reports, publications, and journals produced by Utah State University faculty, staff, and students. The Library and the Physics Department developed a partnership to transcend traditional library repository architecture and emphasize faculty research groups within the department. Previously, only student theses and dissertations were collected, and they were not associated with the department in any way. Now student presentations, papers, and posters appear with other faculty works all in the same research work space. This poster session highlights the features of the University's repository and describes what is required to establish a similar structure at other academic institutions. We anticipate several long-term benefits of this new structure. Students are pleased with the increased visibility of their research and with having an online presence through their "Selected Works" personal author site. Faculty are pleased with the opportunity to highlight their research and the potential to attract new students to their research groups. This new repository model also allows the library to amplify the existing scientific outreach initiatives of the physics department. One example of this is a recent exhibit created in the Library showcasing a student research group's 30-year history of sending payloads into space. The exhibit was a direct result of archiving the work of student researchers in the institutional repository. From the perspective of the Library, the benefits are also impressive. The Library is able to build its institutional repository, develop strong relations with faculty in the Physics Department, and have access to unpublished reports that otherwise might be lost. Establishing research groups' presence in DigitalCommons@USU provided an opportunity to meet with the Physics graduate students to discuss setting up online web portfolios, archiving their publications, and understanding publisher contracts. Developing partnerships between academic units and libraries is one more method to reach out to potential students, promote research, and showcase the talents of faculty and students. Using the Library's institutional repository to do this is beneficial for everyone.

  18. Object linking in repositories

    NASA Technical Reports Server (NTRS)

    Eichmann, David (Editor); Beck, Jon; Atkins, John; Bailey, Bill

    1992-01-01

    This topic is covered in three sections. The first section explores some of the architectural ramifications of extending the Eichmann/Atkins lattice-based classification scheme to encompass the assets of the full life cycle of software development. A model is considered that provides explicit links between objects in addition to the edges connecting classification vertices in the standard lattice. The second section gives a description of the efforts to implement the repository architecture using a commercially available object-oriented database management system. Some of the features of this implementation are described, and some of the next steps to be taken to produce a working prototype of the repository are pointed out. In the final section, it is argued that design and instantiation of reusable components have competing criteria (design-for-reuse strives for generality, design-with-reuse strives for specificity) and that providing mechanisms for each can be complementary rather than antagonistic. In particular, it is demonstrated how program slicing techniques can be applied to customization of reusable components.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manteufel, R.D.; Ahola, M.P.; Turner, D.R.

    A literature review has been conducted to determine the state of knowledge available in the modeling of coupled thermal (T), hydrologic (H), mechanical (M), and chemical (C) processes relevant to the design and/or performance of the proposed high-level waste (HLW) repository at Yucca Mountain, Nevada. The review focuses on identifying coupling mechanisms between individual processes and assessing their importance (i.e., if the coupling is either important, potentially important, or negligible). The significance of considering THMC-coupled processes lies in whether or not the processes impact the design and/or performance objectives of the repository. A review, such as reported here, is usefulmore » in identifying which coupled effects will be important, hence which coupled effects will need to be investigated by the US Nuclear Regulatory Commission in order to assess the assumptions, data, analyses, and conclusions in the design and performance assessment of a geologic reposit``. Although this work stems from regulatory interest in the design of the geologic repository, it should be emphasized that the repository design implicitly considers all of the repository performance objectives, including those associated with the time after permanent closure. The scope of this review is considered beyond previous assessments in that it attempts with the current state-of-knowledge) to determine which couplings are important, and identify which computer codes are currently available to model coupled processes.« less

  20. Health professional learner attitudes and use of digital learning resources.

    PubMed

    Maloney, Stephen; Chamberlain, Michael; Morrison, Shane; Kotsanas, George; Keating, Jennifer L; Ilic, Dragan

    2013-01-16

    Web-based digital repositories allow educational resources to be accessed efficiently and conveniently from diverse geographic locations, hold a variety of resource formats, enable interactive learning, and facilitate targeted access for the user. Unlike some other learning management systems (LMS), resources can be retrieved through search engines and meta-tagged labels, and content can be streamed, which is particularly useful for multimedia resources. The aim of this study was to examine usage and user experiences of an online learning repository (Physeek) in a population of physiotherapy students. The secondary aim of this project was to examine how students prefer to access resources and which resources they find most helpful. The following data were examined using an audit of the repository server: (1) number of online resources accessed per day in 2010, (2) number of each type of resource accessed, (3) number of resources accessed during business hours (9 am to 5 pm) and outside business hours (years 1-4), (4) session length of each log-on (years 1-4), and (5) video quality (bit rate) of each video accessed. An online questionnaire and 3 focus groups assessed student feedback and self-reported experiences of Physeek. Students preferred the support provided by Physeek to other sources of educational material primarily because of its efficiency. Peak usage commonly occurred at times of increased academic need (ie, examination times). Students perceived online repositories as a potential tool to support lifelong learning and health care delivery. The results of this study indicate that today's health professional students welcome the benefits of online learning resources because of their convenience and usability. This represents a transition away from traditional learning styles and toward technological learning support and may indicate a growing link between social immersions in Internet-based connections and learning styles. The true potential for Web-based resources to support student learning is as yet unknown.

  1. Knowledge Management & Its Applications in Distance Education

    ERIC Educational Resources Information Center

    Saxena, Anurag

    2007-01-01

    It is the digital economy age in which we are living presently. Thus, traditional thinking is proving futile and newer methods are substituting the older ones. If one has to achieve developmental goals, one has to build a knowledge repository. Success of any system today is defined by its knowledge capital. For example for a university, knowledge…

  2. Online Law School Video Repository: The Flash Way

    ERIC Educational Resources Information Center

    Fang, Wei

    2009-01-01

    As with many other libraries in the U.S., the Rutgers University Law Library-Newark (RULLN), where the author works as digital services librarian, offers VHS tapes and CDs to its patrons. In 2006, some faculty members asked if their lectures could be posted online with protection so that only authorized users of the university's NetID system could…

  3. Scanning and georeferencing historical USGS quadrangles

    USGS Publications Warehouse

    Davis, Larry R.; Allord, G.J.

    2011-01-01

    The USGS Historical Quadrangle Scanning Project (HQSP) is scanning all scales and all editions of approximately 250,000 topographic maps published by the U.S. Geological Survey (USGS) since the inception of the topographic mapping program in 1884. This scanning will provide a comprehensive digital repository of USGS topographic maps, available to the public at no cost. This project serves the dual purpose of creating a master catalog and digital archive copies of the irreplaceable collection of topographic maps in the USGS Reston Map Library as well as making the maps available for viewing and downloading from the USGS Store and The National Map Viewer.

  4. Incorporating the Last Four Digits of Social Security Numbers Substantially Improves Linking Patient Data from De-identified Hospital Claims Databases

    PubMed Central

    Naessens, James M; Visscher, Sue L; Peterson, Stephanie M; Swanson, Kristi M; Johnson, Matthew G; Rahman, Parvez A; Schindler, Joe; Sonneborn, Mark; Fry, Donald E; Pine, Michael

    2015-01-01

    Objective Assess algorithms for linking patients across de-identified databases without compromising confidentiality. Data Sources/Study Setting Hospital discharges from 11 Mayo Clinic hospitals during January 2008–September 2012 (assessment and validation data). Minnesota death certificates and hospital discharges from 2009 to 2012 for entire state (application data). Study Design Cross-sectional assessment of sensitivity and positive predictive value (PPV) for four linking algorithms tested by identifying readmissions and posthospital mortality on the assessment data with application to statewide data. Data Collection/Extraction Methods De-identified claims included patient gender, birthdate, and zip code. Assessment records were matched with institutional sources containing unique identifiers and the last four digits of Social Security number (SSNL4). Principal Findings Gender, birthdate, and five-digit zip code identified readmissions with a sensitivity of 98.0 percent and a PPV of 97.7 percent and identified postdischarge mortality with 84.4 percent sensitivity and 98.9 percent PPV. Inclusion of SSNL4 produced nearly perfect identification of readmissions and deaths. When applied statewide, regions bordering states with unavailable hospital discharge data had lower rates. Conclusion Addition of SSNL4 to administrative data, accompanied by appropriate data use and data release policies, can enable trusted repositories to link data with nearly perfect accuracy without compromising patient confidentiality. States maintaining centralized de-identified databases should add SSNL4 to data specifications. PMID:26073819

  5. [Self-archiving of biomedical papers in open access repositories].

    PubMed

    Abad-García, M Francisca; Melero, Remedios; Abadal, Ernest; González-Teruel, Aurora

    2010-04-01

    Open-access literature is digital, online, free of charge, and free of most copyright and licensing restrictions. Self-archiving or deposit of scholarly outputs in institutional repositories (open-access green route) is increasingly present in the activities of the scientific community. Besides the benefits of open access for visibility and dissemination of science, it is increasingly more often required by funding agencies to deposit papers and any other type of documents in repositories. In the biomedical environment this is even more relevant by the impact scientific literature can have on public health. However, to make self-archiving feasible, authors should be aware of its meaning and the terms in which they are allowed to archive their works. In that sense, there are some tools like Sherpa/RoMEO or DULCINEA (both directories of copyright licences of scientific journals at different levels) to find out what rights are retained by authors when they publish a paper and if they allow to implement self-archiving. PubMed Central and its British and Canadian counterparts are the main thematic repositories for biomedical fields. In our country there is none of similar nature, but most of the universities and CSIC, have already created their own institutional repositories. The increase in visibility of research results and their impact on a greater and earlier citation is one of the most frequently advance of open access, but removal of economic barriers to access to information is also a benefit to break borders between groups.

  6. The role of the Jotello F. Soga Library in the digital preservation of South African veterinary history.

    PubMed

    Breytenbach, Amelia; Lourens, Antoinette; Marsh, Susan

    2013-04-26

    The history of veterinary science in South Africa can only be appreciated, studied, researched and passed on to coming generations if historical sources are readily available. In most countries, material and sources with historical value are often difficult to locate, dispersed over a large area and not part of the conventional book and journal literature. The Faculty of Veterinary Science of the University of Pretoria and its library has access to a large collection of historical sources. The collection consists of photographs, photographic slides, documents, proceedings, posters, audio-visual material, postcards and other memorabilia. Other institutions in the country are also approached if relevant sources are identified in their collections. The University of Pretoria's institutional repository, UPSpace, was launched in 2006. This provided the Jotello F. Soga Library with the opportunity to fill the repository with relevant digitised collections of diverse heritage and learning resources that can contribute to the long-term preservation and accessibility of historical veterinary sources. These collections are available for use not only by historians and researchers in South Africa but also elsewhere in Africa and the rest of the world. Important historical collections such as the Arnold Theiler collection, the Jotello F. Soga collection and collections of the Onderstepoort Journal of Veterinary Research and the Journal of the South African Veterinary Association are highlighted. The benefits of an open access digital repository, the importance of collaboration across the veterinary community and other prerequisites for the sustainability of a digitisation project and the importance of metadata to enhance accessibility are covered.

  7. Significance of clustering and classification applications in digital and physical libraries

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Ioannis; Koulouris, Alexandros; Zervos, Spiros; Dendrinos, Markos; Giannakopoulos, Georgios

    2015-02-01

    Applications of clustering and classification techniques can be proved very significant in both digital and physical (paper-based) libraries. The most essential application, document classification and clustering, is crucial for the content that is produced and maintained in digital libraries, repositories, databases, social media, blogs etc., based on various tags and ontology elements, transcending the traditional library-oriented classification schemes. Other applications with very useful and beneficial role in the new digital library environment involve document routing, summarization and query expansion. Paper-based libraries can benefit as well since classification combined with advanced material characterization techniques such as FTIR (Fourier Transform InfraRed spectroscopy) can be vital for the study and prevention of material deterioration. An improved two-level self-organizing clustering architecture is proposed in order to enhance the discrimination capacity of the learning space, prior to classification, yielding promising results when applied to the above mentioned library tasks.

  8. Specification for the U.S. Geological Survey Historical Topographic Map Collection

    USGS Publications Warehouse

    Allord, Gregory J.; Walter, Jennifer L.; Fishburn, Kristin A.; Shea, Gale A.

    2014-01-01

    This document provides the detailed requirements for producing, archiving, and disseminating a comprehensive digital collection of topographic maps for the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. The HTMC provides ready access to maps that are no longer available for distribution in print. A digital file representing the original paper historical topographic map is produced for each historical map in the HTMC in georeferenced PDF (GeoPDF) format (a portable document format [PDF] with a geospatial extension).

  9. SharedCanvas: A Collaborative Model for Medieval Manuscript Layout Dissemination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanderson, Robert D.; Albritton, Benjamin; Schwemmer, Rafael

    2011-01-01

    In this paper we present a model based on the principles of Linked Data that can be used to describe the interrelationships of images, texts and other resources to facilitate the interoperability of repositories of medieval manuscripts or other culturally important handwritten documents. The model is designed from a set of requirements derived from the real world use cases of some of the largest digitized medieval content holders, and instantiations of the model are intended as the input to collection-independent page turning and scholarly presentation interfaces. A canvas painting paradigm, such as in PDF and SVG, was selected based onmore » the lack of a one to one correlation between image and page, and to fulfill complex requirements such as when the full text of a page is known, but only fragments of the physical object remain. The model is implemented using technologies such as OAI-ORE Aggregations and OAC Annotations, as the fundamental building blocks of emerging Linked Digital Libraries. The model and implementation are evaluated through prototypes of both content providing and consuming applications. Although the system was designed from requirements drawn from the medieval manuscript domain, it is applicable to any layout-oriented presentation of images of text.« less

  10. TRENCADIS - secure architecture to share and manage DICOM objects in a ontological framework based on OGSA.

    PubMed

    Blanquer, Ignacio; Hernandez, Vicente; Segrelles, Damià; Torres, Erik

    2007-01-01

    Today most European healthcare centers use the digital format for their databases of images. TRENCADIS is a software architecture comprising a set of services as a solution for interconnecting, managing and sharing selected parts of medical DICOM data for the development of training and decision support tools. The organization of the distributed information in virtual repositories is based on semantic criteria. Different groups of researchers could organize themselves to propose a Virtual Organization (VO). These VOs will be interested in specific target areas, and will share information concerning each area. Although the private part of the information to be shared will be removed, special considerations will be taken into account to avoid the access by non-authorized users. This paper describes the security model implemented as part of TRENCADIS. The paper is organized as follows. First introduces the problem and presents our motivations. Section 1 defines the objectives. Section 2 presents an overview of the existing proposals per objective. Section 3 outlines the overall architecture. Section 4 describes how TRENCADIS is architected to realize the security goals discussed in the previous sections. The different security services and components of the infrastructure are briefly explained, as well as the exposed interfaces. Finally, Section 5 concludes and gives some remarks on our future work.

  11. Geoscience parameter data base handbook: granites and basalts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-12-01

    The Department of Energy has the responsibility for selecting and constructing Federal repositories for radioactive waste. The Nuclear Regulatory Commission must license such repositories prior to construction. The basic requirement in the geologic disposal of radioactive waste is stated as: placement in a geologic host whereby the radioactive waste is not in mechanical, thermal or chemical equilibrium with the object of preventing physical or chemical migration of radionuclides into the biosphere or hydrosphere in hazardous concentration (USGS, 1977). The object of this report is to document the known geologic parameters of large granite and basalt occurrences in the coterminous Unitedmore » States, for future evaluation in the selection and licensing of radioactive waste repositories. The description of the characteristics of certain potential igneous hosts has been limited to existing data pertaining to the general geologic character, geomechanics, and hydrology of identified occurrences. A description of the geochemistry is the subject of a separate report.« less

  12. An Assessment of a Science Discipline Archive Against ISO 16363

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Downs, R. R.

    2016-12-01

    The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.

  13. Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories

    PubMed Central

    Neu, Scott C.; Crawford, Karen L.; Toga, Arthur W.

    2012-01-01

    Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead. PMID:22470336

  14. Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories.

    PubMed

    Neu, Scott C; Crawford, Karen L; Toga, Arthur W

    2012-01-01

    Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead.

  15. Implementing DSpace at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Lowe, Greta

    2007-01-01

    This presentation looks at the implementation of the DSpace institutional repository system at the NASA Langley Technical Library. NASA Langley Technical Library implemented DSpace software as a replacement for the Langley Technical Report Server (LTRS). DSpace was also used to develop the Langley Technical Library Digital Repository (LTLDR). LTLDR contains archival copies of core technical reports in the aeronautics area dating back to the NACA era and other specialized collections relevant to the NASA Langley community. Extensive metadata crosswalks were created to facilitate moving data from various systems and formats to DSpace. The Dublin Core metadata screens were also customized. The OpenURL standard and Ex Libris Metalib are being used in this environment to assist our customers with either discovering full-text content or with initiating a request for the item.

  16. Challenges and Best Practices for the Curation and Publication of Long-Tail Data with GFZ Data Services

    NASA Astrophysics Data System (ADS)

    Elger, Kirsten; Ulbricht, Damian; Bertelmann, Roland

    2017-04-01

    Open access to research data is an increasing international request and includes not only data underlying scholarly publication, but also raw and curated data. Especially in the framework of the observed shift in many scientific fields towards data science and data mining, data repositories are becoming important player as data archives and access point to curated research data. While general and institutional data repositories are available across all scientific disciplines, domain-specific data repositories are specialised for scientific disciplines, like, e.g., bio- or geosciences, with the possibility to use more discipline-specific and richer metadata models than general repositories. Data publication is increasingly regarded as important scientific achievement, and datasets with digital object identifier (DOI) are now fully citable in journal articles. Moreover, following in their signature of the "Statement of Commitment of the Coalition on Publishing Data in the Earth and Space Sciences" (COPDESS), many publishers have adopted their data policies and recommend and even request to store and publish data underlying scholarly publications in (domain-specific) data repositories and not as classical supplementary material directly attached to the respective article. The curation of large dynamic data from global networks in, e.g., seismology, magnetics or geodesy, always required a high grade of professional, IT-supported data management, simply to be able to store and access the huge number of files and manage dynamic datasets. In contrast to these, the vast amount of research data acquired by individual investigators or small teams known as 'long-tail data' was often not the focus for the development of data curation infrastructures. Nevertheless, even though they are small in size and highly variable, in total they represent a significant portion of the total scientific outcome. The curation of long-tail data requires more individual approaches and personal involvement of the data curator, especially regarding the data description. Here we will introduce best practices for the publication of long-tail data that are helping to reduce the individual effort, improve the quality of the data description. The data repository of GFZ Data Services, which is hosted at GFZ German Research Centre for Geosciences in Potsdam, is a domain-specific data repository for geosciences. In addition to large dynamic datasets from different disciplines, it has a large focus on the DOI-referenced publication of long-tail data with the aim to reach a high grade of reusability through a comprehensive data description and in the same time provide and distribute standardised, machine actionable metadata for data discovery (FAIR data). The development of templates for data reports, metadata provision by scientists via an XML Metadata Editor and discipline-specific DOI landing pages are helping both, the data curators to handle all kinds of datasets and enabling the scientists, i.e. user, to quickly decide whether a published dataset is fulfilling their needs. In addition, GFZ Data Services have developed DOI-registration services for several international networks (e.g. ICGEM, World Stress Map, IGETS, etc.). In addition, we have developed project-or network-specific designs of the DOI landing pages with the logo or design of the networks or project

  17. The Scholarly Communication Process within the University Research Corridor (Michigan State University, the University of Michigan, and Wayne State University): A Case Study in Cooperation

    ERIC Educational Resources Information Center

    Utter, Timothy; Holley, Robert P.

    2009-01-01

    The growth of open access publishing, the development of institutional repositories, and the availability of millions of digitized monographs and journals are rapidly changing scholarly communication. This case study looks at the current and possible uses of these tools by Michigan's three largest universities: Michigan State University, the…

  18. Exposing Exposure: Automated Anatomy-specific CT Radiation Exposure Extraction for Quality Assurance and Radiation Monitoring

    PubMed Central

    Warden, Graham I.; Farkas, Cameron E.; Ikuta, Ichiro; Prevedello, Luciano M.; Andriole, Katherine P.; Khorasani, Ramin

    2012-01-01

    Purpose: To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. Materials and Methods: This institutional review board–approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Results: Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Conclusion: Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools. ©RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12111822/-/DC1 PMID:22668563

  19. Data publication, documentation and user friendly landing pages - improving data discovery and reuse

    NASA Astrophysics Data System (ADS)

    Elger, Kirsten; Ulbricht, Damian; Bertelmann, Roland

    2016-04-01

    Research data are the basis for scientific research and often irreplaceable (e.g. observational data). Storage of such data in appropriate, theme specific or institutional repositories is an essential part of ensuring their long term preservation and access. The free and open access to research data for reuse and scrutiny has been identified as a key issue by the scientific community as well as by research agencies and the public. To ensure the datasets to intelligible and usable for others they must be accompanied by comprehensive data description and standardized metadata for data discovery, and ideally should be published using digital object identifier (DOI). These make datasets citable and ensure their long-term accessibility and are accepted in reference lists of journal articles (http://www.copdess.org/statement-of-commitment/). The GFZ German Research Centre for Geosciences is the national laboratory for Geosciences in Germany and part of the Helmholtz Association, Germany's largest scientific organization. The development and maintenance of data systems is a key component of 'GFZ Data Services' to support state-of-the-art research. The datasets, archived in and published by the GFZ Data Repository cover all geoscientific disciplines and range from large dynamic datasets deriving from global monitoring seismic or geodetic networks with real-time data acquisition, to remotely sensed satellite products, to automatically generated data publications from a database for data from micro meteorological stations, to various model results, to geochemical and rock mechanical analyses from various labs, and field observations. The user-friendly presentation of published datasets via a DOI landing page is as important for reuse as the storage itself, and the required information is highly specific for each scientific discipline. If dataset descriptions are too general, or require the download of a dataset before knowing its suitability, many researchers often decide not to reuse a published dataset. In contrast to large data repositories without thematic specification, theme-specific data repositories have a large expertise in data discovery and opportunity to develop usable, discipline-specific formats and layouts for specific datasets, including consultation to different formats for the data description (e.g., via a Data Report or an article in a Data Journal) with full consideration of international metadata standards.

  20. Decision Tree Repository and Rule Set Based Mingjiang River Estuarine Wetlands Classifaction

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Li, X.; Xiao, W.

    2018-05-01

    The increasing urbanization and industrialization have led to wetland losses in estuarine area of Mingjiang River over past three decades. There has been increasing attention given to produce wetland inventories using remote sensing and GIS technology. Due to inconsistency training site and training sample, traditionally pixel-based image classification methods can't achieve a comparable result within different organizations. Meanwhile, object-oriented image classification technique shows grate potential to solve this problem and Landsat moderate resolution remote sensing images are widely used to fulfill this requirement. Firstly, the standardized atmospheric correct, spectrally high fidelity texture feature enhancement was conducted before implementing the object-oriented wetland classification method in eCognition. Secondly, we performed the multi-scale segmentation procedure, taking the scale, hue, shape, compactness and smoothness of the image into account to get the appropriate parameters, using the top and down region merge algorithm from single pixel level, the optimal texture segmentation scale for different types of features is confirmed. Then, the segmented object is used as the classification unit to calculate the spectral information such as Mean value, Maximum value, Minimum value, Brightness value and the Normalized value. The Area, length, Tightness and the Shape rule of the image object Spatial features and texture features such as Mean, Variance and Entropy of image objects are used as classification features of training samples. Based on the reference images and the sampling points of on-the-spot investigation, typical training samples are selected uniformly and randomly for each type of ground objects. The spectral, texture and spatial characteristics of each type of feature in each feature layer corresponding to the range of values are used to create the decision tree repository. Finally, with the help of high resolution reference images, the random sampling method is used to conduct the field investigation, achieve an overall accuracy of 90.31 %, and the Kappa coefficient is 0.88. The classification method based on decision tree threshold values and rule set developed by the repository, outperforms the results obtained from the traditional methodology. Our decision tree repository and rule set based object-oriented classification technique was an effective method for producing comparable and consistency wetlands data set.

  1. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    NASA Astrophysics Data System (ADS)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data used in preparing data for the MODFLOW model in Hydrology, JupyterHub used in Hydroshare, PyLith used in Computational Infrastructure for Geodynamics, and GeoSpace Collaborative Observations and Assimilative Modeling used in space science. The GeoTrust Hub is funded through the National Science Foundation EarthCube program.

  2. Processes in scientific workflows for information seeking related to physical sample materials

    NASA Astrophysics Data System (ADS)

    Ramdeen, S.

    2014-12-01

    The majority of State Geological Surveys have repositories containing cores, cuttings, fossils or other physical sample material. State surveys maintain these collections to support their own research as well as the research conducted by external users from other organizations. This includes organizations such as government agencies (state and federal), academia, industry and the public. The preliminary results presented in this paper will look at the research processes of these external users. In particular: how they discover, access and use digital surrogates, which they use to evaluate and access physical items in these collections. Data such as physical samples are materials that cannot be completely replaced with digital surrogates. Digital surrogates may be represented as metadata, which enable discovery and ultimately access to these samples. These surrogates may be found in records, databases, publications, etc. But surrogates do not completely prevent the need for access to the physical item as they cannot be subjected to chemical testing and/or other similar analysis. The goal of this research is to document the various processes external users perform in order to access physical materials. Data for this study will be collected by conducting interviews with these external users. During the interviews, participants will be asked to describe the workflow that lead them to interact with state survey repositories, and what steps they took afterward. High level processes/categories of behavior will be identified. These processes will be used in the development of an information seeking behavior model. This model may be used to facilitate the development of management tools and other aspects of cyberinfrastructure related to physical samples.

  3. Native Americans and state and local governments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rusco, E.R.

    1991-10-01

    Native Americans` concerns arising from the possibility of establishment of a nuclear repository for high level wastes at Yucca Mountain fall principally into two main categories. First, the strongest objection to the repository comes from traditional Western Shoshones. Their objections are based on a claim that the Western Shoshones still own Yucca Mountain and also on the assertion that putting high level nuclear wastes into the ground is a violation of their religious views regarding nature. Second, there are several reservations around the Yucca Mountain site that might be affected in various ways by building of the repository. There ismore » a question about how many such reservations there are, which can only be decided when more information is available. This report discusses two questions: the bearing of the continued vigorous assertion by traditionalist Western Shoshones of their land claim; and the extent to which Nevada state and local governments are able to understand and represent Indian viewpoints about Yucca Mountain.« less

  4. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  5. AWARE: Adaptive Software Monitoring and Dynamic Reconfiguration for Critical Infrastructure Protection

    DTIC Science & Technology

    2015-04-29

    in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software

  6. Emerging Challenges and Opportunities in Building Information Modeling for the US Army Installation Management Command

    DTIC Science & Technology

    2012-07-01

    Information Modeling ( BIM ) is the process of generating and managing building data during a facility’s entire life cycle. New BIM standards for...cycle Building Information Modeling ( BIM ) as a new standard for building information data repositories can serve as the foun- dation for automation and... Building Information Modeling ( BIM ) is defined as “a digital representa- tion of physical and functional

  7. DSA-WDS Common Requirements: Developing a New Core Data Repository Certification

    NASA Astrophysics Data System (ADS)

    Minster, J. B. H.; Edmunds, R.; L'Hours, H.; Mokrane, M.; Rickards, L.

    2016-12-01

    The Data Seal of Approval (DSA) and the International Council for Science - World Data System (ICSU-WDS) have both developed minimally intensive core certification standards whereby digital repositories supply evidence that they are trustworthy and have a long-term outlook. Both DSA and WDS applicants have found core certification to be beneficial: building stakeholder confidence, enhancing the repository's reputation, and demonstrating that it is following good practices; as well as stimulating the repository to focus on processes and procedures, thereby achieving ever higher levels of professionalism over time.The DSA and WDS core certifications evolved independently serving initially different communities but both initiatives are multidisciplinary with catalogues of criteria and review procedures based on the same principles. Hence, to realize efficiencies, simplify assessment options, stimulate more certifications, and increase impact on the community, the Repository Audit and Certification DSA-WDS Partnership Working Group (WG) was established under the umbrella of the Research Data Alliance (RDA). The WG conducted a side-by-side analysis of both frameworks to unify the wording and criteria, ultimately leading to a harmonized Catalogue of Common Requirements for core certification of repositories—as well as a set of Common Procedures for their assessment.This presentation will focus on the collaborative effort by DSA and WDS to establish (1) a testbed comprising DSA and WDS certified data repositories to validate both the new Catalogue and Procedures, and (2) a joint Certification Board towards their practical implementation. We will describe:• The purpose and methodology of the testbed, including selection of repositories to be assessed against the common standard.• The results of the testbed, with an in-depth look at some of the comments received and issues highlighted.• General insights gained from evaluating the testbed results, the subsequent changes to the Common Requirements and Procedures, and an assessment of the success of these enhancements.• Steps by the two organizations to integrate the Common Certification into their tools and systems. In particular, the creation of Terms of Reference for the nascent DSA-WDS Certification Board.

  8. CyVerse Data Commons: lessons learned in cyberinfrastructure management and data hosting from the Life Sciences

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Walls, R.; Merchant, N.

    2017-12-01

    CyVerse, is a US National Science Foundation funded initiative "to design, deploy, and expand a national cyberinfrastructure for life sciences research, and to train scientists in its use," supporting and enabling cross disciplinary collaborations across institutions. CyVerse' free, open-source, cyberinfrastructure is being adopted into biogeoscience and space sciences research. CyVerse data-science agnostic platforms provide shared data storage, high performance computing, and cloud computing that allow analysis of very large data sets (including incomplete or work-in-progress data sets). Part of CyVerse success has been in addressing the handling of data through its entire lifecycle, from creation to final publication in a digital data repository to reuse in new analyses. CyVerse developers and user communities have learned many lessons that are germane to Earth and Environmental Science. We present an overview of the tools and services available through CyVerse including: interactive computing with the Discovery Environment (https://de.cyverse.org/), an interactive data science workbench featuring data storage and transfer via the Data Store; cloud computing with Atmosphere (https://atmo.cyverse.org); and access to HPC via Agave API (https://agaveapi.co/). Each CyVerse service emphasizes access to long term data storage, including our own Data Commons (http://datacommons.cyverse.org), as well as external repositories. The Data Commons service manages, organizes, preserves, publishes, allows for discovery and reuse of data. All data published to CyVerse's Curated Data receive a permanent identifier (PID) in the form of a DOI (Digital Object Identifier) or ARK (Archival Resource Key). Data that is more fluid can also be published in the Data commons through Community Collaborated data. The Data Commons provides landing pages, permanent DOIs or ARKs, and supports data reuse and citation through features such as open data licenses and downloadable citations. The ability to access and do computing on data within the CyVerse framework or with external compute resources when necessary, has proven highly beneficial to our user community, which has continuously grown since the inception of CyVerse nine years ago.

  9. A Web 2.0 Application for Executing Queries and Services on Climatic Data

    NASA Astrophysics Data System (ADS)

    Abad-Mota, S.; Ruckhaus, E.; Garboza, A.; Tepedino, G.

    2007-12-01

    For many years countries have collected data in order to understand climate, to study its effect in living species, and to predict future behavior. Nowadays, terabytes of data are collected by governmental agencies and academic institutions and the current challenge is how to provide appropriate access to this vast amount of climatic data. Each country has a different situation with respect to the collection and use of these data. In particular, in Venezuela, a few institutions have systematically gathered observational and hidrology data, but the data are mostly registered in non-digital media which have been lost or have deteriorated over the years; all of this restricts data availability. In 2006 a joint project between two major venezuelan universities, Universidad Simón Bolívar (USB) and Universidad Central de Venezuela (UCV) was initiated. The goal of the project is to develop a digital repository of the country's climatic and hidrology data, and to build an application that provides querying and service execution capabilities over these data. The repository has been conceptually modeled as a database, which integrates observational data and metadata. Among the metadata we have an inventory of all the stations where data has been collected, and the description of the measurements themselves, for instance, the instruments used for the collection, the time granularity of the measurements, and their units of measure. The resulting data model combines traditional entity relationship concepts with star and snowflake schemas from datawarehouses. The model allows the inclusion of historic or current data, and each kind of data requires a different loading process. A special emphasis has been given to the representation of the quality of the data stored in the repository. Quality attributes can be attached to each individual value or to sets of values; these attributes can represent statistical or semantic quality of the data. Values can be stored at any level of aggregation, hourly, daily, monthly, so that they can be provided to the user at the desired level. This means that additional caution has to be exercised in query answering, in order to distinguish between primary and derived data. On the other hand, a Web 2.0 application is being designed to provide a front-end to the repository. This design focuses on two important aspects: the use of metadata structures, and the definition of collaborative Web 2.0 features that can be integrated to a project of this nature. Metadata descriptors include for a set of measurements, its quality, granularity and other dimension information. With these descriptors it is possible to establish relationships between different sets of measurements and provide scientists with efficient searching mechanisms that determine the related sets of measurements that contribute to a query answer. Unlike traditional applications for climatic data, our approach not only satisfies requirements of researchers specialized in this domain, but also those of anyone interested in this area; one of the objectives is to build an informal knowledge base that can be improved and consolidated with the usage of the system.

  10. Quarterly Report - May through July 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Laniece E.

    2012-08-09

    The first quarter of my postgraduate internship has been an extremely varied one, and one which I have tackled several different aspects of the project. Because this is the beginning of a new investigation for the Research Library, I think it is appropriate that I explore data management at LANL from multiple perspectives. I have spent a considerable amount of time doing a literature search and taking notes on what I've been reading in preparation for potential writing activities later. The Research Library is not the only research library exploring the possibility of providing services to their user base. Themore » Joint Information Systems Committee (JISC) and the Digital Curation Centre (DCC) in the UK are actively pursuing possibilities to preserve the scientific record. DataOne is a U.S. National Science Foundation (NSF) initiative aimed at helping to curate bioscience data. This is just a tiny sample of the organizations actively looking into the issues surrounding data management on an organizational, cultural, or technical level. I have included a partial bibliography of some papers I have read. Based on what I read, various discussions, and previous library training, I have begun to document the services I feel I could provide researchers in the context of my internship. This is still very much a work in progress as I learn more about the landscape in libraries and at the Laboratory. I have detailed this process and my thoughts on the issue below. As data management is such a complex and interconnected activity, it is impossible to investigate the organizational and cultural needs of the researchers without familiarizing myself with technologies that could facilitate the local cataloging and preservation of data sets. I have spent some time investigating the repository software DSpace. The library has long maintained the digital object repository aDORe, but the differences in features and lack of a user interface compared to DSpace have made DSpace a good test bed for this project. However my internship is not about repository software and DSpace is just one potential tool for supporting researchers and their data. More details my repository investigation. The most exciting aspect of the project thus far has been meeting with researchers, some of which are potential collaborators. Some people I have talked with have been very interested and enthusiastic about the possibility of collaborating, while others have not wanted to discuss the issue at all. I have had discussions with individual researchers managing their own lab as well as with researchers who are part of much larger collaborations. Three of the research groups whom I feel are of particular interest are detailed below. I have added an appendix below which goes into more detail about the protein crystallography community which has addressed the complete data life cycle within their field end to end. The issue of data management is much bigger than just my internship and there are several people and organizations exploring the issues at the Laboratory. I am making every effort to stay focused on small science data sets and ensure that my activities use standards-based approaches and are sustainable.« less

  11. Developing a University Course for Online Delivery Based on Learning Objects: From Ideals to Compromises

    ERIC Educational Resources Information Center

    Wilhelm, Pierre; Wilde, Russ

    2005-01-01

    A course instructor and his assistant at Athabasca University investigated whether the process of transferring interoperable learning objects from online repositories facilitated course production, both pedagogically and economically. They examined the efficiency of the objects-assembly method from several perspectives while developing an online…

  12. Online 4d Reconstruction Using Multi-Images Available Under Open Access

    NASA Astrophysics Data System (ADS)

    Ioannides, M.; Hadjiprocopi, A.; Doulamis, N.; Doulamis, A.; Protopapadakis, E.; Makantasis, K.; Santos, P.; Fellner, D.; Stork, A.; Balet, O.; Julien, M.; Weinlinger, G.; Johnson, P. S.; Klein, M.; Fritsch, D.

    2013-07-01

    The advent of technology in digital cameras and their incorporation into virtually any smart mobile device has led to an explosion of the number of photographs taken every day. Today, the number of images stored online and available freely has reached unprecedented levels. It is estimated that in 2011, there were over 100 billion photographs stored in just one of the major social media sites. This number is growing exponentially. Moreover, advances in the fields of Photogrammetry and Computer Vision have led to significant breakthroughs such as the Structure from Motion algorithm which creates 3D models of objects using their twodimensional photographs. The existence of powerful and affordable computational machinery not only the reconstruction of complex structures but also entire cities. This paper illustrates an overview of our methodology for producing 3D models of Cultural Heritage structures such as monuments and artefacts from 2D data (pictures, video), available on Internet repositories, social media, Google Maps, Bing, etc. We also present new approaches to semantic enrichment of the end results and their subsequent export to Europeana, the European digital library, for integrated, interactive 3D visualisation within regular web browsers using WebGl and X3D. Our main goal is to enable historians, architects, archaeologists, urban planners and affiliated professionals to reconstruct views of historical structures from millions of images floating around the web and interact with them.

  13. Geosamples.org: Shared Cyberinfrastructure for Geoscience Samples

    NASA Astrophysics Data System (ADS)

    Lehnert, Kerstin; Allison, Lee; Arctur, David; Klump, Jens; Lenhardt, Christopher

    2014-05-01

    Many scientific domains, specifically in the geosciences, rely on physical samples as basic elements for study and experimentation. Samples are collected to analyze properties of natural materials and features that are key to our knowledge of Earth's dynamical systems and evolution, and to preserve a record of our environment over time. Huge volumes of samples have been acquired over decades or even centuries and stored in a large number and variety of institutions including museums, universities and colleges, state geological surveys, federal agencies, and industry. All of these collections represent highly valuable, often irreplaceable records of nature that need to be accessible so that they can be re-used in future research and for educational purposes. Many sample repositories are keen to use cyberinfrastructure capabilities to enhance access to their collections on the internet and to support and streamline collection management (accessioning of new samples, labeling, handling sample requests, etc.), but encounter substantial challenges and barriers to integrate digital sample management into their daily routine. They lack the resources (staff, funding) and infrastructure (hardware, software, IT support) to develop and operate web-enabled databases, to migrate analog sample records into digital data management systems, and to transfer paper- or spreadsheet-based workflows to electronic systems. Use of commercial software is often not an option as it incurs high costs for licenses, requires IT expertise for installation and maintenance, and often does not match the needs of the smaller repositories, being designed for large museums or different types of collections (art, archeological, biological). Geosamples.org is an alliance of sample repositories (academic, US federal and state surveys, industry) and data facilities that aims to develop a cyberinfrastructure that will dramatically advance access to physical samples for the research community, government agencies, students, educators, and the general public, while supporting, simplifying, and standardizing the work of curators in repositories, museums, and universities, and even for individual investigators who manage personal or project-based sample collections in their lab. Geosamples.org builds upon best practices and cyberinfrastructure for sample identification, registration, and documentation developed by the IGSN e.V., an international organization that governs the International Geosample Number, a persistent unique identifier for physical samples. Geosamples.org will develop a Digital Environment for Sample Curation (DESC) that will facilitate the creation, identification, and registration of 'virtual samples' and network them into an 'Internet of Samples' that will allow to discover, access, and track online physical samples, the data derived by their study, and the publications that contain these data. DESC will provide easy-to-use software tools for curators to maintain digital catalogs of their collections, to provide online access to the catalog to search for and request samples, manage sample requests and users, track collection usage and impact. Geosamples.org will also work toward joint practices for the recognition of intellectual property, build mechanisms to create sustainable business models for continuing maintenance and evolution of managing sample resources, and integrate the sample management life-cycle into professional and cultural practice of science.

  14. BCO-DMO: Enabling Access to Federally Funded Research Data

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Allison, M. D.; Chandler, C. L.; Groman, R. C.; Rauch, S.; Shepherd, A.; Gegg, S. R.; Wiebe, P. H.; Glover, D. M.

    2013-12-01

    In a February, 2013 memo1, the White House Office of Science and Technology Policy (OSTP) outlined principles and objectives to increase access by the public to federally funded research publications and data. Such access is intended to drive innovation by allowing private and commercial efforts to take full advantage of existing resources, thereby maximizing Federal research dollars and efforts. The Biological and Chemical Oceanography Data Management Office (BCO-DMO; bco-dmo.org) serves as a model resource for organizations seeking compliance with the OSTP policy. BCO-DMO works closely with scientific investigators to publish their data from research projects funded by the National Science Foundation (NSF), within the Biological and Chemical Oceanography Sections (OCE) and the Division of Polar Programs Antarctic Organisms & Ecosystems Program (PLR). BCO-DMO addresses many of the OSTP objectives for public access to digital scientific data: (1) Marine biogeochemical and ecological data and metadata are disseminated via a public website, and curated on intermediate time frames; (2) Preservation needs are met by collaborating with appropriate national data facilities for data archive; (3) Cost and administrative burden associated with data management is minimized by the use of one dedicated office providing hundreds of NSF investigators support for data management plan development, data organization, metadata generation and deposition of data and metadata into the BCO-DMO repository; (4) Recognition of intellectual property is reinforced through the office's citation policy and the use of digital object identifiers (DOIs); (5) Education and training in data stewardship and use of the BCO-DMO system is provided by office staff through a variety of venues. Oceanographic research data and metadata from thousands of datasets generated by hundreds of investigators are now available through BCO-DMO. 1 White House Office of Science and Technology Policy, Memorandum for the Heads of Executive Departments and Agencies: Increasing Access to the Results of Federally Funded Scientific Research, February 23, 2013. http://www.whitehouse.gov/sites/default/files/microsites/ostp/ostp_public_access_memo_2013.pdf

  15. Patterns of Learning Object Reuse in the Connexions Repository

    ERIC Educational Resources Information Center

    Duncan, S. M.

    2009-01-01

    Since the term "learning object" was first published, there has been either an explicit or implicit expectation of reuse. There has also been a lot of speculation about why learning objects are, or are not, reused. This study quantitatively examined the actual amount and type of learning object use, to include reuse, modification, and translation,…

  16. Building a Trustworthy Environmental Science Data Repository: Lessons Learned from the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S. K.; Boyer, A.; Beaty, T.; Deb, D.; Hook, L.

    2017-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, https://daac.ornl.gov) for biogeochemical dynamics is one of NASA's Earth Observing System Data and Information System (EOSDIS) data centers. The mission of the ORNL DAAC is to assemble, distribute, and provide data services for a comprehensive archive of terrestrial biogeochemistry and ecological dynamics observations and models to facilitate research, education, and decision-making in support of NASA's Earth Science. Since its establishment in 1994, ORNL DAAC has been continuously building itself into a trustworthy environmental science data repository by not only ensuring the quality and usability of its data holdings, but also optimizing its data publication and management process. This paper describes the lessons learned from ORNL DAAC's effort toward this goal. ORNL DAAC has been proactively implementing international community standards throughout its data management life cycle, including data publication, preservation, discovery, visualization, and distribution. Data files in standard formats, detailed documentation, and metadata following standard models are prepared to improve the usability and longevity of data products. Assignment of a Digital Object Identifier (DOI) ensures the identifiability and accessibility of every data product, including the different versions and revisions of its life cycle. ORNL DAAC's data citation policy assures data producers receive appropriate recognition of use of their products. Web service standards, such as OpenSearch and Open Geospatial Consortium (OGC), promotes the discovery, visualization, distribution, and integration of ORNL DAAC's data holdings. Recently, ORNL DAAC began efforts to optimize and standardize its data archival and data publication workflows, to improve the efficiency and transparency of its data archival and management processes.

  17. NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF

    PubMed Central

    Halavi, Maryam; Polavaram, Sridevi; Donohue, Duncan E.; Hamilton, Gail; Hoyt, Jeffrey; Smith, Kenneth P.; Ascoli, Giorgio A.

    2009-01-01

    Neuronal morphology affects network connectivity, plasticity, and information processing. Uncovering the design principles and functional consequences of dendritic and axonal shape necessitates quantitative analysis and computational modeling of detailed experimental data. Digital reconstructions provide the required neuromorphological descriptions in a parsimonious, comprehensive, and reliable numerical format. NeuroMorpho.Org is the largest web-accessible repository service for digitally reconstructed neurons and one of the integrated resources in the Neuroscience Information Framework (NIF). Here we describe the NeuroMorpho.Org approach as an exemplary experience in designing, creating, populating, and curating a neuroscience digital resource. The simple three-tier architecture of NeuroMorpho.Org (web client, web server, and relational database) encompasses all necessary elements to support a large-scale, integrate-able repository. The data content, while heterogeneous in scientific scope and experimental origin, is unified in format and presentation by an in house standardization protocol. The server application (MRALD) is secure, customizable, and developer-friendly. Centralized processing and expert annotation yields a comprehensive set of metadata that enriches and complements the raw data. The thoroughly tested interface design allows for optimal and effective data search and retrieval. Availability of data in both original and standardized formats ensures compatibility with existing resources and fosters further tool development. Other key functions enable extensive exploration and discovery, including 3D and interactive visualization of branching, frequently measured morphometrics, and reciprocal links to the original PubMed publications. The integration of NeuroMorpho.Org with version-1 of the NIF (NIFv1) provides the opportunity to access morphological data in the context of other relevant resources and diverse subdomains of neuroscience, opening exciting new possibilities in data mining and knowledge discovery. The outcome of such coordination is the rapid and powerful advancement of neuroscience research at both the conceptual and technological level. PMID:18949582

  18. NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF.

    PubMed

    Halavi, Maryam; Polavaram, Sridevi; Donohue, Duncan E; Hamilton, Gail; Hoyt, Jeffrey; Smith, Kenneth P; Ascoli, Giorgio A

    2008-09-01

    Neuronal morphology affects network connectivity, plasticity, and information processing. Uncovering the design principles and functional consequences of dendritic and axonal shape necessitates quantitative analysis and computational modeling of detailed experimental data. Digital reconstructions provide the required neuromorphological descriptions in a parsimonious, comprehensive, and reliable numerical format. NeuroMorpho.Org is the largest web-accessible repository service for digitally reconstructed neurons and one of the integrated resources in the Neuroscience Information Framework (NIF). Here we describe the NeuroMorpho.Org approach as an exemplary experience in designing, creating, populating, and curating a neuroscience digital resource. The simple three-tier architecture of NeuroMorpho.Org (web client, web server, and relational database) encompasses all necessary elements to support a large-scale, integrate-able repository. The data content, while heterogeneous in scientific scope and experimental origin, is unified in format and presentation by an in house standardization protocol. The server application (MRALD) is secure, customizable, and developer-friendly. Centralized processing and expert annotation yields a comprehensive set of metadata that enriches and complements the raw data. The thoroughly tested interface design allows for optimal and effective data search and retrieval. Availability of data in both original and standardized formats ensures compatibility with existing resources and fosters further tool development. Other key functions enable extensive exploration and discovery, including 3D and interactive visualization of branching, frequently measured morphometrics, and reciprocal links to the original PubMed publications. The integration of NeuroMorpho.Org with version-1 of the NIF (NIFv1) provides the opportunity to access morphological data in the context of other relevant resources and diverse subdomains of neuroscience, opening exciting new possibilities in data mining and knowledge discovery. The outcome of such coordination is the rapid and powerful advancement of neuroscience research at both the conceptual and technological level.

  19. Semantic Repositories for eGovernment Initiatives: Integrating Knowledge and Services

    NASA Astrophysics Data System (ADS)

    Palmonari, Matteo; Viscusi, Gianluigi

    In recent years, public sector investments in eGovernment initiatives have depended on making more reliable existing governmental ICT systems and infrastructures. Furthermore, we assist at a change in the focus of public sector management, from the disaggregation, competition and performance measurements typical of the New Public Management (NPM), to new models of governance, aiming for the reintegration of services under a new perspective in bureaucracy, namely a holistic approach to policy making which exploits the extensive digitalization of administrative operations. In this scenario, major challenges are related to support effective access to information both at the front-end level, by means of highly modular and customizable content provision, and at the back-end level, by means of information integration initiatives. Repositories of information about data and services that exploit semantic models and technologies can support these goals by bridging the gap between the data-level representations and the human-level knowledge involved in accessing information and in searching for services. Moreover, semantic repository technologies can reach a new level of automation for different tasks involved in interoperability programs, both related to data integration techniques and service-oriented computing approaches. In this chapter, we discuss the above topics by referring to techniques and experiences where repositories based on conceptual models and ontologies are used at different levels in eGovernment initiatives: at the back-end level to produce a comprehensive view of the information managed in the public administrations' (PA) information systems, and at the front-end level to support effective service delivery.

  20. The Hemiptera type-material housed in the "Museu de Ciências Naturais, Fundação Zoobotânica do Rio Grande do Sul" of Porto Alegre, Brazil.

    PubMed

    Ruschel, Tatiana Petersen; Guidoti, Marcus; Barcellos, Aline

    2013-01-01

    We provide a commented and referenced list on the type material deposited in the "Museu de Ciencias Naturais, Fundação Zoobotânica do Rio Grande do Sul", Porto Alegre, Brazil. Geographic coordinates are available on a digital repository for free access. High-resolution images of the specimens are available under request.

  1. Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buschaert, S.; Lesoille, S.; Bertrand, J.

    2012-07-01

    The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less

  2. The Effects of Using Learning Objects in Two Different Settings

    ERIC Educational Resources Information Center

    Cakiroglu, Unal; Baki, Adnan; Akkan, Yasar

    2012-01-01

    The study compared the effects of Learning Objects (LOs) within different applications; in classroom and in extracurricular activities. So in this study, firstly a Learning Object Repository (LOR) has been designed in parallel with 9th grade school mathematics curriculum. One of the two treatment groups was named as "classroom group" (n…

  3. Anticipatory planning and control of grasp positions and forces for dexterous two-digit manipulation.

    PubMed

    Fu, Qiushi; Zhang, Wei; Santello, Marco

    2010-07-07

    Dexterous object manipulation requires anticipatory control of digit positions and forces. Despite extensive studies on sensorimotor learning of digit forces, how humans learn to coordinate digit positions and forces has never been addressed. Furthermore, the functional role of anticipatory modulation of digit placement to object properties remains to be investigated. We addressed these questions by asking human subjects (12 females, 12 males) to grasp and lift an inverted T-shaped object using precision grip at constrained or self-chosen locations. The task requirement was to minimize object roll during lift. When digit position was not constrained, subjects could have implemented many equally valid digit position-force coordination patterns. However, choice of digit placement might also have resulted in large trial-to-trial variability of digit position, hence challenging the extent to which the CNS could have relied on sensorimotor memories for anticipatory control of digit forces. We hypothesized that subjects would modulate digit placement for optimal force distribution and digit forces as a function of variable digit positions. All subjects learned to minimize object roll within the first three trials, and the unconstrained device was associated with significantly smaller grip forces but larger variability of digit positions. Importantly, however, digit load force modulation compensated for position variability, thus ensuring consistent object roll minimization on each trial. This indicates that subjects learned object manipulation by integrating sensorimotor memories with sensory feedback about digit positions. These results are discussed in the context of motor equivalence and sensorimotor integration of grasp kinematics and kinetics.

  4. Design training activity for teachers and students on environmental science topic in the frame of ENVRIPLUS project

    NASA Astrophysics Data System (ADS)

    D'Addezio, G.; Beranzoli, L.; Antonella, M.

    2016-12-01

    We elaborated actions to improve the content of the ENVRIPLUS e-Training Platform for multimedia education of secondary school level teachers and students. The purpose is to favor teacher training and consequently students training on selected scientific themes faced within the ENVRIPLUS Research Infrastructures. In particular we address major thematic research areas and challenges on Biodiversity and Ecosystem Services, Greenhouse effect and Earth Warming, Ocean acidifications and Environmental sustainability. First we identified "Best practices" that could positively impacts on students by providing motivation on promoting scientific research and increase the awareness of the Earth System complexity and Environmental challenges for its preservation and sustainability,). Best practice teaching strategies represent an inherent part of a curriculum that exemplifies the connection and relevance identified in education research. To realize the training platform we start detailed study and analysis of teaching and multimedia information materials already available. We plan the realization of a digital repository for access to teachers and students with opportunities to develop original content, with standardization of the design methods of the scientific and technical content, classification / cataloging of information in digital form and definition of a logical model for the provision of thematic content in a single digital environment. To better design the actions and to catch teacher needs, we prepare a questionnaire that will be administered to a large sample of international secondary school level teachers. The first part focused on objective information about the formal, quantitative and qualitative position of science class in schools and the content and methods of teaching in different countries. The second part investigate subjective teacher experiences and their views on what can improve training offer for environmental science lessons and courses.

  5. Reviving legacy clay mineralogy data and metadata through the IEDA-CCNY Data Internship Program

    NASA Astrophysics Data System (ADS)

    Palumbo, R. V.; Randel, C.; Ismail, A.; Block, K. A.; Cai, Y.; Carter, M.; Hemming, S. R.; Lehnert, K.

    2016-12-01

    Reconstruction of past climate and ocean circulation using ocean sediment cores relies on the use of multiple climate proxies measured on well-studied cores. Preserving all the information collected on a sediment core is crucial for the success of future studies using these unique and important samples. Clay mineralogy is a powerful tool to study weathering processes and sedimentary provenance. In his pioneering dissertation, Pierre Biscaye (1964, Yale University) established the X-Ray Diffraction (XRD) method for quantitative clay mineralogy analyses in ocean sediments and presented data for 500 core-top samples throughout the Atlantic Ocean and its neighboring seas. Unfortunately, the data only exists in analog format, which has discouraged scientists from reusing the data, apart from replication of the published maps. Archiving and preserving this dataset and making it publicly available in a digital format, linked with the metadata from the core repository will allow the scientific community to use these data to generate new findings. Under the supervision of Sidney Hemming and members of the Interdisciplinary Earth Data Alliance (IEDA) team, IEDA-CCNY interns digitized the data and metadata from Biscaye's dissertation and linked them with additional sample metadata using IGSN (International Geo-Sample Number). After compilation and proper documentation of the dataset, it was published in the EarthChem Library where the dataset will be openly accessible, and citable with a persistent DOI (Digital Object Identifier). During this internship, the students read peer-reviewed articles, interacted with active scientists in the field and acquired knowledge about XRD methods and the data generated, as well as its applications. They also learned about existing and emerging best practices in data publication and preservation. Data rescue projects are a fun and interactive way for students to become engaged in the field.

  6. Co-production of Health enabled by next generation personal health systems.

    PubMed

    Boye, Niels

    2012-01-01

    This paper describes the theoretical principles for the establishment of a parallel and complementary modality of healthcare delivery - named Coproduction of Health (CpH). This service-model activates digital data, information, and knowledge about health, healthy choices, and the individuals' health-state and computes through personalized models context-aware communication and advice. "Lightweight technologies" (smartphones, tablets, application stores) would serve as the technology close to the end-users (citizens, patients, clients, customers), connecting them with "big data" in conventionally and non-conventionally organized data repositories. The CpH modality aims at providing synergies between professional healthcare, selfcare, informal care and provides data-fusion from several sources such as health characteristics of consumer goods, from sensors, actuators, and health related data-repositories, and turns this into "health added value" for the individual. A theoretical business model respecting healthcare values, ethics, and legal foundation is also sketched out.

  7. Challenges in Developing XML-Based Learning Repositories

    NASA Astrophysics Data System (ADS)

    Auksztol, Jerzy; Przechlewski, Tomasz

    There is no doubt that modular design has many advantages, including the most important ones: reusability and cost-effectiveness. In an e-leaming community parlance the modules are determined as Learning Objects (LOs) [11]. An increasing amount of learning objects have been created and published online, several standards has been established and multiple repositories developed for them. For example Cisco Systems, Inc., "recognizes a need to move from creating and delivering large inflexible training courses, to database-driven objects that can be reused, searched, and modified independent of their delivery media" [6]. The learning object paradigm of education resources authoring is promoted mainly to reduce the cost of the content development and to increase its quality. A frequently used metaphor of Learning Objects paradigm compares them to Lego Logs or objects in Object-Oriented program design [25]. However a metaphor is only an abstract idea, which should be turned to something more concrete to be usable. The problem is that many papers on LOs end up solely in metaphors. In our opinion Lego or OO metaphors are gross oversimplificatation of the problem as there is much easier to develop Lego set or design objects in OO program than develop truly interoperable, context-free learning content1.

  8. OWLing Clinical Data Repositories With the Ontology Web Language

    PubMed Central

    Pastor, Xavier; Lozano, Esther

    2014-01-01

    Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697

  9. Implementing a Community-Driven Cyberinfrastructure Platform for the Paleo- and Rock Magnetic Scientific Fields that Generalizes to Other Geoscience Disciplines

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Jarboe, N.; Koppers, A. A.; Tauxe, L.; Constable, C.

    2013-12-01

    EarthRef.org is a geoscience umbrella website for several databases and data and model repository portals. These portals, unified in the mandate to preserve their respective data and promote scientific collaboration in their fields, are also disparate in their schemata. The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the paleo- and rock magnetic scientific community to archive their wealth of peer-reviewed raw data and interpretations from studies on natural and synthetic samples and relies on a partially strict subsumptive hierarchical data model. The Geochemical Earth Reference Model (http://earthref.org/GERM/) portal focuses on the chemical characterization of the Earth and relies on two data schemata: a repository of peer-reviewed reservoir geochemistry, and a database of partition coefficients for rocks, minerals, and elements. The Seamount Biogeosciences Network (http://earthref.org/SBN/) encourages the collaboration between the diverse disciplines involved in seamount research and includes the Seamount Catalog (http://earthref.org/SC/) of bathymetry and morphology. All of these portals also depend on the EarthRef Reference Database (http://earthref.org/ERR/) for publication reference metadata and the EarthRef Digital Archive (http://earthref.org/ERDA/), a generic repository of data objects and their metadata. The development of the new MagIC Search Interface (http://earthref.org/MagIC/search/) centers on a reusable platform designed to be flexible enough for largely heterogeneous datasets and to scale up to datasets with tens of millions of records. The HTML5 web application and Oracle 11g database residing at the San Diego Supercomputer Center (SDSC) support the online contribution and editing of complex datasets in a spreadsheet environment and the browsing and filtering of these contributions in the context of thousands of other datasets. EarthRef.org is in the process of implementing this platform across all of its data portals in spite of the wide variety of data schemata and is dedicated to serving the geoscience community with as little effort from the end-users as possible.

  10. The digital library: an oxymoron?

    PubMed Central

    Guédon, J C

    1999-01-01

    "Virtual libraries" and "digital libraries" have become stock phrases of our times. But what do they really mean? While digital refers to a new form of document encoding and must be approached from that perspective, virtual resonates with aspects that modern philosophy treats with benign neglect at best. The word virtual harbors the notion of potential, and therein lies its hidden strength. Although strong commercial interests try to use the shift to a digital environment to redefine the political economy of knowledge, and thus virtualize libraries into a state of almost complete impotence, all hope is not lost. Librarians of virtualized libraries may well discover that they have re-empowered institutions if they place human interaction at the heart of their operations. In other words, rather than envisioning themselves as knowledge bankers sitting on treasure vaults of knowledge, they should see themselves as "hearts" dynamizing human communities. They should also see themselves as an essential part of these communities, and not as external repositories of knowledge. In this fashion, they will avoid the fate of becoming an oxymoron. PMID:9934524

  11. Integrating digital information for coastal and marine sciences

    USGS Publications Warehouse

    Marincioni, Fausto; Lightsom, Frances L.; Riall, Rebecca L.; Linck, Guthrie A.; Aldrich, Thomas C.; Caruso, Michael J.

    2004-01-01

    A pilot distributed geolibrary, the Marine Realms Information Bank (MRIB), was developed by the U.S. Geological Survey Coastal and Marine Geology Program and the Woods Hole Oceanographic Institution, to classify, integrate, and facilitate access to scientific information about oceans, coasts, and lakes. The MRIB is composed of a categorization scheme, a metadata database, and a specialized software backend, capable of drawing together information from remote sources without modifying their original format or content. Twelve facets are used to classify information: location, geologic time, feature type, biota, discipline, research method, hot topics, project, agency, author, content type, and file type. The MRIB approach allows easy and flexible organization of large or growing document collections for which centralized repositories would be impractical. Geographic searching based on the gazetteer and map interface is the centerpiece of the MRIB distributed geolibrary. The MRIB is one of a very few digital libraries that employ georeferencing -- a fundamentally different way to structure information from the traditional author/title/subject/keyword approach employed by most digital libraries. Lessons learned in developing the MRIB will be useful as other digital libraries confront the challenges of georeferencing.

  12. The Use of Underground Research Laboratories to Support Repository Development Programs. A Roadmap for the Underground Research Facilities Network.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, Robert J.

    2015-10-26

    Under the auspices of the International Atomic Energy Agency (IAEA), nationally developed underground research laboratories (URLs) and associated research institutions are being offered for use by other nations. These facilities form an Underground Research Facilities (URF) Network for training in and demonstration of waste disposal technologies and the sharing of knowledge and experience related to geologic repository development, research, and engineering. In order to achieve its objectives, the URF Network regularly sponsors workshops and training events related to the knowledge base that is transferable between existing URL programs and to nations with an interest in developing a new URL. Thismore » report describes the role of URLs in the context of a general timeline for repository development. This description includes identification of key phases and activities that contribute to repository development as a repository program evolves from an early research and development phase to later phases such as construction, operations, and closure. This information is cast in the form of a matrix with the entries in this matrix forming the basis of the URF Network roadmap that will be used to identify and plan future workshops and training events.« less

  13. Metadata management and semantics in microarray repositories.

    PubMed

    Kocabaş, F; Can, T; Baykal, N

    2011-12-01

    The number of microarray and other high-throughput experiments on primary repositories keeps increasing as do the size and complexity of the results in response to biomedical investigations. Initiatives have been started on standardization of content, object model, exchange format and ontology. However, there are backlogs and inability to exchange data between microarray repositories, which indicate that there is a great need for a standard format and data management. We have introduced a metadata framework that includes a metadata card and semantic nets that make experimental results visible, understandable and usable. These are encoded in syntax encoding schemes and represented in RDF (Resource Description Frame-word), can be integrated with other metadata cards and semantic nets, and can be exchanged, shared and queried. We demonstrated the performance and potential benefits through a case study on a selected microarray repository. We concluded that the backlogs can be reduced and that exchange of information and asking of knowledge discovery questions can become possible with the use of this metadata framework.

  14. Exploiting the HASH Planetary Nebula Research Platform

    NASA Astrophysics Data System (ADS)

    Parker, Quentin A.; Bojičić, Ivan; Frew, David J.

    2017-10-01

    The HASH (Hong Kong/ AAO/ Strasbourg/ Hα) planetary nebula research platform is a unique data repository with a graphical interface and SQL capability that offers the community powerful, new ways to undertake Galactic PN studies. HASH currently contains multi-wavelength images, spectra, positions, sizes, morphologies and other data whenever available for 2401 true, 447 likely, and 692 possible Galactic PNe, for a total of 3540 objects. An additional 620 Galactic post-AGB stars, pre-PNe, and PPN candidates are included. All objects were classified and evaluated following the precepts and procedures established and developed by our group over the last 15 years. The complete database contains over 6,700 Galactic objects including the many mimics and related phenomena previously mistaken or confused with PNe. Curation and updating currently occurs on a weekly basis to keep the repository as up to date as possible until the official release of HASH v1 planned in the near future.

  15. On implementing clinical decision support: achieving scalability and maintainability by combining business rules and ontologies.

    PubMed

    Kashyap, Vipul; Morales, Alfredo; Hongsermeier, Tonya

    2006-01-01

    We present an approach and architecture for implementing scalable and maintainable clinical decision support at the Partners HealthCare System. The architecture integrates a business rules engine that executes declarative if-then rules stored in a rule-base referencing objects and methods in a business object model. The rules engine executes object methods by invoking services implemented on the clinical data repository. Specialized inferences that support classification of data and instances into classes are identified and an approach to implement these inferences using an OWL based ontology engine is presented. Alternative representations of these specialized inferences as if-then rules or OWL axioms are explored and their impact on the scalability and maintenance of the system is presented. Architectural alternatives for integration of clinical decision support functionality with the invoking application and the underlying clinical data repository; and their associated trade-offs are discussed and presented.

  16. Microservices in Web Objects Enabled IoT Environment for Enhancing Reusability

    PubMed Central

    Chong, Ilyoung

    2018-01-01

    In the ubiquitous Internet of Things (IoT) environment, reusing objects instead of creating new one has become important in academics and industries. The situation becomes complex due to the availability of a huge number of connected IoT objects, and each individual service creates a new object instead of reusing the existing one to fulfill a requirement. A well-standard mechanism not only improves the reusability of objects but also improves service modularity and extensibility, and reduces cost. Web Objects enabled IoT environment applies the principle of reusability of objects in multiple IoT application domains through central objects repository and microservices. To reuse objects with microservices and to maintain a relationship with them, this study presents an architecture of Web of Objects platform. In the case of a similar request for an object, the already instantiated object that exists in the same or from other domain can be reused. Reuse of objects through microservices avoids duplications, and reduces time to search and instantiate them from their registries. Further, this article presents an algorithm for microservices and related objects discovery that considers the reusability of objects through the central objects repository. To support the reusability of objects, the necessary algorithm for objects matching is also presented. To realize the reusability of objects in Web Objects enabled IoT environment, a prototype has been designed and implemented based on a use case scenario. Finally, the results of the prototype have been analyzed and discussed to validate the proposed approach. PMID:29373491

  17. Microservices in Web Objects Enabled IoT Environment for Enhancing Reusability.

    PubMed

    Jarwar, Muhammad Aslam; Kibria, Muhammad Golam; Ali, Sajjad; Chong, Ilyoung

    2018-01-26

    In the ubiquitous Internet of Things (IoT) environment, reusing objects instead of creating new one has become important in academics and industries. The situation becomes complex due to the availability of a huge number of connected IoT objects, and each individual service creates a new object instead of reusing the existing one to fulfill a requirement. A well-standard mechanism not only improves the reusability of objects but also improves service modularity and extensibility, and reduces cost. Web Objects enabled IoT environment applies the principle of reusability of objects in multiple IoT application domains through central objects repository and microservices. To reuse objects with microservices and to maintain a relationship with them, this study presents an architecture of Web of Objects platform. In the case of a similar request for an object, the already instantiated object that exists in the same or from other domain can be reused. Reuse of objects through microservices avoids duplications, and reduces time to search and instantiate them from their registries. Further, this article presents an algorithm for microservices and related objects discovery that considers the reusability of objects through the central objects repository. To support the reusability of objects, the necessary algorithm for objects matching is also presented. To realize the reusability of objects in Web Objects enabled IoT environment, a prototype has been designed and implemented based on a use case scenario. Finally, the results of the prototype have been analyzed and discussed to validate the proposed approach.

  18. Scanning for Digitization Projects

    ERIC Educational Resources Information Center

    Wentzel, Larry

    2007-01-01

    Librarians and archivists find themselves facing the prospect of digitization. Everyone is doing it, everyone needs it. Discussions rage nationally and internationally concerning what to digitize and the best means to present and retain digital objects. Digitization is the act of making something digital, expressing a physical object "in numerical…

  19. New Catalog of Resources Enables Paleogeosciences Research

    NASA Astrophysics Data System (ADS)

    Lingo, R. C.; Horlick, K. A.; Anderson, D. M.

    2014-12-01

    The 21st century promises a new era for scientists of all disciplines, the age where cyber infrastructure enables research and education and fuels discovery. EarthCube is a working community of over 2,500 scientists and students of many Earth Science disciplines who are looking to build bridges between disciplines. The EarthCube initiative will create a digital infrastructure that connects databases, software, and repositories. A catalog of resources (databases, software, repositories) has been produced by the Research Coordination Network for Paleogeosciences to improve the discoverability of resources. The Catalog is currently made available within the larger-scope CINERGI geosciences portal (http://hydro10.sdsc.edu/geoportal/catalog/main/home.page). Other distribution points and web services are planned, using linked data, content services for the web, and XML descriptions that can be harvested using metadata protocols. The databases provide searchable interfaces to find data sets that would otherwise remain dark data, hidden in drawers and on personal computers. The software will be described in catalog entries so just one click will lead users to methods and analytical tools that many geoscientists were unaware of. The repositories listed in the Paleogeosciences Catalog contain physical samples found all across the globe, from natural history museums to the basements of university buildings. EarthCube has over 250 databases, 300 software systems, and 200 repositories which will grow in the coming year. When completed, geoscientists across the world will be connected into a productive workflow for managing, sharing, and exploring geoscience data and information that expedites collaboration and innovation within the paleogeosciences, potentially bringing about new interdisciplinary discoveries.

  20. De-identification of Medical Images with Retention of Scientific Research Value

    PubMed Central

    Maffitt, David R.; Smith, Kirk E.; Kirby, Justin S.; Clark, Kenneth W.; Freymann, John B.; Vendt, Bruce A.; Tarbox, Lawrence R.; Prior, Fred W.

    2015-01-01

    Online public repositories for sharing research data allow investigators to validate existing research or perform secondary research without the expense of collecting new data. Patient data made publicly available through such repositories may constitute a breach of personally identifiable information if not properly de-identified. Imaging data are especially at risk because some intricacies of the Digital Imaging and Communications in Medicine (DICOM) format are not widely understood by researchers. If imaging data still containing protected health information (PHI) were released through a public repository, a number of different parties could be held liable, including the original researcher who collected and submitted the data, the original researcher’s institution, and the organization managing the repository. To minimize these risks through proper de-identification of image data, one must understand what PHI exists and where that PHI resides, and one must have the tools to remove PHI without compromising the scientific integrity of the data. DICOM public elements are defined by the DICOM Standard. Modality vendors use private elements to encode acquisition parameters that are not yet defined by the DICOM Standard, or the vendor may not have updated an existing software product after DICOM defined new public elements. Because private elements are not standardized, a common de-identification practice is to delete all private elements, removing scientifically useful data as well as PHI. Researchers and publishers of imaging data can use the tools and process described in this article to de-identify DICOM images according to current best practices. ©RSNA, 2015 PMID:25969931

  1. Repository Drift Backfilling Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Londe, I.; Dubois, J.Ph.; Bauer, C.

    2008-07-01

    The 'Backfilling Demonstrator' is one of the technological demonstrators developed by ANDRA in the framework of the feasibility studies for a geological repository for high-level long-lived (HL-LL waste) within a clay formation. The demonstrator concerns the standard and supporting backfills as defined in Andra's 2005 design. The standard backfill is intended to fill up almost all drifts of the underground repository in order to limit any deformation of the rock after the degradation of the drift lining. The supporting backfill only concerns a small portion of the volume to be backfilled in order to counter the swelling pressure of themore » swelling clay contained in the sealing structures. The first objective of the demonstrator was to show the possibility of manufacturing a satisfactory backfill, in spite of the exiguity of the underground structures, and of reusing as much as possible the argillite muck. For the purpose of this experiment, the argillite muck was collected on Andra's work-site for the implementation of an underground research laboratory. Still ongoing, the second objective is to follow up the long-term evolution of the backfill. Approximately 200 m{sup 3} of compacted backfill material have been gathered in a large concrete tube simulating a repository drift. The standard backfill was manufactured exclusively with argillite. The supporting backfill was made by forming a mixture of argillite and sand. Operations were carried out mostly at Richwiller, close to Mulhouse, France. The objectives of the demonstrator were met: an application method was tested and proven satisfactory. The resulting dry densities are relatively high, although the moduli of deformation do not always reach the set goal. The selected objective for the demonstrator was a dry density corresponding to a relatively high compaction level (95% of the standard Proctor optimum [SPO]), for both pure argillite and the argillite-sand mixture. The plate-percussion compaction technique was used and proved satisfactory. The measured dry densities are higher than the 95%-SPO objective. The implementation rates remain very low due to the experimental conditions involved. The metal supply mode would need to be revised before any industrial application is contemplated. The Demonstrator Program started in August 2004 and is followed up today over the long term. With that objective in mind, sensors and a water-saturation system have been installed. (author)« less

  2. The Internet of Scientific Research Things

    NASA Astrophysics Data System (ADS)

    Chandler, Cynthia; Shepherd, Adam; Arko, Robert; Leadbetter, Adam; Groman, Robert; Kinkade, Danie; Rauch, Shannon; Allison, Molly; Copley, Nancy; Gegg, Stephen; Wiebe, Peter; Glover, David

    2016-04-01

    The sum of the parts is greater than the whole, but for scientific research how do we identify the parts when they are curated at distributed locations? Results from environmental research represent an enormous investment and constitute essential knowledge required to understand our planet in this time of rapid change. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) curates data from US NSF Ocean Sciences funded research awards, but BCO-DMO is only one repository in a landscape that includes many other sites that carefully curate results of scientific research. Recent efforts to use persistent identifiers (PIDs), most notably Open Researcher and Contributor ID (ORCiD) for person, Digital Object Identifier (DOI) for publications including data sets, and Open Funder Registry (FundRef) codes for research grants and awards are realizing success in unambiguously identifying the pieces that represent results of environmental research. This presentation uses BCO-DMO as a test case for adding PIDs to the locally-curated information published out as standards compliant metadata records. We present a summary of progress made thus far; what has worked and why, and thoughts on logical next steps.

  3. A Safety Case Approach for Deep Geologic Disposal of DOE HLW and DOE SNF in Bedded Salt - 13350

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevougian, S. David; MacKinnon, Robert J.; Leigh, Christi D.

    2013-07-01

    The primary objective of this study is to investigate the feasibility and utility of developing a defensible safety case for disposal of United States Department of Energy (U.S. DOE) high-level waste (HLW) and DOE spent nuclear fuel (SNF) in a conceptual deep geologic repository that is assumed to be located in a bedded salt formation of the Delaware Basin [1]. A safety case is a formal compilation of evidence, analyses, and arguments that substantiate and demonstrate the safety of a proposed or conceptual repository. We conclude that a strong initial safety case for potential licensing can be readily compiled bymore » capitalizing on the extensive technical basis that exists from prior work on the Waste Isolation Pilot Plant (WIPP), other U.S. repository development programs, and the work published through international efforts in salt repository programs such as in Germany. The potential benefits of developing a safety case include leveraging previous investments in WIPP to reduce future new repository costs, enhancing the ability to effectively plan for a repository and its licensing, and possibly expediting a schedule for a repository. A safety case will provide the necessary structure for organizing and synthesizing existing salt repository science and identifying any issues and gaps pertaining to safe disposal of DOE HLW and DOE SNF in bedded salt. The safety case synthesis will help DOE to plan its future R and D activities for investigating salt disposal using a risk-informed approach that prioritizes test activities that include laboratory, field, and underground investigations. It should be emphasized that the DOE has not made any decisions regarding the disposition of DOE HLW and DOE SNF. Furthermore, the safety case discussed herein is not intended to either site a repository in the Delaware Basin or preclude siting in other media at other locations. Rather, this study simply presents an approach for accelerated development of a safety case for a potential DOE HLW and DOE SNF repository using the currently available technical basis for bedded salt. This approach includes a summary of the regulatory environment relevant to disposal of DOE HLW and DOE SNF in a deep geologic repository, the key elements of a safety case, the evolution of the safety case through the successive phases of repository development and licensing, and the existing technical basis that could be used to substantiate the safety of a geologic repository if it were to be sited in the Delaware Basin. We also discuss the potential role of an underground research laboratory (URL). (authors)« less

  4. Organization of marine phenology data in support of planning and conservation in ocean and coastal ecosystems

    USGS Publications Warehouse

    Thomas, Kathryn A.; Fornwall, Mark D.; Weltzin, Jake F.; Griffis, R.B.

    2014-01-01

    Among the many effects of climate change is its influence on the phenology of biota. In marine and coastal ecosystems, phenological shifts have been documented for multiple life forms; however, biological data related to marine species' phenology remain difficult to access and is under-used. We conducted an assessment of potential sources of biological data for marine species and their availability for use in phenological analyses and assessments. Our evaluations showed that data potentially related to understanding marine species' phenology are available through online resources of governmental, academic, and non-governmental organizations, but appropriate datasets are often difficult to discover and access, presenting opportunities for scientific infrastructure improvement. The developing Federal Marine Data Architecture when fully implemented will improve data flow and standardization for marine data within major federal repositories and provide an archival repository for collaborating academic and public data contributors. Another opportunity, largely untapped, is the engagement of citizen scientists in standardized collection of marine phenology data and contribution of these data to established data flows. Use of metadata with marine phenology related keywords could improve discovery and access to appropriate datasets. When data originators choose to self-publish, publication of research datasets with a digital object identifier, linked to metadata, will also improve subsequent discovery and access. Phenological changes in the marine environment will affect human economics, food systems, and recreation. No one source of data will be sufficient to understand these changes. The collective attention of marine data collectors is needed—whether with an agency, an educational institution, or a citizen scientist group—toward adopting the data management processes and standards needed to ensure availability of sufficient and useable marine data to understand marine phenology.

  5. New Incentives to Stimulate Data Publication

    NASA Astrophysics Data System (ADS)

    Urban, E. R.; Lowry, R.; Pissierssens, P.

    2008-12-01

    Data from ocean observations and experiments often are not submitted to appropriate data centers, or if they are submitted, may not be easily retrievable. These problems arise for a variety of reasons. Data are not always submitted, even when required by the agency funding the research, because the rewards for submitting data are not strong enough. Once data are submitted, the typical data center disaggregates the data into its component parameters, so it is difficult to get all the data related to a particular experiment back out of the system. With the advent of persistent identifiers, like digital object identifiers , the rapid evolution of the high-speed Internet, and the availability of large digital storage capacities that enable the transfer and storage of comprehensive data sets, it is now possible to restructure data management in a way that will create new incentives for ocean scientists to submit their data, for others to use it, and for the originating scientists to get credit for their effort and creativity in collecting the data. This presentation will report on a new activity of the Scientific Committee on Oceanic Research and the International Ocean Data and Information Exchange of UNESCO's Intergovernmental Oceanographic Commission that is mapping out new ways to (1) submit the data underlying the figures and tables in traditionally published papers to a recognized repository and link it to the publication, and (2) stimulate the submission of data publications that can be cited on originating scientists' CVs.

  6. Learning Objects, Repositories, Sharing and Reusability

    ERIC Educational Resources Information Center

    Koppi, Tony; Bogle, Lisa; Bogle, Mike

    2005-01-01

    The online Learning Resource Catalogue (LRC) Project has been part of an international consortium for several years and currently includes 25 institutions worldwide. The LRC Project has evolved for several pragmatic reasons into an academic network whereby members can identify and share reusable learning objects as well as collaborate in a number…

  7. The Effects of Discipline on the Application of Learning Object Metadata in UK Higher Education: The Case of the Jorum Repository

    ERIC Educational Resources Information Center

    Balatsoukas, Panos; O'Brien, Ann; Morris, Anne

    2011-01-01

    Introduction: This paper reports on the findings of a study investigating the potential effects of discipline (sciences and engineering versus humanities and social sciences) on the application of the Institute of Electrical and Electronic Engineers learning object metadata elements for the description of learning objects in the Jorum learning…

  8. Site characterization progress report: Yucca Mountain, Nevada. Number 15, April 1--September 30, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-04-01

    During the second half of fiscal year 1996, activities at the Yucca Mountain Site Characterization Project (Project) supported the objectives of the revised Program Plan released this period by the Office of Civilian Radioactive Waste Management of the US Department of Energy (Department). Outlined in the revised plan is a focused, integrated program of site characterization, design, engineering, environmental, and performance assessment activities that will achieve key Program and statutory objectives. The plan will result in the development of a license application for repository construction at Yucca Mountain, if the site is found suitable. Activities this period focused on twomore » of the three near-term objectives of the revised plan: updating in 1997 the regulatory framework for determining the suitability of the site for the proposed repository concept and providing information for a 1998 viability assessment of continuing toward the licensing of a repository. The Project has also developed a new design approach that uses the advanced conceptual design published during the last reporting period as a base for developing a design that will support the viability assessment. The initial construction phase of the Thermal Testing Facility was completed and the first phase of the in situ heater tests began on schedule. In addition, phase-one construction was completed for the first of two alcoves that will provide access to the Ghost Dance fault.« less

  9. Representing Value as Digital Object: A Discussion of Transferability and Anonymity; Digital Library Initiatives of the Deutsche Forschungsgemeinschaft; CrossRef Turns One; Fermi National Accelerator Laboratory (Fermilab).

    ERIC Educational Resources Information Center

    Kahn, Robert E.; Lyons, Patrice A.; Brahms, Ewald; Brand, Amy; van den Bergen, Mieke

    2001-01-01

    Includes four articles that discuss the use of digital objects to represent value in a network environment; digital library initiatives at the central public funding organization for academic research in Germany; an application of the Digital Object Identifier System; and the Web site of the Fermi National Accelerator Laboratory. (LRW)

  10. Open access: changing global science publishing.

    PubMed

    Gasparyan, Armen Yuri; Ayvazyan, Lilit; Kitas, George D

    2013-08-01

    The article reflects on open access as a strategy of changing the quality of science communication globally. Successful examples of open-access journals are presented to highlight implications of archiving in open digital repositories for the quality and citability of research output. Advantages and downsides of gold, green, and hybrid models of open access operating in diverse scientific environments are described. It is assumed that open access is a global trend which influences the workflow in scholarly journals, changing their quality, credibility, and indexability.

  11. Joint Information Theoretic and Differential Geometrical Approach for Robust Automated Target Recognition

    DTIC Science & Technology

    2012-02-29

    surface and Swiss roll) and real-world data sets (UCI Machine Learning Repository [12] and USPS digit handwriting data). In our experiments, we use...less than µn ( say µ = 0.8), we can first use screening technique to select µn candidate nodes, and then apply BIPS on them for further selection and...identified from node j to node i. So we can say the probability for the existence of this connection is approximately 82%. Given the probability matrix

  12. Image microarrays (IMA): Digital pathology's missing tool

    PubMed Central

    Hipp, Jason; Cheng, Jerome; Pantanowitz, Liron; Hewitt, Stephen; Yagi, Yukako; Monaco, James; Madabhushi, Anant; Rodriguez-canales, Jaime; Hanson, Jeffrey; Roy-Chowdhuri, Sinchita; Filie, Armando C.; Feldman, Michael D.; Tomaszewski, John E.; Shih, Natalie NC.; Brodsky, Victor; Giaccone, Giuseppe; Emmert-Buck, Michael R.; Balis, Ulysses J.

    2011-01-01

    Introduction: The increasing availability of whole slide imaging (WSI) data sets (digital slides) from glass slides offers new opportunities for the development of computer-aided diagnostic (CAD) algorithms. With the all-digital pathology workflow that these data sets will enable in the near future, literally millions of digital slides will be generated and stored. Consequently, the field in general and pathologists, specifically, will need tools to help extract actionable information from this new and vast collective repository. Methods: To address this limitation, we designed and implemented a tool (dCORE) to enable the systematic capture of image tiles with constrained size and resolution that contain desired histopathologic features. Results: In this communication, we describe a user-friendly tool that will enable pathologists to mine digital slides archives to create image microarrays (IMAs). IMAs are to digital slides as tissue microarrays (TMAs) are to cell blocks. Thus, a single digital slide could be transformed into an array of hundreds to thousands of high quality digital images, with each containing key diagnostic morphologies and appropriate controls. Current manual digital image cut-and-paste methods that allow for the creation of a grid of images (such as an IMA) of matching resolutions are tedious. Conclusion: The ability to create IMAs representing hundreds to thousands of vetted morphologic features has numerous applications in education, proficiency testing, consensus case review, and research. Lastly, in a manner analogous to the way conventional TMA technology has significantly accelerated in situ studies of tissue specimens use of IMAs has similar potential to significantly accelerate CAD algorithm development. PMID:22200030

  13. Optical digitizing and strategies to combine different views of an optical sensor

    NASA Astrophysics Data System (ADS)

    Duwe, Hans P.

    1997-09-01

    Non-contact digitization of objects and surfaces with optical sensors based on fringe or pattern projection in combination with a CCD-camera allows a representation of surfaces with pointclouds equals x, y, z data points. To digitize the total surface of an object, it is necessary to combine the different measurement data obtained by the optical sensor from different views. Depending on the size of the object and the required accuracy of the measured data, different sensor set-ups with handling system or a combination of linear and rotation axes are described. Furthermore, strategies to match the overlapping pointclouds of a digitized object are introduced. This is very important especially for the digitization of large objects like 1:1 car models, etc. With different sensor sizes, it is possible to digitize small objects like teeth, crowns, inlays, etc. with an overall accuracy of 20 micrometer as well as large objects like car models, with a total accuracy of 0.5 mm. The various applications in the field of optical digitization are described.

  14. Spatial-Heterodyne Interferometry For Reflection And Transm Ission (Shirt) Measurements

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN; Tobin, Ken W [Harriman, TN

    2006-02-14

    Systems and methods are described for spatial-heterodyne interferometry for reflection and transmission (SHIRT) measurements. A method includes digitally recording a first spatially-heterodyned hologram using a first reference beam and a first object beam; digitally recording a second spatially-heterodyned hologram using a second reference beam and a second object beam; Fourier analyzing the digitally recorded first spatially-heterodyned hologram to define a first analyzed image; Fourier analyzing the digitally recorded second spatially-heterodyned hologram to define a second analyzed image; digitally filtering the first analyzed image to define a first result; and digitally filtering the second analyzed image to define a second result; performing a first inverse Fourier transform on the first result, and performing a second inverse Fourier transform on the second result. The first object beam is transmitted through an object that is at least partially translucent, and the second object beam is reflected from the object.

  15. A recommendation module to help teachers build courses through the Moodle Learning Management System

    NASA Astrophysics Data System (ADS)

    Limongelli, Carla; Lombardi, Matteo; Marani, Alessandro; Sciarrone, Filippo; Temperini, Marco

    2016-01-01

    In traditional e-learning, teachers design sets of Learning Objects (LOs) and organize their sequencing; the material implementing the LOs could be either built anew or adopted from elsewhere (e.g. from standard-compliant repositories) and reused. This task is applicable also when the teacher works in a system for personalized e-learning. In this case, the burden actually increases: for instance, the LOs may need adaptation to the system, through additional metadata. This paper presents a module that gives some support to the operations of retrieving, analyzing, and importing LOs from a set of standard Learning Objects Repositories, acting as a recommending system. In particular, it is designed to support the teacher in the phases of (i) retrieval of LOs, through a keyword-based search mechanism applied to the selected repositories; (ii) analysis of the returned LOs, whose information is enriched by a concept of relevance metric, based on both the results of the searching operation and the data related to the previous use of the LOs in the courses managed by the Learning Management System; and (iii) LO importation into the course under construction.

  16. US EPA Digital Science: An Evolution

    NASA Astrophysics Data System (ADS)

    Ziegler, C. R.; Burch, K.; Laniak, G.; Vega, A.; Harten, P.; Kremer, J.; Brookes, A.; Yuen, A.; Subramanian, B.

    2015-12-01

    The United States Environmental Protection Agency's (US EPA) digital science "enterprise" plays a critical role in US EPA's efforts to achieve its mission to protect human health and the environment. This enterprise is an evolving cross-disciplinary research and development construct, with social and institutional dimensions. It has an active development community and produces a portfolio of digital science products including decision support tools, data repositories, Web interfaces, and more. Earth sciences and sustainable development organizations from around the world - including US government agencies - have achieved various levels of success in taking advantage of the rapidly-evolving digital age. Efficiency, transparency and ability to innovate are tied to an organization's digital maturity and related social characteristics. Concepts like participatory web, data and software interoperability, global technology transfer, ontological harmonization, big data, scaling, re-use and open science are no longer "new and emerging." They have emerged and - in some cases - are tied to US government directives. We assess maturity, describe future scenarios, discuss new initiatives and outline steps for better leveraging the information age to more effectively and efficiently achieve US EPA's mission. The views expressed herein are those of the authors and do not necessarily reflect the views or policies of the organizations for which they work and/or represent.

  17. Web-of-Objects (WoO)-Based Context Aware Emergency Fire Management Systems for the Internet of Things

    PubMed Central

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-01-01

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository. PMID:24531299

  18. Web-of-Objects (WoO)-based context aware emergency fire management systems for the Internet of Things.

    PubMed

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-02-13

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository.

  19. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  20. The application of digital image plane holography technology to identify Chinese herbal medicine

    NASA Astrophysics Data System (ADS)

    Wang, Huaying; Guo, Zhongjia; Liao, Wei; Zhang, Zhihui

    2012-03-01

    In this paper, the imaging technology of digital image plane holography to identify the Chinese herbal medicine is studied. The optical experiment system of digital image plane holography which is the special case of pre-magnification digital holography was built. In the record system, one is an object light by using plane waves which illuminates the object, and the other one is recording hologram by using spherical light wave as reference light. There is a Micro objective lens behind the object. The second phase factor which caus ed by the Micro objective lens can be eliminated by choosing the proper position of the reference point source when digital image plane holography is recorded by spherical light. In this experiment, we use the Lygodium cells and Onion cells as the object. The experiment results with Lygodium cells and Onion cells show that digital image plane holography avoid the process of finding recording distance by using auto-focusing approach, and the phase information of the object can be reconstructed more accurately. The digital image plane holography is applied to the microscopic imaging of cells more effectively, and it is suit to apply for the identify of Chinese Herbal Medicine. And it promotes the application of digital holographic in practice.

  1. Multiple Criteria Evaluation of Quality and Optimisation of e-Learning System Components

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Dagiene, Valentina

    2010-01-01

    The main research object of the paper is investigation and proposal of the comprehensive Learning Object Repositories (LORs) quality evaluation tool suitable for their multiple criteria decision analysis, evaluation and optimisation. Both LORs "internal quality" and "quality in use" evaluation (decision making) criteria are analysed in the paper.…

  2. Ontology-Based Annotation of Learning Object Content

    ERIC Educational Resources Information Center

    Gasevic, Dragan; Jovanovic, Jelena; Devedzic, Vladan

    2007-01-01

    The paper proposes a framework for building ontology-aware learning object (LO) content. Previously ontologies were exclusively employed for enriching LOs' metadata. Although such an approach is useful, as it improves retrieval of relevant LOs from LO repositories, it does not enable one to reuse components of a LO, nor to incorporate an explicit…

  3. Model of Distributed Learning Objects Repository for a Heterogenic Internet Environment

    ERIC Educational Resources Information Center

    Kaczmarek, Jerzy; Landowska, Agnieszka

    2006-01-01

    In this article, an extension of the existing structure of learning objects is described. The solution addresses the problem of the access and discovery of educational resources in the distributed Internet environment. An overview of e-learning standards, reference models, and problems with educational resources delivery is presented. The paper…

  4. A Content Standard for Computational Models; Digital Rights Management (DRM) Architectures; A Digital Object Approach to Interoperable Rights Management: Finely-Grained Policy Enforcement Enabled by a Digital Object Infrastructure; LOCKSS: A Permanent Web Publishing and Access System; Tapestry of Time and Terrain.

    ERIC Educational Resources Information Center

    Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.

    2001-01-01

    Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…

  5. OntoVIP: an ontology for the annotation of object models used for medical image simulation.

    PubMed

    Gibaud, Bernard; Forestier, Germain; Benoit-Cattin, Hugues; Cervenansky, Frédéric; Clarysse, Patrick; Friboulet, Denis; Gaignard, Alban; Hugonnard, Patrick; Lartizien, Carole; Liebgott, Hervé; Montagnat, Johan; Tabary, Joachim; Glatard, Tristan

    2014-12-01

    This paper describes the creation of a comprehensive conceptualization of object models used in medical image simulation, suitable for major imaging modalities and simulators. The goal is to create an application ontology that can be used to annotate the models in a repository integrated in the Virtual Imaging Platform (VIP), to facilitate their sharing and reuse. Annotations make the anatomical, physiological and pathophysiological content of the object models explicit. In such an interdisciplinary context we chose to rely on a common integration framework provided by a foundational ontology, that facilitates the consistent integration of the various modules extracted from several existing ontologies, i.e. FMA, PATO, MPATH, RadLex and ChEBI. Emphasis is put on methodology for achieving this extraction and integration. The most salient aspects of the ontology are presented, especially the organization in model layers, as well as its use to browse and query the model repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Analyzing Hidden Semantics in Social Bookmarking of Open Educational Resources

    NASA Astrophysics Data System (ADS)

    Minguillón, Julià

    Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describing them. This is especially interesting in the field of open educational resources, as delicious is a simple way to bridge the institutional point of view (i.e. learning object repositories) with the individual one (i.e. personal collections), thus promoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used to improve descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags create clusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.

  7. Long-Term Information Management (LTIM) of Safeguards Data at Repositories: Phase II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haddal, Risa N.

    One of the challenges of implementing safeguards for geological repositories will be the long-term preservation of safeguards-related data for 100 years or more. While most countries considering the construction and operation of such facilities agree that safeguards information should be preserved, there are gaps with respect to standardized requirements, guidelines, timescales, and approaches. This study analyzes those gaps and explores research to clarify stakeholder needs, identify current policies, approaches, best practices and international standards, and explores existing safeguards information management infrastructure. The study also attempts to clarify what a safeguards data classification system might look like, how long data shouldmore » be retained, and how information should be exchanged between stakeholders at different phases of a repository’s life cycle. The analysis produced a variety of recommendations on what information to preserve, how to preserve it, where to store it, retention options and how to exchange information in the long term. Key findings include the use of the globally recognized international records management standard, ISO15489, for guidance on the development of information management systems, and the development of a Key Information File (KIF). The KIF could be used to identify only the most relevant, high-level safeguards information and the history of decision making about the repository. The study also suggests implementing on-site and off-site records storage in digital and physical form; developing a safeguards data classification system; long-term records retention with periodic reviews every 5 to 10 years during each phase of the repository life cycle; and establishing transition procedures well in advance so that data shepherds and records officers can transfer information with incoming facility managers effectively and efficiently. These and other recommendations are further analyzed in this study.« less

  8. Hybrid Multiagent System for Automatic Object Learning Classification

    NASA Astrophysics Data System (ADS)

    Gil, Ana; de La Prieta, Fernando; López, Vivian F.

    The rapid evolution within the context of e-learning is closely linked to international efforts on the standardization of learning object metadata, which provides learners in a web-based educational system with ubiquitous access to multiple distributed repositories. This article presents a hybrid agent-based architecture that enables the recovery of learning objects tagged in Learning Object Metadata (LOM) and provides individualized help with selecting learning materials to make the most suitable choice among many alternatives.

  9. An analysis of packaging formats for complex digtal objects: review of principles

    NASA Astrophysics Data System (ADS)

    Bekaert, Jeroen L.; Hochstenbach, Patrick; De Kooning, Emiel; Van de Walle, Rik

    2003-11-01

    During recent years, the number of organizations making digital information available has massively increased. This evolution encouraged the development of standards for packaging and encoding digital representations of complex objects (such as a digital music albums or digitized books and photograph albums). The primary goal of this article is to offer a method to compare these packaging standards and best practices tailored to the needs of the digital library community and the rising digital preservation programs. The contribution of this paper is the definition of an integrated reference model, based on both the OAIS framework and some additional significant properties that affect the quality, usability, encoding and behavior of the digital objects.

  10. Building a semantic web-based metadata repository for facilitating detailed clinical modeling in cancer genome studies.

    PubMed

    Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian

    2017-06-05

    Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.

  11. Digitization of Blocks and Virtual Anastylosis of AN Antique Facade in Pont-Sainte (france)

    NASA Astrophysics Data System (ADS)

    Alby, E.; Grussenmeyer, P.; Bitard, L.; Guillemin, S.; Brunet-Gaston, V.; Gaston, C.; Rougier, R.

    2017-08-01

    This paper is dedicated to the digitization of blocks and virtual anastylosis of an antique façade in Pont-Sainte-Maxence (France). In 2014 during the construction of a shopping center, the National Institute for Preventive Archaeological Research (INRAP) discovered a Gallo-Roman site from the 2nd century AD. The most interesting part of the site for the study is a façade of 70 meters long by nearly 10 meters high. The state of the conservation of the blocks of the façade makes them exceptional due to the question raised by the collapse. Representative and symbolic blocks of this building have been selected for a virtual anastylosis study. The blocks discovered belong to different types: decorated architectural blocks, monumental statuary elements and details of very fine decorations. The digital reproduction of the façade will facilitate the formulation of hypothesis for the collapse of the structure. The Photogrammetry and Geomatics Group from INSA Strasbourg is in charge of the digitization, the anastylosis and the development of exploratory methods for understanding the ruin of the façade. To develop the three-dimensional model of the facade, approximately 70 blocks of various dimensions were chosen by the archaeologists. The choice of the digitization technique is made according to the following pragmatic criterion: the movable objects are acquired with a scan-arm or a hand-held scanner in the laboratory and the largest blocks are recorded by photogrammetry at the repository near Paris. The expected types of deliverables are multiple: very accurate 3D models with the most faithful representation to document the objects in the best way and with optimized size model allowing easy handling during anastylosis tests. The visual aspect of the models is also a very important issue. Indeed, textures from photos are an excellent way to bring about the realism of the virtual model, but fine details of the object are sometimes blurred by the uniformity of the color of the original material. Acquisition by hand-held scanner does not provide the textures (they must be acquired according to a complementary process). The data types are therefore different depending on the acquisition. The type of rendering of the models depends therefore on precise choices to be defined optimally. After the acquisition, hypothesis for the construction of the façade must be validated and / or adapted by the anastylosis of the digitized blocks. Different cases must be taken into account. First, the reconstruction of broken blocks is done by adjusting the recovered fragments. If all the fragments discovered are close to the initial shape of the block, the process is assimilated to a puzzle of complex surfaces. If the fragments have no contact but are an integral part of the block, the proportion of hypotheses in relation to the contact pieces is changed. And finally, if the blocks are to be assembled together by superposition and thanks to a common plan, as assumed during the construction, the restitution could be based on the positions of discoveries and hypotheses based on the architectural knowledge of this period. Each of these three methods of reconstruction involves different processes. The three-dimensional model will be validated by the positioning of the blocks and extended according to the actual dimensions of the façade. Different collapse scenarios will result from this study.

  12. The environmental constraint needs for design improvements to the Saligny I/LLW-repository near Cernavoda NPP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barariu, Gheorghe

    2007-07-01

    The paper presents the new perspectives on the development of the L/ILW Final Repository Project which will be built near Cernavoda NPP. The Repository is designed to satisfy the main performance objectives in accordance to IAEA recommendation. Starting in October 1996, Romania became a country with an operating nuclear power plant. Reactor 2 reached the criticality on May 6, 2007 and it will be put in commercial operation in September 2007. The Ministry of Economy and Finance has decided to proceed with the commissioning of Units 3 and 4 of Cernavoda NPP till 2014. The Strategy for radioactive waste managementmore » was elaborated by National Agency for Radioactive Waste (ANDRAD), the jurisdictional authority for definitive disposal and the coordination of nuclear spent fuel and radioactive waste management (Order 844/2004) with attributions established by Governmental Decision (GO) 31/2006. The Strategy specifies the commissioning of the Saligny L/IL Radwaste Repository near Cernavoda NPP in 2014. When designing the L/IL Radwaste Repository, the following prerequisites have been taken into account: 1) Cernavoda NPP will be equipped with 4 Candu 6 units. 2) National Legislation in radwaste management will be reviewed and/or completed to harmonize with UE standards 3) The selected site is now in process of confirmation after a comprehensive set of interdisciplinary investigations. (author)« less

  13. Fused off-axis object illumination direct-to-digital holography with a plurality of illumination sources

    DOEpatents

    Price, Jeffery R.; Bingham, Philip R.

    2005-11-08

    Systems and methods are described for rapid acquisition of fused off-axis illumination direct-to-digital holography. A method of recording a plurality of off-axis object illuminated spatially heterodyne holograms, each of the off-axis object illuminated spatially heterodyne holograms including spatially heterodyne fringes for Fourier analysis, includes digitally recording, with a first illumination source of an interferometer, a first off-axis object illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; and digitally recording, with a second illumination source of the interferometer, a second off-axis object illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis.

  14. mHealthApps: A Repository and Database of Mobile Health Apps.

    PubMed

    Xu, Wenlong; Liu, Yin

    2015-03-18

    The market of mobile health (mHealth) apps has rapidly evolved in the past decade. With more than 100,000 mHealth apps currently available, there is no centralized resource that collects information on these health-related apps for researchers in this field to effectively evaluate the strength and weakness of these apps. The objective of this study was to create a centralized mHealth app repository. We expect the analysis of information in this repository to provide insights for future mHealth research developments. We focused on apps from the two most established app stores, the Apple App Store and the Google Play Store. We extracted detailed information of each health-related app from these two app stores via our python crawling program, and then stored the information in both a user-friendly array format and a standard JavaScript Object Notation (JSON) format. We have developed a centralized resource that provides detailed information of more than 60,000 health-related apps from the Apple App Store and the Google Play Store. Using this information resource, we analyzed thousands of apps systematically and provide an overview of the trends for mHealth apps. This unique database allows the meta-analysis of health-related apps and provides guidance for research designs of future apps in the mHealth field.

  15. Recording multiple spatially-heterodyned direct to digital holograms in one digital image

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-03-25

    Systems and methods are described for recording multiple spatially-heterodyned direct to digital holograms in one digital image. A method includes digitally recording, at a first reference beam-object beam angle, a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram to sit on top of a first spatial-heterodyne carrier frequency defined by the first reference beam-object beam angle; digitally recording, at a second reference beam-object beam angle, a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram to sit on top of a second spatial-heterodyne carrier frequency defined by the second reference beam-object beam angle; applying a first digital filter to cut off signals around the first original origin and define a first result; performing a first inverse Fourier transform on the first result; applying a second digital filter to cut off signals around the second original origin and define a second result; and performing a second inverse Fourier transform on the second result, wherein the first reference beam-object beam angle is not equal to the second reference beam-object beam angle and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  16. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology imagesmore » by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.« less

  17. The importance of metadata to assess information content in digital reconstructions of neuronal morphology.

    PubMed

    Parekh, Ruchi; Armañanzas, Rubén; Ascoli, Giorgio A

    2015-04-01

    Digital reconstructions of axonal and dendritic arbors provide a powerful representation of neuronal morphology in formats amenable to quantitative analysis, computational modeling, and data mining. Reconstructed files, however, require adequate metadata to identify the appropriate animal species, developmental stage, brain region, and neuron type. Moreover, experimental details about tissue processing, neurite visualization and microscopic imaging are essential to assess the information content of digital morphologies. Typical morphological reconstructions only partially capture the underlying biological reality. Tracings are often limited to certain domains (e.g., dendrites and not axons), may be incomplete due to tissue sectioning, imperfect staining, and limited imaging resolution, or can disregard aspects irrelevant to their specific scientific focus (such as branch thickness or depth). Gauging these factors is critical in subsequent data reuse and comparison. NeuroMorpho.Org is a central repository of reconstructions from many laboratories and experimental conditions. Here, we introduce substantial additions to the existing metadata annotation aimed to describe the completeness of the reconstructed neurons in NeuroMorpho.Org. These expanded metadata form a suitable basis for effective description of neuromorphological data.

  18. The digital revolution in phenotyping

    PubMed Central

    Oellrich, Anika; Collier, Nigel; Groza, Tudor; Rebholz-Schuhmann, Dietrich; Shah, Nigam; Bodenreider, Olivier; Boland, Mary Regina; Georgiev, Ivo; Liu, Hongfang; Livingston, Kevin; Luna, Augustin; Mallon, Ann-Marie; Manda, Prashanti; Robinson, Peter N.; Rustici, Gabriella; Simon, Michelle; Wang, Liqin; Winnenburg, Rainer; Dumontier, Michel

    2016-01-01

    Phenotypes have gained increased notoriety in the clinical and biological domain owing to their application in numerous areas such as the discovery of disease genes and drug targets, phylogenetics and pharmacogenomics. Phenotypes, defined as observable characteristics of organisms, can be seen as one of the bridges that lead to a translation of experimental findings into clinical applications and thereby support ‘bench to bedside’ efforts. However, to build this translational bridge, a common and universal understanding of phenotypes is required that goes beyond domain-specific definitions. To achieve this ambitious goal, a digital revolution is ongoing that enables the encoding of data in computer-readable formats and the data storage in specialized repositories, ready for integration, enabling translational research. While phenome research is an ongoing endeavor, the true potential hidden in the currently available data still needs to be unlocked, offering exciting opportunities for the forthcoming years. Here, we provide insights into the state-of-the-art in digital phenotyping, by means of representing, acquiring and analyzing phenotype data. In addition, we provide visions of this field for future research work that could enable better applications of phenotype data. PMID:26420780

  19. The National Institutes of Health Clinical Center Digital Imaging Network, Picture Archival and Communication System, and Radiology Information System.

    PubMed

    Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V

    2001-06-01

    In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.

  20. Standard for the U.S. Geological Survey Historical Topographic Map Collection

    USGS Publications Warehouse

    Allord, Gregory J.; Fishburn, Kristin A.; Walter, Jennifer L.

    2014-01-01

    This document defines the digital map product of the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic quadrangle maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. Each printed topographic map is scanned “as is” and captures the content and condition of each map. The HTMC provides ready access to maps that are no longer available for distribution in print. A new generation of topographic maps called “US Topo” was defined in 2009. US Topo maps, though modeled on the legacy 7.5-minute topographic maps, conform to different standards. For more information on the HTMC, see the project Web site at: http://nationalmap.gov/historical/.

  1. Application Examples for Handle System Usage

    NASA Astrophysics Data System (ADS)

    Toussaint, F.; Weigel, T.; Thiemann, H.; Höck, H.; Stockhause, M.; Lautenschlager, M.

    2012-12-01

    Besides the well-known DOI (Digital Object Identifiers) as a special form of Handles that resolve to scientific publications there are various other applications in use. Others perhaps are just not yet. We present some examples for the existing ones and some ideas for the future. The national German project C3-Grid provides a framework to implement a first solution for provenance tracing and explore unforeseen implications. Though project-specific, the high-level architecture is generic and represents well a common notion of data derivation. Users select one or many input datasets and a workflow software module (an agent in this context) to execute on the data. The output data is deposited in a repository to be delivered to the user. All data is accompanied by an XML metadata document. All input and output data, metadata and the workflow module receive Handles and are linked together to establish a directed acyclic graph of derived data objects and involved agents. Data that has been modified by a workflow module is linked to its predecessor data and the workflow module involved. Version control systems such as svn or git provide Internet access to software repositories using URLs. To refer to a specific state of the source code of for instance a C3 workflow module, it is sufficient to reference the URL to the svn revision or git hash. In consequence, individual revisions and the repository as a whole receive PIDs. Moreover, the revision specific PIDs are linked to their respective predecessors and become part of the provenance graph. Another example for usage of PIDs in a current major project is given in EUDAT (European Data Infrastructure) which will link scientific data of several research communities together. In many fields it is necessary to provide data objects at multiple locations for a variety of applications. To ensure consistency, not only the master of a data object but also its copies shall be provided with a PID. To verify transaction safety and to keep all copies consistent requires that the chain from master to copy and vice versa has to be resolvable, preferably through PIDs directly. As part of EUDAT necessary services are created on the basis of iRODS. These form the core structure of the data infrastructure developed within EUDAT. Though many implementations of PID systems already exist, many valuable web accessible data sources come with unresolvable identifiers like UUIDs, with instable recognition patterns like URLs, or even with proprietary implementations. However, other data collections would like to link to them in the data descriptions of their metadata. In addition, by usage of PIDs one can decouple the responsibilities for data and MD in projects where necessary. For some metadata entities like persons or even institutes it makes sense to give them single PIDs that point to contact and/or location information. ORCID (Open Researcher & Contributor ID), e.g., keeps track of persons working in scholarly fields, independent of name changes and linguistic variances. The ISO 27729 based International Standard Name Identifier (ISNI) also identifies legal entities and fictional characters besides natural persons. Other systems exist that, e.g., reference geographic localities. IDs of this kind may resolve to a URL where detailed information is given.

  2. Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results

    NASA Astrophysics Data System (ADS)

    Nussbaum, C. O.; Bossart, P. J.

    2012-12-01

    Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.

  3. A Note on Interfacing Object Warehouses and Mass Storage Systems for Data Mining Applications

    NASA Technical Reports Server (NTRS)

    Grossman, Robert L.; Northcutt, Dave

    1996-01-01

    Data mining is the automatic discovery of patterns, associations, and anomalies in data sets. Data mining requires numerically and statistically intensive queries. Our assumption is that data mining requires a specialized data management infrastructure to support the aforementioned intensive queries, but because of the sizes of data involved, this infrastructure is layered over a hierarchical storage system. In this paper, we discuss the architecture of a system which is layered for modularity, but exploits specialized lightweight services to maintain efficiency. Rather than use a full functioned database for example, we use light weight object services specialized for data mining. We propose using information repositories between layers so that components on either side of the layer can access information in the repositories to assist in making decisions about data layout, the caching and migration of data, the scheduling of queries, and related matters.

  4. Building a genome database using an object-oriented approach.

    PubMed

    Barbasiewicz, Anna; Liu, Lin; Lang, B Franz; Burger, Gertraud

    2002-01-01

    GOBASE is a relational database that integrates data associated with mitochondria and chloroplasts. The most important data in GOBASE, i. e., molecular sequences and taxonomic information, are obtained from the public sequence data repository at the National Center for Biotechnology Information (NCBI), and are validated by our experts. Maintaining a curated genomic database comes with a towering labor cost, due to the shear volume of available genomic sequences and the plethora of annotation errors and omissions in records retrieved from public repositories. Here we describe our approach to increase automation of the database population process, thereby reducing manual intervention. As a first step, we used Unified Modeling Language (UML) to construct a list of potential errors. Each case was evaluated independently, and an expert solution was devised, and represented as a diagram. Subsequently, the UML diagrams were used as templates for writing object-oriented automation programs in the Java programming language.

  5. Probalistic Criticality Consequence Evaluation (SCPB:N/A)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Gottlieb; J.W. Davis; J.R. Massari

    1996-09-04

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development (WPD) department with the objective of providing a comprehensive, conservative estimate of the consequences of the criticality which could possibly occur as the result of commercial spent nuclear fuel emplaced in the underground repository at Yucca Mountain. The consequences of criticality are measured principally in terms of the resulting changes in radionuclide inventory as a function of the power level and duration of the criticality. The purpose of this analysis is to extend the prior estimates of increased radionuclide inventory (Refs. 5.52 and 5.54), for bothmore » internal and external criticality. This analysis, and similar estimates and refinements to be completed before the end of fiscal year 1997, will be provided as input to Total System Performance Assessment-Viability Assessment (TSPA-VA) to demonstrate compliance with the repository performance objectives.« less

  6. Using Object Storage Technology vs Vendor Neutral Archives for an Image Data Repository Infrastructure.

    PubMed

    Bialecki, Brian; Park, James; Tilkin, Mike

    2016-08-01

    The intent of this project was to use object storage and its database, which has the ability to add custom extensible metadata to an imaging object being stored within the system, to harness the power of its search capabilities, and to close the technology gap that healthcare faces. This creates a non-disruptive tool that can be used natively by both legacy systems and the healthcare systems of today which leverage more advanced storage technologies. The base infrastructure can be populated alongside current workflows without any interruption to the delivery of services. In certain use cases, this technology can be seen as a true alternative to the VNA (Vendor Neutral Archive) systems implemented by healthcare today. The scalability, security, and ability to process complex objects makes this more than just storage for image data and a commodity to be consumed by PACS (Picture Archiving and Communication System) and workstations. Object storage is a smart technology that can be leveraged to create vendor independence, standards compliance, and a data repository that can be mined for truly relevant content by adding additional context to search capabilities. This functionality can lead to efficiencies in workflow and a wealth of minable data to improve outcomes into the future.

  7. Tug-of-war lacunarity—A novel approach for estimating lacunarity

    NASA Astrophysics Data System (ADS)

    Reiss, Martin A.; Lemmerer, Birgit; Hanslmeier, Arnold; Ahammer, Helmut

    2016-11-01

    Modern instrumentation provides us with massive repositories of digital images that will likely only increase in the future. Therefore, it has become increasingly important to automatize the analysis of digital images, e.g., with methods from pattern recognition. These methods aim to quantify the visual appearance of captured textures with quantitative measures. As such, lacunarity is a useful multi-scale measure of texture's heterogeneity but demands high computational efforts. Here we investigate a novel approach based on the tug-of-war algorithm, which estimates lacunarity in a single pass over the image. We computed lacunarity for theoretical and real world sample images, and found that the investigated approach is able to estimate lacunarity with low uncertainties. We conclude that the proposed method combines low computational efforts with high accuracy, and that its application may have utility in the analysis of high-resolution images.

  8. Scanning and georeferencing historical USGS quadrangles

    USGS Publications Warehouse

    Fishburn, Kristin A.; Davis, Larry R.; Allord, Gregory J.

    2017-06-23

    The U.S. Geological Survey (USGS) National Geospatial Program is scanning published USGS 1:250,000-scale and larger topographic maps printed between 1884, the inception of the topographic mapping program, and 2006. The goal of this project, which began publishing the Historical Topographic Map Collection in 2011, is to provide access to a digital repository of USGS topographic maps that is available to the public at no cost. For more than 125 years, USGS topographic maps have accurately portrayed the complex geography of the Nation. The USGS is the Nation’s largest producer of traditional topographic maps, and, prior to 2006, USGS topographic maps were created using traditional cartographic methods and printed using a lithographic process. The next generation of topographic maps, US Topo, is being released by the USGS in digital form, and newer technologies make it possible to also deliver historical maps in the same electronic format that is more publicly accessible.

  9. The EGS Data Collaboration Platform: Enabling Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weers, Jonathan D; Johnston, Henry; Huggins, Jay V

    Collaboration in the digital age has been stifled in recent years. Reasonable responses to legitimate security concerns have created a virtual landscape of silos and fortified castles incapable of sharing information efficiently. This trend is unfortunately opposed to the geothermal scientific community's migration toward larger, more collaborative projects. To facilitate efficient sharing of information between team members from multiple national labs, universities, and private organizations, the 'EGS Collab' team has developed a universally accessible, secure data collaboration platform and has fully integrated it with the U.S. Department of Energy's (DOE) Geothermal Data Repository (GDR) and the National Geothermal Data Systemmore » (NGDS). This paper will explore some of the challenges of collaboration in the modern digital age, highlight strategies for active data management, and discuss the integration of the EGS Collab data management platform with the GDR to enable scientific discovery through the timely dissemination of information.« less

  10. Digital data for quick response (QR) codes of alkalophilic Bacillus pumilus to identify and to compare bacilli isolated from Lonar Crator Lake, India.

    PubMed

    Rekadwad, Bhagwan N; Khobragade, Chandrahasya N

    2016-06-01

    Microbiologists are routinely engaged isolation, identification and comparison of isolated bacteria for their novelty. 16S rRNA sequences of Bacillus pumilus were retrieved from NCBI repository and generated QR codes for sequences (FASTA format and full Gene Bank information). 16SrRNA were used to generate quick response (QR) codes of Bacillus pumilus isolated from Lonar Crator Lake (19° 58' N; 76° 31' E), India. Bacillus pumilus 16S rRNA gene sequences were used to generate CGR, FCGR and PCA. These can be used for visual comparison and evaluation respectively. The hyperlinked QR codes, CGR, FCGR and PCA of all the isolates are made available to the users on a portal https://sites.google.com/site/bhagwanrekadwad/. This generated digital data helps to evaluate and compare any Bacillus pumilus strain, minimizes laboratory efforts and avoid misinterpretation of the species.

  11. Content-based fused off-axis object illumination direct-to-digital holography

    DOEpatents

    Price, Jeffery R.

    2006-05-02

    Systems and methods are described for content-based fused off-axis illumination direct-to-digital holography. A method includes calculating an illumination angle with respect to an optical axis defined by a focusing lens as a function of data representing a Fourier analyzed spatially heterodyne hologram; reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object the object beam incident upon the object at the illumination angle; focusing the reference beam and the object beam at a focal plane of a digital recorder to from the content-based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; and digitally recording the content based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis.

  12. A Survey of Complex Object Technologies for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Argue, Brad; Efron, Miles; Denn, Sheila; Pattuelli, Maria Cristina

    2001-01-01

    Many early web-based digital libraries (DLs) had implicit assumptions reflected in their architecture that the unit of focus in the DL (frequently "reports" or "e-prints") would only be manifested in a single, or at most a few, common file formats such as PDF or PostScript. DLs have now matured to the point where their contents are commonly no longer simple files. Complex objects in DLs have emerged from in response to various requirements, including: simple aggregation of formats and supporting files, bundling additional information to aid digital preservation, creating opaque digital objects for e-commerce applications, and the incorporation of dynamic services with the traditional data files. We examine a representative (but not necessarily exhaustive) number of current and recent historical web-based complex object technologies and projects that are applicable to DLs: Aurora, Buckets, ComMentor, Cryptolopes, Digibox, Document Management Alliance, FEDORA, Kahn-Wilensky Framework Digital Objects, Metadata Encoding & Transmission Standard, Multivalent Documents, Open eBooks, VERS Encapsulated Objects, and the Warwick Framework.

  13. Geohydrologic aspects for siting and design of low-level radioactive-waste disposal

    USGS Publications Warehouse

    Bedinger, M.S.

    1989-01-01

    The objective for siting and design of low-level radioactive-waste repository sites is to isolate the waste from the biosphere until the waste no longer poses an unacceptable hazard as a result of radioactive decay. Low-level radioactive waste commonly is isolated at shallow depths with various engineered features to stabilize the waste and to reduce its dissolution and transport by ground water. The unsaturated zone generally is preferred for isolating the waste. Low-level radioactive waste may need to be isolated for 300 to 500 years. Maintenance and monitoring of the repository site are required by Federal regulations for only the first 100 years. Therefore, geohydrology of the repository site needs to provide natural isolation of the waste for the hazardous period following maintenance of the site. Engineering design of the repository needs to be compatible with the natural geohydrologic conditions at the site. Studies at existing commercial and Federal waste-disposal sites provide information on the problems encountered and the basis for establishing siting guidelines for improved isolation of radioactive waste, engineering design of repository structures, and surveillance needs to assess the effectiveness of the repositories and to provide early warning of problems that may require remedial action.Climate directly affects the hydrology of a site and probably is the most important single factor that affects the suitability of a site for shallow-land burial of low-level radioactive waste. Humid and subhumid regions are not well suited for shallow isolation of low-level radioactive waste in the unsaturated zone; arid regions with zero to small infiltration from precipitation, great depths to the water table, and long flow paths to natural discharge areas are naturally well suited to isolation of the waste. The unsaturated zone is preferred for isolation of low-level radioactive waste. The guiding rationale is to minimize contact of water with the waste and to minimize transport of waste from the repository. The hydrology of a flow system containing a repository is greatly affected by the engineering of the repository site. Prediction of the performance of the repository is a complex problem, hampered by problems of characterizing the natural and manmade features of the flow system and by the limitations of models to predict flow and geochemical processes in the saturated and unsaturated zones. Disposal in low-permeability unfractured clays in the saturated zone may be feasible where the radionuclide transport is controlled by diffusion rather than advection.

  14. The Evolution of Digital Chemistry at Southampton.

    PubMed

    Bird, Colin; Coles, Simon J; Frey, Jeremy G

    2015-09-01

    In this paper we take a historical view of e-Science and e-Research developments within the Chemical Sciences at the University of Southampton, showing the development of several stages of the evolving data ecosystem as Chemistry moves into the digital age of the 21(st) Century. We cover our research on aspects of the representation of chemical information in the context of the world wide web (WWW) and its semantic enhancement (the Semantic Web) and illustrate this with the example of the representation of quantities and units within the Semantic Web. We explore the changing nature of laboratories as computing power becomes increasing powerful and pervasive and specifically look at the function and role of electronic or digital notebooks. Having focussed on the creation of chemical data and information in context, we finish the paper by following the use and reuse of this data as facilitated by the features provided by digital repositories and their importance in facilitating the exchange of chemical information touching on the issues of open and or intelligent access to the data. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

  15. Mining Very High Resolution INSAR Data Based On Complex-GMRF Cues And Relevance Feedback

    NASA Astrophysics Data System (ADS)

    Singh, Jagmal; Popescu, Anca; Soccorsi, Matteo; Datcu, Mihai

    2012-01-01

    With the increase in number of remote sensing satellites, the number of image-data scenes in our repositories is also increasing and a large quantity of these scenes are never received and used. Thus automatic retrieval of de- sired image-data using query by image content to fully utilize the huge repository volume is becoming of great interest. Generally different users are interested in scenes containing different kind of objects and structures. So its important to analyze all the image information mining (IIM) methods so that its easier for user to select a method depending upon his/her requirement. We concentrate our study only on high-resolution SAR images and we propose to use InSAR observations instead of only one single look complex (SLC) images for mining scenes containing coherent objects such as high-rise buildings. However in case of objects with less coherence like areas with vegetation cover, SLC images exhibits better performance. We demonstrate IIM performance comparison using complex-Gauss Markov Random Fields as texture descriptor for image patches and SVM relevance- feedback.

  16. Citing geospatial feature inventories with XML manifests

    NASA Astrophysics Data System (ADS)

    Bose, R.; McGarva, G.

    2006-12-01

    Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.

  17. Management of Object Histories in the SWALLOW Repository,

    DTIC Science & Technology

    1980-07-01

    time of this future version. Since the end time of the current version should not be automatically extended up to tile start time of tile token until...and T is determined by the speed with which the available online version StoraIge fills up . Unfortunately, since versions of different objects are...of these images is accessible by Illlowing tie chain of pointers in the object history. The other images use up storage, but do not have an adverse

  18. Creation of a digital slide and tissue microarray resource from a multi-institutional predictive toxicology study in the rat: an initial report from the PredTox group.

    PubMed

    Mulrane, Laoighse; Rexhepaj, Elton; Smart, Valerie; Callanan, John J; Orhan, Diclehan; Eldem, Türkan; Mally, Angela; Schroeder, Susanne; Meyer, Kirstin; Wendt, Maria; O'Shea, Donal; Gallagher, William M

    2008-08-01

    The widespread use of digital slides has only recently come to the fore with the development of high-throughput scanners and high performance viewing software. This development, along with the optimisation of compression standards and image transfer techniques, has allowed the technology to be used in wide reaching applications including integration of images into hospital information systems and histopathological training, as well as the development of automated image analysis algorithms for prediction of histological aberrations and quantification of immunohistochemical stains. Here, the use of this technology in the creation of a comprehensive library of images of preclinical toxicological relevance is demonstrated. The images, acquired using the Aperio ScanScope CS and XT slide acquisition systems, form part of the ongoing EU FP6 Integrated Project, Innovative Medicines for Europe (InnoMed). In more detail, PredTox (abbreviation for Predictive Toxicology) is a subproject of InnoMed and comprises a consortium of 15 industrial (13 large pharma, 1 technology provider and 1 SME) and three academic partners. The primary aim of this consortium is to assess the value of combining data generated from 'omics technologies (proteomics, transcriptomics, metabolomics) with the results from more conventional toxicology methods, to facilitate further informed decision making in preclinical safety evaluation. A library of 1709 scanned images was created of full-face sections of liver and kidney tissue specimens from male Wistar rats treated with 16 proprietary and reference compounds of known toxicity; additional biological materials from these treated animals were separately used to create 'omics data, that will ultimately be used to populate an integrated toxicological database. In respect to assessment of the digital slides, a web-enabled digital slide management system, Digital SlideServer (DSS), was employed to enable integration of the digital slide content into the 'omics database and to facilitate remote viewing by pathologists connected with the project. DSS also facilitated manual annotation of digital slides by the pathologists, specifically in relation to marking particular lesions of interest. Tissue microarrays (TMAs) were constructed from the specimens for the purpose of creating a repository of tissue from animals used in the study with a view to later-stage biomarker assessment. As the PredTox consortium itself aims to identify new biomarkers of toxicity, these TMAs will be a valuable means of validation. In summary, a large repository of histological images was created enabling the subsequent pathological analysis of samples through remote viewing and, along with the utilisation of TMA technology, will allow the validation of biomarkers identified by the PredTox consortium. The population of the PredTox database with these digitised images represents the creation of the first toxicological database integrating 'omics and preclinical data with histological images.

  19. Endoscopic pulsed digital holography for 3D measurements

    NASA Astrophysics Data System (ADS)

    Saucedo, A. Tonatiuh; Mendoza Santoyo, Fernando; de La Torre-Ibarra, Manuel; Pedrini, Giancarlo; Osten, Wolfgang

    2006-02-01

    A rigid endoscope and three different object illumination source positions are used in pulsed digital holography to measure the three orthogonal displacement components from hidden areas of a harmonically vibrating metallic cylinder. In order to obtain simultaneous 3D information from the optical set up, it is necessary to match the optical paths of each of the reference object beam pairs, but to incoherently mismatch the three reference object beam pairs, such that three pulsed digital holograms are incoherently recorded within a single frame of the CCD sensor. The phase difference is obtained using the Fourier method and by subtracting two digital holograms captured for two different object positions.

  20. Connecting the pieces: Using ORCIDs to improve research impact and repositories.

    PubMed

    Baessa, Mohamed; Lery, Thibaut; Grenz, Daryl; Vijayakumar, J K

    2015-01-01

    Quantitative data are crucial in the assessment of research impact in the academic world. However, as a young university created in 2009, King Abdullah University of Science and Technology (KAUST) needs to aggregate bibliometrics from researchers coming from diverse origins, not necessarily with the proper affiliations. In this context, the University has launched an institutional repository in September 2012 with the objectives of creating a home for the intellectual outputs of KAUST researchers. Later, the university adopted the first mandated institutional open access policy in the Arab region, effective June 31, 2014. Several projects were then initiated in order to accurately identify the research being done by KAUST authors and bring it into the repository in accordance with the open access policy. Integration with ORCID has been a key element in this process and the best way to ensure data quality for researcher's scientific contributions. It included the systematic inclusion and creation, if necessary, of ORCID identifiers in the existing repository system, an institutional membership in ORCID, and the creation of dedicated integration tools. In addition and in cooperation with the Office of Research Evaluation, the Library worked at implementing a Current Research Information System (CRIS) as a standardized common resource to monitor KAUST research outputs. We will present our findings about the CRIS implementation, the ORCID API, the repository statistics as well as our approach in conducting the assessment of research impact in terms of usage by the global research community.

  1. A Spectrum of Interoperability: The Site for Science Prototype for the NSDL; Re-Inventing the Wheel? Standards, Interoperability and Digital Cultural Content; Preservation Risk Management for Web Resources: Virtual Remote Control in Cornell's Project Prism; Safekeeping: A Cooperative Approach to Building a Digital Preservation Resource; Object Persistence and Availability in Digital Libraries; Illinois Digital Cultural Heritage Community-Collaborative Interactions among Libraries, Museums and Elementary Schools.

    ERIC Educational Resources Information Center

    Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.

    2002-01-01

    Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…

  2. A Digital Repository and Execution Platform for Interactive Scholarly Publications in Neuroscience.

    PubMed

    Hodge, Victoria; Jessop, Mark; Fletcher, Martyn; Weeks, Michael; Turner, Aaron; Jackson, Tom; Ingram, Colin; Smith, Leslie; Austin, Jim

    2016-01-01

    The CARMEN Virtual Laboratory (VL) is a cloud-based platform which allows neuroscientists to store, share, develop, execute, reproduce and publicise their work. This paper describes new functionality in the CARMEN VL: an interactive publications repository. This new facility allows users to link data and software to publications. This enables other users to examine data and software associated with the publication and execute the associated software within the VL using the same data as the authors used in the publication. The cloud-based architecture and SaaS (Software as a Service) framework allows vast data sets to be uploaded and analysed using software services. Thus, this new interactive publications facility allows others to build on research results through reuse. This aligns with recent developments by funding agencies, institutions, and publishers with a move to open access research. Open access provides reproducibility and verification of research resources and results. Publications and their associated data and software will be assured of long-term preservation and curation in the repository. Further, analysing research data and the evaluations described in publications frequently requires a number of execution stages many of which are iterative. The VL provides a scientific workflow environment to combine software services into a processing tree. These workflows can also be associated with publications and executed by users. The VL also provides a secure environment where users can decide the access rights for each resource to ensure copyright and privacy restrictions are met.

  3. Where Do Data Go When They Die? Attaining Data Salvation Through the Establishment of a Solar Dynamo Dataverse

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, Andres

    2016-05-01

    The arrival of a highly interconnected digital age with practically limitless data storage capacity has brought with it a significant shift in which scientific data is stored and distributed (i.e. from being in the hands of a small group of scientists to being openly and freely distributed for anyone to use). However, the vertiginous speed at which hardware, software, and the nature of the internet changes has also sped up the rate at which data is lost due to formatting obsolescence and loss of access.This poster is meant to advertise the creation of a highly permanent data repository (within the context of Harvard's Dataverse), curated to contain datasets of high relevance for the study, and prediction of the solar dynamo, solar cycle, and long-term solar variability. This repository has many advantages over traditional data storage like the assignment of unique DOI identifiers for each database (making it easier for scientist to directly cite them), and the automatic versioning of each database so that all data are able to attain salvation.

  4. Development of a user-centered radiology teaching file system

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Fujino, Asa

    2011-03-01

    Learning radiology requires systematic and comprehensive study of a large knowledge base of medical images. In this work is presented the development of a digital radiology teaching file system. The proposed system has been created in order to offer a set of customized services regarding to users' contexts and their informational needs. This has been done by means of an electronic infrastructure that provides easy and integrated access to all relevant patient data at the time of image interpretation, so that radiologists and researchers can examine all available data to reach well-informed conclusions, while protecting patient data privacy and security. The system is presented such as an environment which implements a distributed clinical database, including medical images, authoring tools, repository for multimedia documents, and also a peer-reviewed model which assures dataset quality. The current implementation has shown that creating clinical data repositories on networked computer environments points to be a good solution in terms of providing means to review information management practices in electronic environments and to create customized and contextbased tools for users connected to the system throughout electronic interfaces.

  5. Web Based Autonomous Geophysical/Hydrological Monitoring of the Gilt Edge Mine Site: Implementation and Results

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Wangerud, K.; Mattson, E.; Ankeny, M.; Richardson, A.; Heath, G.

    2005-05-01

    The Ruby Gulch repository at the Gilt Edge Mine Superfund site is a capped waste rock repository. Early in the system design EPA and its subcontractor, Bureau of Reclamation, recognized the need for long-term monitoring system to provide information on the repository behavior with the following objectives: 1 Provide information on the integrity of the newly constructed surface cover and diversion system 2 Continually assess the waste's hydrological and geochemical behavior, such that rational decisions can be made for the operation of this cover and liner system 3 Easily access of information pertaining to the system performance to stakeholders 4 Integration of a variety of data sources to produce information which could be used to enhance future cover designs. Through discussions between EPA, the Bureau of Reclamation and Idaho National Laboratory a long-term monitoring system was designed and implemented allowing EPA to meet these objectives. This system was designed to provide a cost effective way to deal with massive amounts of data and information, subject to the following specifications: 1 Data acquisition should occur autonomously and automatically, 2 Data management, processing and presentation should be automated as much as possible, 3 Users should be able to access all data and information remotely through a web browser. The INL long-term monitoring system integrates the data from a set of 522 electrodes resistivity electrodes consisting of 462 surface electrodes and 60 borehole electrodes (in 4 wells with 15 electrodes each), an outflow meter at the toe of the repository, an autonomous, remotely accessible weather station, and four wells (average depths of 250 feet) with thermocouples, pressure transducers and sampling ports for water and air. The monitoring system has currently been in operation for over a year, and has collected data continuously over this period. Results from this system have shown both the diurnal variation in rockmass behavior, movement of water through the waste (allowing estimated in residence time) and are leading to a comprehensive model of the repository behavior. Due to the sheer volume of data, a user driven interface allows users to create their own views of the different datasets.

  6. Effects of Semantic Web Based Learning on Pre-Service Teachers' ICT Learning Achievement and Satisfaction

    ERIC Educational Resources Information Center

    Karalar, Halit; Korucu, Agah Tugrul

    2016-01-01

    Although the Semantic Web offers many opportunities for learners, effects of it in the classroom is not well known. Therefore, in this study explanations have been stated as how the learning objects defined by means of using the terminology in a developed ontology and kept in objects repository should be presented to learners with the aim of…

  7. Digital holographic image fusion for a larger size object using compressive sensing

    NASA Astrophysics Data System (ADS)

    Tian, Qiuhong; Yan, Liping; Chen, Benyong; Yao, Jiabao; Zhang, Shihua

    2017-05-01

    Digital holographic imaging fusion for a larger size object using compressive sensing is proposed. In this method, the high frequency component of the digital hologram under discrete wavelet transform is represented sparsely by using compressive sensing so that the data redundancy of digital holographic recording can be resolved validly, the low frequency component is retained totally to ensure the image quality, and multiple reconstructed images with different clear parts corresponding to a laser spot size are fused to realize the high quality reconstructed image of a larger size object. In addition, a filter combing high-pass and low-pass filters is designed to remove the zero-order term from a digital hologram effectively. The digital holographic experimental setup based on off-axis Fresnel digital holography was constructed. The feasible and comparative experiments were carried out. The fused image was evaluated by using the Tamura texture features. The experimental results demonstrated that the proposed method can improve the processing efficiency and visual characteristics of the fused image and enlarge the size of the measured object effectively.

  8. Best practices for fungal germplasm repositories and perspectives on their implementation.

    PubMed

    Wiest, Aric; Schnittker, Robert; Plamann, Mike; McCluskey, Kevin

    2012-02-01

    In over 50 years, the Fungal Genetics Stock Center has grown to become a world-recognized biological resource center. Along with this growth comes the development and implementation of myriad practices for the management and curation of a diverse collection of filamentous fungi, yeast, and molecular genetic tools for working with the fungi. These practices include techniques for the testing, manipulation, and preservation of individual fungal isolates as well as for processing of thousands of isolates in parallel. In addition to providing accurate record keeping, an electronic managements system allows the observation of trends in strain distribution and in sample characteristics. Because many ex situ fungal germplasm repositories around the world share similar objectives, best-practice guidelines have been developed by a number of organizations such as the Organization for Economic Cooperation and Development or the International Society for Biological and Environmental Repositories. These best-practice guidelines provide a framework for the successful operation of collections and promote the development and interactions of biological resource centers around the world.

  9. Preservation Environments

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    2004-01-01

    The long-term preservation of digital entities requires mechanisms to manage the authenticity of massive data collections that are written to archival storage systems. Preservation environments impose authenticity constraints and manage the evolution of the storage system technology by building infrastructure independent solutions. This seeming paradox, the need for large archives, while avoiding dependence upon vendor specific solutions, is resolved through use of data grid technology. Data grids provide the storage repository abstractions that make it possible to migrate collections between vendor specific products, while ensuring the authenticity of the archived data. Data grids provide the software infrastructure that interfaces vendor-specific storage archives to preservation environments.

  10. Object-oriented structures supporting remote sensing databases

    NASA Technical Reports Server (NTRS)

    Wichmann, Keith; Cromp, Robert F.

    1995-01-01

    Object-oriented databases show promise for modeling the complex interrelationships pervasive in scientific domains. To examine the utility of this approach, we have developed an Intelligent Information Fusion System based on this technology, and applied it to the problem of managing an active repository of remotely-sensed satellite scenes. The design and implementation of the system is compared and contrasted with conventional relational database techniques, followed by a presentation of the underlying object-oriented data structures used to enable fast indexing into the data holdings.

  11. Development of anomaly detection models for deep subsurface monitoring

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.

    2017-12-01

    Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.

  12. Strike Up the Score: Deriving Searchable and Playable Digital Formats from Sheet Music; Smart Objects and Open Archives; Building the Archives of the Future: Advanced in Preserving Electronic Records at the National Archives and Records Administration; From the Digitized to the Digital Library.

    ERIC Educational Resources Information Center

    Choudhury, G. Sayeed; DiLauro, Tim; Droettboom, Michael; Fujinaga, Ichiro; MacMillan, Karl; Nelson, Michael L.; Maly, Kurt; Thibodeau, Kenneth; Thaller, Manfred

    2001-01-01

    These articles describe the experiences of the Johns Hopkins University library in digitizing their collection of sheet music; motivation for buckets, Smart Object, Dumb Archive (SODA) and the Open Archives Initiative (OAI), and initial experiences using them in digital library (DL) testbeds; requirements for archival institutions, the National…

  13. Arc-An OAI Service Provider for Digital Library Federation; Kepler-An OAI Data/Service Provider for the Individual; Information Objects and Rights Management: A Mediation-Based Approach to DRM Interoperability; Automated Name Authority Control and Enhanced Searching in the Levy Collection; Renardus Project Developments and the Wider Digital Library Context.

    ERIC Educational Resources Information Center

    Liu, Xiaoming; Maly, Kurt; Zubair, Mohammad; Nelson, Michael L.; Erickson, John S.; DiLauro, Tim; Choudhury, G. Sayeed; Patton, Mark; Warner, James W.; Brown, Elizabeth W.; Heery, Rachel; Carpenter, Leona; Day, Michael

    2001-01-01

    Includes five articles that discuss the OAI (Open Archive Initiative), an interface between data providers and service providers; information objects and digital rights management interoperability; digitizing library collections, including automated name authority control, metadata, and text searching engines; and building digital library services…

  14. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  15. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  16. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  17. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  18. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  19. GENESI-DR Portal: a scientific gateway to distributed repositories

    NASA Astrophysics Data System (ADS)

    Goncalves, Pedro; Brito, Fabrice; D'Andria, Fabio; Cossu, Roberto; Fusco, Luigi

    2010-05-01

    GENESI-DR (Ground European Network for Earth Science Interoperations - Digital Repositories) is a European Commission (EC)-funded project, kicked-off early 2008 lead by ESA; partners include Space Agencies (DLR, ASI, CNES), both space and no-space data providers such as ENEA (I), Infoterra (UK), K-SAT (N), NILU (N), JRC (EU) and industry as Elsag Datamat (I), CS (F) and TERRADUE (I). GENESI-DR intends to meet the challenge of facilitating "time to science" from different Earth Science disciplines in discovery, access and use (combining, integrating, processing, …) of historical and recent Earth-related data from space, airborne and in-situ sensors, which are archived in large distributed repositories. "Discovering" which data are available on a "geospatial web" is one of the main challenges ES scientists have to face today. Some well- known data sets are referred to in many places, available from many sources. For core information with a common purpose many copies are distributed, e.g., VMap0, Landsat, and SRTM. Other data sets in low or local demand may only be found in a few places and niche communities. Relevant services, results of analysis, applications and tools are accessible in a very scattered and uncoordinated way, often through individual initiatives from Earth Observation mission operators, scientific institutes dealing with ground measurements, service companies or data catalogues. In the discourse of Spatial Data Infrastructures, there are "catalogue services" - directories containing information on where spatial data and services can be found. For metadata "records" describing spatial data and services, there are "registries". The Geospatial industry coins specifications for search interfaces, where it might do better to reach out to other information retrieval and Internet communities. These considerations are the basis for the GENESI-DR scientific portal, which adopts a simple model allowing the geo-spatial classification and discovery of information as a loosely connected federation of nodes. This network had however to be resilient to node failures and able to scale with the growing addition of new information about data and services. The GENESI-DR scientific portal is still evolving as the project deploys the different components amongst the different partners, but the aim is to provide the connection to information, establish rights, access it and in some cases apply algorithms using the computer power available on the infrastructure with simple interfaces. As information is discovered in the network, it can be further exploited, filtered or enhanced according to the user goals. To implement this vision two specialized graphical interfaces were designed on the portal. The first, concentrates on the text-based search of information, while the second is a command and control of submission and order status on a distributed processing environment. The text search uses natural language features that extract the spatial temporal components from the user query. This is then propagated to the nodes by mapping them to OpenSearch extensions, and then returned to the user as an aggregated list of the resources. These can either be access points to dataset series or services that can be further analysed and processed. At this stage, the user is presented with dedicated interfaces that correspond to context of the action that is performing. Be it a bulk data download, data processing or data mining, the different services offer specialized interfaces that are integrated on the portal. In the overall, the GENESI-DR project identifies best practices and supporting context for the use of a minimal abstract model to loosely connect a federation of Digital Repositories. Surpassing the apparent lack of cost effectiveness of the Spatial Data Infrastructures effort in developing "catalogue services" is achieved by trimming the use cases to the most common and relevant. The GENESI-DR scientific portal is, as such, the visible front-end of a dedicated infrastructure providing transparent access to information and allowing Earth Science communities to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains.

  20. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  1. Watermarking 3D Objects for Verification

    DTIC Science & Technology

    1999-01-01

    signal (audio/ image /video) pro- cessing and steganography fields, and even newer to the computer graphics community. Inherently, digital watermarking of...quality images , and digital video. The field of digital watermarking is relatively new, and many of its terms have not been well defined. Among the dif...ferent media types, watermarking of 2D still images is comparatively better studied. Inherently, digital water- marking of 3D objects remains a

  2. Persistent Identifiers for Data Products: Adoption, Enhancement, and Use

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Schumacher, J.; Scialdone, J.; Hansen, M.

    2016-12-01

    Persistent identifiers offer value for science and for various science community stakeholders, such as data producers, data distributers, science article authors, scientific journal publishers, research sponsors, libraries, and affiliated institutions. However, to attain the benefits of persistent identifiers, they should be assigned to disseminated data products and included within the references reported in publications that describe the studies in which the data were used. Scientific data centers, archives, digital repositories, and other data publishers also need to determine the level of aggregation, or granularity, of data products to be assigned persistent identifiers as well as the elements to be included in the landing pages to which persistent identifiers will resolve. Similarly, policies and procedures should be clear on decisions about maintenance issues, including versioning of data products and how persistent identifiers to previous versions and new locations will be maintained. With some persistent identifiers, such as Digital Object Identifiers (DOIs), which provide capabilities to link to related identifiers of other works, decisions on the establishment of links also must be clear, including links between early versions of data products and subsequent versions, links between data products and associated documentation, and links between data products and other publications that describe the data. We describe decisions for enabling the adoption and assignment of DOIs as persistent identifiers for data products disseminated by the NASA Socioeconomic Data and Applications Center (SEDAC) along with considerations for policy decisions, testing, implementation, and enhancement. The prevalence of the adoption of DOIs for citing the use of Earth science data disseminated by SEDAC also is described to provide insight into how interdisciplinary data users have engaged in the use of DOIs within their publications along with the implications of such use.

  3. Interpretation of medical imaging data with a mobile application: a mobile digital imaging processing environment.

    PubMed

    Lin, Meng Kuan; Nicolini, Oliver; Waxenegger, Harald; Galloway, Graham J; Ullmann, Jeremy F P; Janke, Andrew L

    2013-01-01

    Digital Imaging Processing (DIP) requires data extraction and output from a visualization tool to be consistent. Data handling and transmission between the server and a user is a systematic process in service interpretation. The use of integrated medical services for management and viewing of imaging data in combination with a mobile visualization tool can be greatly facilitated by data analysis and interpretation. This paper presents an integrated mobile application and DIP service, called M-DIP. The objective of the system is to (1) automate the direct data tiling, conversion, pre-tiling of brain images from Medical Imaging NetCDF (MINC), Neuroimaging Informatics Technology Initiative (NIFTI) to RAW formats; (2) speed up querying of imaging measurement; and (3) display high-level of images with three dimensions in real world coordinates. In addition, M-DIP provides the ability to work on a mobile or tablet device without any software installation using web-based protocols. M-DIP implements three levels of architecture with a relational middle-layer database, a stand-alone DIP server, and a mobile application logic middle level realizing user interpretation for direct querying and communication. This imaging software has the ability to display biological imaging data at multiple zoom levels and to increase its quality to meet users' expectations. Interpretation of bioimaging data is facilitated by an interface analogous to online mapping services using real world coordinate browsing. This allows mobile devices to display multiple datasets simultaneously from a remote site. M-DIP can be used as a measurement repository that can be accessed by any network environment, such as a portable mobile or tablet device. In addition, this system and combination with mobile applications are establishing a virtualization tool in the neuroinformatics field to speed interpretation services.

  4. Interpretation of Medical Imaging Data with a Mobile Application: A Mobile Digital Imaging Processing Environment

    PubMed Central

    Lin, Meng Kuan; Nicolini, Oliver; Waxenegger, Harald; Galloway, Graham J.; Ullmann, Jeremy F. P.; Janke, Andrew L.

    2013-01-01

    Digital Imaging Processing (DIP) requires data extraction and output from a visualization tool to be consistent. Data handling and transmission between the server and a user is a systematic process in service interpretation. The use of integrated medical services for management and viewing of imaging data in combination with a mobile visualization tool can be greatly facilitated by data analysis and interpretation. This paper presents an integrated mobile application and DIP service, called M-DIP. The objective of the system is to (1) automate the direct data tiling, conversion, pre-tiling of brain images from Medical Imaging NetCDF (MINC), Neuroimaging Informatics Technology Initiative (NIFTI) to RAW formats; (2) speed up querying of imaging measurement; and (3) display high-level of images with three dimensions in real world coordinates. In addition, M-DIP provides the ability to work on a mobile or tablet device without any software installation using web-based protocols. M-DIP implements three levels of architecture with a relational middle-layer database, a stand-alone DIP server, and a mobile application logic middle level realizing user interpretation for direct querying and communication. This imaging software has the ability to display biological imaging data at multiple zoom levels and to increase its quality to meet users’ expectations. Interpretation of bioimaging data is facilitated by an interface analogous to online mapping services using real world coordinate browsing. This allows mobile devices to display multiple datasets simultaneously from a remote site. M-DIP can be used as a measurement repository that can be accessed by any network environment, such as a portable mobile or tablet device. In addition, this system and combination with mobile applications are establishing a virtualization tool in the neuroinformatics field to speed interpretation services. PMID:23847587

  5. A robust, high-throughput method for computing maize ear, cob, and kernel attributes automatically from images.

    PubMed

    Miller, Nathan D; Haase, Nicholas J; Lee, Jonghyun; Kaeppler, Shawn M; de Leon, Natalia; Spalding, Edgar P

    2017-01-01

    Grain yield of the maize plant depends on the sizes, shapes, and numbers of ears and the kernels they bear. An automated pipeline that can measure these components of yield from easily-obtained digital images is needed to advance our understanding of this globally important crop. Here we present three custom algorithms designed to compute such yield components automatically from digital images acquired by a low-cost platform. One algorithm determines the average space each kernel occupies along the cob axis using a sliding-window Fourier transform analysis of image intensity features. A second counts individual kernels removed from ears, including those in clusters. A third measures each kernel's major and minor axis after a Bayesian analysis of contour points identifies the kernel tip. Dimensionless ear and kernel shape traits that may interrelate yield components are measured by principal components analysis of contour point sets. Increased objectivity and speed compared to typical manual methods are achieved without loss of accuracy as evidenced by high correlations with ground truth measurements and simulated data. Millimeter-scale differences among ear, cob, and kernel traits that ranged more than 2.5-fold across a diverse group of inbred maize lines were resolved. This system for measuring maize ear, cob, and kernel attributes is being used by multiple research groups as an automated Web service running on community high-throughput computing and distributed data storage infrastructure. Users may create their own workflow using the source code that is staged for download on a public repository. © 2016 The Authors. The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  6. Deep Boreholes Seals Subjected to High P,T conditions - Proposed Experimental Studies

    NASA Astrophysics Data System (ADS)

    Caporuscio, F.

    2015-12-01

    Deep borehole experimental work will constrain the P,T conditions which "seal" material will experience in deep borehole crystalline rock repositories. The rocks of interest to this study include mafic (amphibolites) and silicic (granitic gneiss) end members. The experiments will systematically add components to capture discrete changes in both water and EBS component chemistries. Experiments in the system wall rock-clay-concrete-groundwater will evaluate interactions among components, including: mineral phase stability, metal corrosion rates and thermal limits. Based on engineered barrier studies, experimental investigations will move forward with three focusses. First, evaluation of interaction between "seal" materials and repository wall rock (crystalline) under fluid-saturated conditions over long-term (i.e., six-month) experiments; which reproduces the thermal pulse event of a repository. Second, perform experiments to determine the stability of zeolite minerals (analcime-wairakitess) under repository conditions. Both sets of experiments are critically important for understanding mineral paragenesis (zeolites and/or clay transformations) associated with "seals" in contact with wall rock at elevated temperatures. Third, mineral growth at the metal interface is a principal control on the survivability (i.e. corrosion) of waste canisters in a repository. The objective of this planned experimental work is to evaluate physio-chemical processes for 'seal' components and materials relevant to deep borehole disposal. These evaluations will encompass multi-laboratory efforts for the development of seals concepts and application of Thermal-Mechanical-Chemical (TMC) modeling work to assess barrier material interactions with subsurface fluids and other barrier materials, their stability at high temperatures, and the implications of these processes to the evaluation of thermal limits.

  7. Studies of Global Solar Magnetic Field Patterns Using a Newly Digitized Archive

    NASA Astrophysics Data System (ADS)

    Hewins, I.; Webb, D. F.; Gibson, S. E.; McFadden, R.; Emery, B. A.; Malanushenko, A. V.

    2017-12-01

    The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly Ha, He 10830Å and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced polarity inversion lines (PILs), filaments, sunspots and plage and, later, coronal holes, yielding a unique 45-year record of features associated with the large-scale organization of the solar magnetic field. We discuss our efforts to preserve and digitize this archive; the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed, and a website and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. The archive is complete for SC 23 and partially complete for SCs 21 and 22. In this paper we show examples of how the data base can be utilized for scientific applications. We compare the evolution of the areas and boundaries of CHs with other recent results, and we use the maps to track the global, SC-evolution of filaments, large-scale positive and negative polarity regions, PILs and sunspots.

  8. Tomato Expression Database (TED): a suite of data presentation and analysis tools

    PubMed Central

    Fei, Zhangjun; Tang, Xuemei; Alba, Rob; Giovannoni, James

    2006-01-01

    The Tomato Expression Database (TED) includes three integrated components. The Tomato Microarray Data Warehouse serves as a central repository for raw gene expression data derived from the public tomato cDNA microarray. In addition to expression data, TED stores experimental design and array information in compliance with the MIAME guidelines and provides web interfaces for researchers to retrieve data for their own analysis and use. The Tomato Microarray Expression Database contains normalized and processed microarray data for ten time points with nine pair-wise comparisons during fruit development and ripening in a normal tomato variety and nearly isogenic single gene mutants impacting fruit development and ripening. Finally, the Tomato Digital Expression Database contains raw and normalized digital expression (EST abundance) data derived from analysis of the complete public tomato EST collection containing >150 000 ESTs derived from 27 different non-normalized EST libraries. This last component also includes tools for the comparison of tomato and Arabidopsis digital expression data. A set of query interfaces and analysis, and visualization tools have been developed and incorporated into TED, which aid users in identifying and deciphering biologically important information from our datasets. TED can be accessed at . PMID:16381976

  9. Tomato Expression Database (TED): a suite of data presentation and analysis tools.

    PubMed

    Fei, Zhangjun; Tang, Xuemei; Alba, Rob; Giovannoni, James

    2006-01-01

    The Tomato Expression Database (TED) includes three integrated components. The Tomato Microarray Data Warehouse serves as a central repository for raw gene expression data derived from the public tomato cDNA microarray. In addition to expression data, TED stores experimental design and array information in compliance with the MIAME guidelines and provides web interfaces for researchers to retrieve data for their own analysis and use. The Tomato Microarray Expression Database contains normalized and processed microarray data for ten time points with nine pair-wise comparisons during fruit development and ripening in a normal tomato variety and nearly isogenic single gene mutants impacting fruit development and ripening. Finally, the Tomato Digital Expression Database contains raw and normalized digital expression (EST abundance) data derived from analysis of the complete public tomato EST collection containing >150,000 ESTs derived from 27 different non-normalized EST libraries. This last component also includes tools for the comparison of tomato and Arabidopsis digital expression data. A set of query interfaces and analysis, and visualization tools have been developed and incorporated into TED, which aid users in identifying and deciphering biologically important information from our datasets. TED can be accessed at http://ted.bti.cornell.edu.

  10. Data Stewardship throughout the Ocean Research Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Chandler, Cynthia; Groman, Robert; Allison, Molly; Wiebe, Peter; Glover, David

    2013-04-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) works in partnership with ocean science investigators to publish data from research projects funded by the Biological and Chemical Oceanography Sections and the Office of Polar Programs Antarctic Organisms & Ecosystems Program (OPP ANT) at the U.S. National Science Foundation. Since 2006, researchers have been contributing data to the BCO-DMO data system, and it has developed into a rich repository of data from ocean, coastal and Great Lakes research programs. The end goals of the BCO-DMO are to ensure preservation of NSF funded project data and to provide open access to those data; achievement of those goals is attained through successful completion of a series of related phases. BCO-DMO has developed an end-to-end data stewardship process that includes all phases of the data life cycle: (1) providing data management advice to investigators during the proposal writing stage; (2) registering their funded project at BCO-DMO; (3) adding data and supporting documentation to the BCO-DMO data repository; (4) providing geospatial and text-based data access systems that support data discovery, access, display, assessment, integration, and export of data resources; (5) exploring mechanisms for exchange of data with complementary repositories; (6) publication of data sets to provide publishers of the peer-reviewed literature with citable references (Digital Object Identifiers) and to encourage proper citation and attribution of data sets in the future and (7) submission of final data sets for preservation in the appropriate long-term data archive. Strategic development of collaborative partnerships with complementary data management organizations is essential to sustainable coverage of the full data life cycle from research proposal through preservation of the final data products. Development and incorporation of controlled vocabularies, domain-specific ontologies and globally unique, persistent identifiers to unambiguously identify resources of interest curated by and available from BCO-DMO have significantly enabled progress toward interoperability with partner systems. Several important components have emerged from early collaborative relationships: (1) identifying a trusted authoritative source of complementary content and the appropriate contact; (2) determining the globally unique, persistent identifier for resources of interest and (3) negotiating the requisite syntactic and semantic exchange systems. An added benefit is the ability to use globally unique, persistent resource identifiers to identify and compare related content in other repositories, thus enabling us to improve the accuracy of content in the BCO-DMO data collection. Results from a recent community discussion at the January 2013 Federation of Earth Science Information Partners (ESIP) meeting will be presented. Mindful of the NSF EarthCube initiative in the United States, the ESIP discussion was an effort to identify commonalities and differences in the way different communities meet the challenges of data stewardship throughout the full data life cycle and to determine any gaps that currently exist. BCO-DMO: http://bco-dmo.org ESIP: http://esipfed.org/

  11. Laboratory E-Notebooks: A Learning Object-Based Repository

    ERIC Educational Resources Information Center

    Abari, Ilior; Pierre, Samuel; Saliah-Hassane, Hamadou

    2006-01-01

    During distributed virtual laboratory experiment sessions, a major problem is to be able to collect, store, manage and share heterogeneous data (intermediate results, analysis, annotations, etc) manipulated simultaneously by geographically distributed teammates composing a virtual team. The electronic notebook is a possible response to this…

  12. Optimal operation management of fuel cell/wind/photovoltaic power sources connected to distribution networks

    NASA Astrophysics Data System (ADS)

    Niknam, Taher; Kavousifard, Abdollah; Tabatabaei, Sajad; Aghaei, Jamshid

    2011-10-01

    In this paper a new multiobjective modified honey bee mating optimization (MHBMO) algorithm is presented to investigate the distribution feeder reconfiguration (DFR) problem considering renewable energy sources (RESs) (photovoltaics, fuel cell and wind energy) connected to the distribution network. The objective functions of the problem to be minimized are the electrical active power losses, the voltage deviations, the total electrical energy costs and the total emissions of RESs and substations. During the optimization process, the proposed algorithm finds a set of non-dominated (Pareto) optimal solutions which are stored in an external memory called repository. Since the objective functions investigated are not the same, a fuzzy clustering algorithm is utilized to handle the size of the repository in the specified limits. Moreover, a fuzzy-based decision maker is adopted to select the 'best' compromised solution among the non-dominated optimal solutions of multiobjective optimization problem. In order to see the feasibility and effectiveness of the proposed algorithm, two standard distribution test systems are used as case studies.

  13. NELS 2.0 - A general system for enterprise wide information management

    NASA Technical Reports Server (NTRS)

    Smith, Stephanie L.

    1993-01-01

    NELS, the NASA Electronic Library System, is an information management tool for creating distributed repositories of documents, drawings, and code for use and reuse by the aerospace community. The NELS retrieval engine can load metadata and source files of full text objects, perform natural language queries to retrieve ranked objects, and create links to connect user interfaces. For flexibility, the NELS architecture has layered interfaces between the application program and the stored library information. The session manager provides the interface functions for development of NELS applications. The data manager is an interface between session manager and the structured data system. The center of the structured data system is the Wide Area Information Server. This system architecture provides access to information across heterogeneous platforms in a distributed environment. There are presently three user interfaces that connect to the NELS engine; an X-Windows interface, and ASCII interface and the Spatial Data Management System. This paper describes the design and operation of NELS as an information management tool and repository.

  14. Digital fabrication of multi-material biomedical objects.

    PubMed

    Cheung, H H; Choi, S H

    2009-12-01

    This paper describes a multi-material virtual prototyping (MMVP) system for modelling and digital fabrication of discrete and functionally graded multi-material objects for biomedical applications. The MMVP system consists of a DMMVP module, an FGMVP module and a virtual reality (VR) simulation module. The DMMVP module is used to model discrete multi-material (DMM) objects, while the FGMVP module is for functionally graded multi-material (FGM) objects. The VR simulation module integrates these two modules to perform digital fabrication of multi-material objects, which can be subsequently visualized and analysed in a virtual environment to optimize MMLM processes for fabrication of product prototypes. Using the MMVP system, two biomedical objects, including a DMM human spine and an FGM intervertebral disc spacer are modelled and digitally fabricated for visualization and analysis in a VR environment. These studies show that the MMVP system is a practical tool for modelling, visualization, and subsequent fabrication of biomedical objects of discrete and functionally graded multi-materials for biomedical applications. The system may be adapted to control MMLM machines with appropriate hardware for physical fabrication of biomedical objects.

  15. Storytelling in the digital world: achieving higher-level learning objectives.

    PubMed

    Schwartz, Melissa R

    2012-01-01

    Nursing students are not passive media consumers but instead live in a technology ecosystem where digital is the language they speak. To prepare the next generation of nurses, educators must incorporate multiple technologies to improve higher-order learning. The author discusses the evolution and use of storytelling as part of the digital world and how digital stories can be aligned with Bloom's Taxonomy so that students achieve higher-level learning objectives.

  16. A Prototype Semantic Web-Based Digital Content Exchange for Schools in Singapore

    ERIC Educational Resources Information Center

    Shabajee, Paul; McBride, Brian; Steer, Damian; Reynolds, Dave

    2006-01-01

    Singapore has many large and educationally valuable digital collections and is planning the development of many more. These digital collections contain historical, cultural and scientific multimedia objects, along with learning objects. At present, school teachers and pupils find it hard to locate many of these resources using traditional search…

  17. Object-based neglect in number processing

    PubMed Central

    2013-01-01

    Recent evidence suggests that neglect patients seem to have particular problems representing relatively smaller numbers corresponding to the left part of the mental number line. However, while this indicates space-based neglect for representational number space little is known about whether and - if so - how object-based neglect influences number processing. To evaluate influences of object-based neglect in numerical cognition, a group of neglect patients and two control groups had to compare two-digit numbers to an internally represented standard. Conceptualizing two-digit numbers as objects of which the left part (i.e., the tens digit should be specifically neglected) we were able to evaluate object-based neglect for number magnitude processing. Object-based neglect was indicated by a larger unit-decade compatibility effect actually reflecting impaired processing of the leftward tens digits. Additionally, faster processing of within- as compared to between-decade items provided further evidence suggesting particular difficulties in integrating tens and units into the place-value structure of the Arabic number system. In summary, the present study indicates that, in addition to the spatial representation of number magnitude, also the processing of place-value information of multi-digit numbers seems specifically impaired in neglect patients. PMID:23343126

  18. Prevention of data duplication for high throughput sequencing repositories

    PubMed Central

    Gabdank, Idan; Chan, Esther T; Davidson, Jean M; Hilton, Jason A; Davis, Carrie A; Baymuradov, Ulugbek K; Narayanan, Aditi; Onate, Kathrina C; Graham, Keenan; Miyasato, Stuart R; Dreszer, Timothy R; Strattan, J Seth; Jolanki, Otto; Tanaka, Forrest Y; Hitz, Benjamin C

    2018-01-01

    Abstract Prevention of unintended duplication is one of the ongoing challenges many databases have to address. Working with high-throughput sequencing data, the complexity of that challenge increases with the complexity of the definition of a duplicate. In a computational data model, a data object represents a real entity like a reagent or a biosample. This representation is similar to how a card represents a book in a paper library catalog. Duplicated data objects not only waste storage, they can mislead users into assuming the model represents more than the single entity. Even if it is clear that two objects represent a single entity, data duplication opens the door to potential inconsistencies between the objects since the content of the duplicated objects can be updated independently, allowing divergence of the metadata associated with the objects. Analogously to a situation in which a catalog in a paper library would contain by mistake two cards for a single copy of a book. If these cards are listing simultaneously two different individuals as current book borrowers, it would be difficult to determine which borrower (out of the two listed) actually has the book. Unfortunately, in a large database with multiple submitters, unintended duplication is to be expected. In this article, we present three principal guidelines the Encyclopedia of DNA Elements (ENCODE) Portal follows in order to prevent unintended duplication of both actual files and data objects: definition of identifiable data objects (I), object uniqueness validation (II) and de-duplication mechanism (III). In addition to explaining our modus operandi, we elaborate on the methods used for identification of sequencing data files. Comparison of the approach taken by the ENCODE Portal vs other widely used biological data repositories is provided. Database URL: https://www.encodeproject.org/ PMID:29688363

  19. Towards the use of computationally inserted lesions for mammographic CAD assessment

    NASA Astrophysics Data System (ADS)

    Ghanian, Zahra; Pezeshk, Aria; Petrick, Nicholas; Sahiner, Berkman

    2018-03-01

    Computer-aided detection (CADe) devices used for breast cancer detection on mammograms are typically first developed and assessed for a specific "original" acquisition system, e.g., a specific image detector. When CADe developers are ready to apply their CADe device to a new mammographic acquisition system, they typically assess the CADe device with images acquired using the new system. Collecting large repositories of clinical images containing verified cancer locations and acquired by the new image acquisition system is costly and time consuming. Our goal is to develop a methodology to reduce the clinical data burden in the assessment of a CADe device for use with a different image acquisition system. We are developing an image blending technique that allows users to seamlessly insert lesions imaged using an original acquisition system into normal images or regions acquired with a new system. In this study, we investigated the insertion of microcalcification clusters imaged using an original acquisition system into normal images acquired with that same system utilizing our previously-developed image blending technique. We first performed a reader study to assess whether experienced observers could distinguish between computationally inserted and native clusters. For this purpose, we applied our insertion technique to clinical cases taken from the University of South Florida Digital Database for Screening Mammography (DDSM) and the Breast Cancer Digital Repository (BCDR). Regions of interest containing microcalcification clusters from one breast of a patient were inserted into the contralateral breast of the same patient. The reader study included 55 native clusters and their 55 inserted counterparts. Analysis of the reader ratings using receiver operating characteristic (ROC) methodology indicated that inserted clusters cannot be reliably distinguished from native clusters (area under the ROC curve, AUC=0.58±0.04). Furthermore, CADe sensitivity was evaluated on mammograms with native and inserted microcalcification clusters using a commercial CADe system. For this purpose, we used full field digital mammograms (FFDMs) from 68 clinical cases, acquired at the University of Michigan Health System. The average sensitivities for native and inserted clusters were equal, 85.3% (58/68). These results demonstrate the feasibility of using the inserted microcalcification clusters for assessing mammographic CAD devices.

  20. Effect of sway on image fidelity in whole-body digitizing

    NASA Astrophysics Data System (ADS)

    Corner, Brian D.; Hu, Anmin

    1998-03-01

    For 3D digitizers to be useful data collection tools in scientific and human factors engineering applications, the models created from scan data must match the original object very closely. Factors such as ambient light, characteristics of the object's surface, and object movement, among others can affect the quality of the image produced by any 3D digitizing system. Recently, Cyberware has developed a whole body digitizer for collecting data on human size and shape. With a digitizing time of about 15 seconds, the effect subject movement, or sway, on model fidelity is an important issue to be addressed. The effect of sway is best measured by comparing the dimensions of an object of known geometry to the model of the same object captured by the digitizer. Since it is difficult to know the geometry of a human body accurately, it was decided to compare an object of simple geometry to its digitized counterpart. Preliminary analysis showed that a single cardboard tube would provide the best artifact for detecting sway. A tube was attached to the subjects using supports that allowed the cylinder to stand away from the body. The stand-off was necessary to minimize occluded areas. Multiple scans were taken of 1 subject and the cylinder extracted from the images. Comparison of the actual cylinder dimensions to those extracted from the whole body images found the effect of sway to be minimal. This follows earlier findings that anthropometric dimensions extracted from whole body scans are very close to the same dimensions measured using standard manual methods. Recommendations for subject preparation and stabilization are discussed.

  1. Building Specialized Multilingual Lexical Graphs Using Community Resources

    NASA Astrophysics Data System (ADS)

    Daoud, Mohammad; Boitet, Christian; Kageura, Kyo; Kitamoto, Asanobu; Mangeot, Mathieu; Daoud, Daoud

    We are describing methods for compiling domain-dedicated multilingual terminological data from various resources. We focus on collecting data from online community users as a main source, therefore, our approach depends on acquiring contributions from volunteers (explicit approach), and it depends on analyzing users' behaviors to extract interesting patterns and facts (implicit approach). As a generic repository that can handle the collected multilingual terminological data, we are describing the concept of dedicated Multilingual Preterminological Graphs MPGs, and some automatic approaches for constructing them by analyzing the behavior of online community users. A Multilingual Preterminological Graph is a special lexical resource that contains massive amount of terms related to a special domain. We call it preterminological, because it is a raw material that can be used to build a standardized terminological repository. Building such a graph is difficult using traditional approaches, as it needs huge efforts by domain specialists and terminologists. In our approach, we build such a graph by analyzing the access log files of the website of the community, and by finding the important terms that have been used to search in that website, and their association with each other. We aim at making this graph as a seed repository so multilingual volunteers can contribute. We are experimenting this approach with the Digital Silk Road Project. We have used its access log files since its beginning in 2003, and obtained an initial graph of around 116000 terms. As an application, we used this graph to obtain a preterminological multilingual database that is serving a CLIR system for the DSR project.

  2. D Webgis and Visualization Issues for Architectures and Large Sites

    NASA Astrophysics Data System (ADS)

    De Amicis, R.; Conti, G.; Girardi, G.; Andreolli, M.

    2011-09-01

    Traditionally, within the field of archaeology and, more generally, within the cultural heritage domain, Geographical Information Systems (GIS) have been mostly used as support to cataloguing activities, essentially operating as gateways to large geo-referenced archives of specialised cultural heritage information. Additionally GIS have proved to be essential to help cultural heritage institutions improve management of their historical information, providing the means for detection of otherwise hard-to-discover spatial patterns, supporting with computation tools necessary to perform spatial clustering, proximity and orientation analysis. This paper presents a platform developed to answer to both the aforementioned issues, by allowing geo-referenced cataloguing of multi-media resources of cultural relevance as well as access, in a user-friendly manner, through an interactive 3D geobrowser which operates as single point of access to the available digital repositories. The solution has been showcased in the context of "Festival dell'economia" (the Fair of Economics) a major event recently occurred in Trento, Italy and it has allowed visitors of the event to interactively access an extremely large repository of information, as well as their metadata, available across the area of the Autonomous Province of Trento, in Italy. Within the event, an extremely large repository was made accessible, via the network, through web-services, from a 3D interactive geobrowser developed by the authors. The 3D scene was enriched with a number of Points of Interest (POIs) linking to information available within various databases. The software package was deployed with a complex hardware set-up composed of a large composite panoramic screen covering a horizontal field of view of 240 degrees.

  3. 42 CFR 37.42 - Chest radiograph specifications-digital radiography systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... resolution, modulation transfer function (MTF), image signal-to-noise and detective quantum efficiency must... Information Object Definitions, sections: Computed Radiography Image Information Object Definition; Digital X...

  4. Linear encoding device

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B. (Inventor)

    1993-01-01

    A Linear Motion Encoding device for measuring the linear motion of a moving object is disclosed in which a light source is mounted on the moving object and a position sensitive detector such as an array photodetector is mounted on a nearby stationary object. The light source emits a light beam directed towards the array photodetector such that a light spot is created on the array. An analog-to-digital converter, connected to the array photodetector is used for reading the position of the spot on the array photodetector. A microprocessor and memory is connected to the analog-to-digital converter to hold and manipulate data provided by the analog-to-digital converter on the position of the spot and to compute the linear displacement of the moving object based upon the data from the analog-to-digital converter.

  5. Research Data Management Self-Education for Librarians: A Webliography

    ERIC Educational Resources Information Center

    Goben, Abigail; Raszewski, Rebecca

    2015-01-01

    As data as a scholarly object continues to grow in importance in the research community, librarians are undertaking increasing responsibilities regarding data management and curation. New library initiatives include assisting researchers in finding data sets for reuse; locating and hosting repositories for required archiving; consultations on…

  6. The Speed of Serial Attention Shifts in Visual Search: Evidence from the N2pc Component.

    PubMed

    Grubert, Anna; Eimer, Martin

    2016-02-01

    Finding target objects among distractors in visual search display is often assumed to be based on sequential movements of attention between different objects. However, the speed of such serial attention shifts is still under dispute. We employed a search task that encouraged the successive allocation of attention to two target objects in the same search display and measured N2pc components to determine how fast attention moved between these objects. Each display contained one digit in a known color (fixed-color target) and another digit whose color changed unpredictably across trials (variable-color target) together with two gray distractor digits. Participants' task was to find the fixed-color digit and compare its numerical value with that of the variable-color digit. N2pc components to fixed-color targets preceded N2pc components to variable-color digits, demonstrating that these two targets were indeed selected in a fixed serial order. The N2pc to variable-color digits emerged approximately 60 msec after the N2pc to fixed-color digits, which shows that attention can be reallocated very rapidly between different target objects in the visual field. When search display durations were increased, thereby relaxing the temporal demands on serial selection, the two N2pc components to fixed-color and variable-color targets were elicited within 90 msec of each other. Results demonstrate that sequential shifts of attention between different target locations can operate very rapidly at speeds that are in line with the assumptions of serial selection models of visual search.

  7. Neural Representations of Sensorimotor Memory- and Digit Position-Based Load Force Adjustments Before the Onset of Dexterous Object Manipulation.

    PubMed

    Marneweck, Michelle; Barany, Deborah A; Santello, Marco; Grafton, Scott T

    2018-05-16

    Anticipatory load forces for dexterous object manipulation in humans are modulated based on visual object property cues, sensorimotor memories of previous experiences with the object, and, when digit positioning varies from trial to trial, the integrating of this sensed variability with force modulation. Studies of the neural representations encoding these anticipatory mechanisms have not considered these mechanisms separately from each other or from feedback mechanisms emerging after lift onset. Here, representational similarity analyses of fMRI data were used to identify neural representations of sensorimotor memories and the sensing and integration of digit position. Cortical activity and movement kinematics were measured as 20 human subjects (11 women) minimized tilt of a symmetrically shaped object with a concealed asymmetric center of mass (CoM, left and right sided). This task required generating compensatory torques in opposite directions, which, without helpful visual CoM cues, relied primarily on sensorimotor memories of the same object and CoM. Digit position was constrained or unconstrained, the latter of which required modulating forces beyond what can be recalled from sensorimotor memories to compensate for digit position variability. Ventral premotor (PMv), somatosensory, and cerebellar lobule regions (CrusII, VIIIa) were sensitive to anticipatory behaviors that reflect sensorimotor memory content, as shown by larger voxel pattern differences for unmatched than matched CoM conditions. Cerebellar lobule I-IV, Broca area 44, and PMv showed greater voxel pattern differences for unconstrained than constrained grasping, which suggests their sensitivity to monitor the online coincidence of planned and actual digit positions and correct for a mismatch by force modulation. SIGNIFICANCE STATEMENT To pick up a water glass without slipping, tipping, or spilling requires anticipatory planning of fingertip load forces before the lift commences. This anticipation relies on object visual properties (e.g., mass/mass distribution), sensorimotor memories built from previous experiences (especially when object properties cannot be inferred visually), and online sensing of where the digits are positioned. There is limited understanding of how the brain represents each of these anticipatory mechanisms. We used fMRI measures of regional brain patterns and digit position kinematics before lift onset of an object with nonsalient visual cues specifically to isolate sensorimotor memories and integration of sensed digit position with force modulation. In doing so, we localized neural representations encoding these anticipatory mechanisms for dexterous object manipulation. Copyright © 2018 the authors 0270-6474/18/384724-14$15.00/0.

  8. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    PubMed

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  9. Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects.

    PubMed

    Matsushima, Kyoji; Sonobe, Noriaki

    2018-01-01

    Digitized holography techniques are used to reconstruct three-dimensional (3D) images of physical objects using large-scale computer-generated holograms (CGHs). The object field is captured at three wavelengths over a wide area at high densities. Synthetic aperture techniques using single sensors are used for image capture in phase-shifting digital holography. The captured object field is incorporated into a virtual 3D scene that includes nonphysical objects, e.g., polygon-meshed CG models. The synthetic object field is optically reconstructed as a large-scale full-color CGH using red-green-blue color filters. The CGH has a wide full-parallax viewing zone and reconstructs a deep 3D scene with natural motion parallax.

  10. Apparatus for direct-to-digital spatially-heterodyned holography

    DOEpatents

    Thomas, Clarence E.; Hanson, Gregory R.

    2006-12-12

    An apparatus operable to record a spatially low-frequency heterodyne hologram including spatially heterodyne fringes for Fourier analysis includes: a laser; a beamsplitter optically coupled to the laser; an object optically coupled to the beamsplitter; a focusing lens optically coupled to both the beamsplitter and the object; a digital recorder optically coupled to the focusing lens; and a computer that performs a Fourier transform, applies a digital filter, and performs an inverse Fourier transform. A reference beam and an object beam are focused by the focusing lens at a focal plane of the digital recorder to form a spatially low-frequency heterodyne hologram including spatially heterodyne fringes for Fourier analysis which is recorded by the digital recorder, and the computer transforms the recorded spatially low-frequency heterodyne hologram including spatially heterodyne fringes and shifts axes in Fourier space to sit on top of a heterodyne carrier frequency defined by an angle between the reference beam and the object beam and cuts off signals around an original origin before performing the inverse Fourier transform.

  11. Acquisition and replay systems for direct-to-digital holography and holovision

    DOEpatents

    Thomas, Clarence E.; Hanson, Gregory R.

    2003-02-25

    Improvements to the acquisition and replay systems for direct-to-digital holography and holovision are described. A method of recording an off-axis hologram includes: splitting a laser beam into an object beam and a reference beam; reflecting the reference beam from a reference beam mirror; reflecting the object beam from an illumination beamsplitter; passing the object beam through an objective lens; reflecting the object beam from an object; focusing the reference beam and the object beam at a focal plane of a digital recorder to form an off-axis hologram; digitally recording the off-axis hologram; and transforming the off-axis hologram in accordance with a Fourier transform to obtain a set of results. A method of writing an off-axis hologram includes: passing a laser beam through a spatial light modulator; and focusing the laser beam at a focal plane of a photorefractive crystal to impose a holographic diffraction grating pattern on the photorefractive crystal. A method of replaying an off-axis hologram includes: illuminating a photorefractive crystal having a holographic diffraction grating with a replay beam.

  12. Chemical markup, XML and the World-Wide Web. 3. Toward a signed semantic chemical web of trust.

    PubMed

    Gkoutos, G V; Murray-Rust, P; Rzepa, H S; Wright, M

    2001-01-01

    We describe how a collection of documents expressed in XML-conforming languages such as CML and XHTML can be authenticated and validated against digital signatures which make use of established X.509 certificate technology. These can be associated either with specific nodes in the XML document or with the entire document. We illustrate this with two examples. An entire journal article expressed in XML has its individual components digitally signed by separate authors, and the collection is placed in an envelope and again signed. The second example involves using a software robot agent to acquire a collection of documents from a specified URL, to perform various operations and transformations on the content, including expressing molecules in CML, and to automatically sign the various components and deposit the result in a repository. We argue that these operations can used as components for building what we term an authenticated and semantic chemical web of trust.

  13. Digital data for quick response (QR) codes of alkalophilic Bacillus pumilus to identify and to compare bacilli isolated from Lonar Crator Lake, India

    PubMed Central

    Rekadwad, Bhagwan N.; Khobragade, Chandrahasya N.

    2016-01-01

    Microbiologists are routinely engaged isolation, identification and comparison of isolated bacteria for their novelty. 16S rRNA sequences of Bacillus pumilus were retrieved from NCBI repository and generated QR codes for sequences (FASTA format and full Gene Bank information). 16SrRNA were used to generate quick response (QR) codes of Bacillus pumilus isolated from Lonar Crator Lake (19° 58′ N; 76° 31′ E), India. Bacillus pumilus 16S rRNA gene sequences were used to generate CGR, FCGR and PCA. These can be used for visual comparison and evaluation respectively. The hyperlinked QR codes, CGR, FCGR and PCA of all the isolates are made available to the users on a portal https://sites.google.com/site/bhagwanrekadwad/. This generated digital data helps to evaluate and compare any Bacillus pumilus strain, minimizes laboratory efforts and avoid misinterpretation of the species. PMID:27141529

  14. Shape priors for segmentation of the cervix region within uterine cervix images

    NASA Astrophysics Data System (ADS)

    Lotenberg, Shelly; Gordon, Shiri; Greenspan, Hayit

    2008-03-01

    The work focuses on a unique medical repository of digital Uterine Cervix images ("Cervigrams") collected by the National Cancer Institute (NCI), National Institute of Health, in longitudinal multi-year studies. NCI together with the National Library of Medicine is developing a unique web-based database of the digitized cervix images to study the evolution of lesions related to cervical cancer. Tools are needed for the automated analysis of the cervigram content to support the cancer research. In recent works, a multi-stage automated system for segmenting and labeling regions of medical and anatomical interest within the cervigrams was developed. The current paper concentrates on incorporating prior-shape information in the cervix region segmentation task. In accordance with the fact that human experts mark the cervix region as circular or elliptical, two shape models (and corresponding methods) are suggested. The shape models are embedded within an active contour framework that relies on image features. Experiments indicate that incorporation of the prior shape information augments previous results.

  15. Beyond PubMed: Searching the "Grey Literature" for Clinical Trial Results.

    PubMed

    Citrome, Leslie

    2014-07-01

    Clinical trial results have been traditionally communicated through the publication of scholarly reports and reviews in biomedical journals. However, this dissemination of information can be delayed or incomplete, making it difficult to appraise new treatments, or in the case of missing data, evaluate older interventions. Going beyond the routine search of PubMed, it is possible to discover additional information in the "grey literature." Examples of the grey literature include clinical trial registries, patent databases, company and industrywide repositories, regulatory agency digital archives, abstracts of paper and poster presentations on meeting/congress websites, industry investor reports and press releases, and institutional and personal websites.

  16. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    PubMed

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  17. A preliminary checklist of the freshwater snails of Sabah (Malaysian Borneo) deposited in the BORNEENSIS collection, Universiti Malaysia Sabah

    PubMed Central

    Ng, Ting Hui; Dulipat, Jasrul; Foon, Junn Kitt; Lopes-Lima, Manuel; Alexandra Zieritz; Liew, Thor-Seng

    2017-01-01

    Abstract Sabah, a Malaysian state at the north-eastern tip of Borneo, is situated in one of the Earth’s biodiversity hotspots yet its freshwater gastropod diversity remains poorly known. An annotated checklist of the freshwater gastropods is presented, based on specimens deposited in the BORNEENSIS collection of the Institute for Tropical Biology and Conservation at Universiti Malaysia Sabah, Malaysia. A KMZ file is also provided, which acts as a repository of digital images and complete collection data of all examined material, so that it can be shared and adapted to facilitate future research. PMID:28769673

  18. [The future of scientific libraries].

    PubMed

    De Fiore, Luca

    2013-10-01

    "Making predictions is always very difficult, especially about the future". Niels Bohr's quote is very appropriate when looking into the future of libraries. If the Web is now the richest library in the world, it is also the most friendly and therefore the most convenient. The evolution of libraries in the coming years - both traditional and online - will probably depend on their ability to meet the information needs of users: improved ease of use and better reliability of the information. These are objectives that require money and - given the general reduction in budgets - it is not obvious that the results will be achieved. However, there are many promising experiences at the international level that show that the world of libraries is populated by projects and creativity. Traditional or digital, libraries will increasingly present themselves more as a sharing tool than as a repository of information: it is the sharing that translates data into knowledge. In the healthcare field, the integration of online libraries with the epidemiological information systems could favor the fulfillment of unconscious information needs of health personnel; libraries will therefore be a key tool for an integrated answer to the challenge of continuing education in medicine. The Internet is no longer a library but an information ecosystem where the data are transformed into knowledge by sharing and discussion.

  19. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory.

    PubMed

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  20. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    PubMed Central

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164

  1. Evaluation on radiation protection aspect and radiological risk at Mukim Belanja repository

    NASA Astrophysics Data System (ADS)

    Azmi, Siti Nur Aisyah; Kenoh, Hamiza; Majid, Amran Ab.

    2016-01-01

    Asian Rare Earth (ARE) is a locally incorporated company that operated a mineral processing operation to extract rare earth element. ARE has received much attention from the public since the beginning of their operation until the work of decommissioning and decontamination of the plant. Due to the existence of Naturally Occurring Radioactive Material (NORM) in the residue, the decommissioning and disposal was done by the company in collaboration with the Perak State Government and the Atomic Energy Licensing Board (AELB). The main objective of this study is to review the level of compliance of the existing Radiation Protection Regulations enforced by AELB particularly in the achievement of allowed exposure dose limit. The next objective was to study the impact of the construction of the Mukim Belanja Repository to workers and public. This study was conducted by analyzing documents that were issued and conducting the area monitoring using a Geiger Muller detector (GM) and Sodium Iodide (NaI(Tl)) survey meters. The measurements were made at 5 cm and 1 m from the ground surface at 27 measurement stations. The external doses measured were within the background levels of the surrounding area. The annual effective dose using the highest reading at 5 cm and 1 m from ground surface by GM detector was calculated to be 1.36 mSv/year and 1.21 mSv/year respectively. Whereas the annual effective dose using the highest reading at 5 cm and 1 m from ground surface by using NaI(Tl) detector was calculated to be 3.31 mSv/year and 2.83 mSv/year respectively. The calculated cancer risks from the study showed that the risk is small compared with the risks derived from natural radiation based on global annual radiation dose to humans. This study therefore indicated that the repository is able to constrain the dose exposure from the disposed NORM waste. The study also revealed that the construction of the repository has complied with all the rules and regulations subjected to it. The exposed dose received by the radiation and the public workers during the construction of the repository were below the annual limit i.e. 20 mSv/year and 1mSv/year respectively.

  2. Evaluation on radiation protection aspect and radiological risk at Mukim Belanja repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azmi, Siti Nur Aisyah, E-mail: nuclear.aisyahazmi@gmail.com; Kenoh, Hamiza; Majid, Amran Ab.

    2016-01-22

    Asian Rare Earth (ARE) is a locally incorporated company that operated a mineral processing operation to extract rare earth element. ARE has received much attention from the public since the beginning of their operation until the work of decommissioning and decontamination of the plant. Due to the existence of Naturally Occurring Radioactive Material (NORM) in the residue, the decommissioning and disposal was done by the company in collaboration with the Perak State Government and the Atomic Energy Licensing Board (AELB). The main objective of this study is to review the level of compliance of the existing Radiation Protection Regulations enforcedmore » by AELB particularly in the achievement of allowed exposure dose limit. The next objective was to study the impact of the construction of the Mukim Belanja Repository to workers and public. This study was conducted by analyzing documents that were issued and conducting the area monitoring using a Geiger Muller detector (GM) and Sodium Iodide (NaI(Tl)) survey meters. The measurements were made at 5 cm and 1 m from the ground surface at 27 measurement stations. The external doses measured were within the background levels of the surrounding area. The annual effective dose using the highest reading at 5 cm and 1 m from ground surface by GM detector was calculated to be 1.36 mSv/year and 1.21 mSv/year respectively. Whereas the annual effective dose using the highest reading at 5 cm and 1 m from ground surface by using NaI(Tl) detector was calculated to be 3.31 mSv/year and 2.83 mSv/year respectively. The calculated cancer risks from the study showed that the risk is small compared with the risks derived from natural radiation based on global annual radiation dose to humans. This study therefore indicated that the repository is able to constrain the dose exposure from the disposed NORM waste. The study also revealed that the construction of the repository has complied with all the rules and regulations subjected to it. The exposed dose received by the radiation and the public workers during the construction of the repository were below the annual limit i.e. 20 mSv/year and 1mSv/year respectively.« less

  3. Application of comparative digital holography for distant shape control

    NASA Astrophysics Data System (ADS)

    Baumbach, Torsten; Osten, Wolfgang; von Kopylow, Christoph; Juptner, Werner P. O.

    2004-09-01

    The comparison of two objects is of great importance in the industrial production process. Especially comparing the shape is of particular interest for maintaining calibration tools or controlling the tolerance in the deviation between a sample and a master. Outsourcing and globalization of production places can result in large distances between co-operating partners and might cause problems for maintaining quality standards. Consequently new challenges arise for optical measurement techniques especially in the field of industrial shape control. In this paper we describe the progress of implementing a novel technique for comparing directly two objects with different microstructure. The technique is based on the combination of comparative holography and digital holography. Comparing the objects can be done in two ways. One is the digital comparison in the computer and the other way is by using the analogue reconstruction of a master hologram with a spatial light modulator (SLM) as coherent mask for illuminating the test object. Since this mask is stored digitally it can be transmitted via telecommunication networks and this enables the access to the full optical information of the master object at any place wanted. Beside the basic principle of comparative digital holography (CDH), we will show in this paper the set-up for doing the analogue comparison of two objects with increased sensitivity in comparison to former measurements and the calibration of the SLM that is used for the experiments. We will give examples for the digital and the analogue comparison of objects including a verification of our results by another optical measurement technique.

  4. AstroCV: Astronomy computer vision library

    NASA Astrophysics Data System (ADS)

    González, Roberto E.; Muñoz, Roberto P.; Hernández, Cristian A.

    2018-04-01

    AstroCV processes and analyzes big astronomical datasets, and is intended to provide a community repository of high performance Python and C++ algorithms used for image processing and computer vision. The library offers methods for object recognition, segmentation and classification, with emphasis in the automatic detection and classification of galaxies.

  5. At the Creation: Chaos, Control, and Automation--Commercial Software Development for Archives.

    ERIC Educational Resources Information Center

    Drr, W. Theodore

    1988-01-01

    An approach to the design of flexible text-based management systems for archives includes tiers for repository, software, and user management systems. Each tier has four layers--objective, program, result, and interface. Traps awaiting software development companies involve the market, competition, operations, and finance. (10 references) (MES)

  6. OER Use in Intermediate Language Instruction: A Case Study

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2017-01-01

    This paper reports on a case study in the experimental use of Open Educational Resources (OERs) in intermediate level language instruction. The resources come from three sources: the instructor, the students, and open content repositories. The objective of this action research project was to provide student-centered learning materials, enhance…

  7. The Big Challenge in Big Earth Science Data: Maturing to Transdisciplinary Data Platforms that are Relevant to Government, Research and Industry

    NASA Astrophysics Data System (ADS)

    Wyborn, Lesley; Evans, Ben

    2016-04-01

    Collecting data for the Earth Sciences has a particularly long history going back centuries. Initially scientific data came only from simple human observations recorded by pen on paper. Scientific instruments soon supplemented data capture, and as these instruments became more capable (e.g, automation, more information captured, generation of digitally-born outputs), Earth Scientists entered the 'Big Data' era where progressively data became too big to store and process locally in the old style vaults. To date, most funding initiatives for collection and storage of large volume data sets in the Earth Sciences have been specialised within a single discipline (e.g., climate, geophysics, and Earth Observation) or specific to an individual institution. To undertake interdisciplinary research, it is hard for users to integrate data from these individual repositories mainly due to limitations on physical access to/movement of the data, and/or data being organised without enough information to make sense of it without discipline specialised knowledge. Smaller repositories have also gradually been seen as inefficient in terms of the cost to manage and access (including scarce skills) and effective implementation of new technology and techniques. Within the last decade, the trend is towards fewer and larger data repositories that increasingly are collocated with HPC/cloud resources. There has also been a growing recognition that digital data can be a valuable resource that can be reused and repurposed - publicly funded data from either the academic of government sector is seen as a shared resource, and that efficiencies can be gained by co-location. These new, highly capable, 'transdisciplinary' data repositories are emerging as a fundamental 'infrastructure' both for research and other innovation. The sharing of academic and government data resources on the same infrastructures is enabling new research programmes that will enable integration beyond the traditional physical scientific domain silos, including into the humanities and social sciences. Furthermore there is increasing desire for these 'Big Data' data infrastructures to prove their value not only as platforms for scientific discovery, but to also support the development of evidence-based government policies, economic growth, and private-sector opportunities. The capacity of these transdisciplinary data repositories leads to many new exciting opportunities for the next generation of large-scale data integration, but there is an emerging suite of data challenges that now need to be tackled. Many large volume data sets have historically been developed within traditional domain silos and issues such as difference of standards (informal and formal), the data conventions, the lack of controlled or even uniform vocabularies, the non-existent/not machine-accessible semantic information, and bespoke or unclear copyrights and licensing are becoming apparent. The different perspectives and approaches of the various communities have also started to come to the fore; particularly the dominant file based approach of the big data generating science communities versus the database approach of the point observational communities; and the multidimensional approach of the climate and oceans community versus the traditional 2D approach of the GIS/spatial community. Addressing such challenges is essential to fully unlock online access to all relevant data to enable the maturing of research to the transdisciplinary paradigm.

  8. Uranium (VI) solubility in carbonate-free ERDA-6 brine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucchini, Jean-francois; Khaing, Hnin; Reed, Donald T

    2010-01-01

    When present, uranium is usually an element of importance in a nuclear waste repository. In the Waste Isolation Pilot Plant (WIPP), uranium is the most prevalent actinide component by mass, with about 647 metric tons to be placed in the repository. Therefore, the chemistry of uranium, and especially its solubility in the WIPP conditions, needs to be well determined. Long-term experiments were performed to measure the solubility of uranium (VI) in carbonate-free ERDA-6 brine, a simulated WIPP brine, at pC{sub H+} values between 8 and 12.5. These data, obtained from the over-saturation approach, were the first repository-relevant data for themore » VI actinide oxidation state. The solubility trends observed pointed towards low uranium solubility in WIPP brines and a lack of amphotericity. At the expected pC{sub H+} in the WIPP ({approx} 9.5), measured uranium solubility approached 10{sup -7} M. The objective of these experiments was to establish a baseline solubility to further investigate the effects of carbonate complexation on uranium solubility in WIPP brines.« less

  9. Credit where credit is due: indexing and exposing data citations in international data repository networks

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Vieglais, D.; Cruse, P.; Chodacki, J.; Budden, A. E.; Fenner, M.; Lowenberg, D.; Abrams, S.

    2017-12-01

    Research data are fundamental to the success of the academic enterprise, and yet the practice of citing data in academic and applied works is not widespread among researchers. Researchers need credit for their contributions, and yet current citation infrastructure focuses primarily on citations to research literature. Some citation indiexing systems even systematically exclude citations to data from their corpus. The Making Data Count (MDC) project will enable measuring the impact of research data much as is currently being done with publications, the primary vehicle for scholarly credit and accountability. The MDC team (including the California Digital Library, COUNTER, DataCite, and DataONE) are working together to publish a new COUNTER recommendation on data usage statistics; launch a DataCite-hosted MDC service for aggregated DLM based on the open-source Lagotto platform; and build tools for data repository and discovery services to easily integrate with the new MDC service. In providing such data-level metrics (DLM), the MDC project augments existing measures of scholarly success and so offers an important incentive promoting open data principles and quality research data through adoption of research data management best practices.

  10. Direct-to-digital holography and holovision

    DOEpatents

    Thomas, Clarence E.; Baylor, Larry R.; Hanson, Gregory R.; Rasmussen, David A.; Voelkl, Edgar; Castracane, James; Simkulet, Michelle; Clow, Lawrence

    2000-01-01

    Systems and methods for direct-to-digital holography are described. An apparatus includes a laser; a beamsplitter optically coupled to the laser; a reference beam mirror optically coupled to the beamsplitter; an object optically coupled to the beamsplitter, a focusing lens optically coupled to both the reference beam mirror and the object; and a digital recorder optically coupled to the focusing lens. A reference beam is incident upon the reference beam mirror at a non-normal angle, and the reference beam and an object beam are focused by the focusing lens at a focal plane of the digital recorder to form an image. The systems and methods provide advantages in that computer assisted holographic measurements can be made.

  11. Virtual mask digital electron beam lithography

    DOEpatents

    Baylor, L.R.; Thomas, C.E.; Voelkl, E.; Moore, J.A.; Simpson, M.L.; Paulus, M.J.

    1999-04-06

    Systems and methods for direct-to-digital holography are described. An apparatus includes a laser; a beamsplitter optically coupled to the laser; a reference beam mirror optically coupled to the beamsplitter; an object optically coupled to the beamsplitter, a focusing lens optically coupled to both the reference beam mirror and the object; and a digital recorder optically coupled to the focusing lens. A reference beam is incident upon the reference beam mirror at a non-normal angle, and the reference beam and an object beam are focused by the focusing lens at a focal plane of the digital recorder to form an image. The systems and methods provide advantages in that computer assisted holographic measurements can be made. 5 figs.

  12. Virtual mask digital electron beam lithography

    DOEpatents

    Baylor, Larry R.; Thomas, Clarence E.; Voelkl, Edgar; Moore, James A.; Simpson, Michael L.; Paulus, Michael J.

    1999-01-01

    Systems and methods for direct-to-digital holography are described. An apparatus includes a laser; a beamsplitter optically coupled to the laser; a reference beam mirror optically coupled to the beamsplitter; an object optically coupled to the beamsplitter, a focusing lens optically coupled to both the reference beam mirror and the object; and a digital recorder optically coupled to the focusing lens. A reference beam is incident upon the reference beam mirror at a non-normal angle, and the reference beam and an object beam are focused by the focusing lens at a focal plane of the digital recorder to form an image. The systems and methods provide advantages in that computer assisted holographic measurements can be made.

  13. Food entries in a large allergy data repository

    PubMed Central

    Plasek, Joseph M.; Goss, Foster R.; Lai, Kenneth H.; Lau, Jason J.; Seger,, Diane L.; Blumenthal, Kimberly G.; Wickner, Paige G.; Slight, Sarah P.; Chang, Frank Y.; Topaz, Maxim; Bates, David W.

    2016-01-01

    Objective Accurate food adverse sensitivity documentation in electronic health records (EHRs) is crucial to patient safety. This study examined, encoded, and grouped foods that caused any adverse sensitivity in a large allergy repository using natural language processing and standard terminologies. Methods Using the Medical Text Extraction, Reasoning, and Mapping System (MTERMS), we processed both structured and free-text entries stored in an enterprise-wide allergy repository (Partners’ Enterprise-wide Allergy Repository), normalized diverse food allergen terms into concepts, and encoded these concepts using the Systematized Nomenclature of Medicine – Clinical Terms (SNOMED-CT) and Unique Ingredient Identifiers (UNII) terminologies. Concept coverage also was assessed for these two terminologies. We further categorized allergen concepts into groups and calculated the frequencies of these concepts by group. Finally, we conducted an external validation of MTERMS’s performance when identifying food allergen terms, using a randomized sample from a different institution. Results We identified 158 552 food allergen records (2140 unique terms) in the Partners repository, corresponding to 672 food allergen concepts. High-frequency groups included shellfish (19.3%), fruits or vegetables (18.4%), dairy (9.0%), peanuts (8.5%), tree nuts (8.5%), eggs (6.0%), grains (5.1%), and additives (4.7%). Ambiguous, generic concepts such as “nuts” and “seafood” accounted for 8.8% of the records. SNOMED-CT covered more concepts than UNII in terms of exact (81.7% vs 68.0%) and partial (14.3% vs 9.7%) matches. Discussion Adverse sensitivities to food are diverse, and existing standard terminologies have gaps in their coverage of the breadth of allergy concepts. Conclusion New strategies are needed to represent and standardize food adverse sensitivity concepts, to improve documentation in EHRs. PMID:26384406

  14. OWLing Clinical Data Repositories With the Ontology Web Language.

    PubMed

    Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther

    2014-08-01

    The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.

  15. Images of a place and vacation preferences: Implications of the 1989 surveys for assessing the economic impacts of a nuclear waste repository in Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slovic, P.; Layman, M.; Flynn, J.H.

    1990-11-01

    In July, 1989 the authors produced a report titled Perceived Risk, Stigma, and Potential Economic Impacts of a High-Level Nuclear-Waste Repository in Nevada (Slovic et al., 1989). That report described a program of research designed to assess the potential impacts of a high-level nuclear waste repository at Yucca Mountain, Nevada upon tourism, retirement and job-related migration, and business development in Las Vegas and the state. It was concluded that adverse economic impacts potentially may result from two related social processes. Specifically, the study by Slovic et al. employed analyses of imagery in order to overcome concerns about the validity ofmore » direct questions regarding the influence of a nuclear-waste repository at Yucca Mountain upon a person`s future behaviors. During the latter months of 1989, data were collected in three major telephone surveys, designed to achieve the following objectives: (1) to replicate the results from the Phoenix, Arizona, surveys using samples from other populations that contribute to tourism, migration, and development in Nevada; (2) to retest the original Phoenix respondents to determine the stability of their images across an 18-month time period and to determine whether their vacation choices subsequent to the first survey were predictable from the images they produced in that original survey; (3) to elicit additional word-association images for the stimulus underground nuclear waste repository in order to determine whether the extreme negative images generated by the Phoenix respondents would occur with other samples of respondents; and (4) to develop and test a new method for imagery elicitation, based upon a rating technique rather than on word associations. 2 refs., 8 figs., 13 tabs.« less

  16. An Investigation into Digital Media: Characteristics of Learning Objects Which K-12 Teachers Determine Meet Their Instructional Needs

    ERIC Educational Resources Information Center

    Guthrie, Patricia Ann

    2010-01-01

    In recent years, learning objects have emerged as an instructional tool for teachers. Digital libraries and collections provide teachers with free or fee-base access to a variety of learning objects from photos and famous speeches to Flash animations and interactive Java Applets. Learning objects offer opportunities for students to interact with…

  17. Proposal of a Framework for Internet Based Licensing of Learning Objects

    ERIC Educational Resources Information Center

    Santos, Osvaldo A.; Ramos, Fernando M. S.

    2004-01-01

    This paper presents a proposal of a framework whose main objective is to manage the delivery and rendering of learning objects in a digital rights controlled environment. The framework is based on a digital licensing scheme that requires each learning object to have the proper license in order to be rendered by a trusted player. A conceptual model…

  18. High aperture off-axis parabolic mirror applied in digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kalenkov, Georgy S.; Kalenkov, Sergey G.; Shtanko, Alexander E.

    2018-04-01

    An optical scheme of recording digital holograms of micro-objects based on high numerical aperture off-axis parabolic mirror forming a high aperture reference wave is suggested. Registration of digital holograms based on the proposed optical scheme is confirmed experimentally. Application of the proposed approach for hyperspectral holograms registration of micro-objects in incoherent light is discussed.

  19. Smart Objects, Dumb Archives: A User-Centric, Layered Digital Library Framework

    NASA Technical Reports Server (NTRS)

    Maly, Kurt; Nelson, Michael L.; Zubair, Mohammad

    1999-01-01

    Currently, there exist a large number of superb digital libraries, all of which are, unfortunately, vertically integrated and all presenting a monolithic interface to their users. Ideally, a user would want to locate resources from a variety of digital libraries dealing only with one interface. A number of approaches exist to this interoperability issue exist including: defining a universal protocol for all libraries to adhere to; or developing mechanisms to translate between protocols. The approach we illustrate in this paper is to push down the level of universal protocols to one for digital object communication and for communication for simple archives. This approach creates the opportunity for digital library service providers to create digital libraries tailored to the needs of user communities drawing from available archives and individual publishers who adhere to this standard. We have created a reference implementation based on the hyper text transfer protocol (http) with the protocols being derived from the Dienst protocol. We have created a special class of digital objects called buckets and a number of archives based on a NASA collection and NSF funded projects. Starting from NCSTRL we have developed a set of digital library services called NCSTRL+ and have created digital libraries for researchers, educators and students that can each draw on all the archives and individually created buckets.

  20. Surface Model and Tomographic Archive of Fossil Primate and Other Mammal Holotype and Paratype Specimens of the Ditsong National Museum of Natural History, Pretoria, South Africa.

    PubMed

    Adams, Justin W; Olah, Angela; McCurry, Matthew R; Potze, Stephany

    2015-01-01

    Nearly a century of paleontological excavation and analysis from the cave deposits of the Cradle of Humankind UNESCO World Heritage Site in northeastern South Africa underlies much of our understanding of the evolutionary history of hominins, other primates and other mammal lineages in the late Pliocene and early Pleistocene of Africa. As one of few designated fossil repositories, the Plio-Pleistocene Palaeontology Section of the Ditsong National Museum of Natural History (DNMNH; the former Transvaal Museum) curates much of the mammalian faunas recovered from the fossil-rich deposits of major South African hominin-bearing localities, including the holotype and paratype specimens of many primate, carnivore, and other mammal species (Orders Primates, Carnivora, Artiodactyla, Eulipotyphla, Hyracoidea, Lagomorpha, Perissodactyla, and Proboscidea). Here we describe an open-access digital archive of high-resolution, full-color three-dimensional (3D) surface meshes of all 89 non-hominin holotype, paratype and significant mammalian specimens curated in the Plio-Pleistocene Section vault. Surface meshes were generated using a commercial surface scanner (Artec Spider, Artec Group, Luxembourg), are provided in formats that can be opened in both open-source and commercial software, and can be readily downloaded either via an online data repository (MorphoSource) or via direct request from the DNMNH. In addition to providing surface meshes for each specimen, we also provide tomographic data (both computerized tomography [CT] and microfocus [microCT]) for a subset of these fossil specimens. This archive of the DNMNH Plio-Pleistocene collections represents the first research-quality 3D datasets of African mammal fossils to be made openly available. This simultaneously provides the paleontological community with essential baseline information (e.g., updated listing and 3D record of specimens in their current state of preservation) and serves as a single resource of high-resolution digital data that improves collections accessibility, reduces unnecessary duplication of efforts by researchers, and encourages ongoing imaging-based paleobiological research across a range of South African non-hominin fossil faunas. Because the types, paratypes, and key specimens include globally-distributed mammal taxa, this digital archive not only provides 3D morphological data on taxa fundamental to Neogene and Quaternary South African palaeontology, but also lineages critical to research on African, other Old World, and New World paleocommunities. With such a broader impact of the DNMNH 3D data, we hope that establishing open access to this digital archive will encourage other researchers and institutions to provide similar resources that increase accessibility to paleontological collections and support advanced paleobiological analyses.

  1. Remembering Math: The Design of Digital Learning Objects to Spark Professional Learning

    ERIC Educational Resources Information Center

    Halverson, Richard; Wolfenstein, Moses; Williams, Caroline C.; Rockman, Charles

    2009-01-01

    This article describes how the design of digital learning objects can spark professional learning. The challenge was to build learning objects that would help experienced special education teachers, who had been teaching in math classes, to demonstrate their proficiency in middle and secondary school mathematics on the PRAXIS examination. While…

  2. Incoherent digital holograms acquired by interferenceless coded aperture correlation holography system without refractive lenses.

    PubMed

    Kumar, Manoj; Vijayakumar, A; Rosen, Joseph

    2017-09-14

    We present a lensless, interferenceless incoherent digital holography technique based on the principle of coded aperture correlation holography. The acquired digital hologram by this technique contains a three-dimensional image of some observed scene. Light diffracted by a point object (pinhole) is modulated using a random-like coded phase mask (CPM) and the intensity pattern is recorded and composed as a point spread hologram (PSH). A library of PSHs is created using the same CPM by moving the pinhole to all possible axial locations. Intensity diffracted through the same CPM from an object placed within the axial limits of the PSH library is recorded by a digital camera. The recorded intensity this time is composed as the object hologram. The image of the object at any axial plane is reconstructed by cross-correlating the object hologram with the corresponding component of the PSH library. The reconstruction noise attached to the image is suppressed by various methods. The reconstruction results of multiplane and thick objects by this technique are compared with regular lens-based imaging.

  3. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  4. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  5. Creating a Framework of Guidance for Building Good Digital Collections.

    ERIC Educational Resources Information Center

    Cole, Timothy W.

    2002-01-01

    Presents the Framework of Guidance for Building Good Digital Collections that was developed by the Institute of Museum and Library Services with other organizations to guide museums and libraries in digitization collection practices. Highlights digital collections, digital objects, and metadata, and discusses reusability, persistence,…

  6. The 3D Digital Story-telling Media on Batik Learning in Vocational High Schools

    NASA Astrophysics Data System (ADS)

    Widiaty, I.; Achdiani, Y.; Kuntadi, I.; Mubaroq, S. R.; Zakaria, D.

    2018-02-01

    The aim of this research is to make 3D digital Story-telling Media on Batik Learning in Vocational High School. The digital story-telling developed in this research is focused on 3D-based story-telling. In contrast to the digital story-telling that has been developed in existing learning, this research is expected to be able to improve understanding of vocational students about the value of local wisdom batik more meaningful and “live”. The process of making 3D digital story-telling media consists of two processes, namely the creation of 3D objects and the creation of 3D object viewer.

  7. Northeast Artificial Intelligence Consortium (NAIC) Review of Technical Tasks. Volume 2, Part 1.

    DTIC Science & Technology

    1987-07-01

    34- . 6.2 Transformation Invariant Attributes for S Digitized Object Outlines ................................. 469 6.3 Design of an Inference Engine for an...Attributes for Digital Object Outlines ...................................... 597 7 SPEECH UNDERSTANDING RESEARCH ( Rochester Institute of Technology...versatile maintenance expert system ES) for trouble-shooting--’ digital circuits. +" Some diagnosis systems, such as MYCLN [19] for medical diagnosis and CRIB

  8. In-Factory Learning - Qualification For The Factory Of The Future

    NASA Astrophysics Data System (ADS)

    Quint, Fabian; Mura, Katharina; Gorecky, Dominic

    2015-07-01

    The Industry 4.0 vision anticipates that internet technologies will find their way into future factories replacing traditional components by dynamic and intelligent cyber-physical systems (CPS) that combine the physical objects with their digital representation. Reducing the gap between the real and digital world makes the factory environment more flexible, more adaptive, but also more complex for the human workers. Future workers require interdisciplinary competencies from engineering, information technology, and computer science in order to understand and manage the diverse interrelations between physical objects and their digital counterpart. This paper proposes a mixed-reality based learning environment, which combines physical objects and visualisation of digital content via Augmented Reality. It uses reality-based interaction in order to make the dynamic interrelations between real and digital factory visible and tangible. We argue that our learning system does not work as a stand-alone solution, but should fit into existing academic and advanced training curricula.

  9. Collection Metadata Solutions for Digital Library Applications

    NASA Technical Reports Server (NTRS)

    Hill, Linda L.; Janee, Greg; Dolin, Ron; Frew, James; Larsgaard, Mary

    1999-01-01

    Within a digital library, collections may range from an ad hoc set of objects that serve a temporary purpose to established library collections intended to persist through time. The objects in these collections vary widely, from library and data center holdings to pointers to real-world objects, such as geographic places, and the various metadata schemas that describe them. The key to integrated use of such a variety of collections in a digital library is collection metadata that represents the inherent and contextual characteristics of a collection. The Alexandria Digital Library (ADL) Project has designed and implemented collection metadata for several purposes: in XML form, the collection metadata "registers" the collection with the user interface client; in HTML form, it is used for user documentation; eventually, it will be used to describe the collection to network search agents; and it is used for internal collection management, including mapping the object metadata attributes to the common search parameters of the system.

  10. Spatial-heterodyne interferometry for transmission (SHIFT) measurements

    DOEpatents

    Bingham, Philip R.; Hanson, Gregory R.; Tobin, Ken W.

    2006-10-10

    Systems and methods are described for spatial-heterodyne interferometry for transmission (SHIFT) measurements. A method includes digitally recording a spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis using a reference beam, and an object beam that is transmitted through an object that is at least partially translucent; Fourier analyzing the digitally recorded spatially-heterodyned hologram, by shifting an original origin of the digitally recorded spatially-heterodyned hologram to sit on top of a spatial-heterodyne carrier frequency defined by an angle between the reference beam and the object beam, to define an analyzed image; digitally filtering the analyzed image to cut off signals around the original origin to define a result; and performing an inverse Fourier transform on the result.

  11. Off-axis illumination direct-to-digital holography

    DOEpatents

    Thomas, Clarence E.; Price, Jeffery R.; Voelkl, Edgar; Hanson, Gregory R.

    2004-06-08

    Systems and methods are described for off-axis illumination direct-to-digital holography. A method of recording an off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis, includes: reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object at an angle with respect to an optical axis defined by a focusing lens; focusing the reference beam and the object beam at a focal plane of a digital recorder to form the off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; digitally recording the off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; Fourier analyzing the recorded off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes by transforming axes of the recorded off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes in Fourier space to sit on top of a heterodyne carrier frequency defined as an angle between the reference beam and the object beam; applying a digital filter to cut off signals around an original origin; and then performing an inverse Fourier transform.

  12. Survey of Staff Perceptions of the AEL Resource Center.

    ERIC Educational Resources Information Center

    Cowley, Kimberly S.

    The Resource Center at the Appalachia Educational Laboratory (AEL), Inc., provides direct services to clients both within and outside AEL, as well as serving as a repository and distribution center for educational materials. Three main objectives were identified: to discover the extent to which staff use current components of the Resource Center;…

  13. Unified Database Development Program. Final Report.

    ERIC Educational Resources Information Center

    Thomas, Everett L., Jr.; Deem, Robert N.

    The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…

  14. Discourses of the Contemporary Urban Campus in Europe: Intimations of Americanisation?

    ERIC Educational Resources Information Center

    McEldowney, Malachy; Gaffikin, Frank; Perry, David C.

    2009-01-01

    This article studies major structural changes in both the urban context and the internal objectives of universities in Europe. While they enjoy expanded student demand and an elevated role in their city-region economy as significant creators and repositories of knowledge, they simultaneously confront a funding gap in accommodating these higher…

  15. Seamless lesion insertion in digital mammography: methodology and reader study

    NASA Astrophysics Data System (ADS)

    Pezeshk, Aria; Petrick, Nicholas; Sahiner, Berkman

    2016-03-01

    Collection of large repositories of clinical images containing verified cancer locations is costly and time consuming due to difficulties associated with both the accumulation of data and establishment of the ground truth. This problem poses a significant challenge to the development of machine learning algorithms that require large amounts of data to properly train and avoid overfitting. In this paper we expand the methods in our previous publications by making several modifications that significantly increase the speed of our insertion algorithms, thereby allowing them to be used for inserting lesions that are much larger in size. These algorithms have been incorporated into an image composition tool that we have made publicly available. This tool allows users to modify or supplement existing datasets by seamlessly inserting a real breast mass or micro-calcification cluster extracted from a source digital mammogram into a different location on another mammogram. We demonstrate examples of the performance of this tool on clinical cases taken from the University of South Florida Digital Database for Screening Mammography (DDSM). Finally, we report the results of a reader study evaluating the realism of inserted lesions compared to clinical lesions. Analysis of the radiologist scores in the study using receiver operating characteristic (ROC) methodology indicates that inserted lesions cannot be reliably distinguished from clinical lesions.

  16. An efficient architecture to support digital pathology in standard medical imaging repositories.

    PubMed

    Marques Godinho, Tiago; Lebre, Rui; Silva, Luís Bastião; Costa, Carlos

    2017-07-01

    In the past decade, digital pathology and whole-slide imaging (WSI) have been gaining momentum with the proliferation of digital scanners from different manufacturers. The literature reports significant advantages associated with the adoption of digital images in pathology, namely, improvements in diagnostic accuracy and better support for telepathology. Moreover, it also offers new clinical and research applications. However, numerous barriers have been slowing the adoption of WSI, among which the most important are performance issues associated with storage and distribution of huge volumes of data, and lack of interoperability with other hospital information systems, most notably Picture Archive and Communications Systems (PACS) based on the DICOM standard. This article proposes an architecture of a Web Pathology PACS fully compliant with DICOM standard communications and data formats. The solution includes a PACS Archive responsible for storing whole-slide imaging data in DICOM WSI format and offers a communication interface based on the most recent DICOM Web services. The second component is a zero-footprint viewer that runs in any web-browser. It consumes data using the PACS archive standard web services. Moreover, it features a tiling engine especially suited to deal with the WSI image pyramids. These components were designed with special focus on efficiency and usability. The performance of our system was assessed through a comparative analysis of the state-of-the-art solutions. The results demonstrate that it is possible to have a very competitive solution based on standard workflows. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Metadata mapping and reuse in caBIG.

    PubMed

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-02-05

    This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.

  18. Quality evaluation of value sets from cancer study common data elements using the UMLS semantic groups

    PubMed Central

    Solbrig, Harold R; Chute, Christopher G

    2012-01-01

    Objective The objective of this study is to develop an approach to evaluate the quality of terminological annotations on the value set (ie, enumerated value domain) components of the common data elements (CDEs) in the context of clinical research using both unified medical language system (UMLS) semantic types and groups. Materials and methods The CDEs of the National Cancer Institute (NCI) Cancer Data Standards Repository, the NCI Thesaurus (NCIt) concepts and the UMLS semantic network were integrated using a semantic web-based framework for a SPARQL-enabled evaluation. First, the set of CDE-permissible values with corresponding meanings in external controlled terminologies were isolated. The corresponding value meanings were then evaluated against their NCI- or UMLS-generated semantic network mapping to determine whether all of the meanings fell within the same semantic group. Results Of the enumerated CDEs in the Cancer Data Standards Repository, 3093 (26.2%) had elements drawn from more than one UMLS semantic group. A random sample (n=100) of this set of elements indicated that 17% of them were likely to have been misclassified. Discussion The use of existing semantic web tools can support a high-throughput mechanism for evaluating the quality of large CDE collections. This study demonstrates that the involvement of multiple semantic groups in an enumerated value domain of a CDE is an effective anchor to trigger an auditing point for quality evaluation activities. Conclusion This approach produces a useful quality assurance mechanism for a clinical study CDE repository. PMID:22511016

  19. How Elsevier is supporting the value and usefulness of data with Cross-linking and Research Data Services.

    NASA Astrophysics Data System (ADS)

    Keall, Bethan; Koers, Hylke; Marques, David

    2013-04-01

    Research in the Earth & Planetary Sciences is characterized by a wealth of observational data - ranging from observations by satellites orbiting the Earth, to borehole measurements at the bottom of the ocean, and also includes data from projects like the Rover Curiosity Landing. Thanks to technological advancements, it has become much easier for researchers over the last few decades to gather large volumes of data, analyze, and share with other researchers inside and outside the lab. With data serving such an important role in the way research is carried out, it becomes a crucial task to archive, maintain, organize, and disseminate research data in a dependable and structured manner. Subject-specific data repositories, often driven by the scientific community, are taking an increasingly prominent role in this domain, getting traction amongst researchers as the go-to place to deposit raw research data. At the same time, the scientific article remains an essential resource of scientific information. At Elsevier, we strive to continuously adapt the article format to meet the needs of modern-day researchers. This includes better support for digital content (see, e.g., http://www.elsevier.com/googlemaps), but also bidirectional linking between online articles and data repositories. In this spirit, Elsevier is collaborating with several leading data repositories, such as PANGAEA, IEDA, and NERC, to interlink articles and data for improved visibility and discoverability of both primary research data and research articles. In addition, Elsevier has formed a new group, Research Data Services, with three primary goals: • help increase the sharing and archiving of research data in discipline-specific repositories • help increase the value of shared data, particularly with annotation and provenance metadata and linking discipline-specific datasets together • help create a credit and impact assessment infrastructure to make research data independently important in its own right. We are working on several initiatives at Elsevier that enhance the online article format, and to make it easier for researchers to share, find, access, link together and analyze relevant research data. This helps to increase the value of both articles and data, and enables researchers to gain full credit for their research data output.

  20. Building a diabetes screening population data repository using electronic medical records.

    PubMed

    Tuan, Wen-Jan; Sheehy, Ann M; Smith, Maureen A

    2011-05-01

    There has been a rapid advancement of information technology in the area of clinical and population health data management since 2000. However, with the fast growth of electronic medical records (EMRs) and the increasing complexity of information systems, it has become challenging for researchers to effectively access, locate, extract, and analyze information critical to their research. This article introduces an outpatient encounter data framework designed to construct an EMR-based population data repository for diabetes screening research. The outpatient encounter data framework is developed on a hybrid data structure of entity-attribute-value models, dimensional models, and relational models. This design preserves a small number of subject-specific tables essential to key clinical constructs in the data repository. It enables atomic information to be maintained in a transparent and meaningful way to researchers and health care practitioners who need to access data and still achieve the same performance level as conventional data warehouse models. A six-layer information processing strategy is developed to extract and transform EMRs to the research data repository. The data structure also complies with both Health Insurance Portability and Accountability Act regulations and the institutional review board's requirements. Although developed for diabetes screening research, the design of the outpatient encounter data framework is suitable for other types of health service research. It may also provide organizations a tool to improve health care quality and efficiency, consistent with the "meaningful use" objectives of the Health Information Technology for Economic and Clinical Health Act. © 2011 Diabetes Technology Society.

  1. Analysis of space systems for the space disposal of nuclear waste follow-on study. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The space option for disposal of certain high-level nuclear wastes in space as a complement to mined geological repositories is studied. A brief overview of the study background, scope, objective, guidelines and assumptions, and contents is presented. The determination of the effects of variations in the waste mix on the space systems concept to allow determination of the space systems effect on total system risk benefits when used as a complement to the DOE reference mined geological repository is studied. The waste payload system, launch site, launch system, and orbit transfer system are all addressed. Rescue mission requirements are studied. The characteristics of waste forms suitable for space disposal are identified. Trajectories and performance requirements are discussed.

  2. Systems and Methods for Imaging of Falling Objects

    NASA Technical Reports Server (NTRS)

    Fallgatter, Cale (Inventor); Garrett, Tim (Inventor)

    2014-01-01

    Imaging of falling objects is described. Multiple images of a falling object can be captured substantially simultaneously using multiple cameras located at multiple angles around the falling object. An epipolar geometry of the captured images can be determined. The images can be rectified to parallelize epipolar lines of the epipolar geometry. Correspondence points between the images can be identified. At least a portion of the falling object can be digitally reconstructed using the identified correspondence points to create a digital reconstruction.

  3. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    ERIC Educational Resources Information Center

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  4. The 3D scanner prototype utilize object profile imaging using line laser and octave software

    NASA Astrophysics Data System (ADS)

    Nurdini, Mugi; Manunggal, Trikarsa Tirtadwipa; Samsi, Agus

    2016-11-01

    Three-dimensional scanner or 3D Scanner is a device to reconstruct the real object into digital form on a computer. 3D Scanner is a technology that is being developed, especially in developed countries, where the current 3D Scanner devices is the advanced version with a very expensive prices. This study is basically a simple prototype of 3D Scanner with a very low investment costs. 3D Scanner prototype device consists of a webcam, a rotating desk system controlled by a stepper motor and Arduino UNO, and a line laser. Objects that limit the research is the object with same radius from its center point (object pivot). Scanning is performed by using object profile imaging by line laser which is then captured by the camera and processed by a computer (image processing) using Octave software. On each image acquisition, the scanned object on a rotating desk rotated by a certain degree, so for one full turn multiple images of a number of existing side are finally obtained. Then, the profile of the entire images is extracted in order to obtain digital object dimension. Digital dimension is calibrated by length standard, called gage block. Overall dimensions are then digitally reconstructed into a three-dimensional object. Validation of the scanned object reconstruction of the original object dimensions expressed as a percentage error. Based on the results of data validation, horizontal dimension error is about 5% to 23% and vertical dimension error is about +/- 3%.

  5. Beyond PubMed: Searching the “Grey Literature” for Clinical Trial Results

    PubMed Central

    2014-01-01

    Clinical trial results have been traditionally communicated through the publication of scholarly reports and reviews in biomedical journals. However, this dissemination of information can be delayed or incomplete, making it difficult to appraise new treatments, or in the case of missing data, evaluate older interventions. Going beyond the routine search of PubMed, it is possible to discover additional information in the “grey literature.” Examples of the grey literature include clinical trial registries, patent databases, company and industrywide repositories, regulatory agency digital archives, abstracts of paper and poster presentations on meeting/congress websites, industry investor reports and press releases, and institutional and personal websites. PMID:25337445

  6. Rendering an archive in three dimensions

    NASA Astrophysics Data System (ADS)

    Leiman, David A.; Twose, Claire; Lee, Teresa Y. H.; Fletcher, Alex; Yoo, Terry S.

    2003-05-01

    We examine the requirements for a publicly accessible, online collection of three-dimensional biomedical image data, including those yielded by radiological processes such as MRI, ultrasound and others. Intended as a repository and distribution mechanism for such medical data, we created the National Online Volumetric Archive (NOVA) as a case study aimed at identifying the multiple issues involved in realizing a large-scale digital archive. In the paper we discuss such factors as the current legal and health information privacy policy affecting the collection of human medical images, retrieval and management of information and technical implementation. This project culminated in the launching of a website that includes downloadable datasets and a prototype data submission system.

  7. Being there: the library as place.

    PubMed

    Weise, Frieda

    2004-01-01

    The value of the library as place is examined in this Janet Doe Lecture. The lecture, which is intended to focus on the history or philosophy of health sciences librarianship, presents an overview of the library as a place in society from ancient times to the present. The impact of information technology and changes in the methods of scholarly publication from print to digital are addressed as well as the role of the library as the repository of the written historical record of cultures. Functions and services of libraries are discussed in light of the physical library facility of the future. Finally, librarians are asked to remember the enduring values of librarianship in planning libraries of the future.

  8. Being there: the library as place*

    PubMed Central

    Weise, Frieda

    2004-01-01

    The value of the library as place is examined in this Janet Doe Lecture. The lecture, which is intended to focus on the history or philosophy of health sciences librarianship, presents an overview of the library as a place in society from ancient times to the present. The impact of information technology and changes in the methods of scholarly publication from print to digital are addressed as well as the role of the library as the repository of the written historical record of cultures. Functions and services of libraries are discussed in light of the physical library facility of the future. Finally, librarians are asked to remember the enduring values of librarianship in planning libraries of the future. PMID:14762459

  9. Automatic Topography Using High Precision Digital Moire Methods

    NASA Astrophysics Data System (ADS)

    Yatagai, T.; Idesawa, M.; Saito, S.

    1983-07-01

    Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.

  10. Active Exploration of Large 3D Model Repositories.

    PubMed

    Gao, Lin; Cao, Yan-Pei; Lai, Yu-Kun; Huang, Hao-Zhi; Kobbelt, Leif; Hu, Shi-Min

    2015-12-01

    With broader availability of large-scale 3D model repositories, the need for efficient and effective exploration becomes more and more urgent. Existing model retrieval techniques do not scale well with the size of the database since often a large number of very similar objects are returned for a query, and the possibilities to refine the search are quite limited. We propose an interactive approach where the user feeds an active learning procedure by labeling either entire models or parts of them as "like" or "dislike" such that the system can automatically update an active set of recommended models. To provide an intuitive user interface, candidate models are presented based on their estimated relevance for the current query. From the methodological point of view, our main contribution is to exploit not only the similarity between a query and the database models but also the similarities among the database models themselves. We achieve this by an offline pre-processing stage, where global and local shape descriptors are computed for each model and a sparse distance metric is derived that can be evaluated efficiently even for very large databases. We demonstrate the effectiveness of our method by interactively exploring a repository containing over 100 K models.

  11. Current Status of the Nuclear Waste Management Programme in Finland - 13441

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehto, Kimmo; Vuorio, Petteri

    2013-07-01

    Pursuant to the Decision-in-Principle of 2001 the Finnish programme for geologic disposal of spent fuel has now moved to the phase of applying for construction licence to build up the encapsulation plant and underground repository. The main objective of former programme phase, underground characterisation phase, was to confirm - or refute - the suitability of the Olkiluoto site by investigations conducted underground at the actual depth of the repository. The construction work of the access tunnel to the rock characterisation facility (ONKALO) started in the late summer of 2004. The site research and investigations work aimed at the maturity neededmore » for submission of the application for construction license of the actual repository in end of 2012. This requires, however, that also the technology has reached the maturity needed. The design and technical plans form the necessary platform for the development of the safety case for spent fuel disposal. A plan, 'road map', has been produced for the portfolio of reports that demonstrates the safety of disposal as required by the criteria set by the government and further detailed by the safety authority, STUK. (authors)« less

  12. [Digital learning object for diagnostic reasoning in nursing applied to the integumentary system].

    PubMed

    da Costa, Cecília Passos Vaz; Luz, Maria Helena Barros Araújo

    2015-12-01

    To describe the creation of a digital learning object for diagnostic reasoning in nursing applied to the integumentary system at a public university of Piaui. A methodological study applied to technological production based on the pedagogical framework of problem-based learning. The methodology for creating the learning object observed the stages of analysis, design, development, implementation and evaluation recommended for contextualized instructional design. The revised taxonomy of Bloom was used to list the educational goals. The four modules of the developed learning object were inserted into the educational platform Moodle. The theoretical assumptions allowed the design of an important online resource that promotes effective learning in the scope of nursing education. This study should add value to nursing teaching practices through the use of digital learning objects for teaching diagnostic reasoning applied to skin and skin appendages.

  13. Digital holographic microscopy combined with optical tweezers

    NASA Astrophysics Data System (ADS)

    Cardenas, Nelson; Yu, Lingfeng; Mohanty, Samarendra K.

    2011-02-01

    While optical tweezers have been widely used for the manipulation and organization of microscopic objects in three dimensions, observing the manipulated objects along axial direction has been quite challenging. In order to visualize organization and orientation of objects along axial direction, we report development of a Digital holographic microscopy combined with optical tweezers. Digital holography is achieved by use of a modified Mach-Zehnder interferometer with digital recording of interference pattern of the reference and sample laser beams by use of a single CCD camera. In this method, quantitative phase information is retrieved dynamically with high temporal resolution, only limited by frame rate of the CCD. Digital focusing, phase-unwrapping as well as online analysis and display of the quantitative phase images was performed on a software developed on LabView platform. Since phase changes observed in DHOT is very sensitive to optical thickness of trapped volume, estimation of number of particles trapped in the axial direction as well as orientation of non-spherical objects could be achieved with high precision. Since in diseases such as malaria and diabetics, change in refractive index of red blood cells occurs, this system can be employed to map such disease-specific changes in biological samples upon immobilization with optical tweezers.

  14. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A [Livermore, CA; Brinkerhoff, David L [Antioch, CA

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  15. Envoicing Silent Objects: Art and Literature at the Site of the Canadian Landscape

    ERIC Educational Resources Information Center

    Brock, Richard

    2008-01-01

    In this article, the author examines some of the ways in which art and literature converge upon the site of the Canadian landscape, generating an "ekphrastic" conception of place which reminds everyone constantly that every framed, static view of a landscape represents a story house, a repository of narratives concerning all those…

  16. Student and Staff Perceptions of a Learning Management System for Blended Learning in Teacher Education

    ERIC Educational Resources Information Center

    Holmes, Kathryn A.; Prieto-Rodriguez, Elena

    2018-01-01

    Higher education institutions routinely use Learning Management Systems (LMS) for multiple purposes; to organise coursework and assessment, to facilitate staff and student interactions, and to act as repositories of learning objects. The analysis reported here involves staff (n = 46) and student (n = 470) responses to surveys as well as data…

  17. MTF Database: A Repository of Students' Academic Performance Measurements for the Development of Techniques for Evaluating Team Functioning

    ERIC Educational Resources Information Center

    Hsiung, Chin-Min; Zheng, Xiang-Xiang

    2015-01-01

    The Measurements for Team Functioning (MTF) database contains a series of student academic performance measurements obtained at a national university in Taiwan. The measurements are acquired from unit tests and homework tests performed during a core mechanical engineering course, and provide an objective means of assessing the functioning of…

  18. Associations between genetic polymorphisms of insulin-like growth factor axis genes and risk for age-related macular degeneration

    USDA-ARS?s Scientific Manuscript database

    Purpose: Our objective was to investigate if insulin-like growth factor (IGF) axis genes affect the risk for age-related macular degeneration (AMD). Methods: 864 Caucasian non-diabetic participants from the Age-Related Eye Disease Study (AREDS) Genetic Repository were used in this case control st...

  19. Digital implementation of a neural network for imaging

    NASA Astrophysics Data System (ADS)

    Wood, Richard; McGlashan, Alex; Yatulis, Jay; Mascher, Peter; Bruce, Ian

    2012-10-01

    This paper outlines the design and testing of a digital imaging system that utilizes an artificial neural network with unsupervised and supervised learning to convert streaming input (real time) image space into parameter space. The primary objective of this work is to investigate the effectiveness of using a neural network to significantly reduce the information density of streaming images so that objects can be readily identified by a limited set of primary parameters and act as an enhanced human machine interface (HMI). Many applications are envisioned including use in biomedical imaging, anomaly detection and as an assistive device for the visually impaired. A digital circuit was designed and tested using a Field Programmable Gate Array (FPGA) and an off the shelf digital camera. Our results indicate that the networks can be readily trained when subject to limited sets of objects such as the alphabet. We can also separate limited object sets with rotational and positional invariance. The results also show that limited visual fields form with only local connectivity.

  20. Nicephor[e]: a web-based solution for teaching forensic and scientific photography.

    PubMed

    Voisard, R; Champod, C; Furrer, J; Curchod, J; Vautier, A; Massonnet, G; Buzzini, P

    2007-04-11

    Nicephor[e] is a project funded by "Swiss Virtual Campus" and aims at creating a distant or mixed web-based learning system in forensic and scientific photography and microscopy. The practical goal is to organize series of on-line modular courses corresponding to the educational requirements of undergraduate academic programs. Additionally, this program could be used in the context of continuing educational programs. The architecture of the project is designed to guarantee a high level of knowledge in forensic and scientific photographic techniques, and to have an easy content production and the ability to create a number of different courses sharing the same content. The e-learning system Nicephor[e] consists of three different parts. The first one is a repository of learning objects that gathers all theoretical subject matter of the project such as texts, animations, images, and films. This repository is a web content management system (Typo3) that permits creating, publishing, and administrating dynamic content via a web browser as well as storing it into a database. The flexibility of the system's architecture allows for an easy updating of the content to follow the development of photographic technology. The instructor of a course can decide which modular contents need to be included in the course, and in which order they will be accessed by students. All the modular courses are developed in a learning management system (WebCT or Moodle) that can deal with complex learning scenarios, content distribution, students, tests, and interaction with instructor. Each course has its own learning scenario based on the goals of the course and the student's profile. The content of each course is taken from the content management system. It is then structured in the learning management system according to the pedagogical goals defined by the instructor. The modular courses are created in a highly interactive setting and offer autoevaluating tests to the students. The last part of the system is a digital assets management system (Extensis Portfolio). The practical portion of each course is to produce images of different marks or objects. The collection of all this material produced, indexed by the students and corrected by the instructor is essential to the development of a knowledge base of photographic techniques applied to a specific forensic subject. It represents also an extensible collection of different marks from known sources obtained under various conditions. It allows to reuse these images for creating image-based case files.

  1. Rolling Deck to Repository (R2R): Big Data and Standard Services for the Fleet Community

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.

    2014-12-01

    The Rolling Deck to Repository (R2R; http://rvdata.us/) program curates underway environmental sensor data from the U.S. academic oceanographic research fleet, ensuring data sets are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. Currently 25 in-service vessels contribute 7 terabytes of data to R2R each year, acquired from a full suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. To accommodate this large volume and variety of data, R2R has developed highly efficient stewardship procedures. These include scripted "break out" of cruise data packages from each vessel based on standard filename and directory patterns; automated harvest of cruise metadata from the UNOLS Office via Web Services and from OpenXML-based forms submitted by vessel operators; scripted quality assessment routines that calculate statistical summaries and standard ratings for selected data types; adoption of community-standard controlled vocabularies for vessel codes, instrument types, etc, provided by the NERC Vocabulary Server, in lieu of maintaining custom local term lists; and a standard package structure based on the IETF BagIt format for delivering data to long-term archives. Documentation and standard post-field products, including quality-controlled shiptrack navigation data for every cruise, are published in multiple services and formats to satisfy a diverse range of clients. These include Catalog Service for Web (CSW), GeoRSS, and OAI-PMH discovery services via a GeoNetwork portal; OGC Web Map and Feature Services for GIS clients; a citable Digital Object Identifier (DOI) for each dataset; ISO 19115-2 standard geospatial metadata records suitable for submission to long-term archives as well as the POGO global catalog; and Linked Open Data resources with a SPARQL query endpoint for Semantic Web clients. R2R participates in initiatives such as the Ocean Data Interoperability Platform (ODIP) and the NSF EarthCube OceanLink project to promote community-standard formats, vocabularies, and services among ocean data providers.

  2. Using Digital Learning Objects to Introduce Students to the Nature of Models and the Nature of Matter

    ERIC Educational Resources Information Center

    Gustafson, Brenda; Mahaffy, Peter; Martin, Brian

    2011-01-01

    This article reports a subset of findings from a larger study centered on designing a series of six digital learning objects to help Grade 5 (age 10-12) students begin to consider the nature of models (understood as the physical or mental representation of objects, phenomena, or processes), the particle nature of matter, and the behavior of…

  3. The Galileo Teacher Training Programme

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    The Galileo Teacher Training Program is a global effort to empower teachers all over the world to embark on a new trend in science teaching, using new technologies and real research meth-ods to teach curriculum content. The GTTP goal is to create a worldwide network of "Galileo Ambassadors", promoters of GTTP training session, and a legion of "Galileo Teachers", edu-cators engaged on the use of innovative resources and sharing experiences and supporting its pears worldwide. Through workshops, online training tools and resources, the products and techniques promoted by this program can be adapted to reach locations with few resources of their own, as well as network-connected areas that can take advantage of access to robotic, optical and radio telescopes, webcams, astronomy exercises, cross-disciplinary resources, image processing and digital universes (web and desktop planetariums). Promoters of GTTP are expert astronomy educators connected to Universities or EPO institutions that facilitate the consolidation of an active support to newcomers and act as a 24 hour helpdesk to teachers all over the world. GTTP will also engage in the creation of a repository of astronomy education resources and science research projects, ViRoS (Virtual Repository of resources and Science Projects), in order to simplify the task of educators willing to enrich classroom activities.

  4. FY94 CAG trip reports, CAG memos and other products: Volume 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-15

    The Yucca Mountain Site Characterization Project (YMP) of the US DOE is tasked with designing, constructing, and operating an Exploratory Studies Facility (ESF) at Yucca Mountain, Nevada. The purpose of the YMP is to provide detailed characterization of the Yucca Mountain site for the potential mined geologic repository for permanent disposal of high-level radioactive waste. Detailed characterization of properties of the site are to be conducted through a wide variety of short-term and long-term in-situ tests. Testing methods require the installation of a large number of test instruments and sensors with a variety of functions. These instruments produce analog andmore » digital data that must be collected, processed, stored, and evaluated in an attempt to predict performance of the repository. The Integrated Data and Control System (IDCS) is envisioned as a distributed data acquisition that electronically acquires and stores data from these test instruments. IDCS designers are responsible for designing and overseeing the procurement of the system, IDCS Operation and Maintenance operates and maintains the installed system, and the IDCS Data Manager is responsible for distribution of IDCS data to participants. This report is a compilation of trip reports, interoffice memos, and other memos relevant to Computer Applications Group, Inc., work on this project.« less

  5. Determination of diffusivities in the Rustler Formation from exploratory-shaft construction at the Waste Isolation Pilot Plant in southeastern New Mexico

    USGS Publications Warehouse

    Stevens, Ken; Beyeler, Walt

    1985-01-01

    The construction of an exploratory shaft 12 feet in diameter into the Salado Formation (repository horizon for transuranic waste material) at the Waste Isolation Pilot Plant site in southeastern New Mexico affected water-levels in water-bearing zones above the repository horizon. By reading the construction history of the exploratory shaft, an approximation of construction-generated hydraulic stresses at the shaft was made. The magnitude of the construction-generated stresses was calibrated using the hydrographs from one hydrologic test pad. Whereas flow rates from the Magenta Dolomite and Culebra Dolomite Members in the Rustler Formation into the exploratory shaft were unknown, the ratio of transmissivity to storage (diffusivity) was determined by mathematically simulating the aquifers and the hydrologic stresses with flood-wave-response digital model. These results indicate that the Magenta Dolomite and Culebra Dolomite Members of the Rustler Formation can be modeled as homogeneous, isotropic, and confined water-bearing zones. One simple and consistent explanation, but by no means the only explanation, of the lack of a single diffusivity value in the Culebra aquifer is that the open-hole observation wells at the hydrologic test pads dampen the amplitude of water-level changes. (USGS)

  6. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  7. DAM-ing the Digital Flood

    ERIC Educational Resources Information Center

    Raths, David

    2008-01-01

    With the widespread digitization of art, photography, and music, plus the introduction of streaming video, many colleges and universities are realizing that they must develop or purchase systems to preserve their school's digitized objects; that they must create searchable databases so that researchers can find and share copies of digital files;…

  8. Digital Initiatives and Metadata Use in Thailand

    ERIC Educational Resources Information Center

    SuKantarat, Wichada

    2008-01-01

    Purpose: This paper aims to provide information about various digital initiatives in libraries in Thailand and especially use of Dublin Core metadata in cataloguing digitized objects in academic and government digital databases. Design/methodology/approach: The author began researching metadata use in Thailand in 2003 and 2004 while on sabbatical…

  9. The McIntosh Archive: A solar feature database spanning four solar cycles

    NASA Astrophysics Data System (ADS)

    Gibson, S. E.; Malanushenko, A. V.; Hewins, I.; McFadden, R.; Emery, B.; Webb, D. F.; Denig, W. F.

    2016-12-01

    The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly H-alpha, He-1 10830 and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced coronal holes, polarity inversion lines, filaments, sunspots and plage, yielding a unique 45-year record of the features associated with the large-scale solar magnetic field. We will present the results of recent efforts to preserve and digitize this archive. Most of the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed and streamlined, and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. We will demonstrate how Solar Cycle 23 data may now be accessed and how it may be utilized for scientific applications. In addition, we will discuss how this database of human-recognized features, which overlaps with the onset of high-resolution, continuous modern solar data, may act as a training set for computer feature recognition algorithms.

  10. Sharing Neuron Data: Carrots, Sticks, and Digital Records.

    PubMed

    Ascoli, Giorgio A

    2015-10-01

    Routine data sharing is greatly benefiting several scientific disciplines, such as molecular biology, particle physics, and astronomy. Neuroscience data, in contrast, are still rarely shared, greatly limiting the potential for secondary discovery and the acceleration of research progress. Although the attitude toward data sharing is non-uniform across neuroscience subdomains, widespread adoption of data sharing practice will require a cultural shift in the community. Digital reconstructions of axonal and dendritic morphology constitute a particularly "sharable" kind of data. The popularity of the public repository NeuroMorpho.Org demonstrates that data sharing can benefit both users and contributors. Increased data availability is also catalyzing the grassroots development and spontaneous integration of complementary resources, research tools, and community initiatives. Even in this rare successful subfield, however, more data are still unshared than shared. Our experience as developers and curators of NeuroMorpho.Org suggests that greater transparency regarding the expectations and consequences of sharing (or not sharing) data, combined with public disclosure of which datasets are shared and which are not, may expedite the transition to community-wide data sharing.

  11. ECHO Data Partners Join Forces to Federate Access to Resources

    NASA Astrophysics Data System (ADS)

    Kendall, J.; Macie, M.

    2003-12-01

    During the past year the NASA's Earth Science Data and Information System (ESDIS) project has been collaborating with various Earth science data and client providers to design and implement the EOS Clearinghouse (ECHO). ECHO is an open, interoperable metadata clearinghouse and order broker system. ECHO functions as a repository of information intended to streamline access to digital data and services provided by NASA's Earth Science Enterprise and the extended Earth science community. In a unique partnership, ECHO data providers are working to extend their services in the digital era, to reflect current trends in scientific and educational communications. The multi-organization, inter-disciplinary content of ECHO provides a valuable new service to a growing number of Earth science applications and interdisciplinary research efforts. As such, ECHO is expected to attract a wide audience. In this poster, we highlight the contributions of current ECHO data partners and provide information for prospective data partners on how the project supports the incorporation of new collections and effective long-term asset management that is directly under the control of the organizations who contribute resources to ECHO.

  12. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  13. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  14. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  15. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  16. The Hawaiian Freshwater Algal Database (HfwADB): a laboratory LIMS and online biodiversity resource

    PubMed Central

    2012-01-01

    Background Biodiversity databases serve the important role of highlighting species-level diversity from defined geographical regions. Databases that are specially designed to accommodate the types of data gathered during regional surveys are valuable in allowing full data access and display to researchers not directly involved with the project, while serving as a Laboratory Information Management System (LIMS). The Hawaiian Freshwater Algal Database, or HfwADB, was modified from the Hawaiian Algal Database to showcase non-marine algal specimens collected from the Hawaiian Archipelago by accommodating the additional level of organization required for samples including multiple species. Description The Hawaiian Freshwater Algal Database is a comprehensive and searchable database containing photographs and micrographs of samples and collection sites, geo-referenced collecting information, taxonomic data and standardized DNA sequence data. All data for individual samples are linked through unique 10-digit accession numbers (“Isolate Accession”), the first five of which correspond to the collection site (“Environmental Accession”). Users can search online for sample information by accession number, various levels of taxonomy, habitat or collection site. HfwADB is hosted at the University of Hawaii, and was made publicly accessible in October 2011. At the present time the database houses data for over 2,825 samples of non-marine algae from 1,786 collection sites from the Hawaiian Archipelago. These samples include cyanobacteria, red and green algae and diatoms, as well as lesser representation from some other algal lineages. Conclusions HfwADB is a digital repository that acts as a Laboratory Information Management System for Hawaiian non-marine algal data. Users can interact with the repository through the web to view relevant habitat data (including geo-referenced collection locations) and download images of collection sites, specimen photographs and micrographs, and DNA sequences. It is publicly available at http://algae.manoa.hawaii.edu/hfwadb/. PMID:23095476

  17. Granite disposal of U.S. high-level radioactive waste.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, Geoffrey A.; Mariner, Paul E.; Lee, Joon H.

    This report evaluates the feasibility of disposing U.S. high-level radioactive waste in granite several hundred meters below the surface of the earth. The U.S. has many granite formations with positive attributes for permanent disposal. Similar crystalline formations have been extensively studied by international programs, two of which, in Sweden and Finland, are the host rocks of submitted or imminent repository license applications. This report is enabled by the advanced work of the international community to establish functional and operational requirements for disposal of a range of waste forms in granite media. In this report we develop scoping performance analyses, basedmore » on the applicable features, events, and processes (FEPs) identified by international investigators, to support generic conclusions regarding post-closure safety. Unlike the safety analyses for disposal in salt, shale/clay, or deep boreholes, the safety analysis for a mined granite repository depends largely on waste package preservation. In crystalline rock, waste packages are preserved by the high mechanical stability of the excavations, the diffusive barrier of the buffer, and favorable chemical conditions. The buffer is preserved by low groundwater fluxes, favorable chemical conditions, backfill, and the rigid confines of the host rock. An added advantage of a mined granite repository is that waste packages would be fairly easy to retrieve, should retrievability be an important objective. The results of the safety analyses performed in this study are consistent with the results of comprehensive safety assessments performed for sites in Sweden, Finland, and Canada. They indicate that a granite repository would satisfy established safety criteria and suggest that a small number of FEPs would largely control the release and transport of radionuclides. In the event the U.S. decides to pursue a potential repository in granite, a detailed evaluation of these FEPs would be needed to inform site selection and safety assessment.« less

  18. Organizing the Present, Looking to the Future: An Online Knowledge Repository to Facilitate Collaboration

    PubMed Central

    Burchill, Charles; Fergusson, Patricia; Jebamani, Laurel; Turner, Ken; Dueck, Stephen

    2000-01-01

    Background Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. Objective The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research methods, and facilitate both internal communication and collaboration with other sites. Methods This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Results Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. Conclusions This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated. PMID:11720929

  19. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information

    PubMed Central

    2013-01-01

    Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934 PMID:23402499

  20. Progress and challenges associated with digitizing and serving up Hawaii's geothermal data

    NASA Astrophysics Data System (ADS)

    Thomas, D. M.; Lautze, N. C.; Abdullah, M.

    2012-12-01

    This presentation will report on the status of our effort to digitize and serve up Hawaii's geothermal information, an undertaking that commenced in 2011 and will continue through at least 2013. This work is part of national project that is funded by the Department of Energy and managed by the Arizona State Geology Survey (AZGS). The data submitted to AZGS is being entered into the National Geothermal Data System (see http://www.stategeothermaldata.org/overview). We are also planning to host the information locally. Main facets of this project are to: - digitize and generate metadata for non-published geothermal documents relevant to the State of Hawaii - digitize ~100 years of paper records relevant to well permitting and water resources development and serve up information on the ~4500 water wells in the state - digitize, organize, and serve up information on research and geothermal exploratory drilling conducted from the 1980s to the present. - work with AZGS and OneGeology to contribute a geologic map for Hawaii that integrates geologic and geothermal resource data. By December 2012, we anticipate that the majority of the digitization will be complete, the geologic map will be approved, and that over 1000 documents will be hosted online through the University of Hawaii's library system (in the "Geothermal Collection" within the "Scholar Space" repository, see http://scholarspace.manoa.hawaii.edu/handle/10125/21320). Developing a 'user-friendly' web interface for the water well and drilling data will be a main task in the coming year. Challenges we have faced and anticipate include: 1) ensuring that no personally identifiable information (e.g. SSN, private telephone numbers, bank or credit account) is contained in the geothermal documents and well files; 2) Homeland Security regulations regarding release of information on critical infrastructure related to municipal water supply systems; 3) maintenance of the well database as future well data are developed with the state's expanding inventory of wells to meet private and public needs. Feedback is welcome.

Top