Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul
2016-02-15
The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.
Space Telecommunications Radio System (STRS) Application Repository Design and Analysis
NASA Technical Reports Server (NTRS)
Handler, Louis M.
2013-01-01
The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.
A Novel Navigation Paradigm for XML Repositories.
ERIC Educational Resources Information Center
Azagury, Alain; Factor, Michael E.; Maarek, Yoelle S.; Mandler, Benny
2002-01-01
Discusses data exchange over the Internet and describes the architecture and implementation of an XML document repository that promotes a navigation paradigm for XML documents based on content and context. Topics include information retrieval and semistructured documents; and file systems as information storage infrastructure, particularly XMLFS.…
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2010 CFR
2010-10-01
... repositories. 227.7207 Section 227.7207 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to...
AASG State Geothermal Data Repository for the National Geothermal Data System.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-01-01
This Drupal metadata and documents capture and management system is a repository, used for maintenance of metadata which describe resources contributed to the AASG State Geothermal Data System. The repository also provides an archive for files that are not hosted by the agency contributing the resource. Data from all 50 state geological surveys is represented here, and is contributed in turn to the National Geothermal Data System.
Monitored Geologic Repository Project Description Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. M. Curry
2001-01-30
The primary objective of the Monitored Geologic Repository Project Description Document (PDD) is to allocate the functions, requirements, and assumptions to the systems at Level 5 of the Civilian Radioactive Waste Management System (CRWMS) architecture identified in Section 4. It provides traceability of the requirements to those contained in Section 3 of the ''Monitored Geologic Repository Requirements Document'' (MGR RD) (YMP 2000a) and other higher-level requirements documents. In addition, the PDD allocates design related assumptions to work products of non-design organizations. The document provides Monitored Geologic Repository (MGR) technical requirements in support of design and performance assessment in preparing formore » the Site Recommendation (SR) and License Application (LA) milestones. The technical requirements documented in the PDD are to be captured in the System Description Documents (SDDs) which address each of the systems at Level 5 of the CRWMS architecture. The design engineers obtain the technical requirements from the SDDs and by reference from the SDDs to the PDD. The design organizations and other organizations will obtain design related assumptions directly from the PDD. These organizations may establish additional assumptions for their individual activities, but such assumptions are not to conflict with the assumptions in the PDD. The PDD will serve as the primary link between the technical requirements captured in the SDDs and the design requirements captured in US Department of Energy (DOE) documents. The approved PDD is placed under Level 3 baseline control by the CRWMS Management and Operating Contractor (M and O) and the following portions of the PDD constitute the Technical Design Baseline for the MGR: the design characteristics listed in Table 1-1, the MGR Architecture (Section 4.1), the Technical Requirements (Section 5), and the Controlled Project Assumptions (Section 6).« less
A proposed application programming interface for a physical volume repository
NASA Technical Reports Server (NTRS)
Jones, Merritt; Williams, Joel; Wrenn, Richard
1996-01-01
The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.
ENVIRONMENTAL INFORMATION MANAGEMENT SYSTEM (EIMS)
The Environmental Information Management System (EIMS) organizes descriptive information (metadata) for data sets, databases, documents, models, projects, and spatial data. The EIMS design provides a repository for scientific documentation that can be easily accessed with standar...
A Remote Knowledge Repository System for Teaching and Learning.
ERIC Educational Resources Information Center
Martins, Protasio D.; Maidantchik, Carmen; Lemos, Leandro T.; Manoel de Seixas, Jose
Changes in the global economy and the extensive use of the internet implied a conceptual redefinition of the working and social structure, and consequently an enhancement of educational systems that instruct engineers. This paper presents a repository of remote multimedia information such as formatted or non-formatted documents, hypertext pages,…
An XML-based system for the flexible classification and retrieval of clinical practice guidelines.
Ganslandt, T.; Mueller, M. L.; Krieglstein, C. F.; Senninger, N.; Prokosch, H. U.
2002-01-01
Beneficial effects of clinical practice guidelines (CPGs) have not yet reached expectations due to limited routine adoption. Electronic distribution and reminder systems have the potential to overcome implementation barriers. Existing electronic CPG repositories like the National Guideline Clearinghouse (NGC) provide individual access but lack standardized computer-readable interfaces necessary for automated guideline retrieval. The aim of this paper was to facilitate automated context-based selection and presentation of CPGs. Using attributes from the NGC classification scheme, an XML-based metadata repository was successfully implemented, providing document storage, classification and retrieval functionality. Semi-automated extraction of attributes was implemented for the import of XML guideline documents using XPath. A hospital information system interface was exemplarily implemented for diagnosis-based guideline invocation. Limitations of the implemented system are discussed and possible future work is outlined. Integration of standardized computer-readable search interfaces into existing CPG repositories is proposed. PMID:12463831
TRAC Searchable Research Library
2016-05-01
network accessible document repository for technical documents and similar document artifacts. We used a model-based approach using the Vector...demonstration and model refinement. 14. SUBJECT TERMS Knowledge Management, Document Repository , Digital Library, Vector Directional Data Model...27 Figure D1. Administrator Repository Upload Page. ................................................................... D-2 Figure D2
Social tagging in the life sciences: characterizing a new metadata resource for bioinformatics.
Good, Benjamin M; Tennis, Joseph T; Wilkinson, Mark D
2009-09-25
Academic social tagging systems, such as Connotea and CiteULike, provide researchers with a means to organize personal collections of online references with keywords (tags) and to share these collections with others. One of the side-effects of the operation of these systems is the generation of large, publicly accessible metadata repositories describing the resources in the collections. In light of the well-known expansion of information in the life sciences and the need for metadata to enhance its value, these repositories present a potentially valuable new resource for application developers. Here we characterize the current contents of two scientifically relevant metadata repositories created through social tagging. This investigation helps to establish how such socially constructed metadata might be used as it stands currently and to suggest ways that new social tagging systems might be designed that would yield better aggregate products. We assessed the metadata that users of CiteULike and Connotea associated with citations in PubMed with the following metrics: coverage of the document space, density of metadata (tags) per document, rates of inter-annotator agreement, and rates of agreement with MeSH indexing. CiteULike and Connotea were very similar on all of the measurements. In comparison to PubMed, document coverage and per-document metadata density were much lower for the social tagging systems. Inter-annotator agreement within the social tagging systems and the agreement between the aggregated social tagging metadata and MeSH indexing was low though the latter could be increased through voting. The most promising uses of metadata from current academic social tagging repositories will be those that find ways to utilize the novel relationships between users, tags, and documents exposed through these systems. For more traditional kinds of indexing-based applications (such as keyword-based search) to benefit substantially from socially generated metadata in the life sciences, more documents need to be tagged and more tags are needed for each document. These issues may be addressed both by finding ways to attract more users to current systems and by creating new user interfaces that encourage more collectively useful individual tagging behaviour.
Preliminary Concept of Operations for the Spent Fuel Management System--WM2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cumberland, Riley M; Adeniyi, Abiodun Idowu; Howard, Rob L
The Nuclear Fuels Storage and Transportation Planning Project (NFST) within the U.S. Department of Energy s Office of Nuclear Energy is tasked with identifying, planning, and conducting activities to lay the groundwork for developing interim storage and transportation capabilities in support of an integrated waste management system. The system will provide interim storage for commercial spent nuclear fuel (SNF) from reactor sites and deliver it to a repository. The system will also include multiple subsystems, potentially including; one or more interim storage facilities (ISF); one or more repositories; facilities to package and/or repackage SNF; and transportation systems. The project teammore » is analyzing options for an integrated waste management system. To support analysis, the project team has developed a Concept of Operations document that describes both the potential integrated system and inter-dependencies between system components. The goal of this work is to aid systems analysts in the development of consistent models across the project, which involves multiple investigators. The Concept of Operations document will be updated periodically as new developments emerge. At a high level, SNF is expected to travel from reactors to a repository. SNF is first unloaded from reactors and placed in spent fuel pools for wet storage at utility sites. After the SNF has cooled enough to satisfy loading limits, it is placed in a container at reactor sites for storage and/or transportation. After transportation requirements are met, the SNF is transported to an ISF to store the SNF until a repository is developed or directly to a repository if available. While the high level operation of the system is straightforward, analysts must evaluate numerous alternative options. Alternative options include the number of ISFs (if any), ISF design, the stage at which SNF repackaging occurs (if any), repackaging technology, the types of containers used, repository design, component sizing, and timing of events. These alternative options arise due to technological, economic, or policy considerations. As new developments regularly emerge, the operational concepts will be periodically updated. This paper gives an overview of the different potential alternatives identified in the Concept of Operations document at a conceptual level.« less
NASA Technical Reports Server (NTRS)
Mckay, Charles
1991-01-01
This is the configuration management Plan for the AdaNet Repository Based Software Engineering (RBSE) contract. This document establishes the requirements and activities needed to ensure that the products developed for the AdaNet RBSE contract are accurately identified, that proposed changes to the product are systematically evaluated and controlled, that the status of all change activity is known at all times, and that the product achieves its functional performance requirements and is accurately documented.
Compliance Assurance and Enforcement Division Document Repository (CAEDDOCRESP) provides internal and external access of Inspection Records, Enforcement Actions, and National Environmental Protection Act (NEPA) documents to all CAED staff. The respository will also include supporting documents, images, etc.
Repository-based software engineering program: Concept document
NASA Technical Reports Server (NTRS)
1992-01-01
This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Browne, S.V.; Green, S.C.; Moore, K.
1994-04-01
The Netlib repository, maintained by the University of Tennessee and Oak Ridge National Laboratory, contains freely available software, documents, and databases of interest to the numerical, scientific computing, and other communities. This report includes both the Netlib User`s Guide and the Netlib System Manager`s Guide, and contains information about Netlib`s databases, interfaces, and system implementation. The Netlib repository`s databases include the Performance Database, the Conferences Database, and the NA-NET mail forwarding and Whitepages Databases. A variety of user interfaces enable users to access the Netlib repository in the manner most convenient and compatible with their networking capabilities. These interfaces includemore » the Netlib email interface, the Xnetlib X Windows client, the netlibget command-line TCP/IP client, anonymous FTP, anonymous RCP, and gopher.« less
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
The Environmental Data Initiative (EDI) is an outgrowth of more than 30 years of information management experience and technology from LTER Network data practitioners. EDI builds upon the PASTA data repository software used by the LTER Network Information System and manages more than 42,000 data packages, containing tabular data, imagery, and other formats. Development of the repository was a community process beginning in 2009 that included numerous working groups for generating use cases, system requirements, and testing of completed software, thereby creating a vested interested in its success and transparency in design. All software is available for review on GitHub, and refinements and new features are ongoing. Documentation is also available on Read-the-docs, including a comprehensive description of all web-service API methods. PASTA is metadata driven and uses the Ecological Metadata Language (EML) standard for describing environmental and ecological data; a simplified Dublin Core document is also available for each data package. Data are aggregated into packages consisting of metadata and other related content described by an OAI-ORE document. Once archived, each data package becomes immutable and permanent; updates are possible through the addition of new revisions. Components of each data package are accessible through a unique identifier, while the entire data package receives a DOI that is registered in DataCite. Preservation occurs through a combination of DataONE synchronization/replication and by a series of local and remote backup strategies, including daily uploads to AWS Glacier storage. Checksums are computed for all data at initial upload, with random verification occurring on a continuous basis, thus ensuring the integrity of data. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML before data are archived; data packages that fail any test are forbidden in the repository. These tests are a measure data fitness, which ultimately increases confidence in data reuse and synthesis. The EDI data repository is recognized by multiple organizations, including EarthCube's Council of Data Facilities, the United States Geological Survey, FAIRsharing.org, re3data.org, and is a PLOS and Nature recommended data repository.
ERIC Educational Resources Information Center
Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia
2002-01-01
Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…
ERIC Educational Resources Information Center
Miller, William A.; Billings, Marilyn
2012-01-01
Digital repositories are new tools for documenting the accumulated scholarly work produced at academic institutions and disseminating that material broadly via the internet. Digital repositories support all file types and can be adapted to meet the custom design specifications of individual institutions. A section for community engagement…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.
1983-10-01
Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
Transportation plan repository and archive.
DOT National Transportation Integrated Search
2011-04-01
This project created a repository and archive for transportation planning documents in Texas within the : established Texas A&M Repository (http://digital.library.tamu.edu). This transportation planning archive : and repository provides ready access ...
Concept document of the repository-based software engineering program: A constructive appraisal
NASA Technical Reports Server (NTRS)
1992-01-01
A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.
Borrego, Sofía; Perdomo, Ivette
2016-02-01
The quality of the indoor air can provide very useful information for the artwork conservation. The aim of the study was to evaluate the microbial concentration inside six document repositories of the National Archive of the Republic of Cuba in two months of 1 year. The repositories are large, high, and have a natural cross-ventilation system. The microbial sampling was done in July 2010 (summer or rainy month) and February 2011 (winter or dry month) using the SAS Super 100 biocollector at 100 L/min. An appropriate selective culture media were used to isolate fungi and bacteria. A high total microbial concentration on the north side of the building in two studied months was observed. The fungal concentrations were significantly higher in July 2010 in all repositories, while the bacterial concentrations were significantly higher mostly in February 2011 only in repositories located on the first and second floor of the building. Eight fungal genera in the indoor air of all environments were isolated. Regardless of the side of the analyzed building, Penicillium, Aspergillus, and Cladosporium were the predominant genera. Aspergillus flavus and Aspergillus niger were the species isolated in almost all of the analyzed repositories in the studied months. Gram-positive bacteria prevailed among bacterial groups isolated from indoor air repositories, and some percentages corresponded to the genera Bacillus and Streptomyces. In Cuba, the temperature and relative humidity are high during the whole year but the natural ventilation plays an important role in retarding microbial growth on materials.
Computational knowledge integration in biopharmaceutical research.
Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim
2003-09-01
An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Definition. 19.1501 Section 19.1501 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC... Repository means a secure, Web-based application that collects, stores, and disseminates documents to the...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Definition. 19.1501 Section 19.1501 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC... Repository means a secure, Web-based application that collects, stores, and disseminates documents to the...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Definition. 19.1501 Section 19.1501 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC... Repository means a secure, Web-based application that collects, stores, and disseminates documents to the...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Definition. 19.1501 Section 19.1501 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC... Repository means a secure, Web-based application that collects, stores, and disseminates documents to the...
Repository of not readily available documents for project W-320
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conner, J.C.
1997-04-18
The purpose of this document is to provide a readily available source of the technical reports needed for the development of the safety documentation provided for the waste retrieval sluicing system (WRSS), designed to remove the radioactive and chemical sludge from tank 241-C-106, and transport that material to double-shell tank 241-AY-102 via a new, temporary, shielded, encased transfer line.
An ontology based information system for the management of institutional repository's collections
NASA Astrophysics Data System (ADS)
Tsolakidis, A.; Kakoulidis, P.; Skourlas, C.
2015-02-01
In this paper we discuss a simple methodological approach to create, and customize institutional repositories for the domain of the technological education. The use of the open source software platform of DSpace is proposed to build up the repository application and provide access to digital resources including research papers, dissertations, administrative documents, educational material, etc. Also the use of owl ontologies is proposed for indexing and accessing the various, heterogeneous items stored in the repository. Customization and operation of a platform for the selection and use of terms or parts of similar existing owl ontologies is also described. This platform could be based on the open source software Protégé that supports owl, is widely used, and also supports visualization, SPARQL etc. The combined use of the owl platform and the DSpace repository form a basis for creating customized ontologies, accommodating the semantic metadata of items and facilitating searching.
17 CFR 49.26 - Disclosure requirements of swap data repositories.
Code of Federal Regulations, 2013 CFR
2013-04-01
... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Disclosure requirements of...
17 CFR 49.26 - Disclosure requirements of swap data repositories.
Code of Federal Regulations, 2012 CFR
2012-04-01
... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Disclosure requirements of...
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2011-12-01
Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.
Method and system of integrating information from multiple sources
Alford, Francine A [Livermore, CA; Brinkerhoff, David L [Antioch, CA
2006-08-15
A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.
A digital library for medical imaging activities
NASA Astrophysics Data System (ADS)
dos Santos, Marcelo; Furuie, Sérgio S.
2007-03-01
This work presents the development of an electronic infrastructure to make available a free, online, multipurpose and multimodality medical image database. The proposed infrastructure implements a distributed architecture for medical image database, authoring tools, and a repository for multimedia documents. Also it includes a peer-reviewed model that assures quality of dataset. This public repository provides a single point of access for medical images and related information to facilitate retrieval tasks. The proposed approach has been used as an electronic teaching system in Radiology as well.
Privacy Impact Assessment for the eDiscovery Service
This system collects Logical Evidence Files, which include data from workstations, laptops, SharePoint and document repositories. Learn how the data is collected, used, who has access, the purpose of data collection, and record retention policies.
2008-09-30
89 Integrated Surface Ship ASW Combat System (AN/SQQ-89) SSDS Ship Self Defense System TSTS Total Ship Training System UDDI Universal Description...34ContractorOrganization" type="ContractorOrganizationType"> <xs:annotation> <xs:documentation>Identifies a contractor organization resposible for the
Bytautas, Jessica P; Gheihman, Galina; Dobrow, Mark J
2017-04-01
Quality improvement (QI) is becoming an important focal point for health systems. There is increasing interest among health system stakeholders to learn from and share experiences on the use of QI methods and approaches in their work. Yet there are few easily accessible, online repositories dedicated to documenting QI activity. We conducted a scoping review of publicly available, web-based QI repositories to (i) identify current approaches to sharing information on QI practices; (ii) categorise these approaches based on hosting, scope and size, content acquisition and eligibility, content format and search, and evaluation and engagement characteristics; and (iii) review evaluations of the design, usefulness and impact of their online QI practice repositories. The search strategy consisted of traditional database and grey literature searches, as well as expert consultation, with the ultimate aim of identifying and describing QI repositories of practices undertaken in a healthcare context. We identified 13 QI repositories and found substantial variation across the five categories. The QI repositories used different terminology (eg, practices vs case studies) and approaches to content acquisition, and varied in terms of primary areas of focus. All provided some means for organising content according to categories or themes and most provided at least rudimentary keyword search functionality. Notably, none of the QI repositories included evaluations of their impact. With growing interest in sharing and spreading best practices and increasing reliance on QI as a key contributor to health system performance, the role of QI repositories is likely to expand. Designing future QI repositories based on knowledge of the range and type of features available is an important starting point for improving their usefulness and impact. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
2009-08-19
SSDS Ship Self Defense System TSTS Total Ship Training System UDDI Universal Description, Discovery, and Integration UML Unified Modeling...34ContractorOrganization" type="ContractorOrganizationType"> <xs:annotation> <xs:documentation>Identifies a contractor organization resposible for the
National Pipeline Mapping System (NPMS) : repository standards
DOT National Transportation Integrated Search
1997-07-01
This draft document contains 7 sections. They are as follows: 1. General Topics, 2. Data Formats, 3. Metadata, 4. Attribute Data, 5. Data Flow, 6. Descriptive Process, and 7. Validation and Processing of Submitted Data. These standards were created w...
Organizing Diverse, Distributed Project Information
NASA Technical Reports Server (NTRS)
Keller, Richard M.
2003-01-01
SemanticOrganizer is a software application designed to organize and integrate information generated within a distributed organization or as part of a project that involves multiple, geographically dispersed collaborators. SemanticOrganizer incorporates the capabilities of database storage, document sharing, hypermedia navigation, and semantic-interlinking into a system that can be customized to satisfy the specific information-management needs of different user communities. The program provides a centralized repository of information that is both secure and accessible to project collaborators via the World Wide Web. SemanticOrganizer's repository can be used to collect diverse information (including forms, documents, notes, data, spreadsheets, images, and sounds) from computers at collaborators work sites. The program organizes the information using a unique network-structured conceptual framework, wherein each node represents a data record that contains not only the original information but also metadata (in effect, standardized data that characterize the information). Links among nodes express semantic relationships among the data records. The program features a Web interface through which users enter, interlink, and/or search for information in the repository. By use of this repository, the collaborators have immediate access to the most recent project information, as well as to archived information. A key advantage to SemanticOrganizer is its ability to interlink information together in a natural fashion using customized terminology and concepts that are familiar to a user community.
NASA Astrophysics Data System (ADS)
Benedict, K. K.; Servilla, M. S.; Vanderbilt, K.; Wheeler, J.
2015-12-01
The growing volume, variety and velocity of production of Earth science data magnifies the impact of inefficiencies in data acquisition, processing, analysis, and sharing workflows, potentially to the point of impairing the ability of researchers to accomplish their desired scientific objectives. The adaptation of agile software development principles (http://agilemanifesto.org/principles.html) to data curation processes has significant potential to lower barriers to effective scientific data discovery and reuse - barriers that otherwise may force the development of new data to replace existing but unusable data, or require substantial effort to make data usable in new research contexts. This paper outlines a data curation process that was developed at the University of New Mexico that provides a cross-walk of data and associated documentation between the data archive developed by the Long Term Ecological Research (LTER) Network Office (PASTA - http://lno.lternet.edu/content/network-information-system) and UNM's institutional repository (LoboVault - http://repository.unm.edu). The developed automated workflow enables the replication of versioned data objects and their associated standards-based metadata between the LTER system and LoboVault - providing long-term preservation for those data/metadata packages within LoboVault while maintaining the value-added services that the PASTA platform provides. The relative ease with which this workflow was developed is a product of the capabilities independently developed on both platforms - including the simplicity of providing a well-documented application programming interface (API) for each platform enabling scripted interaction and the use of well-established documentation standards (EML in the case of PASTA, Dublin Core in the case of LoboVault) by both systems. These system characteristics, when combined with an iterative process of interaction between the Data Curation Librarian (on the LoboVault side of the process), the Sevilleta LTER Information Manager and the LTER Network Information System developer, yielded a rapid and relatively streamlined process for targeted replication of data and metadata between the two systems - increasing the discoverability and usability of the LTER data assets.
XDS in healthcare: Could it lead to a duplication problem? Field study from GVR Sweden
NASA Astrophysics Data System (ADS)
Wintell, M.; Lundberg, N.; Lindsköld, L.
2011-03-01
Managing different registries and repositories within healthcare regions grows the risk of having almost the same information but with different status and with different content. This is due to the fact that when medical information is created it's done in a dynamical process that will lead to that information will change its contents during lifetime within the "active" healthcare phase. The information needs to be easy accessible, being the platform for making the medical decisions transparent. In the Region Västra Götaland (VGR), Sweden, data is shared from 29 X-ray departments with different Picture Archive and Communication Systems (PACS) and Radiology Information Systems (RIS) systems through the Infobroker solution, that's acts as a broker between the actors involved. Request/reports from RIS are stored as DIgital COmmunication in Medicine (DICOM)-Structured Reports (SR) objects, together with the images. Every status change within this activities are updated within the Information Infrastructure based on Integrating the Healthcare Enterprise (IHE) mission. Cross-enterprise Document Sharing for Imaging (XDS-I) were the registry and the central repository are the components used for sharing medical documentation. The VGR strategy was not to apply one regional XDS-I registry and repository, instead VGR applied an Enterprise Architecture (EA) intertwined with the Information Infrastructure for the dynamic delivery to consumers. The upcoming usage of different Regional XDS registries and repositories could lead to new ways of carrying out shared work but it can also lead into "problems". XDS and XDS-I implemented without a strategy could lead to increased numbers of status/versions but also duplication of information in the Information Infrastructure.
Documenting Climate Models and Their Simulations
Guilyardi, Eric; Balaji, V.; Lawrence, Bryan; ...
2013-05-01
The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now nonspecialists such as government officials, policy makers, and the general public all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. We describe a pilot community initiative to collect and make available documentation of climatemore » models and their simulations. In an initial application, a metadata repository is being established to provide information of this kind for a major internationally coordinated modeling activity known as CMIP5 (Coupled Model Intercomparison Project, Phase 5). We expected that for a wide range of stakeholders, this and similar community-managed metadata repositories will spur development of analysis tools that facilitate discovery and exploitation of Earth system simulations.« less
Save medical personnel's time by improved user interfaces.
Kindler, H
1997-01-01
Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.
Generic Argillite/Shale Disposal Reference Case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Liange; Colon, Carlos Jové; Bianchi, Marco
Radioactive waste disposal in a deep subsurface repository hosted in clay/shale/argillite is a subject of widespread interest given the desirable isolation properties, geochemically reduced conditions, and widespread geologic occurrence of this rock type (Hansen 2010; Bianchi et al. 2013). Bianchi et al. (2013) provides a description of diffusion in a clay-hosted repository based on single-phase flow and full saturation using parametric data from documented studies in Europe (e.g., ANDRA 2005). The predominance of diffusive transport and sorption phenomena in this clay media are key attributes to impede radionuclide mobility making clay rock formations target sites for disposal of high-level radioactivemore » waste. The reports by Hansen et al. (2010) and those from numerous studies in clay-hosted underground research laboratories (URLs) in Belgium, France and Switzerland outline the extensive scientific knowledge obtained to assess long-term clay/shale/argillite repository isolation performance of nuclear waste. In the past several years under the UFDC, various kinds of models have been developed for argillite repository to demonstrate the model capability, understand the spatial and temporal alteration of the repository, and evaluate different scenarios. These models include the coupled Thermal-Hydrological-Mechanical (THM) and Thermal-Hydrological-Mechanical-Chemical (THMC) models (e.g. Liu et al. 2013; Rutqvist et al. 2014a, Zheng et al. 2014a) that focus on THMC processes in the Engineered Barrier System (EBS) bentonite and argillite host hock, the large scale hydrogeologic model (Bianchi et al. 2014) that investigates the hydraulic connection between an emplacement drift and surrounding hydrogeological units, and Disposal Systems Evaluation Framework (DSEF) models (Greenberg et al. 2013) that evaluate thermal evolution in the host rock approximated as a thermal conduction process to facilitate the analysis of design options. However, the assumptions and the properties (parameters) used in these models are different, which not only make inter-model comparisons difficult, but also compromise the applicability of the lessons learned from one model to another model. The establishment of a reference case would therefore be helpful to set up a baseline for model development. A generic salt repository reference case was developed in Freeze et al. (2013) and the generic argillite repository reference case is presented in this report. The definition of a reference case requires the characterization of the waste inventory, waste form, waste package, repository layout, EBS backfill, host rock, and biosphere. This report mainly documents the processes in EBS bentonite and host rock that are potentially important for performance assessment and properties that are needed to describe these processes, with brief description other components such as waste inventory, waste form, waste package, repository layout, aquifer, and biosphere. A thorough description of the generic argillite repository reference case will be given in Jové Colon et al. (2014).« less
Rolling Deck to Repository I: Designing a Database Infrastructure
NASA Astrophysics Data System (ADS)
Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.
2008-12-01
The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.
Accident/Mishap Investigation System
NASA Technical Reports Server (NTRS)
Keller, Richard; Wolfe, Shawn; Gawdiak, Yuri; Carvalho, Robert; Panontin, Tina; Williams, James; Sturken, Ian
2007-01-01
InvestigationOrganizer (IO) is a Web-based collaborative information system that integrates the generic functionality of a database, a document repository, a semantic hypermedia browser, and a rule-based inference system with specialized modeling and visualization functionality to support accident/mishap investigation teams. This accessible, online structure is designed to support investigators by allowing them to make explicit, shared, and meaningful links among evidence, causal models, findings, and recommendations.
Finet, Philippe; Gibaud, Bernard; Dameron, Olivier; Le Bouquin Jeannès, Régine
2016-03-01
The number of patients with complications associated with chronic diseases increases with the ageing population. In particular, complex chronic wounds raise the re-admission rate in hospitals. In this context, the implementation of a telemedicine application in Basse-Normandie, France, contributes to reduce hospital stays and transport. This application requires a new collaboration among general practitioners, private duty nurses and the hospital staff. However, the main constraint mentioned by the users of this system is the lack of interoperability between the information system of this application and various partners' information systems. To improve medical data exchanges, the authors propose a new implementation based on the introduction of interoperable clinical documents and a digital document repository for managing the sharing of the documents between the telemedicine application users. They then show that this technical solution is suitable for any telemedicine application and any document sharing system in a healthcare facility or network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engeman, J. K.; Girardot, C. L.; Harlow, D. G.
2012-12-20
This report contains reference materials cited in RPP-ASMT -53793, Tank 241-AY-102 Leak Assessment Report, that were obtained from the National Archives Federal Records Repository in Seattle, Washington, or from other sources including the Hanford Site's Integrated Data Management System database (IDMS).
DR Argillite Disposal R&D at LBNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Liange; Kim, Kunhwi; Xu, Hao
2016-08-12
Within the Natural Barrier System (NBS) group of the Used Fuel Disposition (UFD) Campaign at the Department of Energy’s (DOE) Office of Nuclear Energy, LBNL’s research activities have focused on understanding and modeling EDZ evolution and the associated coupled processes and impacts of high temperature on parameters and processes relevant to performance of a clay repository to establish the technical base for the maximum allowable temperature. This report documents results from some of these activities. These activities address key Features, Events, and Processes (FEPs), which have been ranked in importance from medium to high, as listed in Table 7 ofmore » the Used Fuel Disposition Campaign Disposal Research and Development Roadmap (FCR&D-USED-2011-000065 REV0) (Nutt, 2011). Specifically, they address FEP 2.2.01, Excavation Disturbed Zone, for clay/shale, by investigating how coupled processes affect EDZ evolution; FEP 2.2.05, Flow and Transport Pathways; and FEP 2.2.08, Hydrologic Processes, and FEP 2.2.07, Mechanical Processes, and FEP 2.2.09, Chemical Process—Transport, by studying near-field coupled THMC processes in clay/shale repositories. The activities documented in this report also address a number of research topics identified in Research & Development (R&D) Plan for Used Fuel Disposition Campaign (UFDC) Natural System Evaluation and Tool Development (Wang 2011), including Topics S3, Disposal system modeling – Natural System; P1, Development of discrete fracture network (DFN) model; P14, Technical basis for thermal loading limits; and P15 Modeling of disturbed rock zone (DRZ) evolution (clay repository).« less
Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard
2017-01-01
Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.
Food entries in a large allergy data repository
Plasek, Joseph M.; Goss, Foster R.; Lai, Kenneth H.; Lau, Jason J.; Seger,, Diane L.; Blumenthal, Kimberly G.; Wickner, Paige G.; Slight, Sarah P.; Chang, Frank Y.; Topaz, Maxim; Bates, David W.
2016-01-01
Objective Accurate food adverse sensitivity documentation in electronic health records (EHRs) is crucial to patient safety. This study examined, encoded, and grouped foods that caused any adverse sensitivity in a large allergy repository using natural language processing and standard terminologies. Methods Using the Medical Text Extraction, Reasoning, and Mapping System (MTERMS), we processed both structured and free-text entries stored in an enterprise-wide allergy repository (Partners’ Enterprise-wide Allergy Repository), normalized diverse food allergen terms into concepts, and encoded these concepts using the Systematized Nomenclature of Medicine – Clinical Terms (SNOMED-CT) and Unique Ingredient Identifiers (UNII) terminologies. Concept coverage also was assessed for these two terminologies. We further categorized allergen concepts into groups and calculated the frequencies of these concepts by group. Finally, we conducted an external validation of MTERMS’s performance when identifying food allergen terms, using a randomized sample from a different institution. Results We identified 158 552 food allergen records (2140 unique terms) in the Partners repository, corresponding to 672 food allergen concepts. High-frequency groups included shellfish (19.3%), fruits or vegetables (18.4%), dairy (9.0%), peanuts (8.5%), tree nuts (8.5%), eggs (6.0%), grains (5.1%), and additives (4.7%). Ambiguous, generic concepts such as “nuts” and “seafood” accounted for 8.8% of the records. SNOMED-CT covered more concepts than UNII in terms of exact (81.7% vs 68.0%) and partial (14.3% vs 9.7%) matches. Discussion Adverse sensitivities to food are diverse, and existing standard terminologies have gaps in their coverage of the breadth of allergy concepts. Conclusion New strategies are needed to represent and standardize food adverse sensitivity concepts, to improve documentation in EHRs. PMID:26384406
Food entries in a large allergy data repository.
Plasek, Joseph M; Goss, Foster R; Lai, Kenneth H; Lau, Jason J; Seger, Diane L; Blumenthal, Kimberly G; Wickner, Paige G; Slight, Sarah P; Chang, Frank Y; Topaz, Maxim; Bates, David W; Zhou, Li
2016-04-01
Accurate food adverse sensitivity documentation in electronic health records (EHRs) is crucial to patient safety. This study examined, encoded, and grouped foods that caused any adverse sensitivity in a large allergy repository using natural language processing and standard terminologies. Using the Medical Text Extraction, Reasoning, and Mapping System (MTERMS), we processed both structured and free-text entries stored in an enterprise-wide allergy repository (Partners' Enterprise-wide Allergy Repository), normalized diverse food allergen terms into concepts, and encoded these concepts using the Systematized Nomenclature of Medicine - Clinical Terms (SNOMED-CT) and Unique Ingredient Identifiers (UNII) terminologies. Concept coverage also was assessed for these two terminologies. We further categorized allergen concepts into groups and calculated the frequencies of these concepts by group. Finally, we conducted an external validation of MTERMS's performance when identifying food allergen terms, using a randomized sample from a different institution. We identified 158 552 food allergen records (2140 unique terms) in the Partners repository, corresponding to 672 food allergen concepts. High-frequency groups included shellfish (19.3%), fruits or vegetables (18.4%), dairy (9.0%), peanuts (8.5%), tree nuts (8.5%), eggs (6.0%), grains (5.1%), and additives (4.7%). Ambiguous, generic concepts such as "nuts" and "seafood" accounted for 8.8% of the records. SNOMED-CT covered more concepts than UNII in terms of exact (81.7% vs 68.0%) and partial (14.3% vs 9.7%) matches. Adverse sensitivities to food are diverse, and existing standard terminologies have gaps in their coverage of the breadth of allergy concepts. New strategies are needed to represent and standardize food adverse sensitivity concepts, to improve documentation in EHRs. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOT National Transportation Integrated Search
2017-01-01
The National Transportation Library's (NTL) Repository and Open Science Portal (ROSA P) : is a digital library for transportation, including U. S. Department of Transportation : sponsored research results and technical publications, other documents a...
Rolling Deck to Repository (R2R): Building the Data Pipeline - Initial Results
NASA Astrophysics Data System (ADS)
Arko, R. A.; Clark, P. D.; Rioux, M. A.; McGovern, T. M.; Deering, T. W.; Hagg, R. K.; Payne, A. A.; Fischman, D. E.; Ferrini, V.
2009-12-01
The NSF-funded Rolling Deck to Repository (R2R) project is working with U.S. academic research vessel operators to ensure the documentation and preservation of data from routine “underway” (meteorological, geophysical, and oceanographic) sensor systems. A standard pipeline is being developed in which data are submitted by vessel operators directly to a central repository; inventoried in an integrated fleet-wide catalog; organized into discrete data sets with persistent unique identifiers; associated with essential cruise-level metadata; and delivered to the National Data Centers for archiving and dissemination. Several vessels including Atlantis, Healy, Hugh R. Sharp, Ka'imikai-O-Kanaloa, Kilo Moana, Knorr, Marcus G. Langseth, Melville, Oceanus, Roger Revelle, and Thomas G. Thompson began submitting data and documentation to R2R during the project’s pilot phase, and a repository infrastructure has been established. Cruise metadata, track maps, and data inventories are published at the R2R Web portal, with controlled vocabularies drawn from community standards (e.g. International Council for the Exploration of the Sea (ICES) ship codes). A direct connection has been established to the University-National Oceanographic Laboratory System (UNOLS) Ship Time Request and Scheduling System (STRS) via Web services to synchronize port codes and cruise schedules. A secure portal is being developed where operators may login to upload sailing orders, review data inventories, and create vessel profiles. R2R has established a standard procedure for submission of data to the National Geophysical Data Center (NGDC) that incorporates persistent unique identifiers for cruises, data sets, and individual files, using multibeam data as a test bed. Once proprietary holds are cleared and a data set is delivered to NGDC, the R2R catalog record is updated with the URL for direct download and it becomes immediately available to integration and synthesis projects such as the NSF-funded Global Multi-Resolution Topography (GMRT) synthesis. Similar procedures will be developed for delivery of data to other National Data Centers as appropriate.
Fukushima Daiichi Information Repository FY13 Status
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis; Phelan, Cherie; Schwieder, Dave
The accident at the Fukushima Daiichi nuclear power station in Japan is one of the most serious in commercial nuclear power plant operating history. Much will be learned that may be applicable to the U.S. reactor fleet, nuclear fuel cycle facilities, and supporting systems, and the international reactor fleet. For example, lessons from Fukushima Daiichi may be applied to emergency response planning, reactor operator training, accident scenario modeling, human factors engineering, radiation protection, and accident mitigation; as well as influence U.S. policies towards the nuclear fuel cycle including power generation, and spent fuel storage, reprocessing, and disposal. This document describesmore » the database used to establish a centralized information repository to store and manage the Fukushima data that has been gathered. The data is stored in a secured (password protected and encrypted) repository that is searchable and available to researchers at diverse locations.« less
Rolling Deck to Repository (R2R): Products and Services for the U.S. Research Fleet Community
NASA Astrophysics Data System (ADS)
Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.
2016-02-01
The Rolling Deck to Repository (R2R) program is working to ensure open access to environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 TB/year of data to R2R from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R ensures these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R has recently expanded to include the vessels Sikuliaq, operated by the University of Alaska; Falkor, operated by the Schmidt Ocean Institute; and Ronald H. Brown and Okeanos Explorer, operated by NOAA. R2R maintains a master catalog of U.S. research cruises, currently holding over 4,670 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. Standard post-field cruise products are published including shiptrack navigation, near-real-time MET/TSG data, underway geophysical profiles, and CTD profiles. Software tools available to users include the R2R Event Logger and the R2R Nav Manager. A Digital Object Identifier (DOI) is published for each cruise, original field sensor dataset, standard post-field product, and document (e.g. cruise report) submitted by the science party. Scientists are linked to personal identifiers such as ORCIDs where available. Using standard identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. R2R collaborates in the Ocean Data Interoperability Platform (ODIP) to strengthen links among regional and national data systems, populates U.S. cruises in the POGO global catalog, and is working toward membership in the DataONE alliance. It is a lead partner in the EarthCube GeoLink project, developing Semantic Web technologies to share data and documentation between repositories, and in the newly-launched EarthCube SeaView project, delivering data from R2R and other ocean data facilities to scientists using the Ocean Data View (ODV) software tool.
KiMoSys: a web-based repository of experimental data for KInetic MOdels of biological SYStems
2014-01-01
Background The kinetic modeling of biological systems is mainly composed of three steps that proceed iteratively: model building, simulation and analysis. In the first step, it is usually required to set initial metabolite concentrations, and to assign kinetic rate laws, along with estimating parameter values using kinetic data through optimization when these are not known. Although the rapid development of high-throughput methods has generated much omics data, experimentalists present only a summary of obtained results for publication, the experimental data files are not usually submitted to any public repository, or simply not available at all. In order to automatize as much as possible the steps of building kinetic models, there is a growing requirement in the systems biology community for easily exchanging data in combination with models, which represents the main motivation of KiMoSys development. Description KiMoSys is a user-friendly platform that includes a public data repository of published experimental data, containing concentration data of metabolites and enzymes and flux data. It was designed to ensure data management, storage and sharing for a wider systems biology community. This community repository offers a web-based interface and upload facility to turn available data into publicly accessible, centralized and structured-format data files. Moreover, it compiles and integrates available kinetic models associated with the data. KiMoSys also integrates some tools to facilitate the kinetic model construction process of large-scale metabolic networks, especially when the systems biologists perform computational research. Conclusions KiMoSys is a web-based system that integrates a public data and associated model(s) repository with computational tools, providing the systems biology community with a novel application facilitating data storage and sharing, thus supporting construction of ODE-based kinetic models and collaborative research projects. The web application implemented using Ruby on Rails framework is freely available for web access at http://kimosys.org, along with its full documentation. PMID:25115331
Huang, Haiyan; Liu, Chun-Chi; Zhou, Xianghong Jasmine
2010-04-13
The rapid accumulation of gene expression data has offered unprecedented opportunities to study human diseases. The National Center for Biotechnology Information Gene Expression Omnibus is currently the largest database that systematically documents the genome-wide molecular basis of diseases. However, thus far, this resource has been far from fully utilized. This paper describes the first study to transform public gene expression repositories into an automated disease diagnosis database. Particularly, we have developed a systematic framework, including a two-stage Bayesian learning approach, to achieve the diagnosis of one or multiple diseases for a query expression profile along a hierarchical disease taxonomy. Our approach, including standardizing cross-platform gene expression data and heterogeneous disease annotations, allows analyzing both sources of information in a unified probabilistic system. A high level of overall diagnostic accuracy was shown by cross validation. It was also demonstrated that the power of our method can increase significantly with the continued growth of public gene expression repositories. Finally, we showed how our disease diagnosis system can be used to characterize complex phenotypes and to construct a disease-drug connectivity map.
Supplier Assessment System (SAS)
NASA Technical Reports Server (NTRS)
Dietrich, Kristen
2016-01-01
Background: Sponsored by NASA Headquarters; Charter - provide information to assist the quality assurance community in evaluating and determining supplier risk; Comprehensive on-line repository of supplier information; Available to approved civil service personnel from all NASA Centers, other U.S. Government Agencies, Prime Contractors, and NASA direct support contractors; User access to specific data types or documents is controlled as needed.
Design guide for high pressure oxygen systems
NASA Technical Reports Server (NTRS)
Bond, A. C.; Pohl, H. O.; Chaffee, N. H.; Guy, W. W.; Allton, C. S.; Johnston, R. L.; Castner, W. L.; Stradling, J. S.
1983-01-01
A repository for critical and important detailed design data and information, hitherto unpublished, along with significant data on oxygen reactivity phenomena with metallic and nonmetallic materials in moderate to very high pressure environments is documented. This data and information provide a ready and easy to use reference for the guidance of designers of propulsion, power, and life support systems for use in space flight. The document is also applicable to designs for industrial and civilian uses of high pressure oxygen systems. The information presented herein are derived from data and design practices involving oxygen usage at pressures ranging from about 20 psia to 8000 psia equal with thermal conditions ranging from room temperatures up to 500 F.
NELS 2.0 - A general system for enterprise wide information management
NASA Technical Reports Server (NTRS)
Smith, Stephanie L.
1993-01-01
NELS, the NASA Electronic Library System, is an information management tool for creating distributed repositories of documents, drawings, and code for use and reuse by the aerospace community. The NELS retrieval engine can load metadata and source files of full text objects, perform natural language queries to retrieve ranked objects, and create links to connect user interfaces. For flexibility, the NELS architecture has layered interfaces between the application program and the stored library information. The session manager provides the interface functions for development of NELS applications. The data manager is an interface between session manager and the structured data system. The center of the structured data system is the Wide Area Information Server. This system architecture provides access to information across heterogeneous platforms in a distributed environment. There are presently three user interfaces that connect to the NELS engine; an X-Windows interface, and ASCII interface and the Spatial Data Management System. This paper describes the design and operation of NELS as an information management tool and repository.
Framework and prototype for a secure XML-based electronic health records system.
Steele, Robert; Gardner, William; Chandra, Darius; Dillon, Tharam S
2007-01-01
Security of personal medical information has always been a challenge for the advancement of Electronic Health Records (EHRs) initiatives. eXtensible Markup Language (XML), is rapidly becoming the key standard for data representation and transportation. The widespread use of XML and the prospect of its use in the Electronic Health (e-health) domain highlights the need for flexible access control models for XML data and documents. This paper presents a declarative access control model for XML data repositories that utilises an expressive XML role control model. The operational semantics of this model are illustrated by Xplorer, a user interface generation engine which supports search-browse-navigate activities on XML repositories.
Region-to-area screening methodology for the Crystalline Repository Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1985-04-01
The purpose of this document is to describe the Crystalline Repository Project's (CRP) process for region-to-area screening of exposed and near-surface crystalline rock bodies in the three regions of the conterminous United States where crystalline rock is being evaluated as a potential host for the second nuclear waste repository (i.e., in the North Central, Northeastern, and Southeastern Regions). This document indicates how the US Department of Energy's (DOE) General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories (10 CFR 960) were used to select and apply factors and variables for the region-to-area screening, explains how these factors andmore » variable are to be applied in the region-to-area screening, and indicates how this methodology relates to the decision process leading to the selection of candidate areas. A brief general discussion of the screening process from the national survey through area screening and site recommendation is presented. This discussion sets the scene for detailed discussions which follow concerning the region-to-area screening process, the guidance provided by the DOE Siting Guidelines for establishing disqualifying factors and variables for screening, and application of the disqualifying factors and variables in the screening process. This document is complementary to the regional geologic and environmental characterization reports to be issued in the summer of 1985 as final documents. These reports will contain the geologic and environmental data base that will be used in conjunction with the methodology to conduct region-to-area screening.« less
Haarbrandt, Birger; Wilschko, Andreas; Marschollek, Michael
2016-01-01
In order to integrate operative report documents from two operating room management systems into a data warehouse, we investigated the application of the two-level modelling approach of openEHR to create a shared data model. Based on the systems' analyses, a template consisting of 13 archetypes has been developed. Of these 13 archetypes, 3 have been obtained from the international archetype repository of the openEHR foundation. The remaining 10 archetypes have been newly created. The template was evaluated by an application system expert and through conducting a first test mapping of real-world data from one of the systems. The evaluation showed that by using the two-level modelling approach of openEHR, we succeeded to represent an integrated and shared information model for operative report documents. More research is needed to learn about the limitations of this approach in other data integration scenarios.
Building and Using Digital Repository Certifications across Science
NASA Astrophysics Data System (ADS)
McIntosh, L.
2017-12-01
When scientific recommendations are made based upon research, the quality and integrity of the data should be rigorous enough to verify claims and in a trusted location. Key to ensuring the transparency and verifiability of research, reproducibility hinges not only on the availability of the documentation, analyses, and data, but the ongoing accessibility and viability of the files and documents, enhanced through a process of curation. The Research Data Alliance (RDA) is an international, community-driven, action-oriented, virtual organization committed to enabling the open sharing of data by building social and technical bridges. Within the RDA, multiple groups are working on consensus-building around the certification of digital repositories across scientific domains. For this section of the panel, we will discuss the work to date on repository certification from this RDA perspective.
Releasing Reservations from Isolation.
ERIC Educational Resources Information Center
Ambler, Marjane
1994-01-01
Discusses the role of the emerging tribal libraries, designed to serve as repositories of Native American history and knowledge. Indicates that the tribal libraries and archives have provided a means for recentralizing documents important to Native American history that had previously been moved to distant repositories by outside elements. (MAB)
Unifying Access to National Hydrologic Data Repositories via Web Services
NASA Astrophysics Data System (ADS)
Valentine, D. W.; Jennings, B.; Zaslavsky, I.; Maidment, D. R.
2006-12-01
The CUAHSI hydrologic information system (HIS) is designed to be a live, multiscale web portal system for accessing, querying, visualizing, and publishing distributed hydrologic observation data and models for any location or region in the United States. The HIS design follows the principles of open service oriented architecture, i.e. system components are represented as web services with well defined standard service APIs. WaterOneFlow web services are the main component of the design. The currently available services have been completely re-written compared to the previous version, and provide programmatic access to USGS NWIS. (steam flow, groundwater and water quality repositories), DAYMET daily observations, NASA MODIS, and Unidata NAM streams, with several additional web service wrappers being added (EPA STORET, NCDC and others.). Different repositories of hydrologic data use different vocabularies, and support different types of query access. Resolving semantic and structural heterogeneities across different hydrologic observation archives and distilling a generic set of service signatures is one of the main scalability challenges in this project, and a requirement in our web service design. To accomplish the uniformity of the web services API, data repositories are modeled following the CUAHSI Observation Data Model. The web service responses are document-based, and use an XML schema to express the semantics in a standard format. Access to station metadata is provided via web service methods, GetSites, GetSiteInfo and GetVariableInfo. The methdods form the foundation of CUAHSI HIS discovery interface and may execute over locally-stored metadata or request the information from remote repositories directly. Observation values are retrieved via a generic GetValues method which is executed against national data repositories. The service is implemented in ASP.Net, and other providers are implementing WaterOneFlow services in java. Reference implementation of WaterOneFlow web services is available. More information about the ongoing development of CUAHSI HIS is available from http://www.cuahsi.org/his/.
Transportation needs assessment: Emergency response section
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The transportation impacts of moving high level nuclear waste (HLNW) to a repository at Yucca Mountain in Nevada are of concern to the residents of the State as well as to the residents of other states through which the nuclear wastes might be transported. The projected volume of the waste suggests that shipments will occur on a daily basis for some period of time. This will increase the risk of accidents, including a catastrophic incident. Furthermore, as the likelihood of repository construction and operation and waste shipments increase, so will the attention given by the national media. This document ismore » not to be construed as a willingness to accept the HLNW repository on the part of the State. Rather it is an initial step in ensuring that the safety and well-being of Nevada residents and visitors and the State`s economy will be adequately addressed in federal decision-making pertaining to the transportation of HLNW into and across Nevada for disposal in the proposed repository. The Preferred Transportation System Needs Assessment identifies critical system design elements and technical and social issues that must be considered in conducting a comprehensive transportation impact analysis. Development of the needs assessment and the impact analysis is especially complex because of the absence of information and experience with shipping HLNW and because of the ``low probability, high consequence`` aspect of the transportation risk.« less
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
Rolling Deck to Repository (R2R): Standards and Semantics for Open Access to Research Data
NASA Astrophysics Data System (ADS)
Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen
2015-04-01
In recent years, a growing number of funding agencies and professional societies have issued policies calling for open access to research data. The Rolling Deck to Repository (R2R) program is working to ensure open access to the environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 terabytes of data to R2R each year, acquired from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R is working to ensure these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R maintains a master catalog of cruises for the U.S. academic research fleet, currently holding essential documentation for over 3,800 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. A Digital Object Identifier (DOI) is published for 1) each cruise, 2) each original field sensor dataset, 3) each post-field data product such as quality-controlled shiptrack navigation produced by the R2R program, and 4) each document such as a cruise report submitted by the science party. Scientists are linked to personal identifiers, such as the Open Researcher and Contributor ID (ORCID), where known. Using standard global identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. Since its inception, the R2R program has worked in close collaboration with other data repositories in the development of shared semantics for oceanographic research. The R2R cruise catalog uses community-standard terms and definitions hosted by the NERC Vocabulary Server, and publishes ISO metadata records for each cruise that use community-standard profiles developed with the NOAA Data Centers and the EU SeaDataNet project. R2R is a partner in the Ocean Data Interoperability Platform (ODIP), working to strengthen links among regional and national data systems, as well as a lead partner in the EarthCube "GeoLink" project, developing a standard set of ontology design patterns for publishing research data using Semantic Web protocols.
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
NASA Astrophysics Data System (ADS)
Arko, Robert; Chandler, Cynthia; Stocks, Karen; Smith, Shawn; Clark, Paul; Shepherd, Adam; Moore, Carla; Beaulieu, Stace
2013-04-01
The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from U.S. academic oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. The entire R2R Catalog is published online as a Linked Data collection, making it easily accessible to encourage discovery and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by publishing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation; 2.) We facilitate data citation through the entire lifecycle from field acquisition to shoreside archiving to journal articles and global syntheses, by publishing Digital Object Identifiers (DOIs) for datasets and encoding them directly into our Linked Data resources; and 3.) We facilitate federation with other repositories such as the Biological and Chemical Oceanography Data Management Office (BCO-DMO), InterRidge Vents Database, and Index to Marine and Lacustrine Geological Samples (IMLGS), by reciprocal linking between RDF resources and supporting the RDF Query Language. R2R participates in the Ocean Data Interoperability Platform (ODIP), a joint European-U.S.-Australian partnership to facilitate the sharing of data and documentation across international borders. We publish our controlled vocabularies as a Simple Knowledge Organization System (SKOS) concept collection, and are working toward alignment with SeaDataNet and other community-standard terms using the NERC Vocabulary Server (NVS). http://rvdata.us/
Hartman, Victoria; Castillo-Pelayo, Tania; Babinszky, Sindy; Dee, Simon; Leblanc, Jodi; Matzke, Lise; O'Donoghue, Sheila; Carpenter, Jane; Carter, Candace; Rush, Amanda; Byrne, Jennifer; Barnes, Rebecca; Mes-Messons, Anne-Marie; Watson, Peter
2018-02-01
Ongoing quality management is an essential part of biobank operations and the creation of high quality biospecimen resources. Adhering to the standards of a national biobanking network is a way to reduce variability between individual biobank processes, resulting in cross biobank compatibility and more consistent support for health researchers. The Canadian Tissue Repository Network (CTRNet) implemented a set of required operational practices (ROPs) in 2011 and these serve as the standards and basis for the CTRNet biobank certification program. A review of these 13 ROPs covering 314 directives was conducted after 5 years to identify areas for revision and update, leading to changes to 7/314 directives (2.3%). A review of all internal controlled documents (including policies, standard operating procedures and guides, and forms for actions and processes) used by the BC Cancer Agency's Tumor Tissue Repository (BCCA-TTR) to conform to these ROPs was then conducted. Changes were made to 20/106 (19%) of BCCA-TTR documents. We conclude that a substantial fraction of internal controlled documents require updates at regular intervals to accommodate changes in best practices. Reviewing documentation is an essential aspect of keeping up to date with best practices and ensuring the quality of biospecimens and data managed by biobanks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guyer, H.B.; McChesney, C.A.
The overall primary Objective of HDAR is to create a repository of historical personnel security documents and provide the functionality needed for archival and retrieval use by other software modules and application users of the DISS/ET system. The software product to be produced from this specification is the Historical Document Archival and Retrieval Subsystem The product will provide the functionality to capture, retrieve and manage documents currently contained in the personnel security folders in DOE Operations Offices vaults at various locations across the United States. The long-term plan for DISS/ET includes the requirement to allow for capture and storage ofmore » arbitrary, currently undefined, clearance-related documents that fall outside the scope of the ``cradle-to-grave`` electronic processing provided by DISS/ET. However, this requirement is not within the scope of the requirements specified in this document.« less
Comparing the hierarchy of author given tags and repository given tags in a large document archive
NASA Astrophysics Data System (ADS)
Tibély, Gergely; Pollner, Péter; Palla, Gergely
2016-10-01
Folksonomies - large databases arising from collaborative tagging of items by independent users - are becoming an increasingly important way of categorizing information. In these systems users can tag items with free words, resulting in a tripartite item-tag-user network. Although there are no prescribed relations between tags, the way users think about the different categories presumably has some built in hierarchy, in which more special concepts are descendants of some more general categories. Several applications would benefit from the knowledge of this hierarchy. Here we apply a recent method to check the differences and similarities of hierarchies resulting from tags given by independent individuals and from tags given by a centrally managed repository system. The results from our method showed substantial differences between the lower part of the hierarchies, and in contrast, a relatively high similarity at the top of the hierarchies.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-13
... SECURITIES AND EXCHANGE COMMISSION 17 CFR Parts 240 and 249 [Release No. 34-63347; File No. S7-35-10] RIN 3235-AK79 Security-Based Swap Data Repository Registration, Duties, and Core Principles; Correction Correction In proposed rule document C1-2010-29719 beginning on page 79320 in the issue of...
75 FR 79320 - Security-Based Swap Data Repository Registration, Duties, and Core Principles
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-20
... SECURITIES AND EXCHANGE COMMISSION 17 CFR Parts 240 and 249 [Release No. 34-63347; File No. S7-35-10] RIN 3235-AK79 Security-Based Swap Data Repository Registration, Duties, and Core Principles Correction In proposed rule document 2010-29719 beginning on page 77306 in the issue of December 10, 2010...
DOE Office of Scientific and Technical Information (OSTI.GOV)
St. John, C.M.
1977-04-01
An underground repository containing heat generating, High Level Waste or Spent Unreprocessed Fuel may be approximated as a finite number of heat sources distributed across the plane of the repository. The resulting temperature, displacement and stress changes may be calculated using analytical solutions, providing linear thermoelasticity is assumed. This report documents a computer program based on this approach and gives results that form the basis for a comparison between the effects of disposing of High Level Waste and Spent Unreprocessed Fuel.
Skentzos, Stephen; Shubina, Maria; Plutzky, Jorge; Turchin, Alexander
2011-01-01
Adverse reactions to medications to which the patient was known to be intolerant are common. Electronic decision support can prevent them but only if history of adverse reactions to medications is recorded in structured format. We have conducted a retrospective study of 31,531 patients with adverse reactions to statins documented in the notes, as identified with natural language processing. The software identified statin adverse reactions with sensitivity of 86.5% and precision of 91.9%. Only 9020 of these patients had an adverse reaction to a statin recorded in structured format. In multivariable analysis the strongest predictor of structured documentation was utilization of EMR functionality that integrated the medication list with the structured medication adverse reaction repository (odds ratio 48.6, p < 0.0001). Integration of information flow between EMR modules can help improve documentation and potentially prevent adverse drug events. PMID:22195188
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Charles R.; Weck, Philippe F.; Vaughn, Palmer
Report RWEV-REP-001, Analysis of Postclosure Groundwater Impacts for a Geologic Repository for the Disposal of Spent Nuclear Fuel and High Level Radioactive Waste at Yucca Mountain, Nye County, Nevada was issued by the DOE in 2009 and is currently being updated. Sandia National Laboratories (SNL) provided support for the original document, performing calculations and extracting data from the Yucca Mountain Performance Assessment Model that were used as inputs to the contaminant transport and dose calculations by Jason Associates Corporation, the primary developers of the DOE report. The inputs from SNL were documented in LSA-AR-037, Inputs to Jason Associates Corporation inmore » Support of the Postclosure Repository Supplemental Environmental Impact Statement. To support the updating of the original Groundwater Impacts document, SNL has reviewed the inputs provided in LSA-AR-037 to verify that they are current and appropriate for use. The results of that assessment are documented here.« less
Implementation of Medical Information Exchange System Based on EHR Standard
Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-01-01
Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447
Implementation of Medical Information Exchange System Based on EHR Standard.
Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-12-01
To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.
Managing operational documentation in the ALICE Detector Control System
NASA Astrophysics Data System (ADS)
Lechman, M.; Augustinus, A.; Bond, P.; Chochula, P.; Kurepin, A.; Pinazza, O.; Rosinsky, P.
2012-12-01
ALICE (A Large Ion Collider Experiment) is one of the big LHC (Large Hadron Collider) experiments at CERN in Geneve, Switzerland. The experiment is composed of 18 sub-detectors controlled by an integrated Detector Control System (DCS) that is implemented using the commercial SCADA package PVSSII. The DCS includes over 1200 network devices, over 1,000,000 monitored parameters and numerous custom made software components that are prepared by over 100 developers from all around the world. This complex system is controlled by a single operator via a central user interface. One of his/her main tasks is the recovery of anomalies and errors that may occur during operation. Therefore, clear, complete and easily accessible documentation is essential to guide the shifter through the expert interfaces of different subsystems. This paper describes the idea of the management of the operational documentation in ALICE using a generic repository that is built on a relational database and is integrated with the control system. The experience gained and the conclusions drawn from the project are also presented.
Development of a user-centered radiology teaching file system
NASA Astrophysics Data System (ADS)
dos Santos, Marcelo; Fujino, Asa
2011-03-01
Learning radiology requires systematic and comprehensive study of a large knowledge base of medical images. In this work is presented the development of a digital radiology teaching file system. The proposed system has been created in order to offer a set of customized services regarding to users' contexts and their informational needs. This has been done by means of an electronic infrastructure that provides easy and integrated access to all relevant patient data at the time of image interpretation, so that radiologists and researchers can examine all available data to reach well-informed conclusions, while protecting patient data privacy and security. The system is presented such as an environment which implements a distributed clinical database, including medical images, authoring tools, repository for multimedia documents, and also a peer-reviewed model which assures dataset quality. The current implementation has shown that creating clinical data repositories on networked computer environments points to be a good solution in terms of providing means to review information management practices in electronic environments and to create customized and contextbased tools for users connected to the system throughout electronic interfaces.
Phoenix: Service Oriented Architecture for Information Management - Abstract Architecture Document
2011-09-01
implementation logic and policy if and which Information Brokering and Repository Services the information is going to be forwarded to. These service chains...descriptions are going to be retrieved. Raised Exceptions: • Exception getConsumers(sessionTrack : SessionTrack, information : Information...that exetnd the usefullness of the IM system as a whole. • Client • Event Notification • Filter • Information Discovery • Security • Service
Tool Integration and Environment Architectures
1991-05-01
include the Interactive Development Environment (IDE) Software Through Pictures (STP), Sabre-C and FrameMaker coalition, and the Verdix Ada Development...System (VADS) APSE, which includes the VADS compiler and choices of CADRE Teamwork or STP and FrameMaker or Interleaf. The key characteristic of...remote procedure execution to achieve a simulation of a homoge- neous repository (i.e., a simulation that the data in a FrameMaker document resides in one
M-Learning and Augmented Reality: A Review of the Scientific Literature on the WoS Repository
ERIC Educational Resources Information Center
Fombona, Javier; Pascual-Sevillano, Maria-Angeles; González-Videgara, MariCarmen
2017-01-01
Augmented reality emerges as a tool, on which it is necessary to examine its real educational value. This paper shows the results of a bibliometric analysis performed on documents collected from the Web of Science repository, an Internet service that concentrates bibliographic information from more than 7,000 institutions. Our analysis included an…
ERIC Educational Resources Information Center
Sutradhar, B.
2006-01-01
Purpose: To describe how an institutional repository (IR) was set up, using open source software, at the Indian Institute of Technology (IIT) in Kharagpur. Members of the IIT can publish their research documents in the IR for online access as well as digital preservation. Material in this IR includes instructional materials, records, data sets,…
Open Scenario Study: IDA Open Scenario Repository User’s Manual
2010-01-01
Thomason, Study Co-Lead Zachary S. Rabold, Sub-Task Lead Ylli Bajraktari Rachel D. Dubin Mary Catherine Flythe Open Scenario Study: IDA Open Scenario... Bajraktari Rachel D. Dubin Mary Catherine Flythe Open Scenario Study: IDA Open Scenario Repository User’s Manual iii Preface This document reports the...vii Appendices A. Identifying Scenario Components...........................................................A-1 B . Acronyms
Criteria for the evaluation and certification of long-term digital archives in the earth sciences
NASA Astrophysics Data System (ADS)
Klump, Jens
2010-05-01
Digital information has become an indispensable part of our cultural and scientific heritage. Scientific findings, historical documents and cultural achievements are to a rapidly increasing extent being presented in electronic form - in many cases exclusively so. However, besides the invaluable advantages offered by this form, it also carries a serious disadvantage: users need to invest a great deal of technical effort in accessing the information. Also, the underlying technology is still undergoing further development at an exceptionally fast pace. The rapid obsolescence of the technology required to read the information combined with the frequently imperceptible physical decay of the media themselves represents a serious threat to preservation of the information content. Many data sets in earth science research are from observations that cannot be repeated. This makes these digital assets particularly valuable. Therefore, these data should be kept and made available for re-use long after the end of the project from which they originated. Since research projects only run for a relatively short period of time, it is advisable to shift the burden of responsibility for long-term data curation from the individual researcher to a trusted data repository or archive. But what makes a trusted data repository? Each trusted digital repository has its own targets and specifications. The trustworthiness of digital repositories can be tested and assessed on the basis of a criteria catalogue. This is the main focus of the work of the nestor working group "Trusted repositories - Certification". It identifies criteria which permit the trustworthiness of a digital repository to be evaluated, both at the organisational and technical levels. The criteria are defined in close collaboration with a wide range of different memory organisations, producers of information, experts and other interested parties. This open approach ensures a high degree of universal validity, suitability for daily practical use and also broad-based acceptance of the results. The criteria catalogue is also intended to present the option of documenting trustworthiness by means of certification in a standardised national or international process. The criteria catalogue is based on the Reference Model for an Open Archival Information System (OAIS, ISO 14721:2003) With its broad approach, the nestor criteria catalogue for trusted digital repositories has to remain on a high level of abstraction. For application in the earth sciences the evaluation criteria need to be transferred into the context of earth science data and their designated user community. This presentation offers a brief introduction to the problems surrounding the long-term preservation of digital objects. This introduction is followed by a proposed application of the criteria catalogue for trusted digital repositories to the context of earth science data and their long-term preservation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1985-12-31
In 1982, the Congress enacted the Nuclear Waste Policy Act (Public Law 97-425), which established a comprehensive national program directed toward siting, constructing, and operating geologic repositories for the permanent disposal of high-level radioactive waste. In February 1983, the United States Department of Energy (DOE) identified the nine referenced repository locations as potentially acceptable sites for a mined geologic repository. These sites have been evaluated in accordance with the DOE`s General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories. The DOE findings and determinations are based on the evaluations contained in the draft Environmental Assessments (EA). A finalmore » EA will be prepared after considering the comments received on the draft EA. The purpose of this document is to provide the public with specific site information on each potential repository location.« less
HPGMG 1.0: A Benchmark for Ranking High Performance Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Mark; Brown, Jed; Shalf, John
2014-05-05
This document provides an overview of the benchmark ? HPGMG ? for ranking large scale general purpose computers for use on the Top500 list [8]. We provide a rationale for the need for a replacement for the current metric HPL, some background of the Top500 list and the challenges of developing such a metric; we discuss our design philosophy and methodology, and an overview of the specification of the benchmark. The primary documentation with maintained details on the specification can be found at hpgmg.org and the Wiki and benchmark code itself can be found in the repository https://bitbucket.org/hpgmg/hpgmg.
Use of a Knowledge Management System in Waste Management Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruendler, D.; Boetsch, W.U.; Holzhauer, U.
2006-07-01
In Germany the knowledge management system 'WasteInfo' about waste management and disposal issues has been developed and implemented. Beneficiaries of 'WasteInfo' are official decision makers having access to a large information pool. The information pool is fed by experts, so called authors This means compiling of information, evaluation and assigning of appropriate properties (metadata) to this information. The knowledge management system 'WasteInfo' has been introduced at the WM04, the operation of 'WasteInfo' at the WM05. The recent contribution describes the additional advantage of the KMS being used as a tool for the dealing with waste management projects. This specific aspectmore » will be demonstrated using a project concerning a comparative analysis of the implementation of repositories in six countries using nuclear power as examples: The information of 'WasteInfo' is assigned to categories and structured according to its origin and type of publication. To use 'WasteInfo' as a tool for the processing the projects, a suitable set of categories has to be developed for each project. Apart from technical and scientific aspects, the selected project deals with repository strategies and policies in various countries, with the roles of applicants and authorities in licensing procedures, with safety philosophy and with socio-economic concerns. This new point of view has to be modelled in the categories. Similar to this, new sources of information such as local and regional dailies or particular web-sites have to be taken into consideration. In this way 'WasteInfo' represents an open document which reflects the current status of the respective repository policy in several countries. Information with particular meaning for the German repository planning is marked and by this may influence the German strategy. (authors)« less
2011-03-28
particular topic of interest. Paper -based documents require the availability of a physical instance of a document, involving the transport of documents...repository of documents via the World Wide Web and search engines offer support in locating documents that are likely to contain relevant information. The... Web , with news agencies, newspapers, various organizations, and individuals as sources. Clearly the analysis, interpretation, and integration of
NASA Astrophysics Data System (ADS)
Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen
Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2012 CFR
2012-01-01
... characteristic, of a specific or technical nature. It may, for example, document research, experimental... computer software documentation). Examples of technical data include research and engineering data... repository, to take title to the spent nuclear fuel or high-level radioactive waste involved as expeditiously...
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2011 CFR
2011-01-01
... characteristic, of a specific or technical nature. It may, for example, document research, experimental... computer software documentation). Examples of technical data include research and engineering data... repository, to take title to the spent nuclear fuel or high-level radioactive waste involved as expeditiously...
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
NASA Technical Reports Server (NTRS)
Carvalho, Robert F.; Williams, James; Keller, Richard; Sturken, Ian; Panontin, Tina
2004-01-01
InvestigationOrganizer (IO) is a collaborative web-based system designed to support the conduct of mishap investigations. IO provides a common repository for a wide range of mishap related information, and allows investigators to make explicit, shared, and meaningful links between evidence, causal models, findings and recommendations. It integrates the functionality of a database, a common document repository, a semantic knowledge network, a rule-based inference engine, and causal modeling and visualization. Thus far, IO has been used to support four mishap investigations within NASA, ranging from a small property damage case to the loss of the Space Shuttle Columbia. This paper describes how the functionality of IO supports mishap investigations and the lessons learned from the experience of supporting two of the NASA mishap investigations: the Columbia Accident Investigation and the CONTOUR Loss Investigation.
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marsha Keister; Kathryn McBride
The Nuclear Waste Policy Act of 1982 (NWPA), as amended, assigned the Department of Energy (DOE) responsibility for developing and managing a Federal system for the disposal of spent nuclear fuel (SNF) and high-level radioactive waste (HLW). The Office of Civilian Radioactive Waste Management (OCRWM) is responsible for accepting, transporting, and disposing of SNF and HLW at the Yucca Mountain repository in a manner that protects public health, safety, and the environment; enhances national and energy security; and merits public confidence. OCRWM faces a near-term challenge—to develop and demonstrate a transportation system that will sustain safe and efficient shipments ofmore » SNF and HLW to a repository. To better inform and improve its current planning, OCRWM has extensively reviewed plans and other documents related to past high-visibility shipping campaigns of SNF and other radioactive materials within the United States. This report summarizes the results of this review and, where appropriate, lessons learned.« less
An overview of platforms for cloud based development.
Fylaktopoulos, G; Goumas, G; Skolarikis, M; Sotiropoulos, A; Maglogiannis, I
2016-01-01
This paper provides an overview of the state of the art technologies for software development in cloud environments. The surveyed systems cover the whole spectrum of cloud-based development including integrated programming environments, code repositories, software modeling, composition and documentation tools, and application management and orchestration. In this work we evaluate the existing cloud development ecosystem based on a wide number of characteristics like applicability (e.g. programming and database technologies supported), productivity enhancement (e.g. editor capabilities, debugging tools), support for collaboration (e.g. repository functionality, version control) and post-development application hosting and we compare the surveyed systems. The conducted survey proves that software engineering in the cloud era has made its initial steps showing potential to provide concrete implementation and execution environments for cloud-based applications. However, a number of important challenges need to be addressed for this approach to be viable. These challenges are discussed in the article, while a conclusion is drawn that although several steps have been made, a compact and reliable solution does not yet exist.
Evolution of computational models in BioModels Database and the Physiome Model Repository.
Scharm, Martin; Gebhardt, Tom; Touré, Vasundra; Bagnacani, Andrea; Salehzadeh-Yazdi, Ali; Wolkenhauer, Olaf; Waltemath, Dagmar
2018-04-12
A useful model is one that is being (re)used. The development of a successful model does not finish with its publication. During reuse, models are being modified, i.e. expanded, corrected, and refined. Even small changes in the encoding of a model can, however, significantly affect its interpretation. Our motivation for the present study is to identify changes in models and make them transparent and traceable. We analysed 13734 models from BioModels Database and the Physiome Model Repository. For each model, we studied the frequencies and types of updates between its first and latest release. To demonstrate the impact of changes, we explored the history of a Repressilator model in BioModels Database. We observed continuous updates in the majority of models. Surprisingly, even the early models are still being modified. We furthermore detected that many updates target annotations, which improves the information one can gain from models. To support the analysis of changes in model repositories we developed MoSt, an online tool for visualisations of changes in models. The scripts used to generate the data and figures for this study are available from GitHub https://github.com/binfalse/BiVeS-StatsGenerator and as a Docker image at https://hub.docker.com/r/binfalse/bives-statsgenerator/ . The website https://most.bio.informatik.uni-rostock.de/ provides interactive access to model versions and their evolutionary statistics. The reuse of models is still impeded by a lack of trust and documentation. A detailed and transparent documentation of all aspects of the model, including its provenance, will improve this situation. Knowledge about a model's provenance can avoid the repetition of mistakes that others already faced. More insights are gained into how the system evolves from initial findings to a profound understanding. We argue that it is the responsibility of the maintainers of model repositories to offer transparent model provenance to their users.
SPECIATE 4.0: SPECIATION DATABASE DEVELOPMENT DOCUMENTATION--FINAL REPORT
SPECIATE is the U.S. EPA's repository of total organic compounds (TOC) and particulate matter (PM) speciation profiles of air pollution sources. This report documents how EPA developed the SPECIATE 4.0 database that replaces the prior version, SPECIATE 3.2. SPECIATE 4.0 includes ...
Rolling Deck to Repository (R2R): Technical Design - Experiences and Lessons (Invited)
NASA Astrophysics Data System (ADS)
Arko, R. A.; Carbotte, S. M.; Miller, S. P.; Chandler, C. L.; Ferrini, V.; Stocks, K.; Maffei, A. R.; Smith, S. R.; Bourassa, M. A.; McLean, S. J.; Alberts, J. C.
2009-12-01
The NSF-funded Rolling Deck to Repository (R2R) project envisions the academic research fleet as an integrated global observing system, with routine “underway” sensor data flowing directly from research vessels to a central shore-side repository. It is a complex endeavor involving many stakeholders - technicians at sea, data managers on shore, ship schedulers, clearance officers, funding agencies, National Data Centers, data synthesis projects, the science community, and the public - working toward a common goal of acquiring, documenting, archiving, evaluating, and disseminating high-quality scientific data. The technical design for R2R is guided by several key principles: 1) The data pipeline is modular, so that initial stages (e.g. inventory and review of data shipments, posting of catalog records and track maps) may proceed routinely for every cruise, while later stages (e.g. quality assessment and production of file-level metadata) may proceed at different rates for different data types; 2) Cruise documentation (e.g. sailing orders, review/release of data inventories, vessel profiles) is gathered primarily via an authenticated Web portal, linked with the UNOLS scheduling database to synchronize vocabularies and eliminate redundancies; and 3) Every data set will be documented and delivered to the appropriate National Data Center for long-term archiving and dissemination after proprietary holds are cleared, while R2R maintains a master cruise catalog that links all the data sets together. This design accommodates the diversity of instrument types, data volumes, and shipment schedules among fleet operators. During its pilot development period, R2R has solicited feedback at community workshops, UNOLS meetings, and conference presentations, including fleet-wide surveys of current practices and instrument inventories. Several vessel operators began submitting cruise data and documentation during the pilot, providing a test bed for database development and Web portal design as well as feedback on delivery formats and data policies. Visits to operating institutions, including time at sea, have been critical to understanding the full range of vessel classes, capabilities, and concerns, and will continue to be an integral component of the R2R project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marietta, Melvin Gary; Anderson, D. Richard; Bonano, Evaristo J.
2011-11-01
Sandia National Laboratories (SNL) is the world leader in the development of the detailed science underpinning the application of a probabilistic risk assessment methodology, referred to in this report as performance assessment (PA), for (1) understanding and forecasting the long-term behavior of a radioactive waste disposal system, (2) estimating the ability of the disposal system and its various components to isolate the waste, (3) developing regulations, (4) implementing programs to estimate the safety that the system can afford to individuals and to the environment, and (5) demonstrating compliance with the attendant regulatory requirements. This report documents the evolution of themore » SNL PA methodology from inception in the mid-1970s, summarizing major SNL PA applications including: the Subseabed Disposal Project PAs for high-level radioactive waste; the Waste Isolation Pilot Plant PAs for disposal of defense transuranic waste; the Yucca Mountain Project total system PAs for deep geologic disposal of spent nuclear fuel and high-level radioactive waste; PAs for the Greater Confinement Borehole Disposal boreholes at the Nevada National Security Site; and PA evaluations for disposal of high-level wastes and Department of Energy spent nuclear fuels stored at Idaho National Laboratory. In addition, the report summarizes smaller PA programs for long-term cover systems implemented for the Monticello, Utah, mill-tailings repository; a PA for the SNL Mixed Waste Landfill in support of environmental restoration; PA support for radioactive waste management efforts in Egypt, Iraq, and Taiwan; and, most recently, PAs for analysis of alternative high-level radioactive waste disposal strategies including repositories deep borehole disposal and geologic repositories in shale and granite. Finally, this report summarizes the extension of the PA methodology for radioactive waste disposal toward development of an enhanced PA system for carbon sequestration and storage systems. These efforts have produced a generic PA methodology for the evaluation of waste management systems that has gained wide acceptance within the international community. This report documents how this methodology has been used as an effective management tool to evaluate different disposal designs and sites; inform development of regulatory requirements; identify, prioritize, and guide research aimed at reducing uncertainties for objective estimations of risk; and support safety assessments.« less
System and method for responding to ground and flight system malfunctions
NASA Technical Reports Server (NTRS)
Anderson, Julie J. (Inventor); Fussell, Ronald M. (Inventor)
2010-01-01
A system for on-board anomaly resolution for a vehicle has a data repository. The data repository stores data related to different systems, subsystems, and components of the vehicle. The data stored is encoded in a tree-based structure. A query engine is coupled to the data repository. The query engine provides a user and automated interface and provides contextual query to the data repository. An inference engine is coupled to the query engine. The inference engine compares current anomaly data to contextual data stored in the data repository using inference rules. The inference engine generates a potential solution to the current anomaly by referencing the data stored in the data repository.
The Amistad Research Center: Documenting the African American Experience.
ERIC Educational Resources Information Center
Chepesiuk, Ron
1993-01-01
Describes the Amistad Research Center housed at Tulane University which is a repository of primary documents on African-American history. Topics addressed include the development and growth of the collection; inclusion of the American Missionary Association archives; sources of support; civil rights; and collecting for the future. (LRW)
Burchill, C; Roos, L L; Fergusson, P; Jebamani, L; Turner, K; Dueck, S
2000-01-01
Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research Methods, and facilitate both internal communication and collaboration with other sites. This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated.
Burchill, Charles; Fergusson, Patricia; Jebamani, Laurel; Turner, Ken; Dueck, Stephen
2000-01-01
Background Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. Objective The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research methods, and facilitate both internal communication and collaboration with other sites. Methods This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Results Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. Conclusions This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated. PMID:11720929
ERIC Educational Resources Information Center
McBride, Lawrence W., Ed.; Drake, Frederick D., Ed.
This curriculum considers the social history of Illinois during the years of 1836-1861 by studying Abraham Lincoln's legal papers from his time as a lawyer. Nearly 100,000 documents have been discovered in the archives of local, county, state, federal courts, libraries, and other repositories. The documents include detailed information about the…
The Biological Reference Repository (BioR): a rapid and flexible system for genomics annotation.
Kocher, Jean-Pierre A; Quest, Daniel J; Duffy, Patrick; Meiners, Michael A; Moore, Raymond M; Rider, David; Hossain, Asif; Hart, Steven N; Dinu, Valentin
2014-07-01
The Biological Reference Repository (BioR) is a toolkit for annotating variants. BioR stores public and user-specific annotation sources in indexed JSON-encoded flat files (catalogs). The BioR toolkit provides the functionality to combine and retrieve annotation from these catalogs via the command-line interface. Several catalogs from commonly used annotation sources and instructions for creating user-specific catalogs are provided. Commands from the toolkit can be combined with other UNIX commands for advanced annotation processing. We also provide instructions for the development of custom annotation pipelines. The package is implemented in Java and makes use of external tools written in Java and Perl. The toolkit can be executed on Mac OS X 10.5 and above or any Linux distribution. The BioR application, quickstart, and user guide documents and many biological examples are available at http://bioinformaticstools.mayo.edu. © The Author 2014. Published by Oxford University Press.
76 FR 81950 - Privacy Act; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... ``Consolidated Data Repository'' (09-90-1000). This system of records is being amended to include records... Repository'' (SORN 09-90-1000). OIG is adding record sources to the system. This system fulfills our..., and investigations of the Medicare and Medicaid programs. SYSTEM NAME: Consolidated Data Repository...
Identifying Tensions in the Use of Open Licenses in OER Repositories
ERIC Educational Resources Information Center
Amiel, Tel; Soares, Tiago Chagas
2016-01-01
We present an analysis of 50 repositories for educational content conducted through an "audit system" that helped us classify these repositories, their software systems, promoters, and how they communicated their licensing practices. We randomly accessed five resources from each repository to investigate the alignment of licensing…
Code of Federal Regulations, 2010 CFR
2010-01-01
... repository after permanent closure. 60.112 Section 60.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.112 Overall system performance objective for the geologic repository after permanent closure...
Semantic Document Model to Enhance Data and Knowledge Interoperability
NASA Astrophysics Data System (ADS)
Nešić, Saša
To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowe, B.
1980-12-31
This document summarizes an oral presentation that described the potential for volcanic activity at the proposed Yucca Mountain, Texas repository site. Yucca Mountain is located in a broad zone of volcanic activity known as the Death Valley-Pancake Ridge volcanic zone. The probability estimate for the likelihood that some future volcanic event will intersect a buried repository at Yucca Mountain is low. Additionally, the radiological consequences of penetration of a repository by basaltic magma followed by eruption of the magma at the surface are limited. The combination of low probability and limited consequence suggests that the risk posed by waste storagemore » at this site is low. (TEM)« less
Annual Historical Summary, Defense Documentation Center, 1 July 1968 to 30 June 1969.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
This summary describes the more significant activities and achievements of the Defense Documentation Center (DDC) including: DDC and the scientific and technical community. The DDC role in the Department of Defense Scientific and Technical Information Program continued to shift from the traditional concept of an archival repository and a…
Ondigita: A Platform for the Management and Delivery of Digital Documents
ERIC Educational Resources Information Center
Mazza, Riccardo; Baldassari, Andrea; Guidi, Roberto
2013-01-01
This paper presents Ondigita, a platform developed at the University of Applied Sciences of Southern Switzerland for the management and delivery of digital documents to students enrolled in bachelor's courses in various curricula within the field of engineering. Ondigita allows our organization to have a cloud-based repository of educational…
Sandia National Laboratories Hazardous Waste (RCRA) Information Repository
Albuquerque New Mexico. Hard copies of the documents are available for review at Zimmerman Library, located prefer to receive hard copy notices of updates, please download subscription form and mail this completed Select the 'print' button to get a hard copy of your document listings. Instructions for Zimmerman
Proceedings of the 8th US/German Workshop on Salt Repository Research Design and Operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Francis D.; Steininger, Walter; Bollingerfehr, Wilhelm
This document records the Proceedings of the 2017 gathering of salt repository nations. In a spirit of mutual support, technical issues are dissected, led capably by subject matter experts. As before, it is not possible to explore all contemporary issues regarding nuclear waste disposal in salt formations. Instead, the group focused on a few selected issues to be pursued in depth, while at the same time acknowledging and recording ancillary issues.
Poltronieri, Elisabetta; Truccolo, Ivana; Di Benedetto, Corrado; Castelli, Mauro; Mazzocut, Mauro; Cognetti, Gaetana
2010-12-20
The Open Archive Initiative (OAI) refers to a movement started around the '90 s to guarantee free access to scientific information by removing the barriers to research results, especially those related to the ever increasing journal subscription prices. This new paradigm has reshaped the scholarly communication system and is closely connected to the build up of institutional repositories (IRs) conceived to the benefit of scientists and research bodies as a means to keep possession of their own literary production. The IRs are high-value tools which permit authors to gain visibility by enabling rapid access to scientific material (not only publications) thus increasing impact (citation rate) and permitting a multidimensional assessment of research findings. A survey was conducted in March 2010 to mainly explore the managing system in use for archiving the research finding adopted by the Italian Scientific Institutes for Research, Hospitalization and Health Care (IRCCS) of the oncology area within the Italian National Health Service (Servizio Sanitario Nazionale, SSN). They were asked to respond to a questionnaire intended to collect data about institutional archives, metadata formats and posting of full-text documents. The enquiry concerned also the perceived role of the institutional repository DSpace ISS, built up by the Istituto Superiore di Sanità (ISS) and based on a XML scheme for encoding metadata. Such a repository aims at acting as a unique reference point for the biomedical information produced by the Italian research institutions. An in-depth analysis has also been performed on the collection of information material addressed to patients produced by the institutions surveyed. The survey respondents were 6 out of 9. The results reveal the use of different practices and standard among the institutions concerning: the type of documentation collected, the software adopted, the use and format of metadata and the conditions of accessibility to the IRs. The Italian research institutions in the field of oncology are moving the first steps towards the philosophy of OA. The main effort should be the implementation of common procedures also in order to connect scientific publications to researchers curricula. In this framework, an important effort is represented by the project of ISS aimed to set a common interface able to allow migration of data from partner institutions to the OA compliant repository DSpace ISS.
2010-01-01
Background The Open Archive Initiative (OAI) refers to a movement started around the '90s to guarantee free access to scientific information by removing the barriers to research results, especially those related to the ever increasing journal subscription prices. This new paradigm has reshaped the scholarly communication system and is closely connected to the build up of institutional repositories (IRs) conceived to the benefit of scientists and research bodies as a means to keep possession of their own literary production. The IRs are high-value tools which permit authors to gain visibility by enabling rapid access to scientific material (not only publications) thus increasing impact (citation rate) and permitting a multidimensional assessment of research findings. Methods A survey was conducted in March 2010 to mainly explore the managing system in use for archiving the research finding adopted by the Italian Scientific Institutes for Research, Hospitalization and Health Care (IRCCS) of the oncology area within the Italian National Health Service (Servizio Sanitario Nazionale, SSN). They were asked to respond to a questionnaire intended to collect data about institutional archives, metadata formats and posting of full-text documents. The enquiry concerned also the perceived role of the institutional repository DSpace ISS, built up by the Istituto Superiore di Sanità (ISS) and based on a XML scheme for encoding metadata. Such a repository aims at acting as a unique reference point for the biomedical information produced by the Italian research institutions. An in-depth analysis has also been performed on the collection of information material addressed to patients produced by the institutions surveyed. Results The survey respondents were 6 out of 9. The results reveal the use of different practices and standard among the institutions concerning: the type of documentation collected, the software adopted, the use and format of metadata and the conditions of accessibility to the IRs. Conclusions The Italian research institutions in the field of oncology are moving the first steps towards the philosophy of OA. The main effort should be the implementation of common procedures also in order to connect scientific publications to researchers curricula. In this framework, an important effort is represented by the project of ISS aimed to set a common interface able to allow migration of data from partner institutions to the OA compliant repository DSpace ISS. PMID:21172002
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Ketcham, R. A.
2017-12-01
Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.
Overview of the TREC 2008 Legal Track
2008-11-01
CDIP ) Test Collection, version 1.0 (referred to here as “IIT CDIP 1.0”) which is based on documents released under the tobacco “Master Settlement...repository, the Legacy Tobacco Documents Library (LTDL), for tobacco documents [9]. The IIT CDIP 1.0 collection is based on a snapshot, generated between...November 2005 and January 2006, of the MSA subcollection of the LTDL. The IIT CDIP 1.0 collection consists of 6,910,192 document records in the form of
Scharm, Martin; Wolkenhauer, Olaf; Waltemath, Dagmar
2016-02-15
Repositories support the reuse of models and ensure transparency about results in publications linked to those models. With thousands of models available in repositories, such as the BioModels database or the Physiome Model Repository, a framework to track the differences between models and their versions is essential to compare and combine models. Difference detection not only allows users to study the history of models but also helps in the detection of errors and inconsistencies. Existing repositories lack algorithms to track a model's development over time. Focusing on SBML and CellML, we present an algorithm to accurately detect and describe differences between coexisting versions of a model with respect to (i) the models' encoding, (ii) the structure of biological networks and (iii) mathematical expressions. This algorithm is implemented in a comprehensive and open source library called BiVeS. BiVeS helps to identify and characterize changes in computational models and thereby contributes to the documentation of a model's history. Our work facilitates the reuse and extension of existing models and supports collaborative modelling. Finally, it contributes to better reproducibility of modelling results and to the challenge of model provenance. The workflow described in this article is implemented in BiVeS. BiVeS is freely available as source code and binary from sems.uni-rostock.de. The web interface BudHat demonstrates the capabilities of BiVeS at budhat.sems.uni-rostock.de. © The Author 2015. Published by Oxford University Press.
Collaborative Information Retrieval Method among Personal Repositories
NASA Astrophysics Data System (ADS)
Kamei, Koji; Yukawa, Takashi; Yoshida, Sen; Kuwabara, Kazuhiro
In this paper, we describe a collaborative information retrieval method among personal repositorie and an implementation of the method on a personal agent framework. We propose a framework for personal agents that aims to enable the sharing and exchange of information resources that are distributed unevenly among individuals. The kernel of a personal agent framework is an RDF(resource description framework)-based information repository for storing, retrieving and manipulating privately collected information, such as documents the user read and/or wrote, email he/she exchanged, web pages he/she browsed, etc. The repository also collects annotations to information resources that describe relationships among information resources and records of interaction between the user and information resources. Since the information resources in a personal repository and their structure are personalized, information retrieval from other users' is an important application of the personal agent. A vector space model with a personalized concept-base is employed as an information retrieval mechanism in a personal repository. Since a personalized concept-base is constructed from information resources in a personal repository, it reflects its user's knowledge and interests. On the other hand, it leads to another problem while querying other users' personal repositories; that is, simply transferring query requests does not provide desirable results. To solve this problem, we propose a query equalization scheme based on a relevance feedback method for collaborative information retrieval between personalized concept-bases. In this paper, we describe an implementation of the collaborative information retrieval method and its user interface on the personal agent framework.
Enabling outsourcing XDS for imaging on the public cloud.
Ribeiro, Luís S; Rodrigues, Renato P; Costa, Carlos; Oliveira, José Luís
2013-01-01
Picture Archiving and Communication System (PACS) has been the main paradigm in supporting medical imaging workflows during the last decades. Despite its consolidation, the appearance of Cross-Enterprise Document Sharing for imaging (XDS-I), within IHE initiative, constitutes a great opportunity to readapt PACS workflow for inter-institutional data exchange. XDS-I provides a centralized discovery of medical imaging and associated reports. However, the centralized XDS-I actors (document registry and repository) must be deployed in a trustworthy node in order to safeguard patient privacy, data confidentiality and integrity. This paper presents XDS for Protected Imaging (XDS-p), a new approach to XDS-I that is capable of being outsourced (e.g. Cloud Computing) while maintaining privacy, confidentiality, integrity and legal concerns about patients' medical information.
What is the Value Proposition of Persistent Identifiers?
NASA Astrophysics Data System (ADS)
Klump, Jens; Huber, Robert
2017-04-01
Persistent identifiers (PID) are widely used today in scientific communication and documentation. Global unique identification plus persistent resolution of links to referenced digital research objects have been strong selling points for PID Systems as enabling technical infrastructures. Novel applications of PID Systems in research now go beyond the identification of file based objects such as literature or data sets and include the identification of dynamically changing datasets accessed through web services, physical objects, persons and organisations. But not only do we see more use cases but also a proliferation of identifier systems. An analysis of PID Systems used by 1381 repositories listed in the Registry of Research Data Repositories (re3data.org, status of 14 Dec 2015) showed that many disciplinary data repositories make use of PID that are not among the systems promoted by the libraries and publishers (DOI, PURL, ARK). This indicates that a number of communities have developed their own PID Systems. This begs the question, do we need more identifier systems? What makes their value proposition more appealing than those of already existing systems? On the other hand, some of these new use cases deal with entities outside the digital domain, the original scope of application for PIDs. It is therefore necessary to critically appraise the value propositions of available PID Systems and compare these against the requirements of new use cases for PID. Undoubtedly, DOI are the most used persistent identifier in scholarly communication. It was originally designed "to link customers with publishers, facilitate electronic commerce, and enable copyright management systems." Today, the DOI system is described as providing "a technical and social infrastructure for the registration and use of persistent interoperable identifiers for use on digital networks". This example shows how value propositions can change over time. Additional value can be gained by cross-linking between PID Systems, thus allowing new scholarly documentation and evaluation methods such as documenting the track record of researchers in publications and successful funding proposals, apply advanced bibliometric approaches, estimate the output and impact of funding, assess the reuse and subsequent impact of data publications, demonstrate the efficient use of research infrastructures, etc. This recombination of systems raise a series of new expectations and each stakeholder group may have its own vision of the benefits and value proposition of PIDs, which might be in conflict with others. New PID applications will arise with the application of PID Systems to semantic web technologies and to the Internet of Things, which extend PID applications to beyond digital objects to concepts and things, respectively, raising yet again their own expectations and value propositions. What are we trying to identify? What is the purpose served by identifying it? What are the implications for semantic web technologies? How certain can we be about the identity of an object and its state changes over time (Ship of Theseus Paradox)? In this presentation we will discuss a number of PID use cases and match these against the value propositions offered by a selection of PID Systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nutt, M.; Nuclear Engineering Division
2010-05-25
The activity of Phase I of the Waste Management Working Group under the United States - Japan Joint Nuclear Energy Action Plan started in 2007. The US-Japan JNEAP is a bilateral collaborative framework to support the global implementation of safe, secure, and sustainable, nuclear fuel cycles (referred to in this document as fuel cycles). The Waste Management Working Group was established by strong interest of both parties, which arise from the recognition that development and optimization of waste management and disposal system(s) are central issues of the present and future nuclear fuel cycles. This report summarizes the activity of themore » Waste Management Working Group that focused on consolidation of the existing technical basis between the U.S. and Japan and the joint development of a plan for future collaborative activities. Firstly, the political/regulatory frameworks related to nuclear fuel cycles in both countries were reviewed. The various advanced fuel cycle scenarios that have been considered in both countries were then surveyed and summarized. The working group established the working reference scenario for the future cooperative activity that corresponds to a fuel cycle scenario being considered both in Japan and the U.S. This working scenario involves transitioning from a once-through fuel cycle utilizing light water reactors to a one-pass uranium-plutonium fuel recycle in light water reactors to a combination of light water reactors and fast reactors with plutonium, uranium, and minor actinide recycle, ultimately concluding with multiple recycle passes primarily using fast reactors. Considering the scenario, current and future expected waste streams, treatment and inventory were discussed, and the relevant information was summarized. Second, the waste management/disposal system optimization was discussed. Repository system concepts were reviewed, repository design concepts for the various classifications of nuclear waste were summarized, and the factors to consider in repository design and optimization were then discussed. Japan is considering various alternatives and options for the geologic disposal facility and the framework for future analysis of repository concepts was discussed. Regarding the advanced waste and storage form development, waste form technologies developed in both countries were surveyed and compared. Potential collaboration areas and activities were next identified. Disposal system optimization processes and techniques were reviewed, and factors to consider in future repository design optimization activities were also discussed. Then the potential collaboration areas and activities related to the optimization problem were extracted.« less
75 FR 73095 - Privacy Act of 1974; Report of New System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-29
... Repository'' System No. 09-70-0587. The final rule for the Medicare and Medicaid EHR Incentive Program... primary purpose of this system, called the National Level Repository or NLR, is to collect, maintain, and... Maintenance of Data in the System The National Level Repository (NLR) contains information on eligible...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barr, G.E.; Borns, D.J.; Fridrich, C.
A comprehensive collection of scenarios is presented that connect initiating tectonic events with radionuclide releases by logical and physically possible combinations or sequences of features, events and processes. The initiating tectonic events include both discrete faulting and distributed rock deformation developed through the repository and adjacent to it, as well as earthquake-induced ground motion and changes in tectonic stress at the site. The effects of these tectonic events include impacts on the engineered-barrier system, such as container rupture and failure of repository tunnels. These effects also include a wide range of hydrologic effects such as changes in pathways and flowmore » rates in the unsaturated and saturated zones, changes in the water-table configuration, and in the development of perched-water systems. These scenarios are intended go guide performance-assessment analyses and to assist principal investigators in how essential field, laboratory, and calculational studies are used. This suite of scenarios will help ensure that all important aspects of the system disturbance related to a tectonic scenario are captured in numerical analyses. It also provides a record of all options considered by project analysts to provide documentation required for licensing agreement. The final portion of this report discusses issues remaining to be addressed with respect to tectonic activity. 105 refs.« less
Difficult to Document: The History of Physics and Allied Fields in Industrial and Government Labs
ERIC Educational Resources Information Center
Anderson, R. Joseph
2005-01-01
Approximately thirty years ago archivists began formulating new models to guide archival collecting, creating a literature that continues to grow. In the mid-1980s, the introduction of the documentation strategy collection model put new emphasis on cooperation between repositories and among stakeholders. The model initially focused on the history…
In Search of a Better Search Engine
ERIC Educational Resources Information Center
Kolowich, Steve
2009-01-01
Early this decade, the number of Web-based documents stored on the servers of the University of Florida hovered near 300,000. By the end of 2006, that number had leapt to four million. Two years later, the university hosts close to eight million Web documents. Web sites for colleges and universities everywhere have become repositories for data…
Foster Town History and Documents Located at the Tyler Free Library.
ERIC Educational Resources Information Center
McDonough, Leslie B.
This annotated bibliography attempts to make the collection of the Tyler Free Library in Foster, Rhode Island, more accessible to anyone interested in the history of the town. The library has long been an unofficial repository of historical information and town documents for the community of Foster, Rhode Island. The library also houses the files…
KAT: A Flexible XML-based Knowledge Authoring Environment
Hulse, Nathan C.; Rocha, Roberto A.; Del Fiol, Guilherme; Bradshaw, Richard L.; Hanna, Timothy P.; Roemer, Lorrie K.
2005-01-01
As part of an enterprise effort to develop new clinical information systems at Intermountain Health Care, the authors have built a knowledge authoring tool that facilitates the development and refinement of medical knowledge content. At present, users of the application can compose order sets and an assortment of other structured clinical knowledge documents based on XML schemas. The flexible nature of the application allows the immediate authoring of new types of documents once an appropriate XML schema and accompanying Web form have been developed and stored in a shared repository. The need for a knowledge acquisition tool stems largely from the desire for medical practitioners to be able to write their own content for use within clinical applications. We hypothesize that medical knowledge content for clinical use can be successfully created and maintained through XML-based document frameworks containing structured and coded knowledge. PMID:15802477
Scaling an expert system data mart: more facilities in real-time.
McNamee, L A; Launsby, B D; Frisse, M E; Lehmann, R; Ebker, K
1998-01-01
Clinical Data Repositories are being rapidly adopted by large healthcare organizations as a method of centralizing and unifying clinical data currently stored in diverse and isolated information systems. Once stored in a clinical data repository, healthcare organizations seek to use this centralized data to store, analyze, interpret, and influence clinical care, quality and outcomes. A recent trend in the repository field has been the adoption of data marts--specialized subsets of enterprise-wide data taken from a larger repository designed specifically to answer highly focused questions. A data mart exploits the data stored in the repository, but can use unique structures or summary statistics generated specifically for an area of study. Thus, data marts benefit from the existence of a repository, are less general than a repository, but provide more effective and efficient support for an enterprise-wide data analysis task. In previous work, we described the use of batch processing for populating data marts directly from legacy systems. In this paper, we describe an architecture that uses both primary data sources and an evolving enterprise-wide clinical data repository to create real-time data sources for a clinical data mart to support highly specialized clinical expert systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. Keister; K, McBride
The Nuclear Waste Policy Act of 1982 (NWPA), as amended, assigned the Department of Energy (DOE) responsibility for developing and managing a Federal system for the disposal of spent nuclear fuel (SNF) and high-level radioactive waste (HLW). The Office of Civilian Radioactive Waste Management (OCRWM) is responsible for accepting, transporting, and disposing of SNF and HLW at the Yucca Mountain repository (if licensed) in a manner that protects public health, safety, and the environment; enhances national and energy security; and merits public confidence. OCRWM faces a near-term challenge--to develop and demonstrate a transportation system that will sustain safe and efficientmore » shipments of SNF and HLW to a repository. To better inform and improve its current planning, OCRWM has extensively reviewed plans and other documents related to past high-visibility shipping campaigns of SNF and other radioactive materials within the United States. This report summarizes the results of this review and, where appropriate, lessons learned. The objective of this lessons learned study was to identify successful, best-in-class trends and commonalities from past shipping campaigns, which OCRWM could consider when planning for the development and operation of a repository transportation system. Note: this paper is for analytical and discussion purposes only, and is not an endorsement of, or commitment by, OCRWM to follow any of the comments or trends. If OCRWM elects to make such commitments at a future time, they will be appropriately documented in formal programmatic policy statements, plans and procedures. Reviewers examined an extensive study completed in 2003 by DOE's National Transportation Program (NTP), Office of Environmental Management (EM), as well as plans and documents related to SNF shipments since issuance of the NTP report. OCRWM examined specific planning, business, institutional and operating practices that have been identified by DOE, its transportation contractors, and stakeholders as important issues that arise repeatedly. In addition, the review identifies lessons learned or activities/actions which were found not to be productive to the planning and conduct of SNF shipments (i.e., negative impacts). This paper is a 'looking back' summary of lessons learned across multiple transportation campaigns. Not all lessons learned are captured here, and participants in some of the campaigns have divergent opinions and perspectives about which lessons are most critical. This analysis is part of a larger OCRWM benchmarking effort to identify best practices to consider in future transportation of radioactive materials ('looking forward'). Initial findings from this comprehensive benchmarking analysis are expected to be available in late fall 2006.« less
Giardine, Belinda; Borg, Joseph; Higgs, Douglas R; Peterson, Kenneth R; Philipsen, Sjaak; Maglott, Donna; Singleton, Belinda K; Anstee, David J; Basak, A Nazli; Clark, Barnaby; Costa, Flavia C; Faustino, Paula; Fedosyuk, Halyna; Felice, Alex E; Francina, Alain; Galanello, Renzo; Gallivan, Monica V E; Georgitsi, Marianthi; Gibbons, Richard J; Giordano, Piero C; Harteveld, Cornelis L; Hoyer, James D; Jarvis, Martin; Joly, Philippe; Kanavakis, Emmanuel; Kollia, Panagoula; Menzel, Stephan; Miller, Webb; Moradkhani, Kamran; Old, John; Papachatzopoulou, Adamantia; Papadakis, Manoussos N; Papadopoulos, Petros; Pavlovic, Sonja; Perseu, Lucia; Radmilovic, Milena; Riemer, Cathy; Satta, Stefania; Schrijver, Iris; Stojiljkovic, Maja; Thein, Swee Lay; Traeger-Synodinos, Jan; Tully, Ray; Wada, Takahito; Waye, John S; Wiemann, Claudia; Zukic, Branka; Chui, David H K; Wajcman, Henri; Hardison, Ross C; Patrinos, George P
2011-03-20
We developed a series of interrelated locus-specific databases to store all published and unpublished genetic variation related to hemoglobinopathies and thalassemia and implemented microattribution to encourage submission of unpublished observations of genetic variation to these public repositories. A total of 1,941 unique genetic variants in 37 genes, encoding globins and other erythroid proteins, are currently documented in these databases, with reciprocal attribution of microcitations to data contributors. Our project provides the first example of implementing microattribution to incentivise submission of all known genetic variation in a defined system. It has demonstrably increased the reporting of human variants, leading to a comprehensive online resource for systematically describing human genetic variation in the globin genes and other genes contributing to hemoglobinopathies and thalassemias. The principles established here will serve as a model for other systems and for the analysis of other common and/or complex human genetic diseases.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-08
... States Citizenship and Immigration Services-012 Citizenship and Immigration Data Repository System of... and Immigration Data Repository System of Records system of records and this proposed rulemaking. In... Repository (CIDR). The Privacy Act embodies fair information principles in a statutory framework governing...
NASA Technical Reports Server (NTRS)
Werdell, P. Jeremy; Fargion, Giulietta S. (Editor); McClain, Charles R. (Editor); Bailey, Sean W.
2002-01-01
Satellite ocean color missions require an abundance of high-quality in situ measurements for bio-optical and atmospheric algorithm development and post-launch product validation and sensor calibration. To facilitate the assembly of a global data set, the NASA Sea-viewing Wide Field-of-view (SeaWiFS) Project developed the Seafaring Bio-optical Archive and Storage System (SeaBASS), a local repository for in situ data regularly used in their scientific analyses. The system has since been expanded to contain data sets collected by the NASA Sensor Intercalibration and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project, as part of NASA Research Announcements NRA-96-MTPE-04 and NRA-99-OES-99. SeaBASS is a well moderated and documented hive for bio-optical data with a simple, secure mechanism for locating and extracting data based on user inputs. Its holdings are available to the general public with the exception of the most recently collected data sets. Extensive quality assurance protocols, comprehensive data and system documentation, and the continuation of an archive and relational database management system (RDBMS) suitable for bio-optical data all contribute to the continued success of SeaBASS. This document provides an overview of the current operational SeaBASS system.
NASA Astrophysics Data System (ADS)
Yang, Liang-Chih; Lu, Hsi-Peng
This paper depicts a longitudinal investigation of knowledge management system development from industrial perspectives. Snapshots on three surveys (2002, 2006, and 2010) of Taiwanese companies were conducted and compared, which is to explore the perceived understandings and requirements for the applications of a knowledge management system.From the surveys, it was found that the most useful applications were document management, knowledge search and retrieval, and knowledge repository and map. The emerging applications were expert management, document security, and knowledge automation such as auto-classification, auto-abstract and auto-keyword generation. The most wanted services along with KMS were consulting service, success story-sharing, and modularization while deploying knowledge management system in the enterprises. The trends and transformation of a KM system were also collected and analyzed. We suggest that a company should use different knowledge management approach according to its corporate main business function. Combing intellectual capital theories proposed by other researchers, we categorize knowledge management focus as staff-centric, system-centric, and customer-centric knowledge from industrial perspectives.
10 CFR 960.5-1 - System guidelines.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines § 960.5-1 System guidelines. (a) Qualifying conditions—(1) Preclosure... radioactive materials to restricted and unrestricted areas during repository operation and closure shall meet... repository siting, construction, operation, closure, and decommissioning the public and the environment shall...
An Inter-Personal Information Sharing Model Based on Personalized Recommendations
NASA Astrophysics Data System (ADS)
Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji
In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-08
... Repository System of Records AGENCY: Privacy Office, DHS. ACTION: Notice of Privacy Act system of records... Immigration Services 012 Citizenship and Immigration Data Repository System of Records.'' Citizenship and Immigration Data Repository is a mirror copy of USCIS's major immigrant and non-immigrant benefits databases...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... Citizenship and Immigration Services- 012 Citizenship and Immigration Data Repository System of Records AGENCY... Immigration Data Repository System of Records'' from certain provisions of the Privacy Act. Specifically, the... Immigration Services-012 Citizenship and Immigration Data Repository System of Records'' from one or more...
Mineralogic Model (MM3.0) Analysis Model Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. Lum
2002-02-12
The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing),more » and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M&O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M&O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed stratigraphy and structural features of the site into a 3-D model that will be useful in primary downstream models and repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential nuclear waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for a repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 1. The lateral boundaries of the ISM and its three component models are shown in Figure 2.« less
Review of CBRN Medical and Operational Terminologies in NATO CBRN Publications
2016-08-01
repository for NATO terminology and is used to search terms, abbreviations, and definitions found in NATO documents, communications, and activities of all...21 is a compilation of terminology used in NATO chemical, biological, radiological, and nuclear defense activities , documentation, and communications...informational constructs, activities , and functionality necessary for: 1. Reporting of all chemical, biological or radiological incidents and nuclear
Assessing repository technology. Where do we go from here?
NASA Technical Reports Server (NTRS)
Eichmann, David
1992-01-01
Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as prospective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.
Assessing repository technology: Where do we go from here?
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as perspective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.
Wieland, L Susan; Rutkow, Lainie; Vedula, S Swaroop; Kaufmann, Christopher N; Rosman, Lori M; Twose, Claire; Mahendraratnam, Nirosha; Dickersin, Kay
2014-01-01
To describe the sources of internal company documents used in public health and healthcare research. We searched PubMed and Embase for articles using internal company documents to address a research question about a health-related topic. Our primary interest was where authors obtained internal company documents for their research. We also extracted information on type of company, type of research question, type of internal documents, and funding source. Our searches identified 9,305 citations of which 357 were eligible. Scanning of reference lists and consultation with colleagues identified 4 additional articles, resulting in 361 included articles. Most articles examined internal tobacco company documents (325/361; 90%). Articles using documents from pharmaceutical companies (20/361; 6%) were the next most common. Tobacco articles used documents from repositories; pharmaceutical documents were from a range of sources. Most included articles relied upon internal company documents obtained through litigation (350/361; 97%). The research questions posed were primarily about company strategies to promote or position the company and its products (326/361; 90%). Most articles (346/361; 96%) used information from miscellaneous documents such as memos or letters, or from unspecified types of documents. When explicit information about study funding was provided (290/361 articles), the most common source was the US-based National Cancer Institute. We developed an alternative and more sensitive search targeted at identifying additional research articles using internal pharmaceutical company documents, but the search retrieved an impractical number of citations for review. Internal company documents provide an excellent source of information on health topics (e.g., corporate behavior, study data) exemplified by articles based on tobacco industry documents. Pharmaceutical and other industry documents appear to have been less used for research, indicating a need for funding for this type of research and well-indexed and curated repositories to provide researchers with ready access to the documents.
Wieland, L. Susan; Rutkow, Lainie; Vedula, S. Swaroop; Kaufmann, Christopher N.; Rosman, Lori M.; Twose, Claire; Mahendraratnam, Nirosha; Dickersin, Kay
2014-01-01
Objective To describe the sources of internal company documents used in public health and healthcare research. Methods We searched PubMed and Embase for articles using internal company documents to address a research question about a health-related topic. Our primary interest was where authors obtained internal company documents for their research. We also extracted information on type of company, type of research question, type of internal documents, and funding source. Results Our searches identified 9,305 citations of which 357 were eligible. Scanning of reference lists and consultation with colleagues identified 4 additional articles, resulting in 361 included articles. Most articles examined internal tobacco company documents (325/361; 90%). Articles using documents from pharmaceutical companies (20/361; 6%) were the next most common. Tobacco articles used documents from repositories; pharmaceutical documents were from a range of sources. Most included articles relied upon internal company documents obtained through litigation (350/361; 97%). The research questions posed were primarily about company strategies to promote or position the company and its products (326/361; 90%). Most articles (346/361; 96%) used information from miscellaneous documents such as memos or letters, or from unspecified types of documents. When explicit information about study funding was provided (290/361 articles), the most common source was the US-based National Cancer Institute. We developed an alternative and more sensitive search targeted at identifying additional research articles using internal pharmaceutical company documents, but the search retrieved an impractical number of citations for review. Conclusions Internal company documents provide an excellent source of information on health topics (e.g., corporate behavior, study data) exemplified by articles based on tobacco industry documents. Pharmaceutical and other industry documents appear to have been less used for research, indicating a need for funding for this type of research and well-indexed and curated repositories to provide researchers with ready access to the documents. PMID:24800999
Medical image informatics infrastructure design and applications.
Huang, H K; Wong, S T; Pietka, E
1997-01-01
Picture archiving and communication systems (PACS) is a system integration of multimodality images and health information systems designed for improving the operation of a radiology department. As it evolves, PACS becomes a hospital image document management system with a voluminous image and related data file repository. A medical image informatics infrastructure can be designed to take advantage of existing data, providing PACS with add-on value for health care service, research, and education. A medical image informatics infrastructure (MIII) consists of the following components: medical images and associated data (including PACS database), image processing, data/knowledge base management, visualization, graphic user interface, communication networking, and application oriented software. This paper describes these components and their logical connection, and illustrates some applications based on the concept of the MIII.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, Frank Vinton; Kelley, Richard E.
The DOE Spent Fuel and Waste Technology (SWFT) R&D Campaign is supporting research on crystalline rock, shale (argillite) and salt as potential host rocks for disposal of HLW and SNF in a mined geologic repository. The distribution of these three potential repository host rocks is limited to specific regions of the US and to different geologic and hydrologic environments (Perry et al., 2014), many of which may be technically suitable as a site for mined geologic disposal. This report documents a regional geologic evaluation of the Pierre Shale, as an example of evaluating a potentially suitable shale for siting amore » geologic HLW repository. This report follows a similar report competed in 2016 on a regional evaluation of crystalline rock that focused on the Superior Province of the north-central US (Perry et al., 2016).« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-29
... System or the Supplemental Complex Repository for Examiners AGENCY: United States Patent and Trademark... been scanned into the Image File Wrapper system (IFW) or the Supplemental Complex Repository for..., the USPTO had fully deployed SCORE, a data repository system designed to augment IFW with the capture...
Downgrade of the Savannah River Sites FB-Line
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADOWSKI, ED; YOURCHAK, RANDY; PRETZELLO MARJI
2005-07-05
This paper will discuss the Safeguards & Security (S&S) activities that resulted in the downgrade of the Savannah River Site's FB-Line (FBL) from a Category I Material Balance Area (MBA) in a Material Access Area (MAA) to a Category IV MBA in a Property Protection Area (PPA). The Safeguards activities included measurement of final product items, transferal of nuclear material to other Savannah River Site (SRS) facilities, discard of excess nuclear material items, and final measurements of holdup material. The Security activities included relocation and destruction of classified documents and repositories, decertification of a classified computer, access control changes, updatesmore » to planning documents, deactivation and removal of security systems, Human Reliability Program (HRP) removals, and information security training for personnel that will remain in the FBL PPA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roger Mayes; Sera White; Randy Lee
2005-04-01
Selenium is present in waste rock/overburden that is removed during phosphate mining in southeastern Idaho. Waste rock piles or rock used during reclamation can be a source of selenium (and other metals) to streams and vegetation. Some instances (in 1996) of selenium toxicity in grazing sheep and horses caused public health and environmental concerns, leading to Idaho Department of Environmental Quality (DEQ) involvement. The Selenium Information System Project is a collaboration among the DEQ, the United States Forest Service (USFS), the Bureau of Land Management (BLM), the Idaho Mining Association (IMA), Idaho State University (ISU), and the Idaho National Laboratorymore » (INL)2. The Selenium Information System is a centralized data repository for southeastern Idaho selenium data. The data repository combines information that was previously in numerous agency, mining company, and consultants’ databases and web sites. These data include selenium concentrations in soil, water, sediment, vegetation and other environmental media, as well as comprehensive mine information. The Idaho DEQ spearheaded a selenium area-wide investigation through voluntary agreements with the mining companies and interagency participants. The Selenium Information System contains the results of that area-wide investigation, and many other background documents. As studies are conducted and remedial action decisions are made the resulting data and documentation will be stored within the information system. Potential users of the information system are agency officials, students, lawmakers, mining company personnel, teachers, researchers, and the general public. The system, available from a central website, consists of a database that contains the area-wide sampling information and an ESRI ArcIMS map server. The user can easily acquire information pertaining to the area-wide study as well as the final area-wide report. Future work on this project includes creating custom tools to increase the simplicity of the website and increasing the amount of information available from site-specific studies at 15 mines.« less
DOT National Transportation Integrated Search
2015-12-29
The LTPP program was initiated in 1987 to satisfy a wide range of pavement information needs. Over the years, the program has accumulated a vast repository of research quality data, extensive documentation, and related tools, which compose LTPPs c...
Migration of the Gaudi and LHCb software repositories from CVS to Subversion
NASA Astrophysics Data System (ADS)
Clemencic, M.; Degaudenzi, H.; LHCb Collaboration
2011-12-01
A common code repository is of primary importance in a distributed development environment such as large HEP experiments. CVS (Concurrent Versions System) has been used in the past years at CERN for the hosting of shared software repositories, among which were the repositories for the Gaudi Framework and the LHCb software projects. Many developers around the world produced alternative systems to share code and revisions among several developers, mainly to overcome the limitations in CVS, and CERN has recently started a new service for code hosting based on the version control system Subversion. The differences between CVS and Subversion and the way the code was organized in Gaudi and LHCb CVS repositories required careful study and planning of the migration. Special care was used to define the organization of the new Subversion repository. To avoid as much as possible disruption in the development cycle, the migration has been gradual with the help of tools developed explicitly to hide the differences between the two systems. The principles guiding the migration steps, the organization of the Subversion repository and the tools developed will be presented, as well as the problems encountered both from the librarian and the user points of view.
Subsurface Contamination Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Y. Yuan
There are two objectives of this report, ''Subsurface Contamination Control''. The first is to provide a technical basis for recommending limiting radioactive contamination levels (LRCL) on the external surfaces of waste packages (WP) for acceptance into the subsurface repository. The second is to provide an evaluation of the magnitude of potential releases from a defective WP and the detectability of the released contents. The technical basis for deriving LRCL has been established in ''Retrieval Equipment and Strategy for Wp on Pallet'' (CRWMS M and O 2000g, 6.3.1). This report updates the derivation by incorporating the latest design information of themore » subsurface repository for site recommendation. The derived LRCL on the external surface of WPs, therefore, supercede that described in CRWMS M and O 2000g. The derived LRCL represent the average concentrations of contamination on the external surfaces of each WP that must not be exceeded before the WP is to be transported to the subsurface facility for emplacement. The evaluation of potential releases is necessary to control the potential contamination of the subsurface repository and to detect prematurely failed WPs. The detection of failed WPs is required in order to provide reasonable assurance that the integrity of each WP is intact prior to MGR closure. An emplaced WP may become breached due to manufacturing defects or improper weld combined with failure to detect the defect, by corrosion, or by mechanical penetration due to accidents or rockfall conditions. The breached WP may release its gaseous and volatile radionuclide content to the subsurface environment and result in contaminating the subsurface facility. The scope of this analysis is limited to radioactive contaminants resulting from breached WPs during the preclosure period of the subsurface repository. This report: (1) documents a method for deriving LRCL on the external surfaces of WP for acceptance into the subsurface repository; (2) provides a table of derived LRCL for nuclides of radiological importance; (3) Provides an as low as is reasonably achievable (ALARA) evaluation of the derived LRCL by comparing potential onsite and offsite doses to documented ALARA requirements; (4) Provides a method for estimating potential releases from a defective WP; (5) Provides an evaluation of potential radioactive releases from a defective WP that may become airborne and result in contamination of the subsurface facility; and (6) Provides a preliminary analysis of the detectability of a potential WP leak to support the design of an airborne release monitoring system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harwell, M. A.; Brandstetter, A.; Benson, G. L.
1982-06-01
As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harwell, M. A.; Brandstetter, A.; Benson, G. L.
1982-06-01
As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less
Akolkar, Beena; Spain, Lisa M.; Guill, Michael H.; Del Vecchio, Corey T.; Carroll, Leslie E.
2015-01-01
The National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) Central Repositories, part of the National Institutes of Health (NIH), are an important resource available to researchers and the general public. The Central Repositories house samples, genetic data, phenotypic data, and study documentation from >100 NIDDK-funded clinical studies, in areas such as diabetes, digestive disease, and liver disease research. The Central Repositories also have an exceptionally rich collection of studies related to kidney disease, including the Modification of Diet in Renal Disease landmark study and recent data from the Chronic Renal Insufficiency Cohort and CKD in Children Cohort studies. The data are carefully curated and linked to the samples from the study. The NIDDK is working to make the materials and data accessible to researchers. The Data Repositories continue to improve flexible online searching tools that help researchers identify the samples or data of interest, and NIDDK has created several different paths to access the data and samples, including some funding initiatives. Over the past several years, the Central Repositories have seen steadily increasing interest and use of the stored materials. NIDDK plans to make more collections available and do more outreach and education about use of the datasets to the nephrology research community in the future to enhance the value of this resource. PMID:25376765
Rasooly, Rebekah S; Akolkar, Beena; Spain, Lisa M; Guill, Michael H; Del Vecchio, Corey T; Carroll, Leslie E
2015-04-07
The National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) Central Repositories, part of the National Institutes of Health (NIH), are an important resource available to researchers and the general public. The Central Repositories house samples, genetic data, phenotypic data, and study documentation from >100 NIDDK-funded clinical studies, in areas such as diabetes, digestive disease, and liver disease research. The Central Repositories also have an exceptionally rich collection of studies related to kidney disease, including the Modification of Diet in Renal Disease landmark study and recent data from the Chronic Renal Insufficiency Cohort and CKD in Children Cohort studies. The data are carefully curated and linked to the samples from the study. The NIDDK is working to make the materials and data accessible to researchers. The Data Repositories continue to improve flexible online searching tools that help researchers identify the samples or data of interest, and NIDDK has created several different paths to access the data and samples, including some funding initiatives. Over the past several years, the Central Repositories have seen steadily increasing interest and use of the stored materials. NIDDK plans to make more collections available and do more outreach and education about use of the datasets to the nephrology research community in the future to enhance the value of this resource. Copyright © 2015 by the American Society of Nephrology.
Clinical results of HIS, RIS, PACS integration using data integration CASE tools
NASA Astrophysics Data System (ADS)
Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.
1995-05-01
Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.
A Python library for FAIRer access and deposition to the Metabolomics Workbench Data Repository.
Smelter, Andrey; Moseley, Hunter N B
2018-01-01
The Metabolomics Workbench Data Repository is a public repository of mass spectrometry and nuclear magnetic resonance data and metadata derived from a wide variety of metabolomics studies. The data and metadata for each study is deposited, stored, and accessed via files in the domain-specific 'mwTab' flat file format. In order to improve the accessibility, reusability, and interoperability of the data and metadata stored in 'mwTab' formatted files, we implemented a Python library and package. This Python package, named 'mwtab', is a parser for the domain-specific 'mwTab' flat file format, which provides facilities for reading, accessing, and writing 'mwTab' formatted files. Furthermore, the package provides facilities to validate both the format and required metadata elements of a given 'mwTab' formatted file. In order to develop the 'mwtab' package we used the official 'mwTab' format specification. We used Git version control along with Python unit-testing framework as well as continuous integration service to run those tests on multiple versions of Python. Package documentation was developed using sphinx documentation generator. The 'mwtab' package provides both Python programmatic library interfaces and command-line interfaces for reading, writing, and validating 'mwTab' formatted files. Data and associated metadata are stored within Python dictionary- and list-based data structures, enabling straightforward, 'pythonic' access and manipulation of data and metadata. Also, the package provides facilities to convert 'mwTab' files into a JSON formatted equivalent, enabling easy reusability of the data by all modern programming languages that implement JSON parsers. The 'mwtab' package implements its metadata validation functionality based on a pre-defined JSON schema that can be easily specialized for specific types of metabolomics studies. The library also provides a command-line interface for interconversion between 'mwTab' and JSONized formats in raw text and a variety of compressed binary file formats. The 'mwtab' package is an easy-to-use Python package that provides FAIRer utilization of the Metabolomics Workbench Data Repository. The source code is freely available on GitHub and via the Python Package Index. Documentation includes a 'User Guide', 'Tutorial', and 'API Reference'. The GitHub repository also provides 'mwtab' package unit-tests via a continuous integration service.
Geotechnical support and topical studies for nuclear waste geologic repositories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1989-01-01
The present report lists the technical reviews and comments made during the fiscal year 1988 and summarizes the technical progress of the topical studies. In the area of technical assistance, there were numerous activities detailed in the next section. These included 24 geotechnical support activities, including reviews of 6 Study Plans (SP) and participation in 6 SP Review Workshops, review of one whole document Site Characterization Plan (SCP) and participation in the Assembled Document SCP Review Workshops by 6 LBL reviewers; the hosting of a DOE program review, the rewriting of the project statement of work, 2 trips to technicalmore » and planning meetings; preparation of proposed work statements for two new topics for DOE, and 5 instances of technical assistance to DOE. These activities are described in a Table in the following section entitled Geoscience Technical Support for Nuclear Waste Geologic Repositories.''« less
Lessons from Natural Analog Studies for Geologic Disposal of High-Level Nuclear Waste (Invited)
NASA Astrophysics Data System (ADS)
Murphy, W. M.
2009-12-01
For over fifty years natural analog studies have provided lessons addressing scientific, technical, and social problems concerning geologic disposal of high-level nuclear waste. Idealized concepts for permanent disposal environments evolved from an understanding of the geological, geochemical and hydrological characteristics of analogous rocks including natural salt deposits (as advocated by the US National Academy of Sciences in 1957), ancient cratonic rocks (as investigated at Lac du Bonnet, Canada, Aspö, Sweden, and Vienne, France), and marine sedimentary rock formations (as studied at Mol, Belgium, and Bure, France). Additional multidisciplinary studies have been conducted at natural sites that bear characteristics analogous to potential repository systems, notably at natural uranium (and thorium) deposits including Poços de Caldas, Brazil, Alligator Rivers, Australia, Peña Blanca, Mexico, and Oklo, Gabon. Researchers of natural analogs for geologic disposal have addressed technical uncertainties regarding processes that have transpired over large time and space scales, which are generally inaccessible to laboratory studies. Principal questions for nuclear waste disposal include the geochemical stability and alteration rates of radionuclide bearing minerals and the mechanisms and rates of transport of radionuclides in groundwater. In their most direct applications, natural analogs studies have been devoted to testing specific models for repository performance and the experimental data that support those models. Parameters used in predictive performance assessment modeling have been compared to natural system data, including mineral solubilities, sorption coefficients, diffusion rates, and colloid transport properties. For example, the rate of uraninite oxidation and the natural paragenesis of uranium mineral alteration at Peña Blanca have been compared favorably to results of experimental studies of spent fuel alteration related to the proposed repository at Yucca Mountain, Nevada, USA. These results generally bracket repository conditions between natural and experimental systems providing confidence in the understanding of expected processes. Also, the conceptual bases and numerical techniques for modeling unsaturated zone contaminant transport over periods of thousands of years at Yucca Mountain were tested by modeling the observable record of metal transport from archaeological artifacts buried in Holocene tuff at Akrotiri, Greece. Geologically episodic mineral alteration and contaminant transport have been documented using radioisotope data in numerous analog systems providing insights for the interpretation and validity of predictive models for long term repository performance. The applicability and value of natural analog studies to understanding geologic disposal systems is a persistent question. As proposed disposal sites become increasingly well defined by site characterization and engineering design, the strengths and weaknesses of analogies can be assessed. Confidence in predictive models for complex geologic and engineered phenomena can be enhanced through multiple lines of investigation including studies of natural analog systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrichs, D.R.
1980-01-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. The various input parameters required in the analysis are compiled in data systems. The data are organized and preparedmore » by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System, a storage and retrieval system for model input and output data, including graphical interpretation and display is described. This is the first of four volumes of the description of the CIRMIS Data System.« less
NSSDC and WDC-A-R/S document availability and distribution services
NASA Technical Reports Server (NTRS)
1980-01-01
Documents available from the National Space Science Data Center (NSSDC) and the World Data Center A for Rockets Satellites are described. The availability, costs, ordering procedures for documents presently available, and the procedures for obtaining future documents are given. NSSDC, established by NASA to further the widest practicable use of reduced data obtained from space science investigations and to provide investigators with an active repository for such data, is responsible for the active collection, organization, storage, announcement, retrieval, dissemination, and exchange of data received from satellite experiments. Information on sounding rocket investigations is also collected.
2012-04-01
Third Edition [Formula: see text] [Box: see text] Printed with permission from the International Society for Biological and Environmental Repositories (ISBER) © 2011 ISBER All Rights Reserved Editor-in-Chief Lori D. Campbell, PhD Associate Editors Fay Betsou, PhD Debra Leiolani Garcia, MPA Judith G. Giri, PhD Karen E. Pitt, PhD Rebecca S. Pugh, MS Katherine C. Sexton, MBA Amy P.N. Skubitz, PhD Stella B. Somiari, PhD Individual Contributors to the Third Edition Jonas Astrin, Susan Baker, Thomas J. Barr, Erica Benson, Mark Cada, Lori Campbell, Antonio Hugo Jose Froes Marques Campos, David Carpentieri, Omoshile Clement, Domenico Coppola, Yvonne De Souza, Paul Fearn, Kelly Feil, Debra Garcia, Judith Giri, William E. Grizzle, Kathleen Groover, Keith Harding, Edward Kaercher, Joseph Kessler, Sarah Loud, Hannah Maynor, Kevin McCluskey, Kevin Meagher, Cheryl Michels, Lisa Miranda, Judy Muller-Cohn, Rolf Muller, James O'Sullivan, Karen Pitt, Rebecca Pugh, Rivka Ravid, Katherine Sexton, Ricardo Luis A. Silva, Frank Simione, Amy Skubitz, Stella Somiari, Frans van der Horst, Gavin Welch, Andy Zaayenga 2012 Best Practices for Repositories: Collection, Storage, Retrieval and Distribution of Biological Materials for Research INTERNATIONAL SOCIETY FOR BIOLOGICAL AND ENVIRONMENTAL REPOSITORIES (ISBER) INTRODUCTION T he availability of high quality biological and environmental specimens for research purposes requires the development of standardized methods for collection, long-term storage, retrieval and distribution of specimens that will enable their future use. Sharing successful strategies for accomplishing this goal is one of the driving forces for the International Society for Biological and Environmental Repositories (ISBER). For more information about ISBER see www.isber.org . ISBER's Best Practices for Repositories (Best Practices) reflect the collective experience of its members and has received broad input from other repository professionals. Throughout this document effective practices are presented for the management of specimen collections and repositories. The term "Best Practice" is used in cases where a level of operation is indicated that is above the basic recommended practice or more specifically designates the most effective practice. It is understood that repositories in certain locations or with particular financial constraints may not be able to adhere to each of the items designated as "Best Practices". Repositories fitting into either of these categories will need to decide how they might best adhere to these recommendations within their particular circumstances. While adherence to ISBER Best Practices is strictly on a voluntary basis, it is important to note that some aspects of specimen management are governed by national/federal, regional and local regulations. The reader should refer directly to regulations for their national/federal, regional and local requirements, as appropriate. ISBER has strived to include terminology appropriate to the various specimen types covered under these practices, but here too, the reader should take steps to ensure the appropriateness of the recommendations to their particular repository type prior to the implementation of any new approaches. Important terms within the document are italicized when first used in a section and defined in the glossary. The ISBER Best Practices are periodically reviewed and revised to reflect advances in research and technology. The third edition of the Best Practices builds on the foundation established in the first and second editions which were published in 2005 and 2008, respectively.
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Ketcham, R. A.; Hanlon, M.; Pettengill, M.; Ranganath, A.; Venkatesh, A.
2016-12-01
Due to advances in imaging modalities such as X-ray microtomography and scattered electron microscopy, 2D and 3D imaged datasets of rock microstructure on nanometer to centimeter length scale allow investigation of nonlinear flow and mechanical phenomena using numerical approaches. This in turn produces various upscaled parameters required by subsurface flow and deformation simulators. However, a single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (http://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Our objective is to enable scientific inquiry and engineering decisions founded on a data-driven basis. We show how the data loaded in the portal can be documented, referenced in publications via digital object identifiers, visualize and linked to other repositories. We then show preliminary results on integrating remote parallel visualization and flow simulation workflow with the pore structures currently stored in the repository. We finally discuss the issues of collecting correct metadata, data discoverability and repository sustainability. This is the first repository for this particular data, but is part of the wider ecosystem of geoscience data and model cyber-infrastructure called "Earthcube" (http://earthcube.org/) sponsored by National Science Foundation. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.
PGP repository: a plant phenomics and genomics data publication infrastructure
Arend, Daniel; Junker, Astrid; Scholz, Uwe; Schüler, Danuta; Wylie, Juliane; Lange, Matthias
2016-01-01
Plant genomics and phenomics represents the most promising tools for accelerating yield gains and overcoming emerging crop productivity bottlenecks. However, accessing this wealth of plant diversity requires the characterization of this material using state-of-the-art genomic, phenomic and molecular technologies and the release of subsequent research data via a long-term stable, open-access portal. Although several international consortia and public resource centres offer services for plant research data management, valuable digital assets remains unpublished and thus inaccessible to the scientific community. Recently, the Leibniz Institute of Plant Genetics and Crop Plant Research and the German Plant Phenotyping Network have jointly initiated the Plant Genomics and Phenomics Research Data Repository (PGP) as infrastructure to comprehensively publish plant research data. This covers in particular cross-domain datasets that are not being published in central repositories because of its volume or unsupported data scope, like image collections from plant phenotyping and microscopy, unfinished genomes, genotyping data, visualizations of morphological plant models, data from mass spectrometry as well as software and documents. The repository is hosted at Leibniz Institute of Plant Genetics and Crop Plant Research using e!DAL as software infrastructure and a Hierarchical Storage Management System as data archival backend. A novel developed data submission tool was made available for the consortium that features a high level of automation to lower the barriers of data publication. After an internal review process, data are published as citable digital object identifiers and a core set of technical metadata is registered at DataCite. The used e!DAL-embedded Web frontend generates for each dataset a landing page and supports an interactive exploration. PGP is registered as research data repository at BioSharing.org, re3data.org and OpenAIRE as valid EU Horizon 2020 open data archive. Above features, the programmatic interface and the support of standard metadata formats, enable PGP to fulfil the FAIR data principles—findable, accessible, interoperable, reusable. Database URL: http://edal.ipk-gatersleben.de/repos/pgp/ PMID:27087305
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. Gaffiney
2004-11-23
This report presents and documents the model components and analyses that represent potential processes associated with propagation of a magma-filled crack (dike) migrating upward toward the surface, intersection of the dike with repository drifts, flow of magma in the drifts, and post-magma emplacement effects on repository performance. The processes that describe upward migration of a dike and magma flow down the drift are referred to as the dike intrusion submodel. The post-magma emplacement processes are referred to as the post-intrusion submodel. Collectively, these submodels are referred to as a conceptual model for dike/drift interaction. The model components and analyses ofmore » the dike/drift interaction conceptual model provide the technical basis for assessing the potential impacts of an igneous intrusion on repository performance, including those features, events, and processes (FEPs) related to dike/drift interaction (Section 6.1).« less
Automatic indexing and retrieval of encounter-specific evidence for point-of-care support.
O'Sullivan, Dympna M; Wilk, Szymon A; Michalowski, Wojtek J; Farion, Ken J
2010-08-01
Evidence-based medicine relies on repositories of empirical research evidence that can be used to support clinical decision making for improved patient care. However, retrieving evidence from such repositories at local sites presents many challenges. This paper describes a methodological framework for automatically indexing and retrieving empirical research evidence in the form of the systematic reviews and associated studies from The Cochrane Library, where retrieved documents are specific to a patient-physician encounter and thus can be used to support evidence-based decision making at the point of care. Such an encounter is defined by three pertinent groups of concepts - diagnosis, treatment, and patient, and the framework relies on these three groups to steer indexing and retrieval of reviews and associated studies. An evaluation of the indexing and retrieval components of the proposed framework was performed using documents relevant for the pediatric asthma domain. Precision and recall values for automatic indexing of systematic reviews and associated studies were 0.93 and 0.87, and 0.81 and 0.56, respectively. Moreover, precision and recall for the retrieval of relevant systematic reviews and associated studies were 0.89 and 0.81, and 0.92 and 0.89, respectively. With minor modifications, the proposed methodological framework can be customized for other evidence repositories. Copyright 2010 Elsevier Inc. All rights reserved.
A Repository of Codes of Ethics and Technical Standards in Health Informatics
Zaïane, Osmar R.
2014-01-01
We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository. PMID:25422725
LTPP InfoPave Release 2017: What's New
DOT National Transportation Integrated Search
2017-01-01
The LTPP program was initiated in 1987 to satisfy a wide range of pavement information needs. Over the years, the program has accumulated a vast repository of research quality data, extensive documentation, and related tools, which compose LTPPs c...
Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.
2011-01-01
The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.
The new on-line Czech Food Composition Database.
Machackova, Marie; Holasova, Marie; Maskova, Eva
2013-10-01
The new on-line Czech Food Composition Database (FCDB) was launched on http://www.czfcdb.cz in December 2010 as a main freely available channel for dissemination of Czech food composition data. The application is based on a complied FCDB documented according to the EuroFIR standardised procedure for full value documentation and indexing of foods by the LanguaL™ Thesaurus. A content management system was implemented for administration of the website and performing data export (comma-separated values or EuroFIR XML transport package formats) by a compiler. Reference/s are provided for each published value with linking to available freely accessible on-line sources of data (e.g. full texts, EuroFIR Document Repository, on-line national FCDBs). LanguaL™ codes are displayed within each food record as searchable keywords of the database. A photo (or a photo gallery) is used as a visual descriptor of a food item. The application is searchable on foods, components, food groups, alphabet and a multi-field advanced search. Copyright © 2013 Elsevier Ltd. All rights reserved.
Final Inventory Work-Off Plan for ORNL transuranic wastes (1986 version)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickerson, L.S.
1988-05-01
The Final Inventory Work-Off Plan (IWOP) for ORNL Transuranic Wastes addresses ORNL's strategy for retrieval, certification, and shipment of its stored and newly generated contact-handled (CH) and remote-handled (RH) transuranic (TRU) wastes to the Waste Isolation Pilot Plant (WIPP), the proposed geologic repository near Carlsbad, New Mexico. This document considers certification compliance with the WIPP waste acceptance criteria (WAC) and is consistent with the US Department of Energy's Long-Range Master Plan for Defense Transuranic Waste Management. This document characterizes Oak Ridge National Laboratory's (ORNL's) TRU waste by type and estimates the number of shipments required to dispose of it; describesmore » the methods, facilities, and systems required for its certification and shipment; presents work-off strategies and schedules for retrieval, certification, and transportation; discusses the resource needs and additions that will be required for the effort and forecasts costs for the long-term TRU waste management program; and lists public documentation required to support certification facilities and strategies. 22 refs., 6 figs., 10 tabs.« less
NASA Astrophysics Data System (ADS)
Gaich, A.; Deák, F.; Pötsch, M.
2012-12-01
The Hungarian National Radioactive Waste Repository is being built in the neighborhood of the village called Bátaapáti. The program of the new disposal facility for the low- and intermediate-level wastes (L/ILW) is conducted by PURAM (Public Limited Company for Radioactive Waste Management). The Bátaapáti underground research program began in February 2005, with the excavation of the two inclined exploratory tunnels. These tunnels have 30 m distance between their axes, 10% inclination and 1.7 km length, and have reached the 0 m Baltic sea-level in the Mórágy Granite Formation. The safety of nuclear repository mainly is influenced by the ground behaviour and its fracturing hence mapping of the geological features has a great importance. Because of the less stable ground, the cavern walls were shotcreted after every tunnelling advance. The site geologists were required to make the tunnel mapping after every drill and blast cycle. The time interval was short and the documenting work was unrepeatable due to the shotcrete supported walls, so it was very important to use a modern, precise system to create 3D photorealistic models of the rock surfaces on the excavated tunnel walls. We have chosen the photogrammetric method, because it has adequate resolution and quality for the photo combined 3D models. At the beginning, we had used the JointMetriX3D (JMX) system and subsequently ShapeMetriX3D (SMX) in the repository chamber excavation phase. From the acquired 3D images through geological mapping is performed as the system allows directly measuring geometric information on visible discontinuities such as dip and dip direction. Descriptive rock mass parameters such as spacing, area, roughness are instantly available. In this article we would like to continue that research having made by JMX model of a tunnel face of "TSZV" access tunnel and using SMX model of a tunnel face from "DEK" Chamber. Our studies were carried out by field engineering geologists on further investigation of the photorealistic 3D models reproducibility in the both cases JMX and SMX. Regularly geotechnical rock mass classifications (Q, RMR and GSI) were used on the basis of the 3D models without field experience of the given tunnel faces. All documentations were analysed with statistical methods considering the circumstances of scanning and picturing. The orientation of main characteristic discontinuities were defined by each geologist, but also some differences occured. These discrepancies had not occurred in the results of geotechnical evaluation. Due to several cases the information provided by the 3D modelling systems could be very useful in different phases of excavation works. These information were applied in geoscience researches for example in surface roughness determination, fracture system modelling of the host rock and geological or technical objects findings behind the shotcrete layer. Beside the above mentioned advanteges we have to emphasize that JMX and SMX systems provide contact free acqusition and assessment of rock and terrain surfaces by metric high resolution 3D images in very short time period.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Keating; W.Statham
2004-02-12
The purpose of this model report is to provide documentation of the conceptual and mathematical model (ASHPLUME) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. The ASHPLUME conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through the Yucca Mountain repository and downwind transport of contaminated tephra. The ASHPLUME mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the groundmore » surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report will improve and clarify the previous documentation of the ASHPLUME mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model.« less
ScienceOrganizer System and Interface Summary
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Norvig, Peter (Technical Monitor)
2001-01-01
ScienceOrganizer is a specialized knowledge management tool designed to enhance the information storage, organization, and access capabilities of distributed NASA science teams. Users access ScienceOrganizer through an intuitive Web-based interface that enables them to upload, download, and organize project information - including data, documents, images, and scientific records associated with laboratory and field experiments. Information in ScienceOrganizer is "threaded", or interlinked, to enable users to locate, track, and organize interrelated pieces of scientific data. Linkages capture important semantic relationships among information resources in the repository, and these assist users in navigating through the information related to their projects.
Caliver: An R package for CALIbration and VERification of forest fire gridded model outputs.
Vitolo, Claudia; Di Giuseppe, Francesca; D'Andrea, Mirko
2018-01-01
The name caliver stands for CALIbration and VERification of forest fire gridded model outputs. This is a package developed for the R programming language and available under an APACHE-2 license from a public repository. In this paper we describe the functionalities of the package and give examples using publicly available datasets. Fire danger model outputs are taken from the modeling components of the European Forest Fire Information System (EFFIS) and observed burned areas from the Global Fire Emission Database (GFED). Complete documentation, including a vignette, is also available within the package.
Caliver: An R package for CALIbration and VERification of forest fire gridded model outputs
Di Giuseppe, Francesca; D’Andrea, Mirko
2018-01-01
The name caliver stands for CALIbration and VERification of forest fire gridded model outputs. This is a package developed for the R programming language and available under an APACHE-2 license from a public repository. In this paper we describe the functionalities of the package and give examples using publicly available datasets. Fire danger model outputs are taken from the modeling components of the European Forest Fire Information System (EFFIS) and observed burned areas from the Global Fire Emission Database (GFED). Complete documentation, including a vignette, is also available within the package. PMID:29293536
Flight Data Entry, Descent, and Landing (EDL) Repository
NASA Technical Reports Server (NTRS)
Martinez, Elmain M.; Winterhalter, Daniel
2012-01-01
Dr. Daniel Winterhalter, NASA Engineering and Safety Center Chief Engineer at the Jet Propulsion Laboratory, requested the NASA Engineering and Safety Center sponsor a 3-year effort to collect entry, descent, and landing material and to establish a NASA-wide archive to serve the material. The principle focus of this task was to identify entry, descent, and landing repository material that was at risk of being permanently lost due to damage, decay, and undocumented storage. To provide NASA-wide access to this material, a web-based digital archive was created. This document contains the outcome of the effort.
Description of a Website Resource for Turbulence Modeling Verification and Validation
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.
2010-01-01
The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.
Preservation of Earth Science Data History with Digital Content Repository Technology
NASA Astrophysics Data System (ADS)
Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.
2011-12-01
An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.
Initial Radionuclide Inventories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, H
The purpose of this analysis is to provide an initial radionuclide inventory (in grams per waste package) and associated uncertainty distributions for use in the Total System Performance Assessment for the License Application (TSPA-LA) in support of the license application for the repository at Yucca Mountain, Nevada. This document is intended for use in postclosure analysis only. Bounding waste stream information and data were collected that capture probable limits. For commercially generated waste, this analysis considers alternative waste stream projections to bound the characteristics of wastes likely to be encountered using arrival scenarios that potentially impact the commercial spent nuclearmore » fuel (CSNF) waste stream. For TSPA-LA, this radionuclide inventory analysis considers U.S. Department of Energy (DOE) high-level radioactive waste (DHLW) glass and two types of spent nuclear fuel (SNF): CSNF and DOE-owned (DSNF). These wastes are placed in two groups of waste packages: the CSNF waste package and the codisposal waste package (CDSP), which are designated to contain DHLW glass and DSNF, or DHLW glass only. The radionuclide inventory for naval SNF is provided separately in the classified ''Naval Nuclear Propulsion Program Technical Support Document'' for the License Application. As noted previously, the radionuclide inventory data presented here is intended only for TSPA-LA postclosure calculations. It is not applicable to preclosure safety calculations. Safe storage, transportation, and ultimate disposal of these wastes require safety analyses to support the design and licensing of repository equipment and facilities. These analyses will require radionuclide inventories to represent the radioactive source term that must be accommodated during handling, storage and disposition of these wastes. This analysis uses the best available information to identify the radionuclide inventory that is expected at the last year of last emplacement, currently identified as 2030 and 2033, depending on the type of waste. TSPA-LA uses the results of this analysis to decay the inventory to the year of repository closure projected for the year of 2060.« less
Use of Narrative Nursing Records for Nursing Research
Park, Hyeoun-Ae; Cho, InSook; Ahn, Hee-Jung
2012-01-01
To explore the usefulness of narrative nursing records documented using a standardized terminology-based electronic nursing records system, we conducted three different studies on (1) the gaps between the required nursing care time and the actual nursing care time, (2) the practice variations in pressure ulcer care, and (3) the surveillance of adverse drug events. The narrative nursing notes, documented at the point of care using standardized nursing statements, were extracted from the clinical data repository at a teaching hospital in Korea and analyzed. Our findings were: the pediatric and geriatric units showed relatively high staffing needs; overall incidence rate of pressure ulcer among the intensive-care patients was 15.0% and the nursing interventions provided for pressure-ulcer care varied depending on nursing units; and at least one adverse drug event was noted in 53.0% of the cancer patients who were treated with cisplatin. A standardized nursing terminology-based electronic nursing record system allowed us to explore answers to different various research questions. PMID:24199111
IEDA Integrated Services: Improving the User Experience for Interdisciplinary Earth Science Research
NASA Astrophysics Data System (ADS)
Carter-Orlando, M.; Ferrini, V. L.; Lehnert, K.; Carbotte, S. M.; Richard, S. M.; Morton, J. J.; Shane, N.; Ash, J.; Song, L.
2017-12-01
The Interdisciplinary Earth Data Alliance (IEDA) is an NSF-funded data facility that provides data tools and services to support the Ocean, Earth, and Polar Sciences. IEDA systems, developed and maintained primarily by the IEDA partners EarthChem and the Marine Geoscience Data System (MGDS), serve as primary community data collections for global geochemistry and marine geoscience research and support the preservation, discovery, retrieval, and analysis of a wide range of observational field and analytical data types. Individual IEDA systems originated independently and differ from one another in purpose and scope. Some IEDA systems are data repositories (EarthChem Library, Marine Geo-Digital Library), while others are actively maintained data syntheses (GMRT, PetDB, EarthChem Portal, Geochron). Still others are data visualization and analysis tools (GeoMapApp). Although the diversity of IEDA's data types, tools, and services is a major strength and of high value to investigators, it can be a source of confusion. And while much of the data managed in IEDA systems is appropriate for interdisciplinary research, investigators may be unfamiliar with the user interfaces and services of each system, especially if it is not in their primary discipline. This presentation will highlight new ways in which IEDA helps researchers to more efficiently navigate data submission and data access. It will also discuss how IEDA promotes discovery and access within and across its systems, to serve interdisciplinary science while also remaining aware of and responsive to the more specific needs of its disciplinary user communities. The IEDA Data Submission Hub (DaSH), which is currently under development, aspires to streamline the submission process for both the science data contributor and for the repository data curator. Instead of users deciding a priori, which system they should contribute their data to, the DaSH helps route them to the appropriate repository based primarily on data type, and to efficiently gather the necessary documentation for data accession. Similarly, for those looking for data, the IEDA Data Browser provides cross-system browse and discovery of data in a map interface presented in both Mercator and South Polar projections.
Taking advantage of continuity of care documents to populate a research repository.
Klann, Jeffrey G; Mendis, Michael; Phillips, Lori C; Goodson, Alyssa P; Rocha, Beatriz H; Goldberg, Howard S; Wattanasin, Nich; Murphy, Shawn N
2015-03-01
Clinical data warehouses have accelerated clinical research, but even with available open source tools, there is a high barrier to entry due to the complexity of normalizing and importing data. The Office of the National Coordinator for Health Information Technology's Meaningful Use Incentive Program now requires that electronic health record systems produce standardized consolidated clinical document architecture (C-CDA) documents. Here, we leverage this data source to create a low volume standards based import pipeline for the Informatics for Integrating Biology and the Bedside (i2b2) clinical research platform. We validate this approach by creating a small repository at Partners Healthcare automatically from C-CDA documents. We designed an i2b2 extension to import C-CDAs into i2b2. It is extensible to other sites with variances in C-CDA format without requiring custom code. We also designed new ontology structures for querying the imported data. We implemented our methodology at Partners Healthcare, where we developed an adapter to retrieve C-CDAs from Enterprise Services. Our current implementation supports demographics, encounters, problems, and medications. We imported approximately 17 000 clinical observations on 145 patients into i2b2 in about 24 min. We were able to perform i2b2 cohort finding queries and view patient information through SMART apps on the imported data. This low volume import approach can serve small practices with local access to C-CDAs and will allow patient registries to import patient supplied C-CDAs. These components will soon be available open source on the i2b2 wiki. Our approach will lower barriers to entry in implementing i2b2 where informatics expertise or data access are limited. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
SPECIATE Version 4.4 Database Development Documentation
SPECIATE is the U.S. Environmental Protection Agency’s (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Some of the many uses of these source profiles include: (1) creating speciated emissions inventories for regi...
SPECIATE 4.2: speciation Database Development Documentation
SPECIATE is the U.S. Environmental Protection Agency's (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Among the many uses of speciation data, these source profiles are used to: (1) create speciated emissions inve...
USDA-ARS?s Scientific Manuscript database
The National Clonal Germplasm Repository (NCGR) in Davis is one among the nine repositories in the National Plant Germplasm System, USDA-ARS that is responsible for conservation of clonally propagated woody perennial subtropical and temperate fruit and nut crop germplasm. Currently the repository ho...
Wynn, J.C.; Roseboom, E.H.
1987-01-01
Evaluation of potential high-level nuclear waste repository sites is an area where geophysical capabilities and limitations may significantly impact a major governmental program. Since there is concern that extensive exploratory drilling might degrade most potential disposal sites, geophysical methods become crucial as the only nondestructive means to examine large volumes of rock in three dimensions. Characterization of potential sites requires geophysicists to alter their usual mode of thinking: no longer are anomalies being sought, as in mineral exploration, but rather their absence. Thus the size of features that might go undetected by a particular method take on new significance. Legal and regulatory considerations that stem from this different outlook, most notably the requirements of quality assurance (necessary for any data used in support of a repository license application), are forcing changes in the manner in which geophysicists collect and document their data. -Authors
An Infrastructure for Indexing and Organizing Best Practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Staples, Mark; Gorton, Ian
Industry best practices are widely held but not necessarily empirically verified software engineering beliefs. Best practices can be documented in distributed web-based public repositories as pattern catalogues or practice libraries. There is a need to systematically index and organize these practices to enable their better practical use and scientific evaluation. In this paper, we propose a semi-automatic approach to index and organise best practices. A central repository acts as an information overlay on top of other pre-existing resources to facilitate organization, navigation, annotation and meta-analysis while maintaining synchronization with those resources. An initial population of the central repository is automatedmore » using Yahoo! contextual search services. The collected data is organized using semantic web technologies so that the data can be more easily shared and used for innovative analyses. A prototype has demonstrated the capability of the approach.« less
Geoscience parameter data base handbook: granites and basalts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-12-01
The Department of Energy has the responsibility for selecting and constructing Federal repositories for radioactive waste. The Nuclear Regulatory Commission must license such repositories prior to construction. The basic requirement in the geologic disposal of radioactive waste is stated as: placement in a geologic host whereby the radioactive waste is not in mechanical, thermal or chemical equilibrium with the object of preventing physical or chemical migration of radionuclides into the biosphere or hydrosphere in hazardous concentration (USGS, 1977). The object of this report is to document the known geologic parameters of large granite and basalt occurrences in the coterminous Unitedmore » States, for future evaluation in the selection and licensing of radioactive waste repositories. The description of the characteristics of certain potential igneous hosts has been limited to existing data pertaining to the general geologic character, geomechanics, and hydrology of identified occurrences. A description of the geochemistry is the subject of a separate report.« less
NASA Astrophysics Data System (ADS)
Thomas, V. I.; Yu, E.; Acharya, P.; Jaramillo, J.; Chowdhury, F.
2015-12-01
Maintaining and archiving accurate site metadata is critical for seismic network operations. The Advanced National Seismic System (ANSS) Station Information System (SIS) is a repository of seismic network field equipment, equipment response, and other site information. Currently, there are 187 different sensor models and 114 data-logger models in SIS. SIS has a web-based user interface that allows network operators to enter information about seismic equipment and assign response parameters to it. It allows users to log entries for sites, equipment, and data streams. Users can also track when equipment is installed, updated, and/or removed from sites. When seismic equipment configurations change for a site, SIS computes the overall gain of a data channel by combining the response parameters of the underlying hardware components. Users can then distribute this metadata in standardized formats such as FDSN StationXML or dataless SEED. One powerful advantage of SIS is that existing data in the repository can be leveraged: e.g., new instruments can be assigned response parameters from the Incorporated Research Institutions for Seismology (IRIS) Nominal Response Library (NRL), or from a similar instrument already in the inventory, thereby reducing the amount of time needed to determine parameters when new equipment (or models) are introduced into a network. SIS is also useful for managing field equipment that does not produce seismic data (eg power systems, telemetry devices or GPS receivers) and gives the network operator a comprehensive view of site field work. SIS allows users to generate field logs to document activities and inventory at sites. Thus, operators can also use SIS reporting capabilities to improve planning and maintenance of the network. Queries such as how many sensors of a certain model are installed or what pieces of equipment have active problem reports are just a few examples of the type of information that is available to SIS users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
F. Perry; R. Youngs
The purpose of this scientific analysis report is threefold: (1) Present a conceptual framework of igneous activity in the Yucca Mountain region (YMR) consistent with the volcanic and tectonic history of this region and the assessment of this history by experts who participated in the probabilistic volcanic hazard analysis (PVHA) (CRWMS M&O 1996 [DIRS 100116]). Conceptual models presented in the PVHA are summarized and applied in areas in which new information has been presented. Alternative conceptual models are discussed, as well as their impact on probability models. The relationship between volcanic source zones defined in the PVHA and structural featuresmore » of the YMR are described based on discussions in the PVHA and studies presented since the PVHA. (2) Present revised probability calculations based on PVHA outputs for a repository footprint proposed in 2003 (BSC 2003 [DIRS 162289]), rather than the footprint used at the time of the PVHA. This analysis report also calculates the probability of an eruptive center(s) forming within the repository footprint using information developed in the PVHA. Probability distributions are presented for the length and orientation of volcanic dikes located within the repository footprint and for the number of eruptive centers (conditional on a dike intersecting the repository) located within the repository footprint. (3) Document sensitivity studies that analyze how the presence of potentially buried basaltic volcanoes may affect the computed frequency of intersection of the repository footprint by a basaltic dike. These sensitivity studies are prompted by aeromagnetic data collected in 1999, indicating the possible presence of previously unrecognized buried volcanoes in the YMR (Blakely et al. 2000 [DIRS 151881]; O'Leary et al. 2002 [DIRS 158468]). The results of the sensitivity studies are for informational purposes only and are not to be used for purposes of assessing repository performance.« less
Schematic designs for penetration seals for a reference repository in bedded salt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelsall, P.C.; Case, J.B.; Meyer, D.
1982-11-01
The isolation of radioactive wastes in geologic repositories requires that man-made penetrations such as shafts, tunnels, or boreholes are adequately sealed. This report describes schematic seal designs for a repository in bedded salt referenced to the straitigraphy of southeastern New Mexico. The designs are presented for extensive peer review and will be updated as site-specific conceptual designs when a site for a repository in salt has been selected. The principal material used in the seal system is crushed salt obtained from excavating the repository. It is anticipated that crushed salt will consolidate as the repository rooms creep close to themore » degree that mechanical and hydrologic properties will eventually match those of undisturbed, intact salt. For southeastern New Mexico salt, analyses indicate that this process will require approximately 1000 years for a seal located at the base of one of the repository shafts (where there is little increase in temperature due to waste emplacement) and approximately 400 years for a seal located in an access tunnel within the repository. Bulkheads composed of contrete or salt bricks are also included in the seal system as components which will have low permeability during the period required for salt consolidation.« less
Native American Art and Culture: Documentary Resources.
ERIC Educational Resources Information Center
Lawrence, Deirdre
1992-01-01
Presents a brief overview of the evolution of documentary material of Native American cultures and problems confronted by researchers in locating relevant information. Bibliographic sources for research are discussed and a directory of major repositories of Native American art documentation is provided. (EA)
NASA Astrophysics Data System (ADS)
Kwon, N.; Gentle, J.; Pierce, S. A.
2015-12-01
Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.
Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.. Volume 1
NASA Technical Reports Server (NTRS)
Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)
1996-01-01
This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.
Revision history aware repositories of computational models of biological systems.
Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F
2011-01-14
Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware, by building it on top of Mercurial, an existing DVCS. We have demonstrated the utility of this approach, when used in conjunction with the model composition facilities in CellML, to build and understand more complex models. We have also demonstrated the ability of the repository software to present version history to casual users over the web, and to highlight specific versions which are likely to be useful to users. Providing facilities for maintaining and using revision history information is an important part of building a useful repository of computational models, as this information is useful both for understanding the source of and justification for parts of a model, and to facilitate automated processes such as merges. The availability of fully revision history aware repositories, and associated tools, will therefore be of significant benefit to the community.
Saadawi, Gilan M; Harrison, James H
2006-10-01
Clinical laboratory procedure manuals are typically maintained as word processor files and are inefficient to store and search, require substantial effort for review and updating, and integrate poorly with other laboratory information. Electronic document management systems could improve procedure management and utility. As a first step toward building such systems, we have developed a prototype electronic format for laboratory procedures using Extensible Markup Language (XML). Representative laboratory procedures were analyzed to identify document structure and data elements. This information was used to create a markup vocabulary, CLP-ML, expressed as an XML Document Type Definition (DTD). To determine whether this markup provided advantages over generic markup, we compared procedures structured with CLP-ML or with the vocabulary of the Health Level Seven, Inc. (HL7) Clinical Document Architecture (CDA) narrative block. CLP-ML includes 124 XML tags and supports a variety of procedure types across different laboratory sections. When compared with a general-purpose markup vocabulary (CDA narrative block), CLP-ML documents were easier to edit and read, less complex structurally, and simpler to traverse for searching and retrieval. In combination with appropriate software, CLP-ML is designed to support electronic authoring, reviewing, distributing, and searching of clinical laboratory procedures from a central repository, decreasing procedure maintenance effort and increasing the utility of procedure information. A standard electronic procedure format could also allow laboratories and vendors to share procedures and procedure layouts, minimizing duplicative word processor editing. Our results suggest that laboratory-specific markup such as CLP-ML will provide greater benefit for such systems than generic markup.
e!DAL - a framework to store, share and publish research data
2014-01-01
Background The life-science community faces a major challenge in handling “big data”, highlighting the need for high quality infrastructures capable of sharing and publishing research data. Data preservation, analysis, and publication are the three pillars in the “big data life cycle”. The infrastructures currently available for managing and publishing data are often designed to meet domain-specific or project-specific requirements, resulting in the repeated development of proprietary solutions and lower quality data publication and preservation overall. Results e!DAL is a lightweight software framework for publishing and sharing research data. Its main features are version tracking, metadata management, information retrieval, registration of persistent identifiers (DOI), an embedded HTTP(S) server for public data access, access as a network file system, and a scalable storage backend. e!DAL is available as an API for local non-shared storage and as a remote API featuring distributed applications. It can be deployed “out-of-the-box” as an on-site repository. Conclusions e!DAL was developed based on experiences coming from decades of research data management at the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK). Initially developed as a data publication and documentation infrastructure for the IPK’s role as a data center in the DataCite consortium, e!DAL has grown towards being a general data archiving and publication infrastructure. The e!DAL software has been deployed into the Maven Central Repository. Documentation and Software are also available at: http://edal.ipk-gatersleben.de. PMID:24958009
e!DAL--a framework to store, share and publish research data.
Arend, Daniel; Lange, Matthias; Chen, Jinbo; Colmsee, Christian; Flemming, Steffen; Hecht, Denny; Scholz, Uwe
2014-06-24
The life-science community faces a major challenge in handling "big data", highlighting the need for high quality infrastructures capable of sharing and publishing research data. Data preservation, analysis, and publication are the three pillars in the "big data life cycle". The infrastructures currently available for managing and publishing data are often designed to meet domain-specific or project-specific requirements, resulting in the repeated development of proprietary solutions and lower quality data publication and preservation overall. e!DAL is a lightweight software framework for publishing and sharing research data. Its main features are version tracking, metadata management, information retrieval, registration of persistent identifiers (DOI), an embedded HTTP(S) server for public data access, access as a network file system, and a scalable storage backend. e!DAL is available as an API for local non-shared storage and as a remote API featuring distributed applications. It can be deployed "out-of-the-box" as an on-site repository. e!DAL was developed based on experiences coming from decades of research data management at the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK). Initially developed as a data publication and documentation infrastructure for the IPK's role as a data center in the DataCite consortium, e!DAL has grown towards being a general data archiving and publication infrastructure. The e!DAL software has been deployed into the Maven Central Repository. Documentation and Software are also available at: http://edal.ipk-gatersleben.de.
10 CFR 960.5-2-5 - Environmental quality.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines Environment, Socioeconomics, and Transportation § 960.5-2-5 Environmental... repository siting, construction, operation, closure, and decommissioning, and projected environmental impacts... of the repository or its support facilities on, a component of the National Park System, the National...
Automated Rocket Propulsion Test Management
NASA Technical Reports Server (NTRS)
Walters, Ian; Nelson, Cheryl; Jones, Helene
2007-01-01
The Rocket Propulsion Test-Automated Management System provides a central location for managing activities associated with Rocket Propulsion Test Management Board, National Rocket Propulsion Test Alliance, and the Senior Steering Group business management activities. A set of authorized users, both on-site and off-site with regard to Stennis Space Center (SSC), can access the system through a Web interface. Web-based forms are used for user input with generation and electronic distribution of reports easily accessible. Major functions managed by this software include meeting agenda management, meeting minutes, action requests, action items, directives, and recommendations. Additional functions include electronic review, approval, and signatures. A repository/library of documents is available for users, and all items are tracked in the system by unique identification numbers and status (open, closed, percent complete, etc.). The system also provides queries and version control for input of all items.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meike, A.; Stroes-Gascoyne, S.
2000-08-01
A workshop on Microbial Activities at Yucca Mountain (May 1995, Lafayette, CA) was held with the intention to compile information on all pertinent aspects of microbial activity for application to a potential repository at Yucca Mountain. The findings of this workshop set off a number of efforts intended to eventually incorporate the impacts of microbial behavior into performance assessment models. One effort was to expand an existing modeling approach to include the distinctive characteristics of a repository at Yucca Mountain (e.g., unsaturated conditions and a significant thermal load). At the same time, a number of experimental studies were initiated asmore » well as a compilation of relevant literature to more thoroughly study the physical, chemical and biological parameters that would affect microbial activity under Yucca Mountain-like conditions. This literature search (completed in 1996) is the subject of the present document. The collected literature can be divided into four categories: (1) abiotic factors, (2) community dynamics and in-situ considerations, (3) nutrient considerations and (4) transport of radionuclides. The complete bibliography represents a considerable resource, but is too large to be discussed in one document. Therefore, the present report focuses on the first category, abiotic factors, and a discussion of these factors in order to facilitate the development of a model for Yucca Mountain.« less
NASA Technical Reports Server (NTRS)
Milroy, Audrey; Hale, Joe
2006-01-01
NASA s Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality, including the verification, validation and accreditation information. The NASA MSRR will be implemented leveraging M&S industry best practices. This presentation will discuss the requirements that will enable NASA to capture and make available the "meta data" or "simulation biography" data associated with a model. The presentation will also describe the requirements that drive how NASA will collect and document relevant information for models or suites of models in order to facilitate use and reuse of relevant models and provide visibility across NASA organizations and the larger M&S community.
Ten Thousand Years of Solitude
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benford, G.; Kirkwood, C.W.; Harry, O.
1991-03-01
This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5more » tabs.« less
In-Drift Microbial Communities
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Jolley
2000-11-09
As directed by written work direction (CRWMS M and O 1999f), Performance Assessment (PA) developed a model for microbial communities in the engineered barrier system (EBS) as documented here. The purpose of this model is to assist Performance Assessment and its Engineered Barrier Performance Section in modeling the geochemical environment within a potential repository drift for TSPA-SR/LA, thus allowing PA to provide a more detailed and complete near-field geochemical model and to answer the key technical issues (KTI) raised in the NRC Issue Resolution Status Report (IRSR) for the Evolution of the Near Field Environment (NFE) Revision 2 (NRC 1999).more » This model and its predecessor (the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document, CRWMS M and O 1998a) was developed to respond to the applicable KTIs. Additionally, because of the previous development of the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document (CRWMS M and O 1998a), the M and O was effectively able to resolve a previous KTI concern regarding the effects of microbial processes on seepage and flow (NRC 1998). This document supercedes the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document (CRWMS M and O 1998a). This document provides the conceptual framework of the revised in-drift microbial communities model to be used in subsequent performance assessment (PA) analyses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, S.K.; Cole, C.R.; Bond, F.W.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less
Working paper : the ITS cost data repository at Mitretek Systems
DOT National Transportation Integrated Search
1998-11-30
Mitretek Systems has been tasked by the Intelligent Transportation Systems (ITS) Joint Program Office (JPO) to collect available information on ITS costs and maintain the information in a cost database, which serves as the ITS Cost Data Repository. T...
Cieslewicz, Artur; Dutkiewicz, Jakub; Jedrzejek, Czeslaw
2018-01-01
Abstract Information retrieval from biomedical repositories has become a challenging task because of their increasing size and complexity. To facilitate the research aimed at improving the search for relevant documents, various information retrieval challenges have been launched. In this article, we present the improved medical information retrieval systems designed by Poznan University of Technology and Poznan University of Medical Sciences as a contribution to the bioCADDIE 2016 challenge—a task focusing on information retrieval from a collection of 794 992 datasets generated from 20 biomedical repositories. The system developed by our team utilizes the Terrier 4.2 search platform enhanced by a query expansion method using word embeddings. This approach, after post-challenge modifications and improvements (with particular regard to assigning proper weights for original and expanded terms), allowed us achieving the second best infNDCG measure (0.4539) compared with the challenge results and infAP 0.3978. This demonstrates that proper utilization of word embeddings can be a valuable addition to the information retrieval process. Some analysis is provided on related work involving other bioCADDIE contributions. We discuss the possibility of improving our results by using better word embedding schemes to find candidates for query expansion. Database URL: https://biocaddie.org/benchmark-data PMID:29688372
Geoengineering properties of potential repository units at Yucca Mountain, southern Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tillerson, J.R.; Nimick, F.B.
1984-12-01
The Nevada Nuclear Waste Storage Investigations (NNWSI) Project is currently evaluating volcanic tuffs at the Yucca Mountain site, located on and adjacent to the Nevada Test Site, for possible use as a host rock for a radioactive waste repository. The behavior of tuff as an engineering material must be understood to design, license, construct, and operate a repository. Geoengineering evaluations and measurements are being made to develop confidence in both the analysis techniques for thermal, mechanical, and hydrothermal effects and the supporting data base of rock properties. The analysis techniques and the data base are currently used for repository design,more » waste package design, and performance assessment analyses. This report documents the data base of geoengineering properties used in the analyses that aided the selection of the waste emplacement horizon and in analyses synopsized in the Environmental Assessment Report prepared for the Yucca Mountain site. The strategy used for the development of the data base relies primarily on data obtained in laboratory tests that are then confirmed in field tests. Average thermal and mechanical properties (and their anticipated variations) are presented. Based upon these data, analyses completed to date, and previous excavation experience in tuff, it is anticipated that existing mining technology can be used to develop stable underground openings and that repository operations can be carried out safely.« less
wayGoo: a platform for geolocating and managing indoor and outdoor spaces
NASA Astrophysics Data System (ADS)
Thomopoulos, Stelios C. A.; Karafylli, Christina; Karafylli, Maria; Motos, Dionysis; Lampropoulos, Vassilis; Dimitros, Kostantinos; Margonis, Christos
2016-05-01
wayGoo2 is a platform for Geolocating and Managing indoor and outdoor spaces and content with multidimensional indoor and outdoor Navigation and Guidance. Its main components are a Geographic Information System, a back-end server, front-end applications and a web-based Content Management System (CMS). It constitutes a fully integrated 2D/3D space and content management system that creates a repository that consists of a database, content components and administrative data. wayGoo can connect to any third party database and event management data-source. The platform is secure as the data is only available through a Restful web service using https security protocol in conjunction with an API key used for authentication. To enhance users experience, wayGoo makes the content available by extracting components out of the repository and constructing targeted applications. The wayGoo platform supports geo-referencing of indoor and outdoor information and use of metadata. It also allows the use of existing information such as maps and databases. The platform enables planning through integration of content that is connected either spatially, temporally or contextually, and provides immediate access to all spatial data through interfaces and interactive 2D and 3D representations. wayGoo constitutes a mean to document and preserve assets through computerized techniques and provides a system that enhances the protection of your space, people and guests when combined with wayGoo notification and alert system. It constitutes a strong marketing tool providing staff and visitors with an immersive tool for navigation in indoor spaces and allowing users to organize their agenda and to discover events through wayGoo event scheduler and recommendation system. Furthermore, the wayGoo platform can be used in Security applications and event management, e.g. CBRNE incidents, man-made and natural disasters, etc., to document and geolocate information and sensor data (off line and real time) on one end, and offer navigation capabilities in indoor and outdoor spaces. Furthermore, the wayGoo platform can be used for the creation of immersive environments and experiences in conjunction with VR/AR (Virtual and Augmented Reality) technologies.
10 CFR 960.4 - Postclosure guidelines.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Postclosure Guidelines § 960.4 Postclosure guidelines. The guidelines in this subpart specify the factors to be considered in evaluating and comparing sites on the basis of expected repository performance... NRC and EPA regulations. These requirements must be met by the repository system, which contains...
USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15
2017-05-31
AFRL-SA-WP-SR-2017-0014 USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 Daniel A. Williams...Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...Health Readiness System-Hearing Conservation Data Repository (DOEHRS-HC DR). Major command- and installation-level reports are available quarterly
A data library management system for midwest FreightView and its data repository.
DOT National Transportation Integrated Search
2011-03-01
Midwest FreightView (MWFV) and its associated data repository is part of a large multifaceted : effort to promote regional economic development throughout the Great Lakes : system. The main objective for the system is to promote sustainable maritime ...
Privacy-Preserving Accountable Accuracy Management Systems (PAAMS)
NASA Astrophysics Data System (ADS)
Thomas, Roshan K.; Sandhu, Ravi; Bertino, Elisa; Arpinar, Budak; Xu, Shouhuai
We argue for the design of “Privacy-preserving Accountable Accuracy Management Systems (PAAMS)”. The designs of such systems recognize from the onset that accuracy, accountability, and privacy management are intertwined. As such, these systems have to dynamically manage the tradeoffs between these (often conflicting) objectives. For example, accuracy in such systems can be improved by providing better accountability links between structured and unstructured information. Further, accuracy may be enhanced if access to private information is allowed in controllable and accountable ways. Our proposed approach involves three key elements. First, a model to link unstructured information such as that found in email, image and document repositories with structured information such as that in traditional databases. Second, a model for accuracy management and entity disambiguation by proactively preventing, detecting and tracing errors in information bases. Third, a model to provide privacy-governed operation as accountability and accuracy are managed.
Semantic Document Library: A Virtual Research Environment for Documents, Data and Workflows Sharing
NASA Astrophysics Data System (ADS)
Kotwani, K.; Liu, Y.; Myers, J.; Futrelle, J.
2008-12-01
The Semantic Document Library (SDL) was driven by use cases from the environmental observatory communities and is designed to provide conventional document repository features of uploading, downloading, editing and versioning of documents as well as value adding features of tagging, querying, sharing, annotating, ranking, provenance, social networking and geo-spatial mapping services. It allows users to organize a catalogue of watershed observation data, model output, workflows, as well publications and documents related to the same watershed study through the tagging capability. Users can tag all relevant materials using the same watershed name and find all of them easily later using this tag. The underpinning semantic content repository can store materials from other cyberenvironments such as workflow or simulation tools and SDL provides an effective interface to query and organize materials from various sources. Advanced features of the SDL allow users to visualize the provenance of the materials such as the source and how the output data is derived. Other novel features include visualizing all geo-referenced materials on a geospatial map. SDL as a component of a cyberenvironment portal (the NCSA Cybercollaboratory) has goal of efficient management of information and relationships between published artifacts (Validated models, vetted data, workflows, annotations, best practices, reviews and papers) produced from raw research artifacts (data, notes, plans etc.) through agents (people, sensors etc.). Tremendous scientific potential of artifacts is achieved through mechanisms of sharing, reuse and collaboration - empowering scientists to spread their knowledge and protocols and to benefit from the knowledge of others. SDL successfully implements web 2.0 technologies and design patterns along with semantic content management approach that enables use of multiple ontologies and dynamic evolution (e.g. folksonomies) of terminology. Scientific documents involved with many interconnected entities (artifacts or agents) are represented as RDF triples using semantic content repository middleware Tupelo in one or many data/metadata RDF stores. Queries to the RDF enables discovery of relations among data, process and people, digging out valuable aspects, making recommendations to users, such as what tools are typically used to answer certain kinds of questions or with certain types of dataset. This innovative concept brings out coherent information about entities from four different perspectives of the social context (Who-human relations and interactions), the casual context (Why - provenance and history), the geo-spatial context (Where - location or spatially referenced information) and the conceptual context (What - domain specific relations, ontologies etc.).
SPECIATE 4.3: Addendum to SPECIATE 4.2--Speciation database development documentation
SPECIATE is the U.S. Environmental Protection Agency's (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Among the many uses of speciation data, these source profiles are used to: (1) create speciated emissions inve...
Repository Planning, Design, and Engineering: Part II-Equipment and Costing.
Baird, Phillip M; Gunter, Elaine W
2016-08-01
Part II of this article discusses and provides guidance on the equipment and systems necessary to operate a repository. The various types of storage equipment and monitoring and support systems are presented in detail. While the material focuses on the large repository, the requirements for a small-scale startup are also presented. Cost estimates and a cost model for establishing a repository are presented. The cost model presents an expected range of acquisition costs for the large capital items in developing a repository. A range of 5,000-7,000 ft(2) constructed has been assumed, with 50 frozen storage units, to reflect a successful operation with growth potential. No design or engineering costs, permit or regulatory costs, or smaller items such as the computers, software, furniture, phones, and barcode readers required for operations have been included.
ERIC Educational Resources Information Center
White, Hollie C.
2012-01-01
Background: According to Salo (2010), the metadata entered into repositories are "disorganized" and metadata schemes underlying repositories are "arcane". This creates a challenging repository environment in regards to personal information management (PIM) and knowledge organization systems (KOSs). This dissertation research is…
Data Storing Proposal from Heterogeneous Systems into a Specialized Repository
NASA Astrophysics Data System (ADS)
Václavová, Andrea; Tanuška, Pavol; Jánošík, Ján
2016-12-01
The aim of this paper is to analyze and to propose an appropriate system for processing and simultaneously storing a vast volume of structured and unstructured data. The paper consists of three parts. The first part addresses the issue of structured and unstructured data. The second part provides the detailed analysis of data repositories and subsequent evaluation indicating which system would be for the given type and volume of data optimal. The third part focuses on the use of gathered information to transfer data to the proposed repository.
78 FR 23938 - Privacy Act of 1974; Report of a New Routine Use for Selected CMS Systems of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
...-0558; Medicare Integrated Data Repository (IDR), System No. 09-70-0571; Common Working Files (CWF... Integrated Data Repository (IDR), System No. 09-70- 0571, published at 71 Fed. Reg., 74915 (December 13, 2006...
PGP repository: a plant phenomics and genomics data publication infrastructure.
Arend, Daniel; Junker, Astrid; Scholz, Uwe; Schüler, Danuta; Wylie, Juliane; Lange, Matthias
2016-01-01
Plant genomics and phenomics represents the most promising tools for accelerating yield gains and overcoming emerging crop productivity bottlenecks. However, accessing this wealth of plant diversity requires the characterization of this material using state-of-the-art genomic, phenomic and molecular technologies and the release of subsequent research data via a long-term stable, open-access portal. Although several international consortia and public resource centres offer services for plant research data management, valuable digital assets remains unpublished and thus inaccessible to the scientific community. Recently, the Leibniz Institute of Plant Genetics and Crop Plant Research and the German Plant Phenotyping Network have jointly initiated the Plant Genomics and Phenomics Research Data Repository (PGP) as infrastructure to comprehensively publish plant research data. This covers in particular cross-domain datasets that are not being published in central repositories because of its volume or unsupported data scope, like image collections from plant phenotyping and microscopy, unfinished genomes, genotyping data, visualizations of morphological plant models, data from mass spectrometry as well as software and documents.The repository is hosted at Leibniz Institute of Plant Genetics and Crop Plant Research using e!DAL as software infrastructure and a Hierarchical Storage Management System as data archival backend. A novel developed data submission tool was made available for the consortium that features a high level of automation to lower the barriers of data publication. After an internal review process, data are published as citable digital object identifiers and a core set of technical metadata is registered at DataCite. The used e!DAL-embedded Web frontend generates for each dataset a landing page and supports an interactive exploration. PGP is registered as research data repository at BioSharing.org, re3data.org and OpenAIRE as valid EU Horizon 2020 open data archive. Above features, the programmatic interface and the support of standard metadata formats, enable PGP to fulfil the FAIR data principles-findable, accessible, interoperable, reusable.Database URL:http://edal.ipk-gatersleben.de/repos/pgp/. © The Author(s) 2016. Published by Oxford University Press.
Mountain-Scale Coupled Processes (TH/THC/THM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Dixon
The purpose of this Model Report is to document the development of the Mountain-Scale Thermal-Hydrological (TH), Thermal-Hydrological-Chemical (THC), and Thermal-Hydrological-Mechanical (THM) Models and evaluate the effects of coupled TH/THC/THM processes on mountain-scale UZ flow at Yucca Mountain, Nevada. This Model Report was planned in ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.12.7), and was developed in accordance with AP-SIII.10Q, Models. In this Model Report, any reference to ''repository'' means the nuclear waste repository at Yucca Mountain, and any reference to ''drifts'' means the emplacement drifts at the repository horizon. This Model Report provides themore » necessary framework to test conceptual hypotheses for analyzing mountain-scale hydrological/chemical/mechanical changes and predict flow behavior in response to heat release by radioactive decay from the nuclear waste repository at the Yucca Mountain site. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH Model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH Model captures mountain-scale three dimensional (3-D) flow effects, including lateral diversion at the PTn/TSw interface and mountain-scale flow patterns. The Mountain-Scale THC Model evaluates TH effects on water and gas chemistry, mineral dissolution/precipitation, and the resulting impact to UZ hydrological properties, flow and transport. The THM Model addresses changes in permeability due to mechanical and thermal disturbances in stratigraphic units above and below the repository host rock. The Mountain-Scale THM Model focuses on evaluating the changes in 3-D UZ flow fields arising out of thermal stress and rock deformation during and after the thermal periods.« less
Kim, Hwa Sun; Cho, Hune; Lee, In Keun
2011-06-01
We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medicaid Services (CMS)-1500 form based on information regarding the patients and physicians' clinical activities. It supports electronic insurance claims by creating reimbursement charges. It also contains an HL7 interface engine to exchange clinical messages between heterogeneous devices. The system partially prevents physician malpractice by suggesting proper treatments according to patient diagnoses and supports physicians by easily preparing documents for reimbursement and submitting claim documents to insurance organizations electronically, without additional effort by the user. To show the usability of the developed system, we performed an experiment that compares the time spent filling out the CMS-1500 form directly and time required create electronic claim data using the developed system. From the experimental results, we conclude that the system could save considerable time for physicians in making claim documents. The developed system might be particularly useful for those who need a reimbursement-specialized EHR system, even though the proposed system does not completely satisfy all criteria requested by the CMS and Office of the National Coordinator for Health Information Technology (ONC). This is because the criteria are not sufficient but necessary condition for the implementation of EHR systems. The system will be upgraded continuously to implement the criteria and to offer more stable and transparent transmission of electronic claim data.
76 FR 53454 - Privacy Act System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... statutory responsibilities of the OIG; and Acting as a repository and source for information necessary to... in matters relating to the statutory responsibilities of the OIG; and 7. Acting as a repository and.... Acting as a repository and source for information necessary to fulfill the reporting requirements of the...
ERIC Educational Resources Information Center
Hoorens, Stijn; van Dijk, Lidia Villalba; van Stolk, Christian
2009-01-01
This briefing paper captures the key findings and recommendations of a study commissioned by the Joint Information Systems Committee on aspects of the strategic commitment of institutions to repository sustainability. This project, labelled EMBRACE (EMBedding Repositories And Consortial Enhancement), is aimed at enhancing the functionality,…
Developing an Integrated Institutional Repository at Imperial College London
ERIC Educational Resources Information Center
Afshari, Fereshteh; Jones, Richard
2007-01-01
Purpose: This paper aims to demonstrate how a highly integrated approach to repository development and deployment can be beneficial in producing a successful archive. Design/methodology/approach: Imperial College London undertook a significant specifications process to gather and formalise requirements for its repository system. This was done…
SITE GENERATED RADIOLOGICAL WASTE HANDLING SYSTEM DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. C. Khamankar
2000-06-20
The Site Generated Radiological Waste Handling System handles radioactive waste products that are generated at the geologic repository operations area. The waste is collected, treated if required, packaged for shipment, and shipped to a disposal site. Waste streams include low-level waste (LLW) in solid and liquid forms, as-well-as mixed waste that contains hazardous and radioactive constituents. Liquid LLW is segregated into two streams, non-recyclable and recyclable. The non-recyclable stream may contain detergents or other non-hazardous cleaning agents and is packaged for shipment. The recyclable stream is treated to recycle a large portion of the water while the remaining concentrated wastemore » is packaged for shipment; this greatly reduces the volume of waste requiring disposal. There will be no liquid LLW discharge. Solid LLW consists of wet solids such as ion exchange resins and filter cartridges, as-well-as dry active waste such as tools, protective clothing, and poly bags. Solids will be sorted, volume reduced, and packaged for shipment. The generation of mixed waste at the Monitored Geologic Repository (MGR) is not planned; however, if it does come into existence, it will be collected and packaged for disposal at its point of occurrence, temporarily staged, then shipped to government-approved off-site facilities for disposal. The Site Generated Radiological Waste Handling System has equipment located in both the Waste Treatment Building (WTB) and in the Waste Handling Building (WHB). All types of liquid and solid LLW are processed in the WTB, while wet solid waste from the Pool Water Treatment and Cooling System is packaged where received in the WHB. There is no installed hardware for mixed waste. The Site Generated Radiological Waste Handling System receives waste from locations where water is used for decontamination functions. In most cases the water is piped back to the WTB for processing. The WTB and WHB provide staging areas for storing and shipping LLW packages as well as any mixed waste packages. The buildings house the system and provide shielding and support for the components. The system is ventilated by and connects to the ventilation systems in the buildings to prevent buildup and confine airborne radioactivity via the high efficiency particulate air filters. The Monitored Geologic Repository Operations Monitoring and Control System will provide monitoring and supervisory control facilities for the system.« less
10 CFR 2.1004 - Amendments and additions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Waste at a Geologic Repository § 2.1004 Amendments and additions. Any document that has not been provided to other parties in electronic form must be identified in an electronic notice and made available... for the high-level waste proceeding. The time allowed under this paragraph will be stayed pending...
10 CFR 2.1004 - Amendments and additions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Waste at a Geologic Repository § 2.1004 Amendments and additions. Any document that has not been provided to other parties in electronic form must be identified in an electronic notice and made available... for the high-level waste proceeding. The time allowed under this paragraph will be stayed pending...
10 CFR 2.1004 - Amendments and additions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Waste at a Geologic Repository § 2.1004 Amendments and additions. Any document that has not been provided to other parties in electronic form must be identified in an electronic notice and made available... for the high-level waste proceeding. The time allowed under this paragraph will be stayed pending...
Supporting Social Awareness in Collaborative E-Learning
ERIC Educational Resources Information Center
Lambropoulos, Niki; Faulkner, Xristine; Culwin, Fintan
2012-01-01
In the last decade, we have seen the emergence of virtual learning environments. Initially, these environments were a little more than document repositories that tutor used unicast to the students. Informed in part by social constructivist theories of education, later environments included capabilities for tutor-student and student-student,…
Utilizing the Antarctic Master Directory to find orphan datasets
NASA Astrophysics Data System (ADS)
Bonczkowski, J.; Carbotte, S. M.; Arko, R. A.; Grebas, S. K.
2011-12-01
While most Antarctic data are housed at an established disciplinary-specific data repository, there are data types for which no suitable repository exists. In some cases, these "orphan" data, without an appropriate national archive, are served from local servers by the principal investigators who produced the data. There are many pitfalls with data served privately, including the frequent lack of adequate documentation to ensure the data can be understood by others for re-use and the impermanence of personal web sites. For example, if an investigator leaves an institution and the data moves, the link published is no longer accessible. To ensure continued availability of data, submission to long-term national data repositories is needed. As stated in the National Science Foundation Office of Polar Programs (NSF/OPP) Guidelines and Award Conditions for Scientific Data, investigators are obligated to submit their data for curation and long-term preservation; this includes the registration of a dataset description into the Antarctic Master Directory (AMD), http://gcmd.nasa.gov/Data/portals/amd/. The AMD is a Web-based, searchable directory of thousands of dataset descriptions, known as DIF records, submitted by scientists from over 20 countries. It serves as a node of the International Directory Network/Global Change Master Directory (IDN/GCMD). The US Antarctic Program Data Coordination Center (USAP-DCC), http://www.usap-data.org/, funded through NSF/OPP, was established in 2007 to help streamline the process of data submission and DIF record creation. When data does not quite fit within any existing disciplinary repository, it can be registered within the USAP-DCC as the fallback data repository. Within the scope of the USAP-DCC we undertook the challenge of discovering and "rescuing" orphan datasets currently registered within the AMD. In order to find which DIF records led to data served privately, all records relating to US data within the AMD were parsed. After identifying the records containing a URL leading to a national data center or other disciplinary data repository, the remaining records were individually inspected for data type, format, and quality of metadata and then assessed to determine how best to preserve. Of the records reviewed, those for which appropriate repositories could be identified were submitted. An additional 35 were deemed acceptable in quality of metadata to register in the USAP-DCC. The content of these datasets were varied in nature, ranging from penguin counts to paleo-geologic maps to results of meteorological models all of which are discoverable through our search interface, http://www.usap-data.org/search.php. The remaining 40 records linked to either no data or had inadequate documentation for preservation highlighting the danger of serving datasets on local servers where minimal metadata standards can not be enforced and long-term access can not be ensured.
NASA Astrophysics Data System (ADS)
Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.
2017-12-01
Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both repositories support secure and sustainable data storage, backup, documentation, publication with DOIs, search, download, statistics as well as web mapping features. Moreover, RDM consulting and support services as well as training sessions are carried out regularly.
Protocols for Scholarly Communication
NASA Astrophysics Data System (ADS)
Pepe, A.; Yeomans, J.
2007-10-01
CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should guide authors naturally towards OA publication, and CERN wants to help reach a full open access publishing environment for the particle physics community and related sciences in the next few years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Arnold, Bill W.; Swift, Peter N.
2012-07-01
A deep borehole repository is one of the four geologic disposal system options currently under study by the U.S. DOE to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic deep borehole repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a deep borehole. A prototype performance assessment model for a generic deep borehole repository has been developed using themore » approach for a mined geological repository. The preliminary results from the simplified deep borehole generic repository performance assessment indicate that soluble, non-sorbing (or weakly sorbing) fission product radionuclides, such as I-129, Se-79 and Cl-36, are the likely major dose contributors, and that the annual radiation doses to hypothetical future humans associated with those releases may be extremely small. While much work needs to be done to validate the model assumptions and parameters, these preliminary results highlight the importance of a robust seal design in assuring long-term isolation, and suggest that deep boreholes may be a viable alternative to mined repositories for disposal of both HLW and UNF. (authors)« less
Proceedings of the 6th US/German Workshop on Salt Repository Research, Design, and Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Francis D.; Walter Steininger; Wilhelm Bollingerfehr
The 6th US/German Workshop on Salt Repository Research, Design, and Operation was held in Dresden. Germany on September 7-9, 2015. Over seventy participants helped advance the technical basis for salt disposal of radioactive waste. The number of collaborative efforts continues to grow and to produce useful documentation, as well as to define the state of the art for research areas. These Proceedings are divided into Chapters, and a list of authors is included in the Acknowledgement Section. Also in this document are the Technical Agenda, List of Participants, Biographical Information, Abstracts, and Presentations. Proceedings of all workshops and other pertinentmore » information are posted on websites hosted by Sandia National Laboratories and the Nuclear Energy Agency Salt Club. The US/German workshops provide continuity for long-term research, summarize and publish status of mature areas, and develop appropriate research by consensus in a workshop environment. As before, major areas and findings are highlighted, which constitute topical Chapters in these Proceedings. In total, the scientific breadth is substantial and while not all subject matter is elaborated into chapter format, all presentations and abstracts are published in this document. In the following Proceedings, six selected topics are developed in detail.« less
SIRSALE: integrated video database management tools
NASA Astrophysics Data System (ADS)
Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.
2002-07-01
Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.
Monte Carlo simulations for generic granite repository studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, Shaoping; Lee, Joon H; Wang, Yifeng
In a collaborative study between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL) for the DOE-NE Office of Fuel Cycle Technologies Used Fuel Disposition (UFD) Campaign project, we have conducted preliminary system-level analyses to support the development of a long-term strategy for geologic disposal of high-level radioactive waste. A general modeling framework consisting of a near- and a far-field submodel for a granite GDSE was developed. A representative far-field transport model for a generic granite repository was merged with an integrated systems (GoldSim) near-field model. Integrated Monte Carlo model runs with the combined near- and farfield transport modelsmore » were performed, and the parameter sensitivities were evaluated for the combined system. In addition, a sub-set of radionuclides that are potentially important to repository performance were identified and evaluated for a series of model runs. The analyses were conducted with different waste inventory scenarios. Analyses were also conducted for different repository radionuelide release scenarios. While the results to date are for a generic granite repository, the work establishes the method to be used in the future to provide guidance on the development of strategy for long-term disposal of high-level radioactive waste in a granite repository.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-25
... File Wrapper System or the Supplemental Complex Repository for Examiners AGENCY: United States Patent... (IFW) or the Supplemental Complex Repository for Examiners (SCORE). The USPTO has considered the... Supplemental Complex Repository for Examiners, 76 FR 53667 (August 29, 2011), 1370 Off. Gaz. Pat. Office 211...
USDA-ARS?s Scientific Manuscript database
In early 1980’s the USDA-ARS established the National Plant Germplasm System (NPGS) a network of gene banks to preserve genetic resources of importance to National and International agriculture. The National Clonal Germplasm Repository (NCGR) in Miami, Florida is one of these repositories. This repo...
ERIC Educational Resources Information Center
Cullen, Rowena; Chawner, Brenda
2011-01-01
The Open Access movement of the past decade, and institutional repositories developed by universities and academic libraries as a part of that movement, have openly challenged the traditional scholarly communication system. This article examines the growth of repositories around the world, and summarizes a growing body of evidence of the response…
77 FR 22632 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-16
...--Repository (SCIDO-R)-VA'' (108VA11S) as set forth in the Federal Register 74 FR 11185-11186 dated March 16.... SUPPLEMENTARY INFORMATION: The Spinal Cord Injury and Disorders Outcomes--Repository (SCIDO-R) provides a registry of veterans with spinal cord injury and disorders (SCI&D). This repository contains pertinent...
History, Context, and Policies of a Learning Object Repository
ERIC Educational Resources Information Center
Simpson, Steven Marshall
2016-01-01
Learning object repositories, a form of digital libraries, are robust systems that provide educators new ways to search for educational resources, collaborate with peers, and provide instruction to students in unique and varied ways. This study examines a learning object repository created by a large suburban school district to increase teaching…
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. L. Poe, Jr.; P.F. Wise
The U.S. Department of Energy (DOE) is preparing a proposal to construct, operate 2nd monitor, and eventually close a repository at Yucca Mountain in Nye County, Nevada, for the geologic disposal of spent nuclear fuel (SNF) and high-level radioactive waste (HLW). As part of this effort, DOE has prepared a viability assessment and an assessment of potential consequences that may exist if the repository is not constructed. The assessment of potential consequences if the repository is not constructed assumes that all SNF and HLW would be left at the generator sites. These include 72 commercial generator sites (three commercial facilitymore » pairs--Salem and Hope Creek, Fitzpatrick and Nine Mile Point, and Dresden and Morris--would share common storage due to their close proximity to each other) and five DOE sites across the country. DOE analyzed the environmental consequences of the effects of the continued storage of these materials at these sites in a report titled Continued Storage Analysis Report (CSAR; Reference 1 ) . The CSAR analysis includes a discussion of the degradation of these materials when exposed to the environment. This document describes the environmental parameters that influence the degradation analyzed in the CSAR. These include temperature, relative humidity, precipitation chemistry (pH and chemical composition), annual precipitation rates, annual number of rain-days, and annual freeze/thaw cycles. The document also tabulates weather conditions for each storage site, evaluates the degradation of concrete storage modules and vaults in different regions of the country, and provides a thermal analysis of commercial SNF in storage.« less
NASA Astrophysics Data System (ADS)
Maiwald, F.; Vietze, T.; Schneider, D.; Henze, F.; Münster, S.; Niebling, F.
2017-02-01
Historical photographs contain high density of information and are of great importance as sources in humanities research. In addition to the semantic indexing of historical images based on metadata, it is also possible to reconstruct geometric information about the depicted objects or the camera position at the time of the recording by employing photogrammetric methods. The approach presented here is intended to investigate (semi-) automated photogrammetric reconstruction methods for heterogeneous collections of historical (city) photographs and photographic documentation for the use in the humanities, urban research and history sciences. From a photogrammetric point of view, these images are mostly digitized photographs. For a photogrammetric evaluation, therefore, the characteristics of scanned analog images with mostly unknown camera geometry, missing or minimal object information and low radiometric and geometric resolution have to be considered. In addition, these photographs have not been created specifically for documentation purposes and so the focus of these images is often not on the object to be evaluated. The image repositories must therefore be subjected to a preprocessing analysis of their photogrammetric usability. Investigations are carried out on the basis of a repository containing historical images of the Kronentor ("crown gate") of the Dresden Zwinger. The initial step was to assess the quality and condition of available images determining their appropriateness for generating three-dimensional point clouds from historical photos using a structure-from-motion evaluation (SfM). Then, the generated point clouds were assessed by comparing them with current measurement data of the same object.
[Tissue repositories for research at Sheba Medical Center(SMC].
Cohen, Yehudit; Barshack, Iris; Onn, Amir
2013-06-01
Cancer is the number one cause of death in both genders. Breakthroughs in the understanding of cancer biology, the identification of prognostic factors, and the development of new treatments are increasingly dependent on access to human cancer tissues with linked clinicopathological data. Access to human tumor samples and a large investment in translational research are needed to advance this research. The SMC tissue repositories provide researchers with biological materials, which are essential tools for cancer research. SMC tissue repositories for research aim to collect, document and preserve human biospecimens from patients with cancerous diseases. This is in order to provide the highest quality and well annotated biological biospecimens, used as essential tools to achieve the growing demands of scientific research needs. Such repositories are partners in acceLerating biomedical research and medical product development through clinical resources, in order to apply best options to the patients. Following Institutional Review Board approval and signing an Informed Consent Form, the tumor and tumor-free specimens are coLLected by a designated pathologist at the operating room only when there is a sufficient amount of the tumor, in excess of the routine needs. Blood samples are collected prior to the procedure. Other types of specimens collected include ascites fluid, pleural effusion, tissues for Optimal Cutting Temperature [OCT] and primary culture etc. Demographic, clinical, pathologicaL, and follow-up data are collected in a designated database. SMC has already established several organ or disease-specific tissue repositories within different departments. The foundation of tissue repositories requires the concentrated effort of a multidisciplinary team composed of paramedical, medical and scientific professionals. Research projects using these specimens facilitate the development of 'targeted therapy', accelerate basic research aimed at clarifying molecular mechanisms involved in cancer, and support the development of novel diagnostic tools.
[Self-archiving of biomedical papers in open access repositories].
Abad-García, M Francisca; Melero, Remedios; Abadal, Ernest; González-Teruel, Aurora
2010-04-01
Open-access literature is digital, online, free of charge, and free of most copyright and licensing restrictions. Self-archiving or deposit of scholarly outputs in institutional repositories (open-access green route) is increasingly present in the activities of the scientific community. Besides the benefits of open access for visibility and dissemination of science, it is increasingly more often required by funding agencies to deposit papers and any other type of documents in repositories. In the biomedical environment this is even more relevant by the impact scientific literature can have on public health. However, to make self-archiving feasible, authors should be aware of its meaning and the terms in which they are allowed to archive their works. In that sense, there are some tools like Sherpa/RoMEO or DULCINEA (both directories of copyright licences of scientific journals at different levels) to find out what rights are retained by authors when they publish a paper and if they allow to implement self-archiving. PubMed Central and its British and Canadian counterparts are the main thematic repositories for biomedical fields. In our country there is none of similar nature, but most of the universities and CSIC, have already created their own institutional repositories. The increase in visibility of research results and their impact on a greater and earlier citation is one of the most frequently advance of open access, but removal of economic barriers to access to information is also a benefit to break borders between groups.
Using artificial intelligence to automate remittance processing.
Adams, W T; Snow, G M; Helmick, P M
1998-06-01
The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.
An Automated Acquisition System for Media Exploitation
2008-06-01
on the acquisition station, AcqMan will pull out the SHA256 image hash, and the device’s model, serial number, and manufacturer. 2. Query the ADOMEX...Repository Using the data collected above, AcqMan will query the ADOMEX repository. The ADOMEX repository will respond to the query with the SHA256 ’s of...whose SHA256s do not match. The last category will be a list of images that the ADOMEX repository already has and that the acquisition station can
New Features of the re3data Registry of Research Data Repositories
NASA Astrophysics Data System (ADS)
Elger, K.; Pampel, H.; Vierkant, P.; Witt, M.
2016-12-01
re3data is a registry of research data repositories that lists over 1,600 repositories from around the world, making it the largest and most comprehensive online catalog of data repositories on the web. The registry offers researchers, funding agencies, libraries and publishers a comprehensive overview of the heterogeneous landscape of data repositories. The repositories are described, following the "Metadata Schema for the Description of Research Data Repositories". re3data summarises the properties of a repository into a user-friendly icon system helping users to easily identify an adequate repository for the storage of their datasets. The re3data entries are curated by an international, multi-disciplinary editorial board. An application programming interface (API) enables other information systems to list and fetch metadata for integration and interoperability. Funders like the European Commission (2015) and publishers like Springer Nature (2016) recommend the use of re3data.org in their policies. The original re3data project partners are the GFZ German Research Centre for Geosciences, the Humboldt-Universität zu Berlin, the Purdue University Libraries and the Karlsruhe Institute of Technology (KIT). Since 2015 re3data is operated as a service of DataCite, a global non-profit organisation that provides persistent identifiers (DOIs) for research data. At the 2016 AGU Fall Meeting we will describe the current status of re3data. An overview of the major developments and new features will be given. Furthermore, we will present our plans to increase the quality of the re3data entries.
48 CFR 227.7108 - Contractor data repositories.
Code of Federal Regulations, 2014 CFR
2014-10-01
... repositories. 227.7108 Section 227.7108 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in... such data; (3) When required by the contracting officer, deliver data to the Government on paper or in...
48 CFR 227.7108 - Contractor data repositories.
Code of Federal Regulations, 2013 CFR
2013-10-01
... repositories. 227.7108 Section 227.7108 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in... such data; (3) When required by the contracting officer, deliver data to the Government on paper or in...
48 CFR 227.7108 - Contractor data repositories.
Code of Federal Regulations, 2011 CFR
2011-10-01
... repositories. 227.7108 Section 227.7108 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in... such data; (3) When required by the contracting officer, deliver data to the Government on paper or in...
48 CFR 227.7108 - Contractor data repositories.
Code of Federal Regulations, 2012 CFR
2012-10-01
... repositories. 227.7108 Section 227.7108 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in... such data; (3) When required by the contracting officer, deliver data to the Government on paper or in...
The Tropical and Subtropical Germplasm Repositories of The National Germplasm System
USDA-ARS?s Scientific Manuscript database
Germplasm collections are viewed as a source of genetic diversity to support crop improvement and agricultural research, and germplasm conservation efforts. The United States Department of Agriculture's National Plant Germplasm Repository System (NPGS) is responsible for administering plant genetic ...
Malaysian Education Index (MEI): An Online Indexing and Repository System
ERIC Educational Resources Information Center
Kabilan, Muhammad Kamarul; Ismail, Hairul Nizam; Yaakub, Rohizani; Yusof, Najeemah Mohd; Idros, Sharifah Noraidah Syed; Umar, Irfan Naufal; Arshad, Muhammad Rafie Mohd.; Idrus, Rosnah; Rahman, Habsah Abdul
2010-01-01
This "Project Sheet" describes an on-going project that is being carried out by a group of educational researchers, computer science researchers and librarians from Universiti Sains Malaysia, Penang. The Malaysian Education Index (MEI) has two main functions--(1) Online Indexing System, and (2) Online Repository System. In this brief…
Implementation of SAP Waste Management System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frost, M.L.; LaBorde, C.M.; Nichols, C.D.
2008-07-01
The Y-12 National Security Complex (Y-12) assumed responsibility for newly generated waste on October 1, 2005. To ensure effective management and accountability of newly generated waste, Y-12 has opted to utilize SAP, Y-12's Enterprise Resource Planning (ERP) tool, to track low-level radioactive waste (LLW), mixed waste (MW), hazardous waste, and non-regulated waste from generation through acceptance and disposal. SAP Waste will include the functionality of the current waste tracking system and integrate with the applicable modules of SAP already in use. The functionality of two legacy systems, the Generator Entry System (GES) and the Waste Information Tracking System (WITS), andmore » peripheral spreadsheets, databases, and e-mail/fax communications will be replaced by SAP Waste. Fundamentally, SAP Waste will promote waste acceptance for certification and disposal, not storage. SAP Waste will provide a one-time data entry location where waste generators can enter waste container information, track the status of their waste, and maintain documentation. A benefit of the new system is that it will provide a single data repository where Y-12's Waste Management organization can establish waste profiles, verify and validate data, maintain inventory control utilizing hand-held data transfer devices, schedule and ship waste, manage project accounting, and report on waste handling activities. This single data repository will facilitate the production of detailed waste generation reports for use in forecasting and budgeting, provide the data for required regulatory reports, and generate metrics to evaluate the performance of the Waste Management organization and its subcontractors. SAP Waste will replace the outdated and expensive legacy system, establish tools the site needs to manage newly generated waste, and optimize the use of the site's ERP tool for integration with related business processes while promoting disposition of waste. (authors)« less
76 FR 57644 - Privacy Act of 1974; Implementation
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-16
... in DMDC 13, entitled ``Investigative Records Repository'', when investigatory material is compiled... exemptions. * * * * * (c) * * * (17) System identifier and name: DMDC 13, Investigative Records Repository...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huff, Kathryn D.
Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less
SeisCode: A seismological software repository for discovery and collaboration
NASA Astrophysics Data System (ADS)
Trabant, C.; Reyes, C. G.; Clark, A.; Karstens, R.
2012-12-01
SeisCode is a community repository for software used in seismological and related fields. The repository is intended to increase discoverability of such software and to provide a long-term home for software projects. Other places exist where seismological software may be found, but none meet the requirements necessary for an always current, easy to search, well documented, and citable resource for projects. Organizations such as IRIS, ORFEUS, and the USGS have websites with lists of available or contributed seismological software. Since the authors themselves do often not maintain these lists, the documentation often consists of a sentence or paragraph, and the available software may be outdated. Repositories such as GoogleCode and SourceForge, which are directly maintained by the authors, provide version control and issue tracking but do not provide a unified way of locating geophysical software scattered in and among countless unrelated projects. Additionally, projects are hosted at language-specific sites such as Mathworks and PyPI, in FTP directories, and in websites strewn across the Web. Search engines are only partially effective discovery tools, as the desired software is often hidden deep within the results. SeisCode provides software authors a place to present their software, codes, scripts, tutorials, and examples to the seismological community. Authors can choose their own level of involvement. At one end of the spectrum, the author might simply create a web page that points to an existing site. At the other extreme, an author may choose to leverage the many tools provided by SeisCode, such as a source code management tool with integrated issue tracking, forums, news feeds, downloads, wikis, and more. For software development projects with multiple authors, SeisCode can also be used as a central site for collaboration. SeisCode provides the community with an easy way to discover software, while providing authors a way to build a community around their software packages. IRIS invites the seismological community to browse and to submit projects to https://seiscode.iris.washington.edu/
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.
Characterize Eruptive Processes at Yucca Mountain, Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Krier
2004-10-04
The purpose of this scientific analysis report, ''Characterize Eruptive Processes at Yucca Mountain, Nevada'', is to present information about natural volcanic systems and the parameters that can be used to model their behavior. This information is used to develop parameter-value distributions appropriate for analysis of the consequences of volcanic eruptions through a repository at Yucca Mountain. This scientific analysis report provides information to four other reports: ''Number of Waste Packages Hit by Igneous Intrusion'', (BSC 2004 [DIRS 170001]); ''Atmospheric Dispersal and Deposition of Tephra from Potential Volcanic Eruption at Yucca Mountain, Nevada'' (BSC 2004 [DIRS 170026]); ''Dike/Drift Interactions'' (BSC 2004more » [DIRS 170028]); ''Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV'' (BSC 2004 [DIRS 170027], Section 6.5). This report is organized into seven major sections. This section addresses the purpose of this document. Section 2 addresses quality assurance, Section 3 the use of software, Section 4 identifies the requirements that constrain this work, and Section 5 lists assumptions and their rationale. Section 6 presents the details of the scientific analysis and Section 7 summarizes the conclusions reached.« less
An Assessment of a Science Discipline Archive Against ISO 16363
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Downs, R. R.
2016-12-01
The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.
Integration of HTML documents into an XML-based knowledge repository.
Roemer, Lorrie K; Rocha, Roberto A; Del Fiol, Guilherme
2005-01-01
The Emergency Patient Instruction Generator (EPIG) is an electronic content compiler / viewer / editor developed by Intermountain Health Care. The content is vendor-licensed HTML patient discharge instructions. This work describes the process by which discharge instructions where converted from ASCII-encoded HTML to XML, then loaded to a database for use by EPIG.
On-line remote monitoring of radioactive waste repositories
NASA Astrophysics Data System (ADS)
Calì, Claudio; Cosentino, Luigi; Litrico, Pietro; Pappalardo, Alfio; Scirè, Carlotta; Scirè, Sergio; Vecchio, Gianfranco; Finocchiaro, Paolo; Alfieri, Severino; Mariani, Annamaria
2014-12-01
A low-cost array of modular sensors for online monitoring of radioactive waste was developed at INFN-LNS. We implemented a new kind of gamma counter, based on Silicon PhotoMultipliers and scintillating fibers, that behaves like a cheap scintillating Geiger-Muller counter. It can be placed in shape of a fine grid around each single waste drum in a repository. Front-end electronics and an FPGA-based counting system were developed to handle the field data, also implementing data transmission, a graphical user interface and a data storage system. A test of four sensors in a real radwaste storage site was performed with promising results. Following the tests an agreement was signed between INFN and Sogin for the joint development and installation of a prototype DMNR (Detector Mesh for Nuclear Repository) system inside the Garigliano radwaste repository in Sessa Aurunca (CE, Italy). Such a development is currently under way, with the installation foreseen within 2014.
Indexing and retrieving DICOM data in disperse and unstructured archives.
Costa, Carlos; Freitas, Filipe; Pereira, Marco; Silva, Augusto; Oliveira, José L
2009-01-01
This paper proposes an indexing and retrieval solution to gather information from distributed DICOM documents by allowing searches and access to the virtual data repository using a Google-like process. The medical imaging modalities are becoming more powerful and less expensive. The result is the proliferation of equipment acquisition by imaging centers, including the small ones. With this dispersion of data, it is not easy to take advantage of all the information that can be retrieved from these studies. Furthermore, many of these small centers do not have large enough requirements to justify the acquisition of a traditional PACS. A peer-to-peer PACS platform to index and query DICOM files over a set of distributed repositories that are logically viewed as a single federated unit. The solution is based on a public domain document-indexing engine and extends traditional PACS query and retrieval mechanisms. This proposal deals well with complex searching requirements, from a single desktop environment to distributed scenarios. The solution performance and robustness were demonstrated in trials. The characteristics of presented PACS platform make it particularly important for small institutions, including educational and research groups.
Developing Integrated Taxonomies for a Tiered Information Architecture
NASA Technical Reports Server (NTRS)
Dutra, Jayne E.
2006-01-01
This viewgraph presentation reviews the concept of developing taxonomies for an information architecture. In order to assist people in accessing information required to access information and retrieval, including cross repository searching, a system of nested taxonomies is being developed. Another facet of this developmental project is collecting and documenting attributes about people, to allow for several uses: access management, i.e., who are you and what can you see?; targeted content delivery i.e., what content helps you get your work done?; w ork force planning i.e., what skill sets do you have that we can appl y to work?; and IT Services i.e., How can we provision you with the proper IT services?
Lettuce germplasm collection in the National Plant Germplasm System
USDA-ARS?s Scientific Manuscript database
The National Plant Germplasm System (NPGS) holds more than half million accessions of crop plants and their related species that are coordinately assigned to four major Regional Plant Introduction Stations and an additional 21 crop-specific repositories. These Stations and repositories acquire, main...
The Frictionless Data Package: Data Containerization for Automated Scientific Workflows
NASA Astrophysics Data System (ADS)
Shepherd, A.; Fils, D.; Kinkade, D.; Saito, M. A.
2017-12-01
As cross-disciplinary geoscience research increasingly relies on machines to discover and access data, one of the critical questions facing data repositories is how data and supporting materials should be packaged for consumption. Traditionally, data repositories have relied on a human's involvement throughout discovery and access workflows. This human could assess fitness for purpose by reading loosely coupled, unstructured information from web pages and documentation. In attempts to shorten the time to science and access data resources across may disciplines, expectations for machines to mediate the process of discovery and access is challenging data repository infrastructure. This challenge is to find ways to deliver data and information in ways that enable machines to make better decisions by enabling them to understand the data and metadata of many data types. Additionally, once machines have recommended a data resource as relevant to an investigator's needs, the data resource should be easy to integrate into that investigator's toolkits for analysis and visualization. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) supports NSF-funded OCE and PLR investigators with their project's data management needs. These needs involve a number of varying data types some of which require multiple files with differing formats. Presently, BCO-DMO has described these data types and the important relationships between the type's data files through human-readable documentation on web pages. For machines directly accessing data files from BCO-DMO, this documentation could be overlooked and lead to misinterpreting the data. Instead, BCO-DMO is exploring the idea of data containerization, or packaging data and related information for easier transport, interpretation, and use. In researching the landscape of data containerization, the Frictionlessdata Data Package (http://frictionlessdata.io/) provides a number of valuable advantages over similar solutions. This presentation will focus on these advantages and how the Frictionlessdata Data Package addresses a number of real-world use cases faced for data discovery, access, analysis and visualization.
Nuclear waste disposal: Gambling on Yucca Mountain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ginsburg, S.
1995-05-01
This document describes the historical aspects of nuclear energy ,nuclear weapons usage, and development of the nuclear bureaucracy in the United States, and discusses the selection and siting of Yucca Mountain, Nevada for a federal nuclear waste repository. Litigation regarding the site selection and resulting battles in the political arena and in the Nevada State Legislature are also presented. Alternative radioactive waste disposal options, risk assessments of the Yucca Mountain site, and logistics regarding the transportation and storage of nuclear waste are also presented. This document also contains an extensive bibliography.
ERIC Educational Resources Information Center
Yalcinalp, Serpil; Emiroglu, Bulent
2012-01-01
Although many developments have been made in the design and development of learning object repositories (LORs), the efficient use of such systems is still questionable. Without realising the functional use of such systems or considering the involvement of their dynamic users, these systems would probably become obsolete. This study includes both…
Fifth NASA Goddard Conference on Mass Storage Systems and Technologies. Volume 2
NASA Technical Reports Server (NTRS)
Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)
1996-01-01
This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies held September 17 - 19, 1996, at the University of Maryland, University Conference Center in College Park, Maryland. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.
Benge, James; Beach, Thomas; Gladding, Connie; Maestas, Gail
2008-01-01
The Military Health System (MHS) deployed its electronic health record (EHR), AHLTA to Military Treatment Facilities (MTFs) around the world. This paper focuses on the approach and barriers to using structured text in AHLTA to document care encounters and illustrates the direct correlation between the use of structured text and achievement of expected benefits. AHLTA uses commercially available products, a health data dictionary and standardized medical terminology, enabling the capture of structured computable data. With structured text stored in the AHLTA Clinical Data Repository (CDR), the MHS has seen a return on its EHR investment with improvements in the accuracy and completeness of coding and the documentation of care provided. Determining the aspects of documentation where structured text is most beneficial, as well as the degree of structured text needed has been a significant challenge. This paper describes how the economic value framework aligns the enterprise strategic objectives with the EHR investment features, performance metrics and expected benefits. The framework analyses focus on return on investment calculations, baseline assessment and post-implementation benefits validation. Cost avoidance, revenue enhancements and operational improvements, such as evidence-based medicine and medical surveillance can be directly attributed to use structured text.
ERIC Educational Resources Information Center
Bates, Melanie; Loddington, Steve; Manuel, Sue; Oppenheim, Charles
2007-01-01
In the United Kingdom over the past few years there has been a dramatic growth of national and regional repositories to collect and disseminate resources related to teaching and learning. Most notable of these are the Joint Information Systems Committee's Online Repository for [Learning and Teaching] Materials as well as the Higher Education…
NASA Technical Reports Server (NTRS)
Merwarth, P., D.
1983-01-01
The Common Software Module Repository (CSMR) is computerized library system with high product and service visibility to potential users. Online capabilities of system allow both librarian and user to interact with library. Librarian is responsible for maintaining information in CSMR library. User searches library to locate software modules that meet his or her current needs.
MO/DSD online information server and global information repository access
NASA Technical Reports Server (NTRS)
Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William
1994-01-01
Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.
FAIRDOMHub: a repository and collaboration environment for sharing systems biology research.
Wolstencroft, Katherine; Krebs, Olga; Snoep, Jacky L; Stanford, Natalie J; Bacall, Finn; Golebiewski, Martin; Kuzyakiv, Rostyk; Nguyen, Quyen; Owen, Stuart; Soiland-Reyes, Stian; Straszewski, Jakub; van Niekerk, David D; Williams, Alan R; Malmström, Lars; Rinn, Bernd; Müller, Wolfgang; Goble, Carole
2017-01-04
The FAIRDOMHub is a repository for publishing FAIR (Findable, Accessible, Interoperable and Reusable) Data, Operating procedures and Models (https://fairdomhub.org/) for the Systems Biology community. It is a web-accessible repository for storing and sharing systems biology research assets. It enables researchers to organize, share and publish data, models and protocols, interlink them in the context of the systems biology investigations that produced them, and to interrogate them via API interfaces. By using the FAIRDOMHub, researchers can achieve more effective exchange with geographically distributed collaborators during projects, ensure results are sustained and preserved and generate reproducible publications that adhere to the FAIR guiding principles of data stewardship. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Architecture for the Interdisciplinary Earth Data Alliance
NASA Astrophysics Data System (ADS)
Richard, S. M.
2016-12-01
The Interdisciplinary Earth Data Alliance (IEDA) is leading an EarthCube (EC) Integrative Activity to develop a governance structure and technology framework that enables partner data systems to share technology, infrastructure, and practice for documenting, curating, and accessing heterogeneous geoscience data. The IEDA data facility provides capabilities in an extensible framework that enables domain-specific requirements for each partner system in the Alliance to be integrated into standardized cross-domain workflows. The shared technology infrastructure includes a data submission hub, a domain-agnostic file-based repository, an integrated Alliance catalog and a Data Browser for data discovery across all partner holdings, as well as services for registering identifiers for datasets (DOI) and samples (IGSN). The submission hub will be a platform that facilitates acquisition of cross-domain resource documentation and channels users into domain and resource-specific workflows tailored for each partner community. We are exploring an event-based message bus architecture with a standardized plug-in interface for adding capabilities. This architecture builds on the EC CINERGI metadata pipeline as well as the message-based architecture of the SEAD project. Plug-in components for file introspection to match entities to a data type registry (extending EC Digital Crust and Research Data Alliance work), extract standardized keywords (using CINERGI components), location, cruise, personnel and other metadata linkage information (building on GeoLink and existing IEDA partner components). The submission hub will feed submissions to appropriate partner repositories and service endpoints targeted by domain and resource type for distribution. The Alliance governance will adopt patterns (vocabularies, operations, resource types) for self-describing data services using standard HTTP protocol for simplified data access (building on EC GeoWS and other `RESTful' approaches). Exposure of resource descriptions (datasets and service distributions) for harvesting by commercial search engines as well as geoscience-data focused crawlers (like EC B-Cube crawler) will increase discoverability of IEDA resources with minimal effort by curators.
Software Tools Streamline Project Management
NASA Technical Reports Server (NTRS)
2009-01-01
Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query-Based Document Management (QBDM) is a tool that enables content or context searches, either simple or hierarchical, across a variety of databases. The system enables users to specify notification subscriptions where they associate "contexts of interest" and "events of interest" to one or more documents or collection(s) of documents. Based on these subscriptions, users receive notification when the events of interest occur within the contexts of interest for associated document or collection(s) of documents. Users can also associate at least one notification time as part of the notification subscription, with at least one option for the time period of notifications.
The open research system: a web-based metadata and data repository for collaborative research
Charles M. Schweik; Alexander Stepanov; J. Morgan Grove
2005-01-01
Beginning in 1999, a web-based metadata and data repository we call the "open research system" (ORS) was designed and built to assist geographically distributed scientific research teams. The purpose of this innovation was to promote the open sharing of data within and across organizational lines and across geographic distances. As the use of the system...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandt, G.; Spicher, G.; Steyer, St.
2008-07-01
Since the 1998 termination of LLW and ILW emplacement in the Morsleben repository (ERAM), Germany, the treatment, conditioning and documentation of radioactive waste products and packages have been continued on the basis of the waste acceptance requirements as of 1995, prepared for the Konrad repository near Salzgitter in Lower Saxony, Germany. The resulting waste products and packages are stored in interim storage facilities. Due to the Konrad license issued in 2002 the waste acceptance requirements have to be completed by additional requirements imposed by the licensing authority, e. g. for the declaration of chemical waste package constituents. Therefore, documentation ofmore » waste products and packages which are checked by independent experts and are in parts approved by the responsible authority (Office for Radiation Protection, BfS) up to now will have to be checked again for fulfilling the final waste acceptance requirements prior to disposal. In order to simplify these additional checks, databases are used to ensure an easy access to all known facts about the waste packages. A short balance of the existing waste products and packages which are already checked and partly approved by BfS as well as an overview on the established databases ensuring a fast access to the known facts about the conditioning processes is presented. (authors)« less
Integration of HTML Documents into an XML-Based Knowledge Repository
Roemer, Lorrie K; Rocha, Roberto A; Del Fiol, Guilherme
2005-01-01
The Emergency Patient Instruction Generator (EPIG) is an electronic content compiler/viewer/editor developed by Intermountain Health Care. The content is vendor-licensed HTML patient discharge instructions. This work describes the process by which discharge instructions where converted from ASCII-encoded HTML to XML, then loaded to a database for use by EPIG. PMID:16779384
Digital Authenticity and Integrity: Digital Cultural Heritage Documents as Research Resources
ERIC Educational Resources Information Center
Bradley; Rachael
2005-01-01
This article presents the results of a survey addressing methods of securing digital content and ensuring the content's authenticity and integrity, as well as the perceived importance of authenticity and integrity. The survey was sent to 40 digital repositories in the United States and Canada between June 30 and July 19, 2003. Twenty-two…
The SoRReL papers: Recent publications of the Software Reuse Repository Lab
NASA Technical Reports Server (NTRS)
Eichmann, David A. (Editor)
1992-01-01
The entire publication is presented of some of the papers recently published by the SoRReL. Some typical titles are as follows: Design of a Lattice-Based Faceted Classification System; A Hybrid Approach to Software Reuse Repository Retrieval; Selecting Reusable Components Using Algebraic Specifications; Neural Network-Based Retrieval from Reuse Repositories; and A Neural Net-Based Approach to Software Metrics.
Doing Your Science While You're in Orbit
NASA Astrophysics Data System (ADS)
Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.
2010-11-01
Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.
Implementation of an Online Database for Chemical Propulsion Systems
NASA Technical Reports Server (NTRS)
David B. Owen, II; McRight, Patrick S.; Cardiff, Eric H.
2009-01-01
The Johns Hopkins University, Chemical Propulsion Information Analysis Center (CPIAC) has been working closely with NASA Goddard Space Flight Center (GSFC); NASA Marshall Space Flight Center (MSFC); the University of Alabama at Huntsville (UAH); The Johns Hopkins University, Applied Physics Laboratory (APL); and NASA Jet Propulsion Laboratory (JPL) to capture satellite and spacecraft propulsion system information for an online database tool. The Spacecraft Chemical Propulsion Database (SCPD) is a new online central repository containing general and detailed system and component information on a variety of spacecraft propulsion systems. This paper only uses data that have been approved for public release with unlimited distribution. The data, supporting documentation, and ability to produce reports on demand, enable a researcher using SCPD to compare spacecraft easily, generate information for trade studies and mass estimates, and learn from the experiences of others through what has already been done. This paper outlines the layout and advantages of SCPD, including a simple example application with a few chemical propulsion systems from various NASA spacecraft.
Waste retrieval sluicing system data acquisition system acceptance test report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevins, R.R.
1998-07-31
This document describes the test procedure for the Project W-320 Tank C-106 Sluicing Data Acquisition System (W-320 DAS). The Software Test portion will test items identified in the WRSS DAS System Description (SD), HNF-2115. Traceability to HNF-2115 will be via a reference that follows in parenthesis, after the test section title. The Field Test portion will test sensor operability, analog to digital conversion, and alarm setpoints for field instrumentation. The W-320 DAS supplies data to assist thermal modeling of tanks 241-C-106 and 241-AY-102. It is designed to be a central repository for information from sources that would otherwise have tomore » be read, recorded, and integrated manually. Thus, completion of the DAS requires communication with several different data collection devices and output to a usable PC data formats. This test procedure will demonstrate that the DAS functions as required by the project requirements stated in Section 3 of the W-320 DAS System Description, HNF-2115.« less
Samwald, Matthias; Lim, Ernest; Masiar, Peter; Marenco, Luis; Chen, Huajun; Morse, Thomas; Mutalik, Pradeep; Shepherd, Gordon; Miller, Perry; Cheung, Kei-Hoi
2009-01-01
The amount of biomedical data available in Semantic Web formats has been rapidly growing in recent years. While these formats are machine-friendly, user-friendly web interfaces allowing easy querying of these data are typically lacking. We present "Entrez Neuron", a pilot neuron-centric interface that allows for keyword-based queries against a coherent repository of OWL ontologies. These ontologies describe neuronal structures, physiology, mathematical models and microscopy images. The returned query results are organized hierarchically according to brain architecture. Where possible, the application makes use of entities from the Open Biomedical Ontologies (OBO) and the 'HCLS knowledgebase' developed by the W3C Interest Group for Health Care and Life Science. It makes use of the emerging RDFa standard to embed ontology fragments and semantic annotations within its HTML-based user interface. The application and underlying ontologies demonstrate how Semantic Web technologies can be used for information integration within a curated information repository and between curated information repositories. It also demonstrates how information integration can be accomplished on the client side, through simple copying and pasting of portions of documents that contain RDFa markup.
Conservaton and retrieval of information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, M.
This is a summary of the findings of a Nordic working group formed in 1990 and given the task of establishing a basis for a common Nordic view of the need for information conservation for nuclear waste repositories by investigating the following: (1) the type of information that should be conserved; (2) the form in which the information should be kept; (3) the quality of the information as regards both type and form; and (4) the problems of future retrieval of information, including retrieval after very long periods of time. High-level waste from nuclear power generation will remain radioactive formore » very long times even though the major part of the radioactivity will have decayed within 1000 yr. Certain information about the waste must be kept for long time periods because future generations may-intentionally or inadvertently-come into contact with the radioactive waste. Current day waste management would benefit from an early identification of documents to be part of an archive for radioactive waste repositories. The same reasoning is valid for repositories for other toxic wastes.« less
Gómez, Alberto; Nieto-Díaz, Manuel; Del Águila, Ángela; Arias, Enrique
2018-05-01
Transparency in science is increasingly a hot topic. Scientists are required to show not only results but also evidence of how they have achieved these results. In experimental studies of spinal cord injury, there are a number of standardized tests, such as the Basso-Beattie-Bresnahan locomotor rating scale for rats and Basso Mouse Scale for mice, which researchers use to study the pathophysiology of spinal cord injury and to evaluate the effects of experimental therapies. Although the standardized data from the Basso-Beattie-Bresnahan locomotor rating scale and the Basso Mouse Scale are particularly suited for storage and sharing in databases, systems of data acquisition and repositories are still lacking. To the best of our knowledge, both tests are usually conducted manually, with the data being recorded on a paper form, which may be documented with video recordings, before the data is transferred to a spreadsheet for analysis. The data thus obtained is used to compute global scores, which is the information that usually appears in publications, with a wealth of information being omitted. This information may be relevant to understand locomotion deficits or recovery, or even important aspects of the treatment effects. Therefore, this paper presents a mobile application to record and share Basso Mouse Scale tests, meeting the following criteria: i) user-friendly; ii) few hardware requirements (only a smartphone or tablet with a camera running under Android Operating System); and iii) based on open source software such as SQLite, XML, Java, Android Studio and Android SDK. The BAMOS app can be downloaded and installed from the Google Market repository and the app code is available at the GitHub repository. The BAMOS app demonstrates that mobile technology constitutes an opportunity to develop tools for aiding spinal cord injury scientists in recording and sharing experimental data. Copyright © 2018 Elsevier Ltd. All rights reserved.
Geologic Framework Model Analysis Model Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompassmore » the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential radioactive waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for the repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 2.« less
NASA Astrophysics Data System (ADS)
Rahman, Fuad; Tarnikova, Yuliya; Hartono, Rachmat; Alam, Hassan
2006-01-01
This paper presents a novel automatic web publishing solution, Pageview (R). PageView (R) is a complete working solution for document processing and management. The principal aim of this tool is to allow workgroups to share, access and publish documents on-line on a regular basis. For example, assuming that a person is working on some documents. The user will, in some fashion, organize his work either in his own local directory or in a shared network drive. Now extend that concept to a workgroup. Within a workgroup, some users are working together on some documents, and they are saving them in a directory structure somewhere on a document repository. The next stage of this reasoning is that a workgroup is working on some documents, and they want to publish them routinely on-line. Now it may happen that they are using different editing tools, different software, and different graphics tools. The resultant documents may be in PDF, Microsoft Office (R), HTML, or Word Perfect format, just to name a few. In general, this process needs the documents to be processed in a fashion so that they are in the HTML format, and then a web designer needs to work on that collection to make them available on-line. PageView (R) takes care of this whole process automatically, making the document workflow clean and easy to follow. PageView (R) Server publishes documents, complete with the directory structure, for online use. The documents are automatically converted to HTML and PDF so that users can view the content without downloading the original files, or having to download browser plug-ins. Once published, other users can access the documents as if they are accessing them from their local folders. The paper will describe the complete working system and will discuss possible applications within the document management research.
Natural geochemical analogues of the near field of high-level nuclear waste repositories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apps, J.A.
1995-09-01
United States practice has been to design high-level nuclear waste (HLW) geological repositories with waste densities sufficiently high that repository temperatures surrounding the waste will exceed 100{degrees}C and could reach 250{degrees}C. Basalt and devitrified vitroclastic tuff are among the host rocks considered for waste emplacement. Near-field repository thermal behavior and chemical alteration in such rocks is expected to be similar to that observed in many geothermal systems. Therefore, the predictive modeling required for performance assessment studies of the near field could be validated and calibrated using geothermal systems as natural analogues. Examples are given which demonstrate the need for refinementmore » of the thermodynamic databases used in geochemical modeling of near-field natural analogues and the extent to which present models can predict conditions in geothermal fields.« less
Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems
ERIC Educational Resources Information Center
Mason, Robert T.
2011-01-01
An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…
10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...
10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...
10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...
10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Performance objectives for the geologic repository after permanent closure. 63.113 Section 63.113 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH... and an engineered barrier system. (b) The engineered barrier system must be designed so that, working...
SureChEMBL: a large-scale, chemically annotated patent document database.
Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P
2016-01-04
SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
SureChEMBL: a large-scale, chemically annotated patent document database
Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A.; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P.
2016-01-01
SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. PMID:26582922
Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML database--Xindice.
Li, Feng; Li, Maoyu; Xiao, Zhiqiang; Zhang, Pengfei; Li, Jianling; Chen, Zhuchu
2006-01-11
Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC) is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML) editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.
CARRIER PREPARATION BUILDING MATERIALS HANDLING SYSTEM DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
E.F. Loros
2000-06-28
The Carrier Preparation Building Materials Handling System receives rail and truck shipping casks from the Carrier/Cask Transport System, and inspects and prepares the shipping casks for return to the Carrier/Cask Transport System. Carrier preparation operations for carriers/casks received at the surface repository include performing a radiation survey of the carrier and cask, removing/retracting the personnel barrier, measuring the cask temperature, removing/retracting the impact limiters, removing the cask tie-downs (if any), and installing the cask trunnions (if any). The shipping operations for carriers/casks leaving the surface repository include removing the cask trunnions (if any), installing the cask tie-downs (if any), installingmore » the impact limiters, performing a radiation survey of the cask, and installing the personnel barrier. There are four parallel carrier/cask preparation lines installed in the Carrier Preparation Building with two preparation bays in each line, each of which can accommodate carrier/cask shipping and receiving. The lines are operated concurrently to handle the waste shipping throughputs and to allow system maintenance operations. One remotely operated overhead bridge crane and one remotely operated manipulator is provided for each pair of carrier/cask preparation lines servicing four preparation bays. Remotely operated support equipment includes a manipulator and tooling and fixtures for removing and installing personnel barriers, impact limiters, cask trunnions, and cask tie-downs. Remote handling equipment is designed to facilitate maintenance, dose reduction, and replacement of interchangeable components where appropriate. Semi-automatic, manual, and backup control methods support normal, abnormal, and recovery operations. Laydown areas and equipment are included as required for transportation system components (e.g., personnel barriers and impact limiters), fixtures, and tooling to support abnormal and recovery operations. The Carrier Preparation Building Materials Handling System interfaces with the Cask/Carrier Transport System to move the carriers to and from the system. The Carrier Preparation Building System houses the equipment and provides the facility, utility, safety, communications, and auxiliary systems supporting operations and protecting personnel.« less
Wilbanks, Bryan A; Geisz-Everson, Marjorie; Boust, Rebecca R
2016-09-01
Clinical documentation is a critical tool in supporting care provided to patients. Sound documentation provides a picture of clinical events that can be used to improve patient care. However, many other uses for clinical documentation are equally important. Such documentation informs clinical decision support tools, creates a legal record of patient care, assists in financial reimbursement of services, and serves as a repository for secondary data analysis. Conversely, poor documentation can impair patient safety and increase malpractice risk exposure by reflecting poor or inaccurate information that ultimately may guide patient care decisions.Through an examination of anesthesia-related closed claims, a descriptive qualitative study emerged, which explored the antecedents and consequences of documentation quality in the claims reviewed. A secondary data analysis utilized a database generated by the American Association of Nurse Anesthetists Foundation closed claim review team. Four major themes emerged from the analysis. Themes 1, 2, and 4 primarily describe how poor documentation quality can have negative consequences for clinicians. The third theme primarily describes how poor documentation quality that can negatively affect patient safety.
A Computational Workflow for the Automated Generation of Models of Genetic Designs.
Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil
2018-06-05
Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.
Adapting a Clinical Data Repository to ICD-10-CM through the use of a Terminology Repository
Cimino, James J.; Remennick, Lyubov
2014-01-01
Clinical data repositories frequently contain patient diagnoses coded with the International Classification of Diseases, Ninth Revision (ICD-9-CM). These repositories now need to accommodate data coded with the Tenth Revision (ICD-10-CM). Database users wish to retrieve relevant data regardless of the system by which they are coded. We demonstrate how a terminology repository (the Research Entities Dictionary or RED) serves as an ontology relating terms of both ICD versions to each other to support seamless version-independent retrieval from the Biomedical Translational Research Information System (BTRIS) at the National Institutes of Health. We make use of the Center for Medicare and Medicaid Services’ General Equivalence Mappings (GEMs) to reduce the modeling effort required to determine whether ICD-10-CM terms should be added to the RED as new concepts or as synonyms of existing concepts. A divide-and-conquer approach is used to develop integration heuristics that offer a satisfactory interim solution and facilitate additional refinement of the integration as time and resources allow. PMID:25954344
Burn Incidence and Treatment in the U.S.
... state health data systems, and the National Burn Repository (NBR) of the American Burn Association (ABA). ABA ... Burn Admissions to Burn Centers (ABA National Burn Repository 2015) Survival Rate: 96.8% Gender: 68% Male, ...
Core Certification of Data Repositories: Trustworthiness and Long-Term Stewardship
NASA Astrophysics Data System (ADS)
de Sherbinin, A. M.; Mokrane, M.; Hugo, W.; Sorvari, S.; Harrison, S.
2017-12-01
Scientific integrity and norms dictate that data created and used by scientists should be managed, curated, and archived in trustworthy data repositories thus ensuring that science is verifiable and reproducible while preserving the initial investment in collecting data. Research stakeholders including researchers, science funders, librarians, and publishers must also be able to establish the trustworthiness of data repositories they use to confirm that the data they submit and use remain useful and meaningful in the long term. Data repositories are increasingly recognized as a key element of the global research infrastructure and the importance of establishing their trustworthiness is recognised as a prerequisite for efficient scientific research and data sharing. The Core Trustworthy Data Repository Requirements are a set of universal requirements for certification of data repositories at the core level (see: https://goo.gl/PYsygW). They were developed by the ICSU World Data System (WDS: www.icsu-wds.org) and the Data Seal of Approval (DSA: www.datasealofapproval.org)—the two authoritative organizations responsible for the development and implementation of this standard to be further developed under the CoreTrustSeal branding . CoreTrustSeal certification of data repositories involves a minimally intensive process whereby repositories supply evidence that they are sustainable and trustworthy. Repositories conduct a self-assessment which is then reviewed by community peers. Based on this review CoreTrustSeal certification is granted by the CoreTrustSeal Standards and Certification Board. Certification helps data communities—producers, repositories, and consumers—to improve the quality and transparency of their processes, and to increase awareness of and compliance with established standards. This presentation will introduce the CoreTrustSeal certification requirements for repositories and offer an opportunity to discuss ways to improve the contribution of certified data repositories to sustain open data for open scientific research.
An e-consent-based shared EHR system architecture for integrated healthcare networks.
Bergmann, Joachim; Bott, Oliver J; Pretschner, Dietrich P; Haux, Reinhold
2007-01-01
Virtual integration of distributed patient data promises advantages over a consolidated health record, but raises questions mainly about practicability and authorization concepts. Our work aims on specification and development of a virtual shared health record architecture using a patient-centred integration and authorization model. A literature survey summarizes considerations of current architectural approaches. Complemented by a methodical analysis in two regional settings, a formal architecture model was specified and implemented. Results presented in this paper are a survey of architectural approaches for shared health records and an architecture model for a virtual shared EHR, which combines a patient-centred integration policy with provider-oriented document management. An electronic consent system assures, that access to the shared record remains under control of the patient. A corresponding system prototype has been developed and is currently being introduced and evaluated in a regional setting. The proposed architecture is capable of partly replacing message-based communications. Operating highly available provider repositories for the virtual shared EHR requires advanced technology and probably means additional costs for care providers. Acceptance of the proposed architecture depends on transparently embedding document validation and digital signature into the work processes. The paradigm shift from paper-based messaging to a "pull model" needs further evaluation.
Multimedia Health Records: user-centered design approach for a multimedia uploading service.
Plazzotta, Fernando; Mayan, John C; Storani, Fernando D; Ortiz, Juan M; Lopez, Gastón E; Gimenez, Gastón M; Luna, Daniel R
2015-01-01
Multimedia elements add value to text documents by transmitting information difficult to express in words. In healthcare, many professional and services keep this elements in their own repositories. This brings the problem of information fragmentation in different silos which hinder its access to other healthcare professionals. On the other hand patients have clinical data of their own in different formats generated in different healthcare organizations which is not accessible to professionals within our healthcare network. This paper describes the design, development and implementation processes of a service which allows media elements to be loaded in a patient clinical data repository (CDR) either through an electronic health record by professionals (EHR) or through a personal health record (PHR) by patients, in order to avoid fragmentation of the information.
Understand your Algorithm: Drill Down to Sample Visualizations in Jupyter Notebooks
NASA Astrophysics Data System (ADS)
Mapes, B. E.; Ho, Y.; Cheedela, S. K.; McWhirter, J.
2017-12-01
Statistics are the currency of climate dynamics, but the space of all possible algorithms is fathomless - especially for 4-dimensional weather-resolving data that many "impact" variables depend on. Algorithms are designed on data samples, but how do you know if they measure what you expect when turned loose on Big Data? We will introduce the year-1 prototype of a 3-year scientist-led, NSF-supported, Unidata-quality software stack called DRILSDOWN (https://brianmapes.github.io/EarthCube-DRILSDOWN/) for automatically extracting, integrating, and visualizing multivariate 4D data samples. Based on a customizable "IDV bundle" of data sources, fields and displays supplied by the user, the system will teleport its space-time coordinates to fetch Cases of Interest (edge cases, typical cases, etc.) from large aggregated repositories. These standard displays can serve as backdrops to overlay with your value-added fields (such as derived quantities stored on a user's local disk). Fields can be readily pulled out of the visualization object for further processing in Python. The hope is that algorithms successfully tested in this visualization space will then be lifted out and added to automatic processing toolchains, lending confidence in the next round of processing, to seek the next Cases of Interest, in light of a user's statistical measures of "Interest". To log the scientific work done in this vein, the visualizations are wrapped in iPython-based Jupyter notebooks for rich, human-readable documentation (indeed, quasi-publication with formatted text, LaTex math, etc.). Such notebooks are readable and executable, with digital replicability and provenance built in. The entire digital object of a case study can be stored in a repository, where libraries of these Case Study Notebooks can be examined in a browser. Model data (the session topic) are of course especially convenient for this system, but observations of all sorts can also be brought in, overlain, and differenced or otherwise co-processed. The system is available in various tiers, from minimal-install GUI visualizations only, to GUI+Notebook system, to the full system with the repository software. We seek interested users, initially in a "beta tester" mode with the goodwill to offer reports and requests to help drive improvements in project years 2 and 3.
DServO: A Peer-to-Peer-based Approach to Biomedical Ontology Repositories.
Mambone, Zakaria; Savadogo, Mahamadi; Some, Borlli Michel Jonas; Diallo, Gayo
2015-01-01
We present in this poster an extension of the ServO ontology server system, which adopts a decentralized Peer-To-Peer approach for managing multiple heterogeneous knowledge organization systems. It relies on the use of the JXTA protocol coupled with information retrieval techniques to provide a decentralized infrastructure for managing multiples instances of Ontology Repositories.
Social Influences on User Behavior in Group Information Repositories
ERIC Educational Resources Information Center
Rader, Emilee Jeanne
2009-01-01
Group information repositories are systems for organizing and sharing files kept in a central location that all group members can access. These systems are often assumed to be tools for storage and control of files and their metadata, not tools for communication. The purpose of this research is to better understand user behavior in group…
Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.
State Assessment Program Item Banks: Model Language for Request for Proposals (RFP) and Contracts
ERIC Educational Resources Information Center
Swanson, Leonard C.
2010-01-01
This document provides recommendations for request for proposal (RFP) and contract language that state education agencies can use to specify their requirements for access to test item banks. An item bank is a repository for test items and data about those items. Item banks are used by state agency staff to view items and associated data; to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaller, A.; Skanata, D.
1995-12-31
Site selection approach to radioactive waste disposal facility, which is under way in Croatia, is presented in the paper. This approach is based on application of certain relevant terrestrial and technical criteria in the site selection process. Basic documentation used for this purpose are regional planning documents prepared by the Regional Planning Institute of Croatia. The basic result of research described in the paper is the proposal of several potential areas which are suitable for siting a radioactive waste repository. All relevant conclusions are based on both data groups -- generic and on-field experienced (measured). Out of a dozen potentialmore » areas, four have been chosen as representative by the authors. The presented comparative analysis was made by means of the VISA II computer code, developed by the V. Belton and SPV Software Products. The code was donated to the APO by the IAEA. The main objective of the paper is to initiate and facilitate further discussions on possible ways of evaluation and comparison of potential areas for sitting of radioactive waste repository in this country, as well as to provide additional contributions to the current site selection process in the Republic of Croatia.« less
Harvesting NASA's Common Metadata Repository (CMR)
NASA Technical Reports Server (NTRS)
Shum, Dana; Durbin, Chris; Norton, James; Mitchell, Andrew
2017-01-01
As part of NASA's Earth Observing System Data and Information System (EOSDIS), the Common Metadata Repository (CMR) stores metadata for over 30,000 datasets from both NASA and international providers along with over 300M granules. This metadata enables sub-second discovery and facilitates data access. While the CMR offers a robust temporal, spatial and keyword search functionality to the general public and international community, it is sometimes more desirable for international partners to harvest the CMR metadata and merge the CMR metadata into a partner's existing metadata repository. This poster will focus on best practices to follow when harvesting CMR metadata to ensure that any changes made to the CMR can also be updated in a partner's own repository. Additionally, since each partner has distinct metadata formats they are able to consume, the best practices will also include guidance on retrieving the metadata in the desired metadata format using CMR's Unified Metadata Model translation software.
SemanticOrganizer: A Customizable Semantic Repository for Distributed NASA Project Teams
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Berrios, Daniel C.; Carvalho, Robert E.; Hall, David R.; Rich, Stephen J.; Sturken, Ian B.; Swanson, Keith J.; Wolfe, Shawn R.
2004-01-01
SemanticOrganizer is a collaborative knowledge management system designed to support distributed NASA projects, including diverse teams of scientists, engineers, and accident investigators. The system provides a customizable, semantically structured information repository that stores work products relevant to multiple projects of differing types. SemanticOrganizer is one of the earliest and largest semantic web applications deployed at NASA to date, and has been used in diverse contexts ranging from the investigation of Space Shuttle Columbia's accident to the search for life on other planets. Although the underlying repository employs a single unified ontology, access control and ontology customization mechanisms make the repository contents appear different for each project team. This paper describes SemanticOrganizer, its customization facilities, and a sampling of its applications. The paper also summarizes some key lessons learned from building and fielding a successful semantic web application across a wide-ranging set of domains with diverse users.
Harvesting NASA's Common Metadata Repository
NASA Astrophysics Data System (ADS)
Shum, D.; Mitchell, A. E.; Durbin, C.; Norton, J.
2017-12-01
As part of NASA's Earth Observing System Data and Information System (EOSDIS), the Common Metadata Repository (CMR) stores metadata for over 30,000 datasets from both NASA and international providers along with over 300M granules. This metadata enables sub-second discovery and facilitates data access. While the CMR offers a robust temporal, spatial and keyword search functionality to the general public and international community, it is sometimes more desirable for international partners to harvest the CMR metadata and merge the CMR metadata into a partner's existing metadata repository. This poster will focus on best practices to follow when harvesting CMR metadata to ensure that any changes made to the CMR can also be updated in a partner's own repository. Additionally, since each partner has distinct metadata formats they are able to consume, the best practices will also include guidance on retrieving the metadata in the desired metadata format using CMR's Unified Metadata Model translation software.
Army Hearing Program Talking Points Calendar Year 2015
2016-12-14
outside the range of normal hearing sensitivity (greater than 25 dB), CY15 data. Data: DOEHRS-HC Data Repository , Soldiers who had a DD2215 or...1. Data: Defense Occupational and Environmental Health Readiness System-Hearing Conservation (DOEHRS-HC) Data Repository , CY15—Army Profile...Soldiers have a hearing loss that required a fit-for-duty (Readiness) evaluation: An H-3 Hearing Profile. Data: DOEHRS-HC Data Repository
Engineered Barrier System: Physical and Chemical Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Dixon
2004-04-26
The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less
Wei, Wei; Ji, Zhanglong; He, Yupeng; Zhang, Kai; Ha, Yuanchi; Li, Qi; Ohno-Machado, Lucila
2018-01-01
Abstract The number and diversity of biomedical datasets grew rapidly in the last decade. A large number of datasets are stored in various repositories, with different formats. Existing dataset retrieval systems lack the capability of cross-repository search. As a result, users spend time searching datasets in known repositories, and they typically do not find new repositories. The biomedical and healthcare data discovery index ecosystem (bioCADDIE) team organized a challenge to solicit new indexing and searching strategies for retrieving biomedical datasets across repositories. We describe the work of one team that built a retrieval pipeline and examined its performance. The pipeline used online resources to supplement dataset metadata, automatically generated queries from users’ free-text questions, produced high-quality retrieval results and achieved the highest inferred Normalized Discounted Cumulative Gain among competitors. The results showed that it is a promising solution for cross-database, cross-domain and cross-repository biomedical dataset retrieval. Database URL: https://github.com/w2wei/dataset_retrieval_pipeline PMID:29688374
Repository contributions to Rubus research
USDA-ARS?s Scientific Manuscript database
The USDA National Plant Germplasm System is a nation-wide source for global genetic resources. The National Clonal Germplasm Repository (NCGR) in Corvallis, OR, maintains crops and crop wild relatives for the Willamette Valley including pear, raspberry and blackberry, strawberry, blueberry, gooseber...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appavoo, Jonathan
Exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today's machines. Systems software for exascale machines must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. The FOX project explored systems software and runtime support for a new approach to the data and work distribution for fault oblivious application execution. Our major OS work at Boston University focusedmore » on developing a new light-weight operating systems model that provides an appropriate context for both multi-core and multi-node application development. This work is discussed in section 1. Early on in the FOX project BU developed infrastructure for prototyping dynamic HPC environments in which the sets of nodes that an application is run on can be dynamically grown or shrunk. This work was an extension of the Kittyhawk project and is discussed in section 2. Section 3 documents the publications and software repositories that we have produced. To put our work in context of the complete FOX project contribution we include in section 4 an extended version of a paper that documents the complete work of the FOX team.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-01-01
The US Department of Energy (DOE) Yucca mountain Site Characterization Project Office (YMPO) assigned Science Applications International Corporation (SAIC), the Technical and Management Support Services (T&MSS) contractor to the YmPo, the task of conducting an Early Site Suitability Evaluation (ESSE) of the Yucca mountain site as a potential site for a high-level radioactive waste repository. First, the assignment called for the development of a method to evaluate a single site against the DOE General Guidelines for Recommendation of Sites for Nuclear Waste Repositories, 10 CFR Part 960. Then, using this method, an evaluation team, the ESSE Core Team, of seniormore » YMP scientists, engineers, and technical experts, evaluated new information obtained about the site since publication of the final Environmental Assessment (DOE, 1986) to determine if new suitability/unsuitability findings could be recommended. Finally, the Core Team identified further information and analyses needed to make final determinations for each of the guidelines. As part of the task, an independent peer review of the ESSE report has been conducted. Expertise was solicited that covered the entire spectrum of siting guidelines in 10 CFR Part 960 in order to provide a complete, in-depth critical review of the data evaluated and cited in the ESSE report, the methods used to evaluate the data, and the conclusions and recommendations offered by the report. Fourteen nationally recognized technical experts (Table 2) served on the Peer Review Panel. The comments from the Panel and the responses prepared by the ESSE Core Team, documented on formal Comment Response Forms, constitute the body of this document.« less
NASA Technical Reports Server (NTRS)
Khayat, Mohammad G.; Kempler, Steven J.
2015-01-01
As geospatial missions age, one of the challenges for the usability of data is the availability of relevant and updated metadata with sufficient documentation that can be used by future generations of users to gain knowledge from the original data. Given that remote sensing data undergo many intermediate processing steps, for example, an understanding of the exact algorithms employed and the quality of that data produced, could be key considerations for these users. As interest in global climate data is increasing, documentation about older data, their origins, and provenance are valuable to first time users attempting to perform historical climate research or comparative analysis of global change. Incomplete or missing documentation could be what stands in the way of a new researcher attempting to use the data. Therefore, preservation of documentation and related metadata is sometimes just as critical as the preservation of the original observational data. The Goddard Earth Sciences - Data and Information Service Center (GES DISC), a NASA Earth science Distributed Active Archive Center (DAAC), that falls under the management structure of the Earth Science Data and Information System (ESDIS), is actively pursuing the preservation of all necessary artifacts needed by future users. In this paper we will detail the data custodial planning and the data lifecycle process developed for content preservation, our implementation of a Preservation System to safeguard documents and associated artifacts from legacy (older) missions, as well as detail lessons learned regarding access rights and confidentiality of information issues. We also elaborate on key points that made our preservation effort successful; the primary points being: the drafting of a governing baseline for historical data preservation from satellite missions, and using the historical baseline as a guide to content filtering of what documents to preserve. The Preservation System currently archives documentation content for High Resolution Dynamics Limb Sounder (HIRDLS), Upper Atmosphere Research Satellite (UARS), Total Ozone Mapping Spectrometer (TOMS), and the 1960's era Nimbus mission. Documentation from other missions like the Tropical Rainfall Measuring Mission (TRMM), the Ozone Monitoring Instrument (OMI), and the Atmospheric Infra-Red Sounder (AIRS) are also slated to be added to this repository, as well as the other mission datasets to be preserved at the GES DISC.
Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture
NASA Astrophysics Data System (ADS)
Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel
2003-11-01
Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
...-end repository to manage various reporting, pooling, and risk management activities associated with... records is to serve as a central back-end repository to house loan origination and servicing, security...
2012-01-01
Objectives This study demonstrates the feasibility of using expert system shells for rapid clinical decision support module development. Methods A readily available expert system shell was used to build a simple rule-based system for the crude diagnosis of vaginal discharge. Pictures and 'canned text explanations' are extensively used throughout the program to enhance its intuitiveness and educational dimension. All the steps involved in developing the system are documented. Results The system runs under Microsoft Windows and is available as a free download at http://healthcybermap.org/vagdisch.zip (the distribution archive includes both the program's executable and the commented knowledge base source as a text document). The limitations of the demonstration system, such as the lack of provisions for assessing uncertainty or various degrees of severity of a sign or symptom, are discussed in detail. Ways of improving the system, such as porting it to the Web and packaging it as an app for smartphones and tablets, are also presented. Conclusions An easy-to-use expert system shell enables clinicians to rapidly become their own 'knowledge engineers' and develop concise evidence-based decision support modules of simple to moderate complexity, targeting clinical practitioners, medical and nursing students, as well as patients, their lay carers and the general public (where appropriate). In the spirit of the social Web, it is hoped that an online repository can be created to peer review, share and re-use knowledge base modules covering various clinical problems and algorithms, as a service to the clinical community. PMID:23346475
NASA Astrophysics Data System (ADS)
Lugmayr, Artur R.; Mailaparampil, Anurag; Tico, Florina; Kalli, Seppo; Creutzburg, Reiner
2003-01-01
Digital television (digiTV) is an additional multimedia environment, where metadata is one key element for the description of arbitrary content. This implies adequate structures for content description, which is provided by XML metadata schemes (e.g. MPEG-7, MPEG-21). Content and metadata management is the task of a multimedia repository, from which digiTV clients - equipped with an Internet connection - can access rich additional multimedia types over an "All-HTTP" protocol layer. Within this research work, we focus on conceptual design issues of a metadata repository for the storage of metadata, accessible from the feedback channel of a local set-top box. Our concept describes the whole heterogeneous life-cycle chain of XML metadata from the service provider to the digiTV equipment, device independent representation of content, accessing and querying the metadata repository, management of metadata related to digiTV, and interconnection of basic system components (http front-end, relational database system, and servlet container). We present our conceptual test configuration of a metadata repository that is aimed at a real-world deployment, done within the scope of the future interaction (fiTV) project at the Digital Media Institute (DMI) Tampere (www.futureinteraction.tv).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voelzke, Holger; Nieslony, Gregor; Ellouz, Manel
Since the license for the Konrad repository was finally confirmed by legal decision in 2007, the Federal Institute for Radiation Protection (BfS) has been performing further planning and preparation work to prepare the repository for operation. Waste conditioning and packaging has been continued by different waste producers as the nuclear industry and federal research institutes on the basis of the official disposal requirements. The necessary prerequisites for this are approved containers as well as certified waste conditioning and packaging procedures. The Federal Institute for Materials Research and Testing (BAM) is responsible for container design testing and evaluation of quality assurancemore » measures on behalf of BfS under consideration of the Konrad disposal requirements. Besides assessing the container handling stability (stacking tests, handling loads), design testing procedures are performed that include fire tests (800 deg. C, 1 hour) and drop tests from different heights and drop orientations. This paper presents the current state of BAM design testing experiences about relevant container types (box shaped, cylindrical) made of steel sheets, ductile cast iron or concrete. It explains usual testing and evaluation methods which range from experimental testing to analytical and numerical calculations. Another focus has been laid on already existing containers and packages. The question arises as to how they can be evaluated properly especially with respect to lack of completeness of safety assessment and fabrication documentation. At present BAM works on numerous applications for container design testing for the Konrad repository. Some licensing procedures were successfully finished in the past and BfS certified several container types like steel sheet, concrete until cast iron containers which are now available for waste packaging for final disposal. However, large quantities of radioactive wastes had been placed into interim storage using containers which are not already licensed for the Konrad repository. Safety assessment of these so-called 'old' containers is a big challenge for all parties because documentation sheets about container design testing and fabrication often contain gaps or have not yet been completed. Appropriate solution strategies are currently under development and discussion. Furthermore, BAM has successfully initiated and established an information forum, called 'ERFA QM Konrad Containers', which facilitates discussions on various issues of common interest with respect to Konrad container licensing procedures as well as the interpretation of disposal requirements under consideration of operational needs. Thus, it provides additional, valuable supports for container licensing procedures. (authors)« less
ERIC Educational Resources Information Center
National Archives and Records Administration, Washington, DC. Office of Public Programs.
This publication is intended for teachers bringing a class to visit the National Archives in Washington, D.C., for a workshop on primary documents. The National Archives serves as the repository for all federal records of enduring value. Primary sources are vital teaching tools because they actively engage the student's imagination so that he or…
Distributed Episodic Exploratory Planning (DEEP)
2012-04-01
other papers , reports and presentations, as referenced in the bibliography, and is only summarized in this report. 15. SUBJECT TERMS Planning...repository feature space. Much of the work is documented in detail in other papers , reports and presentations, as referenced in the bibliography, and is...DISTRIBUTION UNLIMITED 4 released in May 2006 a revolutionary vision paper titled “C2 Enabling Concepts” (Braun, 2006) depicting what a potential
Evolution in Metadata Quality: Common Metadata Repository's Role in NASA Curation Efforts
NASA Technical Reports Server (NTRS)
Gilman, Jason; Shum, Dana; Baynes, Katie
2016-01-01
Metadata Quality is one of the chief drivers of discovery and use of NASA EOSDIS (Earth Observing System Data and Information System) data. Issues with metadata such as lack of completeness, inconsistency, and use of legacy terms directly hinder data use. As the central metadata repository for NASA Earth Science data, the Common Metadata Repository (CMR) has a responsibility to its users to ensure the quality of CMR search results. This poster covers how we use humanizers, a technique for dealing with the symptoms of metadata issues, as well as our plans for future metadata validation enhancements. The CMR currently indexes 35K collections and 300M granules.
Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio
2012-07-01
During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.
Site characterization report for the basalt waste isolation project. Volume II
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1982-11-01
The reference location for a repository in basalt for the terminal storage of nuclear wastes on the Hanford Site and the candidate horizons within this reference repository location have been identified and the preliminary characterization work in support of the site screening process has been completed. Fifteen technical questions regarding the qualification of the site were identified to be addressed during the detailed site characterization phase of the US Department of Energy-National Waste Terminal Storage Program site selection process. Resolution of these questions will be provided in the final site characterization progress report, currently planned to be issued in 1987,more » and in the safety analysis report to be submitted with the License Application. The additional information needed to resolve these questions and the plans for obtaining the information have been identified. This Site Characterization Report documents the results of the site screening process, the preliminary site characterization data, the technical issues that need to be addressed, and the plans for resolving these issues. Volume 2 contains chapters 6 through 12: geochemistry; surface hydrology; climatology, meteorology, and air quality; environmental, land-use, and socioeconomic characteristics; repository design; waste package; and performance assessment.« less
Standards-based curation of a decade-old digital repository dataset of molecular information.
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Murray-Rust, Peter; Rzepa, Henry S; Stewart, James J P
2015-01-01
The desirable curation of 158,122 molecular geometries derived from the NCI set of reference molecules together with associated properties computed using the MOPAC semi-empirical quantum mechanical method and originally deposited in 2005 into the Cambridge DSpace repository as a data collection is reported. The procedures involved in the curation included annotation of the original data using new MOPAC methods, updating the syntax of the CML documents used to express the data to ensure schema conformance and adding new metadata describing the entries together with a XML schema transformation to map the metadata schema to that used by the DataCite organisation. We have adopted a granularity model in which a DataCite persistent identifier (DOI) is created for each individual molecule to enable data discovery and data metrics at this level using DataCite tools. We recommend that the future research data management (RDM) of the scientific and chemical data components associated with journal articles (the "supporting information") should be conducted in a manner that facilitates automatic periodic curation. Graphical abstractStandards and metadata-based curation of a decade-old digital repository dataset of molecular information.
Samwald, Matthias; Lim, Ernest; Masiar, Peter; Marenco, Luis; Chen, Huajun; Morse, Thomas; Mutalik, Pradeep; Shepherd, Gordon; Miller, Perry; Cheung, Kei-Hoi
2013-01-01
The amount of biomedical data available in Semantic Web formats has been rapidly growing in recent years. While these formats are machine-friendly, user-friendly web interfaces allowing easy querying of these data are typically lacking. We present “Entrez Neuron”, a pilot neuron-centric interface that allows for keyword-based queries against a coherent repository of OWL ontologies. These ontologies describe neuronal structures, physiology, mathematical models and microscopy images. The returned query results are organized hierarchically according to brain architecture. Where possible, the application makes use of entities from the Open Biomedical Ontologies (OBO) and the ‘HCLS knowledgebase’ developed by the W3C Interest Group for Health Care and Life Science. It makes use of the emerging RDFa standard to embed ontology fragments and semantic annotations within its HTML-based user interface. The application and underlying ontologies demonstrates how Semantic Web technologies can be used for information integration within a curated information repository and between curated information repositories. It also demonstrates how information integration can be accomplished on the client side, through simple copying and pasting of portions of documents that contain RDFa markup. PMID:19745321
Borrego Alonso, Sofía; Perdomo Amistad, Ivette
2014-01-01
The high relative humidity and temperatures in tropical countries create favorable conditions for development of fungi that are not only a risk to human health but they can also colonize documentary support. To study the concentration of the airborne fungi in two repositories of the National Archives of the Republic of Cuba, the mycobiota deposited on different photographic supports and maps preserved in these repositories, and to determine the taxonomic characterization of the fungi isolated. The air sampling was performed using a sedimentation method, and the supports (6 pictures and 7 maps) were analyzed using moistened sterile swabs. The Cladosporium genus was predominant, followed by Aspergillus and Penicillium genera. Filamentous fungi were isolated in all the photos and maps, and yeasts were only isolated from a photographic supports and a map. We identified several species of Aspergillus and Penicillium genera, but Aspergillus niger and Aspergillus flavus predominated. Candida and Rhodotorula were the yeast genera isolated. The fungal concentration of the air demonstrated that the environments were not contaminated. From the 26 species of filamentous fungi isolated only 5 were detected in the indoor air of the repositories and on one or more of the document supports analyzed (representing a 19.3%). This shows that there is a low ecological relationship between the fungi detected in the indoor air and those that were isolated from the supports studied. Copyright © 2012 Revista Iberoamericana de Micología. Published by Elsevier Espana. All rights reserved.
The MIMIC Code Repository: enabling reproducibility in critical care research.
Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J
2018-01-01
Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Interoperability Across the Stewardship Spectrum in the DataONE Repository Federation
NASA Astrophysics Data System (ADS)
Jones, M. B.; Vieglais, D.; Wilson, B. E.
2016-12-01
Thousands of earth and environmental science repositories serve many researchers and communities, each with their own community and legal mandates, sustainability models, and historical infrastructure. These repositories span the stewardship spectrum from highly curated collections that employ large numbers of staff members to review and improve data, to small, minimal budget repositories that accept data caveat emptor and where all responsibility for quality lies with the submitter. Each repository fills a niche, providing services that meet the stewardship tradeoffs of one or more communities. We have reviewed these stewardship tradeoffs for several DataONE member repositories ranging from minimally (KNB) to highly curated (Arctic Data Center), as well as general purpose (Dryad) to highly discipline or project specific (NEON). The rationale behind different levels of stewardship reflect resolution of these tradeoffs. Some repositories aim to encourage extensive uptake by keeping processes simple and minimizing the amount of information collected, but this limits the long-term utility of the data and the search, discovery, and integration systems that are possible. Other repositories require extensive metadata input, review, and assessment, allowing for excellent preservation, discovery, and integration but at the cost of significant time for submitters and expense for curatorial staff. DataONE recognizes these different levels of curation, and attempts to embrace them to create a federation that is useful across the stewardship spectrum. DataONE provides a tiered model for repositories with growing utility of DataONE services at higher tiers of curation. The lowest tier supports read-only access to data and requires little more than title and contact metadata. Repositories can gradually phase in support for higher levels of metadata and services as needed. These tiered capabilities are possible through flexible support for multiple metadata standards and services, where repositories can incrementally increase their requirements as they want to satisfy more use cases. Within DataONE, metadata search services support minimal metadata models, but significantly expanded precision and recall become possible when repositories provide more extensively curated metadata.
Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories
NASA Astrophysics Data System (ADS)
Lehnert, K.
2017-12-01
Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.
Whitney, J.W.; Keefer, W.R.
2000-01-01
In recognition of a critical national need for permanent radioactive-waste storage, Yucca Mountain in southwestern Nevada has been investigated by Federal agencies since the 1970's, as a potential geologic disposal site. In 1987, Congress selected Yucca Mountain for an expanded and more detailed site characterization effort. As an integral part of this program, the U.S. Geological Survey began a series of detailed geologic, geophysical, and related investigations designed to characterize the tectonic setting, fault behavior, and seismicity of the Yucca Mountain area. This document presents the results of 13 studies of the tectonic environment of Yucca Mountain, in support of a broad goal to assess the effects of future seismic and fault activity in the area on design, long-term performance, and safe operation of the potential surface and subsurface repository facilities.
10 CFR 960.5-2-10 - Hydrology.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the host rock and the land surface. (2) Absence of surface-water systems that could potentially cause flooding of the repository. (3) Availability of the water required for repository construction, operation, and closure. (c) Potentially adverse condition. Ground-water conditions that could require complex...
Linking User Identities Across the DataONE Federation of Data Repositories
NASA Astrophysics Data System (ADS)
Jones, M. B.; Mecum, B.; Leinfelder, B.; Jones, C. S.; Walker, L.
2016-12-01
DataONE provides services for identifying, authenticating, and authorizing researchers to access and contribute data to repositories within the DataONE federation. In the earth sciences, thousands of institutional and disciplinary repositories have created their own user identity and authentication systems with their own user directory based on a database or web content management systems. Thus, researchers have many identities that are neither linked nor interoperable, making it difficult to reference the identity of these users across systems. Key user information is hidden, and only a non-disambiguated name is often available. From a sample of 160,000 data sets within DataONE, a super-majority of references to the data creators lack even an email address. In an attempt to disambiguate these people via the GeoLink project, we conservatively estimate they represent at least 57,000 unique identities, but without a clear user identifier, there could be as many as 223,000. Interoperability among repositories is critical to improving the scope of scientific synthesis and capabilities for research collaboration. While many have focused on the convenience of Single Sign-On (SSO), we have found that sharing user identifiers is far more useful for interoperability. With an unambiguous user identity in incoming metadata, DataONE has built user-profiles that present that user's data across repositories, that link users and their organizational affiliations, and that allow users to work collaboratively in private groups that span repository systems. DataONE's user identity solution leverages existing systems such as InCommon, CILogon, Google, and ORCID to not further proliferate user identities. DataONE provides a core service allowing users to link their multiple identities so that authenticating with one identity (e.g., ORCID) can authorize access to data protected via another identity (e.g., InCommon). Currently, DataONE is using ORCID identities to link and identify users, but challenges must still be overcome to support historical records for which ORCIDs can not be used because the associated people are unavailable to confirm their identity. DataONE's identity systems facilitate crosslinking between user identities and scientific metadata to accelerate collaboration and synthesis.
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; de Sherbinin, A. M.
2017-12-01
Growing recognition of the importance of sharing scientific data more widely and openly has refocused attention on the state of data repositories, including both discipline- or topic-oriented data centers and institutional repositories. Data creators often have several alternatives for depositing and disseminating their natural, social, health, or engineering science data. In selecting a repository for their data, data creators and other stakeholders such as their funding agencies may wish to consider the user community or communities served, the type and quality of data products already offered, and the degree of data stewardship and associated services provided. Some data repositories serve general communities, e.g., those in their host institution or region, whereas others tailor their services to particular scientific disciplines or topical areas. Some repositories are selective when acquiring data and conduct extensive curation and reviews to ensure that data products meet quality standards. Many repositories have secured credentials and established a track record for providing trustworthy, high quality data and services. The NASA Socioeconomic Data and Applications Center (SEDAC) serves users interested in human-environment interactions, including researchers, students, and applied users from diverse sectors. SEDAC is selective when choosing data for dissemination, conducting several reviews of data products and services prior to release. SEDAC works with data producers to continually improve the quality of its open data products and services. As a Distributed Active Archive Center (DAAC) of the NASA Earth Observing System Data and Information System, SEDAC is committed to improving the accessibility, interoperability, and usability of its data in conjunction with data available from other DAACs, as well as other relevant data sources. SEDAC is certified as a Regular Member of the International Council for Science World Data System (ICSU-WDS).
Simmons, Ardyth M.; Stuckless, John S.; with a Foreword by Abraham Van Luik, U.S. Department of Energy
2010-01-01
Natural analogues are defined for this report as naturally occurring or anthropogenic systems in which processes similar to those expected to occur in a nuclear waste repository are thought to have taken place over time periods of decades to millennia and on spatial scales as much as tens of kilometers. Analogues provide an important temporal and spatial dimension that cannot be tested by laboratory or field-scale experiments. Analogues provide one of the multiple lines of evidence intended to increase confidence in the safe geologic disposal of high-level radioactive waste. Although the work in this report was completed specifically for Yucca Mountain, Nevada, as the proposed geologic repository for high-level radioactive waste under the U.S. Nuclear Waste Policy Act, the applicability of the science, analyses, and interpretations is not limited to a specific site. Natural and anthropogenic analogues have provided and can continue to provide value in understanding features and processes of importance across a wide variety of topics in addressing the challenges of geologic isolation of radioactive waste and also as a contribution to scientific investigations unrelated to waste disposal. Isolation of radioactive waste at a mined geologic repository would be through a combination of natural features and engineered barriers. In this report we examine analogues to many of the various components of the Yucca Mountain system, including the preservation of materials in unsaturated environments, flow of water through unsaturated volcanic tuff, seepage into repository drifts, repository drift stability, stability and alteration of waste forms and components of the engineered barrier system, and transport of radionuclides through unsaturated and saturated rock zones.
Wu, Tai-luan; Tseng, Ling-li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
Recent technology products from Space Human Factors research
NASA Technical Reports Server (NTRS)
Jenkins, James P.
1991-01-01
The goals of the NASA Space Human Factors program and the research carried out concerning human factors are discussed with emphasis given to the development of human performance models, data, and tools. The major products from this program are described, which include the Laser Anthropometric Mapping System; a model of the human body for evaluating the kinematics and dynamics of human motion and strength in microgravity environment; an operational experience data base for verifying and validating the data repository of manned space flights; the Operational Experience Database Taxonomy; and a human-computer interaction laboratory whose products are the display softaware and requirements and the guideline documents and standards for applications on human-computer interaction. Special attention is given to the 'Convoltron', a prototype version of a signal processor for synthesizing the head-related transfer functions.
NASA Astrophysics Data System (ADS)
Òtúlàjà, Fẹ´Mi S.; Cameron, Ann; Msimanga, Audrey
2011-09-01
Our response to Hewson and Ogunniyi's paper focuses, on the one hand, on some of the underlying tensions associated with alinging indigenous knowledge systems with westernized science in South African science classrooms, as suggested by the new, post-apartheid, curriculum. On the other hand, the use of argumentation as a vehicle to accomplish the alignment when the jury is still out on the appropriateness of argumentation as a pedagogical and research tool heightens the tension. We argue that the need for education stakeholders from indigenous heritages to value, know and document their own indigenous knowledge becomes paramount. The textualizing of indigenous knowledge, as has been done in western science, will create repositories for teachers to access and may help with the argumentation strategies such as advocated by the authors.
Building Scientific Data's list of recommended data repositories
NASA Astrophysics Data System (ADS)
Hufton, A. L.; Khodiyar, V.; Hrynaszkiewicz, I.
2016-12-01
When Scientific Data launched in 2014 we provided our authors with a list of recommended data repositories to help them identify data hosting options that were likely to meet the journal's requirements. This list has grown in size and scope, and is now a central resource for authors across the Nature-titled journals. It has also been used in the development of data deposition policies and recommended repository lists across Springer Nature and at other publishers. Each new addition to the list is assessed according to a series of criteria that emphasize the stability of the resource, its commitment to principles of open science and its implementation of relevant community standards and reporting guidelines. A preference is expressed for repositories that issue digital object identifiers (DOIs) through the DataCite system and that share data under the Creative Commons CC0 waiver. Scientific Data currently lists fourteen repositories that focus on specific areas within the Earth and environmental sciences, as well as the broad scope repositories, Dryad and figshare. Readers can browse and filter datasets published at the journal by the host repository using ISA-explorer, a demo tool built by the ISA-tools team at Oxford University1. We believe that well-maintained lists like this one help publishers build a network of trust with community data repositories and provide an important complement to more comprehensive data repository indices and more formal certification efforts. In parallel, Scientific Data has also improved its policies to better support submissions from authors using institutional and project-specific repositories, without requiring each to apply for listing individually. Online resources Journal homepage: http://www.nature.com/scientificdata Data repository criteria: http://www.nature.com/sdata/policies/data-policies#repo-criteria Recommended data repositories: http://www.nature.com/sdata/policies/repositories Archived copies of the list: https://dx.doi.org/10.6084/m9.figshare.1434640.v6 Reference Gonzalez-Beltran, A. ISA-explorer: A demo tool for discovering and exploring Scientific Data's ISA-tab metadata. Scientific Data Updates http://blogs.nature.com/scientificdata/2015/12/17/isa-explorer/ (2015).
The role of natural analogs in the repository licensing process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, W.M.
1995-09-01
The concept of a permanent geologic repository for high-level nuclear waste (NLW) is implicitly based on analogy to natural systems that have been stable for millions or billions of years. The time of radioactive and chemical toxicity of HLW exceeds the duration of human civilization, and it is impossible to demonstrate the accuracy of predictions of the behavior of engineered or social systems over such long periods.
Knowledge Management Systems: Linking Contribution, Refinement and Use
ERIC Educational Resources Information Center
Chung, Ting-ting
2009-01-01
Electronic knowledge repositories represent one of the fundamental tools for knowledge management (KM) initiatives. Existing research, however, has largely focused on supply-side driven research questions, such as employee motivation to contribute knowledge to a repository. This research turns attention to the dynamic relationship between the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutton, M; Blink, J A; Greenberg, H R
2012-04-25
The Used Fuel Disposition (UFD) Campaign within the Department of Energy's Office of Nuclear Energy (DOE-NE) Fuel Cycle Technology (FCT) program has been tasked with investigating the disposal of the nation's spent nuclear fuel (SNF) and high-level nuclear waste (HLW) for a range of potential waste forms and geologic environments. The planning, construction, and operation of a nuclear disposal facility is a long-term process that involves engineered barriers that are tailored to both the geologic environment and the waste forms being emplaced. The UFD Campaign is considering a range of fuel cycles that in turn produce a range of wastemore » forms. The UFD Campaign is also considering a range of geologic media. These ranges could be thought of as adding uncertainty to what the disposal facility design will ultimately be; however, it may be preferable to thinking about the ranges as adding flexibility to design of a disposal facility. For example, as the overall DOE-NE program and industrial actions result in the fuel cycles that will produce waste to be disposed, and the characteristics of those wastes become clear, the disposal program retains flexibility in both the choice of geologic environment and the specific repository design. Of course, other factors also play a major role, including local and State-level acceptance of the specific site that provides the geologic environment. In contrast, the Yucca Mountain Project (YMP) repository license application (LA) is based on waste forms from an open fuel cycle (PWR and BWR assemblies from an open fuel cycle). These waste forms were about 90% of the total waste, and they were the determining waste form in developing the engineered barrier system (EBS) design for the Yucca Mountain Repository design. About 10% of the repository capacity was reserved for waste from a full recycle fuel cycle in which some actinides were extracted for weapons use, and the remaining fission products and some minor actinides were encapsulated in borosilicate glass. Because the heat load of the glass was much less than the PWR and BWR assemblies, the glass waste form was able to be co-disposed with the open cycle waste, by interspersing glass waste packages among the spent fuel assembly waste packages. In addition, the Yucca Mountain repository was designed to include some research reactor spent fuel and naval reactor spent fuel, within the envelope that was set using the commercial reactor assemblies as the design basis waste form. This milestone report supports Sandia National Laboratory milestone M2FT-12SN0814052, and is intended to be a chapter in that milestone report. The independent technical review of this LLNL milestone was performed at LLNL and is documented in the electronic Information Management (IM) system at LLNL. The objective of this work is to investigate what aspects of quantifying, characterizing, and representing the uncertainty associated with the engineered barrier are affected by implementing different advanced nuclear fuel cycles (e.g., partitioning and transmutation scenarios) together with corresponding designs and thermal constraints.« less
Repository Profiles for Atmospheric and Climate Sciences: Capabilities and Trends in Data Services
NASA Astrophysics Data System (ADS)
Hou, C. Y.; Thompson, C. A.; Palmer, C. L.
2014-12-01
As digital research data proliferate and expectations for open access escalate, the landscape of data repositories is becoming more complex. For example, DataBib currently identifies 980 data repositories across the disciplines, with 117 categorized under Geosciences. In atmospheric and climate sciences, there are great expectations for the integration and reuse of data for advancing science. To realize this potential, resources are needed that explicate the range of repository options available for locating and depositing open data, their conditions of access and use, and the services and tools they provide. This study profiled 38 open digital repositories in the atmospheric and climate sciences, analyzing each on 55 criteria through content analysis of their websites. The results provide a systematic way to assess and compare capabilities, services, and institutional characteristics and identify trends across repositories. Selected results from the more detailed outcomes to be presented: Most repositories offer guidance on data format(s) for submission and dissemination. 42% offer authorization-free access. More than half use some type of data identification system such as DOIs. Nearly half offer some data processing, with a similar number providing software or tools. 78.9% request that users cite or acknowledge datasets used and the data center. Only 21.1% recommend specific metadata standards, such as ISO 19115 or Dublin Core, with more than half utilizing a customized metadata scheme. Information was rarely provided on repository certification and accreditation and uneven for transfer of rights and data security. Few provided policy information on preservation, migration, reappraisal, disposal, or long-term sustainability. As repository use increases, it will be important for institutions to make their procedures and policies explicit, to build trust with user communities and improve efficiencies in data sharing. Resources such as repository profiles will be essential for scientists to weigh options and understand trends in data services across the evolving network of repositories.
Introducing the slime mold graph repository
NASA Astrophysics Data System (ADS)
Dirnberger, M.; Mehlhorn, K.; Mehlhorn, T.
2017-07-01
We introduce the slime mold graph repository or SMGR, a novel data collection promoting the visibility, accessibility and reuse of experimental data revolving around network-forming slime molds. By making data readily available to researchers across multiple disciplines, the SMGR promotes novel research as well as the reproduction of original results. While SMGR data may take various forms, we stress the importance of graph representations of slime mold networks due to their ease of handling and their large potential for reuse. Data added to the SMGR stands to gain impact beyond initial publications or even beyond its domain of origin. We initiate the SMGR with the comprehensive Kist Europe data set focusing on the slime mold Physarum polycephalum, which we obtained in the course of our original research. It contains sequences of images documenting growth and network formation of the organism under constant conditions. Suitable image sequences depicting the typical P. polycephalum network structures are used to compute sequences of graphs faithfully capturing them. Given such sequences, node identities are computed, tracking the development of nodes over time. Based on this information we demonstrate two out of many possible ways to begin exploring the data. The entire data set is well-documented, self-contained and ready for inspection at http://smgr.mpi-inf.mpg.de.
NASA Astrophysics Data System (ADS)
Gagliolo, S.; Ausonio, E.; Federici, B.; Ferrando, I.; Passoni, D.; Sguerso, D.
2018-05-01
The conservation of Cultural Heritage depends on the availability of means and resources and, consequently, on the possibility to make effective operations of data acquisition. In facts, on the one hand the creation of data repositories allows the description of the present state-of-art, in order to preserve the testimonial value and to permit the fruition. On the other hand, data acquisition grants a metrical knowledge, which is particularly useful for a direct restoration of the surveyed objects, through the analysis of their 3D digital models. In the last decades, the continuous increase and improvement of 3D survey techniques and of tools for the geometric and digital data management have represented a great support to the development of documentary activities. In particular, Photogrammetry is a survey technique highly appropriate in the creation of data repositories in the field of Cultural Heritage, thanks to its advantages of cheapness, flexibility, speed, and the opportunity to ensure the operators' safety in hazardous areas too. In order to obtain a complete documentation, the high precision of the on-site operations must be coupled with an effective post-processing phase. Hence, a comparison among some of the photogrammetric software currently available was performed by the authors, with a particular attention to the workflow completeness and the final products quality.
Assessment of Self-Archiving in Institutional Repositories: Across Disciplines
ERIC Educational Resources Information Center
Xia, Jingfeng
2007-01-01
This research examined self-archiving practices by four disciplines in seven institutional repositories. By checking each individual item for its metadata and deposition status, the research found that a disciplinary culture is not obviously presented. Rather, self-archiving is regulated by a liaison system and a mandate policy.
THE TROPICAL AND SUBTROPICAL GERMPLASM COLLECTIONS AT THE NATIONAL GERMPLASM REPOSITORY IN MIAMI, FL
USDA-ARS?s Scientific Manuscript database
The Subtropical and Tropical USDA, ARS, National Germplasm Repositories (NGR) in Miami, FL; Mayaguez, PR; and Hilo, HI are responsible for the collections of subtropical and tropical fruits, nuts, grasses, and ornamentals for the USDA, ARS, National Plant Germplasm System (NPGS). The NPGS is respons...
Designing Learning Object Repositories as Systems for Managing Educational Communities Knowledge
ERIC Educational Resources Information Center
Sampson, Demetrios G.; Zervas, Panagiotis
2013-01-01
Over the past years, a number of international initiatives that recognize the importance of sharing and reusing digital educational resources among educational communities through the use of Learning Object Repositories (LORs) have emerged. Typically, these initiatives focus on collecting digital educational resources that are offered by their…
ALES: An Innovative Argument-Learning Environment
ERIC Educational Resources Information Center
Abbas, Safia; Sawamura, Hajime
2010-01-01
This paper presents the development of an Argument-Learning System (ALES). The idea is based on the AIF (argumentation interchange format) ontology using "Walton theory". ALES uses different mining techniques to manage a highly structured arguments repository. This repository was designed, developed and implemented by the authors. The aim is to…
NASA Astrophysics Data System (ADS)
Versteeg, R. J.; Wangerud, K.; Mattson, E.; Ankeny, M.; Richardson, A.; Heath, G.
2005-05-01
The Ruby Gulch repository at the Gilt Edge Mine Superfund site is a capped waste rock repository. Early in the system design EPA and its subcontractor, Bureau of Reclamation, recognized the need for long-term monitoring system to provide information on the repository behavior with the following objectives: 1 Provide information on the integrity of the newly constructed surface cover and diversion system 2 Continually assess the waste's hydrological and geochemical behavior, such that rational decisions can be made for the operation of this cover and liner system 3 Easily access of information pertaining to the system performance to stakeholders 4 Integration of a variety of data sources to produce information which could be used to enhance future cover designs. Through discussions between EPA, the Bureau of Reclamation and Idaho National Laboratory a long-term monitoring system was designed and implemented allowing EPA to meet these objectives. This system was designed to provide a cost effective way to deal with massive amounts of data and information, subject to the following specifications: 1 Data acquisition should occur autonomously and automatically, 2 Data management, processing and presentation should be automated as much as possible, 3 Users should be able to access all data and information remotely through a web browser. The INL long-term monitoring system integrates the data from a set of 522 electrodes resistivity electrodes consisting of 462 surface electrodes and 60 borehole electrodes (in 4 wells with 15 electrodes each), an outflow meter at the toe of the repository, an autonomous, remotely accessible weather station, and four wells (average depths of 250 feet) with thermocouples, pressure transducers and sampling ports for water and air. The monitoring system has currently been in operation for over a year, and has collected data continuously over this period. Results from this system have shown both the diurnal variation in rockmass behavior, movement of water through the waste (allowing estimated in residence time) and are leading to a comprehensive model of the repository behavior. Due to the sheer volume of data, a user driven interface allows users to create their own views of the different datasets.
Building a Trustworthy Environmental Science Data Repository: Lessons Learned from the ORNL DAAC
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S. K.; Boyer, A.; Beaty, T.; Deb, D.; Hook, L.
2017-12-01
The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, https://daac.ornl.gov) for biogeochemical dynamics is one of NASA's Earth Observing System Data and Information System (EOSDIS) data centers. The mission of the ORNL DAAC is to assemble, distribute, and provide data services for a comprehensive archive of terrestrial biogeochemistry and ecological dynamics observations and models to facilitate research, education, and decision-making in support of NASA's Earth Science. Since its establishment in 1994, ORNL DAAC has been continuously building itself into a trustworthy environmental science data repository by not only ensuring the quality and usability of its data holdings, but also optimizing its data publication and management process. This paper describes the lessons learned from ORNL DAAC's effort toward this goal. ORNL DAAC has been proactively implementing international community standards throughout its data management life cycle, including data publication, preservation, discovery, visualization, and distribution. Data files in standard formats, detailed documentation, and metadata following standard models are prepared to improve the usability and longevity of data products. Assignment of a Digital Object Identifier (DOI) ensures the identifiability and accessibility of every data product, including the different versions and revisions of its life cycle. ORNL DAAC's data citation policy assures data producers receive appropriate recognition of use of their products. Web service standards, such as OpenSearch and Open Geospatial Consortium (OGC), promotes the discovery, visualization, distribution, and integration of ORNL DAAC's data holdings. Recently, ORNL DAAC began efforts to optimize and standardize its data archival and data publication workflows, to improve the efficiency and transparency of its data archival and management processes.
Wolbarst, A B; Forinash, E K; Byrum, C O; Peake, R T; Marcinowski, F; Kruger, M U
2001-02-01
In March of 1999, the Waste Isolation Pilot Plant (WIPP) in southeast New Mexico, the world's first deep geological repository for radioactive materials, began receiving defense-related transuranic waste. The WIPP was designed and constructed by the U.S. Department of Energy, but critical to its opening was certification by the U.S. Environmental Protection Agency that the repository complies with the radioactive waste disposal regulations set forth as environmental radiation protection standards (40 CFR Part 191) and compliance criteria (40 CFR Part 194). This paper provides a summary of the regulatory process, including the Environmental Protection Agency's waste containment, groundwater protection, and individual dose regulations for the WIPP; the Department of Energy's performance assessment and the other parts of its compliance certification application; and the Environmental Protection Agency's review and analysis of the compliance certification application and related documentation.
BigMouth: a multi-institutional dental data repository
Walji, Muhammad F; Kalenderian, Elsbeth; Stark, Paul C; White, Joel M; Kookal, Krishna K; Phan, Dat; Tran, Duong; Bernstam, Elmer V; Ramoni, Rachel
2014-01-01
Few oral health databases are available for research and the advancement of evidence-based dentistry. In this work we developed a centralized data repository derived from electronic health records (EHRs) at four dental schools participating in the Consortium of Oral Health Research and Informatics. A multi-stakeholder committee developed a data governance framework that encouraged data sharing while allowing control of contributed data. We adopted the i2b2 data warehousing platform and mapped data from each institution to a common reference terminology. We realized that dental EHRs urgently need to adopt common terminologies. While all used the same treatment code set, only three of the four sites used a common diagnostic terminology, and there were wide discrepancies in how medical and dental histories were documented. BigMouth was successfully launched in August 2012 with data on 1.1 million patients, and made available to users at the contributing institutions. PMID:24993547
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Nuclear Waste Policy Act of 1982 (42 USC sections 10101-10226) requires the environmental assessment of a proposed site to include a statement of the basis for nominating a site as suitable for characterization. Volume 2 provides a detailed statement evaluating the site suitability of the Deaf Smith County Site under DOE siting guidelines, as well as a comparison of the Deaf Smith County Site to the other sites under consideration. The evaluation of the Deaf Smith County Site is based on the impacts associated with the reference repository design, but the evaluation will not change if based on themore » Mission Plan repository concept. The second part of this document compares the Deaf Smith County Site to Davis Canyon, Hanford, Richton Dome and Yucca Mountain. This comparison is required under DOE guidelines and is not intended to directly support subsequent recommendation of three sites for characterization as candidate sites. 259 refs., 29 figs., 66 refs. (MHB)« less
NASA Astrophysics Data System (ADS)
Dunagan, S. C.; Herrick, C. G.; Lee, M. Y.
2008-12-01
The Waste Isolation Pilot Plant (WIPP) is located at a depth of 655 m in bedded salt in southeastern New Mexico and is operated by the U.S. Department of Energy as a deep underground disposal facility for transuranic (TRU) waste. The WIPP must comply with the EPA's environmental regulations that require a probabilistic risk analysis of releases of radionuclides due to inadvertent human intrusion into the repository at some time during the 10,000-year regulatory period. Sandia National Laboratories conducts performance assessments (PAs) of the WIPP using a system of computer codes representing the evolution of underground repository and emplaced TRU waste in order to demonstrate compliance. One of the important features modeled in a PA is the disturbed rock zone (DRZ) surrounding the emplacement rooms in the repository. The extent and permeability of DRZ play a significant role in the potential radionuclide release scenarios. We evaluated the phenomena occurring in the repository that affect the DRZ and their potential effects on the extent and permeability of the DRZ. Furthermore, we examined the DRZ's role in determining the performance of the repository. Pressure in the completely sealed repository will be increased by creep closure of the salt and degradation of TRU waste contents by microbial activity in the repository. An increased pressure in the repository will reduce the extent and permeability of the DRZ. The reduced DRZ extent and permeability will decrease the amount of brine that is available to interact with the waste. Furthermore, the potential for radionuclide release from the repository is dependent on the amount of brine that enters the repository. As a result of these coupled biological-geomechanical-geochemical phenomena, the extent and permeability of the DRZ has a significant impact on the potential radionuclide releases from the repository and, in turn, the repository performance. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.
ACToR A Aggregated Computational Toxicology Resource ...
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
ACToR A Aggregated Computational Toxicology Resource (S) ...
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
10 Steps to Building an Architecture for Space Surveillance Projects
NASA Astrophysics Data System (ADS)
Gyorko, E.; Barnhart, E.; Gans, H.
Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Mcclain, Charles R.; Firestone, James K.; Westphal, Todd L.; Yeh, Eueng-Nan; Ge, Yuntao; Firestone, Elaine R.
1994-01-01
This document provides an overview of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Bio-Optical Archive and Storage System (SeaBASS), which will serve as a repository for numerous data sets of interest to the SeaWiFS Science Team and other approved investigators in the oceanographic community. The data collected will be those data sets suitable for the development and evaluation of bio-optical algorithms which include results from SeaWiFS Intercalibration Round-Robin Experiments (SIRREXs), prelaunch characterization of the SeaWiFS instrument by its manufacturer -- Hughes/Santa Barbara Research Center (SBRC), Marine Optical Characterization Experiment (MOCE) cruises, Marine Optical Buoy (MOBY) deployments and refurbishments, and field studies of other scientists outside of NASA. The primary goal of the data system is to provide a simple mechanism for querying the available archive and requesting specific items, while assuring that the data is made available only to authorized users. The design, construction, and maintenance of SeaBASS is the responsibility of the SeaWiFS Calibration and Validation Team (CVT). This report is concerned with documenting the execution of this task by the CVT and consists of a series of chapters detailing the various data sets involved. The topics presented are as follows: 1) overview of the SeaBASS file architecture, 2) the bio-optical data system, 3) the historical pigment database, 4) the SIRREX database, and 5) the SBRC database.
Data Stewardship throughout the Ocean Research Data Life Cycle
NASA Astrophysics Data System (ADS)
Chandler, Cynthia; Groman, Robert; Allison, Molly; Wiebe, Peter; Glover, David
2013-04-01
The Biological and Chemical Oceanography Data Management Office (BCO-DMO) works in partnership with ocean science investigators to publish data from research projects funded by the Biological and Chemical Oceanography Sections and the Office of Polar Programs Antarctic Organisms & Ecosystems Program (OPP ANT) at the U.S. National Science Foundation. Since 2006, researchers have been contributing data to the BCO-DMO data system, and it has developed into a rich repository of data from ocean, coastal and Great Lakes research programs. The end goals of the BCO-DMO are to ensure preservation of NSF funded project data and to provide open access to those data; achievement of those goals is attained through successful completion of a series of related phases. BCO-DMO has developed an end-to-end data stewardship process that includes all phases of the data life cycle: (1) providing data management advice to investigators during the proposal writing stage; (2) registering their funded project at BCO-DMO; (3) adding data and supporting documentation to the BCO-DMO data repository; (4) providing geospatial and text-based data access systems that support data discovery, access, display, assessment, integration, and export of data resources; (5) exploring mechanisms for exchange of data with complementary repositories; (6) publication of data sets to provide publishers of the peer-reviewed literature with citable references (Digital Object Identifiers) and to encourage proper citation and attribution of data sets in the future and (7) submission of final data sets for preservation in the appropriate long-term data archive. Strategic development of collaborative partnerships with complementary data management organizations is essential to sustainable coverage of the full data life cycle from research proposal through preservation of the final data products. Development and incorporation of controlled vocabularies, domain-specific ontologies and globally unique, persistent identifiers to unambiguously identify resources of interest curated by and available from BCO-DMO have significantly enabled progress toward interoperability with partner systems. Several important components have emerged from early collaborative relationships: (1) identifying a trusted authoritative source of complementary content and the appropriate contact; (2) determining the globally unique, persistent identifier for resources of interest and (3) negotiating the requisite syntactic and semantic exchange systems. An added benefit is the ability to use globally unique, persistent resource identifiers to identify and compare related content in other repositories, thus enabling us to improve the accuracy of content in the BCO-DMO data collection. Results from a recent community discussion at the January 2013 Federation of Earth Science Information Partners (ESIP) meeting will be presented. Mindful of the NSF EarthCube initiative in the United States, the ESIP discussion was an effort to identify commonalities and differences in the way different communities meet the challenges of data stewardship throughout the full data life cycle and to determine any gaps that currently exist. BCO-DMO: http://bco-dmo.org ESIP: http://esipfed.org/
SATORI: a system for ontology-guided visual exploration of biomedical data repositories.
Lekschas, Fritz; Gehlenborg, Nils
2018-04-01
The ever-increasing number of biomedical datasets provides tremendous opportunities for re-use but current data repositories provide limited means of exploration apart from text-based search. Ontological metadata annotations provide context by semantically relating datasets. Visualizing this rich network of relationships can improve the explorability of large data repositories and help researchers find datasets of interest. We developed SATORI-an integrative search and visual exploration interface for the exploration of biomedical data repositories. The design is informed by a requirements analysis through a series of semi-structured interviews. We evaluated the implementation of SATORI in a field study on a real-world data collection. SATORI enables researchers to seamlessly search, browse and semantically query data repositories via two visualizations that are highly interconnected with a powerful search interface. SATORI is an open-source web application, which is freely available at http://satori.refinery-platform.org and integrated into the Refinery Platform. nils@hms.harvard.edu. Supplementary data are available at Bioinformatics online.
Information and image integration: project spectrum
NASA Astrophysics Data System (ADS)
Blaine, G. James; Jost, R. Gilbert; Martin, Lori; Weiss, David A.; Lehmann, Ron; Fritz, Kevin
1998-07-01
The BJC Health System (BJC) and the Washington University School of Medicine (WUSM) formed a technology alliance with industry collaborators to develop and implement an integrated, advanced clinical information system. The industry collaborators include IBM, Kodak, SBC and Motorola. The activity, called Project Spectrum, provides an integrated clinical repository for the multiple hospital facilities of the BJC. The BJC System consists of 12 acute care hospitals serving over one million patients in Missouri and Illinois. An interface engine manages transactions from each of the hospital information systems, lab systems and radiology information systems. Data is normalized to provide a consistent view for the primary care physician. Access to the clinical repository is supported by web-based server/browser technology which delivers patient data to the physician's desktop. An HL7 based messaging system coordinates the acquisition and management of radiological image data and sends image keys to the clinical data repository. Access to the clinical chart browser currently provides radiology reports, laboratory data, vital signs and transcribed medical reports. A chart metaphor provides tabs for the selection of the clinical record for review. Activation of the radiology tab facilitates a standardized view of radiology reports and provides an icon used to initiate retrieval of available radiology images. The selection of the image icon spawns an image browser plug-in and utilizes the image key from the clinical repository to access the image server for the requested image data. The Spectrum system is collecting clinical data from five hospital systems and imaging data from two hospitals. Domain specific radiology imaging systems support the acquisition and primary interpretation of radiology exams. The spectrum clinical workstations are deployed to over 200 sites utilizing local area networks and ISDN connectivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Robert Charles; Lukens, Wayne W.
The proposed Yucca Mountain repository, located in southern Nevada, is to be the first facility for permanent disposal of spent reactor fuel and high-level radioactive waste in the United States. Total Systems Performance Assessment (TSPA) analysis has indicated that among the major radionuclides contributing to dose are technetium, iodine, and neptunium, all of which are highly mobile in the environment. Containment of these radionuclides within the repository is a priority for the Yucca Mountain Project (YMP). These proceedings review current research and technology efforts for sequestration of the radionuclides with a focus on technetium, iodine, and neptunium. This workshop alsomore » covered issues concerning the Yucca Mountain environment and getter characteristics required for potential placement into the repository.« less
Analysis of model output and science data in the Virtual Model Repository (VMR).
NASA Astrophysics Data System (ADS)
De Zeeuw, D.; Ridley, A. J.
2014-12-01
Big scientific data not only includes large repositories of data from scientific platforms like satelites and ground observation, but also the vast output of numerical models. The Virtual Model Repository (VMR) provides scientific analysis and visualization tools for a many numerical models of the Earth-Sun system. Individual runs can be analyzed in the VMR and compared to relevant data through relevant metadata, but larger collections of runs can also now be studied and statistics generated on the accuracy and tendancies of model output. The vast model repository at the CCMC with over 1000 simulations of the Earth's magnetosphere was used to look at overall trends in accuracy when compared to satelites such as GOES, Geotail, and Cluster. Methodology for this analysis as well as case studies will be presented.
Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian
2017-06-05
Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-06-18
This document provides a base case description of the rural Clark County community of Indian Springs in anticipation of change associated with the proposed high-level nuclear waste repository at Yucca Mountain. As the community closest to the proposed site, Indian Springs may be seen by site characterization workers, as well as workers associated with later repository phases, as a logical place to live. This report develops and updates information relating to a broad spectrum of socioeconomic variables, thereby providing a `snapshot` or `base case` look at Indian Springs in early 1992. With this as a background, future repository-related developments maymore » be analytically separated from changes brought about by other factors, thus allowing for the assessment of the magnitude of local changes associated with the proposed repository. Given the size of the community, changes that may be considered small in an absolute sense may have relatively large impacts at the local level. Indian Springs is, in many respects, a unique community and a community of contrasts. An unincorporated town, it is a small yet important enclave of workers on large federal projects and home to employees of small- scale businesses and services. It is a rural community, but it is also close to the urbanized Las Vega Valley. It is a desert community, but has good water resources. It is on flat terrain, but it is located within 20 miles of the tallest mountains in Nevada. It is a town in which various interest groups diverge on issues of local importance, but in a sense of community remains an important feature of life. Finally, it has a sociodemographic history of both surface transience and underlying stability. If local land becomes available, Indian Springs has some room for growth but must first consider the historical effects of growth on the town and its desired direction for the future.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter
2016-09-01
The goal of the Fifth Worldwide Review is to document evolution in the state-of-the-art of approaches for nuclear waste disposal in geological formations since the Fourth Worldwide Review that was released in 2006. The last ten years since the previous Worldwide Review has seen major developments in a number of nations throughout the world pursuing geological disposal programs, both in preparing and reviewing safety cases for the operational and long-term safety of proposed and operating repositories. The countries that are approaching implementation of geological disposal will increasingly focus on the feasibility of safely constructing and operating their repositories in short-more » and long terms on the basis existing regulations. The WWR-5 will also address a number of specific technical issues in safety case development along with the interplay among stakeholder concerns, technical feasibility, engineering design issues, and operational and post-closure safety. Preparation and publication of the Fifth Worldwide Review on nuclear waste disposal facilitates assessing the lessons learned and developing future cooperation between the countries. The Report provides scientific and technical experiences on preparing for and developing scientific and technical bases for nuclear waste disposal in deep geologic repositories in terms of requirements, societal expectations and the adequacy of cases for long-term repository safety. The Chapters include potential issues that may arise as repository programs mature, and identify techniques that demonstrate the safety cases and aid in promoting and gaining societal confidence. The report will also be used to exchange experience with other fields of industry and technology, in which concepts similar to the design and safety cases are applied, as well to facilitate the public perception and understanding of the safety of the disposal approaches relative to risks that may increase over long times frames in the absence of a successful implementation of final dispositioning.« less
Modeling Coupled Processes in Clay Formations for Radioactive Waste Disposal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Hui-Hai; Rutqvist, Jonny; Zheng, Liange
As a result of the termination of the Yucca Mountain Project, the United States Department of Energy (DOE) has started to explore various alternative avenues for the disposition of used nuclear fuel and nuclear waste. The overall scope of the investigation includes temporary storage, transportation issues, permanent disposal, various nuclear fuel types, processing alternatives, and resulting waste streams. Although geologic disposal is not the only alternative, it is still the leading candidate for permanent disposal. The realm of geologic disposal also offers a range of geologic environments that may be considered, among those clay shale formations. Figure 1-1 presents themore » distribution of clay/shale formations within the USA. Clay rock/shale has been considered as potential host rock for geological disposal of high-level nuclear waste throughout the world, because of its low permeability, low diffusion coefficient, high retention capacity for radionuclides, and capability to self-seal fractures induced by tunnel excavation. For example, Callovo-Oxfordian argillites at the Bure site, France (Fouche et al., 2004), Toarcian argillites at the Tournemire site, France (Patriarche et al., 2004), Opalinus clay at the Mont Terri site, Switzerland (Meier et al., 2000), and Boom clay at Mol site, Belgium (Barnichon et al., 2005) have all been under intensive scientific investigations (at both field and laboratory scales) for understanding a variety of rock properties and their relations with flow and transport processes associated with geological disposal of nuclear waste. Clay/shale formations may be generally classified as indurated and plastic clays (Tsang et al., 2005). The latter (including Boom clay) is a softer material without high cohesion; its deformation is dominantly plastic. For both clay rocks, coupled thermal, hydrological, mechanical and chemical (THMC) processes are expected to have a significant impact on the long-term safety of a clay repository. For example, the excavation-damaged zone (EDZ) near repository tunnels can modify local permeability (resulting from induced fractures), potentially leading to less confinement capability (Tsang et al., 2005). Because of clay's swelling and shrinkage behavior (depending on whether the clay is in imbibition or drainage processes), fracture properties in the EDZ are quite dynamic and evolve over time as hydromechanical conditions change. To understand and model the coupled processes and their impact on repository performance is critical for the defensible performance assessment of a clay repository. Within the Natural Barrier System (NBS) group of the Used Fuel Disposition (UFD) Campaign at DOE's Office of Nuclear Energy, LBNL's research activities have focused on understanding and modeling such coupled processes. LBNL provided a report in this April on literature survey of studies on coupled processes in clay repositories and identification of technical issues and knowledge gaps (Tsang et al., 2010). This report will document other LBNL research activities within the natural system work package, including the development of constitutive relationships for elastic deformation of clay rock (Section 2), a THM modeling study (Section 3) and a THC modeling study (Section 4). The purpose of the THM and THC modeling studies is to demonstrate the current modeling capabilities in dealing with coupled processes in a potential clay repository. In Section 5, we discuss potential future R&D work based on the identified knowledge gaps. The linkage between these activities and related FEPs is presented in Section 6.« less
Usability Evaluation of a Research Repository and Collaboration Web Site
ERIC Educational Resources Information Center
Zhang, Tao; Maron, Deborah J.; Charles, Christopher C.
2013-01-01
This article reports results from an empirical usability evaluation of Human-Animal Bond Research Initiative Central as part of the effort to develop an open access research repository and collaboration platform for human-animal bond researchers. By repurposing and altering key features of the original HUBzero system, Human-Animal Bond Research…
Combining computational models, semantic annotations and simulation experiments in a graph database
Henkel, Ron; Wolkenhauer, Olaf; Waltemath, Dagmar
2015-01-01
Model repositories such as the BioModels Database, the CellML Model Repository or JWS Online are frequently accessed to retrieve computational models of biological systems. However, their storage concepts support only restricted types of queries and not all data inside the repositories can be retrieved. In this article we present a storage concept that meets this challenge. It grounds on a graph database, reflects the models’ structure, incorporates semantic annotations and simulation descriptions and ultimately connects different types of model-related data. The connections between heterogeneous model-related data and bio-ontologies enable efficient search via biological facts and grant access to new model features. The introduced concept notably improves the access of computational models and associated simulations in a model repository. This has positive effects on tasks such as model search, retrieval, ranking, matching and filtering. Furthermore, our work for the first time enables CellML- and Systems Biology Markup Language-encoded models to be effectively maintained in one database. We show how these models can be linked via annotations and queried. Database URL: https://sems.uni-rostock.de/projects/masymos/ PMID:25754863
Documenting the use of computers in Swedish Health Care up to 1980.
Peterson, H E; Lundin, P
2011-01-01
This paper describes a documentation project to create, collect and preserve previously unavailable sources on informatics in Sweden (including health care as one of 16 subgroups), and making them available on the Web. Time was critical as the personal documentation and artifacts of early pioneers could be irretrievably lost. The criteria for participation were that a person had developed a system in a clinical environment which was used by others prior to 1980. Participants were interviewed and asked for early documentation such as notes, minutes from meetings, drawings, test results and early models - together with related artifacts. The approach included traditional oral history interviews, collection of autobiographies and new self-structuring and time saving methods, such as witness seminars and an Internet-based repository of their recollections (the Writers' Web). The combination of methods obtained new information on system errors, and challenges in reaching the goals due partly to inadequacies of the early technology, and partly to the insufficient understanding of the complexity of the many problems which needed to be solved before a useful electronic patient record could be realized. A very important result was the development of a method to collect information in an easier, faster and much less expensive way than using the traditional scientific method, and still reach results that are qualitative and quantitative for the purpose of documenting the early period of computer-based health care technology. The witness seminars and the Writers' Web yielded especially large amounts of hitherto-unknown information. With all material in one database available to everyone on the Web, it is accessed very frequently - especially by students, researchers, journalists and teachers. Study of the materials explains and clarifies the reasons behind the delays and difficulties that have been encountered in developing electronic patient records, as described in an article [3] published in the IMIA Yearbook 2006.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faybishenko, Boris; Birkholzer, Jens; Sassani, David
The overall objective of the Fifth Worldwide Review (WWR-5) is to document the current state-of-the-art of major developments in a number of nations throughout the World pursuing geological disposal programs, and to summarize challenging problems and experience that have been obtained in siting, preparing and reviewing cases for the operational and long-term safety of proposed and operating nuclear waste repositories. The scope of the Review is to address current specific technical issues and challenges in safety case development along with the interplay of technical feasibility, siting, engineering design issues, and operational and post-closure safety. In particular, the chapters included inmore » the report present the following types of information: the current status of the deep geological repository programs for high level nuclear waste and low- and intermediate level nuclear waste in each country, concepts of siting and radioactive waste and spent nuclear fuel management in different countries (with the emphasis of nuclear waste disposal under different climatic conditions and different geological formations), progress in repository site selection and site characterization, technology development, buffer/backfill materials studies and testing, support activities, programs, and projects, international cooperation, and future plans, as well as regulatory issues and transboundary problems.« less
Geohydrologic aspects for siting and design of low-level radioactive-waste disposal
Bedinger, M.S.
1989-01-01
The objective for siting and design of low-level radioactive-waste repository sites is to isolate the waste from the biosphere until the waste no longer poses an unacceptable hazard as a result of radioactive decay. Low-level radioactive waste commonly is isolated at shallow depths with various engineered features to stabilize the waste and to reduce its dissolution and transport by ground water. The unsaturated zone generally is preferred for isolating the waste. Low-level radioactive waste may need to be isolated for 300 to 500 years. Maintenance and monitoring of the repository site are required by Federal regulations for only the first 100 years. Therefore, geohydrology of the repository site needs to provide natural isolation of the waste for the hazardous period following maintenance of the site. Engineering design of the repository needs to be compatible with the natural geohydrologic conditions at the site. Studies at existing commercial and Federal waste-disposal sites provide information on the problems encountered and the basis for establishing siting guidelines for improved isolation of radioactive waste, engineering design of repository structures, and surveillance needs to assess the effectiveness of the repositories and to provide early warning of problems that may require remedial action.Climate directly affects the hydrology of a site and probably is the most important single factor that affects the suitability of a site for shallow-land burial of low-level radioactive waste. Humid and subhumid regions are not well suited for shallow isolation of low-level radioactive waste in the unsaturated zone; arid regions with zero to small infiltration from precipitation, great depths to the water table, and long flow paths to natural discharge areas are naturally well suited to isolation of the waste. The unsaturated zone is preferred for isolation of low-level radioactive waste. The guiding rationale is to minimize contact of water with the waste and to minimize transport of waste from the repository. The hydrology of a flow system containing a repository is greatly affected by the engineering of the repository site. Prediction of the performance of the repository is a complex problem, hampered by problems of characterizing the natural and manmade features of the flow system and by the limitations of models to predict flow and geochemical processes in the saturated and unsaturated zones. Disposal in low-permeability unfractured clays in the saturated zone may be feasible where the radionuclide transport is controlled by diffusion rather than advection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1986-02-01
This Environmental Assessment (EA) supports the DOE proposal to Congress to construct and operate a facility for monitored retrievable storage (MRS) of spent fuel at a site on the Clinch River in the Roane County portion of Oak Ridge, Tennessee. The first part of this document is an assessment of the value of, need for, and feasibility of an MRS facility as an integral component of the waste management system. The second part is an assessment and comparison of the potential environmental impacts projected for each of six site-design combinations. The MRS facility would be centrally located with respect tomore » existing reactors, and would receive and canister spent fuel in preparation for shipment to and disposal in a geologic repository. 207 refs., 57 figs., 132 tabs.« less
Duncan, R G; Saperia, D; Dulbandzhyan, R; Shabot, M M; Polaschek, J X; Jones, D T
2001-01-01
The advent of the World-Wide-Web protocols and client-server technology has made it easy to build low-cost, user-friendly, platform-independent graphical user interfaces to health information systems and to integrate the presentation of data from multiple systems. The authors describe a Web interface for a clinical data repository (CDR) that was moved from concept to production status in less than six months using a rapid prototyping approach, multi-disciplinary development team, and off-the-shelf hardware and software. The system has since been expanded to provide an integrated display of clinical data from nearly 20 disparate information systems.
Wu, Huiqun; Wei, Yufang; Shang, Yujuan; Shi, Wei; Wang, Lei; Li, Jingjing; Sang, Aimin; Shi, Lili; Jiang, Kui; Dong, Jiancheng
2018-06-06
Type 2 diabetes mellitus (T2DM) is a common chronic disease, and the fragment data collected through separated vendors makes continuous management of DM patients difficult. The lack of standard of fragment data from those diabetic patients also makes the further potential phenotyping based on the diabetic data difficult. Traditional T2DM data repository only supports data collection from T2DM patients, lack of phenotyping ability and relied on standalone database design, limiting the secondary usage of these valuable data. To solve these issues, we proposed a novel T2DM data repository framework, which was based on standards. This repository can integrate data from various sources. It would be used as a standardized record for further data transfer as well as integration. Phenotyping was conducted based on clinical guidelines with KNIME workflow. To evaluate the phenotyping performance of the proposed system, data was collected from local community by healthcare providers and was then tested using algorithms. The results indicated that the proposed system could detect DR cases with an average accuracy of about 82.8%. Furthermore, these results had the promising potential of addressing fragmented data. The proposed system has integrating and phenotyping abilities, which could be used for diabetes research in future studies.
Thermal Analysis of a Nuclear Waste Repository in Argillite Host Rock
NASA Astrophysics Data System (ADS)
Hadgu, T.; Gomez, S. P.; Matteo, E. N.
2017-12-01
Disposal of high-level nuclear waste in a geological repository requires analysis of heat distribution as a result of decay heat. Such an analysis supports design of repository layout to define repository footprint as well as provide information of importance to overall design. The analysis is also used in the study of potential migration of radionuclides to the accessible environment. In this study, thermal analysis for high-level waste and spent nuclear fuel in a generic repository in argillite host rock is presented. The thermal analysis utilized both semi-analytical and numerical modeling in the near field of a repository. The semi-analytical method looks at heat transport by conduction in the repository and surroundings. The results of the simulation method are temperature histories at selected radial distances from the waste package. A 3-D thermal-hydrologic numerical model was also conducted to study fluid and heat distribution in the near field. The thermal analysis assumed a generic geological repository at 500 m depth. For the semi-analytical method, a backfilled closed repository was assumed with basic design and material properties. For the thermal-hydrologic numerical method, a repository layout with disposal in horizontal boreholes was assumed. The 3-D modeling domain covers a limited portion of the repository footprint to enable a detailed thermal analysis. A highly refined unstructured mesh was used with increased discretization near heat sources and at intersections of different materials. All simulations considered different parameter values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock), and different surface storage times. Results of the different modeling cases are presented and include temperature and fluid flow profiles in the near field at different simulation times. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8295 A.
Chemical markup, XML and the World-Wide Web. 3. Toward a signed semantic chemical web of trust.
Gkoutos, G V; Murray-Rust, P; Rzepa, H S; Wright, M
2001-01-01
We describe how a collection of documents expressed in XML-conforming languages such as CML and XHTML can be authenticated and validated against digital signatures which make use of established X.509 certificate technology. These can be associated either with specific nodes in the XML document or with the entire document. We illustrate this with two examples. An entire journal article expressed in XML has its individual components digitally signed by separate authors, and the collection is placed in an envelope and again signed. The second example involves using a software robot agent to acquire a collection of documents from a specified URL, to perform various operations and transformations on the content, including expressing molecules in CML, and to automatically sign the various components and deposit the result in a repository. We argue that these operations can used as components for building what we term an authenticated and semantic chemical web of trust.
Roadmap to a Comprehensive Clinical Data Warehouse for Precision Medicine Applications in Oncology
Foran, David J; Chen, Wenjin; Chu, Huiqi; Sadimin, Evita; Loh, Doreen; Riedlinger, Gregory; Goodell, Lauri A; Ganesan, Shridar; Hirshfield, Kim; Rodriguez, Lorna; DiPaola, Robert S
2017-01-01
Leading institutions throughout the country have established Precision Medicine programs to support personalized treatment of patients. A cornerstone for these programs is the establishment of enterprise-wide Clinical Data Warehouses. Working shoulder-to-shoulder, a team of physicians, systems biologists, engineers, and scientists at Rutgers Cancer Institute of New Jersey have designed, developed, and implemented the Warehouse with information originating from data sources, including Electronic Medical Records, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology and Pathology archives, and Next Generation Sequencing services. Innovative solutions were implemented to detect and extract unstructured clinical information that was embedded in paper/text documents, including synoptic pathology reports. Supporting important precision medicine use cases, the growing Warehouse enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information of patient tumors individually or as part of large cohorts to identify changes and patterns that may influence treatment decisions and potential outcomes. PMID:28469389
Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2015-12-01
Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.
Potter, C.J.; Day, W.C.; Sweetkind, D.S.; Dickerson, R.P.
2004-01-01
Geologic mapping and fracture studies have documented the fundamental patterns of joints and faults in the thick sequence of rhyolite tuffs at Yucca Mountain, Nevada, the proposed site of an underground repository for high-level radioactive waste. The largest structures are north-striking, block-bounding normal faults (with a subordinate left-lateral component) that divide the mountain into numerous 1-4-km-wide panels of gently east-dipping strata. Block-bounding faults, which underwent Quaternary movement as well as earlier Neogene movement, are linked by dominantly northwest-striking relay faults, especially in the more extended southern part of Yucca Mountain. Intrablock faults are commonly short and discontinuous, except those on the more intensely deformed margins of the blocks. Lithologic properties of the local tuff stratigraphy strongly control the mesoscale fracture network, and locally the fracture network has a strong influence on the nature of intrablock faulting. The least faulted part of Yucca Mountain is the north-central part, the site of the proposed repository. Although bounded by complex normal-fault systems, the 4-km-wide central block contains only sparse intrablock faults. Locally intense jointing appears to be strata-bound. The complexity of deformation and the magnitude of extension increase in all directions away from the proposed repository volume, especially in the southern part of the mountain where the intensity of deformation and the amount of vertical-axis rotation increase markedly. Block-bounding faults were active at Yucca Mountain during and after eruption of the 12.8-12.7 Ma Paintbrush Group, and significant motion on these faults postdated the 11.6 Ma Rainier Mesa Tuff. Diminished fault activity continued into Quaternary time. Roughly half of the stratal tilting in the site area occurred after 11.6 Ma, probably synchronous with the main pulse of vertical-axis rotation, which occurred between 11.6 and 11.45 Ma. Studies of sequential formation of tectonic joints, in the context of regional paleostress studies, indicate that north- and northwest-striking joint sets formed coevally with the main faulting episode during regional east-northeast-west-southwest extension and that a prominent northeast-striking joint set formed later, probably after 9 Ma. These structural analyses contribute to the understanding of several important issues at Yucca Mountain, including potential hydrologic pathways, seismic hazards, and fault-displacement hazards. ?? 2004 Geological Society of America.
Data publication, documentation and user friendly landing pages - improving data discovery and reuse
NASA Astrophysics Data System (ADS)
Elger, Kirsten; Ulbricht, Damian; Bertelmann, Roland
2016-04-01
Research data are the basis for scientific research and often irreplaceable (e.g. observational data). Storage of such data in appropriate, theme specific or institutional repositories is an essential part of ensuring their long term preservation and access. The free and open access to research data for reuse and scrutiny has been identified as a key issue by the scientific community as well as by research agencies and the public. To ensure the datasets to intelligible and usable for others they must be accompanied by comprehensive data description and standardized metadata for data discovery, and ideally should be published using digital object identifier (DOI). These make datasets citable and ensure their long-term accessibility and are accepted in reference lists of journal articles (http://www.copdess.org/statement-of-commitment/). The GFZ German Research Centre for Geosciences is the national laboratory for Geosciences in Germany and part of the Helmholtz Association, Germany's largest scientific organization. The development and maintenance of data systems is a key component of 'GFZ Data Services' to support state-of-the-art research. The datasets, archived in and published by the GFZ Data Repository cover all geoscientific disciplines and range from large dynamic datasets deriving from global monitoring seismic or geodetic networks with real-time data acquisition, to remotely sensed satellite products, to automatically generated data publications from a database for data from micro meteorological stations, to various model results, to geochemical and rock mechanical analyses from various labs, and field observations. The user-friendly presentation of published datasets via a DOI landing page is as important for reuse as the storage itself, and the required information is highly specific for each scientific discipline. If dataset descriptions are too general, or require the download of a dataset before knowing its suitability, many researchers often decide not to reuse a published dataset. In contrast to large data repositories without thematic specification, theme-specific data repositories have a large expertise in data discovery and opportunity to develop usable, discipline-specific formats and layouts for specific datasets, including consultation to different formats for the data description (e.g., via a Data Report or an article in a Data Journal) with full consideration of international metadata standards.
Exploring a New Model for Preprint Server: A Case Study of CSPO
ERIC Educational Resources Information Center
Hu, Changping; Zhang, Yaokun; Chen, Guo
2010-01-01
This paper describes the introduction of an open-access preprint server in China covering 43 disciplines. The system includes mandatory deposit for state-funded research and reports on the repository and its effectiveness and outlines a novel process of peer-review of preprints in the repository, which can be incorporated into the established…
NASA Astrophysics Data System (ADS)
McWhirter, J.; Boler, F. M.; Bock, Y.; Jamason, P.; Squibb, M. B.; Noll, C. E.; Blewitt, G.; Kreemer, C. W.
2010-12-01
Three geodesy Archive Centers, Scripps Orbit and Permanent Array Center (SOPAC), NASA's Crustal Dynamics Data Information System (CDDIS) and UNAVCO are engaged in a joint effort to define and develop a common Web Service Application Programming Interface (API) for accessing geodetic data holdings. This effort is funded by the NASA ROSES ACCESS Program to modernize the original GPS Seamless Archive Centers (GSAC) technology which was developed in the 1990s. A new web service interface, the GSAC-WS, is being developed to provide uniform and expanded mechanisms through which users can access our data repositories. In total, our respective archives hold tens of millions of files and contain a rich collection of site/station metadata. Though we serve similar user communities, we currently provide a range of different access methods, query services and metadata formats. This leads to a lack of consistency in the userís experience and a duplication of engineering efforts. The GSAC-WS API and its reference implementation in an underlying Java-based GSAC Service Layer (GSL) supports metadata and data queries into site/station oriented data archives. The general nature of this API makes it applicable to a broad range of data systems. The overall goals of this project include providing consistent and rich query interfaces for end users and client programs, the development of enabling technology to facilitate third party repositories in developing these web service capabilities and to enable the ability to perform data queries across a collection of federated GSAC-WS enabled repositories. A fundamental challenge faced in this project is to provide a common suite of query services across a heterogeneous collection of data yet enabling each repository to expose their specific metadata holdings. To address this challenge we are developing a "capabilities" based service where a repository can describe its specific query and metadata capabilities. Furthermore, the architecture of the GSL is based on a model-view paradigm that decouples the underlying data model semantics from particular representations of the data model. This will allow for the GSAC-WS enabled repositories to evolve their service offerings to incorporate new metadata definition formats (e.g., ISO-19115, FGDC, JSON, etc.) and new techniques for accessing their holdings. Building on the core GSAC-WS implementations the project is also developing a federated/distributed query service. This service will seamlessly integrate with the GSAC Service Layer and will support data and metadata queries across a collection of federated GSAC repositories.
WASTE HANDLING BUILDING VENTILATION SYSTEM DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.A. Kumar
2000-06-21
The Waste Handling Building Ventilation System provides heating, ventilation, and air conditioning (HVAC) for the contaminated, potentially contaminated, and uncontaminated areas of the Monitored Geologic Repository's (MGR) Waste Handling Building (WHB). In the uncontaminated areas, the non-confinement area ventilation system maintains the proper environmental conditions for equipment operation and personnel comfort. In the contaminated and potentially contaminated areas, in addition to maintaining the proper environmental conditions for equipment operation and personnel comfort, the contamination confinement area ventilation system directs potentially contaminated air away from personnel in the WHB and confines the contamination within high-efficiency particulate air (HEPA) filtration units. Themore » contamination confinement areas ventilation system creates airflow paths and pressure zones to minimize the potential for spreading contamination within the building. The contamination confinement ventilation system also protects the environment and the public by limiting airborne releases of radioactive or other hazardous contaminants from the WHB. The Waste Handling Building Ventilation System is designed to perform its safety functions under accident conditions and other Design Basis Events (DBEs) (such as earthquakes, tornadoes, fires, and loss of the primary electric power). Additional system design features (such as compartmentalization with independent subsystems) limit the potential for cross-contamination within the WHB. The system provides status of important system parameters and equipment operation, and provides audible and/or visual indication of off-normal conditions and equipment failures. The Waste Handling Building Ventilation System confines the radioactive and hazardous material within the building such that the release rates comply with regulatory limits. The system design, operations, and maintenance activities incorporate ALARA (as low as is reasonably achievable) principles to maintain personnel radiation doses to all occupational workers below regulatory limits and as low as is reasonably achievable. The Waste Handling Building Ventilation System interfaces with the Waste Handling Building System by being located within the WHB and by maintaining specific pressures, temperatures, and humidity within the building. The system also depends on the WHB for water supply. The system interfaces with the Site Radiological Monitoring System for continuous monitoring of the exhaust air; the Waste Handling Building Fire Protection System for detection of fire and smoke; the Waste Handling Building Electrical System for normal, emergency, and standby power; and the Monitored Geologic Repository Operations Monitoring and Control System for monitoring and control of the system.« less
The effect of iron on montmorillonite stability. (I) Background and thermodynamic considerations
NASA Astrophysics Data System (ADS)
Wilson, James; Savage, David; Cuadros, Javier; Shibata, Masahiro; Ragnarsdottir, K. Vala
2006-01-01
It is envisaged that high-level nuclear waste (HLW) will be disposed of in underground repositories. Many proposed repository designs include steel waste canisters and bentonite backfill. Natural analogues and experimental data indicate that the montmorillonite component of the backfill could react with steel corrosion products to produce non-swelling Fe-rich phyllosilicates such as chamosite, berthierine, or Fe-rich smectite. In K-bearing systems, the alteration of montmorillonite to illite/glauconite could also be envisaged. If montmorillonite were altered to non-swelling minerals, the swelling capacity and self-healing properties of the bentonite backfill could be reduced, thereby diminishing backfill performance. The main aim of this paper was to investigate Fe-rich phyllosilicate mineral stability at the canister-backfill interface using thermodynamic modelling. Estimates of thermodynamic properties were made for Fe-rich clay minerals in order to construct approximate phase-relations for end-member/simplified mineral compositions in logarithmic activity space. Logarithmic activity diagrams (for the system Al 2O 3-FeO-Fe 2O 3-MgO-Na 2O-SiO 2-H 2O) suggest that if pore waters are supersaturated with respect to magnetite in HLW repositories, Fe(II)-rich saponite is the most likely montmorillonite alteration product (if f values are significantly lower than magnetite-hematite equilibrium). Therefore, the alteration of montmorillonite may not be detrimental to nuclear waste repositories that include Fe, as long as the swelling behaviour of the Fe-rich smectite produced is maintained. If f exceeds magnetite-hematite equilibrium, and solutions are saturated with respect to magnetite in HLW repositories, berthierine is likely to be more stable than smectite minerals. The alteration of montmorillonite to berthierine could be detrimental to the performance of HLW repositories.
The Victor C++ library for protein representation and advanced manipulation.
Hirsh, Layla; Piovesan, Damiano; Giollo, Manuel; Ferrari, Carlo; Tosatto, Silvio C E
2015-04-01
Protein sequence and structure representation and manipulation require dedicated software libraries to support methods of increasing complexity. Here, we describe the VIrtual Constrution TOol for pRoteins (Victor) C++ library, an open source platform dedicated to enabling inexperienced users to develop advanced tools and gathering contributions from the community. The provided application examples cover statistical energy potentials, profile-profile sequence alignments and ab initio loop modeling. Victor was used over the last 15 years in several publications and optimized for efficiency. It is provided as a GitHub repository with source files and unit tests, plus extensive online documentation, including a Wiki with help files and tutorials, examples and Doxygen documentation. The C++ library and online documentation, distributed under a GPL license are available from URL: http://protein.bio.unipd.it/victor/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Mobilization Studies List: 1978-1988. Volume 1. Main Document
1989-03-14
overemphasize the benefits of establishing a continuing link with the appropriate data repositories. Encl DALE F. MEANS Colonel, Corps of Engineers Commander... HEALTH & HUMAN SERVICES, ATTN: MR GILLEN/RM 3B-1O, HUMPHREY BLDG. 200 INDEP AVE. S.W. WASHINGTON, DC 20201 DEPARTMENT OF THE INTERIOR, ATTN: BARBARA...23651-5000 HO US ARMY HEALTH SERVICES COMMAND, ATTN: HSOP-SP, FT SAM HOUSTON, TX 78234 HO USMC, ATTN: RESO, WASHINGTON DC, 20380 HO USMC, ATTN: CNC PL
Theory and Modelling Resources Cookbook
NASA Astrophysics Data System (ADS)
Gray, Norman
This cookbook is intended to assemble references to resources likely to be of interest to theorists and modellers. It's not a collection of standard recipes, but instead a repository of brief introductions to facilities. It includes references to sources of authoritative information, including those Starlink documents most likely to be of interest to theorists. Although the topics are chosen for their relevance to theoretical work, a good proportion of the information should be of interest to all of the astronomical computing community.
Signaling Network Map of Endothelial TEK Tyrosine Kinase
Sandhya, Varot K.; Singh, Priyata; Parthasarathy, Deepak; Kumar, Awinav; Gattu, Rudrappa; Mathur, Premendu Prakash; Mac Gabhann, F.; Pandey, Akhilesh
2014-01-01
TEK tyrosine kinase is primarily expressed on endothelial cells and is most commonly referred to as TIE2. TIE2 is a receptor tyrosine kinase modulated by its ligands, angiopoietins, to regulate the development and remodeling of vascular system. It is also one of the critical pathways associated with tumor angiogenesis and familial venous malformations. Apart from the vascular system, TIE2 signaling is also associated with postnatal hematopoiesis. Despite the involvement of TIE2-angiopoietin system in several diseases, the downstream molecular events of TIE2-angiopoietin signaling are not reported in any pathway repository. Therefore, carrying out a detailed review of published literature, we have documented molecular signaling events mediated by TIE2 in response to angiopoietins and developed a network map of TIE2 signaling. The pathway information is freely available to the scientific community through NetPath, a manually curated resource of signaling pathways. We hope that this pathway resource will provide an in-depth view of TIE2-angiopoietin signaling and will lead to identification of potential therapeutic targets for TIE2-angiopoietin associated disorders. PMID:25371820
Audit and Certification Process for Science Data Digital Repositories
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.
2011-12-01
Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.
Continuous integration and quality control for scientific software
NASA Astrophysics Data System (ADS)
Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.
2013-08-01
Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.
Working More Productively: Tools for Administrative Data
Roos, Leslie L; Soodeen, Ruth-Ann; Bond, Ruth; Burchill, Charles
2003-01-01
Objective This paper describes a web-based resource () that contains a series of tools for working with administrative data. This work in knowledge management represents an effort to document, find, and transfer concepts and techniques, both within the local research group and to a more broadly defined user community. Concepts and associated computer programs are made as “modular” as possible to facilitate easy transfer from one project to another. Study Setting/Data Sources Tools to work with a registry, longitudinal administrative data, and special files (survey and clinical) from the Province of Manitoba, Canada in the 1990–2003 period. Data Collection Literature review and analyses of web site utilization were used to generate the findings. Principal Findings The Internet-based Concept Dictionary and SAS macros developed in Manitoba are being used in a growing number of research centers. Nearly 32,000 hits from more than 10,200 hosts in a recent month demonstrate broad interest in the Concept Dictionary. Conclusions The tools, taken together, make up a knowledge repository and research production system that aid local work and have great potential internationally. Modular software provides considerable efficiency. The merging of documentation and researcher-to-researcher dissemination keeps costs manageable. PMID:14596394
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruegg, Janine; Gries, Corinna; Bond-Lamberty, Benjamin
An important goal of macrosystems ecology research is to advance understanding of ecological systems at both fine and broad temporal and spatial scales. Our premise in this paper is that such projects require information management that is integrated into projects from their inception. Such efforts will lead to improved communication and sharing of knowledge among diverse project participants, better science outcomes, and more open science. We promote "closing the data life cycle" by publishing well-documented data sets, which allows for re-use of data to answer new and different questions from the ones conceived by the original projects. The practice ofmore » documenting and submitting data sets to publicly accessible data repositories ensures that research results and data are accessible to and useable by other researchers, thus fostering open science. Ecologists are often not familiar with the information management tools and requirements to effectively preserve data, however, and receive little institutional or professional incentive to do so. This paper describes recommended steps to these ends, and gives examples from current macrosystem ecology projects of why information management is so critical to ensuring that scientific results can be both reproduced and data shared for future use.« less
Waste Form and Indrift Colloids-Associated Radionuclide Concentrations: Abstraction and Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Aguilar
This Model Report describes the analysis and abstractions of the colloids process model for the waste form and engineered barrier system components of the total system performance assessment calculations to be performed with the Total System Performance Assessment-License Application model. Included in this report is a description of (1) the types and concentrations of colloids that could be generated in the waste package from degradation of waste forms and the corrosion of the waste package materials, (2) types and concentrations of colloids produced from the steel components of the repository and their potential role in radionuclide transport, and (3) typesmore » and concentrations of colloids present in natural waters in the vicinity of Yucca Mountain. Additionally, attachment/detachment characteristics and mechanisms of colloids anticipated in the repository are addressed and discussed. The abstraction of the process model is intended to capture the most important characteristics of radionuclide-colloid behavior for use in predicting the potential impact of colloid-facilitated radionuclide transport on repository performance.« less
Functions of an engineered barrier system for a nuclear waste repository in basalt
NASA Astrophysics Data System (ADS)
Coons, W. E.; Moore, E. L.; Smith, M. J.; Kaser, J. D.
1980-01-01
The functions of components selected for an engineered barrier system for a nuclear waste repository in basalt are defined providing a focal point for barrier material research and development by delineating the purpose and operative lifetime of each component of the engineered system. A five component system (comprised of waste form, canister, buffer, overpack, and tailored backfill) is discussed. Redundancy is provided by subsystems of physical and chemical barriers which act in concert with the geology to provide a formidable barrier to transport of hazardous materials to the biosphere. The barrier system is clarified by examples pertinent to storage in basalt, and a technical approach to barrier design and material selection is proposed.
A Fruit of Yucca Mountain: The Remote Waste Package Closure System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Skinner; Greg Housley; Colleen Shelton-Davis
2011-11-01
Was the death of the Yucca Mountain repository the fate of a technical lemon or a political lemon? Without caution, this debate could lure us away from capitalizing on the fruits of the project. In March 2009, Idaho National Laboratory (INL) successfully demonstrated the Waste Package Closure System, a full-scale prototype system for closing waste packages that were to be entombed in the now abandoned Yucca Mountain repository. This article describes the system, which INL designed and built, to weld the closure lids on the waste packages, nondestructively examine the welds using four different techniques, repair the welds if necessary,more » mitigate crack initiating stresses in the surfaces of the welds, evacuate and backfill the packages with an inert gas, and perform all of these tasks remotely. As a nation, we now have a proven method for securely sealing nuclear waste packages for long term storage—regardless of whether or not the future destination for these packages will be an underground repository. Additionally, many of the system’s features and concepts may benefit other remote nuclear applications.« less
ERIC Educational Resources Information Center
Park, Jung-ran; Yang, Chris; Tosaka, Yuji; Ping, Qing; Mimouni, Houda El
2016-01-01
This study is a part of the larger project that develops a sustainable digital repository of professional development resources on emerging data standards and technologies for data organization and management in libraries. Toward that end, the project team developed an automated workflow to crawl for, monitor, and classify relevant web objects…
The Challenges of Releasing Human Data for Analysis
NASA Technical Reports Server (NTRS)
Fitts, Mary; Van Baalen, Mary; Johnson-Throop, Kathy; Lee, Lesley; Havelka, Jacque; Wear, Mary; Thomas, Diedre M.
2011-01-01
The NASA Johnson Space Center s (NASA JSC) Committee for the Protection of Human Subjects (CPHS) recently approved the formation of two human data repositories: the Lifetime Surveillance of Astronaut Health Repository (LSAH-R) for clinical data and the Life Sciences Data Archive Repository (LSDA-R) for research data. The establishment of these repositories forms the foundation for the release of data and information beyond the scope for which the data was originally collected. The release of clinical and research data and information is primarily managed by two NASA groups: the Evidence Base Working Group (EBWG), consisting of members of both repositories, and the LSAH Policy Board. The goal of unifying these repositories and their processes is to provide a mutually supportive approach to handling medical and research data, to enhance the use of medical and research data to reduce risk, and to promote the understanding of space physiology, countermeasures and other mitigation strategies. Over the past year, both repositories have received over 100 data and information requests from a wide variety of requesters. The disposition of these requests has highlighted the challenges faced when attempting to make data collected on a unique set of subjects available beyond the original intent for which the data were collected. As the EBWG works through each request, many considerations must be factored into account when deciding what data can be shared and how - from the Privacy Act of 1974 and the Health Insurance Portability and Accountability Act (HIPAA), to NASA s Health Information Management System (10HIMS) and Human Experimental and Research Data Records (10HERD) access requirements. Additional considerations include the presence of the data in the repositories and vetting requesters for legitimacy of their use of the data. Additionally, fair access must be ensured for intramural, as well as extramural investigators. All of this must be considered in the formulation of the charters, policies and workflows for the human data repositories at NASA.
PIMS-Universal Payload Information Management
NASA Technical Reports Server (NTRS)
Elmore, Ralph; McNair, Ann R. (Technical Monitor)
2002-01-01
As the overall manager and integrator of International Space Station (ISS) science payloads and experiments, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center had a critical need to provide an information management system for exchange and management of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to provide collaborative access to remote experimenters and International Partners. The Payload Information Management System (PIMS) is a ground based electronic document configuration management and workflow system that was built to service that need. Functionally, PIMS provides the following document management related capabilities: 1. File access control, storage and retrieval from a central repository vault. 2. Collect supplemental data about files in the vault. 3. File exchange with a PMS GUI client, or any FTP connection. 4. Files placement into an FTP accessible dropbox for pickup by interfacing facilities, included files transmitted for spacecraft uplink. 5. Transmission of email messages to users notifying them of new version availability. 6. Polling of intermediate facility dropboxes for files that will automatically be processed by PIMS. 7. Provide an API that allows other POIC applications to access PIMS information. Functionally, PIMS provides the following Change Request processing capabilities: 1. Ability to create, view, manipulate, and query information about Operations Change Requests (OCRs). 2. Provides an adaptable workflow approval of OCRs with routing through developers, facility leads, POIC leads, reviewers, and implementers. Email messages can be sent to users either involving them in the workflow process or simply notifying them of OCR approval progress. All PIMS document management and OCR workflow controls are coordinated through and routed to individual user's "to do" list tasks. A user is given a task when it is their turn to perform some action relating to the approval of the Document or OCR. The user's available actions are restricted to only functions available for the assigned task. Certain actions, such as review or action implementation by non-PIMS users, can also be coordinated through automated emails.
NASA Astrophysics Data System (ADS)
Pilone, D.; Gilman, J.; Baynes, K.; Shum, D.
2015-12-01
This talk introduces a new NASA Earth Observing System Data and Information System (EOSDIS) capability to automatically generate and maintain derived, Virtual Product information allowing DAACs and Data Providers to create tailored and more discoverable variations of their products. After this talk the audience will be aware of the new EOSDIS Virtual Product capability, applications of it, and how to take advantage of it. Much of the data made available in the EOSDIS are organized for generation and archival rather than for discovery and use. The EOSDIS Common Metadata Repository (CMR) is launching a new capability providing automated generation and maintenance of user-oriented Virtual Product information. DAACs can easily surface variations on established data products tailored to specific uses cases and users, leveraging DAAC exposed services such as custom ordering or access services like OPeNDAP for on-demand product generation and distribution. Virtual Data Products enjoy support for spatial and temporal information, keyword discovery, association with imagery, and are fully discoverable by tools such as NASA Earthdata Search, Worldview, and Reverb. Virtual Product generation has applicability across many use cases: - Describing derived products such as Surface Kinetic Temperature information (AST_08) from source products (ASTER L1A) - Providing streamlined access to data products (e.g. AIRS) containing many (>800) data variables covering an enormous variety of physical measurements - Attaching additional EOSDIS offerings such as Visual Metadata, external services, and documentation metadata - Publishing alternate formats for a product (e.g. netCDF for HDF products) with the actual conversion happening on request - Publishing granules to be modified by on-the-fly services, like GES-DISC's Data Quality Screening Service - Publishing "bundled" products where granules from one product correspond to granules from one or more other related products
Rahman, Mahabubur; Watabe, Hiroshi
2018-05-01
Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Goloborodko, Anton A; Levitsky, Lev I; Ivanov, Mark V; Gorshkov, Mikhail V
2013-02-01
Pyteomics is a cross-platform, open-source Python library providing a rich set of tools for MS-based proteomics. It provides modules for reading LC-MS/MS data, search engine output, protein sequence databases, theoretical prediction of retention times, electrochemical properties of polypeptides, mass and m/z calculations, and sequence parsing. Pyteomics is available under Apache license; release versions are available at the Python Package Index http://pypi.python.org/pyteomics, the source code repository at http://hg.theorchromo.ru/pyteomics, documentation at http://packages.python.org/pyteomics. Pyteomics.biolccc documentation is available at http://packages.python.org/pyteomics.biolccc/. Questions on installation and usage can be addressed to pyteomics mailing list: pyteomics@googlegroups.com.
Detecting people of interest from internet data sources
NASA Astrophysics Data System (ADS)
Cardillo, Raymond A.; Salerno, John J.
2006-04-01
In previous papers, we have documented success in determining the key people of interest from a large corpus of real-world evidence. Our recent efforts focus on exploring additional domains and data sources. Internet data sources such as email, web pages, and news feeds make it easier to gather a large corpus of documents for various domains, but detecting people of interest in these sources introduces new challenges. Analyzing these massive sources magnifies entity resolution problems, and demands a storage management strategy that supports efficient algorithmic analysis and visualization techniques. This paper discusses the techniques we used in order to analyze the ENRON email repository, which are also applicable to analyzing web pages returned from our "Buddy" meta-search engine.
Numerical modeling of perched water under Yucca Mountain, Nevada
Hinds, J.J.; Ge, S.; Fridrich, C.J.
1999-01-01
The presence of perched water near the potential high-level nuclear waste repository area at Yucca Mountain, Nevada, has important implications for waste isolation. Perched water occurs because of sharp contrasts in rock properties, in particular between the strongly fractured repository host rock (the Topopah Spring welded tuff) and the immediately underlying vitrophyric (glassy) subunit, in which fractures are sealed by clays that were formed by alteration of the volcanic glass. The vitrophyre acts as a vertical barrier to unsaturated flow throughout much of the potential repository area. Geochemical analyses (Yang et al. 1996) indicate that perched water is relatively young, perhaps younger than 10,000 years. Given the low permeability of the rock matrix, fractures and perhaps fault zones must play a crucial role in unsaturated flow. The geologic setting of the major perched water bodies under Yucca Mountain suggests that faults commonly form barriers to lateral flow at the level of the repository horizon, but may also form important pathways for vertical infiltration from the repository horizon down to the water table. Using the numerical code UNSAT2, two factors believed to influence the perched water system at Yucca Mountain, climate and fault-zone permeability, are explored. The two-dimensional model predicts that the volume of water held within the perched water system may greatly increase under wetter climatic conditions, and that perched water bodies may drain to the water table along fault zones. Modeling results also show fault flow to be significantly attenuated in the Paintbrush Tuff non-welded hydrogeologic unit.
Connecting the pieces: Using ORCIDs to improve research impact and repositories.
Baessa, Mohamed; Lery, Thibaut; Grenz, Daryl; Vijayakumar, J K
2015-01-01
Quantitative data are crucial in the assessment of research impact in the academic world. However, as a young university created in 2009, King Abdullah University of Science and Technology (KAUST) needs to aggregate bibliometrics from researchers coming from diverse origins, not necessarily with the proper affiliations. In this context, the University has launched an institutional repository in September 2012 with the objectives of creating a home for the intellectual outputs of KAUST researchers. Later, the university adopted the first mandated institutional open access policy in the Arab region, effective June 31, 2014. Several projects were then initiated in order to accurately identify the research being done by KAUST authors and bring it into the repository in accordance with the open access policy. Integration with ORCID has been a key element in this process and the best way to ensure data quality for researcher's scientific contributions. It included the systematic inclusion and creation, if necessary, of ORCID identifiers in the existing repository system, an institutional membership in ORCID, and the creation of dedicated integration tools. In addition and in cooperation with the Office of Research Evaluation, the Library worked at implementing a Current Research Information System (CRIS) as a standardized common resource to monitor KAUST research outputs. We will present our findings about the CRIS implementation, the ORCID API, the repository statistics as well as our approach in conducting the assessment of research impact in terms of usage by the global research community.
Superfund Public Information System (SPIS), January 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-01-01
The Superfund Public Information System (SPIS) on CD-ROM contains Superfund data for the United States Environmental Protection Agency. The Superfund data is a collection of three databases: Records of Decision (RODS); Comprehensive Environmental, Response, Compensation, and Liability Information System (CERCLIS); and Archive (NFRAP). Descriptions of these databases and CD contents are listed below. Data content: The CD contains the complete text of the official ROD documents signed and issued by EPA from fiscal years 1982--1996; 147 RODs for fiscal year 1997; and seven RODs for fiscal year 1998. The CD also contains 89 Explanation of Significant Difference (ESD) documents, asmore » well as 48 ROD Amendments. CERCLIS and Archive (NFRAP) data is through January 19, 1999. RODS is the Records Of Decision System. RODS is used to track site clean-ups under the Superfund program to justify the type of treatment chosen at each site. RODS contains information on technology justification, site history, community participation, enforcement activities, site characteristics, scope and role of response action, and remedy. Explanation of Significant Differences (ESDs) are also available on the CD. CERCLIS is the Comprehensive Environmental Response, Compensation, and Liability Information System. It is the official repository for all Superfund site and incident data. It contains comprehensive information on hazardous waste sites, site inspections, preliminary assessments, and remedial status. The system is sponsored by the EPA`s Office of Emergency and Remedial Response, Information Management Center. Archive (NFRAP) consists of hazardous waste sites that have no further remedial action planned; only basic identifying information is provided for archive sites. The sites found in the Archive database were originally in the CERCLIS database, but were removed beginning in the fall of 1995.« less
Czarnecki, J.B.
1984-01-01
A study was performed to assess the potential effects of changes in future climatic conditions on the groundwater system in the vicinity of Yucca Mountain, the site of a potential mined geologic repository for high-level nuclear wastes. These changes probably would result in greater rates of precipitation and, consequently, greater rates of recharge. The study was performed by simulating the groundwater system, using a two-dimensional, finite-element, groundwater flow model. The simulated position of the water table rose as much as 130 meters near the U.S. Department of Energy 's preferred repository area at Yucca Mountain for a simulation involving a 100-percent increase in precipitation compared to modern-day conditions. Despite the water table rise, no flooding of the potential repository would occur at its current proposed location. According to the simulation, springs would discharge south and west of Timber Mountain, along Fortymile Canyon, in the Amargosa Desert near Lathrop Wells and Franklin Lake playa, and near Furnace Creek Ranch in Death Valley, where they presently discharge. Simulated directions of groundwater flow paths near the potential repository area generally would be the same for the baseline (modern-day climate) and the increased-recharge simulations, but the magnitude of flow would increase by 2 to 4 times that of the baseline-simulation flow. (USGS)
Relevance similarity: an alternative means to monitor information retrieval systems
Dong, Peng; Loh, Marie; Mondry, Adrian
2005-01-01
Background Relevance assessment is a major problem in the evaluation of information retrieval systems. The work presented here introduces a new parameter, "Relevance Similarity", for the measurement of the variation of relevance assessment. In a situation where individual assessment can be compared with a gold standard, this parameter is used to study the effect of such variation on the performance of a medical information retrieval system. In such a setting, Relevance Similarity is the ratio of assessors who rank a given document same as the gold standard over the total number of assessors in the group. Methods The study was carried out on a collection of Critically Appraised Topics (CATs). Twelve volunteers were divided into two groups of people according to their domain knowledge. They assessed the relevance of retrieved topics obtained by querying a meta-search engine with ten keywords related to medical science. Their assessments were compared to the gold standard assessment, and Relevance Similarities were calculated as the ratio of positive concordance with the gold standard for each topic. Results The similarity comparison among groups showed that a higher degree of agreements exists among evaluators with more subject knowledge. The performance of the retrieval system was not significantly different as a result of the variations in relevance assessment in this particular query set. Conclusion In assessment situations where evaluators can be compared to a gold standard, Relevance Similarity provides an alternative evaluation technique to the commonly used kappa scores, which may give paradoxically low scores in highly biased situations such as document repositories containing large quantities of relevant data. PMID:16029513
NASA Astrophysics Data System (ADS)
Müller, W.; Alkan, H.; Xie, M.; Moog, H.; Sonnenthal, E. L.
2009-12-01
The release and migration of toxic contaminants from the disposed wastes is one of the main issues in long-term safety assessment of geological repositories. In the engineered and geological barriers around the nuclear waste emplacements chemical interactions between the components of the system may affect the isolation properties considerably. As the chemical issues change the transport properties in the near and far field of a nuclear repository, modelling of the transport should also take the chemistry into account. The reactive transport modelling consists of two main components: a code that combines the possible chemical reactions with thermo-hydrogeological processes interactively and a thermodynamic databank supporting the required parameters for the calculation of the chemical reactions. In the last decade many thermo-hydrogeological codes were upgraded to include the modelling of the chemical processes. TOUGHREACT is one of these codes. This is an extension of the well known simulator TOUGH2 for modelling geoprocesses. The code is developed by LBNL (Lawrence Berkeley National Laboratory, Univ. of California) for the simulation of the multi-phase transport of gas and liquid in porous media including heat transfer. After the release of its first version in 1998, this code has been applied and improved many times in conjunction with considerations for nuclear waste emplacement. A recent version has been extended to calculate ion activities in concentrated salt solutions applying the Pitzer model. In TOUGHREACT, the incorporated equation of state module ECO2N is applied as the EOS module for non-isothermal multiphase flow in a fluid system of H2O-NaCl-CO2. The partitioning of H2O and CO2 between liquid and gas phases is modelled as a function of temperature, pressure, and salinity. This module is applicable for waste repositories being expected to generate or having originally CO2 in the fluid system. The enhanced TOUGHREACT uses an EQ3/6-formatted database for both Pitzer ion-interaction parameters and thermodynamic equilibrium constants. The reliability of the parameters is as important as the accuracy of the modelling tool. For this purpose the project THEREDA (www.thereda.de)was set up. The project aims at a comprehensive and internally consistent thermodynamic reference database for geochemical modelling of near and far-field processes occurring in repositories for radioactive wastes in various host rock formations. In the framework of the project all data necessary to perform thermodynamic equilibrium calculations for elevated temperature in the system of oceanic salts are under revision, and it is expected that related data will be available for download by 2010-03. In this paper the geochemical issues that can play an essential role for the transport of radioactive contaminants within and around waste repositories are discussed. Some generic calculations are given to illustrate the geochemical interactions and their probable effects on the transport properties around HLW emplacements and on CO2 generating and/or containing repository systems.
A recommendation module to help teachers build courses through the Moodle Learning Management System
NASA Astrophysics Data System (ADS)
Limongelli, Carla; Lombardi, Matteo; Marani, Alessandro; Sciarrone, Filippo; Temperini, Marco
2016-01-01
In traditional e-learning, teachers design sets of Learning Objects (LOs) and organize their sequencing; the material implementing the LOs could be either built anew or adopted from elsewhere (e.g. from standard-compliant repositories) and reused. This task is applicable also when the teacher works in a system for personalized e-learning. In this case, the burden actually increases: for instance, the LOs may need adaptation to the system, through additional metadata. This paper presents a module that gives some support to the operations of retrieving, analyzing, and importing LOs from a set of standard Learning Objects Repositories, acting as a recommending system. In particular, it is designed to support the teacher in the phases of (i) retrieval of LOs, through a keyword-based search mechanism applied to the selected repositories; (ii) analysis of the returned LOs, whose information is enriched by a concept of relevance metric, based on both the results of the searching operation and the data related to the previous use of the LOs in the courses managed by the Learning Management System; and (iii) LO importation into the course under construction.
Cohen, K Bretonnel; Xia, Jingbo; Roeder, Christophe; Hunter, Lawrence E
2016-05-01
There is currently a crisis in science related to highly publicized failures to reproduce large numbers of published studies. The current work proposes, by way of case studies, a methodology for moving the study of reproducibility in computational work to a full stage beyond that of earlier work. Specifically, it presents a case study in attempting to reproduce the reports of two R libraries for doing text mining of the PubMed/MEDLINE repository of scientific publications. The main findings are that a rational paradigm for reproduction of natural language processing papers can be established; the advertised functionality was difficult, but not impossible, to reproduce; and reproducibility studies can produce additional insights into the functioning of the published system. Additionally, the work on reproducibility lead to the production of novel user-centered documentation that has been accessed 260 times since its publication-an average of once a day per library.
The international phosphate resource data base; development and maintenance
Bridges, Nancy J.
1983-01-01
The IPRDB (International Phosphate Resource Data Base) was developed to provide a single computerized source of geologic information about phosphate deposits worldwide. It is expected that this data base will encourage more thorough scientific analyses of phosphate deposits and assessments of undiscovered phosphate resources, and that methods of data collection and storage will be streamlined. Because the database was intended to serve as a repository for diverse and detailed data, a large amount of the early research effort was devoted to the design and development of the system. To date (1982), the file remains incomplete. All development work and file maintenance work on IPRDB was suspended as of October 1, 1982; this paper is intended to document the steps taken up to that date. The computer programs listed in the appendices were written specifically for the IPRDB phosbib file and are of limited future use.
Semantic Analysis of Email Using Domain Ontologies and WordNet
NASA Technical Reports Server (NTRS)
Berrios, Daniel C.; Keller, Richard M.
2005-01-01
The problem of capturing and accessing knowledge in paper form has been supplanted by a problem of providing structure to vast amounts of electronic information. Systems that can construct semantic links for natural language documents like email messages automatically will be a crucial element of semantic email tools. We have designed an information extraction process that can leverage the knowledge already contained in an existing semantic web, recognizing references in email to existing nodes in a network of ontology instances by using linguistic knowledge and knowledge of the structure of the semantic web. We developed a heuristic score that uses several forms of evidence to detect references in email to existing nodes in the Semanticorganizer repository's network. While these scores cannot directly support automated probabilistic inference, they can be used to rank nodes by relevance and link those deemed most relevant to email messages.
Cloud-based image sharing network for collaborative imaging diagnosis and consultation
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Gu, Yiping; Wang, Mingqing; Sun, Jianyong; Li, Ming; Zhang, Weiqiang; Zhang, Jianguo
2018-03-01
In this presentation, we presented a new approach to design cloud-based image sharing network for collaborative imaging diagnosis and consultation through Internet, which can enable radiologists, specialists and physicians locating in different sites collaboratively and interactively to do imaging diagnosis or consultation for difficult or emergency cases. The designed network combined a regional RIS, grid-based image distribution management, an integrated video conferencing system and multi-platform interactive image display devices together with secured messaging and data communication. There are three kinds of components in the network: edge server, grid-based imaging documents registry and repository, and multi-platform display devices. This network has been deployed in a public cloud platform of Alibaba through Internet since March 2017 and used for small lung nodule or early staging lung cancer diagnosis services between Radiology departments of Huadong hospital in Shanghai and the First Hospital of Jiaxing in Zhejiang Province.
Enabling FAIR and Open Data - The Importance of Communities on Influencing Change
NASA Astrophysics Data System (ADS)
Stall, S.; Lehnert, K.; Robinson, E.; Parsons, M. A.; Hanson, B.; Cutcher-Gershenfeld, J.; Nosek, B.
2017-12-01
Our research ecosystem is diverse and dependent on many interacting stakeholders that influence and support the process of science. These include funders, institutions, libraries, publishers, researchers, data managers, repositories, archives and communities. Process improvement in this ecosystem thus usually needs support by more than one of these many stakeholders. For example, mandates for open data extend across this ecosystem. Solutions require these stakeholders to come together and agree upon improvements. Recently, the value of FAIR and Open Data has encouraged funders to sponsor discussions with tangible agreements that include the steps needed to move the ecosystem towards results. Work by many of these stakeholders over the past years have developed pilot efforts that are ready to be scaled with broader engagement. A partnership of the AGU, Earth Science Information Partners (ESIP), Research Data Alliance (RDA), Center for Open Science, and key publishers including Science, Nature, and the Proceedings of the National Academy of Science (PNAS) have agreed to work together to develop integrated processes, leveraging these pilots, to make FAIR and open data the default for Earth and space science publications. This effort will build on the work of COPDESS.org, ESIP, RDA, the scientific journals, and domain repositories to ensure that well documented data, preserved in a repository with community agreed-upon metadata, and supporting persistent identifiers becomes part of the expected research products submitted in support of each publication.
The repository-based software engineering program: Redefining AdaNET as a mainstream NASA source
NASA Technical Reports Server (NTRS)
1993-01-01
The Repository-based Software Engineering Program (RBSE) is described to inform and update senior NASA managers about the program. Background and historical perspective on software reuse and RBSE for NASA managers who may not be familiar with these topics are provided. The paper draws upon and updates information from the RBSE Concept Document, baselined by NASA Headquarters, Johnson Space Center, and the University of Houston - Clear Lake in April 1992. Several of NASA's software problems and what RBSE is now doing to address those problems are described. Also, next steps to be taken to derive greater benefit from this Congressionally-mandated program are provided. The section on next steps describes the need to work closely with other NASA software quality, technology transfer, and reuse activities and focuses on goals and objectives relative to this need. RBSE's role within NASA is addressed; however, there is also the potential for systematic transfer of technology outside of NASA in later stages of the RBSE program. This technology transfer is discussed briefly.
Allen Brain Atlas-Driven Visualizations: A Web-Based Gene Expression Energy Visualization Tool
2014-05-21
purposes notwithstanding any copyright anno - tation thereon. The views and conclusions contained herein are those of the authors and should not be...Brain Res. Brain Res. Rev. 28, 309–369. doi: 10.1016/S0165-0173(98)00019-8 Bostock, M., Ogievetsky, V., and Heer, J . (2011). D³ data-driven documents...omnibus: NCBI gene expression and hybridization array data repository. Nucleic Acids Res. 30, 207–210. doi: 10.1093/nar/30.1.207 Eppig, J . T., Blake
Rocca-Serra, Philippe; Brandizi, Marco; Maguire, Eamonn; Sklyar, Nataliya; Taylor, Chris; Begley, Kimberly; Field, Dawn; Harris, Stephen; Hide, Winston; Hofmann, Oliver; Neumann, Steffen; Sterk, Peter; Tong, Weida; Sansone, Susanna-Assunta
2010-01-01
Summary: The first open source software suite for experimentalists and curators that (i) assists in the annotation and local management of experimental metadata from high-throughput studies employing one or a combination of omics and other technologies; (ii) empowers users to uptake community-defined checklists and ontologies; and (iii) facilitates submission to international public repositories. Availability and Implementation: Software, documentation, case studies and implementations at http://www.isa-tools.org Contact: isatools@googlegroups.com PMID:20679334
Looking for Skeletons in the Data Centre `Cupboard': How Repository Certification Can Help
NASA Astrophysics Data System (ADS)
Sorvari, S.; Glaves, H.
2017-12-01
There has been a national geoscience repository at the British Geological Survey (or one of its previous incarnations) almost since its inception in 1835. This longevity has resulted in vast amounts of analogue material and, more recently, digital data some of which has been collected by our scientists but much more has been acquired either through various legislative obligations or donated from various sources. However, the role and operation of the UK National Geoscience Data Centre (NGDC) in the 21st Century is very different to that of the past, with new systems and procedures dealing with predominantly digital data. A web-based ingestion portal allows users to submit their data directly to the NGDC while online services provide discovery and access to data and derived products. Increasingly we are also required to implement an array of standards e.g. ISO, OGC, W3C, best practices e.g. FAIR and legislation e.g. EU INSPIRE Directive; whilst at the same time needing to justifying our very existence to our funding agency and hosting organisation. External pressures to demonstrate that we can be recognised as a trusted repository by researchers, various funding agencies, publishers and other related entities have forced us to look at how we function, and to benchmark our operations against those of other organisations and current relevant standards such as those laid down by different repository certification processes. Following an assessment of the various options, the WDS/DSA certification process was selected as the most appropriate route for accreditation of NGDC as a trustworthy repository. It provided a suitable framework for reviewing the current systems, procedures and best practices. Undertaking this process allowed us to identify where the NGDC already has robust systems in place and where there were gaps and deficiencies in current practices. The WDS/DSA assessment process also helped to reinforce best practice throughout the NGDC and demonstrated that many of the recognised and required procedures and standards for recognition as a trusted repository were already in place, even if they were not always followed!
Fundamental Data Standards for Science Data System Interoperability and Data Correlation
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel
The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that will help to meet the PDS's and IPDA's system interoperability and data correlation requirements.
NASA Technical Reports Server (NTRS)
1982-01-01
The space option for disposal of certain high-level nuclear wastes in space as a complement to mined geological repositories is studied. A brief overview of the study background, scope, objective, guidelines and assumptions, and contents is presented. The determination of the effects of variations in the waste mix on the space systems concept to allow determination of the space systems effect on total system risk benefits when used as a complement to the DOE reference mined geological repository is studied. The waste payload system, launch site, launch system, and orbit transfer system are all addressed. Rescue mission requirements are studied. The characteristics of waste forms suitable for space disposal are identified. Trajectories and performance requirements are discussed.
DSA-WDS Common Requirements: Developing a New Core Data Repository Certification
NASA Astrophysics Data System (ADS)
Minster, J. B. H.; Edmunds, R.; L'Hours, H.; Mokrane, M.; Rickards, L.
2016-12-01
The Data Seal of Approval (DSA) and the International Council for Science - World Data System (ICSU-WDS) have both developed minimally intensive core certification standards whereby digital repositories supply evidence that they are trustworthy and have a long-term outlook. Both DSA and WDS applicants have found core certification to be beneficial: building stakeholder confidence, enhancing the repository's reputation, and demonstrating that it is following good practices; as well as stimulating the repository to focus on processes and procedures, thereby achieving ever higher levels of professionalism over time.The DSA and WDS core certifications evolved independently serving initially different communities but both initiatives are multidisciplinary with catalogues of criteria and review procedures based on the same principles. Hence, to realize efficiencies, simplify assessment options, stimulate more certifications, and increase impact on the community, the Repository Audit and Certification DSA-WDS Partnership Working Group (WG) was established under the umbrella of the Research Data Alliance (RDA). The WG conducted a side-by-side analysis of both frameworks to unify the wording and criteria, ultimately leading to a harmonized Catalogue of Common Requirements for core certification of repositories—as well as a set of Common Procedures for their assessment.This presentation will focus on the collaborative effort by DSA and WDS to establish (1) a testbed comprising DSA and WDS certified data repositories to validate both the new Catalogue and Procedures, and (2) a joint Certification Board towards their practical implementation. We will describe:• The purpose and methodology of the testbed, including selection of repositories to be assessed against the common standard.• The results of the testbed, with an in-depth look at some of the comments received and issues highlighted.• General insights gained from evaluating the testbed results, the subsequent changes to the Common Requirements and Procedures, and an assessment of the success of these enhancements.• Steps by the two organizations to integrate the Common Certification into their tools and systems. In particular, the creation of Terms of Reference for the nascent DSA-WDS Certification Board.
Comparison of selected foreign plans and practices for spent fuel and high-level waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, K.J.; Mitchell, S.J.; Lakey, L.T.
1990-04-01
This report describes the major parameters for management of spent nuclear fuel and high-level radioactive wastes in selected foreign countries as of December 1989 and compares them with those in the United States. The foreign countries included in this study are Belgium, Canada, France, the Federal Republic of Germany, Japan, Sweden, Switzerland, and the United Kingdom. All the countries are planning for disposal of spent fuel and/or high-level wastes in deep geologic repositories. Most countries (except Canada and Sweden) plan to reprocess their spent fuel and vitrify the resultant high-level liquid wastes; in comparison, the US plans direct disposal ofmore » spent fuel. The US is planning to use a container for spent fuel as the primary engineered barrier. The US has the most developed repository concept and has one of the earliest scheduled repository startup dates. The repository environment presently being considered in the US is unique, being located in tuff above the water table. The US also has the most prescriptive regulations and performance requirements for the repository system and its components. 135 refs., 8 tabs.« less
Rolling Deck to Repository (R2R): Linking and Integrating Data for Oceanographic Research
NASA Astrophysics Data System (ADS)
Arko, R. A.; Chandler, C. L.; Clark, P. D.; Shepherd, A.; Moore, C.
2012-12-01
The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from NSF-supported oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. We have published the entire R2R Catalog as a Linked Data collection, making it easily accessible to encourage linking and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by providing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation. We are leveraging or adopting existing community-standard concepts and vocabularies, particularly concepts from the Biological and Chemical Oceanography Data Management Office (BCO-DMO) ontology and terms from the pan-European SeaDataNet vocabularies, and continually re-publish resources as new concepts and terms are mapped. 2.) We facilitate data citation through the entire data lifecycle from field acquisition to shoreside archiving to (ultimately) global syntheses and journal articles. We are implementing globally unique and persistent identifiers at the collection, dataset, and granule levels, and encoding these citable identifiers directly into the Linked Data resources. 3.) We facilitate linking and integration with other repositories that publish Linked Data collections for the U.S. academic fleet, such as BCO-DMO and the Index to Marine and Lacustrine Geological Samples (IMLGS). We are initially mapping datasets at the resource level, and plan to eventually implement rule-based mapping at the concept level. We work collaboratively with partner repositories to develop best practices for URI patterns and consensus on shared vocabularies. The R2R Linked Data collection is implemented as a lightweight "virtual RDF graph" generated on-the-fly from our SQL database using the D2RQ (http://d2rq.org) package. In addition to the default SPARQL endpoint for programmatic access, we are developing a Web-based interface from open-source software components that offers user-friendly browse and search.
EVA Wiki - Transforming Knowledge Management for EVA Flight Controllers and Instructors
NASA Technical Reports Server (NTRS)
Johnston, Stephanie S.; Alpert, Brian K.; Montalvo, Edwin James; Welsh, Lawrence Daren; Wray, Scott; Mavridis, Costa
2016-01-01
The EVA Wiki was recently implemented as the primary knowledge database to retain critical knowledge and skills in the EVA Operations group at NASA's Johnson Space Center by ensuring that information is recorded in a common, easy to search repository. Prior to the EVA Wiki, information required for EVA flight controllers and instructors was scattered across different sources, including multiple file share directories, SharePoint, individual computers, and paper archives. Many documents were outdated, and data was often difficult to find and distribute. In 2011, a team recognized that these knowledge management problems could be solved by creating an EVA Wiki using MediaWiki, a free and open-source software developed by the Wikimedia Foundation. The EVA Wiki developed into an EVA-specific Wikipedia on an internal NASA server. While the technical implementation of the wiki had many challenges, one of the biggest hurdles came from a cultural shift. Like many enterprise organizations, the EVA Operations group was accustomed to hierarchical data structures and individually-owned documents. Instead of sorting files into various folders, the wiki searches content. Rather than having a single document owner, the wiki harmonized the efforts of many contributors and established an automated revision controlled system. As the group adapted to the wiki, the usefulness of this single portal for information became apparent. It transformed into a useful data mining tool for EVA flight controllers and instructors, as well as hundreds of others that support the EVA. Program managers, engineers, astronauts, flight directors, and flight controllers in differing disciplines now have an easier-to-use, searchable system to find EVA data. This paper presents the benefits the EVA Wiki has brought to NASA's EVA community, as well as the cultural challenges it had to overcome.
EVA Wiki - Transforming Knowledge Management for EVA Flight Controllers and Instructors
NASA Technical Reports Server (NTRS)
Johnston, Stephanie S.; Alpert, Brian K.; Montalvo, Edwin James; Welsh, Lawrence Daren; Wray, Scott; Mavridis, Costa
2016-01-01
The EVA Wiki was recently implemented as the primary knowledge database to retain critical knowledge and skills in the EVA Operations group at NASA's Johnson Space Center by ensuring that information is recorded in a common, easy to search repository. Prior to the EVA Wiki, information required for EVA flight controllers and instructors was scattered across different sources, including multiple file share directories, SharePoint, individual computers, and paper archives. Many documents were outdated, and data was often difficult to find and distribute. In 2011, a team recognized that these knowledge management problems could be solved by creating an EVA Wiki using MediaWiki, a free and open-source software developed by the Wikimedia Foundation. The EVA Wiki developed into an EVA-specific Wikipedia on an internal NASA server. While the technical implementation of the wiki had many challenges, one of the biggest hurdles came from a cultural shift. Like many enterprise organizations, the EVA Operations group was accustomed to hierarchical data structures and individually-owned documents. Instead of sorting files into various folders, the wiki searches content. Rather than having a single document owner, the wiki harmonized the efforts of many contributors and established an automated revision controlled system. As the group adapted to the wiki, the usefulness of this single portal for information became apparent. It transformed into a useful data mining tool for EVA flight controllers and instructors, as well as hundreds of others that support EVA. Program managers, engineers, astronauts, flight directors, and flight controllers in differing disciplines now have an easier-to-use, searchable system to find EVA data. This paper presents the benefits the EVA Wiki has brought to NASA's EVA community, as well as the cultural challenges it had to overcome.
EVA Wiki - Transforming Knowledge Management for EVA Flight Controllers and Instructors
NASA Technical Reports Server (NTRS)
Johnston, Stephanie
2016-01-01
The EVA (Extravehicular Activity) Wiki was recently implemented as the primary knowledge database to retain critical knowledge and skills in the EVA Operations group at NASA's Johnson Space Center by ensuring that information is recorded in a common, searchable repository. Prior to the EVA Wiki, information required for EVA flight controllers and instructors was scattered across different sources, including multiple file share directories, SharePoint, individual computers, and paper archives. Many documents were outdated, and data was often difficult to find and distribute. In 2011, a team recognized that these knowledge management problems could be solved by creating an EVA Wiki using MediaWiki, a free and open-source software developed by the Wikimedia Foundation. The EVA Wiki developed into an EVA-specific Wikipedia on an internal NASA server. While the technical implementation of the wiki had many challenges, the one of the biggest hurdles came from a cultural shift. Like many enterprise organizations, the EVA Operations group was accustomed to hierarchical data structures and individually-owned documents. Instead of sorting files into various folders, the wiki searches content. Rather than having a single document owner, the wiki harmonized the efforts of many contributors and established an automated revision control system. As the group adapted to the wiki, the usefulness of this single portal for information became apparent. It transformed into a useful data mining tool for EVA flight controllers and instructors, and also for hundreds of other NASA and contract employees. Program managers, engineers, astronauts, flight directors, and flight controllers in differing disciplines now have an easier-to-use, searchable system to find EVA data. This paper presents the benefits the EVA Wiki has brought to NASA's EVA community, as well as the cultural challenges it had to overcome.
Automated population of an i2b2 clinical data warehouse from an openEHR-based data repository.
Haarbrandt, Birger; Tute, Erik; Marschollek, Michael
2016-10-01
Detailed Clinical Model (DCM) approaches have recently seen wider adoption. More specifically, openEHR-based application systems are now used in production in several countries, serving diverse fields of application such as health information exchange, clinical registries and electronic medical record systems. However, approaches to efficiently provide openEHR data to researchers for secondary use have not yet been investigated or established. We developed an approach to automatically load openEHR data instances into the open source clinical data warehouse i2b2. We evaluated query capabilities and the performance of this approach in the context of the Hanover Medical School Translational Research Framework (HaMSTR), an openEHR-based data repository. Automated creation of i2b2 ontologies from archetypes and templates and the integration of openEHR data instances from 903 patients of a paediatric intensive care unit has been achieved. In total, it took an average of ∼2527s to create 2.311.624 facts from 141.917 XML documents. Using the imported data, we conducted sample queries to compare the performance with two openEHR systems and to investigate if this representation of data is feasible to support cohort identification and record level data extraction. We found the automated population of an i2b2 clinical data warehouse to be a feasible approach to make openEHR data instances available for secondary use. Such an approach can facilitate timely provision of clinical data to researchers. It complements analytics based on the Archetype Query Language by allowing querying on both, legacy clinical data sources and openEHR data instances at the same time and by providing an easy-to-use query interface. However, due to different levels of expressiveness in the data models, not all semantics could be preserved during the ETL process. Copyright © 2016 Elsevier Inc. All rights reserved.
Georgitsi, Marianthi; Viennas, Emmanouil; Gkantouna, Vassiliki; Christodoulopoulou, Elena; Zagoriti, Zoi; Tafrali, Christina; Ntellos, Fotios; Giannakopoulou, Olga; Boulakou, Athanassia; Vlahopoulou, Panagiota; Kyriacou, Eva; Tsaknakis, John; Tsakalidis, Athanassios; Poulas, Konstantinos; Tzimas, Giannis; Patrinos, George P
2011-01-01
Population and ethnic group-specific allele frequencies of pharmacogenomic markers are poorly documented and not systematically collected in structured data repositories. We developed the Frequency of Inherited Disorders Pharmacogenomics database (FINDbase-PGx), a separate module of the FINDbase, aiming to systematically document pharmacogenomic allele frequencies in various populations and ethnic groups worldwide. We critically collected and curated 214 scientific articles reporting pharmacogenomic markers allele frequencies in various populations and ethnic groups worldwide. Subsequently, in order to host the curated data, support data visualization and data mining, we developed a website application, utilizing Microsoft™ PivotViewer software. Curated allelic frequency data pertaining to 144 pharmacogenomic markers across 14 genes, representing approximately 87,000 individuals from 150 populations worldwide, are currently included in FINDbase-PGx. A user-friendly query interface allows for easy data querying, based on numerous content criteria, such as population, ethnic group, geographical region, gene, drug and rare allele frequency. FINDbase-PGx is a comprehensive database, which, unlike other pharmacogenomic knowledgebases, fulfills the much needed requirement to systematically document pharmacogenomic allelic frequencies in various populations and ethnic groups worldwide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jewett, J.R.
1997-09-17
In a geological repository for long-lived radioactive wastes, such as actinides and certain fission products, most of the stored radionuclides remain immobile in the particular geological formation. If any of these could possibly become mobile, only trace concentrations of a few radionuclides would result. Nevertheless, with an inventory in the repository of many tonnes of transuranic elements, the amounts that could disperse cannot be neglected. A critical assessment of the chemical behavior of these nuclides, especially their migration properties in the aquifer system around the repository site, is mandatory for analysis of the long-term safety. The chemistry requited for thismore » includes many geochemical multicomponent reactions that are so far only partially understood and [which] therefore can be quantified only incompletely. A few of these reactions have been discussed in this paper based on present knowledge. If a comprehensive discussion of the subject is impossible because of this [lack of information], then an attempt to emphasize the importance of the predominant geochemical reactions of the transuranic elements in various aquifer systems should be made.« less
The Pig PeptideAtlas: A resource for systems biology in animal production and biomedicine.
Hesselager, Marianne O; Codrea, Marius C; Sun, Zhi; Deutsch, Eric W; Bennike, Tue B; Stensballe, Allan; Bundgaard, Louise; Moritz, Robert L; Bendixen, Emøke
2016-02-01
Biological research of Sus scrofa, the domestic pig, is of immediate relevance for food production sciences, and for developing pig as a model organism for human biomedical research. Publicly available data repositories play a fundamental role for all biological sciences, and protein data repositories are in particular essential for the successful development of new proteomic methods. Cumulative proteome data repositories, including the PeptideAtlas, provide the means for targeted proteomics, system-wide observations, and cross-species observational studies, but pigs have so far been underrepresented in existing repositories. We here present a significantly improved build of the Pig PeptideAtlas, which includes pig proteome data from 25 tissues and three body fluid types mapped to 7139 canonical proteins. The content of the Pig PeptideAtlas reflects actively ongoing research within the veterinary proteomics domain, and this article demonstrates how the expression of isoform-unique peptides can be observed across distinct tissues and body fluids. The Pig PeptideAtlas is a unique resource for use in animal proteome research, particularly biomarker discovery and for preliminary design of SRM assays, which are equally important for progress in research that supports farm animal production and veterinary health, as for developing pig models with relevance to human health research. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Pig PeptideAtlas: a resource for systems biology in animal production and biomedicine
Hesselager, Marianne O.; Codrea, Marius C.; Sun, Zhi; Deutsch, Eric W.; Bennike, Tue B.; Stensballe, Allan; Bundgaard, Louise; Moritz, Robert L.; Bendixen, Emøke
2016-01-01
Biological research of Sus scrofa, the domestic pig, is of immediate relevance for food production sciences, and for developing pig as a model organism for human biomedical research. Publicly available data repositories play a fundamental role for all biological sciences, and protein data repositories are in particular essential for the successful development of new proteomic methods. Cumulative proteome data repositories, including the PeptideAtlas, provide the means for targeted proteomics, system wide observations, and cross species observational studies, but pigs have so far been underrepresented in existing repositories. We here present a significantly improved build of the Pig PeptideAtlas, which includes pig proteome data from 25 tissues and three body fluid types mapped to 7139 canonical proteins. The content of the Pig PeptideAtlas reflects actively ongoing research within the veterinary proteomics domain, and this manuscript demonstrates how the expression of isoform-unique peptides can be observed across distinct tissues and body fluids. The Pig PeptideAtlas is a unique resource for use in animal proteome research, particularly biomarker discovery and for preliminary design of SRM assays, which are equally important for progress in research that supports farm animal production and veterinary health, as for developing pig models with relevance to human health research. PMID:26699206
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leigh, Christi D.; Hansen, Francis D.
This report summarizes the state of salt repository science, reviews many of the technical issues pertaining to disposal of heat-generating nuclear waste in salt, and proposes several avenues for future science-based activities to further the technical basis for disposal in salt. There are extensive salt formations in the forty-eight contiguous states, and many of them may be worthy of consideration for nuclear waste disposal. The United States has extensive experience in salt repository sciences, including an operating facility for disposal of transuranic wastes. The scientific background for salt disposal including laboratory and field tests at ambient and elevated temperature, principlesmore » of salt behavior, potential for fracture damage and its mitigation, seal systems, chemical conditions, advanced modeling capabilities and near-future developments, performance assessment processes, and international collaboration are all discussed. The discussion of salt disposal issues is brought current, including a summary of recent international workshops dedicated to high-level waste disposal in salt. Lessons learned from Sandia National Laboratories' experience on the Waste Isolation Pilot Plant and the Yucca Mountain Project as well as related salt experience with the Strategic Petroleum Reserve are applied in this assessment. Disposal of heat-generating nuclear waste in a suitable salt formation is attractive because the material is essentially impermeable, self-sealing, and thermally conductive. Conditions are chemically beneficial, and a significant experience base exists in understanding this environment. Within the period of institutional control, overburden pressure will seal fractures and provide a repository setting that limits radionuclide movement. A salt repository could potentially achieve total containment, with no releases to the environment in undisturbed scenarios for as long as the region is geologically stable. Much of the experience gained from United States repository development, such as seal system design, coupled process simulation, and application of performance assessment methodology, helps define a clear strategy for a heat-generating nuclear waste repository in salt.« less
Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buschaert, S.; Lesoille, S.; Bertrand, J.
2012-07-01
The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less
NASA Astrophysics Data System (ADS)
Versteeg, R.; Heath, G.; Richardson, A.; Paul, D.; Wangerud, K.
2003-12-01
At a cyanide heap-leach open-pit mine, 15-million cubic yards of acid-generating sulfides were dumped at the head of a steep-walled mountain valley, with 30 inches/year precipitation generating 60- gallons/minute ARD leachate. Remediation has reshaped the dump to a 70-acre, 3.5:1-sloped geometry, installed drainage benches and runoff diversions, and capped the repository and lined diversions with a polyethylene geomembrane and cover system. Monitoring was needed to evaluate (a) long-term geomembrane integrity, (b) diversion liner integrity and long-term effectiveness, (c) ARD geochemistry, kinetics and pore-gas dynamics within the repository mass, and (d) groundwater interactions. Observation wells were paired with a 600-electrode resistivity survey system. Using near-surface and down-hole electrodes and automated data collection and post-processing, periodic two- and three-dimensional resistivity images are developed to reflect current and changed-conditions in moisture, temperature, geochemical components, and flow-direction analysis. Examination of total resistivity values and time variances between images allows direct observation of liner and cap integrity with precise identification and location of leaks; likewise, if runoff migrates from degraded diversion ditches into the repository zone, there is an accompanying and noticeable change in resistivity values. Used in combination with monitoring wells containing borehole resistivity electrodes (calibrated with direct sampling of dump water/moisture, temperature and pore-gas composition), the resistivity arrays allow at-depth imaging of geochemical conditions within the repository mass. The information provides early indications of progress or deficiencies in de-watering and ARD- mitigation that is the remedy intent. If emerging technologies present opportunities for secondary treatment, deep resistivity images may assist in developing application methods and evaluating the effectiveness of any reagents introduced into the repository mass to further effect changes in oxidation/reduction reactions.
Performance Confirmation Data Aquisition System
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.W. Markman
2000-10-27
The purpose of this analysis is to identify and analyze concepts for the acquisition of data in support of the Performance Confirmation (PC) program at the potential subsurface nuclear waste repository at Yucca Mountain. The scope and primary objectives of this analysis are to: (1) Review the criteria for design as presented in the Performance Confirmation Data Acquisition/Monitoring System Description Document, by way of the Input Transmittal, Performance Confirmation Input Criteria (CRWMS M&O 1999c). (2) Identify and describe existing and potential new trends in data acquisition system software and hardware that would support the PC plan. The data acquisition softwaremore » and hardware will support the field instruments and equipment that will be installed for the observation and perimeter drift borehole monitoring, and in-situ monitoring within the emplacement drifts. The exhaust air monitoring requirements will be supported by a data communication network interface with the ventilation monitoring system database. (3) Identify the concepts and features that a data acquisition system should have in order to support the PC process and its activities. (4) Based on PC monitoring needs and available technologies, further develop concepts of a potential data acquisition system network in support of the PC program and the Site Recommendation and License Application.« less
Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study
NASA Astrophysics Data System (ADS)
Saliceti, Jose A.
The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.
Improved Access to NSF Funded Ocean Research Data
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Groman, R. C.; Kinkade, D.; Shepherd, A.; Rauch, S.; Allison, M. D.; Gegg, S. R.; Wiebe, P. H.; Glover, D. M.
2015-12-01
Data from NSF-funded, hypothesis-driven research comprise an essential part of the research results upon which we base our knowledge and improved understanding of the impacts of climate change. Initially funded in 2006, the Biological and Chemical Oceanography Data Management Office (BCO-DMO) works with marine scientists to ensure that data from NSF-funded ocean research programs are fully documented and freely available for future use. BCO-DMO works in partnership with information technology professionals, other marine data repositories and national data archive centers to ensure long-term preservation of these valuable environmental research data. Data contributed to BCO-DMO by the original investigators are enhanced with sufficient discipline-specific documentation and published in a variety of standards-compliant forms designed to enable discovery and support accurate re-use.
Godah, Mohammad W; Abdul Khalek, Rima A; Kilzar, Lama; Zeid, Hiba; Nahlawi, Acile; Lopes, Luciane Cruz; Darzi, Andrea J; Schünemann, Holger J; Akl, Elie A
2016-12-01
Low- and middle-income countries adapt World Health Organization (WHO) guidelines instead of de novo development for financial, epidemiologic, sociopolitical, cultural, organizational, and other reasons. To systematically evaluate reported processes used in the adaptation of WHO guidelines for human immunodeficiency virus (HIV) and tuberculosis (TB). We searched three online databases/repositories: United States Agency for International Development (USAID) AIDS Support and Technical Resources - Sector One program (AIDSTAR-One) National Treatment Database; the AIDSspace Guideline Repository, and WHO Database of national HIV and TB guidelines. We assessed the rigor and quality of reported adaptation methodology using the ADAPTE process as benchmark. Of 170 eligible guidelines, only 32 (19%) reported documentation on the adaptation process. The median and interquartile range of the number of ADAPTE steps fulfilled by the eligible guidelines were 11.5 (10, 13.5) (out of 23 steps). The number of guidelines (out of 32 steps) fulfilling each ADAPTE step was 18 (interquartile range, 5-27). Seventeen of 32 guidelines (53%) met all steps relevant to the setup phase, whereas none met all steps relevant to the adaptation phase. The number of well-documented adaptation methodologies in national HIV and/or TB guidelines is very low. There is a need for the use of standardized and systematic framework for guideline adaptation and improved reporting of processes used. Copyright © 2016 Elsevier Inc. All rights reserved.
Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J
2017-04-01
This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.
NASA Astrophysics Data System (ADS)
Wieland, E.; Bradbury, M. H.; van Loon, L.
2003-01-01
The migration of radionuclides within a repository for radioactive waste is retarded due to interaction with the engineered barrier system. Sorption processes play a decisive role in the retardation of radionuclides in the repository environment, and thus, the development of sorption data bases (SDBs) is an important task and an integral part of performance assessment. The methodology applied in the development of a SDB for the cementitious near-field of a repository for long-lived intermediate-level waste is presented in this study. The development of such a SDB requires knowledge of the chemical conditions of the near-field and information on the uptake process of radionuclides by hardened cement paste. The principles upon which the selection of the “best available” laboratory sorption values is based are outlined. The influence of cellulose degradation products, cement additives and cement-derived colloids on the sorption behaviour of radionuclides is addressed in conjunction with the development of the SDB.
Scientific information repository assisting reflectance spectrometry in legal medicine.
Belenki, Liudmila; Sterzik, Vera; Bohnert, Michael; Zimmermann, Klaus; Liehr, Andreas W
2012-06-01
Reflectance spectrometry is a fast and reliable method for the characterization of human skin if the spectra are analyzed with respect to a physical model describing the optical properties of human skin. For a field study performed at the Institute of Legal Medicine and the Freiburg Materials Research Center of the University of Freiburg, a scientific information repository has been developed, which is a variant of an electronic laboratory notebook and assists in the acquisition, management, and high-throughput analysis of reflectance spectra in heterogeneous research environments. At the core of the repository is a database management system hosting the master data. It is filled with primary data via a graphical user interface (GUI) programmed in Java, which also enables the user to browse the database and access the results of data analysis. The latter is carried out via Matlab, Python, and C programs, which retrieve the primary data from the scientific information repository, perform the analysis, and store the results in the database for further usage.
Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results
NASA Astrophysics Data System (ADS)
Nussbaum, C. O.; Bossart, P. J.
2012-12-01
Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.
NASA Astrophysics Data System (ADS)
Pohlmann, K. F.; Zhu, J.; Ye, M.; Carroll, R. W.; Chapman, J. B.; Russell, C. E.; Shafer, D. S.
2006-12-01
Yucca Mountain (YM), Nevada has been recommended as a deep geological repository for the disposal of spent fuel and high-level radioactive waste. If YM is licensed as a repository by the Nuclear Regulatory Commission, it will be important to identify the potential for radionuclides to migrate from underground nuclear testing areas located on the Nevada Test Site (NTS) to the hydraulically downgradient repository area to ensure that monitoring does not incorrectly attribute repository failure to radionuclides originating from other sources. In this study, we use the Death Valley Regional Flow System (DVRFS) model developed by the U.S. Geological Survey to investigate potential groundwater migration pathways and associated travel times from the NTS to the proposed YM repository area. Using results from the calibrated DVRFS model and the particle tracking post-processing package MODPATH we modeled three-dimensional groundwater advective pathways in the NTS and YM region. Our study focuses on evaluating the potential for groundwater pathways between the NTS and YM withdrawal area and whether travel times for advective flow along these pathways coincide with the prospective monitoring time frame at the proposed repository. We include uncertainty in effective porosity as this is a critical variable in the determination of time for radionuclides to travel from the NTS region to the YM withdrawal area. Uncertainty in porosity is quantified through evaluation of existing site data and expert judgment and is incorporated in the model through Monte Carlo simulation. Since porosity information is limited for this region, the uncertainty is quite large and this is reflected in the results as a large range in simulated groundwater travel times.
NASA Astrophysics Data System (ADS)
Massmann, J.; Nagel, T.; Bilke, L.; Böttcher, N.; Heusermann, S.; Fischer, T.; Kumar, V.; Schäfers, A.; Shao, H.; Vogel, P.; Wang, W.; Watanabe, N.; Ziefle, G.; Kolditz, O.
2016-12-01
As part of the German site selection process for a high-level nuclear waste repository, different repository concepts in the geological candidate formations rock salt, clay stone and crystalline rock are being discussed. An open assessment of these concepts using numerical simulations requires physical models capturing the individual particularities of each rock type and associated geotechnical barrier concept to a comparable level of sophistication. In a joint work group of the Helmholtz Centre for Environmental Research (UFZ) and the German Federal Institute for Geosciences and Natural Resources (BGR), scientists of the UFZ are developing and implementing multiphysical process models while BGR scientists apply them to large scale analyses. The advances in simulation methods for waste repositories are incorporated into the open-source code OpenGeoSys. Here, recent application-driven progress in this context is highlighted. A robust implementation of visco-plasticity with temperature-dependent properties into a framework for the thermo-mechanical analysis of rock salt will be shown. The model enables the simulation of heat transport along with its consequences on the elastic response as well as on primary and secondary creep or the occurrence of dilatancy in the repository near field. Transverse isotropy, non-isothermal hydraulic processes and their coupling to mechanical stresses are taken into account for the analysis of repositories in clay stone. These processes are also considered in the near field analyses of engineered barrier systems, including the swelling/shrinkage of the bentonite material. The temperature-dependent saturation evolution around the heat-emitting waste container is described by different multiphase flow formulations. For all mentioned applications, we illustrate the workflow from model development and implementation, over verification and validation, to repository-scale application simulations using methods of high performance computing.
Rolling Deck to Repository (R2R): A "Linked Data" Approach for the U.S. Academic Research Fleet
NASA Astrophysics Data System (ADS)
Arko, R.; Chandler, C.; Clark, P.; Milan, A.; Mize, J.
2012-04-01
The Rolling Deck to Repository (R2R; http://rvdata.us/) program is developing infrastructure to routinely document, assess, and preserve the underway sensor data from U.S. academic research vessels. The R2R master catalog of vessels, instrument systems, operating institutions, cruises, personnel, data sets, event logs, and field reports has grown to over 2,200 cruises in less than two years, and is now accessible via Web services. This catalog is of great value to peer data systems, ranging from large inter/national data centers to small disciplinary data offices, as an aid in quality controlling their own collections and finding related data from authoritative sources. R2R breaks with the tradition of stovepipe portals built around complex search interfaces tightly bound to backend databases. Instead, we have adopted a Linked Data approach to publish our catalog content, based on the W3C Resource Description Framework (RDF) and Uniform Resource Identifiers (URIs). Our data model is published as a collection of W3C Simple Knowledge Organization System (SKOS) concepts, mapped to partner vocabularies such as those developed by the Global Change Master Directory (GCMD) and the pan-European SeaDataNet partnership, and our catalog content is published as collections of RDF resources with globally unique and persistent identifiers. The combination of exposing our data model, mapping local terms to community-wide vocabularies, and using reliable identifiers improves interoperability and reduces ambiguity. R2R's metric of success is the degree to which peer data systems harvest and reuse our content. R2R is working collaboratively with the NOAA National Data Centers and the NSF-funded Biological and Chemical Oceanography Data Management Office (BCO-DMO) on a range of Linked Data pilot applications, including production of ISO-compliant metadata and deployment of a RDF Query Language (SPARQL) interface. Our objective is to support a distributed, loosely federated network of complementary systems that collectively manage the vast body of ocean science data. We will present results and lessons learned.
NASA Astrophysics Data System (ADS)
Gil, Y.; Duffy, C.
2015-12-01
This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitschkowetz, N.; Vickers, D.L.
This report provides a summary of the Computer-aided Acquisition and Logistic Support (CALS) Test Network (CTN) Laboratory Acceptance Test (LAT) and User Application Test (UAT) activities undertaken to evaluate the CALS capabilities being implemented as part of the Department of Defense (DOD) engineering repositories. Although the individual testing activities provided detailed reports for each repository, a synthesis of the results, conclusions, and recommendations is offered to provide a more concise presentation of the issues and the strategies, as viewed from the CTN perspective.
YUCCA MOUNTAIN SITE DESCRIPTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
A.M. Simmons
The ''Yucca Mountain Site Description'' summarizes, in a single document, the current state of knowledge and understanding of the natural system at Yucca Mountain. It describes the geology; geochemistry; past, present, and projected future climate; regional hydrologic system; and flow and transport within the unsaturated and saturated zones at the site. In addition, it discusses factors affecting radionuclide transport, the effect of thermal loading on the natural system, and tectonic hazards. The ''Yucca Mountain Site Description'' is broad in nature. It summarizes investigations carried out as part of the Yucca Mountain Project since 1988, but it also includes work donemore » at the site in earlier years, as well as studies performed by others. The document has been prepared under the Office of Civilian Radioactive Waste Management quality assurance program for the Yucca Mountain Project. Yucca Mountain is located in Nye County in southern Nevada. The site lies in the north-central part of the Basin and Range physiographic province, within the northernmost subprovince commonly referred to as the Great Basin. The basin and range physiography reflects the extensional tectonic regime that has affected the region during the middle and late Cenozoic Era. Yucca Mountain was initially selected for characterization, in part, because of its thick unsaturated zone, its arid to semiarid climate, and the existence of a rock type that would support excavation of stable openings. In 1987, the United States Congress directed that Yucca Mountain be the only site characterized to evaluate its suitability for development of a geologic repository for high-level radioactive waste and spent nuclear fuel.« less
Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences
NASA Astrophysics Data System (ADS)
Smith, P., II
2015-12-01
Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to EarthCube Funded projects with academic research libraries facilitating the data and information assets components of the USGS CDI SSF via institutional repositories and/or digital content management. This session will explore the USGS CDI SSF for cross-discipline collaboration considerations from a library perspective.
A standard-enabled workflow for synthetic biology.
Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach
2017-06-15
A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.
Modeling Guru: Knowledge Base for NASA Modelers
NASA Astrophysics Data System (ADS)
Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.
2009-05-01
Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the same document, even allowing the author to select those who may edit and approve the document. To maintain knowledge integrity, all documents are moderated before they are visible to the public. Modeling Guru, running on Clearspace by Jive Software, has been an active resource to the NASA modeling and HEC communities for more than a year and currently has more than 100 active users. SIVO will soon install live instant messaging support, as well as a user-customizable homepage with social-networking features. In addition, SIVO plans to implement a large dataset/file storage capability so that users can quickly and easily exchange datasets and files with one another. Continued active community participation combined with periodic software updates and improved features will ensure that Modeling Guru remains a vibrant, effective, easy-to-use tool for the NASA scientific community.
Progress and challenges associated with digitizing and serving up Hawaii's geothermal data
NASA Astrophysics Data System (ADS)
Thomas, D. M.; Lautze, N. C.; Abdullah, M.
2012-12-01
This presentation will report on the status of our effort to digitize and serve up Hawaii's geothermal information, an undertaking that commenced in 2011 and will continue through at least 2013. This work is part of national project that is funded by the Department of Energy and managed by the Arizona State Geology Survey (AZGS). The data submitted to AZGS is being entered into the National Geothermal Data System (see http://www.stategeothermaldata.org/overview). We are also planning to host the information locally. Main facets of this project are to: - digitize and generate metadata for non-published geothermal documents relevant to the State of Hawaii - digitize ~100 years of paper records relevant to well permitting and water resources development and serve up information on the ~4500 water wells in the state - digitize, organize, and serve up information on research and geothermal exploratory drilling conducted from the 1980s to the present. - work with AZGS and OneGeology to contribute a geologic map for Hawaii that integrates geologic and geothermal resource data. By December 2012, we anticipate that the majority of the digitization will be complete, the geologic map will be approved, and that over 1000 documents will be hosted online through the University of Hawaii's library system (in the "Geothermal Collection" within the "Scholar Space" repository, see http://scholarspace.manoa.hawaii.edu/handle/10125/21320). Developing a 'user-friendly' web interface for the water well and drilling data will be a main task in the coming year. Challenges we have faced and anticipate include: 1) ensuring that no personally identifiable information (e.g. SSN, private telephone numbers, bank or credit account) is contained in the geothermal documents and well files; 2) Homeland Security regulations regarding release of information on critical infrastructure related to municipal water supply systems; 3) maintenance of the well database as future well data are developed with the state's expanding inventory of wells to meet private and public needs. Feedback is welcome.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Gentle, J.
2015-12-01
The multi-criteria decision support system (MCSDSS) is a newly completed application for touch-enabled group decision support that uses D3 data visualization tools, a geojson conversion utility that we developed, and Paralelex to create an interactive tool. The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies. The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. The technology stack for MCSDSS was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. The software lifecylcle for MCSDSS has following best practices to develop, share, and document the codebase and application. Code is documented and shared via an online repository with the option for programmers to see, contribute, or fork the codebase. Example data files and tutorial documentation have been shared with clear descriptions and data object identifiers. And the metadata about the application has been incorporated into an OntoSoft entry to ensure that MCSDSS is searchable and clearly described. MCSDSS is a flexible platform that allows for data fusion and inclusion of large datasets in an interactive front-end application capable of connecting with other science-based applications and advanced computing resources. In addition, MCSDSS offers functionality that enables communication with non-technical users for policy, education, or engagement with groups around scientific topics with societal relevance.
Marine Corps Warfighting Laboratory Home
of Learning Information System (MCCOLIS) A collaborative, knowledge management system that contains Concept development * Warfighting Challenge Repository that supports the Campaign of Learning and Future
Implementation of an OAIS Repository Using Free, Open Source Software
NASA Astrophysics Data System (ADS)
Flathers, E.; Gessler, P. E.; Seamon, E.
2015-12-01
The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design of the repository, based upon open standards to support interoperability with other institutions' systems and with future versions of our own software components. We also describe the implementation process, including our use of GitHub as a collaboration tool and code repository.
Solar Sail Propulsion Technology Readiness Level Database
NASA Technical Reports Server (NTRS)
Adams, Charles L.
2004-01-01
The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).
Goss, Elizabeth; Link, Michael P; Bruinooge, Suanna S; Lawrence, Theodore S; Tepper, Joel E; Runowicz, Carolyn D; Schilsky, Richard L
2009-08-20
The American Society of Clinical Oncology (ASCO) Cancer Research Committee designed a qualitative research project to assess the attitudes of cancer researchers and compliance officials regarding compliance with the US Privacy Rule and to identify potential strategies for eliminating perceived or real barriers to achieving compliance. A team of three interviewers asked 27 individuals (13 investigators and 14 compliance officials) from 13 institutions to describe the anticipated approach of their institutions to Privacy Rule compliance in three hypothetical research studies. The interviews revealed that although researchers and compliance officials share the view that patients' cancer diagnoses should enjoy a high level of privacy protection, there are significant tensions between the two groups related to the proper standards for compliance necessary to protect patients. The disagreements are seen most clearly with regard to the appropriate definition of a "future research use" of protected health information in biospecimen and data repositories and the standards for a waiver of authorization for disclosure and use of such data. ASCO believes that disagreements related to compliance and the resulting delays in certain projects and abandonment of others might be eased by additional institutional training programs and consultation on Privacy Rule issues during study design. ASCO also proposes the development of best practices documents to guide 1) creation of data repositories, 2) disclosure and use of data from such repositories, and 3) the design of survivorship and genetics studies.
Sharing Site-Based Research Data: Standardizing and Packaging for Reuse
NASA Astrophysics Data System (ADS)
Gordon, S.; DiLauro, T.; Jett, J. G.; Thomer, A.
2014-12-01
One of the key aims of the Institute of Museum and Library Services-funded Site-Based Data Curation (SBDC) Project[1] is to increase the reuse of data gathered or generated through research at sites like Yellowstone National Park (YNP) by improving its usefulness, discoverability, and accessibility. Toward this goal, SBDC worked closely with a geobiologist conducting fieldwork at YNP to explore existing data practices and held a two-day stakeholders workshop at the park with some of the scientists who study it and the National Park Service (NPS) staff who support research activities there. The resulting workshop report[2] recommends, among other things, improvements to the level of detail and consistency of documentation of data and of its sampling and analysis methods. A set of core metadata elements and domain-specific extension elements is proposed (Appendix 9) to provide a more coherent view into the data. Armed with these findings, we are pursuing approaches that will reduce the effort, complexity, and risk tied to adoption of these recommendations. During our investigation, we discovered the EarthChem templates[3], into which we began mapping the geobiologist's data. We find the Vent Fluids template particularly appropriate and adaptable, as many of the high-interest features at YNP are shallow water vents. We are currently building an EarthChem-compatible template that will capture the environmental context of microbes, tracing their identities from water sample through to GenBank entry. Given the variety of potential targets (e.g., site, institutional, and domain repositories; visualization and presentation tools), we decided to record the data in a structured package, which we can transform for a given target. We are using the Data Conservancy's Packaging Tool[4], which provides an intuitive file system view, stores file checksums, and serializes a graph of relationships. This permits a researcher to conveniently group desired data products into a single self-documenting TAR or Zip file. Initial target repositories are NPS's IRMA, Data Conservancy, and SEAD/Medici. [1]http://cirssweb.lis.illinois.edu/SBDC [2] https://www.ideals.illinois.edu/handle/2142/47070 [3] http://www.earthchem.org/data/templates [4] http://dataconservancy.org/dcs-packaging-specification/
Geosamples.org: Shared Cyberinfrastructure for Geoscience Samples
NASA Astrophysics Data System (ADS)
Lehnert, Kerstin; Allison, Lee; Arctur, David; Klump, Jens; Lenhardt, Christopher
2014-05-01
Many scientific domains, specifically in the geosciences, rely on physical samples as basic elements for study and experimentation. Samples are collected to analyze properties of natural materials and features that are key to our knowledge of Earth's dynamical systems and evolution, and to preserve a record of our environment over time. Huge volumes of samples have been acquired over decades or even centuries and stored in a large number and variety of institutions including museums, universities and colleges, state geological surveys, federal agencies, and industry. All of these collections represent highly valuable, often irreplaceable records of nature that need to be accessible so that they can be re-used in future research and for educational purposes. Many sample repositories are keen to use cyberinfrastructure capabilities to enhance access to their collections on the internet and to support and streamline collection management (accessioning of new samples, labeling, handling sample requests, etc.), but encounter substantial challenges and barriers to integrate digital sample management into their daily routine. They lack the resources (staff, funding) and infrastructure (hardware, software, IT support) to develop and operate web-enabled databases, to migrate analog sample records into digital data management systems, and to transfer paper- or spreadsheet-based workflows to electronic systems. Use of commercial software is often not an option as it incurs high costs for licenses, requires IT expertise for installation and maintenance, and often does not match the needs of the smaller repositories, being designed for large museums or different types of collections (art, archeological, biological). Geosamples.org is an alliance of sample repositories (academic, US federal and state surveys, industry) and data facilities that aims to develop a cyberinfrastructure that will dramatically advance access to physical samples for the research community, government agencies, students, educators, and the general public, while supporting, simplifying, and standardizing the work of curators in repositories, museums, and universities, and even for individual investigators who manage personal or project-based sample collections in their lab. Geosamples.org builds upon best practices and cyberinfrastructure for sample identification, registration, and documentation developed by the IGSN e.V., an international organization that governs the International Geosample Number, a persistent unique identifier for physical samples. Geosamples.org will develop a Digital Environment for Sample Curation (DESC) that will facilitate the creation, identification, and registration of 'virtual samples' and network them into an 'Internet of Samples' that will allow to discover, access, and track online physical samples, the data derived by their study, and the publications that contain these data. DESC will provide easy-to-use software tools for curators to maintain digital catalogs of their collections, to provide online access to the catalog to search for and request samples, manage sample requests and users, track collection usage and impact. Geosamples.org will also work toward joint practices for the recognition of intellectual property, build mechanisms to create sustainable business models for continuing maintenance and evolution of managing sample resources, and integrate the sample management life-cycle into professional and cultural practice of science.
OWLing Clinical Data Repositories With the Ontology Web Language
Pastor, Xavier; Lozano, Esther
2014-01-01
Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697
OWLing Clinical Data Repositories With the Ontology Web Language.
Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther
2014-08-01
The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.
Enriching text with images and colored light
NASA Astrophysics Data System (ADS)
Sekulovski, Dragan; Geleijnse, Gijs; Kater, Bram; Korst, Jan; Pauws, Steffen; Clout, Ramon
2008-01-01
We present an unsupervised method to enrich textual applications with relevant images and colors. The images are collected by querying large image repositories and subsequently the colors are computed using image processing. A prototype system based on this method is presented where the method is applied to song lyrics. In combination with a lyrics synchronization algorithm the system produces a rich multimedia experience. In order to identify terms within the text that may be associated with images and colors, we select noun phrases using a part of speech tagger. Large image repositories are queried with these terms. Per term representative colors are extracted using the collected images. Hereto, we either use a histogram-based or a mean shift-based algorithm. The representative color extraction uses the non-uniform distribution of the colors found in the large repositories. The images that are ranked best by the search engine are displayed on a screen, while the extracted representative colors are rendered on controllable lighting devices in the living room. We evaluate our method by comparing the computed colors to standard color representations of a set of English color terms. A second evaluation focuses on the distance in color between a queried term in English and its translation in a foreign language. Based on results from three sets of terms, a measure of suitability of a term for color extraction based on KL Divergence is proposed. Finally, we compare the performance of the algorithm using either the automatically indexed repository of Google Images and the manually annotated Flickr.com. Based on the results of these experiments, we conclude that using the presented method we can compute the relevant color for a term using a large image repository and image processing.
ENGINEERED BARRIER SYSTEM: PHYSICAL AND CHEMICAL ENVIRONMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Jarek
2005-08-29
The purpose of this model report is to describe the evolution of the physical and chemical environmental conditions within the waste emplacement drifts of the repository, including the drip shield and waste package surfaces. The resulting seepage evaporation and gas abstraction models are used in the total system performance assessment for the license application (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. This report develops and documents a set of abstraction-level models that describe the engineered barrier system physical and chemical environment. Where possible, these models use information directly from other reports as input,more » which promotes integration among process models used for TSPA-LA. Specific tasks and activities of modeling the physical and chemical environment are included in ''Technical Work Plan for: Near-Field Environment and Transport In-Drift Geochemistry Model Report Integration'' (BSC 2005 [DIRS 173782], Section 1.2.2). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system reports. To be consistent with other project documents that address features, events, and processes (FEPs), Table 6.14.1 of the current report includes updates to FEP numbers and FEP subjects for two FEPs identified in the technical work plan (TWP) governing this report (BSC 2005 [DIRS 173782]). FEP 2.1.09.06.0A (Reduction-oxidation potential in EBS), as listed in Table 2 of the TWP (BSC 2005 [DIRS 173782]), has been updated in the current report to FEP 2.1.09.06.0B (Reduction-oxidation potential in Drifts; see Table 6.14-1). FEP 2.1.09.07.0A (Reaction kinetics in EBS), as listed in Table 2 of the TWP (BSC 2005 [DIRS 173782]), has been updated in the current report to FEP 2.1.09.07.0B (Reaction kinetics in Drifts; see Table 6.14-1). These deviations from the TWP are justified because they improve integration with FEPs documents. The updates have no impact on the model developed in this report.« less
Burke, Lauri
2010-01-01
This document serves as the repository for the unprocessed data used in the investigation of temperature and overpressure relations within the deep Tuscaloosa Formation in Judge Digby field. It is a compilation of all the publicly accessible wellbore temperature and pressure data for Judge Digby field, a prolific natural gas field producing from the Upper Cretaceous lower part of the Tuscaloosa Formation in the Gulf Coast region. This natural gas field is in Pointe Coupee Parish in the southern part of onshore Louisiana.
NASA Technical Reports Server (NTRS)
Scholz, A. L.; Hart, M. T.; Lowry, D. J.
1987-01-01
The Technology Information Sheet was assembled in database format during Phase I. This document was designed to provide a repository for information pertaining to 144 Operations and Maintenance Instructions (OMI) controlled operations in the Orbiter Processing Facility (OPF), Vehicle Assembly Building (VAB), and PAD. It provides a way to accumulate information about required crew sizes, operations task time duration (serial and/or parallel), special Ground Support Equipment (GSE). required, and identification of a potential application of existing technology or the need for the development of a new technolgoy item.
Framework Development Supporting the Safety Portal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng
2015-07-01
In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allison, Lee; Chickering, Cathy; Anderson, Arlene
2013-09-23
Since the 2009 American Recovery and Reinvestment Act the U.S. Department of Energy’s Geothermal Technologies Office has funded $33.7 million for multiple data digitization and aggregation projects focused on making vast amounts of geothermal relevant data available to industry for advancing geothermal exploration. These projects are collectively part of the National Geothermal Data System (NGDS), a distributed, networked system for maintaining, sharing, and accessing data in an effort to lower the levelized cost of electricity (LCOE). Determining “who owns” and “who maintains” the NGDS and its data nodes (repositories in the distributed system) is yet to be determined. However, themore » invest- ment in building and populating the NGDS has been substantial, both in terms of dollars and time; it is critical that this investment be protected by ensuring sustainability of the data, the software and systems, and the accessibility of the data. Only then, will the benefits be fully realized. To keep this operational system sustainable will require four core elements: continued serving of data and applications; maintenance of system operations; a governance structure; and an effective business model. Each of these presents a number of challenges. Data being added to the NGDS are not strictly geothermal but data considered relevant to geothermal exploration and develop- ment, including vast amounts of oil and gas and groundwater wells, among other data. These are relevant to a broader base of users. By diversifying the client base to other users and other fields, the cost of maintaining core infrastructure can be spread across an array of stakeholders and clients. It is presumed that NGDS will continue to provide free and open access to its data resources. The next-phase NGDS operation should be structured to eventually pursue revenue streams to help off-set sustainability expenses as necessary and appropriate, potentially including income from: grants and contracts (agencies, foundations, pri- vate sector), membership, fees for services (consulting, training, customization, ‘app’ development), repository services (data, services, apps, models, documents, multimedia), advertisements, fees for premier services or applications, subscriptions to value added services, licenses, contributions and donations, endow- ments, and sponsorships.« less
The BioHub Knowledge Base: Ontology and Repository for Sustainable Biosourcing.
Read, Warren J; Demetriou, George; Nenadic, Goran; Ruddock, Noel; Stevens, Robert; Winter, Jerry
2016-06-01
The motivation for the BioHub project is to create an Integrated Knowledge Management System (IKMS) that will enable chemists to source ingredients from bio-renewables, rather than from non-sustainable sources such as fossil oil and its derivatives. The BioHubKB is the data repository of the IKMS; it employs Semantic Web technologies, especially OWL, to host data about chemical transformations, bio-renewable feedstocks, co-product streams and their chemical components. Access to this knowledge base is provided to other modules within the IKMS through a set of RESTful web services, driven by SPARQL queries to a Sesame back-end. The BioHubKB re-uses several bio-ontologies and bespoke extensions, primarily for chemical feedstocks and products, to form its knowledge organisation schema. Parts of plants form feedstocks, while various processes generate co-product streams that contain certain chemicals. Both chemicals and transformations are associated with certain qualities, which the BioHubKB also attempts to capture. Of immediate commercial and industrial importance is to estimate the cost of particular sets of chemical transformations (leading to candidate surfactants) performed in sequence, and these costs too are captured. Data are sourced from companies' internal knowledge and document stores, and from the publicly available literature. Both text analytics and manual curation play their part in populating the ontology. We describe the prototype IKMS, the BioHubKB and the services that it supports for the IKMS. The BioHubKB can be found via http://biohub.cs.manchester.ac.uk/ontology/biohub-kb.owl .
NASA Astrophysics Data System (ADS)
Buscheck, T.; Glascoe, L.; Sun, Y.; Gansemer, J.; Lee, K.
2003-12-01
For the proposed Yucca Mountain geologic repository for high-level nuclear waste, the planned method of disposal involves the emplacement of cylindrical packages containing the waste inside horizontal tunnels, called emplacement drifts, bored several hundred meters below the ground surface. The emplacement drifts reside in highly fractured, partially saturated volcanic tuff. An important phenomenological consideration for the licensing of the proposed repository at Yucca Mountain is the generation of decay heat by the emplaced waste and the consequences of this decay heat. Changes in temperature will affect the hydrologic and chemical environment at Yucca Mountain. A thermohydrologic-modeling tool is necessary to support the performance assessment of the Engineered Barrier System (EBS) of the proposed repository. This modeling tool must simultaneously account for processes occurring at a scale of a few tens of centimeters around individual waste packages, for processes occurring around the emplacement drifts themselves, and for processes occurring at the multi-kilometer scale of the mountain. Additionally, many other features must be considered including non-isothermal, multiphase-flow in fractured porous rock of variable liquid-phase saturation and thermal radiation and convection in open cavities. The Multiscale Thermohydrologic Model (MSTHM) calculates the following thermohydrologic (TH) variables: temperature, relative humidity, liquid-phase saturation, evaporation rate, air-mass fraction, gas-phase pressure, capillary pressure, and liquid- and gas-phase fluxes. The TH variables are determined as a function of position along each of the emplacement drifts in the repository and as a function of waste-package (WP) type. These variables are determined at various generic locations within the emplacement drifts, including the waste package and drip-shield surfaces and in the invert; they are also determined at various generic locations in the adjoining host rock; these variables are determined every 20 m for each emplacement drift in the repository. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow and captures the influence of the key engineering-design variables and natural-system factors affecting TH conditions in the emplacement drifts and adjoining host rock. Presented is a synopsis of recent MSTHM calculations conducted to support the Total System Performance Assessment for the License Application (TSPA-LA). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.